neuroskill is a command-line interface for the NeuroSkill real-time EXG analysis API. It communicates with a locally running Skill server over WebSocket or HTTP, giving you instant terminal access to EXG brain-state scores, sleep staging, session history, annotations, similarity search, and more.
This is the Python port of the TypeScript neuroskill CLI. It is a faithful port — same commands, same flags, same output format, same transport negotiation.
⚠️ Research Use Only. All metrics are experimental outputs derived from consumer-grade EXG hardware. They are not validated clinical measurements, not FDA/CE-cleared, and must not be used for diagnosis, treatment decisions, or any medical purpose.
- Features
- Requirements
- Installation
- Quick Start
- Transport
- Commands
- Output Modes
- Global Options
- Examples
- How to Cite
- License
- Real-time EXG scores — focus, relaxation, engagement, meditation, cognitive load, drowsiness
- Consciousness metrics — Lempel-Ziv Complexity proxy, wakefulness, information integration
- PPG / HRV — heart rate, RMSSD, SDNN, pNN50, LF/HF, SpO₂, Baevsky stress index
- Sleep staging — automatic per-epoch classification and session-level summary
- Session history — list, compare, and UMAP-project all past recording sessions
- Annotations — create timestamped labels and search them by free text or EXG similarity
- Interactive graph search — cross-modal 4-layer graph (labels → EXG → labels)
- Dual transport — WebSocket (full-duplex, live events) and HTTP REST (curl-friendly)
- Pipe-friendly —
--jsonflag emits clean JSON to stdout; informational lines go to stderr - Cross-platform — Python ≥ 3.9, Windows / macOS / Linux
| Dependency | Version |
|---|---|
| Python | ≥ 3.9 |
| Skill server | running locally (auto-discovered via mDNS or lsof) |
pip install neuroskillor with uv:
uv tool install neuroskillAfter installation the neuroskill binary is available globally:
neuroskill statusgit clone https://github.com/NeuroSkill-com/neuroskill-py
cd neuroskill-py
uv sync
uv run neuroskill status# Full device / session / scores snapshot
neuroskill status
# Pipe raw JSON to jq (or python -m json.tool)
neuroskill status --json | python3 -m json.tool
neuroskill status --json | jq '.scores'
# Stream broadcast events for 10 seconds
neuroskill listen --seconds 10
# Print full help with examples
neuroskill --helpneuroskill auto-discovers the Skill server port via:
--port <n>flag (skips all discovery)- mDNS (
_skill._tcpservice advertisement, 5 s timeout) lsof/pgrepfallback (probes each TCP LISTEN port)
Full-duplex, low-latency. Supports live event streaming. Used automatically when the server is reachable.
neuroskill status --ws # force WebSocketRequest/response only. Compatible with curl, Python requests, or any HTTP client.
neuroskill status --http # force HTTP
# Equivalent curl call:
curl -s -X POST http://127.0.0.1:8375/ \
-H "Content-Type: application/json" \
-d '{"command":"status"}'The CLI probes WebSocket first and silently falls back to HTTP. Informational messages go to stderr so JSON piping is never polluted.
| Command | Description |
|---|---|
status |
Full device / session / embeddings / scores snapshot |
session [index] |
All metrics + trends for one session (0 = latest, 1 = previous, …) |
sessions |
List all recording sessions across all days |
label "text" |
Create a timestamped annotation on the current moment |
search-labels "query" |
Search labels by free text (text / context / both modes) |
interactive "keyword" |
Cross-modal 4-layer graph search (labels → EXG → found labels) |
search |
ANN EXG-similarity search (auto: last session, k = 5) |
compare |
Side-by-side A/B metrics (auto: last 2 sessions) |
sleep [index] |
Sleep staging — index selects session (0 = latest) |
calibrations [list|get <id>] |
List calibration profiles or inspect one by ID |
calibrate |
Open calibration window and start immediately |
timer |
Open focus-timer window and start work phase immediately |
notify "title" ["body"] |
Show a native OS notification |
say "text" |
Speak text aloud via on-device TTS |
umap |
3-D UMAP projection with live progress bar |
listen |
Stream broadcast events for N seconds |
raw '{"command":"..."}' |
Send arbitrary JSON and print full response |
| Flag | Behaviour |
|---|---|
| (none) | Human-readable colored summary to stdout |
--json |
Raw JSON only — pipe-safe, no colors |
--full |
Human-readable summary and colorized JSON |
--port <n> Connect to explicit port (skips mDNS discovery)
--ws Force WebSocket transport
--http Force HTTP REST transport
--json Output raw JSON (pipeable to jq / python -m json.tool)
--full Print JSON in addition to human-readable summary
--poll <n> (status) Re-poll every N seconds
--mode <m> Search mode for search-labels: text|context|both (default: text)
--k <n> Number of nearest neighbors for search / search-labels
--ef <n> HNSW ef parameter for search-labels (default: max(k×4, 64))
--k-text <n> (interactive) k for text-label search (default: 5)
--k-EXG <n> (interactive) k for EXG-similarity search (default: 5)
--k-labels <n> (interactive) k for label-proximity search (default: 3)
--reach <n> (interactive) temporal reach in minutes around EXG points (default: 10)
--dot (interactive) Output Graphviz DOT format
--context "..." (label) Long-form annotation body stored with the label
--at <utc> (label) Backdate to a specific unix second (default: now)
--voice <name> (say) Voice name (e.g. Jasper); uses server default when omitted
--profile <p> (calibrate) Profile name or UUID to run (default: active profile)
--seconds <n> (listen) Duration in seconds (default: 5)
--trends (sessions) Show per-session metric trends
--no-color Disable ANSI colors (also honours NO_COLOR env var)
--version Print CLI version and exit
--help Show full help with examples
# Device snapshot
neuroskill status
# Pipe scores to jq or python
neuroskill status --json | jq '.scores.focus'
neuroskill status --json | python3 -c "import sys,json; d=json.load(sys.stdin); print(d['scores']['focus'])"
# Poll status every 5 seconds
neuroskill status --poll 5
# Latest session metrics + trends
neuroskill session 0
# List sessions, show per-session metric trends
neuroskill sessions --trends
# Label the current moment
neuroskill label "started meditation"
neuroskill label "breathwork" --context "box breathing 4-4-4-4, 10 min"
neuroskill label "retrospective note" --at 1740412800
# Search past labels
neuroskill search-labels "meditation" --mode both --k 10
# 4-layer interactive graph search
neuroskill interactive "focus" --k-EXG 10 --reach 15
neuroskill interactive "anxiety" --dot | dot -Tsvg > graph.svg
# Sleep staging for latest session
neuroskill sleep 0
# Compare last two sessions
neuroskill compare
# ANN EXG similarity search
neuroskill search
# UMAP projection
neuroskill umap
# Calibration management
neuroskill calibrations
neuroskill calibrations get 3
neuroskill calibrate --profile "Eyes Open/Closed"
# Focus timer
neuroskill timer
# TTS and notification
neuroskill say "Calibration complete."
neuroskill say "Break time." --voice Jasper
neuroskill notify "Session done" "Great work!"
# Stream events for 30 seconds
neuroskill listen --seconds 30
# Send arbitrary JSON command
neuroskill raw '{"command":"status"}'
neuroskill raw '{"command":"search","start_utc":1740412800,"end_utc":1740415500,"k":3}'
# Force HTTP + specific port
neuroskill status --http --port 8375If you use neuroskill or the Skill EXG platform in academic work, please cite it as:
@software{neuroskill2025,
title = {neuroskill: A Command-Line Interface for the Skill Real-Time EXG Analysis API},
author = {Nataliya Kosmyna and Eugene Hauptmann},
year = {2026},
version = {0.0.1},
url = {https://github.com/NeuroSkill-com/neuroskill},
note = {Research use only. Not a validated clinical tool.}
}If you are citing the underlying Skill EXG analysis platform specifically:
@software{skill2025,
title = {NeuroSkill: Real-Time EXG Analysis Platform},
author = {Nataliya Kosmyna and Eugene Hauptmann},
year = {2026},
url = {https://neuroskill.com},
note = {Consumer-grade EXG processing pipeline with WebSocket and HTTP APIs.
Research use only. Not FDA/CE-cleared.}
}GPLv3 — see LICENSE.