Skip to content

Commit 26817a7

Browse files
authored
Merge branch 'master' into import_pose
2 parents 384af5f + 2b71fb3 commit 26817a7

37 files changed

+4521
-96
lines changed

CHANGELOG.md

Lines changed: 10 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,10 @@
66

77
- Ensure merge tables are declared during file insertion #1205
88
- Update URL for DANDI Docs #1210
9-
- Improve cron job documentation and script #1226
9+
- Add common method `get_position_interval_epoch` #1056
10+
- Improve cron job documentation and script #1226, #1241
11+
- Update export process to include `~external` tables #1239
12+
- Only add merge parts to `source_class_dict` if present in codebase #1237
1013

1114
### Pipelines
1215

@@ -18,6 +21,12 @@
1821
- Allow population of missing `PositionIntervalMap` entries during population
1922
of `DLCPoseEstimation` #1208
2023
- Enable import of existing pose data to `ImportedPose` in position pipeline #1225
24+
- Spikesorting
25+
- Fix compatibility bug between v1 pipeline and `SortedSpikesGroup` unit
26+
filtering #1238
27+
28+
- Behavior
29+
- Implement pipeline for keypoint-moseq extraction of behavior syllables #1056
2130

2231
## [0.5.4] (December 20, 2024)
2332

dj_local_conf_example.json

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -51,6 +51,11 @@
5151
"video": "/your/base/path/deeplabcut/video",
5252
"output": "/your/base/path/deeplabcut/output"
5353
},
54+
"moseq_dirs": {
55+
"base": "/your/base/path/moseq",
56+
"project": "/your/base/path/moseq/projects",
57+
"video": "/your/base/path/moseq/video",
58+
},
5459
"kachery_zone": "franklab.default"
5560
}
5661
}

docs/src/Features/Export.md

Lines changed: 39 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -165,3 +165,42 @@ Or, optionally, you can use the `RestrGraph` class to cascade hand-picked tables
165165
and restrictions without the background logging of `ExportMixin`. The assembled
166166
list of restricted free tables, `RestrGraph.all_ft`, can be passed to
167167
`Export.write_export` to generate a shell script for exporting the data.
168+
169+
## Backwards Compatibility
170+
171+
Spyglass databases that were declared before varchars were reduced to
172+
accommodate MySQL key length restrictions (see
173+
[#630](https://github.com/LorenFrankLab/spyglass/issues/630)) will have trouble
174+
importing exported data. Specifically, varchar mismatches will throw foreign key
175+
errors. To fix this, you run the following bash script on the generated `sql`
176+
files before importing them into the new database.
177+
178+
<details><summary>Script</summary>
179+
180+
```bash
181+
#!/bin/bash
182+
183+
for file in ./_Pop*sql; do \
184+
echo $file
185+
sed -i 's/ DEFAULT CHARSET=[^ ]\w*//g' "$file"
186+
sed -i 's/ DEFAULT COLLATE [^ ]\w*//g' "$file"
187+
sed -i 's/ `nwb_file_name` varchar(255)/ `nwb_file_name` varchar(64)/g' "$file"
188+
sed -i 's/ `analysis_file_name` varchar(255)/ `analysis_file_name` varchar(64)/g' "$file"
189+
sed -i 's/ `interval_list_name` varchar(200)/ `interval_list_name` varchar(170)/g' "$file"
190+
sed -i 's/ `position_info_param_name` varchar(80)/ `position_info_param_name` varchar(32)/g' "$file"
191+
sed -i 's/ `mark_param_name` varchar(80)/ `mark_param_name` varchar(32)/g' "$file"
192+
sed -i 's/ `artifact_removed_interval_list_name` varchar(200)/ `artifact_removed_interval_list_name` varchar(128)/g' "$file"
193+
sed -i 's/ `metric_params_name` varchar(200)/ `metric_params_name` varchar(64)/g' "$file"
194+
sed -i 's/ `auto_curation_params_name` varchar(200)/ `auto_curation_params_name` varchar(36)/g' "$file"
195+
sed -i 's/ `sort_interval_name` varchar(200)/ `sort_interval_name` varchar(64)/g' "$file"
196+
sed -i 's/ `preproc_params_name` varchar(200)/ `preproc_params_name` varchar(32)/g' "$file"
197+
sed -i 's/ `sorter` varchar(200)/ `sorter` varchar(32)/g' "$file"
198+
sed -i 's/ `sorter_params_name` varchar(200)/ `sorter_params_name` varchar(64)/g' "$file"
199+
done
200+
```
201+
202+
</details>
203+
204+
This is essentially a series of `sed` commands that adjust varchar lengths to
205+
their updated values. This script should be run in the directory containing the
206+
`_Populate*.sql` files generated by the export process.

environment_moseq_cpu.yml

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# 1. Install a conda distribution.
2+
# https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html
3+
# 2. Run: `mamba env create -f environment.yml`
4+
# 3. Activate: `conda activate spyglass`
5+
#
6+
# (lines intentionally left blank)
7+
#
8+
#
9+
name: spyglass-moseq-cpu
10+
channels:
11+
- conda-forge
12+
# - defaults # deprecated
13+
- franklab
14+
- edeno
15+
# - pytorch # dlc-only
16+
# - anaconda # dlc-only, for cudatoolkit
17+
dependencies:
18+
- bottleneck
19+
# - cudatoolkit=11.3 # dlc-only
20+
# - ffmpeg # dlc-only
21+
- ipympl
22+
- jupyterlab>=3.*
23+
# - libgcc # dlc-only
24+
- matplotlib
25+
- non_local_detector
26+
- numpy
27+
- pip
28+
- position_tools
29+
- pybind11 # req by mountainsort4 -> isosplit5
30+
- pydotplus
31+
- pyfftw<=0.12.0 # ghostipy req. install from conda-forge for Mac ARM
32+
- python>=3.9,<3.13
33+
- pytorch<1.12.0
34+
- ripple_detection
35+
- seaborn
36+
# - torchaudio # dlc-only
37+
# - torchvision # dlc-only
38+
- track_linearization>=2.3
39+
- pip:
40+
- ghostipy # for common_filter
41+
- mountainsort4
42+
- .[moseq-cpu]

environment_moseq_gpu.yml

Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
# 1. Install a conda distribution.
2+
# https://mamba.readthedocs.io/en/latest/installation/mamba-installation.html
3+
# 2. Run: `mamba env create -f environment.yml`
4+
# 3. Activate: `conda activate spyglass`
5+
#
6+
# (lines intentionally left blank)
7+
#
8+
#
9+
name: spyglass-moseq-gpu
10+
channels:
11+
- conda-forge
12+
# - defaults # deprecated
13+
- franklab
14+
- edeno
15+
# - pytorch # dlc-only
16+
# - anaconda # dlc-only, for cudatoolkit
17+
dependencies:
18+
- bottleneck
19+
# - cudatoolkit=11.3 # dlc-only
20+
# - ffmpeg # dlc-only
21+
- ipympl
22+
- jupyterlab>=3.*
23+
# - libgcc # dlc-only
24+
- matplotlib
25+
- non_local_detector
26+
- numpy
27+
- pip
28+
- position_tools
29+
- pybind11 # req by mountainsort4 -> isosplit5
30+
- pydotplus
31+
- pyfftw<=0.12.0 # ghostipy req. install from conda-forge for Mac ARM
32+
- python>=3.9,<3.13
33+
- pytorch<1.12.0
34+
- ripple_detection
35+
- seaborn
36+
# - torchaudio # dlc-only
37+
# - torchvision # dlc-only
38+
- track_linearization>=2.3
39+
- pip:
40+
- ghostipy # for common_filter
41+
- mountainsort4
42+
- .[moseq-gpu]

maintenance_scripts/README.md

Lines changed: 30 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,11 +26,40 @@ regularly as cron jobs.
2626
1. Clone the repository to the desired location.
2727
2. Set up a config file by copying `dj_local_conf_example.json` to
2828
`dj_local_conf.json` and filling in the necessary information.
29-
3. Set up a cron job to run `run_jobs.sh` at the desired interval by running
29+
3. Copy the `example.env` file to `.env` in the `maintenance_scripts` directory
30+
and fill in the necessary information, including...
31+
- `SPYGLASS_CONDA_ENV`: the name of the conda environment with Spyglass and
32+
DataJoint installed.
33+
- `SPYGLASS_REPO_PATH`: the path to the Spyglass repository.
34+
- `SPYGLASS_LOG`: the path to the log file.
35+
- Optional email settings. If not set, email notifications will not be sent.
36+
- `SPYGLASS_EMAIL_SRC`: The email address from which to send notifications.
37+
- `SPYGLASS_EMAIL_PASS`: the password for the email address.
38+
- `SPYGLASS_EMAIL_DEST`: the email address to which to send notifications.
39+
4. Set up a cron job to run `run_jobs.sh` at the desired interval by running
3040
`crontab -e` and adding the script.
3141

42+
Note that the log file will automatically be truncated to `SPYGLASS_MAX_LOG`
43+
lines on each run. 1000 lines should be sufficient.
44+
45+
### Example Cron Job
46+
3247
In the following example, the script is set to run every Monday at 4:00 AM.
3348

3449
```text
3550
0 4 * * 1 /path/to/run_jobs.sh
3651
```
52+
53+
### Email Service
54+
55+
The script uses `curl` to send email notifications on failure. While this can
56+
work with
57+
[many email services](https://everything.curl.dev/usingcurl/smtp.html), Gmail is
58+
a common choice. To use Gmail, you will need to ...
59+
60+
1. Turn on [2-step verification](https://myaccount.google.com/security-checkup)
61+
2. Turn on [less secure apps](https://myaccount.google.com/lesssecureapps)
62+
3. Create an [app password](https://myaccount.google.com/apppasswords)
63+
64+
`curl` will not work with your master Gmail password, so you will need to use
65+
the app password instead.

maintenance_scripts/example.env

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
# NOTE: Use quotes for strings with spaces
2+
export SPYGLASS_CONDA_ENV=spyglass # name of conda environment
3+
export SPYGLASS_REPO_PATH=/home/franklab/spyglass # path to spyglass repo
4+
export SPYGLASS_LOG=/home/franklab/spyglass.log # log file
5+
export SPYGLASS_EMAIL_SRC=[email protected] # optional, email sender
6+
export SPYGLASS_EMAIL_PASS="example password" # optional, email password
7+
export SPYGLASS_EMAIL_DEST=[email protected] # optional, email recipient
8+
export SPYGLASS_MAX_LOG=1000 # number of lines to keep in log

maintenance_scripts/run_jobs.sh

Lines changed: 58 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,52 @@
11
#!/bin/bash
2+
# AUTHOR: Chris Brozdowski
3+
# DATE: 2025-02-20
4+
#
5+
# 1. Go to SPYGLASS_REPO_PATH and pull the latest changes from the master branch
6+
# 2. Test the SPYGLASS_CONDA_ENV conda environment
7+
# 3. Test the connection to the database
8+
# 4. Run the cleanup script
9+
#
10+
# This script is intended to be run as a cron job, weekly or more frequently.
11+
# It will store a log of its output in SPYGLASS_LOG.
12+
# If any of the operations fail, an email will be sent to SPYGLASS_EMAIL_DEST
213

3-
# SETUP:
4-
# 1. Create a conda environment with datajoint installed.
5-
# 2. Edit the variables below to match your setup.
6-
# 3. Set up the cron job (See README.md)
7-
# Note that the log file will be truncated to the last 1000 lines.
14+
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
15+
source "$SCRIPT_DIR/.env" # load environment variables from this directory
816

9-
SPYGLASS_CONDA_ENV=spyglass
10-
SPYGLASS_REPO_PATH=/home/franklab/spyglass
11-
SPYGLASS_LOG=/home/franklab/spyglass/spyglass.log
17+
if [[ -z "${SPYGLASS_CONDA_ENV}" \
18+
|| -z "${SPYGLASS_REPO_PATH}" \
19+
|| -z "${SPYGLASS_LOG}" ]]; then
20+
echo "Error: SPYGLASS_CONDA_ENV, SPYGLASS_REPO_PATH,
21+
and SPYGLASS_LOG must be set in .env"
22+
exit 1
23+
fi
24+
25+
EMAIL_TEMPLATE=$(cat <<-EOF
26+
From: "Spyglass" <$SPYGLASS_EMAIL_SRC>
27+
To: $SPYGLASS_EMAIL_DEST
28+
Subject: cron fail - $(date "+%Y-%m-%d")
29+
30+
%s
31+
EOF
32+
)
33+
34+
on_fail() { # $1: error message. Echo message and send as email
35+
echo "Error: $1"
36+
if [ -z "$SPYGLASS_EMAIL_SRC" ]; then
37+
return 1 # No email source, so don't send an email
38+
fi
39+
local error_msg="$1"
40+
local content
41+
content=$(printf "$EMAIL_TEMPLATE" "$error_msg")
42+
43+
curl -o /dev/null --ssl-reqd \
44+
--url "smtps://smtp.gmail.com:465" \
45+
--user "${SPYGLASS_EMAIL_SRC}:${SPYGLASS_EMAIL_PASS}" \
46+
--mail-from "$SPYGLASS_EMAIL_SRC" \
47+
--mail-rcpt "$SPYGLASS_EMAIL_DEST" \
48+
-T <(echo "$content")
49+
}
1250

1351
exec >> $SPYGLASS_LOG 2>&1
1452

@@ -17,30 +55,33 @@ echo "SPYGLASS CRON JOB START: $(date +"%Y-%m-%d %H:%M:%S")"
1755

1856
# Run from the root of the spyglass repository
1957
cd $SPYGLASS_REPO_PATH || \
20-
{ echo "Error: Could not change to the spyglass directory"; exit 1; }
58+
{ on_fail "Could not find repo path: $SPYGLASS_REPO_PATH"; exit 1; }
59+
2160

2261
# Update the spyglass repository
23-
git pull https://github.com/LorenFrankLab/spyglass.git master > /dev/null || \
24-
{ echo "Error: $PWD Could not update the spyglass repository"; exit 1; }
62+
git pull --quiet \
63+
https://github.com/LorenFrankLab/spyglass.git master > /dev/null || \
64+
{ on_fail "Could not update the spyglass repo $PWD"; exit 1; }
2565

2666
# Test conda environment
2767
if ! conda env list | grep -q $SPYGLASS_CONDA_ENV; then
28-
echo "Error: Conda environment $SPYGLASS_CONDA_ENV not found"
29-
exit 1
68+
on_fail "Conda environment $SPYGLASS_CONDA_ENV not found"
69+
exit 1
3070
fi
3171

3272
# convenience function to run a command in the spyglass conda environment
3373
conda_run() { conda run --name $SPYGLASS_CONDA_ENV "$@"; }
3474

3575
# Test connection to the database
36-
conda_run python -c "import datajoint as dj; dj.conn()" > /dev/null || \
37-
{ echo "Error: Could not connect to the database"; exit 1; }
76+
CONN_TEST="import datajoint as dj; dj.logger.setLevel('ERROR'); dj.conn()"
77+
conda_run python -c "$CONN_TEST" > /dev/null || \
78+
{ on_fail "Could not connect to the database"; exit 1; }
3879

3980
# Run cleanup script
4081
conda_run python maintenance_scripts/cleanup.py
4182

4283
echo "SPYGLASS CRON JOB END"
4384

4485
# truncate long log file
45-
tail -n 1000 "$SPYGLASS_LOG" > "${SPYGLASS_LOG}.tmp" && \
46-
mv "${SPYGLASS_LOG}.tmp" "$SPYGLASS_LOG"
86+
tail -n ${SPYGLASS_MAX_LOG:-1000} "$SPYGLASS_LOG" > "${SPYGLASS_LOG}.tmp" \
87+
&& mv "${SPYGLASS_LOG}.tmp" "$SPYGLASS_LOG"

notebook-images/moseq_outline.png

59.7 KB
Loading

notebooks/00_Setup.ipynb

Lines changed: 43 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -285,7 +285,47 @@
285285
"The Decoding pipeline relies on `jax` to process data with GPUs. Please see\n",
286286
"their conda installation steps\n",
287287
"[here](https://jax.readthedocs.io/en/latest/installation.html#conda-installation).\n",
288-
"\n"
288+
"\n",
289+
"#### Deep Lab Cut (DLC)\n",
290+
"\n",
291+
"Spyglass provides an environment build for using the DLC pipeline. To create an \n",
292+
"environment with these features, please:\n",
293+
"1. navigate to your cloned spyglass repo.\n",
294+
"2. build the environment from the dlc version\n",
295+
"3. activate the environment to use\n",
296+
"\n",
297+
"```bash\n",
298+
"cd /path/to/spyglass # 1\n",
299+
"mamba env create -f environment_dlc.yml # 2\n",
300+
"mamba activate spyglass-dlc # 3\n",
301+
"```\n",
302+
"\n",
303+
"Alternatively, you can pip install using\n",
304+
"```bash\n",
305+
"pip install spyglass[dlc]\n",
306+
"```\n",
307+
"\n",
308+
"#### Keypoint-Moseq\n",
309+
"\n",
310+
"Spyglass provides an environment build for using the Moseq pipeline. To create an \n",
311+
"environment with these features, please:\n",
312+
"1. navigate to your cloned spyglass repo.\n",
313+
"2. build the environment from one of the moseq versions\n",
314+
"3. activate the environment to use\n",
315+
"\n",
316+
"```bash\n",
317+
"cd /path/to/spyglass # 1\n",
318+
"mamba env create -f environment_moseq_cpu.yml # 2\n",
319+
"mamba activate spyglass-moseq-cpu # 3\n",
320+
"```\n",
321+
"\n",
322+
"Alternatively, you can pip install using\n",
323+
"```bash\n",
324+
"pip install spyglass[moseq-cpu]\n",
325+
"```\n",
326+
"\n",
327+
"To use a GPU enabled version of the package, replace `cpu` with `gpu` in the above \n",
328+
"commands\n"
289329
]
290330
},
291331
{
@@ -593,7 +633,7 @@
593633
],
594634
"metadata": {
595635
"kernelspec": {
596-
"display_name": "Python 3 (ipykernel)",
636+
"display_name": "spyglass",
597637
"language": "python",
598638
"name": "python3"
599639
},
@@ -607,7 +647,7 @@
607647
"name": "python",
608648
"nbconvert_exporter": "python",
609649
"pygments_lexer": "ipython3",
610-
"version": "3.9.19"
650+
"version": "3.9.18"
611651
}
612652
},
613653
"nbformat": 4,

0 commit comments

Comments
 (0)