What's Changed
- Update URL for DANDI Docs by @kabilar in #1210
- Ensure merge tables declared in new database by @samuelbray32 in #1205
- Fix missing pos interval map entry in pose estimation by @samuelbray32 in #1208
- Miscellaneous Fixes by @samuelbray32 in #1220
- Maintenance tool revision by @CBroz1 in #1226
- Fix the bug for computing quality metrics in spikesorting v0 by @sytseng in #1212
- Spike group v1 label compatability by @samuelbray32 in #1238
- Resolve different code and database state for merge table parts by @samuelbray32 in #1237
- Add email-on-fail to cron job by @CBroz1 in #1241
- Export external tables by @CBroz1 in #1239
- Moseq Pipeline by @samuelbray32 in #1056
- Import pose by @samuelbray32 in #1225
- Revert "Import pose" by @CBroz1 in #1245
- Speedup get_sorting by @edeno in #1246
- Fix filtering on labels by @edeno in #1249
- Remove cli module by @edeno in #1250
- Small fixes by @samuelbray32 in #1256
- Import pose edited by @samuelbray32 in #1247
- Add script for drive space check by @CBroz1 in #1257
- Update jupysync by @CBroz1 in #1266
- Fix mismatch in
time_slicetyping inSortedSpikesGroup.fetch_spike_databy @samuelbray32 in #1261 - Add v0.SpikeSortingRecording.cleanup by @CBroz1 in #1263
- Revise cleanup scripts by @CBroz1 in #1271
- Quickfix: Turn off transactions for CuratedSpikeSorting by @samuelbray32 in #1288
- Quickfix: spikesorting vo _use_transaction by @samuelbray32 in #1290
- Misc old issues by @CBroz1 in #1281
- Add fetch1_dataframe to sensor data by @edeno in #1291
- Generalizations for nwb ingestion by @samuelbray32 in #1278
- LFP improvements by @edeno in #1280
- Store arrays in AnalysisNwbfile by @samuelbray32 in #1298
- LFP Import fix by @samuelbray32 in #1302
- Cut down TODOs, part 1: 57 -> 43 by @CBroz1 in #1304
- Allow recompute via
_make_filefunc by @CBroz1 in #1093 - Add badge to tests README by @CBroz1 in #1305
- Interval as object carries functions by @CBroz1 in #1293
- Burst merge curation by @CBroz1 in #1209
- Quickfix: Parse name for Session.Experimenter by @samuelbray32 in #1306
- Fix call to 'upper' in common_behav.py by @sophie-robertson in #1314
- Fix recompute
update_idby @CBroz1 in #1311 - Spikeinterface channel_id ambiguity by @samuelbray32 in #1310
- Generate timestamps from rate and start time by @samuelbray32 in #1322
- Returned merge_id consistency for Merge.fetch_nwb by @samuelbray32 in #1320
- Cleanup Update: exit on fail to
chmodby @CBroz1 in #1328 - Fix spikeinterface channel names by @samuelbray32 in #1334
- Import generalizations by @samuelbray32 in #1318
- DLC updates by @emreybroyles in #1339
- adding function for plotting specific interval lists side by side by @gshvarts in #1330
- Minor Issues by @samuelbray32 in #1270
- Add permissions to GitHub Actions workflows by @edeno in #1344
- Skip empty timestamp objs by @CBroz1 in #1347
- Improve position coverage by @CBroz1 in #1315
- Fix DataJoint query errors with NaN values in probe geometry fields by @Copilot in #1346
- Report table name in
accept_divergenceby @CBroz1 in #1350 - Update kwargs in DLC config by @samuelbray32 in #1352
- Table chains cascade shortest path #1353 by @CBroz1 in #1356
- Allow return of Interval obj from
to_indicesmethod by @CBroz1 in #1357 - More recompute fixes by @CBroz1 in #1340
- Add long-distance example to doc by @CBroz1 in #1361
- Bump version by @CBroz1 in #1316
New Contributors
- @kabilar made their first contribution in #1210 🥳
- @sytseng made their first contribution in #1212 🥳
- @sophie-robertson made their first contribution in #1314 🥳
- @gshvarts made their first contribution in #1330 🥳
- @Copilot made their first contribution in #1346
Database Changes
To update your database to reflect these changes, we recommend running the following script. If you have any issues, please contact maintainers via our Discussions page.
# -- For TrackGraph --
from spyglass.linearization.v1.main import TrackGraph # noqa
TrackGraph.alter() # Add edge map parameter
# -- For dropping deprecated tables --
import datajoint as dj
dj.FreeTable(dj.conn(), "common_nwbfile.analysis_nwbfile_log").drop()
dj.FreeTable(dj.conn(), "common_session.session_group").drop()
# -- For v0 recompute --
from spyglass.spikesorting.v0.spikesorting_recording import (
SpikeSortingRecording,
SpikeSortingRecordingSelection,
IntervalList,
)
SpikeSortingRecording().alter()
SpikeSortingRecording().update_ids()
# -- For v1 recompute --
from spyglass.spikesorting.v1.recording import (
SpikeSortingRecording,
SpikeSortingRecordingSelection,
AnalysisNwbfile,
)
SpikeSortingRecording().alter()
SpikeSortingRecording().update_ids()
# -- For LFP pipeline --
from spyglass.lfp.lfp_imported import ImportedLFP
from spyglass.lfp.lfp_merge import LFPOutput
if len(ImportedLFP()) or len(LFPOutput.ImportedLFP()):
raise ValueError(
"Existing entries found and would be dropped in update. Please delete "
+ "entries or start a GitHub discussion for migration assistance."
+ f"\nImportedLFP: {len(ImportedLFP())}"
+ f"\nLFPOutput.ImportedLFP: {len(LFPOutput.ImportedLFP())}"
)
table = LFPOutput().ImportedLFP()
table_name = table.full_table_name
if len(drop_list := table.connection.dependencies.descendants(table_name)) > 1:
drop_list = [x for x in drop_list if x != table_name]
raise ValueError(
"Downstream tables exist and would be dropped in update."
+ "Please drop the following tables first: \n"
+ "\n ".join([str(t) for t in drop_list])
)
LFPOutput().ImportedLFP().drop_quick()
ImportedLFP().drop()Full Changelog: 0.5.4...0.5.5