Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
9f4d53c
try update to python 3.11
rcpeene Jul 9, 2024
f6b81ed
swap axes in TCA
rcpeene Jul 11, 2024
30754ad
print pip freeze
rcpeene Jul 11, 2024
042ea22
install ssm from pypi
rcpeene Jul 11, 2024
cb8c504
run build on dev PR also
rcpeene Jul 11, 2024
f4d061b
swap install dep steps
rcpeene Jul 11, 2024
f509c76
dont setup python version
rcpeene Jul 11, 2024
b3446f7
reran glo notebook. Jerome to examine get_spike_matrix in PR
rcpeene Jul 19, 2024
5e461ae
Merge branch 'main' of https://github.com/AllenInstitute/openscope_da…
rcpeene Jul 29, 2024
50863fc
try adding docker to workflow
rcpeene Nov 13, 2024
c5eb8ff
add docker to test
rcpeene Nov 13, 2024
f7ab6a9
fix typos
rcpeene Nov 13, 2024
078a763
add docker file
rcpeene Nov 13, 2024
82e69c5
add required ubuntu libraries for cell_matching notebook
rcpeene Nov 14, 2024
0ea9095
improve docker and add env dev tag to setup.py
rcpeene Nov 15, 2024
c38c9e5
don't git clone in docker to get files (then it will run older notebo…
rcpeene Nov 15, 2024
1e6c20d
include databook utils in env
rcpeene Nov 16, 2024
2c209b1
try workflow checking our branch to push to
rcpeene Nov 18, 2024
7e1bea1
fix variable name
rcpeene Nov 18, 2024
302d2c1
restore linderman version of ssm
rcpeene Nov 18, 2024
a24bccb
pin autograd
rcpeene Nov 19, 2024
5660323
fix docker bug and add docker usage instructions
rcpeene Nov 20, 2024
ede92f4
fix git perm issue
rcpeene Nov 20, 2024
04a2ee5
Merge pull request #433 from AllenInstitute/docker
rcpeene Nov 26, 2024
d70472d
Merge branch 'dev' of https://github.com/AllenInstitute/openscope_dat…
rcpeene Nov 26, 2024
24d8328
fix typo
rcpeene Nov 26, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
37 changes: 22 additions & 15 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,16 @@ on:
push:
branches: [ "main" ]
pull_request:
branches: [ "main" ]
branches: [ "main" , "dev"]

workflow_dispatch:

jobs:
build:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -21,24 +23,27 @@ jobs:
- uses: actions/checkout@v3
with:
fetch-depth: 0
ref: main
ref: ${{ github.ref }}

# - name: Set up Python
# uses: actions/setup-python@v4
# with:
# python-version: "3.11"

- name: Set up Python
uses: actions/setup-python@v4
# with:
# python-version: "3.9"
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Install deps
# run: pip install cython numpy

- name: Install deps
run: pip install cython numpy
- name: pip freeze
run: pip freeze

- name: Installing package
run: pip install -e .
# - name: Installing packages again (this prevents a weird error)
# run: pip install -r requirements.txt

- name: Installing packages again (this prevents a weird error)
run: pip install -r requirements.txt
# - name: Installing package
# run: pip install -e .

- name: Installing build dependencies
run: |
Expand Down Expand Up @@ -82,7 +87,9 @@ jobs:
rm ./docs/embargoed/*.nwb

- name: Printing log
run: git status
run: |
git config --global --add safe.directory /__w/openscope_databook/openscope_databook
git status

- name: Printing shortlog
run: git log | git shortlog -sn
Expand Down
18 changes: 10 additions & 8 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,8 @@ jobs:
test:
runs-on:
group: LargerInstance
container:
image: rcpeene/openscope_databook:latest

env:
DANDI_API_KEY: ${{ secrets.DANDI_API_KEY }}
Expand All @@ -19,20 +21,20 @@ jobs:
steps:
- uses: actions/checkout@v3

- name: Upgrading pip
run: pip install --upgrade pip
# - name: Upgrading pip
# run: pip install --upgrade pip

- name: print environment
run: pip freeze

- name: Install cython
run: pip install cython numpy
# - name: Install cython
# run: pip install cython numpy

- name: Installing package
run: pip install -e .
# - name: Installing package
# run: pip install -e .

- name: Installing requirements
run: pip install -r ./requirements.txt
# - name: Installing requirements
# run: pip install -r ./requirements.txt

- name: Installing build dependencies
run: |
Expand Down
22 changes: 22 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
FROM ubuntu:22.04
# base requirements
RUN apt-get update
RUN apt-get install -y coreutils
RUN apt-get install -y libgl1-mesa-glx
RUN apt-get install -y libglib2.0-0
RUN apt-get install -y python3 python3-pip
RUN apt-get install -y git

RUN git config --global --add safe.directory /__w/openscope_databook/openscope_databook

# copy databook setup files
COPY requirements.txt ./openscope_databook/requirements.txt
COPY setup.py ./openscope_databook/setup.py
COPY README.md ./openscope_databook/README.md
COPY LICENSE.txt ./openscope_databook/LICENSE.txt
COPY databook_utils ./openscope_databook/databook_utils

# for reasons I don't understand, these must be installed before the rest the requirements
RUN pip install numpy cython
# set up databook dependencies
RUN pip install -e ./openscope_databook[dev]
2 changes: 1 addition & 1 deletion docs/contribution.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ The Databook can be forked via the GitHub Web UI from the Databook's [GitHub rep
## Initialize Locally
A local repo can be made by pressing the `code` button on the front page of the forked repo, and copying the HTTPS url. Then locally, run the command `git clone <copied_url_here>`. For more information on cloning GitHub repos, check out GitHub's [Cloning a Repository](https://docs.github.com/en/repositories/creating-and-managing-repositories/cloning-a-repository) Page.

Then the environment must be set up. You may set up a conda environment if you don't want to interfere with your local environment. After installing conda, this can be done with the commands `conda create --name databook python=3.9` followed by `activate databook` (Windows) or `source activate databook` (Mac/Linux). Within or without the conda environment, the dependencies for the databook can be installed by navigating to the openscope_databook directory and running `pip install -e . --user`.
Then the environment must be set up. You may set up a conda environment if you don't want to interfere with your local environment. After installing conda, this can be done with the commands `conda create --name databook python=3.11` followed by `activate databook` (Windows) or `source activate databook` (Mac/Linux). Within or without the conda environment, the dependencies for the databook can be installed by navigating to the openscope_databook directory and running `pip install -e . --user`.

Finally, notebooks can be run with Jupyter notebook by running `jupyter notebook ./docs`

Expand Down
89 changes: 35 additions & 54 deletions docs/embargoed/cell_matching.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,12 +50,10 @@
"import json\n",
"import os\n",
"\n",
"import matplotlib as mpl\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"from PIL import Image\n",
"from time import sleep"
"from PIL import Image"
]
},
{
Expand Down Expand Up @@ -93,6 +91,13 @@
"id": "77d78e7d",
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
"text": [
"A newer version (0.63.1) of dandi/dandi-cli is available. You are using 0.61.2\n"
]
},
{
"name": "stdout",
"output_type": "stream",
Expand Down Expand Up @@ -255,66 +260,42 @@
"name": "stderr",
"output_type": "stream",
"text": [
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:NWAY_COMMIT_SHA None\n",
"INFO:NwayMatching:Nway matching version 0.6.0\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\scipy\\__init__.py:169: UserWarning: A NumPy version >=1.18.5 and <1.26.0 is required for this version of SciPy (detected version 1.26.4\n",
" warnings.warn(f\"A NumPy version >={np_minversion} and <{np_maxversion}\"\n",
"WARNING:root:many=True not supported from argparse\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135\n",
"INFO:PairwiseMatching:Matching 1193675753 to 1194754135: best registration was ['Crop', 'CLAHE', 'PhaseCorrelate']\n",
"multiprocessing.pool.RemoteTraceback: \n",
"\"\"\"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 125, in worker\n",
" result = (True, func(*args, **kwds))\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 48, in mapstar\n",
" return list(map(*args))\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 121, in pair_match_job\n",
" pair_match.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 495, in run\n",
" segmask_moving_3d_registered = transform_mask(\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\pairwise_matching.py\", line 384, in transform_mask\n",
" dtype=np.int)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\numpy\\__init__.py\", line 338, in __getattr__\n",
" raise AttributeError(__former_attrs__[attr])\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n",
"\"\"\"\n",
"\n",
"The above exception was the direct cause of the following exception:\n",
"\n",
"Traceback (most recent call last):\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 196, in _run_module_as_main\n",
" return _run_code(code, main_globals, None,\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\runpy.py\", line 86, in _run_code\n",
" exec(code, run_globals)\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 502, in <module>\n",
" nmod.run()\n",
" File \"c:\\Users\\carter.peene\\Desktop\\Projects\\openscope_databook\\databook_env\\lib\\site-packages\\nway\\nway_matching.py\", line 462, in run\n",
" self.pair_matches = pool.map(pair_match_job, pair_arg_list)\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 367, in map\n",
" return self._map_async(func, iterable, mapstar, chunksize).get()\n",
" File \"C:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\multiprocessing\\pool.py\", line 774, in get\n",
" raise self._value\n",
"AttributeError: module 'numpy' has no attribute 'int'.\n",
"`np.int` was a deprecated alias for the builtin `int`. To avoid this error in existing code, use `int` by itself. Doing this will not modify any behavior and is safe. When replacing `np.int`, you may wish to use e.g. `np.int64` or `np.int32` to specify the precision. If you wish to review your current use, check the release note link for additional information.\n",
"The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:\n",
" https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations\n"
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:48: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(0)\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\utils.py:49: FutureWarning: In a future version of pandas all arguments of DataFrame.sort_index will be keyword-only.\n",
" df = df.sort_index(1)\n",
"INFO:NwayMatching:registration success(1) or failure (0):\n",
" 0 1\n",
"0 1 1\n",
"1 1 1\n",
"id map{\n",
" \"0\": 1193675753,\n",
" \"1\": 1194754135\n",
"}\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\nway\\nway_matching.py:208: FutureWarning: The frame.append method is deprecated and will be removed from pandas in a future version. Use pandas.concat instead.\n",
" matching_frame = matching_frame.append(pairframe)\n",
"INFO:NwayMatching:Nway matching is done!\n",
"INFO:NwayMatching:Creating match summary plots\n",
"WARNING:root:setting Dict fields not supported from argparse\n",
"c:\\Users\\carter.peene\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\argschema\\utils.py:346: FutureWarning: '--nway_output.nway_matches' is using old-style command-line syntax with each element as a separate argument. This will not be supported in argschema after 2.0. See http://argschema.readthedocs.io/en/master/user/intro.html#command-line-specification for details.\n",
" warnings.warn(warn_msg, FutureWarning)\n",
"WARNING:root:many=True not supported from argparse\n",
"INFO:NwayMatching:wrote matching_output\\nway_match_fraction_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_overlay_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote matching_output\\nway_warp_summary_plot_2024_11_14_13_37_50.png\n",
"INFO:NwayMatching:wrote ./output.json\n"
]
}
],
"source": [
"!python -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
"!python3 -m nway.nway_matching --input_json input.json --output_json \"./output.json\" --output_dir matching_output"
]
},
{
Expand Down Expand Up @@ -385,7 +366,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b53e35b0>"
"<matplotlib.image.AxesImage at 0x21dff47bfa0>"
]
},
"execution_count": 13,
Expand Down Expand Up @@ -421,7 +402,7 @@
{
"data": {
"text/plain": [
"<matplotlib.image.AxesImage at 0x1c3b7dbdf00>"
"<matplotlib.image.AxesImage at 0x21dff4fe680>"
]
},
"execution_count": 14,
Expand Down
Loading
Loading