Skip to content

Commit 3400479

Browse files
authored
Merge pull request #1156 from mayofaulkner/IBL_project
fix typos
2 parents 2323330 + d6b78a7 commit 3400479

File tree

3 files changed

+35
-26
lines changed

3 files changed

+35
-26
lines changed

projects/behavior_and_theory/IBL_behavior_data.ipynb

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,10 @@
3838
{
3939
"cell_type": "code",
4040
"source": [
41+
"# When running in jupyter set number of threads to 1\n",
42+
"import os\n",
43+
"os.environ.setdefault('ONE_HTTP_DL_THREADS', '1')\n",
44+
"\n",
4145
"from one.api import ONE\n",
4246
"ONE.setup(base_url='https://openalyx.internationalbrainlab.org', silent=True)\n",
4347
"one = ONE(password='international')"
@@ -63,7 +67,7 @@
6367
{
6468
"cell_type": "code",
6569
"source": [
66-
"# Supress some future warnings\n",
70+
"# Suppress some future warnings\n",
6771
"import warnings\n",
6872
"warnings.simplefilter(\"ignore\", FutureWarning)\n",
6973
"\n",

projects/neurons/IBL_BWM_Neuromatch_tutorial.ipynb

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@
105105
"source": [
106106
"### **Decision making task**\n",
107107
"\n",
108-
"In the IBL decision making task a visual stimulus appears at the edge of the screen and mice must bring the stimulus to the screen centre by moving a wheel with their front two paws. If they successfully bring the stimulus to the screen centre, they recieve a reward of sugar water ; if however they move the wheel in the wrong direction until the stimulus reaches beyond the screen edge, a white noise tone is played. To initiate a trial, the mouse must hold the wheel still for a continous period between 0.4-0.7s ; they are then alerted of the start of the trial by the simultaneous presentation of the stimulus on the screen and a go cue tone. Mice have a maximum of 60s to make a decision before the trial times out and a white noise tone is played.\n",
108+
"In the IBL decision making task a visual stimulus appears at the edge of the screen and mice must bring the stimulus to the screen centre by moving a wheel with their front two paws. If they successfully bring the stimulus to the screen centre, they receive a reward of sugar water ; if however they move the wheel in the wrong direction until the stimulus reaches beyond the screen edge, a white noise tone is played. To initiate a trial, the mouse must hold the wheel still for a continuous period between 0.4-0.7s ; they are then alerted of the start of the trial by the simultaneous presentation of the stimulus on the screen and a go cue tone. Mice have a maximum of 60s to make a decision before the trial times out and a white noise tone is played.\n",
109109
"\n",
110110
"Varying contrasts of visual stimulus are shown throughout the session (100%, 25%, 12.5%, 6.25% and 0%). The probability of the stimulus appearing on the left or the right changes between blocks of trials. During 0% contrast trials (where no stimulus appears on the screen but a wheel response is required), the mice can use the inferred block structure to guide their decision.\n",
111111
"\n",
@@ -122,7 +122,7 @@
122122
"### **Accessing the data**\n",
123123
"For the purposes of this course, we have precomputed task-aligned peri-stimulus time histograms (PSTHs) for all good clusters in the dataset. This allows you to quickly begin working with the data using a simplified and accessible format. The tutorial below is based on this preprocessed data.\n",
124124
"\n",
125-
"To access the full dataset, including raw electrophysiology (action potential and LFP bands), spike sorting output, wheel movement, video recordings, and pose estimation data please refer to this [this introductary notebook](https://colab.research.google.com/drive/1_1qfa-DLDbezyFXguFOnJJWF5aJ5AH0i#scrollTo=-TJR7XEgtBxS) and [this tutorial](https://colab.research.google.com/github/NeuromatchAcademy/course-content/blob/main/projects/neurons/IBL_ONE_tutorial.ipynb).\n",
125+
"To access the full dataset, including raw electrophysiology (action potential and LFP bands), spike sorting output, wheel movement, video recordings, and pose estimation data please refer to this [this introductary notebook](https://colab.research.google.com/drive/1_1qfa-DLDbezyFXguFOnJJWF5aJ5AH0i#scrollTo=-TJR7XEgtBxS) and [this tutorial](https://colab.research.google.com/drive/1y3sRI1wC7qbWqN6skvulzPOp6xw8tLm7#scrollTo=hRZA78AoaBIC).\n",
126126
"\n",
127127
"\n",
128128
"\n",
@@ -176,6 +176,10 @@
176176
{
177177
"cell_type": "code",
178178
"source": [
179+
"# When running in jupyter set number of threads to 1\n",
180+
"import os\n",
181+
"os.environ.setdefault('ONE_HTTP_DL_THREADS', '1')\n",
182+
"\n",
179183
"from one.api import ONE\n",
180184
"ONE.setup(base_url='https://openalyx.internationalbrainlab.org', silent=True)\n",
181185
"one = ONE(password='international')"
@@ -2287,9 +2291,7 @@
22872291
},
22882292
{
22892293
"cell_type": "markdown",
2290-
"source": [
2291-
"### **Exploring the visualisation webiste**"
2292-
],
2294+
"source": "### **Exploring the visualisation website**",
22932295
"metadata": {
22942296
"id": "rr7GcJXgb09h"
22952297
}
@@ -3530,9 +3532,7 @@
35303532
},
35313533
{
35323534
"cell_type": "markdown",
3533-
"source": [
3534-
"Simliar to above we can compute the **modulation index** to identify **responsive cells**."
3535-
],
3535+
"source": "Similar to above we can compute the **modulation index** to identify **responsive cells**.",
35363536
"metadata": {
35373537
"id": "DNpIAFXK72mg"
35383538
}
@@ -3717,7 +3717,7 @@
37173717
{
37183718
"cell_type": "markdown",
37193719
"source": [
3720-
"One analysis that we can perform is to examine **when** the neural activity representing left versus right choices begins to diverge before the first movement is made. We will use a dimentionality reduction approach (PCA) to measure the **distance between the left and right choice representations** over a time window of interest.\n",
3720+
"One analysis that we can perform is to examine **when** the neural activity representing left versus right choices begins to diverge before the first movement is made. We will use a dimensionality reduction approach (PCA) to measure the **distance between the left and right choice representations** over a time window of interest.\n",
37213721
"\n",
37223722
"\n",
37233723
"We start by loading data from a specific brain region of interest, **GRN**. The trials are then split based on the **choice** made."
@@ -3774,7 +3774,7 @@
37743774
{
37753775
"cell_type": "code",
37763776
"source": [
3777-
"# Apply PCA to reduce to 2 dimentions\n",
3777+
"# Apply PCA to reduce to 2 dimensions\n",
37783778
"pca = PCA(n_components=2)\n",
37793779
"trajs = pca.fit_transform(all_psth.T).T\n",
37803780
"\n",
@@ -4105,7 +4105,7 @@
41054105
"\n",
41064106
" all_modulated.append(df)\n",
41074107
"\n",
4108-
"# Concatentate results into a single dataframe\n",
4108+
"# Concatenate results into a single dataframe\n",
41094109
"all_modulated = pd.concat(all_modulated)"
41104110
],
41114111
"metadata": {
@@ -4567,7 +4567,7 @@
45674567
"🟨 **Note**\n",
45684568
"* This analysis requires access to the full dataset, rather than the pre-processed version used above. To get started with accessing the full IBL dataset, please refer to:\n",
45694569
" * [This introductary notebook](https://colab.research.google.com/drive/1_1qfa-DLDbezyFXguFOnJJWF5aJ5AH0i#scrollTo=-TJR7XEgtBxS)\n",
4570-
" * [This tutorial](https://colab.research.google.com/github/NeuromatchAcademy/course-content/blob/main/projects/neurons/IBL_ONE_tutorial.ipynb)\n",
4570+
" * [This tutorial](https://colab.research.google.com/drive/1y3sRI1wC7qbWqN6skvulzPOp6xw8tLm7#scrollTo=hRZA78AoaBIC)\n",
45714571
"\n",
45724572
"\n",
45734573
"The `Bayes Optimal` model is really the best an animal could do in estimating the block prior. However, mice are likely not that optimal.\n",

projects/neurons/IBL_ONE_tutorial.ipynb

Lines changed: 18 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -116,9 +116,7 @@
116116
},
117117
{
118118
"cell_type": "markdown",
119-
"source": [
120-
"## Install relevent packages"
121-
],
119+
"source": "## Install relevant packages",
122120
"metadata": {
123121
"id": "CJ31GQkwGbKq"
124122
}
@@ -154,6 +152,17 @@
154152
"execution_count": null,
155153
"outputs": []
156154
},
155+
{
156+
"metadata": {},
157+
"cell_type": "code",
158+
"outputs": [],
159+
"execution_count": null,
160+
"source": [
161+
"# When running in jupyter set number of threads to 1\n",
162+
"import os\n",
163+
"os.environ.setdefault('ONE_HTTP_DL_THREADS', '1')"
164+
]
165+
},
157166
{
158167
"cell_type": "markdown",
159168
"source": [
@@ -956,9 +965,7 @@
956965
},
957966
{
958967
"cell_type": "markdown",
959-
"source": [
960-
"An `eid` represents a unique experiment identifier and can also encoded by a specific path with the following strucutre `lab/Subjects/subject_name/date/session_number`. We refer to this as the `session path` and this also represents the location on your local where data for each session is stored. We can convert between an `eid` and a `session path` using the following `ONE` methods"
961-
],
968+
"source": "An `eid` represents a unique experiment identifier and can also encoded by a specific path with the following structure `lab/Subjects/subject_name/date/session_number`. We refer to this as the `session path` and this also represents the location on your local where data for each session is stored. We can convert between an `eid` and a `session path` using the following `ONE` methods",
962969
"metadata": {
963970
"id": "JehqUWwacEoM"
964971
}
@@ -1200,9 +1207,7 @@
12001207
},
12011208
{
12021209
"cell_type": "markdown",
1203-
"source": [
1204-
"A single dataset can be downloaded and loaded into memory by passing in the eid and dataset as arguemnts into the `one.load_dataset` method,"
1205-
],
1210+
"source": "A single dataset can be downloaded and loaded into memory by passing in the eid and dataset as arguments into the `one.load_dataset` method,",
12061211
"metadata": {
12071212
"id": "5E7JPMUFH-gq"
12081213
}
@@ -1266,7 +1271,7 @@
12661271
"print(list(clusters.keys()))\n",
12671272
"\n",
12681273
"# Only download the clusters object for probe01\n",
1269-
"clusters = one.load_object(eid, 'clusters', collection=f'alf/{pname}/pykilosort', download_only=True)\n"
1274+
"clusters = one.load_object(eid, 'clusters', collection=f'alf/{pname}/pykilosort', download_only=True)"
12701275
],
12711276
"metadata": {
12721277
"id": "9fDFfJrtJV2y",
@@ -1989,7 +1994,7 @@
19891994
"probes = list(ephys_sess_loader.ephys.keys())\n",
19901995
"print(f'Name of probes loaded: {probes}')\n",
19911996
"\n",
1992-
"# Acess the spikesorting data for the first probe\n",
1997+
"# Access the spikesorting data for the first probe\n",
19931998
"spikes = ephys_sess_loader.ephys[probes[0]]['spikes']\n",
19941999
"print(f'Keys of spikes data: {spikes.keys()}')"
19952000
],
@@ -3036,7 +3041,7 @@
30363041
"cell_type": "code",
30373042
"source": [
30383043
"# 4. Compute the firing rate of each cluster\n",
3039-
"# The firing rate of each cluster can be found in the firing rate atrribute of the clusters object\n",
3044+
"# The firing rate of each cluster can be found in the firing rate attribute of the clusters object\n",
30403045
"firing_rate = clusters_good['firing_rate']\n",
30413046
"\n",
30423047
"# To show the interaction between the clusters and the spikes object we will show how you can compute\n",
@@ -4908,7 +4913,7 @@
49084913
"- [Brain-wide map technical paper](https://figshare.com/articles/preprint/Data_release_-_Brainwide_map_-_Q4_2022/21400815)\n",
49094914
"\n",
49104915
"Where can I find out more information about available dataset releases?\n",
4911-
"- [Publically available IBL data](https://int-brain-lab.github.io/iblenv/public_docs/public_introduction.html)\n",
4916+
"- [Publicly available IBL data](https://int-brain-lab.github.io/iblenv/public_docs/public_introduction.html)\n",
49124917
"\n",
49134918
"Where can I read more about the science conducted in the IBL?\n",
49144919
"- [List of publications](https://www.internationalbrainlab.com/publications)\n",

0 commit comments

Comments
 (0)