Skip to content

Commit ce954b9

Browse files
committed
Deploy preview for PR 358 🛫
1 parent a1bc5c8 commit ce954b9

File tree

706 files changed

+101195
-0
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

706 files changed

+101195
-0
lines changed
Lines changed: 42 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,42 @@
1+
2+
<img alt="NWB:N" src="https://neurodatawithoutborders.github.io/images/nwb_n_logo.png" width="400">
3+
4+
5+
# Welcome to the NWB:N Tutorial at Cosyne 2019!
6+
7+
The [Neurodata Without Borders: Neurophysiology (NWB:N)](https://neurodatawithoutborders.github.io/) team is holding a tutorial on the NWB:N data standard and on using [PyNWB](https://neurodatawithoutborders.github.io/pynwb) and [MatNWB](https://neurodatawithoutborders.github.io/matnwbemb) at the Cosyne 2019 Workshops.
8+
9+
The [NWB:N project](https://neurodatawithoutborders.github.io/) is an effort to standardize the description and storage of neurophysiology data and metadata. NWB:N enables data sharing and reuse and reduces the energy-barrier to applying data analytics both within and across labs. NWB:N is more than just a file format but it defines an [ecosystem](https://neurodatawithoutborders.github.io/overview) of tools, methods, and standards for storing, sharing, and analyzing complex neurophysiology data.
10+
11+
We recently released [NWB:N 2.0](https://neurodatawithoutborders.github.io/news), and we are excited to teach our user-base and potential new users about NWB:N and our software tools.
12+
13+
## Dates and Location
14+
15+
* **Date/Time:** March 4, 2019, 1-4pm
16+
* **Location:** Hotel Miragem Cascais, Av.Marginal n.8554 ​ 2754-536 Cascais, Portugal
17+
* **Room:** Sala XVI
18+
* A map of the conference rooms is available [here](https://www.cascaismirage.com/uploads/9/8/2/4/98249186/meeting_rooms_capacity_chart.pdf)
19+
20+
## Registration
21+
22+
Registration is still open! To attend the tutorial please complete the [registration form](https://goo.gl/forms/LAMXakJ11p3Tlwdq2) .
23+
24+
This tutorial is supported by the Kavli Foundation and participation is free, but registration is required. By completing the registration you help us plan attendance and target the event to meet attendees interest. This registration is for the NWB:N tutorial only; to register for Cosyne 2019 please see the [Cosyne 2019 website](http://cosyne.org/c/index.php?title=Registration).
25+
26+
27+
## Tutorial Program
28+
29+
* [NWB:N 2.0: Overview](https://drive.google.com/open?id=1Dq7zhQ4weiGv-3m6zD11ZrHQcObuddik)
30+
* [Electrophysiology tutorial slides](https://docs.google.com/presentation/d/1Q03wU6NzMTOwuWaZIANtldNT-8ZKeM0fk3b0mEI-JFc/edit?usp=sharing)
31+
* [python jupyter notebook](http://htmlpreview.github.io/?https://github.com/NeurodataWithoutBorders/nwb_hackathons/blob/master/Cosyne_2019/cosyne_NWB_tutorial_2019_python.html)
32+
* [matlab code](http://htmlpreview.github.io/?https://github.com/NeurodataWithoutBorders/nwb_hackathons/blob/master/Cosyne_2019/cosyne_NWB_tutorial_2019_matlab.html)
33+
34+
## Organizing Committee
35+
36+
* Ben Dichter
37+
* Oliver Ruebel, LBNL
38+
* Stephanie Albin, Kavli Foundation
39+
40+
### Additional Organizational Support
41+
42+
- The Kavli Foundation
Lines changed: 295 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,295 @@
1+
{
2+
"cells": [
3+
{
4+
"cell_type": "markdown",
5+
"metadata": {},
6+
"source": [
7+
"# Cosyne 2019 NWB:N Tutorial - Extracellular Electrophysiology"
8+
]
9+
},
10+
{
11+
"cell_type": "markdown",
12+
"metadata": {},
13+
"source": [
14+
"## Set up NWB file\n",
15+
"NWB files require a session start time to be entered with a timezone field."
16+
]
17+
},
18+
{
19+
"cell_type": "code",
20+
"execution_count": null,
21+
"metadata": {},
22+
"outputs": [],
23+
"source": [
24+
"from pynwb import NWBFile\n",
25+
"from datetime import datetime\n",
26+
"from dateutil import tz\n",
27+
"\n",
28+
"start_time = datetime(2018, 4, 25, 2, 30, 3, tzinfo=tz.gettz('US/Pacific'))\n",
29+
"\n",
30+
"nwbfile = NWBFile(identifier='Mouse5_Day3',\n",
31+
" session_description='mouse in open exploration and theta maze', # required\n",
32+
" session_start_time=start_time, # required\n",
33+
" experimenter='My Name', # optional\n",
34+
" session_id='session_id', # optional\n",
35+
" institution='University of My Institution', # optional\n",
36+
" lab='My Lab Name', # optional\n",
37+
" related_publications='DOI:10.1016/j.neuron.2016.12.011') # optional"
38+
]
39+
},
40+
{
41+
"cell_type": "markdown",
42+
"metadata": {},
43+
"source": [
44+
"## Subject info"
45+
]
46+
},
47+
{
48+
"cell_type": "code",
49+
"execution_count": null,
50+
"metadata": {},
51+
"outputs": [],
52+
"source": [
53+
"from pynwb.file import Subject\n",
54+
"\n",
55+
"nwbfile.subject = Subject(age='9 months', description='mouse 5',\n",
56+
" species='Mus musculus', sex='M')"
57+
]
58+
},
59+
{
60+
"cell_type": "markdown",
61+
"metadata": {},
62+
"source": [
63+
"## Position\n",
64+
"The `Position` object is a `MultiContainerInterface` that holds one or more `SpatialSeries` objects, which are a subclass of `TimeSeries`. Here, we put a `SpatialSeries` object called `'position'` in a `Position` object, and put that in a `ProcessingModule` named `'behavior'`.\n",
65+
"<img src=\"images/position.png\" width=\"800\">"
66+
]
67+
},
68+
{
69+
"cell_type": "code",
70+
"execution_count": null,
71+
"metadata": {},
72+
"outputs": [],
73+
"source": [
74+
"import numpy as np\n",
75+
"from pynwb.behavior import SpatialSeries, Position\n",
76+
"\n",
77+
"position_data = np.array([\n",
78+
" np.linspace(0, 10, 100),\n",
79+
" np.linspace(1, 8, 100)]).T\n",
80+
"spatial_series_object = SpatialSeries(\n",
81+
" name='position', data=position_data,\n",
82+
" reference_frame='unknown',\n",
83+
" conversion=1.0, resolution=np.nan,\n",
84+
" timestamps=np.linspace(0, 100) / 200)"
85+
]
86+
},
87+
{
88+
"cell_type": "code",
89+
"execution_count": null,
90+
"metadata": {},
91+
"outputs": [],
92+
"source": [
93+
"pos_obj = Position(spatial_series=spatial_series_object)\n",
94+
"behavior_module = nwbfile.create_processing_module(\n",
95+
" name='behavior',\n",
96+
" description='data relevant to behavior')\n",
97+
"\n",
98+
"behavior_module.add_data_interface(pos_obj)\n"
99+
]
100+
},
101+
{
102+
"cell_type": "markdown",
103+
"metadata": {},
104+
"source": [
105+
"## Write to file"
106+
]
107+
},
108+
{
109+
"cell_type": "code",
110+
"execution_count": null,
111+
"metadata": {},
112+
"outputs": [],
113+
"source": [
114+
"from pynwb import NWBHDF5IO\n",
115+
"\n",
116+
"with NWBHDF5IO('test_ephys.nwb', 'w') as io:\n",
117+
" io.write(nwbfile)\n"
118+
]
119+
},
120+
{
121+
"cell_type": "markdown",
122+
"metadata": {},
123+
"source": [
124+
"## Electrodes table\n",
125+
"Extracellular electrodes are stored in a `electrodes`, which is a `DynamicTable`. `electrodes` has several required fields: x, y, z, impedence, location, filtering, and electrode_group. Here, we also demonstate how to add optional columns to a table by adding the `'label'` column.<img src=\"images/electrodes_table.png\" width=\"300\">"
126+
]
127+
},
128+
{
129+
"cell_type": "code",
130+
"execution_count": null,
131+
"metadata": {},
132+
"outputs": [],
133+
"source": [
134+
"nwbfile.add_electrode_column('label', 'label of electrode')\n",
135+
"shank_channels = [4, 3]\n",
136+
"\n",
137+
"electrode_counter = 0\n",
138+
"device = nwbfile.create_device('implant')\n",
139+
"for shankn, nelecs in enumerate(shank_channels):\n",
140+
" electrode_group = nwbfile.create_electrode_group(\n",
141+
" name='shank{}'.format(shankn),\n",
142+
" description='electrode group for shank {}'.format(shankn),\n",
143+
" device=device,\n",
144+
" location='brain area')\n",
145+
" for ielec in range(nelecs):\n",
146+
" nwbfile.add_electrode(\n",
147+
" x=5.3, y=1.5, z=8.5, imp=np.nan,\n",
148+
" location='unknown', filtering='unknown',\n",
149+
" group=electrode_group,\n",
150+
" label='shank{}elec{}'.format(shankn, ielec))\n",
151+
" electrode_counter += 1\n",
152+
"\n",
153+
"all_table_region = nwbfile.create_electrode_table_region(\n",
154+
" list(range(electrode_counter)), 'all electrodes')"
155+
]
156+
},
157+
{
158+
"cell_type": "markdown",
159+
"metadata": {},
160+
"source": [
161+
"## LFP\n",
162+
"`LFP` is another `MultiContainerInterface`. It holds one or more `ElectricalSeries` objects, which are `TimeSeries`. Here, we put an `ElectricalSeries` named `'lfp'` in an `LFP` object, in a `ProcessingModule` named `'ecephys'`.\n",
163+
"<img src=\"images/lfp.png\" width=\"800\">"
164+
]
165+
},
166+
{
167+
"cell_type": "code",
168+
"execution_count": null,
169+
"metadata": {},
170+
"outputs": [],
171+
"source": [
172+
"from pynwb.ecephys import ElectricalSeries, LFP\n",
173+
"lfp_data = np.random.randn(100, 7)\n",
174+
"ecephys_module = nwbfile.create_processing_module(\n",
175+
" name='ecephys',\n",
176+
" description='extracellular electrophysiology data')\n",
177+
"ecephys_module.add_data_interface(\n",
178+
"LFP(ElectricalSeries('lfp', lfp_data, all_table_region, \n",
179+
"rate=1000., resolution=.001, conversion=1.)))"
180+
]
181+
},
182+
{
183+
"cell_type": "markdown",
184+
"metadata": {},
185+
"source": [
186+
"## Spike Times\n",
187+
"Spike times are stored in another `DynamicTable` of subtype `Units`. The main `Units` table is at `/units` in the HDF5 file. You can add columns to the `Units` table just like you did for `electrodes`."
188+
]
189+
},
190+
{
191+
"cell_type": "code",
192+
"execution_count": null,
193+
"metadata": {},
194+
"outputs": [],
195+
"source": [
196+
"for shankn, channels in enumerate(shank_channels):\n",
197+
" for n_units_per_shank in range(np.random.poisson(lam=5)):\n",
198+
" n_spikes = np.random.poisson(lam=10)\n",
199+
" spike_times = np.abs(np.random.randn(n_spikes))\n",
200+
" nwbfile.add_unit(spike_times=spike_times)"
201+
]
202+
},
203+
{
204+
"cell_type": "markdown",
205+
"metadata": {},
206+
"source": [
207+
"## Trials\n",
208+
"Trials is another `DynamicTable` that lives an `/intervals/trials`."
209+
]
210+
},
211+
{
212+
"cell_type": "code",
213+
"execution_count": null,
214+
"metadata": {},
215+
"outputs": [],
216+
"source": [
217+
"nwbfile.add_trial_column('correct', description='correct trial')\n",
218+
"nwbfile.add_trial(start_time=1.0, stop_time=5.0, correct=True)\n",
219+
"nwbfile.add_trial(start_time=6.0, stop_time=10.0, correct=False)"
220+
]
221+
},
222+
{
223+
"cell_type": "markdown",
224+
"metadata": {},
225+
"source": [
226+
"## Write and read\n",
227+
"Data arrays are read passively from the file. That means `TimeSeries.data` does not read the entire data object, but presents an h5py object that can be indexed to read data. Index this array just like a numpy array to read only a specific section of the array, or use the `[:]` operator to read the entire thing."
228+
]
229+
},
230+
{
231+
"cell_type": "code",
232+
"execution_count": null,
233+
"metadata": {},
234+
"outputs": [],
235+
"source": [
236+
"from pynwb import NWBHDF5IO\n",
237+
"\n",
238+
"with NWBHDF5IO('test_ephys.nwb', 'w') as io:\n",
239+
" io.write(nwbfile)\n",
240+
"\n",
241+
"with NWBHDF5IO('test_ephys.nwb', 'r') as io:\n",
242+
" nwbfile2 = io.read()\n",
243+
"\n",
244+
" print(nwbfile2.modules['ecephys']['LFP'].electrical_series['lfp'].data[:])"
245+
]
246+
},
247+
{
248+
"cell_type": "markdown",
249+
"metadata": {},
250+
"source": [
251+
"## Accessing data regions\n",
252+
"You can easily read subsections of datasets"
253+
]
254+
},
255+
{
256+
"cell_type": "code",
257+
"execution_count": null,
258+
"metadata": {},
259+
"outputs": [],
260+
"source": [
261+
"io = NWBHDF5IO('test_ephys.nwb', 'r')\n",
262+
"nwbfile2 = io.read()\n",
263+
"\n",
264+
"print('section of lfp:')\n",
265+
"print(nwbfile2.modules['ecephys']['LFP'].electrical_series['lfp'].data[:10,:5])\n",
266+
"print('')\n",
267+
"print('')\n",
268+
"print('spike times from first unit:')\n",
269+
"print(nwbfile2.units['spike_times'][0])\n",
270+
"io.close()"
271+
]
272+
}
273+
],
274+
"metadata": {
275+
"kernelspec": {
276+
"display_name": "Python 3",
277+
"language": "python",
278+
"name": "python3"
279+
},
280+
"language_info": {
281+
"codemirror_mode": {
282+
"name": "ipython",
283+
"version": 3
284+
},
285+
"file_extension": ".py",
286+
"mimetype": "text/x-python",
287+
"name": "python",
288+
"nbconvert_exporter": "python",
289+
"pygments_lexer": "ipython3",
290+
"version": "3.6.8"
291+
}
292+
},
293+
"nbformat": 4,
294+
"nbformat_minor": 2
295+
}

0 commit comments

Comments
 (0)