Skip to content

Commit a448a18

Browse files
Merge pull request #28 from vathes/dev
merging from dev for internal deploy
2 parents b8e36c7 + d2c8e57 commit a448a18

File tree

76 files changed

+10554
-4054
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

76 files changed

+10554
-4054
lines changed

README.md

Lines changed: 61 additions & 21 deletions
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,44 @@
1-
## How to build and develop using the new dockerrized app.
1+
## How to build and develop using the new dockerized app.
2+
3+
### Prerequisites
4+
5+
If not already satisfied, add the following entry into your `/etc/hosts` file at the very top:
6+
7+
```
8+
127.0.0.1 fakeservices.datajoint.io
9+
```
10+
11+
This will create an alias to your `localhost` based on requests to `fakeservices.datajoint.io`.
12+
13+
Make sure to also define a `.env` file as follows:
14+
15+
``` sh
16+
# minimum
17+
DJ_PASS=db_password
18+
AWS_ACCESS_KEY_ID=aws_key
19+
AWS_SECRET_ACCESS_KEY=aws_secret
20+
DEMO_PASSWORD=ibl_navigator_password
21+
JWT_SECRET=secret
22+
# utilized for remote deployment
23+
SUBDOMAINS=sub
24+
URL=example.com
25+
# utilized for load testing
26+
TEST_DJ_HOST=test_db_host
27+
TEST_DJ_USER=test_db_user
28+
TEST_DJ_PASS=test_db_password
29+
```
30+
31+
### Build
32+
33+
When building in local/dev mode, make sure to not commit the changes in the frontend/src/environment folder - especially the part where backend_url gets overwritten by the fakeservices.datajoint.io url.
234

335
To be 100% sure of the new build to be reflected - use below
436
`docker-compose -f docker-compose-dev.yml build --no-cache`
537

638
Then,
739
`docker-compose -f docker-compose-dev.yml up`
840
to begin the development in `ng serve` mode - go to
9-
localhost:9000 to see the site.
41+
fakeservices.datajoint.io:9000 to see the site.
1042
`docker-compose -f docker-compose-dev.yml down`
1143
when done developing.
1244

@@ -17,48 +49,56 @@ For detached mode and to add log after the fact
1749
`docker-compose -f docker-compose-dev.yml up -d`
1850
`docker-compose -f docker-compose-dev.yml logs -f`
1951

20-
To see the production build using `ng build --prod`,
21-
do the regular docker-compose up then go to localhost:8080
22-
`docker-compose up --build`
52+
**To see the production build using `ng build --prod`, make sure to increment the `vX.X.X` portion of the image tag and if it relates to public site add `-public` at the end.**
53+
54+
to do the regular docker-compose up then go to localhost:9000
55+
`docker-compose -f docker-compose-build.yml up --build`
2356

2457
to check inside docker
2558
`docker-compose -f docker-compose-dev.yml exec ibl-node-server /bin/bash`
2659

2760
--------------------------------
2861
for deploy (general)
2962

30-
Before building, make sure `build: ./ibl-frontend` is UNcommented in docker-compose.yml.
31-
`docker-compose build ibl-navigator` once that's built,
32-
`docker push registry.vathes.com/ibl-navigator/frontend:v0.0`
63+
`docker-compose -f docker-compose-build.yml build ibl-navigator` once that's built,
64+
`docker-compose -f docker-compose-build.yml push ibl-navigator`
3365

34-
commentout the `build: ./ibl-frontend`
35-
36-
repeat for other 3 `iblapi` `ibl-node-server` `nginx` and push to appropriate directory. Update the tags accordingly as well.
66+
repeat for other 2 `iblapi` `ibl-node-server` and push to appropriate directory. Update the tags accordingly as well.
3767

3868
for testdev deploy
3969
comment out test/* directory in `.dockerignore` (until proper storage solution is in place)
40-
for test dev mode, make sure `STAGING=true` for nginx > environment setting.
70+
71+
make sure to update `SUBDOMAINS` key in `.env` file to `testdev`.
72+
make sure to update `URL` key in `.env` file to `datajoint.io`.
73+
74+
for test dev mode, in `docker-compose-deploy.yml` make sure `STAGING=true` for `letsencrypt` > environment setting.
4175

4276
`ssh testdev` go to `ibl-navigator`
43-
`docker-compose down` to stop what's already running
77+
`docker-compose -f docker-compose-deploy.yml down` to stop what's already running
4478
`sudo rm -R letsencrypt-keys` to get rid of key folder that generated in the previous run.
45-
`git pull origin dev` to get the latest from `mahos/ibl-navigator` repo.
79+
`git pull https://github.com/vathes/ibl-navigator.git dev` to get the latest from `vathes/ibl-navigator` repo.
80+
login with your regular github credentials (the one registered under github vathes)
4681
make sure to move over to the `dev` branch by `git checkout dev`
4782
`docker login registry.vathes.com` to docker to get access.
48-
`docker-compose pull` to get the ibl-navigator container
49-
`docker-compose up --build -d`
83+
`docker-compose -f docker-compose-deploy.yml pull` to get the ibl-navigator container
84+
`docker-compose -f docker-compose-deploy.yml up -d`
5085

5186
-----------------------------------
5287

5388
for real deploy
54-
for client deploy mode, comment out `STAGING=true` for nginx > environment setting.
89+
90+
make sure to update `SUBDOMAINS` key in `.env` file to `djcompute`.
91+
make sure to update `URL` key in `.env` file to `internationalbrainlab.org`.
92+
93+
for client deploy mode, in `docker-compose-deploy.yml` make sure to comment out `STAGING=true` for `letsencrypt` > environment setting.
5594

5695
`ssh djcompute` go to `nagivator-deployer/ibl-navigator`
57-
`docker-compose down` to stop what's already running
58-
`git pull origin master` to get the latest from `mahos/ibl-navigator` repo.
96+
`docker-compose -f docker-compose-deploy.yml down` to stop what's already running
97+
`git pull https://github.com/vathes/ibl-navigator.git master` to get the latest from `vathes/ibl-navigator` repo.
98+
login with your regular github credentials (the one registered under github vathes)
5999
make sure to move over to the `master` branch by `git checkout master`
60100
`docker login registry.vathes.com` to docker to get access.
61-
`docker-compose pull` to get the ibl-navigator container
62-
`docker-compose up --build -d`
101+
`docker-compose -f docker-compose-deploy.yml pull` to get the ibl-navigator container
102+
`docker-compose -f docker-compose-deploy.yml up -d`
63103

64104
-------------------------------------

backend/Dockerfile

Lines changed: 19 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,22 +1,29 @@
1-
FROM datajoint/jupyter:python3.6
1+
FROM raphaelguzman/djlab:py3.6-debian
22

3-
# for production builds
4-
ADD . /src/iblapi
5-
6-
#RUN pip uninstall -y datajoint && pip install git+https://github.com/dimitri-yatsenko/datajoint-python.git@dev#egg=datajoint-python
3+
# RUN \
4+
# pip uninstall -y datajoint && \
5+
# pip install \
6+
# git+https://github.com/dimitri-yatsenko/datajoint-python.git@dev#egg=datajoint-python
77

8-
RUN pip install --upgrade --pre datajoint
9-
10-
RUN \
11-
pip install -e /src/iblapi && \
12-
chmod +x /src/iblapi/run-ibl-api.prod.sh && \
13-
chmod +x /src/iblapi/run-ibl-api.dev.sh
8+
RUN pip install --upgrade --pre datajoint==0.12.9
149

1510
HEALTHCHECK \
1611
--timeout=3s \
1712
--retries=20 \
1813
CMD \
19-
curl --fail http://localhost:5000/v0/lab || exit 1
14+
wget --quiet --tries=1 --spider http://localhost:5000/v0/lab > /dev/null 2>&1 || exit 1
2015

2116

2217
ENTRYPOINT ["/src/iblapi/run-ibl-api.prod.sh"]
18+
19+
# for production builds
20+
RUN mkdir -p /src/iblapi
21+
COPY --chown=dja:anaconda ["notebooks", "/src/iblapi/notebooks"]
22+
COPY --chown=dja:anaconda ["./*.txt", "./*.sh", "./*.rst", "./*.py", "/src/iblapi/"]
23+
24+
RUN \
25+
pip install -e /src/iblapi && \
26+
chmod +x /src/iblapi/run-ibl-api.prod.sh && \
27+
chmod +x /src/iblapi/run-ibl-api.dev.sh
28+
29+
# COPY --chown=dja:anaconda ["tests", "/src/iblapi/tests"]

backend/iblapi.py

Lines changed: 91 additions & 37 deletions
Original file line numberDiff line numberDiff line change
@@ -41,10 +41,13 @@ def test_mkvmod(mod):
4141
plotting_behavior = mkvmod('plotting_behavior')
4242
analyses_behavior = mkvmod('analyses_behavior')
4343
plotting_ephys = mkvmod('plotting_ephys')
44+
plotting_histology = mkvmod('plotting_histology')
4445
test_plotting_ephys = test_mkvmod('plotting_ephys')
4546
ephys = mkvmod('ephys')
4647
histology = mkvmod('histology')
4748
test_histology = test_mkvmod('histology')
49+
original_max_join_size = dj.conn().query(
50+
"show variables like 'max_join_size'").fetchall()[0][1]
4851

4952
dj.config['stores'] = {
5053
'ephys': dict(
@@ -140,6 +143,7 @@ def dumps(cls, obj):
140143
'spikeamptimetemplate': plotting_ephys.SpikeAmpTimeTemplate,
141144
'waveformtemplate': plotting_ephys.WaveformTemplate,
142145
# 'depthbrainregions': test_histology.DepthBrainRegion,
146+
'brainregions': reference.BrainRegion
143147

144148
}
145149
dumps = DateTimeEncoder.dumps
@@ -153,44 +157,41 @@ def mkpath(path):
153157
def do_req(subpath):
154158
app.logger.info("method: '{}', path: {}, values: {}".format(
155159
request.method, request.path, request.values))
156-
157160
# 1) parse request & arguments
158161
pathparts = request.path.split('/')[2:] # ['', 'v0'] [ ... ]
159162
obj = pathparts[0]
160-
161163
values = request.values
162164
postargs, jsonargs = {}, None
163-
165+
# construct kwargs
166+
kwargs = {'as_dict': True}
164167
limit = int(request.values['__limit']) if '__limit' in values else None
165168
order = request.values['__order'] if '__order' in values else None
166169
proj = json.loads(request.values['__proj']) if '__proj' in values else None
167-
168-
special_fields = ['__json', '__limit', '__order', '__proj']
170+
special_fields = ['__json', '__limit', '__order', '__proj', '__json_kwargs']
169171
for a in (v for v in values if v not in special_fields):
170172
# HACK: 'uuid' attrs -> UUID type (see also: datajoint-python #594)
171173
postargs[a] = UUID(values[a]) if 'uuid' in a else values[a]
172-
173174
args = [postargs] if len(postargs) else []
174175
if '__json' in values:
175176
jsonargs = json.loads(request.values['__json'])
176177
args += jsonargs if type(jsonargs) == list else [jsonargs]
177-
178+
json_kwargs = {}
179+
if '__json_kwargs' in values:
180+
json_kwargs = json.loads(request.values['__json_kwargs'])
178181
args = {} if not args else dj.AndList(args)
179-
kwargs = {i[0]: i[1] for i in (('as_dict', True,),
182+
kwargs = {k: v for k, v in (('as_dict', True,),
180183
('limit', limit,),
181-
('order_by', order,)) if i[1] is not None}
182-
184+
('order_by', order,)) if v is not None}
183185
# 2) and dispatch
184186
app.logger.debug("args: '{}', kwargs: {}".format(args, kwargs))
185187
if obj not in reqmap:
186188
abort(404)
187189
elif obj == '_q':
188-
return handle_q(pathparts[1], args, proj, **kwargs)
190+
return handle_q(pathparts[1], args, proj, fetch_args=kwargs, **json_kwargs)
189191
else:
190192
q = (reqmap[obj] & args)
191193
if proj:
192194
q = q.proj(*proj)
193-
194195
from time import time
195196
start = time()
196197
print('about to fetch requested object')
@@ -200,22 +201,24 @@ def do_req(subpath):
200201
print('Took {} seconds to fetch dataset'.format(dur))
201202
return dumps(fetched)
202203
# return dumps(q.fetch(**kwargs))
203-
204204

205-
206-
def handle_q(subpath, args, proj, **kwargs):
205+
def handle_q(subpath, args, proj, fetch_args=None, **kwargs):
207206
'''
208207
special queries (under '/_q/ URL Space)
209208
- for sessionpage, provide:
210209
((session * subject * lab * user) & arg).proj(flist)
211210
'''
212211
app.logger.info("handle_q: subpath: '{}', args: {}".format(subpath, args))
212+
app.logger.info('key words: {}'.format(kwargs))
213213

214+
fetch_args = {} if fetch_args is None else fetch_args
214215
ret = []
215216
post_process = None
216217
if subpath == 'sessionpage':
218+
print('type of args: {}'.format(type(args)))
217219
sess_proj = acquisition.Session().aggr(
218-
acquisition.SessionProject().proj('session_project', dummy2='"x"') * dj.U('dummy2'),
220+
acquisition.SessionProject().proj('session_project', dummy2='"x"')
221+
* dj.U('dummy2'),
219222
session_project='IFNULL(session_project, "unassigned")',
220223
keep_all_rows=True
221224
)
@@ -226,18 +229,36 @@ def handle_q(subpath, args, proj, **kwargs):
226229
nplot='count(dummy)',
227230
keep_all_rows=True)
228231
ephys_data = acquisition.Session().aggr(
229-
ephys.ProbeInsertion().proj(dummy2='"x"') * dj.U('dummy2'),
230-
nprobe='count(dummy2)',
232+
ephys.ProbeInsertion().proj(dummy3='"x"') * dj.U('dummy3'),
233+
nprobe='count(dummy3)',
231234
keep_all_rows=True)
232-
# q = (acquisition.Session() * sess_proj * psych_curve * ephys_data * subject.Subject() * subject.SubjectLab() * subject.SubjectUser() * analyses_behavior.SessionTrainingStatus()
233-
# & ((reference.Lab() * reference.LabMember())
234-
# & reference.LabMembership().proj('lab_name', 'user_name'))
235-
# & args)
236-
q = (acquisition.Session() * sess_proj * psych_curve * ephys_data * subject.Subject() * subject.SubjectLab() * subject.SubjectUser() * analyses_behavior.SessionTrainingStatus()) & args
237-
# training_status = acquisition.Session.aggr(analyses_behavior.SessionTrainingStatus.proj(dummy3='"x"') * dj.U('dummy3'), nstatus='count(dummy3)', keep_all_rows=True)
238-
# q = acquisition.Session() * sess_proj * psych_curve * ephys_data * training_status * subject.Subject() * subject.SubjectLab() & ((reference.Lab() * reference.LabMember() & reference.LabMembership().proj('lab_name', 'user_name')))
235+
trainingStatus = acquisition.Session().aggr(
236+
analyses_behavior.SessionTrainingStatus().proj(dummy4='"x"') * dj.U('dummy4'),
237+
keep_all_rows=True) * acquisition.Session().aggr(
238+
(analyses_behavior.SessionTrainingStatus()),
239+
training_status='training_status', good_enough_for_brainwide_map='good_enough_for_brainwide_map',
240+
keep_all_rows=True
241+
)
242+
regions = kwargs.get('brain_regions', None)
243+
# expected format of brain_regions = ["AB", "ABCa", "CS of TCV"]
244+
if regions is not None and len(regions) > 0:
245+
region_restr = [{'acronym': v} for v in regions]
246+
brain_restriction = histology.ProbeBrainRegionTemp() & region_restr
247+
# keep the temp table for internal site since temp table has more entries for internal users to see
248+
# for public site replace ProbeBrainRegionTemp() with ProbeBrainRegion() table.
249+
else:
250+
brain_restriction = {}
251+
# q = ((acquisition.Session() * sess_proj * psych_curve * ephys_data * subject.Subject()*
252+
# subject.SubjectLab() * subject.SubjectUser() *
253+
# analyses_behavior.SessionTrainingStatus()) & args & brain_restriction)
254+
255+
q = ((acquisition.Session() * sess_proj * psych_curve * ephys_data * subject.Subject() *
256+
subject.SubjectLab() * subject.SubjectUser() * trainingStatus) & args & brain_restriction)
257+
258+
dj.conn().query("SET SESSION max_join_size={}".format('18446744073709551615'))
259+
q = q.proj(*proj).fetch(**fetch_args) if proj else q.fetch(**fetch_args)
260+
dj.conn().query("SET SESSION max_join_size={}".format(original_max_join_size))
239261
elif subpath == 'subjpage':
240-
print('Args are:', args)
241262
proj_restr = None
242263
for e in args:
243264
if 'projects' in e and e['projects'] != 'unassigned':
@@ -302,14 +323,17 @@ def handle_q(subpath, args, proj, **kwargs):
302323
q = (ephys.DefaultCluster & args).proj(..., *exclude_attrs) * ephys.DefaultCluster.Metrics.proj('firing_rate')
303324
print(q)
304325
elif subpath == 'probetrajectory':
305-
traj = histology.ProbeTrajectory * histology.InsertionDataSource
326+
# keep the provenance and temp table for internal site
327+
traj = histology.ProbeTrajectoryTemp * histology.Provenance
306328

307329
traj_latest = traj * (dj.U('subject_uuid', 'session_start_time', 'probe_idx', 'provenance') & \
308330
(ephys.ProbeInsertion & args).aggr(traj, provenance='max(provenance)'))
309-
# x, y, z, phi, theta, depth, roll, trajectory_source = traj_latest.fetch1('x', 'y', 'z', 'phi', 'theta', 'depth', 'roll', 'insertion_data_source')
310-
# q = traj_latest.fetch1('x', 'y', 'z', 'phi', 'theta', 'depth', 'roll', 'insertion_data_source')
331+
311332
q = traj * (dj.U('subject_uuid', 'session_start_time', 'probe_idx', 'provenance') & \
312333
(ephys.ProbeInsertion & args).aggr(traj, provenance='max(provenance)'))
334+
335+
# for public site we don't need the trajectory source info (uses provenance) anymore so -> traj = histology.ProbeTrajectory
336+
# or basically for public -> q = histology.ProbeTrajectory & args
313337
elif subpath == 'rasterlight':
314338
# q = plotting_ephys.RasterLinkS3 & args
315339
q = plotting_ephys.Raster & args # temp test table
@@ -406,17 +430,47 @@ def post_process(ret):
406430
parsed_items.append(parsed_item)
407431
return parsed_items
408432
elif subpath == 'depthbrainregions':
409-
depth_region = histology.DepthBrainRegion * histology.InsertionDataSource
410-
411-
q = depth_region * (dj.U('subject_uuid', 'session_start_time', 'probe_idx', 'provenance') &
412-
(ephys.ProbeInsertion & args).aggr(depth_region, provenance='max(provenance)'))
433+
# depth_region = histology.DepthBrainRegionTemp * histology.Provenance
434+
435+
# q = depth_region * (dj.U('subject_uuid', 'session_start_time', 'probe_idx', 'provenance') &
436+
# (ephys.ProbeInsertion & args).aggr(depth_region, provenance='max(provenance)'))
437+
438+
# NEW: test this before deploy to internal
439+
q = histology.DepthBrainRegion & args
440+
elif subpath == 'spinningbrain':
441+
q = plotting_histology.SubjectSpinningBrain & args
442+
# # Switch to plotting_histology once ingested
443+
# q = plotting_histology.SubjectSpinningBrain & args
444+
def post_process(ret):
445+
parsed_items = []
446+
for item in ret:
447+
parsed_item = dict(item)
448+
if parsed_item['subject_spinning_brain_link'] != '': # if empty link, skip
449+
parsed_item['subject_spinning_brain_link'] = \
450+
s3_client.generate_presigned_url('get_object',
451+
Params={'Bucket': 'ibl-dj-external', 'Key': parsed_item['subject_spinning_brain_link']},
452+
ExpiresIn=3*60*60)
453+
parsed_items.append(parsed_item)
454+
return parsed_items
455+
elif subpath == 'coronalsections':
456+
q = plotting_histology.ProbeTrajectoryCoronal & args
457+
def post_process(ret):
458+
parsed_items = []
459+
for item in ret:
460+
parsed_item = dict(item)
461+
if parsed_item['probe_trajectory_coronal_link'] != '': # if empty link, skip
462+
parsed_item['probe_trajectory_coronal_link'] = \
463+
s3_client.generate_presigned_url('get_object',
464+
Params={'Bucket': 'ibl-dj-external', 'Key': parsed_item['probe_trajectory_coronal_link']},
465+
ExpiresIn=3*60*60)
466+
parsed_items.append(parsed_item)
467+
return parsed_items
413468
else:
414469
abort(404)
470+
415471

416-
if proj:
417-
ret = q.proj(*proj).fetch(**kwargs)
418-
else:
419-
ret = q.fetch(**kwargs)
472+
ret = q if isinstance(q, (list, dict)) else (q.proj(*proj).fetch(**fetch_args)
473+
if proj else q.fetch(**fetch_args))
420474

421475
# print('D type', ret.dtype)
422476
# print(ret)

0 commit comments

Comments
 (0)