neuropixels cloud lab information management system Tools to fetch and update paths, metadata and state for Mindscope Neuropixels sessions, in the cloud.
-
make a new Python >=3.9 virtual environment with conda or venv (lighter option, since this package does not require pandas, numpy etc.):
python -m venv .venv
-
activate the virtual environment:
- Windows
.venv\scripts\activate
- Unix
source .venv/bin/activate.sh
-
install the package:
python -m pip install npc_lims
-
setup credentials
- required environment variables:
- AWS S3
AWS_DEFAULT_REGION
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
- to find and read files on S3
- must have read access on relevant aind buckets
- can be in a standard
~/.aws
location, as used by AWS CLI or boto3
- CodeOcean API
CODE_OCEAN_API_TOKEN
CODE_OCEAN_DOMAIN
- to find processed data in "data assets" via the Codeocean API
- generated in CodeOcean:
- right click on
Account
(bottom left, person icon) - click
User Secrets
- these are secrets than can be made available as environment variables in CodeOcean capsules - go to
Access Tokens
and clickGenerate new token
- this is for programatically querying CodeOcean's databases- in
Token Name
enterCodeocean API (read)
and checkread
on capsules and datasets - a token will be generated: click copy (storing it in a password manager, if you use one)
- in
- head back to
User Secrets
where we'll paste it into a new secret viaAdd secret > API credentials
- indescription
enterCodeocean API (read)
- inAPI key
enterCODE_OCEAN_API_KEY
- inAPI secret
paste the copied secret from before (should start withcop_
...)CODE_OCEAN_DOMAIN
is the codeocean https address, up to and including.org
- right click on
- AWS S3
- environment variables can also be specified in a file named
.env
in the current working directory- example: https://www.dotenv.org/docs/security/env.html
- be very careful that this file does not get pushed to public locations, e.g. github
- if using git, add it to a
.gitignore
file in your project's root directory:
.env*
- if using git, add it to a
- required environment variables:
-
now in Python we can find sessions that are available to work with:
>>> import npc_lims; # get a sequence of `SessionInfo` dataclass instances, one per session: >>> tracked_sessions: tuple[npc_lims.SessionInfo, ...] = npc_lims.get_session_info() # each `SessionInfo` instance has minimal metadata about its session: >>> tracked_sessions[0] # doctest: +SKIP npc_lims.SessionInfo(id='626791_2022-08-15', subject=626791, date='2022-08-15', idx=0, project='DRPilotSession', is_ephys=True, is_sync=True, allen_path=PosixUPath('//allen/programs/mindscope/workgroups/dynamicrouting/PilotEphys/Task 2 pilot/DRpilot_626791_20220815')) >>> tracked_sessions[0].is_ephys # doctest: +SKIP False # currently, we're only tracking behavior and ephys sessions that use variants of https://github.com/samgale/DynamicRoutingTask/blob/main/TaskControl.py: >>> all(s.date.year >= 2022 for s in tracked_sessions) True
-
"tracked sessions" are discovered via 3 routes:
- https://github.com/AllenInstitute/npc_lims/blob/main/tracked_sessions.yaml
\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTraining.xlsx
\\allen\programs\mindscope\workgroups\dynamicrouting\DynamicRoutingTask\DynamicRoutingTrainingNSB.xlsx