You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
* refactor order of ops
* adjust code to remove .metadata
* add docker and dep
* move all and sequenced caselist generation to load.py
* remove not needed code
* update to current workflow
* add docker setup and tests to README
* [DPE-1453] Process ANDERS clinical dataset (#122)
* add anders dataset specific filtering, convert lens map to be string vals
* address PR comments
* [DPE-1468] Add neoantigen data into clinical sample data (#124)
* initial commit for incorporating neoantigen data
* rearrange code to have a general validation script
* add tests
* remove unused code
* remove unused code
* add unit tests and docstring
* update docstring order of ops
* add indicator in logs for any error that study failed, address PR comments
@@ -103,7 +103,7 @@ python3 local/iatlas/lens.py run --dataset_id <yaml-dataset-synapse-id> --s3_pre
103
103
104
104
### Overview
105
105
106
-
#### maf_to_cbioportal.py
106
+
#### maf.py
107
107
This script will run the iatlas mutations data through genome nexus so it can be ingested by cbioportal team for visualization.
108
108
109
109
The script does the following:
@@ -115,7 +115,7 @@ The script does the following:
115
115
5. [Creates the required meta_* data](https://github.com/cBioPortal/datahub-study-curation-tools/tree/master/generate-meta-files)
116
116
117
117
118
-
#### clinical_to_cbioportal.py
118
+
#### clinical.py
119
119
This script will process/transform the iatlas clinical data to be cbioportal format friendly so it can be ingested by cbioportal team for visualization.
120
120
121
121
The script does the following:
@@ -128,18 +128,59 @@ The script does the following:
128
128
129
129
130
130
### Setup
131
-
- `pandas` == 2.0
132
-
- `synapseclient`==4.8.0
131
+
Prior to testing/developing/running this locally, you will need to setup the Docker image in order to run this.
132
+
Optional: You can also build your environment via python env and install the `uv.lock` file
133
+
134
+
1. Create and activate your venv
135
+
136
+
```
137
+
python3 -m venv <your_env_name>
138
+
source <your_env_name>/bin/activate
139
+
```
140
+
141
+
2. Export dependencies from uv.lock
142
+
143
+
```
144
+
pip install uv
145
+
uv export > requirements.txt
146
+
```
147
+
148
+
3. Install into your venv
149
+
150
+
```
151
+
pip install -r requirements.txt
152
+
```
153
+
154
+
But it is highly recommended you use the docker image
0 commit comments