Skip to content

Commit b48f164

Browse files
committed
fixed some alignementS and added tutorial to mkdocs
1 parent deafaed commit b48f164

File tree

11 files changed

+25
-19
lines changed

11 files changed

+25
-19
lines changed

README.md

Lines changed: 6 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,8 +3,12 @@
33
LSLAutoBIDS
44
</h1>
55
<p align="center"> Tools to convert LSL + friends automatically to BIDS, and upload it to a Dataverse </p>
6-
7-
6+
<p align="center">
7+
<a href="https://s-ccs.github.io/LSLAutoBIDS/" target="_blank">
8+
<img src="https://img.shields.io/badge/View%20Full%20Documentation-%230072C6?style=for-the-badge&logo=readthedocs&logoColor=white" alt="View Full Documentation">
9+
</a>
10+
</p>
11+
ss
812
## 🚀 Getting Started
913

1014
Get started with LSLAutoBIDS by installing the package and its dependencies.

docs/README.md

Lines changed: 0 additions & 1 deletion
This file was deleted.

docs/bids_convert_and_upload.md

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
### 4. BIDS Conversion and Upload Pipeline ⚙️ (`convert_to_bids_and_upload.py`)
1+
## BIDS Conversion and Upload Pipeline ⚙️ (`convert_to_bids_and_upload.py`)
22

33
The pipeline is designed to ensure:
44

@@ -12,7 +12,7 @@ The pipeline is designed to ensure:
1212

1313
5. The dataset is registered in Dataverse and optionally pushed/uploaded automatically.
1414

15-
#### 1. Entry Point (`bids_process_and_upload()`)
15+
#### Entry Point (`bids_process_and_upload()`)
1616

1717
- Reads project configuration (<project_name>_config.toml) to check if a other computer (non eeg files) was used. (otherFilesUsed: true)
1818

@@ -42,7 +42,7 @@ The pipeline is designed to ensure:
4242

4343
- Pushes data to Dataverse automatically (--yes) or via user confirmation.
4444

45-
#### 2. Convert to BIDS (`convert_to_bids()`)
45+
#### Convert to BIDS (`convert_to_bids()`)
4646
This function handles the core conversion of a XDF files to BIDS format and constructs the dataset structure. It performs the following steps:
4747

4848
1. Copy raw/behavioral/experiment files via `copy_source_files_to_bids()` (See section).
@@ -77,7 +77,7 @@ This function handles the core conversion of a XDF files to BIDS format and cons
7777

7878
- 0: BIDS Conversion done but validation failure
7979

80-
#### 3. Copy Source Files (`copy_source_files_to_bids()`)
80+
#### Copy Source Files (`copy_source_files_to_bids()`)
8181
This function ensures that the original source files (EEG and other/behavioral files) are also a part our dataset. These files can't be directly converted to BIDS format but we give the user the option to include them in the BIDS directory structure in a pseudo-BIDS format for completeness.
8282

8383
- Copies the .xdf into the following structure:
@@ -104,7 +104,7 @@ If otherFilesUsed=True in project config file:
104104

105105
There is a flag in the `lslautobids run` command called `--redo_other_pc` which when specified, forces overwriting of existing other and experiment files in the BIDS dataset. This is useful if there are updates or corrections to the other/behavioral data that need to be reflected in the BIDS dataset.
106106

107-
#### 4. Create Raw XDF (`create_raw_xdf()`)
107+
#### Create Raw XDF (`create_raw_xdf()`)
108108
This function reads the XDF file and creates an MNE Raw object. It performs the following steps:
109109
- Select EEG stream using match_streaminfos(type="EEG").
110110

@@ -118,20 +118,20 @@ This function reads the XDF file and creates an MNE Raw object. It performs the
118118

119119
This produces a clean, memory-efficient Raw object ready for BIDS conversion.
120120

121-
#### 5. BIDS Validation (`validate_bids()`)
121+
#### BIDS Validation (`validate_bids()`)
122122
This function validates the generated BIDS files using the `bids-validator` package. It performs the following steps:
123123
- Walks through the BIDS directory.
124124
- Skips irrelevant files: (`.xdf`, `.tar.gz`, behavioral files, hidden/system files.)
125125
- Uses `BIDSValidator` to validate relative paths.
126126
- If any file fails validation, logs an error and returns 0 ; Otherwise, logs success and returns 1.
127127

128-
#### 6. Populate dataset_description.json (`populate_dataset_description_json()`)
128+
#### Populate dataset_description.json (`populate_dataset_description_json()`)
129129
This function generates the `dataset_description.json` file required by the BIDS standard. It performs the following steps:
130130
- Gathers metadata from the project configuration file (title, authors, license, etc.) from the project TOML config file.
131131
- Calls make_dataset_description() from mne_bids.
132132
- Overwrites existing file with updated values.
133133

134-
#### 7. Datalad and Dataverse Integration
134+
#### Datalad and Dataverse Integration
135135
This part of the pipeline manages version control and data sharing using DataLad and Dataverse. After conversion, the following steps occur:
136136

137137
- `generate_json_file()` → Generates supplementary metadata JSONs.

docs/cli.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
### 1. CLI Module (`cli.py`)
1+
## CLI Module (`cli.py`)
22

33
The `lslautobids` command-line interface provides the main entry point for the application:
44

docs/configuration.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
1-
### 2. Configuration System
1+
## Configuration System
22

33
The configuration system manages dataversse and project-specific settings using YAML and TOML files.
44

5-
#### 1. Dataverse and Project Root Configuration (`gen_dv_config.py`)
5+
#### Dataverse and Project Root Configuration (`gen_dv_config.py`)
66

77
This module generates a global configuration file for Dataverse and project root directories. This is a one-time setup per system. This file is stored in `~/.config/lslautobids/autobids_config.yaml` and contains:
88
- Paths for BIDS, projects, and project_other directories : This allows users to specify where their eeg data, behavioral data, and converted BIDS data are stored on their system. This paths should be relative to the home/users directory of your system and string format.
@@ -19,7 +19,7 @@ _Currently, the package doesn't allow you to have multiple dataverse configurati
1919

2020
However for testing purposes, we create a separate test configuration file `~/.config/lslautobids/test-autobids_config.yaml` which is used when running the tests.
2121

22-
#### 2. Project Configuration (`gen_project_config.py`)
22+
#### Project Configuration (`gen_project_config.py`)
2323
This module generates a project-specific configuration file in TOML format. This file is stored in the `projects/<PROJECT_NAME>/<PROJECT_NAME>_config.toml` file and contains:
2424
- Project metadata: Title, description, license, and authors, etc.
2525

docs/contributions.md

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,5 @@
1+
## Contribution Guidelines
2+
13
First of all, thanks for the interest!
24

35
We welcome all kinds of contribution, including, but not limited to code, documentation, examples, configuration, issue creating, etc.

docs/datalad_integration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
### 4. DataLad Integration (`datalad_create.py`)
1+
## DataLad Integration (`datalad_create.py`)
22

33
The DataLad integration module manages the creation and updating of DataLad datasets for version control of the BIDS dataset. This function initializes a DataLad dataset at a given project path.
44
The function performs the following steps:

docs/dataverse_integration.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
### 5. Dataverse Integration
1+
### Dataverse Integration
22

33
#### 1. Dataverse Dataset Creation (`dataverse_dataset_create.py`)
44
This module handles the creation of a new dataset in Dataverse using the `pyDataverse` library. The function performs the following steps:

docs/faq.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,4 @@
1+
# Frequently Asked Questions (FAQ)
12

23
These are some frequently asked questions regarding the LSLAutoBIDS tool and workflow.
34

docs/file_processing.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
### 3. File Processing Pipeline (`processing_new_files.py`)
1+
### File Processing Pipeline (`processing_new_files.py`)
22

33
The file processing part of the pipeline handles finding and processing new XDF files in the specified project directory:
44

0 commit comments

Comments
 (0)