Skip to content

sophiestrazie/Project-Telepath

Β 
Β 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

58 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

🧠 Project Cere

Multimodal AI for Social Good

AI4Good Lab @Mila β€’ Montreal 2025 Cohort

πŸ” Project Overview

Project Cere develops multimodal machine learning models that integrate visual, textual, and audio data to address pressing social challenges. This repository contains our codebase, experiments, and documentation for creating interpretable AI systems with real-world impact.

✨ Features

  • Modular Architecture: Easy to extend with new classifiers
  • Multiple Classifier Types: Classical ML, Neural Networks, and Ensemble methods
  • Flexible Data Pipeline: Support for various fMRI data formats
  • Comprehensive Evaluation: Cross-validation, metrics, and visualization
  • Hyperparameter Optimization: Built-in grid search capabilities
  • Experiment Management: YAML-based configuration system
  • Extensible Design: Factory pattern for seamless classifier addition

πŸ“‹ Table of Contents


πŸ› οΈ Installation

Prerequisites

  • Python 3.10+
  • Git

Setup

  1. Clone the repository:

    git clone https://github.com/marialagakos/AI4Good-MTL-Group-2.git
    cd AI4Good-MTL-Group-2
  2. Create and activate virtual environment:

    python -m venv cere-env
    # Linux/MacOS
    source cere-env/bin/activate
    # Windows (PowerShell)
    .\cere-env\Scripts\Activate.ps1
  3. Install dependencies:

    pip install --upgrade pip
    pip install -e .  # Editable install for development
    pip install -r requirements.txt  # Optional: Full dependency install

Here’s a concise team onboarding guide and your personal command cheat sheet for working with this repository:

Creating A Fork

  1. Fork the Repository

  2. Clone Your Fork

    git clone https://github.com/YOUR-USERNAME/AI4Good-MTL-Group-2.git
    cd AI4Good-MTL-Group-2
  3. Set Up Remotes

    git remote add upstream https://github.com/marialagakos/AI4Good-MTL-Group-2.git
    git remote -v  # Verify: origin=your fork, upstream=original
  4. Sync with Upstream

    git fetch upstream
    git checkout main
    git merge upstream/main
    git push origin main  # Keep your fork updated

πŸ“‚ Repository Structure

Project-Cere/
β”œβ”€β”€ main.py                 # Main execution script
β”œβ”€β”€ README.md               # Project documentation
β”œβ”€β”€ data/                   # Raw and processed datasets
β”‚   β”œβ”€β”€ feature_extraction.py    # Feature extraction utilities
β”‚   β”œβ”€β”€ DATA_INSTRUCTIONS.md
β”‚   β”œβ”€β”€ .ipnyb_checkpoints/
β”‚   β”œβ”€β”€ src/                 
β”‚   β”‚   β”œβ”€β”€ telepath/
β”‚   β”‚   β”œβ”€β”€ telepath.egg-info/
β”‚   β”‚   └── temp_audio_chunks/
β”‚   β”œβ”€β”€ pca_data/
β”‚   β”œβ”€β”€ loaders.py            # loading (fmri, audio, text, visual)
β”‚   β”œβ”€β”€ preprocessors.py      # preprocessing
β”‚   β”œβ”€β”€ transforms.py         # transformations
β”‚   β”œβ”€β”€ fmri/                    # fMRI data
β”‚   β”œβ”€β”€ audio/                   # Audio samples
β”‚   β”œβ”€β”€ transcripts/             # Text corpora
β”‚   └── visual/                  # Image/video data
β”œβ”€β”€ models/                    # Classifier implementations
β”‚   β”œβ”€β”€ base_classifier.py    # Abstract base class
β”‚   β”œβ”€β”€ classical/            # Traditional ML methods
β”‚   β”‚   β”œβ”€β”€ svm.py
β”‚   β”‚   β”œβ”€β”€ random_forest.py
β”‚   β”‚   └── logistic_regression.py
β”‚   β”œβ”€β”€ neural/               # Neural network methods
β”‚   β”‚   β”œβ”€β”€ mlp.py
β”‚   β”‚   β”œβ”€β”€ cnn.py
β”‚   β”‚   β”œβ”€β”€ lstm.py
β”‚   β”‚   └── transformer.py
β”‚   └── ensemble/             # Ensemble methods
β”‚       β”œβ”€β”€ voting.py
β”‚       └── stacking.py
β”œβ”€β”€ utils/                     # Utility functions
β”‚   β”œβ”€β”€ metrics.py            # Evaluation metrics
β”‚   β”œβ”€β”€ visualization.py     # Plotting functions
β”‚   └── io_utils.py          # I/O operations
β”œβ”€β”€ experiments/              # Experiment management
β”‚   β”œβ”€β”€ experiment_runner.py
β”‚   └── hyperparameter_search.py
β”œβ”€β”€ .gitignore              # File control
β”œβ”€β”€ docs/                   # Technical documentation
β”œβ”€β”€ tests/                  # Unit and integration tests
└── LICENSE.md

πŸš€ Usage

Running the Pipeline

python src/main.py --modality all --config configs/default.yaml

Key Arguments

  • --modality: Choose audio, text, visual, or all
  • --config: Path to YAML configuration file

Jupyter Notebooks

jupyter lab notebooks/

Maintaining Forks

1. Start a New Feature

git checkout -b feature/your-feature-name  # e.g., feature/login-form

2. Commit & Push to Your Fork

git add .
git commit -m "Description of changes"
git push -u origin feature/your-feature-name

3. Sync with Upstream

git checkout main
git fetch upstream
git merge upstream/main  # Or use `git rebase upstream/main`
git push origin main

4. Update Your Feature Branch

git checkout feature/your-feature-name
git rebase main  # Apply your changes on top of latest updates
git push --force  # Only if you've rebased

5. Create a Pull Request (PR)

  1. Go to your fork on GitHub.
  2. Click "Compare & Pull Request" for your branch.
  3. Target marialagakos/AI4Good-MTL-Group-2:main as the base.

Key Rules for the Team

  • Never push directly to upstream (only PRs).
  • Always branch from main (no direct commits to main).
  • Rebase instead of merge to keep history clean (use git rebase main).

Troubleshooting

  • Permission denied?
    git remote set-url origin https://github.com/YOUR-USERNAME/AI4Good-MTL-Group-2.git
  • Broken branch?
    git checkout main
    git branch -D broken-branch

Print this or save it as a text file! Need a visual workflow diagram? Let me know.

πŸ—ΊοΈ Project Roadmap

Phase Key Deliverables
Data Analysis EDA reports, preprocessing pipelines
Modeling Multimodal fusion architectures
Evaluation Cross-modal attention visualizations
Deployment Flask API for model serving

πŸ‘₯ Team

πŸ“œ License

This project is licensed under the MIT License - see LICENSE.md for details.

πŸ™ Acknowledgments

We gratefully acknowledge:

  • Jennifer Addison and Yosra Kazemi for their expertise and leadership
  • The AI4Good Lab Montreal and Mila team for their support
  • Our TA Hugo Berard and Laetitia Constantin

Consulting Scholars and Mentors:

  • Rose Landry - Mila
  • Adel Halawa - McGill University
  • Dr. Lune Bellec - UniversitΓ© de MontrΓ©al
  • Dr. Mayada Elsabbagh - Transforming Autism Care Consortium
  • The Algonauts Project
  • Compute Canada for their computational resources
  • The Digital Research Alliance of Canada

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 88.4%
  • Jupyter Notebook 9.8%
  • PowerShell 1.6%
  • Shell 0.2%