💻 Developed by BioMind AI Lab @ CUNY
📚 Comprehensive 15-week curriculum designed for aspiring ML practitioners
🛠️ Hands-on coding with Python, scikit-learn, PyTorch, and TensorFlow
💻 Jupyter Notebooks + Google Colab support for seamless experimentation
📊 Real-world datasets (UCI ML, TCIA, Kaggle) to bridge theory and practice
🎥 Curated video resources—handpicked free content to visually explain key concepts (not my own recordings, but complementary to the course)
✨ Engaging explanations & interactive coding—text sections written based on core ML concepts and real student questions to reinforce understanding before coding
🤝 Collaborative discussions & knowledge-sharing
git clone https://github.com/PKhosravi-CityTech/ML15AI-CUNY.git
cd ML15AI-CUNYTo run the notebooks interactively without installation, open them in Google Colab:
🟠 Click the Colab badge inside each week's folder to open the corresponding notebooks.
🟡 Colab allows you to execute code without local setup.
🟢 You can read the theoretical explanations inside the Jupyter Notebooks.
🟣 For better visualization, watch the accompanying videos.
Want to contribute? We welcome bug fixes, notebook improvements, dataset suggestions, and discussions!
- Fork this repository
- Create a new branch (
git checkout -b feature-branch) - Make your changes (improve notebooks, update README, etc.)
- Commit and push (
git push origin feature-branch) - Open a Pull Request
💡 Ask questions, share ideas, and collaborate with other students. Have suggestions?
🗣 Join the Discussion on GitHub
📅 Click to Expand 15-Week Course Structure
| Week | Topic | Key Concepts & Hands-On |
|---|---|---|
| 🟠 Week 01 | 🔥 Introduction to ML & AI | Types of ML (Supervised, Unsupervised, RL), AI vs. ML, Learning Paradigms |
| 🟡 Week 02 | 📊 Regression Techniques | Linear & Logistic Regression, Loss Functions, Bias-Variance Tradeoff |
| 🟢 Week 03 | 🌲 Decision Trees & Ensemble Methods | CART, Random Forests, Gradient Boosting (XGBoost, LightGBM) |
| 🔵 Week 04 | 🎯 SVMs & Kernel Methods | Large Margin Classifiers, Soft/Hard Margins, Kernel Trick |
| 🟣 Week 05 | 🔗 Clustering & Unsupervised Learning | K-Means, Hierarchical Clustering, DBSCAN, Silhouette Score |
| 🔴 Week 06 | ✂️ Dimensionality Reduction & Visualization | PCA, LDA, t-SNE, UMAP, Feature Selection vs. Extraction |
| ⚫ Week 07 | 📌 Probabilistic & Bayesian Learning | Naive Bayes, Gaussian Mixture Models, Bayesian Inference |
| ⚪ Week 08 | 🧠 Neural Networks Foundations | Perceptron, Feedforward Networks, Backpropagation, PyTorch Implementation |
| 🏆 Week 09 | 🖼️ Deep Learning for Vision (CNNs) | Convolutions, Pooling, Architectures (LeNet, ResNet), Transfer Learning |
| 🎙 Week 10 | 📖 NLP Fundamentals & Attention | Word Embeddings (Word2Vec, GloVe), RNNs, Seq2Seq, Basic Attention Mechanism |
| 🤖 Week 11 | 🚀 Transformers & Large Language Models (LLMs) | Self-Attention, BERT, GPT, Fine-tuning, Prompt Engineering |
| 🎮 Week 12 | 🕹 Reinforcement Learning | Markov Decision Processes, Q-Learning, Policy Gradients, OpenAI Gym |
| 🌐 Week 13 | 🔍 Explainable AI & Ethics | SHAP, LIME, Grad-CAM, Model Bias, Fairness, Accountability, AI Policy |
| 🧠 Week 14 | 🤝 Multimodal & Foundation Models | CLIP, DALL·E, Multimodal Transformers, Cross-modal Representations |
| 🎓 Week 15 | 📚 Research Trends & Capstone Projects | Student Presentations, Recent Advances (AutoML, Federated Learning, Causal Inference, Continual Learning), VC Theory & Generalization Bounds |