This directory contains research notebooks, educational materials, and autonomous research setups from various open-source projects.
Source: rasbt/reasoning-from-scratch
This repository contains the code for developing an LLM reasoning model and is the official code repository for the book Build a Reasoning Model (From Scratch).
Notebooks:
reasoning-from-scratch/01_main-chapter-code/ch02_main.ipynb- Chapter 2 main codereasoning-from-scratch/01_main-chapter-code/ch02_exercise-solutions.ipynb- Chapter 2 exercise solutionsreasoning-from-scratch/01_main-chapter-code/ch03_main.ipynb- Chapter 3 main codereasoning-from-scratch/01_main-chapter-code/ch03_exercise-solutions.ipynb- Chapter 3 exercise solutionsreasoning-from-scratch/01_main-chapter-code/ch04_main.ipynb- Chapter 4 main codereasoning-from-scratch/01_main-chapter-code/ch04_exercise-solutions.ipynb- Chapter 4 exercise solutionsreasoning-from-scratch/01_main-chapter-code/ch05_main.ipynb- Chapter 5 main codereasoning-from-scratch/01_main-chapter-code/ch05_exercise-solutions.ipynb- Chapter 5 exercise solutionsreasoning-from-scratch/01_main-chapter-code/chC_main.ipynb- Appendix C main codereasoning-from-scratch/01_main-chapter-code/chF_main.ipynb- Appendix F main code
Source: karpathy/nanochat
A full-stack implementation of an LLM like ChatGPT in a single, clean, minimal, hackable, dependency-lite codebase. Designed to run on a single 8XH100 node, including tokenization, pretraining, finetuning, evaluation, inference, and web serving.
Notebooks:
nanochat/dev/estimate_gpt3_core.ipynb- GPT-3 core estimation analysisnanochat/dev/scaling_analysis.ipynb- Scaling analysis notebook
Source: karpathy/autoresearch
AI agents running research on single-GPU nanochat training automatically. An agent is given a small but real LLM training setup and experiments autonomously: it modifies the code, trains for a fixed 5-minute budget, checks if the result improved (validation bits per byte), keeps or discards, and repeats. The human configures the agent via program.md (lightweight “skill” / instructions); the agent only edits train.py (model, optimizer, training loop). Self-contained: one GPU, one editable file, one metric.
Key files are mirrored in this repo under research/autoresearch/ for local use.
Key files:
autoresearch/README.md- Local mirror overview and quick startautoresearch/program.md- Agent instructions (human-edited)autoresearch/train.py- Model, optimizer, training loop (agent-edited)autoresearch/prepare.py- Data prep and runtime utilities (fixed, do not modify)autoresearch/pyproject.toml- Dependencies (uv)autoresearch/analysis.ipynb- Analyzeresults.tsvand plot progress
Source: nlp-with-transformers/notebooks
Companion notebooks for the book "Natural Language Processing with Transformers". Practical examples covering classification, NER, text generation, summarization, question answering, and more.
Notebooks:
nlp-with-transformers/01_introduction.ipynb- Introduction to transformersnlp-with-transformers/02_classification.ipynb- Text classificationnlp-with-transformers/03_transformer-anatomy.ipynb- Transformer architecture deep divenlp-with-transformers/04_multilingual-ner.ipynb- Multilingual named entity recognitionnlp-with-transformers/05_text-generation.ipynb- Text generation with transformersnlp-with-transformers/06_summarization.ipynb- Text summarizationnlp-with-transformers/07_question-answering.ipynb- Question answeringnlp-with-transformers/07_question_answering_v2.ipynb- Question answering (v2)nlp-with-transformers/08_model-compression.ipynb- Model compression techniquesnlp-with-transformers/09_few-to-no-labels.ipynb- Few-shot and zero-shot learningnlp-with-transformers/10_transformers-from-scratch.ipynb- Building transformers from scratchnlp-with-transformers/11_future-directions.ipynb- Future directions in NLPnlp-with-transformers/SageMaker/01_introduction.ipynb- SageMaker introductionnlp-with-transformers/SageMaker/02_classification.ipynb- SageMaker classification
Source: karpathy/nn-zero-to-hero
Neural Networks: Zero to Hero - A course on neural networks from scratch. Learn how neural networks work by building them from the ground up.
Notebooks:
nn-zero-to-hero/lectures/makemore/makemore_part1_bigrams.ipynb- Bigram language modelnn-zero-to-hero/lectures/makemore/makemore_part2_mlp.ipynb- Multi-layer perceptronnn-zero-to-hero/lectures/makemore/makemore_part3_bn.ipynb- Batch normalizationnn-zero-to-hero/lectures/makemore/makemore_part4_backprop.ipynb- Backpropagationnn-zero-to-hero/lectures/makemore/makemore_part5_cnn1.ipynb- Convolutional neural networksnn-zero-to-hero/lectures/micrograd/micrograd_lecture_first_half_roughly.ipynb- Micrograd first halfnn-zero-to-hero/lectures/micrograd/micrograd_lecture_second_half_roughly.ipynb- Micrograd second half
Source: dair-ai/ML-Notebooks
Minimal and clean examples of machine learning algorithms and implementations. Well-organized notebooks covering fundamental ML concepts from linear regression to neural networks.
Notebooks:
ML-Notebooks/notebooks/linear_regression.ipynb- Linear regressionML-Notebooks/notebooks/logistic_regression.ipynb- Logistic regressionML-Notebooks/notebooks/concise_log_reg.ipynb- Concise logistic regressionML-Notebooks/notebooks/first_nn.ipynb- First neural networkML-Notebooks/notebooks/nn_from_scratch.ipynb- Neural network from scratchML-Notebooks/notebooks/bow.ipynb- Bag of wordsML-Notebooks/notebooks/bow-dataloader.ipynb- Bag of words with dataloaderML-Notebooks/notebooks/cbow.ipynb- Continuous bag of wordsML-Notebooks/notebooks/deep_cbow.ipynb- Deep continuous bag of wordsML-Notebooks/notebooks/loglin-lm.ipynb- Log-linear language modelML-Notebooks/notebooks/loglin-lm-dataloader.ipynb- Log-linear language model with dataloaderML-Notebooks/notebooks/nn-lm.ipynb- Neural network language modelML-Notebooks/notebooks/nn-lm-batch.ipynb- Neural network language model with batchingML-Notebooks/notebooks/comp_graphs.ipynb- Computational graphsML-Notebooks/notebooks/intro_gnn.ipynb- Introduction to graph neural networksML-Notebooks/notebooks/pytorch_hello_world.ipynb- PyTorch hello worldML-Notebooks/notebooks/pytorch_gentle_intro.ipynb- PyTorch gentle introductionML-Notebooks/notebooks/maths/algebra.ipynb- Linear algebra fundamentalsML-Notebooks/notebooks/maths/mean.ipynb- Mean calculationML-Notebooks/notebooks/maths/feature-scaling.ipynb- Feature scaling
Source: udlbook/udlbook
Companion notebooks and book PDF for "Understanding Deep Learning" by Simon J.D. Prince. This directory contains a comprehensive collection of 68 Jupyter notebooks that accompany the book, covering deep learning from fundamentals to advanced topics.
Content:
- 68 notebooks organized by chapter (Chapters 1-21)
- Book PDF:
UnderstandingDeepLearning_05_29_25_C.pdf(21MB) - Topics include: background mathematics, supervised learning, shallow and deep networks, loss functions, optimization, backpropagation, generalization, regularization, convolutional networks, residual networks, transformers, graph neural networks, GANs, normalizing flows, variational autoencoders, diffusion models, reinforcement learning, and ethics/explainability
- Each notebook provides hands-on code examples and practical implementations of concepts from the book
Notebooks: See understanding-deep-learning/README.md for the complete list of all 68 notebooks organized by chapter.
Notebook:
Deepseek_OCR_(3B).ipynb- Deepseek OCR model notebook (3B parameters)
This content was automatically fetched from the original repositories. For the most up-to-date versions, please refer to the source repositories.