Releases: AI-Hypercomputer/maxtext
Recipe Branch for TPU performance results
Merge pull request #2539 from AI-Hypercomputer:qinwen/latest-tokamax PiperOrigin-RevId: 823749360
maxtext-tutorial-v1.0.0
Merge pull request #2538 from AI-Hypercomputer:mohit/fix_docker PiperOrigin-RevId: 822796389
tpu-recipes-v0.1.5
Use this release for tpu-recipes that require version tpu-recipes-v0.1.5
maxtext-v0.1.0
Our first MaxText PyPI package is here! MaxText is a high performance, highly scalable, open-source LLM library and reference implementation written in pure Python/JAX and targeting Google Cloud TPUs and GPUs for training. We are excited to make it easier than ever to get started.
Users can now install MaxText through pip, both for local development and through stable PyPI builds. Please see our MaxText Installation Guide for more setup details.
Going forward, this page will document notable changes as we release new versions of MaxText.
tpu-recipes-v0.1.4
Use this release for tpu-recipes that require version tpu-recipes-v0.1.4
pre-nnx-v0.1.0
Use this release for the latest MaxText version that fully depends on Flax Linen (no NNX).
tpu-recipes-v0.1.3
Use this release for tpu-recipes that require version tpu-recipes-v0.1.3
tpu-recipes-v0.1.2
Use this release for tpu-recipes that require version tpu-recipes-v0.1.2
tpu-recipes-v0.1.1
Use this release for tpu-recipes that require version tpu-recipes-v0.1.1
pre-module-v0.1.0
Release prior to module refactor for older train API.
With this release or prior: python3 MaxText/train.py MaxText/configs/base.yml run_name=...
Soon after this release: python3 -m MaxText.train MaxText/configs/base.yml run_name=...