Skip to content
This repository was archived by the owner on Dec 5, 2024. It is now read-only.

Commit e580b6d

Browse files
authored
Release 0.2.2 (#110)
1 parent 09f57aa commit e580b6d

File tree

4 files changed

+376
-362
lines changed

4 files changed

+376
-362
lines changed

README.md

Lines changed: 17 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,14 @@
66

77
# PGMax
88

9-
PGMax implements general factor graphs for probabilistic graphical models (PGMs) with discrete variables, and hardware-accelerated differentiable loopy belief propagation (LBP) in [JAX](https://jax.readthedocs.io/en/latest/).
9+
PGMax implements general factor graphs for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable loopy belief propagation (LBP) in [JAX](https://jax.readthedocs.io/en/latest/).
1010

11-
- **General factor graphs**: PGMax goes beyond pairwise PGMs, and supports arbitrary factor graph topology, including higher-order factors.
11+
- **General factor graphs**: PGMax supports easy specification of general factor graphs with potentially complicated topology, factor definitions, and discrete variables with a varying number of states.
1212
- **LBP in JAX**: PGMax generates pure JAX functions implementing LBP for a given factor graph. The generated pure JAX functions run on modern accelerators (GPU/TPU), work with JAX transformations (e.g. `vmap` for processing batches of models/samples, `grad` for differentiating through the LBP iterative process), and can be easily used as part of a larger end-to-end differentiable system.
1313

14+
[**Installation**](#installation)
15+
| [**Getting started**](#getting-started)
16+
1417
## Installation
1518

1619
### Install from PyPI
@@ -37,6 +40,16 @@ pre-commit install
3740

3841
By default the above commands install JAX for CPU. If you have access to a GPU, follow the official instructions [here](https://github.com/google/jax#pip-installation-gpu-cuda) to install JAX for GPU.
3942

43+
## Getting Started
44+
45+
Here are a few self-contained Colab notebooks to help you get started on using PGMax:
46+
47+
- [Tutorial on basic PGMax usage](https://colab.research.google.com/drive/1PQ9eVaOg336XzPqko-v_us3izEbjvWMW?usp=sharing)
48+
- [Implementing max-product LBP](https://colab.research.google.com/drive/1mSffrA1WgQwgIiJQd2pLULPa5YKAOJOX?usp=sharing) for [Recursive Cortical Networks](https://www.science.org/doi/10.1126/science.aag2612)
49+
- [End-to-end differentiable LBP for gradient-based PGM training](https://colab.research.google.com/drive/1yxDCLwhX0PVgFS7NHUcXG3ptMAY1CxMC?usp=sharing)
50+
51+
52+
4053
## Citing PGMax
4154

4255
To cite this repository
@@ -45,8 +58,8 @@ To cite this repository
4558
author = {Guangyao Zhou* and Nishanth Kumar* and Miguel L\’{a}zaro-Gredilla and Dileep George},
4659
title = {{PGMax}: {F}actor graph on discrete variables and hardware-accelerated differentiable loopy belief propagation in {JAX}},
4760
howpublished={\url{http://github.com/vicariousinc/PGMax}},
48-
version = {0.2.1},
49-
year = {2021},
61+
version = {0.2.2},
62+
year = {2022},
5063
}
5164
```
5265
where * indicates equal contribution.

docs/source/conf.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
)
2727

2828
# The full version, including alpha/beta/rc tags
29-
release = "0.2.1"
29+
release = "0.2.2"
3030

3131

3232
# -- General configuration ---------------------------------------------------

0 commit comments

Comments
 (0)