You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 5, 2024. It is now read-only.
Copy file name to clipboardExpand all lines: README.md
+17-4Lines changed: 17 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -6,11 +6,14 @@
6
6
7
7
# PGMax
8
8
9
-
PGMax implements general factor graphs for probabilistic graphical models (PGMs) with discrete variables, and hardware-accelerated differentiable loopy belief propagation (LBP) in [JAX](https://jax.readthedocs.io/en/latest/).
9
+
PGMax implements general factor graphs for discrete probabilistic graphical models (PGMs), and hardware-accelerated differentiable loopy belief propagation (LBP) in [JAX](https://jax.readthedocs.io/en/latest/).
10
10
11
-
-**General factor graphs**: PGMax goes beyond pairwise PGMs, and supports arbitrary factor graph topology, including higher-order factors.
11
+
-**General factor graphs**: PGMax supports easy specification of general factor graphs with potentially complicated topology, factor definitions, and discrete variables with a varying number of states.
12
12
-**LBP in JAX**: PGMax generates pure JAX functions implementing LBP for a given factor graph. The generated pure JAX functions run on modern accelerators (GPU/TPU), work with JAX transformations (e.g. `vmap` for processing batches of models/samples, `grad` for differentiating through the LBP iterative process), and can be easily used as part of a larger end-to-end differentiable system.
13
13
14
+
[**Installation**](#installation)
15
+
| [**Getting started**](#getting-started)
16
+
14
17
## Installation
15
18
16
19
### Install from PyPI
@@ -37,6 +40,16 @@ pre-commit install
37
40
38
41
By default the above commands install JAX for CPU. If you have access to a GPU, follow the official instructions [here](https://github.com/google/jax#pip-installation-gpu-cuda) to install JAX for GPU.
39
42
43
+
## Getting Started
44
+
45
+
Here are a few self-contained Colab notebooks to help you get started on using PGMax:
46
+
47
+
-[Tutorial on basic PGMax usage](https://colab.research.google.com/drive/1PQ9eVaOg336XzPqko-v_us3izEbjvWMW?usp=sharing)
48
+
-[Implementing max-product LBP](https://colab.research.google.com/drive/1mSffrA1WgQwgIiJQd2pLULPa5YKAOJOX?usp=sharing) for [Recursive Cortical Networks](https://www.science.org/doi/10.1126/science.aag2612)
49
+
-[End-to-end differentiable LBP for gradient-based PGM training](https://colab.research.google.com/drive/1yxDCLwhX0PVgFS7NHUcXG3ptMAY1CxMC?usp=sharing)
50
+
51
+
52
+
40
53
## Citing PGMax
41
54
42
55
To cite this repository
@@ -45,8 +58,8 @@ To cite this repository
45
58
author = {Guangyao Zhou* and Nishanth Kumar* and Miguel L\’{a}zaro-Gredilla and Dileep George},
46
59
title = {{PGMax}: {F}actor graph on discrete variables and hardware-accelerated differentiable loopy belief propagation in {JAX}},
0 commit comments