Skip to content

Commit 263b4ec

Browse files
authored
Update README.md
1 parent cd3b530 commit 263b4ec

File tree

1 file changed

+19
-22
lines changed

1 file changed

+19
-22
lines changed

README.md

Lines changed: 19 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,16 @@
1-
<h1 style="margin: 0 0 0.35 rem 0; line-height: 1.1;">MixFlow</h1>
1+
<h1 style="margin: 0 0 0.35rem 0; line-height: 1.1;">MixFlow</h1>
22

3-
[![License: MIT](https://img.shields.io/badge/License-MIT-green.svg)](./LICENSE)
3+
[![Code License: MIT](https://img.shields.io/badge/Code%20License-MIT-green.svg)](./LICENSE)
4+
[![arXiv](https://img.shields.io/badge/arXiv-2601.11827v2-b31b1b.svg)](https://arxiv.org/abs/2601.11827)
45
[![Python](https://img.shields.io/badge/Python-3.10%2B-blue.svg)](#)
56
[![PyTorch](https://img.shields.io/badge/PyTorch-2.x-EE4C2C.svg)](#)
67
[![Project status](https://img.shields.io/badge/Status-Research%20code-informational.svg)](#)
78
[![Docs](https://img.shields.io/badge/Docs-local-blueviolet.svg)](./docs/index.md)
89

9-
Mixture-Conditioned Flow Matching for Out-of-Distribution Generalization.
10+
**Shortest-Path Flow Matching with Mixture-Conditioned Bases for OOD Generalization to Unseen Conditions.**
1011

1112
---
13+
1214
<table>
1315
<tr>
1416
<td align="center">
@@ -19,48 +21,43 @@ Mixture-Conditioned Flow Matching for Out-of-Distribution Generalization.
1921
<td align="center">
2022
<img src="./docs/figs/manif_ptflow_B.png" width="100%">
2123
<br>
22-
<b>(B)</b> MixFlow
24+
<b>(B)</b> SP-FM
2325
</td>
2426
</tr>
2527
</table>
2628

2729
## Overview
2830

29-
MixFlow is a conditional flow-matching framework for descriptor-controlled generation. Instead of relying on a single Gaussian base distribution, MixFlow learns a mixture base and a descriptor-conditioned flow jointly, trained via shortest-path flow matching. This joint modeling is designed to extrapolate smoothly to unseen conditions and improve out-of-distribution generalization across tasks.
31+
This repository contains research code for **shortest-path flow matching** with **descriptor-conditioned mixture bases** for descriptor-controlled generation.
32+
33+
Instead of relying on a single Gaussian base distribution, the method learns a **condition-dependent mixture base** jointly with a **descriptor-conditioned flow field**, trained via shortest-path (optimal transport) flow matching. Conditioning the base enables the model to adapt its starting distribution across conditions, improving **out-of-distribution (OOD) generalization** to unseen conditions.
3034

3135
## Publication
3236

33-
This project is based on the **MixFlow** manuscript.
37+
This repository accompanies the arXiv manuscript:
3438

35-
- **Title:** MixFlow: Mixture-Conditioned Flow Matching for Out-of-Distribution Generalization
36-
- **Authors:** Andrea Rubbi, Amir Akbarnejad, Mohammad Vali Sanian, Aryan Yazdan Parast, Hesam Asadollahzadeh, Arian Amani, Naveed Akhtar, Sarah Cooper, Andrew Bassett, Lassi Paavolainen, Pietro Liò, Sattar Vakili, Mo Lotfollahi
37-
- **Link:** _TODO_
39+
- **Title:** *Shortest-Path Flow Matching with Mixture-Conditioned Bases for OOD Generalization to Unseen Conditions*
40+
- **arXiv:** 2601.11827v2 \[cs.LG\] (11 Feb 2026)
41+
- **Paper link:** https://arxiv.org/html/2601.11827v2
3842

3943
## Datasets
4044

4145
### Synthetic Data
4246

43-
We construct a synthetic benchmark of letter populations, where each condition corresponds to a letter and a specific rotation. Each descriptor encodes the letter identity and rotation, and MixFlow learns a mixture base distribution per condition. This setup allows us to test extrapolation to unseen letters and rotation angles.
47+
We construct a synthetic benchmark of letter populations, where each condition corresponds to a letter and a specific rotation. Each descriptor encodes the letter identity and rotation, and the model learns a mixture base distribution per condition. This setup allows us to test extrapolation to unseen letters and rotation angles.
4448

4549
### Morphological Perturbations
4650

47-
We evaluate MixFlow on high-content imaging data in feature space. Cells (from BBBC021 and RxRx1) are embedded with a vision backbone, and the model is trained to generate unseen phenotypic responses from compound descriptors alone.
51+
We evaluate on high-content imaging data in feature space. Cells (from BBBC021 and RxRx1) are embedded with a vision backbone, and the model is trained to generate unseen phenotypic responses from compound descriptors alone.
4852

4953
### Perturbation Datasets
5054

51-
For transcriptomic perturbations, we use Chemical- or CRISPR-based single-cell datasets (Norman, Combosciplex, Replogle and iAstrocytes). Conditions correspond to perturbations' embeddings from pretrained models, and MixFlow is trained to model the distribution of perturbed cells.
55+
For transcriptomic perturbations, we use Chemical- or CRISPR-based single-cell datasets (Norman, ComboSciPlex, Replogle and iAstrocytes). Conditions correspond to perturbation embeddings from pretrained models, and the model is trained to model the distribution of perturbed cells.
5256

5357
## Documentation
5458

55-
Check the <a href="./docs/index.md"> documentation </a> for more information about how to use the model and get the data.
56-
57-
## License
58-
59-
This work is released with the MIT license, please see <a href="./LICENSE"> the license file </a> for more information.
60-
61-
## Authors
59+
Check the <a href="./docs/index.md">documentation</a> for more information about how to use the model and get the data.
6260

63-
Andrea Rubbi, Amir Akbarnejad, Mohammad Vali Sanian, Aryan Yazdan Parast, Hesam Asadollahzadeh, Arian Amani, Naveed Akhtar, Sarah Cooper,
64-
Andrew Bassett, Pietro Liò, Lassi Paavolainen, Sattar Vakili,
65-
Mo Lotfollahi
61+
## License
6662

63+
This work is released with the MIT license, please see <a href="./LICENSE">the license file</a> for more information.

0 commit comments

Comments
 (0)