Skip to content

Generation of Labeled Leaf Point Clouds for Plants Trait Estimation (Plant Phenomics, 2025)

Notifications You must be signed in to change notification settings

PRBonn/3DLeafLabGen

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Generation of Labeled Leaf Point Clouds for Plants Trait Estimation



Paper   •   Contact Us

Setup

Build docker image:

make build

You may need to install the NVIDIA container toolkit to bring the GPUs to work with Docker.

Data Samples and Weights

We provide one leaf point cloud with the extracted skeleton. The same leaf-skeleton pair is replicated inside the 'data/val' and 'data/train' folders, this is needed to use the distribution losses that need more than one sample to work.

With this leaf one can test the data loading, training, and testing with the provided network's weights. To perform a full training, one can download the BonnBeetClouds or the Pheno4D dataset and extract the leaves skeletons.

We also provide the checkpoints for computing the generative metrics that we use in the paper.

Option 1: Manual download

Download the data here: data.zip,

the weights here: best.ckpt,

the checkpoint for PointNet here: pointnet_on_single_view.pth,

and the checkpoint for 3D+CLIP embeddings here: pointmlp_8k.pth.

Unzip data.zip and copy the folder data and the weights into the main folder, copy the checkpoints in the src/metrics folder.

Option 2: Automated download

Execute

make download

Test the generation

You can test the point cloud generation running

make generate

How to train the network

You can train the network running

make train

By default this will use the leaves in data, where there are three defined folders train, val, test.

How to Compute Metrics

You can compute the metrics between the leaves in data and the generated leaves (you need to first generate the samples!) running

make compute_gen_metrics

By default this will try and compare up to 1.000 leaves, using the 3D+CLIP model for all three metrics: FID, CMMD, improved Precision and Recall. The realism for precision and recall is set to 0.5.

All these parameters can be changed / passed to the file compute_generative_metrics.py.

Efficient Target Computation

We also provide the script to compute and save the embeddings extracted from the training data to used them during training to compute the distribution losses. You can compute this running

make compute_target

This automatically save a tensor with the name of the dataset in the metrics folder.

Configuration Files

Notice that in the config folder there are three configuration files:

  1. config_bbc.yaml: dataset configuration file for training the network, it specifies information about the real-world data, the network parameters, the size of the used skeletons
  2. test_config.yamls: configuration used to compute the generative metrics, it specifies the location of the real-world data and of the generated samples
  3. generate_config.yamls: configuration for the generative procedure, it specifies information about the type of leaf skeleton and about the network

Style Guidelines

In general, we follow the Python PEP 8 style guidelines. Please install black to format your python code properly. To run the black code formatter, use the following command:

black -l 120 path/to/python/module/or/package/

To optimize and clean up your imports, feel free to have a look at this solution for PyCharm.

About

Generation of Labeled Leaf Point Clouds for Plants Trait Estimation (Plant Phenomics, 2025)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published