https://github.com/hpotechius/ColorTransferLab and https://potechius.com/ColorTransferLab
ColorTransferLib is a library dedicated to color transfer, style transfer, and colorization, featuring a diverse range of published algorithms. Some methods have been re-implemented, while others are integrated from public repositories.
The primary goal of this project is to consolidate all existing color transfer, style transfer, and colorization techniques into a single library with a standardized API. This facilitates both the development and comparison of algorithms within the research community.
Currently, the library supports 11 color transfer, 5 style transfer, and 3 colorization methods across various data types, including images, point clouds, textured meshes, light fields, videos, volumetric videos, and Gaussian splatting. Additionally, it provides 20 evaluation metrics for assessing image-to-image color transfer performance.
A compatibility chart for supported data types and a detailed list of all algorithms can be found below.
For seamless integration, adhere to the API specifications of the new color transfer algorithm, depicted in the Figure below.
Each class demands three inputs: Source, Reference, and Options. The Source and Reference should be of the Image, Video, VolumetricVideo, LightField, GaussianSplatting or Mesh class type, with the latter encompassing 3D point clouds and textured triangle meshes. The Options input consists of dictionaries, stored as a JSON file in the Options folder. For a sample option, see Listings 1. Every option details an adjustable parameter for the algorithm.
Save each new color transfer class in the ColorTransferLib Repository under the Algorithms folder. The class should have the apply(...) function, which ingests the inputs and embodies the core logic for color transfer.
The output should resemble a dictionary format, as outlined in Listing 2. A status code of 0 signifies a valid algorithm output, while -1 indicates invalidity. The process time denotes the algorithm's execution duration, useful for subsequent evaluations. The 'object' key in the dictionary holds the result, which should match the class type of the Source input.

(1) Install the following packages:
# for running matlab code
sudo apt-get install octave-dev
# allows writing of mp4 with h246 codec
sudo apt-get install ffmpeg
(2) Install the following octave package:
# activate octave environment
octave
# install packages
octave:1> pkg install -forge image
octave:2> pkg install -forge statistics
(3) Run the gbvs_install.m
to make the evaluation metric VSI runnable:
user@pc:~/<Project Path>/ColorTransferLib/Evaluation/VIS/gbvs$ ocatve
octave:1> gbvs_install.m
pip install colortransferlib
pip install git+https://github.com/facebookresearch/detectron2.git@main
pip install -r requirements/requirements.txt
python setup.py bdist_wheel
pip install ../ColorTransferLib/dist/ColorTransferLib-2.0.3-py3-none-any.whl
pip install git+https://github.com/facebookresearch/detectron2.git@main
from ColorTransferLib.ColorTransfer import ColorTransfer
from ColorTransferLib.DataTypes.Image import Image
src = Image(file_path='/media/source.png')
ref = Image(file_path='/media/reference.png')
algo = "GLO"
ct = ColorTransfer(src, ref, algo)
out = ct.apply()
# No output file extension has to be given
if out["status_code"] == 0:
out["object"].write("/media/out")
else:
print("Error: " + out["response"])
from ColorTransferLib.ColorTransfer import ColorTransferEvaluation
from ColorTransferLib.DataTypes.Image import Image
src = Image(file_path='/media/source.png')
ref = Image(file_path='/media/reference.png')
out = Image(file_path='/media/output.png')
cte = ColorTransferEvaluation(src, ref, out)
eva = cte.apply(method)
print(eva)
# Test all Color Transfer algorithms with all data type combinations
python main.py --test all_CT --out_path "/media/out"
# Test all Style Transfer algorithms with all data type combinations
python main.py --test all_ST --out_path "/media/out"
# Test all Colorization algorithms with all data type combinations
python main.py --test all_CT --out_path "/media/out"
# Test all evaluation metric on src, ref and out images
python main.py --test all_EVAL
The following color transfer, style transfer and colorization methods are integrated in the library. Some of them are reimplemented based on the algorithm's description in the the published papers and others are adopted from existing repositories and adpated to fit the API. The original implementation of the latter methods can be found next to the publication's name. Highlighted icon within the Support Column indicates the supported data types (From left to right: (1) Gaussian Splatting, (2) Light Field, (3) Volumetric Video, (4) Video, (5) Point Cloud, (6) Mesh, (7) Image)
Year | ID | Support | Publication |
---|---|---|---|
2001 | ![]() |
Color Transfer between Images | |
2003 | ![]() |
A Framework for Transfer Colors Based on the Basic Color Categories | |
2005 | ![]() |
N-dimensional probability density function transfer and its application to color transfer | |
2006 | ![]() |
Color transfer in correlated color space | |
2007 | ![]() |
The Linear Monge-Kantorovitch Linear Colour Mapping for Example-Based Colour Transfer | |
2009 | ![]() |
Color Transfer between Images | |
2010 | ![]() |
An efficient fuzzy clustering-based color transfer method | |
2019 | ![]() |
L2 Divergence for robust colour transfer - Original Implementation | |
2020 | ![]() |
Deep Color Transfer using Histogram Analogy - Original Implementation | |
2021 | ![]() |
HistoGAN: Controlling Colors of GAN-Generated and Real Images via Color Histograms | |
2021 | ![]() |
Example-Based Colour Transfer for 3D Point Clouds |
Year | ID | Support | Publication |
---|---|---|---|
2015 | ![]() |
A Neural Algorithm of Artistic Style - Original Implementation | |
2017 | ![]() |
Deep Photo Style Transfer - Original Implementation | |
2020 | ![]() |
PSNet: A Style Transfer Network for Point Cloud Stylization on Geometry and Color - Original Implementation | |
2021 | ![]() |
CAMS: Color-Aware Multi-Style Transfer - Original Implementation | |
2022 | ![]() |
Stytr2: Image style transfer with transformers - Original Implementation |
Year | ID | Support | Publication |
---|---|---|---|
2020 | ![]() |
Instance-aware image colorization - Original Implementation | |
2022 | ![]() |
Colorformer: Image colorization via color memory assisted hybrid-attention transformer - Original Implementation | |
2023 | ![]() |
DDColor: Towards Photo-Realistic Image Colorization via Dual Decoders - Original Implementation |
Three classes of evaluation metrics are considered here. Metrics that evaluate the color consistency with the reference image (indicated with
Year | ID | Name | Publication |
---|---|---|---|
/ | Peak Signal-to-Noise Ratio | / | |
/ | Histogram Intersection | / | |
/ | Correlation | / | |
/ | Bhattacharyya Distance | / | |
/ | Mean-Squared Error | / | |
/ | Root-Mean-Squared Error | / | |
2003 | Colorfulness | Measuring Colourfulness in Natural Images | |
2003 | Multi-Scale Structural Similarity Index | Multiscale structural similarity for image quality assessment | |
2004 | Structural Similarity Index | Image quality assessment: from error visibility to structural similarity | |
2006 | Gradient-based Structural Similarity Index | Gradient-Based Structural Similarity for Image Quality Assessment | |
2010 | 4-component Structural Similarity Index | Content-partitioned structural similarity index for image quality assessment | |
2011 | 4-component enhanced Gradient-based Structural Similarity Index | An image similarity measure using enhanced human visual system characteristics | |
2011 | Feature Similarity Index | FSIM: A Feature Similarity Index for Image Quality Assessment | |
2012 | Blind/Referenceless Image Spatial Quality Evaluator | No-Reference Image Quality Assessment in the Spatial Domain | |
2013 | Naturalness Image Quality Evaluator | Making a “Completely Blind” Image Quality Analyzer | |
2014 | Visual Saliency-based Index | VSI: A Visual Saliency-Induced Index for Perceptual Image Quality Assessment | |
2016 | Color Transfer Quality Metric | Novel multi-color transfer algorithms and quality measure | |
2018 | Learned Perceptual Image Patch Similarity | The Unreasonable Effectiveness of Deep Features as a Perceptual Metric | |
2018 | Neural Image Assessment | NIMA: Neural Image Assessment | |
2019 | Color and Structure Similarity | Selective color transfer with multi-source images |
- PSN crashes if the point clouds are too large
If you utilize this code in your research, kindly provide a citation:
@inproceeding{potechius2023,
title={A software test bed for sharing and evaluating color transfer algorithms for images and 3D objects},
author={Herbert Potechius, Thomas Sikora, Gunasekaran Raja, Sebastian Knorr},
year={2023},
booktitle={European Conference on Visual Media Production (CVMP)},
doi={10.1145/3626495.3626509}
}