Skip to content

Latest commit

 

History

History
29 lines (20 loc) · 1.5 KB

README.md

File metadata and controls

29 lines (20 loc) · 1.5 KB

⚠️ Notice: Limited Maintenance

This project is no longer actively maintained. While existing releases remain available, there are no planned updates, bug fixes, new features, or security patches. Users should be aware that vulnerabilities may not be addressed.

Workflows can be used to compose an ensemble of Pytorch models and Python functions and package them in a war file. A workflow is executed as a DAG where the nodes can be either Pytorch models packaged as mar files or function nodes specified in the workflow handler file. The DAG can be used to define both sequential or parallel pipelines.

As an example a sequential pipeline may look something like

input -> function1 -> model1 -> model2 -> function2 -> output

And a parallel pipeline may look something like

                          model1
                         /       \
input -> preprocessing ->         -> aggregate_func
                         \       /
                          model2

You can experiment with much more complicated workflows by configuring a YAML file. We've included 2 reference examples including a sequential pipeline and parallel pipeline.

For a more detailed explanation of Workflows and what is currently supported please refer to the main documentation