Skip to content

Commit a3cc989

Browse files
strintjackalcooper
andauthored
enterprise edition to enterprise solution (#1103)
<!-- This is an auto-generated comment: release notes by coderabbit.ai --> ## Summary by CodeRabbit - **Documentation** - Renamed "onediff Enterprise Edition" to "onediff Enterprise Solution" for clarity on offerings. - Enhanced details on performance gains and support options within the "onediff Enterprise Solution" section. - Elevated the "Distributed Run" section to a prominent header, emphasizing distributed inference capabilities. - Adjusted content in the "Distributed Run" section to highlight the use of onediff's compiler in distributed inference engines. <!-- end of auto-generated comment: release notes by coderabbit.ai --> --------- Co-authored-by: Shenghang Tsai <[email protected]>
1 parent eff625d commit a3cc989

File tree

1 file changed

+12
-12
lines changed

1 file changed

+12
-12
lines changed

README.md

Lines changed: 12 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -62,8 +62,8 @@ If you have contributed significantly to open-source software and are interested
6262
+ [PyTorch Module compilation](#pytorch-module-compilation)
6363
+ [Avoid compilation time for new input shape](#avoid-compilation-time-for-new-input-shape)
6464
+ [Avoid compilation time for online serving](#avoid-compilation-time-for-online-serving)
65-
+ [onediff Enterprise Edition](#onediff-enterprise-edition)
66-
* [Distributed Run](#distributed-run)
65+
+ [Distributed Run](#distributed-run)
66+
* [OneDiff Enterprise Solution](#onediff-enterprise-solution)
6767
<!-- tocstop -->
6868

6969
## Documentation
@@ -256,17 +256,17 @@ onediff supports the acceleration for SOTA models.
256256
Compile and save the compiled result offline, then load it online for serving
257257
- [Save and Load the compiled graph](https://github.com/siliconflow/onediff/blob/main/onediff_diffusers_extensions/examples/text_to_image_sdxl_save_load.py)
258258
- Compile at one device(such as device 0), then use the compiled result to other device(such as device 1~7). [Change device of the compiled graph to do multi-process serving](https://github.com/siliconflow/onediff/blob/main/onediff_diffusers_extensions/examples/text_to_image_sdxl_mp_load.py)
259-
#### onediff Enterprise Edition
260-
If you need Enterprise-level Support for your system or business, you can email us at [email protected], or contact us through the website: https://siliconflow.cn/pricing
261-
262-
|   | onediff Enterprise Edition | onediff Community Edition |
263-
| --------------------------------------------------------------------------------------------------------- | --------------------------------------- | --------------------------------------- |
264-
| More Extreme and Dedicated optimization(usually another 20~100% performance gain) for the most used model | Yes | |
265-
| Technical Support for deployment | High priority support | Community |
259+
#### Distributed Run
260+
If you want to do distributed inference, you can use onediff's compiler to do single-device acceleration in a distributed inference engine such as [xDiT](https://github.com/xdit-project/xDiT)
266261

267-
### Distributed Run
268-
If you want to do distributed inference, you can use onediff's compiler to do single-device acceleration in a distributed inference engine such as:
269-
- [xDiT](https://github.com/xdit-project/xDiT)
262+
### OneDiff Enterprise Solution
263+
If you need Enterprise-level Support for your system or business, you can email us at [email protected], or contact us through the website: https://siliconflow.cn/pricing
264+
| | Onediff Enterprise Solution |
265+
| -------------------------------------------------------- | ------------------------------------------------ |
266+
| More extreme compiler optimization for diffusion process | Usually another 20%~30% or more performance gain |
267+
| End-to-end workflow speedup solutions | Sometimes 200%~300% performance gain |
268+
| End-to-end workflow deployment solutions | Workflow to online model API |
269+
| Technical support for deployment | High priority support |
270270

271271
## Citation
272272
```bibtex

0 commit comments

Comments
 (0)