|
1 | 1 | <br/>
|
2 | 2 | <p align="center">
|
3 | 3 | <img src="./assets/icon.png" alt="Library Icon" width="164" height="164" />
|
| 4 | + <h1 align="center">Diffusion Studio</h1> |
4 | 5 | </p>
|
5 | 6 |
|
6 | 7 | <p align="center">
|
|
11 | 12 | </p>
|
12 | 13 | <br/>
|
13 | 14 |
|
14 |
| -# Diffusion Studio - A browser based video editing framework 🚀 |
| 15 | +# Getting Started |
15 | 16 |
|
16 |
| -Yes that's right, DS does not require a backend! This is made possible by bleeding edge browser APIs such as WebGPU, WebCodecs and WebAssembly resulting in a blazingly fast render performance 🏎️ (fastest in town). |
17 |
| - |
18 |
| -### Background |
19 |
| - |
20 |
| -This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated: |
21 |
| -* **C++ w/ Python bindings** - inefficient to develop. |
22 |
| -* **Rust** - early ecosystem (might come back here). |
23 |
| -* **Typescript** - efficient to develop, great performance when gpu based. |
24 |
| - |
25 |
| -They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory. |
26 |
| - |
27 |
| - |
28 |
| -## Current features |
29 |
| -* **Video/Audio** trim and offset |
30 |
| -* **Tracks & Layering** |
31 |
| -* **Splitting** clips |
32 |
| -* **Html & Image** rendering |
33 |
| -* **Text** with multiple styles |
34 |
| -* Web & Local **Fonts** |
35 |
| -* **Filters** |
36 |
| -* **Keyframe** animations |
37 |
| - * **Numbers, Degrees and Colors** |
38 |
| - * **Easing** (easeIn, easeOut etc.) |
39 |
| - * **Extrapolation** `'clamp' | 'extend'` |
40 |
| -* **Realtime playback** |
41 |
| -* **Hardware accelerated** encoding via WebCodecs |
42 |
| -* **Dynamic render resolution and framerate** |
43 |
| - |
44 |
| -## Setup |
| 17 | +Diffusion Studio is an open-source, web-based framework for programmatic video editing. It enables developers to automate complex editing workflows, build AI-powered video editors and create videos at scale. |
45 | 18 |
|
46 | 19 | ```sh
|
47 | 20 | npm i @diffusionstudio/core
|
48 | 21 | ```
|
49 |
| -Diffusion Studio's render implementation was recently migrated to [Pixi.js](https://pixijs.com/), a **WebGL/WebGPU abstraction library**, to speed up development while providing full support for WebGPU and WebGL. It is listed as a peer dependency and will be installed if the dependency is not already satisfied. |
50 | 22 |
|
51 |
| -## Getting Started |
52 |
| -*For more information, please refer to our [documentation](https://docs.diffusion.studio)* |
| 23 | +## Documentation |
| 24 | + |
| 25 | +Visit https://docs.diffusion.studio to view the full documentation. |
| 26 | + |
| 27 | +## Basic Usage |
| 28 | +Let's take a look at an example: |
53 | 29 |
|
54 | 30 | ```typescript
|
55 | 31 | import * as core from '@diffusionstudio/core';
|
@@ -96,30 +72,35 @@ await track.appendClip(text2);
|
96 | 72 | ...
|
97 | 73 | ```
|
98 | 74 |
|
99 |
| -The composition, the track and the clips are each in a relationship of `1:n`. You can find more examples here: |
100 |
| - |
101 |
| -* [Caption Presets](./examples/scripting/src/captions.ts) |
102 |
| -* [Custom Caption Presets](./examples/scripting/src/custom-captions.ts) |
103 |
| -* [Drag and Drop & File API](./examples/scripting/src/drag-and-drop.ts) |
104 |
| -* [Loading Webfonts](./examples/scripting/src/font.ts) |
105 |
| -* [Splitting Clips](./examples/scripting/src/split-video.ts) |
106 |
| -* [Video Trimming](./examples/scripting/src/video-trimming.ts) |
107 |
| -* [Reddit Stories](./examples/scripting/src/reddit-stories.ts) |
108 |
| -* [SSR with Puppeteer](./examples/puppeteer) |
109 |
| - |
110 |
| -Clone the repository and run `npm install && npm run dev` to conveniently test these examples. |
| 75 | +The composition, the track and the clips are each in a relationship of `1:n`. You can find more [examples here.](./examples), or give them a whirl on: https://examples.diffusion.studio |
111 | 76 |
|
112 | 77 | https://github.com/user-attachments/assets/7a943407-e916-4d9f-b46a-3163dbff44c3
|
113 | 78 |
|
114 | 79 | ## How does Diffusion Studio compare to Remotion and Motion Canvas?
|
115 | 80 |
|
116 |
| -**Remotion** acts as a React-based video creation tool, enabling you to utilize the entire DOM tree for video creation as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, it relies heavily on the CPU, which can be inefficient for rendering. |
| 81 | +**Remotion** acts as a React-based video creation tool, enabling you to render the entire DOM tree as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, is limited to react and relies heavily on the CPU, which less efficient compared to GPU backed rendering. |
117 | 82 |
|
118 |
| -**Motion Canvas** lives up to its name by utilizing a GPU-backed canvas element, primarily through a Canvas 2D implementation. It aims to be a standalone editor for creating production-quality animations. Moreover, Motion Canvas employs an imperative API. Instead of rendering markup based on timestamps, elements are added procedurally to the timeline. This approach makes it ideal for creating animations with code (it's intended prupose). However, it is suboptimal for dynamic applications or as the backbone of a video editing application. |
| 83 | +In contrast, **Motion Canvas** uses a 2D implementation of Canvas for rendering. It is intended as a standalone editor for creating production-quality animations. In addition, Motion Canvas uses an imperative API. Instead of rendering markup based on timestamps, elements are procedurally added to the timeline. This approach is ideal for creating animations with code (the intended purpose). However, it usually demands static workflows with little variability, making it difficult to build dynamic applications. |
119 | 84 |
|
120 |
| -**Diffusion Studio** combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU, WebCodecs and WASM). Its API is specifically designed for building video editing and video processing applications, with a strong focus on automation software. |
| 85 | +**Diffusion Studio** combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU and WebCodecs). Its API is specifically designed for building video editing apps and to automate complex video workflows. |
121 | 86 |
|
122 |
| -One notable advantage of Diffusion Studio being client-side software is that it eliminates the need to pay for rendering server infrastructure. However, widespread support across all browsers may take some time to achieve. |
| 87 | +> **Note: Diffusion Studio eliminates the need to pay for rendering server infrastructure, since all processing is performed client-side!** |
| 88 | +
|
| 89 | +## Current features |
| 90 | +* **Video/Audio** trim and offset |
| 91 | +* **Tracks & Layering** |
| 92 | +* **Splitting** clips |
| 93 | +* **Html & Image** rendering |
| 94 | +* **Text** with multiple styles |
| 95 | +* Web & Local **Fonts** |
| 96 | +* **Filters** |
| 97 | +* **Keyframe** animations |
| 98 | + * **Numbers, Degrees and Colors** |
| 99 | + * **Easing** (easeIn, easeOut etc.) |
| 100 | + * **Extrapolation** `'clamp' | 'extend'` |
| 101 | +* **Realtime playback** |
| 102 | +* **Hardware accelerated** encoding via WebCodecs |
| 103 | +* **Dynamic render resolution and framerate** |
123 | 104 |
|
124 | 105 | ## Compatability
|
125 | 106 |
|
@@ -172,30 +153,24 @@ One notable advantage of Diffusion Studio being client-side software is that it
|
172 | 153 | | Ogg | ✅ | ❌ |
|
173 | 154 | | Wav | ✅ | N/A |
|
174 | 155 |
|
175 |
| -## License |
176 |
| - |
177 |
| -### License for Personal Use |
178 |
| - |
179 |
| -Copyright (c) 2024 Diffusion Studio GmbH |
180 |
| - |
181 |
| -Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to use the Software for personal, non-commercial purposes only, subject to the following conditions: |
| 156 | +## Contributing |
| 157 | +Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run: |
182 | 158 |
|
183 |
| -The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. |
184 |
| - |
185 |
| -#### Restrictions |
186 |
| - |
187 |
| -- Redistribution, sublicensing, and republishing of the Software or any derivative works are strictly prohibited. |
188 |
| -- The Software is provided "as is", without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose, and noninfringement. In no event shall the authors or copyright holders be liable for any claim, damages, or other liability, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the Software or the use or other dealings in the Software. |
| 159 | +```sh |
| 160 | +npm install |
| 161 | +``` |
189 | 162 |
|
190 |
| -#### Eligibility |
| 163 | +Before checking in a pull request please verify that all unit tests are still green by running: |
191 | 164 |
|
192 |
| -You are eligible to use Diffusion Studio for free if you are: |
| 165 | +```sh |
| 166 | +npm run test |
| 167 | +``` |
193 | 168 |
|
194 |
| -- an individual. |
195 |
| -- assessing whether Diffusion Studio is a suitable solution for your organization. |
196 |
| -- a for-profit organization with up to five (5) employees. |
197 |
| -- a non-profit organization. |
| 169 | +## Background |
198 | 170 |
|
199 |
| -### Commercial Use |
| 171 | +This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated: |
| 172 | +* **C++ w/ Python bindings** - inefficient to develop. |
| 173 | +* **Rust** - early ecosystem (might come back here). |
| 174 | +* **Typescript** - efficient to develop, great performance when gpu based. |
200 | 175 |
|
201 |
| -For commercial use, a separate commercial license must be obtained. Please contact license(at)diffusion.studio to obtain a commercial license. |
| 176 | +They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory. |
0 commit comments