Skip to content

Commit 4740c3b

Browse files
committed
v1.0.0-beta.13 release
1 parent 409357a commit 4740c3b

File tree

220 files changed

+19780
-1128
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

220 files changed

+19780
-1128
lines changed

.gitignore

-1
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,6 @@ node_modules
1111
dist
1212
dist-ssr
1313
*.local
14-
.env
1514

1615
# Editor directories and files
1716
.vscode/*

LICENSE

+364-15
Large diffs are not rendered by default.

README.md

+45-70
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
<br/>
22
<p align="center">
33
<img src="./assets/icon.png" alt="Library Icon" width="164" height="164" />
4+
<h1 align="center">Diffusion Studio</h1>
45
</p>
56

67
<p align="center">
@@ -11,45 +12,20 @@
1112
</p>
1213
<br/>
1314

14-
# Diffusion Studio - A browser based video editing framework 🚀
15+
# Getting Started
1516

16-
Yes that's right, DS does not require a backend! This is made possible by bleeding edge browser APIs such as WebGPU, WebCodecs and WebAssembly resulting in a blazingly fast render performance 🏎️ (fastest in town).
17-
18-
### Background
19-
20-
This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated:
21-
* **C++ w/ Python bindings** - inefficient to develop.
22-
* **Rust** - early ecosystem (might come back here).
23-
* **Typescript** - efficient to develop, great performance when gpu based.
24-
25-
They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory.
26-
27-
28-
## Current features
29-
* **Video/Audio** trim and offset
30-
* **Tracks & Layering**
31-
* **Splitting** clips
32-
* **Html & Image** rendering
33-
* **Text** with multiple styles
34-
* Web & Local **Fonts**
35-
* **Filters**
36-
* **Keyframe** animations
37-
* **Numbers, Degrees and Colors**
38-
* **Easing** (easeIn, easeOut etc.)
39-
* **Extrapolation** `'clamp' | 'extend'`
40-
* **Realtime playback**
41-
* **Hardware accelerated** encoding via WebCodecs
42-
* **Dynamic render resolution and framerate**
43-
44-
## Setup
17+
Diffusion Studio is an open-source, web-based framework for programmatic video editing. It enables developers to automate complex editing workflows, build AI-powered video editors and create videos at scale.
4518

4619
```sh
4720
npm i @diffusionstudio/core
4821
```
49-
Diffusion Studio's render implementation was recently migrated to [Pixi.js](https://pixijs.com/), a **WebGL/WebGPU abstraction library**, to speed up development while providing full support for WebGPU and WebGL. It is listed as a peer dependency and will be installed if the dependency is not already satisfied.
5022

51-
## Getting Started
52-
*For more information, please refer to our [documentation](https://docs.diffusion.studio)*
23+
## Documentation
24+
25+
Visit https://docs.diffusion.studio to view the full documentation.
26+
27+
## Basic Usage
28+
Let's take a look at an example:
5329

5430
```typescript
5531
import * as core from '@diffusionstudio/core';
@@ -96,30 +72,35 @@ await track.appendClip(text2);
9672
...
9773
```
9874

99-
The composition, the track and the clips are each in a relationship of `1:n`. You can find more examples here:
100-
101-
* [Caption Presets](./examples/scripting/src/captions.ts)
102-
* [Custom Caption Presets](./examples/scripting/src/custom-captions.ts)
103-
* [Drag and Drop & File API](./examples/scripting/src/drag-and-drop.ts)
104-
* [Loading Webfonts](./examples/scripting/src/font.ts)
105-
* [Splitting Clips](./examples/scripting/src/split-video.ts)
106-
* [Video Trimming](./examples/scripting/src/video-trimming.ts)
107-
* [Reddit Stories](./examples/scripting/src/reddit-stories.ts)
108-
* [SSR with Puppeteer](./examples/puppeteer)
109-
110-
Clone the repository and run `npm install && npm run dev` to conveniently test these examples.
75+
The composition, the track and the clips are each in a relationship of `1:n`. You can find more [examples here.](./examples), or give them a whirl on: https://examples.diffusion.studio
11176

11277
https://github.com/user-attachments/assets/7a943407-e916-4d9f-b46a-3163dbff44c3
11378

11479
## How does Diffusion Studio compare to Remotion and Motion Canvas?
11580

116-
**Remotion** acts as a React-based video creation tool, enabling you to utilize the entire DOM tree for video creation as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, it relies heavily on the CPU, which can be inefficient for rendering.
81+
**Remotion** acts as a React-based video creation tool, enabling you to render the entire DOM tree as well as the full suite of browser visualization features, such as HTML, CSS, Canvas, etc.. This makes Remotion ideal for beginners looking to create videos with code. However, is limited to react and relies heavily on the CPU, which less efficient compared to GPU backed rendering.
11782

118-
**Motion Canvas** lives up to its name by utilizing a GPU-backed canvas element, primarily through a Canvas 2D implementation. It aims to be a standalone editor for creating production-quality animations. Moreover, Motion Canvas employs an imperative API. Instead of rendering markup based on timestamps, elements are added procedurally to the timeline. This approach makes it ideal for creating animations with code (it's intended prupose). However, it is suboptimal for dynamic applications or as the backbone of a video editing application.
83+
In contrast, **Motion Canvas** uses a 2D implementation of Canvas for rendering. It is intended as a standalone editor for creating production-quality animations. In addition, Motion Canvas uses an imperative API. Instead of rendering markup based on timestamps, elements are procedurally added to the timeline. This approach is ideal for creating animations with code (the intended purpose). However, it usually demands static workflows with little variability, making it difficult to build dynamic applications.
11984

120-
**Diffusion Studio** combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU, WebCodecs and WASM). Its API is specifically designed for building video editing and video processing applications, with a strong focus on automation software.
85+
**Diffusion Studio** combines the strengths of both Remotion and Motion Canvas by offering a declarative (yet framework-agnostic) API like Remotion, while also being GPU-backed like Motion Canvas. Diffusion Studio is optimized for video processing performance, utilizing the latest and greatest technologies (WebGPU and WebCodecs). Its API is specifically designed for building video editing apps and to automate complex video workflows.
12186

122-
One notable advantage of Diffusion Studio being client-side software is that it eliminates the need to pay for rendering server infrastructure. However, widespread support across all browsers may take some time to achieve.
87+
> **Note: Diffusion Studio eliminates the need to pay for rendering server infrastructure, since all processing is performed client-side!**
88+
89+
## Current features
90+
* **Video/Audio** trim and offset
91+
* **Tracks & Layering**
92+
* **Splitting** clips
93+
* **Html & Image** rendering
94+
* **Text** with multiple styles
95+
* Web & Local **Fonts**
96+
* **Filters**
97+
* **Keyframe** animations
98+
* **Numbers, Degrees and Colors**
99+
* **Easing** (easeIn, easeOut etc.)
100+
* **Extrapolation** `'clamp' | 'extend'`
101+
* **Realtime playback**
102+
* **Hardware accelerated** encoding via WebCodecs
103+
* **Dynamic render resolution and framerate**
123104

124105
## Compatability
125106

@@ -172,30 +153,24 @@ One notable advantage of Diffusion Studio being client-side software is that it
172153
| Ogg |||
173154
| Wav || N/A |
174155

175-
## License
176-
177-
### License for Personal Use
178-
179-
Copyright (c) 2024 Diffusion Studio GmbH
180-
181-
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to use the Software for personal, non-commercial purposes only, subject to the following conditions:
156+
## Contributing
157+
Contributions to Diffusion Studio are welcome and highly appreciated. Simply fork this respository and run:
182158

183-
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
184-
185-
#### Restrictions
186-
187-
- Redistribution, sublicensing, and republishing of the Software or any derivative works are strictly prohibited.
188-
- The Software is provided "as is", without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose, and noninfringement. In no event shall the authors or copyright holders be liable for any claim, damages, or other liability, whether in an action of contract, tort, or otherwise, arising from, out of, or in connection with the Software or the use or other dealings in the Software.
159+
```sh
160+
npm install
161+
```
189162

190-
#### Eligibility
163+
Before checking in a pull request please verify that all unit tests are still green by running:
191164

192-
You are eligible to use Diffusion Studio for free if you are:
165+
```sh
166+
npm run test
167+
```
193168

194-
- an individual.
195-
- assessing whether Diffusion Studio is a suitable solution for your organization.
196-
- a for-profit organization with up to five (5) employees.
197-
- a non-profit organization.
169+
## Background
198170

199-
### Commercial Use
171+
This project has been started in March 2023 with the mission of creating the *"video processing toolkit for the area of AI"*. During an extensive research period, we quickly decided to fully embrace **WebGPU**, which offers a substantial performance improvement over its predecessor WebGL and technologies alike. The following implementations were evaluated:
172+
* **C++ w/ Python bindings** - inefficient to develop.
173+
* **Rust** - early ecosystem (might come back here).
174+
* **Typescript** - efficient to develop, great performance when gpu based.
200175

201-
For commercial use, a separate commercial license must be obtained. Please contact license(at)diffusion.studio to obtain a commercial license.
176+
They all support WebGPU, however, in the case of Typescript, WebGPU is currently only available in Chromium-based browsers, which is why a WebGL fallback is mandatory.

assets/composition.png

10.1 KB
Loading

biome.json

+18
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
{
2+
"$schema": "https://biomejs.dev/schemas/1.8.3/schema.json",
3+
"organizeImports": {
4+
"enabled": true
5+
},
6+
"linter": {
7+
"enabled": false
8+
},
9+
"javascript": {
10+
"formatter": {
11+
"quoteStyle": "single",
12+
"jsxQuoteStyle": "double",
13+
"trailingCommas": "all",
14+
"semicolons": "always",
15+
"lineWidth": 100
16+
}
17+
}
18+
}

docs/api/clip.md

-123
This file was deleted.

0 commit comments

Comments
 (0)