Skip to content

Conversation

@rockwotj
Copy link
Contributor

No description provided.

@mmatczuk
Copy link
Contributor

mmatczuk commented Dec 15, 2025

Mind-blowing, this would be very useful for Connect.

@rockwotj rockwotj changed the title codesandbox: initial sandbox with quickjs codesandbox: initial sandbox with js and py Dec 15, 2025
@mmatczuk
Copy link
Contributor

I wonder how the performance looks like vs plain Python in repeated loops - do not have Python JIT here?

@rockwotj
Copy link
Contributor Author

rockwotj commented Dec 15, 2025

I wonder how the performance looks like vs plain Python in repeated loops - do not have Python JIT here?

Python is not jitted really. There are some efforts to do some basic copy/patch JIT system but I don't remember if that landed yet in newer python versions. At least up to python 3.12 it's a pure threaded bytecode interpreter.

The extra layer of sandboxing is going to be slower than pure python compiled to x86 for sure. In this we have a slower interpreter (RustPython) compiled to wasm, then jitted to x86 and ran in a sandbox. So it's going to be slower for sure, but I suspect it'd still be faster than bloblang depending on what you're doing (I know bloblang has some nice builtins that are probably faster than writing it yourself in python/js)

Also for both of these bytecode interpreters, we could optimize things for connect's use case. Connect compiles once and evaluates many times (and also probably doesn't need to reset the sandbox like the AI use cases) so we can have new APIs to cache the bytecode so we don't have to recompile each time.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants