Replies: 1 comment 1 reply
-
|
No, for the technical reasons that you have already stated, that isn't currently possible. It the issue just that it takes too long to download everything again? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
For FLL we wrote a lot of code that runs on the spike that handles stuff like PID and acceleration in an abstraction. Now we are wondering if there is a way to hot reload a single file and send it to the spike instead of having to send all the boilerplate with it everytime you want to reload your runs.
I've read through a lot of source code of pybricks and pybricksdev and concluded that all of the users program files get compiled to a single blob of mpy bytecode and stored in memory on the RAM of the Spike. I figured it might be possible to just recompile the runs file and overwrite it on the RAM via the pybricks gatt characterstics with Command.COMMAND_WRITE_USER_RAM. However this isn't easy to do as the runs.py obviously can change in size and potentialy overwrite other parts of the software.
Maybe do you have any suggestions or tipps on how to tackle this problem? I would be very grateful.
Thank you in advance and have a nice time!
Beta Was this translation helpful? Give feedback.
All reactions