Skip to content

Taichi on mobile devices? #8774

@MarcWeber

Description

@MarcWeber

I have a somewhat hard time to understand what technology I should be using to display some pixels. Taichi could a be a goo choice, so will probably use taichi.js with wgpu and xrgpu (experimental which can be enabled in latest chrome versions).
I understand that LLVM chain in an app is huge overhead.

But wouldn't it be easy to run Python a mobile device and the send the taichi kernel to a server which aot compiles the kernels and sends them back to the device ? Then the work flow would be as simple as starting a server and running the client. The main advantage would be not having to care about complicated ahead of time compilations and what you use and don't use. Also what if the kernels must adopt to the data ? Then you have to create them as needed ? I don't know enough details about how well taichi.js optimizes kernel. But maybe sending python strings to a server them returning wgsl or such might allow to use optimizing LLVM dev chain, too. Maybe its not the best choice for production apps (which indeed could just cache). But it might be easiest for development.

Metadata

Metadata

Assignees

No one assigned

    Labels

    feature requestSuggest an idea on this project

    Type

    No type

    Projects

    Status

    Untriaged

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions