Skip to content

Multiple GPUs not supported with Tensorflow Backend #424

@ashao

Description

@ashao

Description

Trying to use the multi-gpu functions with Tensorflow models raises an error within RedisAI. This likely seems to be an incompatibility with the call to the Tensorflow backend. Note that we can confirm that multi-gpu works for Torch models.

How to reproduce

  1. Create a Tensorflow model
  2. Use the set_model_multigpu to load this into the database.
  3. Error gets raised in the database

Expected behavior

Users should be able to set Tensorflow models and deploy on multi-gpu machines

Metadata

Metadata

Assignees

No one assigned

    Labels

    area: third-partyIssues related to depencies and third-party package integrationsbug: majorA major bug

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions