-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Labels
area: third-partyIssues related to depencies and third-party package integrationsIssues related to depencies and third-party package integrationsbug: majorA major bugA major bug
Description
Description
Trying to use the multi-gpu functions with Tensorflow models raises an error within RedisAI. This likely seems to be an incompatibility with the call to the Tensorflow backend. Note that we can confirm that multi-gpu works for Torch models.
How to reproduce
- Create a Tensorflow model
- Use the
set_model_multigputo load this into the database. - Error gets raised in the database
Expected behavior
Users should be able to set Tensorflow models and deploy on multi-gpu machines
Metadata
Metadata
Assignees
Labels
area: third-partyIssues related to depencies and third-party package integrationsIssues related to depencies and third-party package integrationsbug: majorA major bugA major bug