Using GPU on google colab #1409
Replies: 4 comments
-
@saksonita mmh this is weird... |
Beta Was this translation helpful? Give feedback.
-
Thank for your response. I also tried with my local machine (GTX 1080Ti).
But I still encounter the same problem.
[image: image.png]
[image: image.png]
…On Tue, Aug 29, 2023 at 1:45 AM leoniewgnr ***@***.***> wrote:
@saksonita <https://github.com/saksonita> mmh this is weird...
This normally works for, but I'm also not using google colab. Did you try
it on another GPU (like azure or local) if the problem still exists there?
—
Reply to this email directly, view it on GitHub
<#1409 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AC5ZOU256BB2JMPAH3VMICLXXTDJPANCNFSM6AAAAAA36D7BHY>
.
You are receiving this because you were mentioned.Message ID:
***@***.***
com>
--
Yours sincerely,
Khoeurn Saksonita
|
Beta Was this translation helpful? Give feedback.
-
@leoniewgnr I have same problem. I found the the model attributes are weired
Does it mean the model is train on gpu ? But I do not see any gpu ram usage. |
Beta Was this translation helpful? Give feedback.
-
Sharing what worked for me. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
I am stuck with utilizing the GPU to run the neural prophet on the google colab. I tried to configure the accelerator to use the GPU but it always use the CPU. I need help on that.
Looking forward to your responding.
Beta Was this translation helpful? Give feedback.
All reactions