Replies: 2 comments
-
|
i have the same error |
Beta Was this translation helpful? Give feedback.
0 replies
-
|
I have the same error too |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Starting the web UI...
Warning: --cai-chat is deprecated. Use --chat instead.
===================================BUG REPORT===================================
Welcome to bitsandbytes. For bug reports, please submit your error trace to: https://github.com/TimDettmers/bitsandbytes/issues
CUDA SETUP: CUDA runtime path found: C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\bin\cudart64_110.dll
CUDA SETUP: Highest compute capability among GPUs detected: 8.6
CUDA SETUP: Detected CUDA version 117
CUDA SETUP: Loading binary C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda117.dll...
Loading the extension "gallery"... Ok.
Running on local URL: http://127.0.0.1:7861
To create a public link, set
share=Trueinlaunch().Traceback (most recent call last):
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\gradio\routes.py", line 393, in run_predict
output = await app.get_blocks().process_api(
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 1108, in process_api
result = await self.call_function(
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\gradio\blocks.py", line 929, in call_function
prediction = await anyio.to_thread.run_sync(
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\anyio\to_thread.py", line 31, in run_sync
return await get_asynclib().run_sync_in_worker_thread(
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 937, in run_sync_in_worker_thread
return await future
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\anyio_backends_asyncio.py", line 867, in run
result = context.run(func, *args)
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\installer_files\env\lib\site-packages\gradio\utils.py", line 490, in async_iteration
return next(iterator)
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\text-generation-webui\modules\chat.py", line 218, in cai_chatbot_wrapper
for history in chatbot_wrapper(text, state):
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\text-generation-webui\modules\chat.py", line 155, in chatbot_wrapper
for reply in generate_reply(f"{prompt}{' ' if len(cumulative_reply) > 0 else ''}{cumulative_reply}", state, eos_token=eos_token, stopping_strings=stopping_strings):
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\text-generation-webui\modules\text_generation.py", line 175, in generate_reply
input_ids = encode(question, add_bos_token=state['add_bos_token'], truncation_length=get_max_prompt_length(state))
File "C:\Oobabooga\oobabooga-windows (4)\oobabooga-windows\text-generation-webui\modules\text_generation.py", line 31, in encode
input_ids = shared.tokenizer.encode(str(prompt), return_tensors='pt', add_special_tokens=add_special_tokens)
AttributeError: 'NoneType' object has no attribute 'encode'
Hello everyone, I would like some help in solving this error! I am trying to use GPT-4 Alpaca through Oobabooga. What can I do to resolve this issue?
Beta Was this translation helpful? Give feedback.
All reactions