Skip to content

ChatGLM-6B模型导入问题 #7

@Alternate-D

Description

@Alternate-D

作者您好!当我在前端输入框输入文本后,ChatGLM-6B模型推理时存在下面的问题:

INFO: Application startup complete.
INFO: 127.0.0.1:33704 - "OPTIONS / HTTP/1.1" 200 OK
Start generating...
The dtype of attention mask (torch.int64) is not bool
Traceback (most recent call last):
File "/media/ln01/2t/usr/cw/ChatZoo/generator/chatbot.py", line 64, in chat
output = self.generate(input_dict, gen_kwargs)
File "/media/ln01/2t/usr/cw/ChatZoo/generator/transformersbot.py", line 69, in generate
return self.model.generate(**input_dict, **gen_kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/transformers/generation/utils.py", line 1485, in generate
return self.sample(
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/transformers/generation/utils.py", line 2524, in sample
outputs = self(
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/media/ln01/2t/usr/cw/ChatZoo/generator/models/chatglm/modeling_chatglm.py", line 1178, in forward
transformer_outputs = self.transformer(
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/media/ln01/2t/usr/cw/ChatZoo/generator/models/chatglm/modeling_chatglm.py", line 984, in forward
layer_ret = layer(
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/media/ln01/2t/usr/cw/ChatZoo/generator/models/chatglm/modeling_chatglm.py", line 624, in forward
attention_input = self.input_layernorm(hidden_states)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1501, in _call_impl
return forward_call(*args, **kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/accelerate/hooks.py", line 165, in new_forward
output = old_forward(*args, **kwargs)
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/modules/normalization.py", line 190, in forward
return F.layer_norm(
File "/home/ln01/miniconda3/envs/ChatZoo/lib/python3.8/site-packages/torch/nn/functional.py", line 2515, in layer_norm
return torch.layer_norm(input, normalized_shape, weight, bias, eps, torch.backends.cudnn.enabled)
RuntimeError: expected scalar type Half but found Float

请问您能帮助我吗?如果可以的话感激不尽!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions