Skip to content

trying to run forecaster and I get this error: 'base_model.model.model.model.embed_tokens' #186

Open
@rumcode

Description

@rumcode

I slightly modified the code that I copied from the forecaster page. and I run into an error. Any suggestions? Thanks in advance.
Code is

"""
from datasets import load_dataset
from transformers import AutoTokenizer, AutoModelForCausalLM
from peft import PeftModel
import torch


base_model = AutoModelForCausalLM.from_pretrained(
    'meta-llama/Llama-2-7b-chat-hf',
    trust_remote_code=True,
    device_map="auto",
    torch_dtype=torch.float16,
    token='mytoken # optional if you have enough VRAM
)

tokenizer = AutoTokenizer.from_pretrained('meta-llama/Llama-2-7b-chat-hf',token='mytoken')
print("hi")
model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')
print("hi2")
model = model.eval()


The error messages are:

C:\Users\xx\AppData\Roaming\Python\Python311\site-packages\torch\nn\modules\module.py:2047: UserWarning: for base_model.model.model.layers.31.mlp.down_proj.lora_B.default.weight: copying from a non-meta parameter in the checkpoint to a meta parameter in the current model, which is a no-op. (Did you mean to pass `assign=True` to assign items in the state dictionary to their corresponding key in the module instead of copying them in place?)
  warnings.warn(f'for {key}: copying from a non-meta parameter in the checkpoint to a meta '
Traceback (most recent call last):

  File c:\ProgramData\Anaconda3\Lib\site-packages\spyder_kernels\py3compat.py:356 in compat_exec
    exec(code, globals, locals)

  File c:\users\rruffley 3677\downloads\fingpt20240715try2.py:23
    model = PeftModel.from_pretrained(base_model, 'FinGPT/fingpt-forecaster_dow30_llama2-7b_lora',token='mytoken')

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:430 in from_pretrained
    model.load_adapter(model_id, adapter_name, is_trainable=is_trainable, **kwargs)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:1022 in load_adapter
    self._update_offload(offload_index, adapters_weights)

  File ~\AppData\Roaming\Python\Python311\site-packages\peft\peft_model.py:908 in _update_offload
    safe_module = dict(self.named_modules())[extended_prefix]

KeyError: 'base_model.model.model.model.embed_tokens'

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions