As the latest transformers library is a requirement and the pull request below is in it, we can use the latest rope setting:
huggingface/transformers#24653
Here is the documentation for it: https://huggingface.co/docs/transformers/main/en/model_doc/llama#transformers.LlamaConfig.rope_scaling