Mistral-7B-Instruct-v0.3 has 36 layers, but this file only contains 32 layers. https://github.com/mit-han-lab/duo-attention/blob/main/attn_patterns/Mistral-7B-Instruct-v0.3/lr%3D0.02-reg%3D0.05-ctx%3D1000_32000-multi_passkey10/full_attention_heads.tsv