Missing model.layers.*.self_attn.rotary_emb.inv_freq weights

#11
by viktoroo - opened

I am loading the model like this:

model_raw: transformers.PreTrainedModel = transformers.AutoModelForCausalLM.from_pretrained(
        path_raw,
        device_map={"": torch.device(device)},
        torch_dtype=torch.float32,
        low_cpu_mem_usage=True,
)

What I see when inspecting the weights is that there are no inv_freq for rotary embeddings. You can Ctrl+F here to make sure that they should be there

I am trying (and failing) to apply this diff: https://huggingface.co/epfml/landmark-attention-llama7b-wdiff

Sign up or log in to comment