OpenMoE / README.md
marsggbo's picture
Update README.md
81d0ac6
|
raw
history blame
766 Bytes
metadata
datasets:
  - wikitext-2-v1
  - yizhongw/self_instruct
language:
  - en
library_name: transformers
metrics: crossentropy
  1. Install ColossalAI
git clone https://github.com/marsggbo/ColossalAI
cd ColossalAI
pip install -e .
  1. download the pth file
  2. load the state dict
from transformers import T5Tokenizer
from transformers.models.llama import LlamaConfig

config = LlamaConfig.from_pretrained(f"hpcaitech/openmoe-base")
model = OpenMoeForCausalLM(config)
ckpt = torch.load("openmoe_base_yizhongw_super_natural_instruction_generation.pth")
state_dict = {}
for key, value in ckpt.items():
    if key.startswith("module."):
        state_dict[key[7:]] = value
    else:
        state_dict[key] = value
model.load_state_dict(state_dict)