Experiments
Collection
experiments. :)
•
1 item
•
Updated
a continually pretrained phi3-mini sparse moe upcycle
Microsoft/phi-3-4k-instruct | Fizzarolli/phi3-4x4b-v1 | |
---|---|---|
MMLU acc. (0-shot) | 0.6799 | 0.6781 |
Hellaswag acc. (0-shot) | 0.6053 | 0.5962 |
ARC-E acc. (0-shot) | 0.8325 | 0.8367 |
ARC-C acc. (0-shot) | 0.5546 | 0.5606 |
honestly i was expecting it to do worse :p, but those are all within a margin of error! so it didn't lose any performance, at least
todo!
please i need money to stay alive and keep making models
not trained on instruct data. it's pretty likely that it won't be much different from phi 3 if you use it like that, if not worse due to any forgetting of instruct formats during the continued training.