FlowerTune LLM Leaderboard
Collection
My submissions for FlowerTune LLM Leaderboard
•
3 items
•
Updated
This PEFT adapter has been trained by using Flower, a friendly federated AI framework.
The adapter and benchmark results have been submitted to the FlowerTune LLM NLP Leaderboard.
Please check the following GitHub project for model details and evaluation results:
https://github.com/mrs83/FlowerTune-xLSTM-7b-NLP
First, install xlstm
and mlstm_kernels
packages:
pip install xlstm
pip install mlstm_kernels
For now, install the transformers repositiory fork from NX-AI (until it is merged):
pip install 'transformers @ git+ssh://[email protected]/NX-AI/transformers.git@integrate_xlstm'
Use this model as:
from peft import PeftModel
from transformers import AutoModelForCausalLM
base_model = AutoModelForCausalLM.from_pretrained("NX-AI/xLSTM-7b")
model = PeftModel.from_pretrained(base_model, "mrs83/FlowerTune-xLSTM-7b-NLP-PEFT")
60609.38 Megabytes
The following bitsandbytes
quantization config was used during training:
Base model
NX-AI/xLSTM-7b