Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
RocktimMBZ
/
dpo_model_merged_lr_2e_07
like
0
Safetensors
llama
Model card
Files
Files and versions
Community
Train
main
dpo_model_merged_lr_2e_07
Commit History
upload model
11c984c
blorg469
commited on
23 days ago
initial commit
b36f2de
verified
RocktimMBZ
commited on
23 days ago