Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mii-llm
/
propaganda-dpo-sx-v0.1
like
0
Follow
mii-llm
23
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
propaganda-dpo-sx-v0.1
Commit History
Upload folder using huggingface_hub
02bf869
verified
giux78
commited on
15 days ago
initial commit
37016c0
verified
giux78
commited on
15 days ago