Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
mllmTeam
/
minicpm-2b-dpo-mllm
like
0
Model card
Files
Files and versions
Community
main
minicpm-2b-dpo-mllm
/
minicpm-2b-dpo-q4_k.mllm
Commit History
Upload folder using huggingface_hub
c2f5a93
verified
mllmTeam
commited on
Aug 29