Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
1
2
PDS DPO
pdsdpo
Follow
https://pds-dpo.github.io/
AI & ML interests
None yet
Organizations
None yet
pdsdpo
's activity
All
Models
Datasets
Spaces
Papers
Collections
Community
Posts
Upvotes
Likes
Articles
liked
2 models
3 months ago
pdsdpo/PDS-DPO-7B
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
29
•
1
pdsdpo/PDS-DPO-7B-LoRA
Image-Text-to-Text
•
Updated
Dec 26, 2024
•
14
•
1