Barcenas 10b

Based on the tiiuae/Falcon3-10B-Instruct and trained with the yahma/alpaca-cleaned dataset.

The objective of this new model is to explore finetuning on the new falcon 3 models.

Made with ❤️ in Guadalupe, Nuevo Leon, Mexico 🇲🇽

Downloads last month
17
Safetensors
Model size
10.3B params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Danielbrdz/Barcenas-10b

Finetuned
(14)
this model
Quantizations
1 model

Dataset used to train Danielbrdz/Barcenas-10b