Neuria_BERT_Contexto_2025_02_06

This model is a fine-tuned version of dccuchile/bert-base-spanish-wwm-cased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0410
  • Accuracy: 0.9318
  • Precision Micro: 0.9741
  • Recall Micro: 0.9464
  • F1 Micro: 0.9601
  • F1 Macro: 0.9682

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Micro Recall Micro F1 Micro F1 Macro
0.3099 0.9979 117 0.1863 0.5160 0.9382 0.5846 0.7203 0.4031
0.141 1.9957 234 0.1098 0.7846 0.9744 0.8291 0.8959 0.7057
0.0778 2.9936 351 0.0773 0.8571 0.9759 0.8811 0.9261 0.7911
0.0482 4.0 469 0.0655 0.8891 0.9713 0.9079 0.9385 0.8397
0.0346 4.9979 586 0.0560 0.9083 0.9787 0.9229 0.95 0.8944
0.026 5.9957 703 0.0517 0.9147 0.9688 0.9363 0.9523 0.9202
0.0202 6.9936 820 0.0452 0.9147 0.9703 0.9313 0.9504 0.9357
0.0167 8.0 938 0.0456 0.9339 0.9808 0.9414 0.9607 0.9575
0.0141 8.9979 1055 0.0423 0.9339 0.9691 0.9447 0.9567 0.9439
0.0124 9.9957 1172 0.0411 0.9296 0.9773 0.9380 0.9573 0.9424
0.0107 10.9936 1289 0.0404 0.9296 0.9756 0.9363 0.9556 0.9409
0.0093 12.0 1407 0.0469 0.9318 0.9690 0.9414 0.9550 0.9401
0.0084 12.9979 1524 0.0389 0.9275 0.9791 0.9414 0.9599 0.9683
0.0075 13.9957 1641 0.0374 0.9382 0.9776 0.9497 0.9635 0.9711
0.0069 14.9936 1758 0.0414 0.9232 0.9640 0.9430 0.9534 0.9408
0.0063 16.0 1876 0.0406 0.9296 0.9724 0.9447 0.9584 0.9571
0.0058 16.9979 1993 0.0421 0.9232 0.9706 0.9397 0.9549 0.9551
0.0057 17.9957 2110 0.0409 0.9254 0.9723 0.9397 0.9557 0.9415
0.005 18.9936 2227 0.0412 0.9232 0.9706 0.9414 0.9558 0.9424
0.0046 20.0 2345 0.0417 0.9275 0.9740 0.9430 0.9583 0.9565
0.0043 20.9979 2462 0.0403 0.9318 0.9741 0.9464 0.9601 0.9689
0.0041 21.9957 2579 0.0415 0.9360 0.9775 0.9464 0.9617 0.9592
0.0038 22.9936 2696 0.0412 0.9318 0.9758 0.9447 0.96 0.9690
0.0035 24.0 2814 0.0410 0.9318 0.9741 0.9464 0.9601 0.9682

Framework versions

  • Transformers 4.44.1
  • Pytorch 2.4.1
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
19
Safetensors
Model size
110M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for neuria99/Neuria_BERT_Contexto_2025_02_06

Finetuned
(94)
this model