Edit model card

PhoBert_Lexical_Dataset55K

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0598
  • Accuracy: 0.7970
  • F1: 0.8468

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.2317 200 0.5915 0.7575 0.8196
No log 0.4635 400 0.6149 0.7294 0.8009
No log 0.6952 600 0.5838 0.7335 0.8038
No log 0.9270 800 0.4301 0.8213 0.8606
0.3511 1.1587 1000 0.4820 0.8211 0.8608
0.3511 1.3905 1200 0.6798 0.7461 0.8126
0.3511 1.6222 1400 0.6930 0.7087 0.7868
0.3511 1.8540 1600 0.6207 0.7709 0.8292
0.2551 2.0857 1800 0.5473 0.8101 0.8545
0.2551 2.3175 2000 0.5228 0.8072 0.8526
0.2551 2.5492 2200 0.5701 0.7828 0.8371
0.2551 2.7810 2400 0.5560 0.7838 0.8378
0.214 3.0127 2600 0.7375 0.7332 0.8039
0.214 3.2445 2800 0.5101 0.8126 0.8565
0.214 3.4762 3000 0.5639 0.8139 0.8574
0.214 3.7080 3200 0.4802 0.8299 0.8671
0.214 3.9397 3400 0.6431 0.7773 0.8336
0.1835 4.1715 3600 0.8122 0.7347 0.8051
0.1835 4.4032 3800 0.4787 0.8506 0.8804
0.1835 4.6350 4000 0.4719 0.8476 0.8784
0.1835 4.8667 4200 0.4643 0.8419 0.8751
0.157 5.0985 4400 0.6827 0.7918 0.8432
0.157 5.3302 4600 0.5695 0.8225 0.8629
0.157 5.5620 4800 0.7416 0.7831 0.8376
0.157 5.7937 5000 0.7787 0.7643 0.8253
0.1337 6.0255 5200 0.6881 0.7963 0.8462
0.1337 6.2572 5400 0.6894 0.7953 0.8456
0.1337 6.4890 5600 0.6472 0.8156 0.8587
0.1337 6.7207 5800 0.5857 0.8222 0.8626
0.1337 6.9525 6000 0.6816 0.8113 0.8559
0.1154 7.1842 6200 0.9047 0.7590 0.8216
0.1154 7.4160 6400 0.7080 0.8088 0.8541
0.1154 7.6477 6600 0.6871 0.8148 0.8581
0.1154 7.8795 6800 0.7481 0.7900 0.8422
0.1002 8.1112 7000 0.7101 0.8236 0.8637
0.1002 8.3430 7200 0.8398 0.7933 0.8443
0.1002 8.5747 7400 0.7284 0.8239 0.8640
0.1002 8.8065 7600 0.7415 0.8117 0.8561
0.0843 9.0382 7800 0.8033 0.8066 0.8529
0.0843 9.2700 8000 0.8593 0.8080 0.8536
0.0843 9.5017 8200 0.8626 0.7953 0.8456
0.0843 9.7335 8400 0.7607 0.8192 0.8609
0.0843 9.9652 8600 0.9512 0.7875 0.8406
0.072 10.1970 8800 0.9709 0.7891 0.8418
0.072 10.4287 9000 0.9674 0.7948 0.8453
0.072 10.6605 9200 0.9815 0.7863 0.8399
0.072 10.8922 9400 1.0347 0.7839 0.8382
0.0617 11.1240 9600 1.1195 0.7774 0.8340
0.0617 11.3557 9800 1.0192 0.7916 0.8434
0.0617 11.5875 10000 0.9594 0.8022 0.8502
0.0617 11.8192 10200 1.0892 0.7750 0.8325
0.0541 12.0510 10400 1.0634 0.7886 0.8414
0.0541 12.2827 10600 1.0198 0.8036 0.8512
0.0541 12.5145 10800 1.0040 0.8016 0.8499
0.0541 12.7462 11000 1.0208 0.7962 0.8464
0.0541 12.9780 11200 1.0329 0.7927 0.8441
0.048 13.2097 11400 0.9916 0.8033 0.8510
0.048 13.4415 11600 1.0793 0.7902 0.8424
0.048 13.6732 11800 1.0424 0.7964 0.8465
0.048 13.9050 12000 1.0162 0.8018 0.8500
0.0426 14.1367 12200 1.0633 0.7936 0.8447
0.0426 14.3685 12400 1.0128 0.8022 0.8502
0.0426 14.6002 12600 1.0780 0.7924 0.8439
0.0426 14.8320 12800 1.0598 0.7970 0.8468

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.1.2
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
353
Safetensors
Model size
135M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for phunganhsang/PhoBert_Lexical_Dataset55K

Finetuned
this model