|
--- |
|
base_model: dccuchile/bert-base-spanish-wwm-uncased |
|
tags: |
|
- generated_from_trainer |
|
metrics: |
|
- accuracy |
|
model-index: |
|
- name: bert-base-spanish-wwm-uncased-finetuned-github_cybersecurity_READMEs |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# bert-base-spanish-wwm-uncased-finetuned-github_cybersecurity_READMEs |
|
|
|
This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-uncased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-uncased) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 1.3626 |
|
- Accuracy: 0.7721 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 3e-05 |
|
- train_batch_size: 32 |
|
- eval_batch_size: 32 |
|
- seed: 42 |
|
- gradient_accumulation_steps: 4 |
|
- total_train_batch_size: 128 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- lr_scheduler_warmup_steps: 1000 |
|
- num_epochs: 200 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Accuracy | |
|
|:-------------:|:------:|:----:|:---------------:|:--------:| |
|
| No log | 0.97 | 14 | 6.6700 | 0.2868 | |
|
| No log | 2.0 | 29 | 6.1329 | 0.3035 | |
|
| No log | 2.97 | 43 | 5.3933 | 0.3612 | |
|
| No log | 4.0 | 58 | 5.0109 | 0.3777 | |
|
| No log | 4.97 | 72 | 4.8244 | 0.3982 | |
|
| No log | 6.0 | 87 | 4.3103 | 0.4191 | |
|
| No log | 6.97 | 101 | 3.9390 | 0.4472 | |
|
| No log | 8.0 | 116 | 3.7105 | 0.4643 | |
|
| No log | 8.97 | 130 | 3.6200 | 0.4682 | |
|
| No log | 10.0 | 145 | 3.3792 | 0.4746 | |
|
| No log | 10.97 | 159 | 3.2035 | 0.5035 | |
|
| No log | 12.0 | 174 | 3.0204 | 0.5292 | |
|
| No log | 12.97 | 188 | 2.9428 | 0.5446 | |
|
| No log | 14.0 | 203 | 2.8275 | 0.5586 | |
|
| No log | 14.97 | 217 | 2.8530 | 0.5389 | |
|
| No log | 16.0 | 232 | 2.7320 | 0.5552 | |
|
| No log | 16.97 | 246 | 2.6976 | 0.5569 | |
|
| No log | 18.0 | 261 | 2.6423 | 0.5672 | |
|
| No log | 18.97 | 275 | 2.5589 | 0.5768 | |
|
| No log | 20.0 | 290 | 2.5393 | 0.5725 | |
|
| No log | 20.97 | 304 | 2.4149 | 0.5883 | |
|
| No log | 22.0 | 319 | 2.3377 | 0.6106 | |
|
| No log | 22.97 | 333 | 2.3686 | 0.6006 | |
|
| No log | 24.0 | 348 | 2.3694 | 0.5896 | |
|
| No log | 24.97 | 362 | 2.3411 | 0.6006 | |
|
| No log | 26.0 | 377 | 2.1990 | 0.6192 | |
|
| No log | 26.97 | 391 | 2.1937 | 0.6187 | |
|
| No log | 28.0 | 406 | 2.1599 | 0.6263 | |
|
| No log | 28.97 | 420 | 2.1169 | 0.6288 | |
|
| No log | 30.0 | 435 | 2.1136 | 0.6363 | |
|
| No log | 30.97 | 449 | 2.1705 | 0.6269 | |
|
| No log | 32.0 | 464 | 1.9909 | 0.6551 | |
|
| No log | 32.97 | 478 | 1.9930 | 0.6452 | |
|
| No log | 34.0 | 493 | 1.9380 | 0.6622 | |
|
| 3.3393 | 34.97 | 507 | 2.0509 | 0.6429 | |
|
| 3.3393 | 36.0 | 522 | 1.9449 | 0.6556 | |
|
| 3.3393 | 36.97 | 536 | 1.9595 | 0.6500 | |
|
| 3.3393 | 38.0 | 551 | 1.8646 | 0.6703 | |
|
| 3.3393 | 38.97 | 565 | 1.9297 | 0.6553 | |
|
| 3.3393 | 40.0 | 580 | 1.8071 | 0.6820 | |
|
| 3.3393 | 40.97 | 594 | 1.9239 | 0.6564 | |
|
| 3.3393 | 42.0 | 609 | 1.7737 | 0.6769 | |
|
| 3.3393 | 42.97 | 623 | 1.7695 | 0.6889 | |
|
| 3.3393 | 44.0 | 638 | 1.7444 | 0.6842 | |
|
| 3.3393 | 44.97 | 652 | 1.7503 | 0.6839 | |
|
| 3.3393 | 46.0 | 667 | 1.7654 | 0.6932 | |
|
| 3.3393 | 46.97 | 681 | 1.7225 | 0.6862 | |
|
| 3.3393 | 48.0 | 696 | 1.8165 | 0.6815 | |
|
| 3.3393 | 48.97 | 710 | 1.7971 | 0.6840 | |
|
| 3.3393 | 50.0 | 725 | 1.7177 | 0.6942 | |
|
| 3.3393 | 50.97 | 739 | 1.6890 | 0.6982 | |
|
| 3.3393 | 52.0 | 754 | 1.7212 | 0.6990 | |
|
| 3.3393 | 52.97 | 768 | 1.7562 | 0.6892 | |
|
| 3.3393 | 54.0 | 783 | 1.7142 | 0.6971 | |
|
| 3.3393 | 54.97 | 797 | 1.6899 | 0.6955 | |
|
| 3.3393 | 56.0 | 812 | 1.7568 | 0.6898 | |
|
| 3.3393 | 56.97 | 826 | 1.6427 | 0.7137 | |
|
| 3.3393 | 58.0 | 841 | 1.5932 | 0.7183 | |
|
| 3.3393 | 58.97 | 855 | 1.6001 | 0.7193 | |
|
| 3.3393 | 60.0 | 870 | 1.6482 | 0.7109 | |
|
| 3.3393 | 60.97 | 884 | 1.5384 | 0.7211 | |
|
| 3.3393 | 62.0 | 899 | 1.6092 | 0.7085 | |
|
| 3.3393 | 62.97 | 913 | 1.6621 | 0.7068 | |
|
| 3.3393 | 64.0 | 928 | 1.5781 | 0.7108 | |
|
| 3.3393 | 64.97 | 942 | 1.5365 | 0.7297 | |
|
| 3.3393 | 66.0 | 957 | 1.5426 | 0.7155 | |
|
| 3.3393 | 66.97 | 971 | 1.6601 | 0.7051 | |
|
| 3.3393 | 68.0 | 986 | 1.5874 | 0.7218 | |
|
| 1.654 | 68.97 | 1000 | 1.6337 | 0.7148 | |
|
| 1.654 | 70.0 | 1015 | 1.5324 | 0.7244 | |
|
| 1.654 | 70.97 | 1029 | 1.5848 | 0.7245 | |
|
| 1.654 | 72.0 | 1044 | 1.4755 | 0.7301 | |
|
| 1.654 | 72.97 | 1058 | 1.5183 | 0.7323 | |
|
| 1.654 | 74.0 | 1073 | 1.4930 | 0.7307 | |
|
| 1.654 | 74.97 | 1087 | 1.4618 | 0.7350 | |
|
| 1.654 | 76.0 | 1102 | 1.5082 | 0.7381 | |
|
| 1.654 | 76.97 | 1116 | 1.4550 | 0.7402 | |
|
| 1.654 | 78.0 | 1131 | 1.4609 | 0.7350 | |
|
| 1.654 | 78.97 | 1145 | 1.5692 | 0.7258 | |
|
| 1.654 | 80.0 | 1160 | 1.4066 | 0.7524 | |
|
| 1.654 | 80.97 | 1174 | 1.5256 | 0.7283 | |
|
| 1.654 | 82.0 | 1189 | 1.4466 | 0.7396 | |
|
| 1.654 | 82.97 | 1203 | 1.4642 | 0.7357 | |
|
| 1.654 | 84.0 | 1218 | 1.4985 | 0.7364 | |
|
| 1.654 | 84.97 | 1232 | 1.4829 | 0.7421 | |
|
| 1.654 | 86.0 | 1247 | 1.4528 | 0.7423 | |
|
| 1.654 | 86.97 | 1261 | 1.3744 | 0.7470 | |
|
| 1.654 | 88.0 | 1276 | 1.4098 | 0.7534 | |
|
| 1.654 | 88.97 | 1290 | 1.4666 | 0.7439 | |
|
| 1.654 | 90.0 | 1305 | 1.3889 | 0.7606 | |
|
| 1.654 | 90.97 | 1319 | 1.4525 | 0.7436 | |
|
| 1.654 | 92.0 | 1334 | 1.3673 | 0.7547 | |
|
| 1.654 | 92.97 | 1348 | 1.4549 | 0.7430 | |
|
| 1.654 | 94.0 | 1363 | 1.4008 | 0.7417 | |
|
| 1.654 | 94.97 | 1377 | 1.3820 | 0.7472 | |
|
| 1.654 | 96.0 | 1392 | 1.3900 | 0.7592 | |
|
| 1.654 | 96.97 | 1406 | 1.4227 | 0.7458 | |
|
| 1.654 | 98.0 | 1421 | 1.4179 | 0.7546 | |
|
| 1.654 | 98.97 | 1435 | 1.4474 | 0.7476 | |
|
| 1.654 | 100.0 | 1450 | 1.4092 | 0.7485 | |
|
| 1.654 | 100.97 | 1464 | 1.3163 | 0.7678 | |
|
| 1.654 | 102.0 | 1479 | 1.3801 | 0.7631 | |
|
| 1.654 | 102.97 | 1493 | 1.4153 | 0.7496 | |
|
| 1.1613 | 104.0 | 1508 | 1.3168 | 0.7616 | |
|
| 1.1613 | 104.97 | 1522 | 1.3385 | 0.7607 | |
|
| 1.1613 | 106.0 | 1537 | 1.4633 | 0.7406 | |
|
| 1.1613 | 106.97 | 1551 | 1.4509 | 0.7473 | |
|
| 1.1613 | 108.0 | 1566 | 1.3938 | 0.7577 | |
|
| 1.1613 | 108.97 | 1580 | 1.4659 | 0.7451 | |
|
| 1.1613 | 110.0 | 1595 | 1.4536 | 0.7403 | |
|
| 1.1613 | 110.97 | 1609 | 1.4069 | 0.7529 | |
|
| 1.1613 | 112.0 | 1624 | 1.2818 | 0.7721 | |
|
| 1.1613 | 112.97 | 1638 | 1.3530 | 0.7618 | |
|
| 1.1613 | 114.0 | 1653 | 1.3854 | 0.7555 | |
|
| 1.1613 | 114.97 | 1667 | 1.3213 | 0.7589 | |
|
| 1.1613 | 116.0 | 1682 | 1.3547 | 0.7578 | |
|
| 1.1613 | 116.97 | 1696 | 1.4230 | 0.7544 | |
|
| 1.1613 | 118.0 | 1711 | 1.3296 | 0.7650 | |
|
| 1.1613 | 118.97 | 1725 | 1.3777 | 0.7616 | |
|
| 1.1613 | 120.0 | 1740 | 1.3832 | 0.7639 | |
|
| 1.1613 | 120.97 | 1754 | 1.4333 | 0.7524 | |
|
| 1.1613 | 122.0 | 1769 | 1.3613 | 0.7655 | |
|
| 1.1613 | 122.97 | 1783 | 1.4481 | 0.7533 | |
|
| 1.1613 | 124.0 | 1798 | 1.4398 | 0.7550 | |
|
| 1.1613 | 124.97 | 1812 | 1.3509 | 0.7678 | |
|
| 1.1613 | 126.0 | 1827 | 1.3034 | 0.7705 | |
|
| 1.1613 | 126.97 | 1841 | 1.4733 | 0.7468 | |
|
| 1.1613 | 128.0 | 1856 | 1.4400 | 0.7557 | |
|
| 1.1613 | 128.97 | 1870 | 1.3901 | 0.7599 | |
|
| 1.1613 | 130.0 | 1885 | 1.3529 | 0.7683 | |
|
| 1.1613 | 130.97 | 1899 | 1.3677 | 0.7568 | |
|
| 1.1613 | 132.0 | 1914 | 1.4481 | 0.7561 | |
|
| 1.1613 | 132.97 | 1928 | 1.2518 | 0.7826 | |
|
| 1.1613 | 134.0 | 1943 | 1.4324 | 0.7527 | |
|
| 1.1613 | 134.97 | 1957 | 1.3740 | 0.7591 | |
|
| 1.1613 | 136.0 | 1972 | 1.3782 | 0.7628 | |
|
| 1.1613 | 136.97 | 1986 | 1.2933 | 0.7735 | |
|
| 0.9181 | 138.0 | 2001 | 1.3451 | 0.7709 | |
|
| 0.9181 | 138.97 | 2015 | 1.4064 | 0.7646 | |
|
| 0.9181 | 140.0 | 2030 | 1.3908 | 0.7661 | |
|
| 0.9181 | 140.97 | 2044 | 1.3139 | 0.7692 | |
|
| 0.9181 | 142.0 | 2059 | 1.3602 | 0.7698 | |
|
| 0.9181 | 142.97 | 2073 | 1.3171 | 0.7763 | |
|
| 0.9181 | 144.0 | 2088 | 1.3736 | 0.7627 | |
|
| 0.9181 | 144.97 | 2102 | 1.3348 | 0.7670 | |
|
| 0.9181 | 146.0 | 2117 | 1.3745 | 0.7672 | |
|
| 0.9181 | 146.97 | 2131 | 1.3725 | 0.7657 | |
|
| 0.9181 | 148.0 | 2146 | 1.3939 | 0.7662 | |
|
| 0.9181 | 148.97 | 2160 | 1.3793 | 0.7654 | |
|
| 0.9181 | 150.0 | 2175 | 1.3246 | 0.7713 | |
|
| 0.9181 | 150.97 | 2189 | 1.2930 | 0.7767 | |
|
| 0.9181 | 152.0 | 2204 | 1.2810 | 0.7786 | |
|
| 0.9181 | 152.97 | 2218 | 1.3552 | 0.7677 | |
|
| 0.9181 | 154.0 | 2233 | 1.4365 | 0.7662 | |
|
| 0.9181 | 154.97 | 2247 | 1.3108 | 0.7701 | |
|
| 0.9181 | 156.0 | 2262 | 1.2976 | 0.7802 | |
|
| 0.9181 | 156.97 | 2276 | 1.3652 | 0.7743 | |
|
| 0.9181 | 158.0 | 2291 | 1.3912 | 0.7628 | |
|
| 0.9181 | 158.97 | 2305 | 1.3401 | 0.7689 | |
|
| 0.9181 | 160.0 | 2320 | 1.2996 | 0.7723 | |
|
| 0.9181 | 160.97 | 2334 | 1.3340 | 0.7764 | |
|
| 0.9181 | 162.0 | 2349 | 1.2927 | 0.7751 | |
|
| 0.9181 | 162.97 | 2363 | 1.3123 | 0.7766 | |
|
| 0.9181 | 164.0 | 2378 | 1.3185 | 0.7712 | |
|
| 0.9181 | 164.97 | 2392 | 1.3288 | 0.7737 | |
|
| 0.9181 | 166.0 | 2407 | 1.3510 | 0.7685 | |
|
| 0.9181 | 166.97 | 2421 | 1.3598 | 0.7699 | |
|
| 0.9181 | 168.0 | 2436 | 1.3490 | 0.7638 | |
|
| 0.9181 | 168.97 | 2450 | 1.3381 | 0.7643 | |
|
| 0.9181 | 170.0 | 2465 | 1.3074 | 0.7761 | |
|
| 0.9181 | 170.97 | 2479 | 1.3886 | 0.7631 | |
|
| 0.9181 | 172.0 | 2494 | 1.3931 | 0.7634 | |
|
| 0.7949 | 172.97 | 2508 | 1.3627 | 0.7662 | |
|
| 0.7949 | 174.0 | 2523 | 1.4032 | 0.7653 | |
|
| 0.7949 | 174.97 | 2537 | 1.3016 | 0.7740 | |
|
| 0.7949 | 176.0 | 2552 | 1.3341 | 0.7710 | |
|
| 0.7949 | 176.97 | 2566 | 1.3820 | 0.7624 | |
|
| 0.7949 | 178.0 | 2581 | 1.3502 | 0.7761 | |
|
| 0.7949 | 178.97 | 2595 | 1.3273 | 0.7752 | |
|
| 0.7949 | 180.0 | 2610 | 1.3915 | 0.7623 | |
|
| 0.7949 | 180.97 | 2624 | 1.4012 | 0.7616 | |
|
| 0.7949 | 182.0 | 2639 | 1.3881 | 0.7692 | |
|
| 0.7949 | 182.97 | 2653 | 1.2757 | 0.7807 | |
|
| 0.7949 | 184.0 | 2668 | 1.3941 | 0.7629 | |
|
| 0.7949 | 184.97 | 2682 | 1.3301 | 0.7800 | |
|
| 0.7949 | 186.0 | 2697 | 1.3781 | 0.7735 | |
|
| 0.7949 | 186.97 | 2711 | 1.3267 | 0.7782 | |
|
| 0.7949 | 188.0 | 2726 | 1.3695 | 0.7688 | |
|
| 0.7949 | 188.97 | 2740 | 1.3516 | 0.7752 | |
|
| 0.7949 | 190.0 | 2755 | 1.3627 | 0.7733 | |
|
| 0.7949 | 190.97 | 2769 | 1.3846 | 0.7713 | |
|
| 0.7949 | 192.0 | 2784 | 1.3710 | 0.7662 | |
|
| 0.7949 | 192.97 | 2798 | 1.3902 | 0.7660 | |
|
| 0.7949 | 193.1 | 2800 | 1.4705 | 0.7550 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.40.0.dev0 |
|
- Pytorch 2.2.1+cu121 |
|
- Datasets 2.18.0 |
|
- Tokenizers 0.15.2 |
|
|