sickcell69
End of training
3266f2f verified
|
raw
history blame
7.49 kB
metadata
license: apache-2.0
base_model: bert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: bert-base-uncased-finetuned-github_cybersecurity_READMEs
    results: []

bert-base-uncased-finetuned-github_cybersecurity_READMEs

This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.2291
  • Accuracy: 0.6479

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.97 14 4.1856 0.4305
No log 2.0 29 4.3178 0.4090
No log 2.97 43 4.0734 0.4342
No log 4.0 58 4.0470 0.4332
No log 4.97 72 4.0668 0.4270
No log 6.0 87 3.9068 0.4390
No log 6.97 101 3.8466 0.4468
No log 8.0 116 3.8330 0.4535
No log 8.97 130 3.7238 0.4516
No log 10.0 145 3.8113 0.4446
No log 10.97 159 3.6681 0.4607
No log 12.0 174 3.5627 0.4679
No log 12.97 188 3.4540 0.4794
No log 14.0 203 3.5997 0.4707
No log 14.97 217 3.4362 0.4860
No log 16.0 232 3.5471 0.4740
No log 16.97 246 3.4968 0.4803
No log 18.0 261 3.2938 0.4985
No log 18.97 275 3.4207 0.4765
No log 20.0 290 3.3869 0.4970
No log 20.97 304 3.3062 0.5012
No log 22.0 319 3.3184 0.4917
No log 22.97 333 3.2132 0.5136
No log 24.0 348 3.2027 0.5074
No log 24.97 362 3.3251 0.4923
No log 26.0 377 3.1569 0.5108
No log 26.97 391 3.0947 0.5194
No log 28.0 406 3.0470 0.5206
No log 28.97 420 3.0662 0.5182
No log 30.0 435 3.0845 0.5191
No log 30.97 449 3.0681 0.5219
No log 32.0 464 2.9902 0.5263
No log 32.97 478 2.8970 0.5448
No log 34.0 493 2.9269 0.5341
3.629 34.97 507 2.8605 0.5519
3.629 36.0 522 2.8657 0.5431
3.629 36.97 536 2.9391 0.5407
3.629 38.0 551 2.8960 0.5437
3.629 38.97 565 2.8819 0.5466
3.629 40.0 580 2.7555 0.5633
3.629 40.97 594 2.7425 0.5555
3.629 42.0 609 2.7960 0.5615
3.629 42.97 623 2.7382 0.5630
3.629 44.0 638 2.7967 0.5580
3.629 44.97 652 2.6611 0.5781
3.629 46.0 667 2.6877 0.5722
3.629 46.97 681 2.7917 0.5609
3.629 48.0 696 2.7029 0.5696
3.629 48.97 710 2.7408 0.5618
3.629 50.0 725 2.6450 0.5772
3.629 50.97 739 2.5569 0.5883
3.629 52.0 754 2.6646 0.5795
3.629 52.97 768 2.6803 0.5729
3.629 54.0 783 2.6233 0.5847
3.629 54.97 797 2.6027 0.5842
3.629 56.0 812 2.4090 0.6034
3.629 56.97 826 2.4978 0.6011
3.629 58.0 841 2.5106 0.5944
3.629 58.97 855 2.5039 0.5912
3.629 60.0 870 2.5792 0.5824
3.629 60.97 884 2.4764 0.6065
3.629 62.0 899 2.5348 0.6036
3.629 62.97 913 2.5338 0.6022
3.629 64.0 928 2.4646 0.6130
3.629 64.97 942 2.4532 0.6066
3.629 66.0 957 2.4526 0.6073
3.629 66.97 971 2.5369 0.5992
3.629 68.0 986 2.4170 0.6181
2.5556 68.97 1000 2.4493 0.6078
2.5556 70.0 1015 2.3939 0.6159
2.5556 70.97 1029 2.4793 0.6049
2.5556 72.0 1044 2.3225 0.6286
2.5556 72.97 1058 2.3551 0.6212
2.5556 74.0 1073 2.4702 0.6075
2.5556 74.97 1087 2.3489 0.6311
2.5556 76.0 1102 2.3455 0.6198
2.5556 76.97 1116 2.4500 0.6145
2.5556 78.0 1131 2.3223 0.6332
2.5556 78.97 1145 2.4375 0.6065
2.5556 80.0 1160 2.2743 0.6291
2.5556 80.97 1174 2.3255 0.6295
2.5556 82.0 1189 2.3785 0.6237
2.5556 82.97 1203 2.2722 0.6344
2.5556 84.0 1218 2.2392 0.6407
2.5556 84.97 1232 2.2322 0.6361
2.5556 86.0 1247 2.2206 0.6496
2.5556 86.97 1261 2.2419 0.6345
2.5556 88.0 1276 2.1919 0.6492
2.5556 88.97 1290 2.2616 0.6433
2.5556 90.0 1305 2.2227 0.6417
2.5556 90.97 1319 2.2847 0.6447
2.5556 92.0 1334 2.2916 0.6339
2.5556 92.97 1348 2.2684 0.6410
2.5556 94.0 1363 2.2432 0.6440
2.5556 94.97 1377 2.2510 0.6462
2.5556 96.0 1392 2.2970 0.6363
2.5556 96.55 1400 2.2197 0.6423

Framework versions

  • Transformers 4.40.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2