Edit model card

llama3.1-8B_finetune_genQA_wiki_r64_v2

This model is a fine-tuned version of meta-llama/Llama-3.1-8B on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3744

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.6e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
0.5552 0.0028 200 0.4789
0.3971 0.0057 400 0.4757
0.576 0.0085 600 0.4745
0.5104 0.0113 800 0.4738
0.6768 0.0142 1000 0.4725
0.4755 0.0170 1200 0.4715
0.304 0.0199 1400 0.4706
0.5975 0.0227 1600 0.4703
0.4878 0.0255 1800 0.4692
0.3835 0.0284 2000 0.4684
0.5211 0.0312 2200 0.4683
0.4565 0.0340 2400 0.4673
0.4741 0.0369 2600 0.4671
0.4993 0.0397 2800 0.4664
0.2234 0.0426 3000 0.4664
0.5535 0.0454 3200 0.4654
0.4021 0.0482 3400 0.4649
0.5119 0.0511 3600 0.4640
0.4791 0.0539 3800 0.4634
0.3927 0.0567 4000 0.4619
0.5627 0.0596 4200 0.4614
0.3563 0.0624 4400 0.4608
0.4719 0.0652 4600 0.4604
0.562 0.0681 4800 0.4593
0.5258 0.0709 5000 0.4591
0.4483 0.0738 5200 0.4583
0.4827 0.0766 5400 0.4572
0.3712 0.0794 5600 0.4577
0.3198 0.0823 5800 0.4565
0.4168 0.0851 6000 0.4567
0.4166 0.0879 6200 0.4566
0.3014 0.0908 6400 0.4557
0.397 0.0936 6600 0.4550
0.4803 0.0965 6800 0.4540
0.4119 0.0993 7000 0.4540
0.5685 0.1021 7200 0.4532
0.4883 0.1050 7400 0.4532
0.5795 0.1078 7600 0.4531
0.5066 0.1106 7800 0.4524
0.4414 0.1135 8000 0.4522
0.3382 0.1163 8200 0.4520
0.5088 0.1191 8400 0.4521
0.4083 0.1220 8600 0.4523
0.437 0.1248 8800 0.4518
0.3769 0.1277 9000 0.4513
0.3828 0.1305 9200 0.4513
0.5162 0.1333 9400 0.4506
0.3603 0.1362 9600 0.4501
0.6039 0.1390 9800 0.4500
0.3339 0.1418 10000 0.4493
0.3821 0.1447 10200 0.4491
0.478 0.1475 10400 0.4488
0.3994 0.1504 10600 0.4481
0.4471 0.1532 10800 0.4474
0.4102 0.1560 11000 0.4471
0.3164 0.1589 11200 0.4464
0.3487 0.1617 11400 0.4458
0.4359 0.1645 11600 0.4455
0.5473 0.1674 11800 0.4450
0.5629 0.1702 12000 0.4452
0.5864 0.1730 12200 0.4446
0.4339 0.1759 12400 0.4442
0.3727 0.1787 12600 0.4447
0.4229 0.1816 12800 0.4439
0.4888 0.1844 13000 0.4437
0.4252 0.1872 13200 0.4439
0.5414 0.1901 13400 0.4432
0.4577 0.1929 13600 0.4429
0.4698 0.1957 13800 0.4425
0.488 0.1986 14000 0.4423
0.3836 0.2014 14200 0.4419
0.4724 0.2043 14400 0.4415
0.4451 0.2071 14600 0.4412
0.4409 0.2099 14800 0.4408
0.4377 0.2128 15000 0.4410
0.4688 0.2156 15200 0.4408
0.4975 0.2184 15400 0.4406
0.5752 0.2213 15600 0.4405
0.4824 0.2241 15800 0.4399
0.379 0.2270 16000 0.4396
0.5237 0.2298 16200 0.4396
0.491 0.2326 16400 0.4389
0.4132 0.2355 16600 0.4388
0.4417 0.2383 16800 0.4385
0.4859 0.2411 17000 0.4380
0.4579 0.2440 17200 0.4377
0.5186 0.2468 17400 0.4366
0.4055 0.2496 17600 0.4362
0.5011 0.2525 17800 0.4359
0.3842 0.2553 18000 0.4350
0.5457 0.2582 18200 0.4352
0.5447 0.2610 18400 0.4350
0.4722 0.2638 18600 0.4349
0.4229 0.2667 18800 0.4340
0.5047 0.2695 19000 0.4338
0.2641 0.2723 19200 0.4340
0.3937 0.2752 19400 0.4333
0.4082 0.2780 19600 0.4332
0.3958 0.2809 19800 0.4317
0.4929 0.2837 20000 0.4315
0.4739 0.2865 20200 0.4313
0.5014 0.2894 20400 0.4316
0.3236 0.2922 20600 0.4308
0.3615 0.2950 20800 0.4299
0.4235 0.2979 21000 0.4297
0.2811 0.3007 21200 0.4295
0.2743 0.3035 21400 0.4287
0.4718 0.3064 21600 0.4290
0.5184 0.3092 21800 0.4289
0.3954 0.3121 22000 0.4289
0.4478 0.3149 22200 0.4288
0.4001 0.3177 22400 0.4288
0.4768 0.3206 22600 0.4288
0.3908 0.3234 22800 0.4284
0.4068 0.3262 23000 0.4280
0.3719 0.3291 23200 0.4270
0.3995 0.3319 23400 0.4266
0.4877 0.3348 23600 0.4262
0.5297 0.3376 23800 0.4262
0.4355 0.3404 24000 0.4259
0.4551 0.3433 24200 0.4255
0.4737 0.3461 24400 0.4253
0.3971 0.3489 24600 0.4252
0.3989 0.3518 24800 0.4247
0.5214 0.3546 25000 0.4246
0.4378 0.3574 25200 0.4244
0.5113 0.3603 25400 0.4246
0.4544 0.3631 25600 0.4246
0.4151 0.3660 25800 0.4244
0.3019 0.3688 26000 0.4243
0.4397 0.3716 26200 0.4242
0.5191 0.3745 26400 0.4238
0.3183 0.3773 26600 0.4237
0.203 0.3801 26800 0.4236
0.522 0.3830 27000 0.4231
0.4159 0.3858 27200 0.4232
0.348 0.3887 27400 0.4231
0.5155 0.3915 27600 0.4229
0.5407 0.3943 27800 0.4226
0.4152 0.3972 28000 0.4219
0.4148 0.4 28200 0.4213
0.5514 0.4028 28400 0.4206
0.4132 0.4057 28600 0.4201
0.4249 0.4085 28800 0.4199
0.4003 0.4113 29000 0.4199
0.3864 0.4142 29200 0.4194
0.3573 0.4170 29400 0.4192
0.4098 0.4199 29600 0.4185
0.3843 0.4227 29800 0.4187
0.5306 0.4255 30000 0.4187
0.2888 0.4284 30200 0.4183
0.2458 0.4312 30400 0.4176
0.3408 0.4340 30600 0.4169
0.4499 0.4369 30800 0.4162
0.4447 0.4397 31000 0.4157
0.523 0.4426 31200 0.4160
0.47 0.4454 31400 0.4160
0.3046 0.4482 31600 0.4155
0.4464 0.4511 31800 0.4156
0.4347 0.4539 32000 0.4152
0.5663 0.4567 32200 0.4152
0.4236 0.4596 32400 0.4150
0.4682 0.4624 32600 0.4143
0.2078 0.4652 32800 0.4138
0.4289 0.4681 33000 0.4135
0.4123 0.4709 33200 0.4134
0.2913 0.4738 33400 0.4132
0.3788 0.4766 33600 0.4131
0.3111 0.4794 33800 0.4132
0.5047 0.4823 34000 0.4124
0.3096 0.4851 34200 0.4124
0.5126 0.4879 34400 0.4121
0.5116 0.4908 34600 0.4117
0.3009 0.4936 34800 0.4115
0.5036 0.4965 35000 0.4108
0.4221 0.4993 35200 0.4109
0.5021 0.5021 35400 0.4109
0.2946 0.5050 35600 0.4108
0.4487 0.5078 35800 0.4104
0.3863 0.5106 36000 0.4103
0.3043 0.5135 36200 0.4097
0.5039 0.5163 36400 0.4092
0.537 0.5191 36600 0.4087
0.3525 0.5220 36800 0.4082
0.3099 0.5248 37000 0.4081
0.4568 0.5277 37200 0.4080
0.1907 0.5305 37400 0.4079
0.5096 0.5333 37600 0.4074
0.4411 0.5362 37800 0.4068
0.4075 0.5390 38000 0.4070
0.5252 0.5418 38200 0.4067
0.2606 0.5447 38400 0.4060
0.4556 0.5475 38600 0.4060
0.3456 0.5504 38800 0.4056
0.2922 0.5532 39000 0.4050
0.4582 0.5560 39200 0.4043
0.3284 0.5589 39400 0.4039
0.511 0.5617 39600 0.4035
0.4445 0.5645 39800 0.4035
0.2857 0.5674 40000 0.4035
0.4778 0.5702 40200 0.4030
0.4949 0.5730 40400 0.4025
0.4691 0.5759 40600 0.4021
0.6142 0.5787 40800 0.4022
0.2999 0.5816 41000 0.4011
0.4709 0.5844 41200 0.4010
0.3927 0.5872 41400 0.4004
0.4305 0.5901 41600 0.4001
0.4798 0.5929 41800 0.3988
0.5321 0.5957 42000 0.3988
0.5152 0.5986 42200 0.3987
0.3418 0.6014 42400 0.3984
0.4056 0.6043 42600 0.3984
0.2357 0.6071 42800 0.3983
0.3883 0.6099 43000 0.3980
0.4356 0.6128 43200 0.3978
0.3481 0.6156 43400 0.3977
0.3969 0.6184 43600 0.3971
0.459 0.6213 43800 0.3969
0.3185 0.6241 44000 0.3969
0.4342 0.6270 44200 0.3963
0.448 0.6298 44400 0.3961
0.4198 0.6326 44600 0.3958
0.3038 0.6355 44800 0.3957
0.3628 0.6383 45000 0.3953
0.4918 0.6411 45200 0.3954
0.5577 0.6440 45400 0.3953
0.2738 0.6468 45600 0.3952
0.481 0.6496 45800 0.3948
0.4624 0.6525 46000 0.3945
0.4184 0.6553 46200 0.3943
0.5 0.6582 46400 0.3942
0.3874 0.6610 46600 0.3941
0.4107 0.6638 46800 0.3939
0.4462 0.6667 47000 0.3938
0.4787 0.6695 47200 0.3934
0.2831 0.6723 47400 0.3932
0.3612 0.6752 47600 0.3933
0.2958 0.6780 47800 0.3928
0.2881 0.6809 48000 0.3926
0.3764 0.6837 48200 0.3927
0.3935 0.6865 48400 0.3926
0.448 0.6894 48600 0.3923
0.4448 0.6922 48800 0.3918
0.3697 0.6950 49000 0.3918
0.4227 0.6979 49200 0.3917
0.4456 0.7007 49400 0.3914
0.4413 0.7035 49600 0.3913
0.466 0.7064 49800 0.3912
0.2918 0.7092 50000 0.3910
0.3252 0.7121 50200 0.3908
0.198 0.7149 50400 0.3905
0.2525 0.7177 50600 0.3903
0.483 0.7206 50800 0.3901
0.458 0.7234 51000 0.3898
0.4184 0.7262 51200 0.3898
0.385 0.7291 51400 0.3894
0.3237 0.7319 51600 0.3893
0.3709 0.7348 51800 0.3889
0.2904 0.7376 52000 0.3890
0.362 0.7404 52200 0.3888
0.4254 0.7433 52400 0.3885
0.4099 0.7461 52600 0.3884
0.2335 0.7489 52800 0.3881
0.4477 0.7518 53000 0.3880
0.3345 0.7546 53200 0.3878
0.3674 0.7574 53400 0.3873
0.435 0.7603 53600 0.3874
0.4013 0.7631 53800 0.3873
0.3666 0.7660 54000 0.3868
0.3511 0.7688 54200 0.3864
0.4599 0.7716 54400 0.3862
0.4172 0.7745 54600 0.3859
0.3722 0.7773 54800 0.3857
0.459 0.7801 55000 0.3853
0.3358 0.7830 55200 0.3848
0.3544 0.7858 55400 0.3847
0.3254 0.7887 55600 0.3846
0.443 0.7915 55800 0.3845
0.1945 0.7943 56000 0.3844
0.4167 0.7972 56200 0.3842
0.4281 0.8 56400 0.3838
0.4496 0.8028 56600 0.3836
0.4004 0.8057 56800 0.3834
0.4789 0.8085 57000 0.3832
0.37 0.8113 57200 0.3831
0.3972 0.8142 57400 0.3830
0.4429 0.8170 57600 0.3827
0.5432 0.8199 57800 0.3825
0.4448 0.8227 58000 0.3825
0.4668 0.8255 58200 0.3820
0.3144 0.8284 58400 0.3815
0.3173 0.8312 58600 0.3812
0.4206 0.8340 58800 0.3811
0.3072 0.8369 59000 0.3810
0.3854 0.8397 59200 0.3806
0.3892 0.8426 59400 0.3805
0.4497 0.8454 59600 0.3805
0.4165 0.8482 59800 0.3805
0.3679 0.8511 60000 0.3802
0.4221 0.8539 60200 0.3796
0.4912 0.8567 60400 0.3796
0.4141 0.8596 60600 0.3794
0.3999 0.8624 60800 0.3792
0.3014 0.8652 61000 0.3788
0.5035 0.8681 61200 0.3787
0.4158 0.8709 61400 0.3784
0.3597 0.8738 61600 0.3783
0.4084 0.8766 61800 0.3782
0.3475 0.8794 62000 0.3782
0.3177 0.8823 62200 0.3781
0.3705 0.8851 62400 0.3778
0.4659 0.8879 62600 0.3776
0.4417 0.8908 62800 0.3774
0.435 0.8936 63000 0.3772
0.4289 0.8965 63200 0.3772
0.3551 0.8993 63400 0.3770
0.4331 0.9021 63600 0.3770
0.3531 0.9050 63800 0.3768
0.4663 0.9078 64000 0.3767
0.4011 0.9106 64200 0.3765
0.344 0.9135 64400 0.3763
0.3488 0.9163 64600 0.3762
0.2881 0.9191 64800 0.3761
0.4809 0.9220 65000 0.3760
0.4229 0.9248 65200 0.3759
0.4683 0.9277 65400 0.3757
0.483 0.9305 65600 0.3756
0.4342 0.9333 65800 0.3755
0.2609 0.9362 66000 0.3754
0.4405 0.9390 66200 0.3754
0.4036 0.9418 66400 0.3754
0.3688 0.9447 66600 0.3753
0.3391 0.9475 66800 0.3752
0.466 0.9504 67000 0.3751
0.4023 0.9532 67200 0.3751
0.4671 0.9560 67400 0.3750
0.2545 0.9589 67600 0.3750
0.2524 0.9617 67800 0.3749
0.3833 0.9645 68000 0.3749
0.4234 0.9674 68200 0.3749
0.4267 0.9702 68400 0.3748
0.3799 0.9730 68600 0.3747
0.2952 0.9759 68800 0.3747
0.2221 0.9787 69000 0.3746
0.4635 0.9816 69200 0.3746
0.2814 0.9844 69400 0.3745
0.3765 0.9872 69600 0.3745
0.4394 0.9901 69800 0.3745
0.4303 0.9929 70000 0.3744
0.2866 0.9957 70200 0.3744
0.4629 0.9986 70400 0.3744

Framework versions

  • PEFT 0.12.0
  • Transformers 4.45.2
  • Pytorch 2.4.0+cu121
  • Datasets 3.0.0
  • Tokenizers 0.20.1
Downloads last month
37
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for strongpear/llama3.1-8B_finetune_genQA_wiki_r64_v2

Adapter
(111)
this model