Edit model card

bge-m3-2024-09-19_04-39-34

This model is a fine-tuned version of BAAI/bge-m3 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7622

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 2
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
2.6055 0.1671 500 1.5444
1.4844 0.3342 1000 1.2380
1.2778 0.5013 1500 1.1190
1.1861 0.6684 2000 1.0531
1.124 0.8356 2500 1.0052
1.0862 1.0027 3000 0.9756
1.0458 1.1698 3500 0.9501
1.0193 1.3369 4000 0.9307
1.0057 1.5040 4500 0.9146
1.0373 0.2089 5000 0.9470
1.0261 0.2298 5500 0.9444
1.0148 0.2507 6000 0.9382
1.0154 0.2716 6500 0.9357
1.0136 0.2924 7000 0.9306
0.9924 0.3133 7500 0.9248
0.9902 0.3342 8000 0.9245
0.9873 0.3551 8500 0.9186
0.9719 0.3760 9000 0.9110
0.9844 0.3969 9500 0.9081
0.9851 0.4178 10000 0.9024
0.9636 0.4387 10500 0.8986
0.9824 0.4596 11000 0.9035
0.9563 0.4804 11500 0.8901
0.9586 0.5013 12000 0.8830
0.9573 0.5222 12500 0.8814
0.9551 0.5431 13000 0.8738
0.947 0.5640 13500 0.8757
0.944 0.5849 14000 0.8721
0.9497 0.6058 14500 0.8670
0.9331 0.6267 15000 0.8620
0.9349 0.6476 15500 0.8588
0.9132 0.6684 16000 0.8557
0.926 0.6893 16500 0.8570
0.9146 0.7102 17000 0.8499
0.9337 0.7311 17500 0.8477
0.913 0.7520 18000 0.8481
0.919 0.7729 18500 0.8452
0.9039 0.7938 19000 0.8398
0.9086 0.8147 19500 0.8371
0.9089 0.8356 20000 0.8376
0.9077 0.8565 20500 0.8359
0.8846 0.8773 21000 0.8309
0.9192 0.8982 21500 0.8317
0.9018 0.9191 22000 0.8251
0.8822 0.9400 22500 0.8271
0.8822 0.9609 23000 0.8241
0.8886 0.9818 23500 0.8205
0.9002 1.0027 24000 0.8190
0.8687 1.0236 24500 0.8174
0.8767 1.0445 25000 0.8141
0.8681 1.0653 25500 0.8136
0.8702 1.0862 26000 0.8134
0.8692 1.1071 26500 0.8094
0.8682 1.1280 27000 0.8106
0.8579 1.1489 27500 0.8043
0.8722 1.1698 28000 0.8038
0.8609 1.1907 28500 0.8062
0.8626 1.2116 29000 0.8008
0.8577 1.2325 29500 0.8002
0.8551 1.2533 30000 0.7984
0.8527 1.2742 30500 0.7949
0.8369 1.2951 31000 0.7942
0.8585 1.3160 31500 0.7953
0.8426 1.3369 32000 0.7914
0.8474 1.3578 32500 0.7925
0.8608 1.3787 33000 0.7898
0.8496 1.3996 33500 0.7884
0.8413 1.4205 34000 0.7846
0.8349 1.4413 34500 0.7863
0.8439 1.4622 35000 0.7837
0.8392 1.4831 35500 0.7834
0.8246 1.5040 36000 0.7817
0.825 1.5249 36500 0.7797
0.8255 1.5458 37000 0.7813
0.8384 1.5667 37500 0.7794
0.8305 1.5876 38000 0.7779
0.8346 1.6085 38500 0.7760
0.8202 1.6293 39000 0.7767
0.8085 1.6502 39500 0.7703
0.8366 1.6711 40000 0.7735
0.8416 1.6920 40500 0.7726
0.8303 1.7129 41000 0.7703
0.8267 1.7338 41500 0.7695
0.8239 1.7547 42000 0.7692
0.8249 1.7756 42500 0.7686
0.8297 1.7965 43000 0.7696
0.8133 1.8173 43500 0.7690
0.8177 1.8382 44000 0.7664
0.8204 1.8591 44500 0.7672
0.8221 1.8800 45000 0.7652
0.8146 1.9009 45500 0.7638
0.8183 1.9218 46000 0.7626
0.8053 1.9427 46500 0.7635
0.8232 1.9636 47000 0.7627
0.819 1.9845 47500 0.7622

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
568M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for strongpear/bge-m3-2024-09-19_04-39-34

Base model

BAAI/bge-m3
Finetuned
(123)
this model