imdatta0's picture
End of training
eca7b2f verified
metadata
base_model: unsloth/mistral-7b-v0.3-bnb-4bit
library_name: peft
license: apache-2.0
tags:
  - unsloth
  - generated_from_trainer
model-index:
  - name: mistralai_mistral_7b_v0.3_imdatta0_Magiccoder_evol_10k_defaule
    results: []

mistralai_mistral_7b_v0.3_imdatta0_Magiccoder_evol_10k_defaule

This model is a fine-tuned version of unsloth/mistral-7b-v0.3-bnb-4bit on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1508

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.02
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss
1.1667 0.0261 4 1.1657
1.168 0.0523 8 1.1853
1.1834 0.0784 12 1.1752
1.0949 0.1046 16 1.1765
1.1669 0.1307 20 1.1847
1.06 0.1569 24 1.1693
1.1873 0.1830 28 1.1557
1.124 0.2092 32 1.1566
1.0828 0.2353 36 1.1538
1.1584 0.2614 40 1.1528
1.1773 0.2876 44 1.1493
1.1151 0.3137 48 1.1615
1.1327 0.3399 52 1.1592
1.094 0.3660 56 1.1487
1.1477 0.3922 60 1.1672
1.156 0.4183 64 1.1475
1.0724 0.4444 68 1.1658
1.0879 0.4706 72 1.1466
1.0652 0.4967 76 1.1522
1.1747 0.5229 80 1.1557
1.0867 0.5490 84 1.1524
1.1416 0.5752 88 1.1699
1.1987 0.6013 92 1.1498
1.1849 0.6275 96 1.1516
1.1133 0.6536 100 1.1447
1.136 0.6797 104 1.1526
1.1579 0.7059 108 1.1694
1.0263 0.7320 112 1.1502
1.093 0.7582 116 1.1325
1.0904 0.7843 120 1.1447
1.1481 0.8105 124 1.1550
1.1437 0.8366 128 1.1556
1.1645 0.8627 132 1.1541
1.0964 0.8889 136 1.1502
1.1825 0.9150 140 1.1487
1.0579 0.9412 144 1.1495
1.0728 0.9673 148 1.1504
1.2134 0.9935 152 1.1508

Framework versions

  • PEFT 0.12.0
  • Transformers 4.44.0
  • Pytorch 2.4.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1