--- base_model: unsloth/mistral-7b-v0.3-bnb-4bit library_name: peft license: apache-2.0 tags: - unsloth - generated_from_trainer model-index: - name: Mistral-7B-v0.3_magiccoder_ortho results: [] --- # Mistral-7B-v0.3_magiccoder_ortho This model is a fine-tuned version of [unsloth/mistral-7b-v0.3-bnb-4bit](https://huggingface.co/unsloth/mistral-7b-v0.3-bnb-4bit) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 7.8291 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.02 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 1.2179 | 0.0262 | 4 | 1.6151 | | 4.1153 | 0.0523 | 8 | 11.4846 | | 9.4045 | 0.0785 | 12 | 10.9796 | | 8.1663 | 0.1047 | 16 | 8.2167 | | 9.0474 | 0.1308 | 20 | 10.0032 | | 8.4796 | 0.1570 | 24 | 8.3873 | | 7.9286 | 0.1832 | 28 | 8.0296 | | 7.8704 | 0.2093 | 32 | 7.9253 | | 7.7139 | 0.2355 | 36 | 7.8579 | | 7.9416 | 0.2617 | 40 | 7.7372 | | 7.9342 | 0.2878 | 44 | 7.8272 | | 7.7907 | 0.3140 | 48 | 7.8569 | | 7.9106 | 0.3401 | 52 | 7.8776 | | 7.8242 | 0.3663 | 56 | 7.8943 | | 7.8321 | 0.3925 | 60 | 7.8261 | | 7.861 | 0.4186 | 64 | 7.8201 | | 7.9374 | 0.4448 | 68 | 7.8658 | | 7.8396 | 0.4710 | 72 | 7.8735 | | 7.8607 | 0.4971 | 76 | 7.8436 | | 7.9294 | 0.5233 | 80 | 7.8951 | | 7.9017 | 0.5495 | 84 | 7.8877 | | 7.8512 | 0.5756 | 88 | 7.8694 | | 7.9036 | 0.6018 | 92 | 7.8331 | | 7.8496 | 0.6280 | 96 | 7.8269 | | 7.8837 | 0.6541 | 100 | 7.8142 | | 7.8718 | 0.6803 | 104 | 7.9025 | | 7.934 | 0.7065 | 108 | 7.8767 | | 7.8706 | 0.7326 | 112 | 7.8579 | | 7.8889 | 0.7588 | 116 | 7.8467 | | 7.8279 | 0.7850 | 120 | 7.7952 | | 7.9176 | 0.8111 | 124 | 7.8180 | | 7.8894 | 0.8373 | 128 | 7.8068 | | 7.8625 | 0.8635 | 132 | 7.8081 | | 7.8447 | 0.8896 | 136 | 7.8196 | | 7.7559 | 0.9158 | 140 | 7.8307 | | 7.8508 | 0.9419 | 144 | 7.8304 | | 7.8058 | 0.9681 | 148 | 7.8295 | | 7.8377 | 0.9943 | 152 | 7.8291 | ### Framework versions - PEFT 0.12.0 - Transformers 4.44.0 - Pytorch 2.4.0+cu121 - Datasets 2.20.0 - Tokenizers 0.19.1