nbroad commited on
Commit
561f6b0
·
verified ·
1 Parent(s): 89a1037

End of training

Browse files
Files changed (1) hide show
  1. README.md +5 -3
README.md CHANGED
@@ -1,17 +1,19 @@
1
  ---
2
  base_model: mistralai/Ministral-8B-Instruct-2410
 
3
  library_name: transformers
4
- model_name: nbroad-odesia-dipro-t1-f5yiqx2u
5
  tags:
6
  - generated_from_trainer
 
7
  - trl
8
  - sft
9
  licence: license
10
  ---
11
 
12
- # Model Card for nbroad-odesia-dipro-t1-f5yiqx2u
13
 
14
- This model is a fine-tuned version of [mistralai/Ministral-8B-Instruct-2410](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410).
15
  It has been trained using [TRL](https://github.com/huggingface/trl).
16
 
17
  ## Quick start
 
1
  ---
2
  base_model: mistralai/Ministral-8B-Instruct-2410
3
+ datasets: nbroad/odesia-dipromats-seq-cls-v1
4
  library_name: transformers
5
+ model_name: mistralai/Ministral-8B-Instruct-2410
6
  tags:
7
  - generated_from_trainer
8
+ - odesia
9
  - trl
10
  - sft
11
  licence: license
12
  ---
13
 
14
+ # Model Card for mistralai/Ministral-8B-Instruct-2410
15
 
16
+ This model is a fine-tuned version of [mistralai/Ministral-8B-Instruct-2410](https://huggingface.co/mistralai/Ministral-8B-Instruct-2410) on the [nbroad/odesia-dipromats-seq-cls-v1](https://huggingface.co/datasets/nbroad/odesia-dipromats-seq-cls-v1) dataset.
17
  It has been trained using [TRL](https://github.com/huggingface/trl).
18
 
19
  ## Quick start