HimashaJ96
commited on
Commit
·
274c845
1
Parent(s):
2d53ac5
End of training
Browse files
README.md
CHANGED
@@ -1,13 +1,12 @@
|
|
1 |
---
|
2 |
license: mit
|
3 |
-
|
4 |
tags:
|
5 |
- trl
|
6 |
- sft
|
7 |
- generated_from_trainer
|
8 |
metrics:
|
9 |
- rouge
|
10 |
-
base_model: TheBloke/zephyr-7B-beta-GPTQ
|
11 |
model-index:
|
12 |
- name: zephyr-support-chatbot
|
13 |
results: []
|
@@ -20,11 +19,11 @@ should probably proofread and complete it, then remove this comment. -->
|
|
20 |
|
21 |
This model is a fine-tuned version of [TheBloke/zephyr-7B-beta-GPTQ](https://huggingface.co/TheBloke/zephyr-7B-beta-GPTQ) on the None dataset.
|
22 |
It achieves the following results on the evaluation set:
|
23 |
-
- Loss: 1.
|
24 |
-
- Rouge1: 0.
|
25 |
-
- Rouge2: 0.
|
26 |
-
- Rougel: 0.
|
27 |
-
- Rougelsum: 0.
|
28 |
|
29 |
## Model description
|
30 |
|
@@ -43,115 +42,42 @@ More information needed
|
|
43 |
### Training hyperparameters
|
44 |
|
45 |
The following hyperparameters were used during training:
|
46 |
-
- learning_rate:
|
47 |
- train_batch_size: 16
|
48 |
- eval_batch_size: 8
|
49 |
- seed: 42
|
50 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
51 |
- lr_scheduler_type: cosine
|
52 |
-
- num_epochs:
|
53 |
- mixed_precision_training: Native AMP
|
54 |
|
55 |
### Training results
|
56 |
|
57 |
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|
58 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
|
59 |
-
| 2.
|
60 |
-
|
|
61 |
-
|
|
62 |
-
|
|
63 |
-
|
|
64 |
-
|
|
65 |
-
|
|
66 |
-
|
|
67 |
-
|
|
68 |
-
|
|
69 |
-
|
|
70 |
-
|
|
71 |
-
|
|
72 |
-
|
|
73 |
-
|
|
74 |
-
|
|
75 |
-
|
|
76 |
-
|
|
77 |
-
| 0.0609 | 21.11 | 190 | 1.4582 | 0.7271 | 0.5469 | 0.7012 | 0.7136 |
|
78 |
-
| 0.0599 | 22.22 | 200 | 1.5727 | 0.7365 | 0.5576 | 0.7073 | 0.7233 |
|
79 |
-
| 0.0587 | 23.33 | 210 | 1.5053 | 0.7419 | 0.5605 | 0.7083 | 0.7265 |
|
80 |
-
| 0.0532 | 24.44 | 220 | 1.5750 | 0.7372 | 0.5631 | 0.7109 | 0.7235 |
|
81 |
-
| 0.0529 | 25.56 | 230 | 1.5663 | 0.7356 | 0.5515 | 0.7082 | 0.7234 |
|
82 |
-
| 0.0519 | 26.67 | 240 | 1.5608 | 0.7403 | 0.5601 | 0.7106 | 0.7258 |
|
83 |
-
| 0.0502 | 27.78 | 250 | 1.5099 | 0.7314 | 0.5467 | 0.6999 | 0.7183 |
|
84 |
-
| 0.0562 | 28.89 | 260 | 1.5654 | 0.7317 | 0.5592 | 0.7051 | 0.7194 |
|
85 |
-
| 0.0486 | 30.0 | 270 | 1.5988 | 0.7309 | 0.5556 | 0.7010 | 0.7171 |
|
86 |
-
| 0.0451 | 31.11 | 280 | 1.5663 | 0.7301 | 0.5577 | 0.7003 | 0.7177 |
|
87 |
-
| 0.0425 | 32.22 | 290 | 1.6243 | 0.7281 | 0.5563 | 0.7022 | 0.7160 |
|
88 |
-
| 0.0436 | 33.33 | 300 | 1.6507 | 0.7253 | 0.5509 | 0.6955 | 0.7134 |
|
89 |
-
| 0.0419 | 34.44 | 310 | 1.5603 | 0.7334 | 0.5520 | 0.7019 | 0.7195 |
|
90 |
-
| 0.0428 | 35.56 | 320 | 1.6508 | 0.7282 | 0.5469 | 0.6954 | 0.7153 |
|
91 |
-
| 0.0409 | 36.67 | 330 | 1.7279 | 0.7220 | 0.5396 | 0.6894 | 0.7118 |
|
92 |
-
| 0.0406 | 37.78 | 340 | 1.6654 | 0.7324 | 0.5540 | 0.7055 | 0.7217 |
|
93 |
-
| 0.0402 | 38.89 | 350 | 1.7581 | 0.7210 | 0.5397 | 0.6923 | 0.7106 |
|
94 |
-
| 0.0405 | 40.0 | 360 | 1.6995 | 0.7250 | 0.5472 | 0.6959 | 0.7153 |
|
95 |
-
| 0.0393 | 41.11 | 370 | 1.7305 | 0.7234 | 0.5399 | 0.6944 | 0.7138 |
|
96 |
-
| 0.0387 | 42.22 | 380 | 1.7684 | 0.7177 | 0.5363 | 0.6884 | 0.7082 |
|
97 |
-
| 0.0396 | 43.33 | 390 | 1.7825 | 0.7208 | 0.5390 | 0.6878 | 0.7095 |
|
98 |
-
| 0.0391 | 44.44 | 400 | 1.7773 | 0.7222 | 0.5392 | 0.6929 | 0.7124 |
|
99 |
-
| 0.0386 | 45.56 | 410 | 1.8209 | 0.7200 | 0.5415 | 0.6904 | 0.7086 |
|
100 |
-
| 0.0383 | 46.67 | 420 | 1.7873 | 0.7210 | 0.5403 | 0.6901 | 0.7093 |
|
101 |
-
| 0.0387 | 47.78 | 430 | 1.7906 | 0.7186 | 0.5396 | 0.6901 | 0.7095 |
|
102 |
-
| 0.0385 | 48.89 | 440 | 1.8082 | 0.7224 | 0.5448 | 0.6954 | 0.7137 |
|
103 |
-
| 0.0392 | 50.0 | 450 | 1.7851 | 0.7309 | 0.5472 | 0.6988 | 0.7188 |
|
104 |
-
| 0.0386 | 51.11 | 460 | 1.8098 | 0.7201 | 0.5414 | 0.6937 | 0.7100 |
|
105 |
-
| 0.038 | 52.22 | 470 | 1.8145 | 0.7214 | 0.5413 | 0.6931 | 0.7114 |
|
106 |
-
| 0.0374 | 53.33 | 480 | 1.7956 | 0.7229 | 0.5408 | 0.6919 | 0.7120 |
|
107 |
-
| 0.038 | 54.44 | 490 | 1.8609 | 0.7231 | 0.5386 | 0.6876 | 0.7093 |
|
108 |
-
| 0.0375 | 55.56 | 500 | 1.8295 | 0.7253 | 0.5400 | 0.6924 | 0.7127 |
|
109 |
-
| 0.0384 | 56.67 | 510 | 1.8193 | 0.7238 | 0.5419 | 0.6958 | 0.7138 |
|
110 |
-
| 0.0374 | 57.78 | 520 | 1.8510 | 0.7202 | 0.5386 | 0.6890 | 0.7083 |
|
111 |
-
| 0.0382 | 58.89 | 530 | 1.8385 | 0.7227 | 0.5403 | 0.6888 | 0.7098 |
|
112 |
-
| 0.0374 | 60.0 | 540 | 1.8390 | 0.7203 | 0.5424 | 0.6895 | 0.7089 |
|
113 |
-
| 0.0374 | 61.11 | 550 | 1.8651 | 0.7202 | 0.5398 | 0.6902 | 0.7084 |
|
114 |
-
| 0.0378 | 62.22 | 560 | 1.8618 | 0.7236 | 0.5402 | 0.6882 | 0.7097 |
|
115 |
-
| 0.0374 | 63.33 | 570 | 1.8483 | 0.7203 | 0.5369 | 0.6905 | 0.7097 |
|
116 |
-
| 0.0363 | 64.44 | 580 | 1.8637 | 0.7190 | 0.5389 | 0.6897 | 0.7089 |
|
117 |
-
| 0.0378 | 65.56 | 590 | 1.8953 | 0.7236 | 0.5369 | 0.6882 | 0.7099 |
|
118 |
-
| 0.0377 | 66.67 | 600 | 1.8834 | 0.7210 | 0.5396 | 0.6909 | 0.7104 |
|
119 |
-
| 0.037 | 67.78 | 610 | 1.8741 | 0.7210 | 0.5436 | 0.6937 | 0.7117 |
|
120 |
-
| 0.0367 | 68.89 | 620 | 1.8890 | 0.7214 | 0.5419 | 0.6917 | 0.7097 |
|
121 |
-
| 0.0384 | 70.0 | 630 | 1.8942 | 0.7238 | 0.5432 | 0.6921 | 0.7115 |
|
122 |
-
| 0.0368 | 71.11 | 640 | 1.8945 | 0.7250 | 0.5414 | 0.6907 | 0.7116 |
|
123 |
-
| 0.0369 | 72.22 | 650 | 1.9093 | 0.7235 | 0.5402 | 0.6896 | 0.7094 |
|
124 |
-
| 0.0374 | 73.33 | 660 | 1.9073 | 0.7221 | 0.5432 | 0.6942 | 0.7093 |
|
125 |
-
| 0.0368 | 74.44 | 670 | 1.8925 | 0.7202 | 0.5434 | 0.6936 | 0.7097 |
|
126 |
-
| 0.0374 | 75.56 | 680 | 1.8965 | 0.7187 | 0.5434 | 0.6936 | 0.7084 |
|
127 |
-
| 0.0369 | 76.67 | 690 | 1.9101 | 0.7200 | 0.5422 | 0.6931 | 0.7078 |
|
128 |
-
| 0.0369 | 77.78 | 700 | 1.9184 | 0.7186 | 0.5407 | 0.6915 | 0.7074 |
|
129 |
-
| 0.0368 | 78.89 | 710 | 1.9334 | 0.7218 | 0.5411 | 0.6896 | 0.7078 |
|
130 |
-
| 0.0366 | 80.0 | 720 | 1.9221 | 0.7227 | 0.5411 | 0.6907 | 0.7090 |
|
131 |
-
| 0.0364 | 81.11 | 730 | 1.9238 | 0.7227 | 0.5427 | 0.6922 | 0.7090 |
|
132 |
-
| 0.0369 | 82.22 | 740 | 1.9318 | 0.7198 | 0.5432 | 0.6931 | 0.7068 |
|
133 |
-
| 0.0364 | 83.33 | 750 | 1.9346 | 0.7210 | 0.5432 | 0.6931 | 0.7083 |
|
134 |
-
| 0.0377 | 84.44 | 760 | 1.9375 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
135 |
-
| 0.0358 | 85.56 | 770 | 1.9375 | 0.7217 | 0.5427 | 0.6922 | 0.7076 |
|
136 |
-
| 0.0363 | 86.67 | 780 | 1.9339 | 0.7206 | 0.5427 | 0.6914 | 0.7065 |
|
137 |
-
| 0.0376 | 87.78 | 790 | 1.9345 | 0.7206 | 0.5427 | 0.6914 | 0.7065 |
|
138 |
-
| 0.0363 | 88.89 | 800 | 1.9342 | 0.7198 | 0.5432 | 0.6931 | 0.7068 |
|
139 |
-
| 0.0361 | 90.0 | 810 | 1.9367 | 0.7186 | 0.5422 | 0.6931 | 0.7063 |
|
140 |
-
| 0.0363 | 91.11 | 820 | 1.9384 | 0.7198 | 0.5432 | 0.6931 | 0.7068 |
|
141 |
-
| 0.0366 | 92.22 | 830 | 1.9390 | 0.7186 | 0.5422 | 0.6931 | 0.7063 |
|
142 |
-
| 0.0369 | 93.33 | 840 | 1.9403 | 0.7206 | 0.5438 | 0.6914 | 0.7070 |
|
143 |
-
| 0.0358 | 94.44 | 850 | 1.9407 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
144 |
-
| 0.0354 | 95.56 | 860 | 1.9409 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
145 |
-
| 0.0369 | 96.67 | 870 | 1.9414 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
146 |
-
| 0.0361 | 97.78 | 880 | 1.9417 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
147 |
-
| 0.0365 | 98.89 | 890 | 1.9420 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
148 |
-
| 0.0364 | 100.0 | 900 | 1.9415 | 0.7212 | 0.5438 | 0.6914 | 0.7070 |
|
149 |
|
150 |
|
151 |
### Framework versions
|
152 |
|
153 |
-
- PEFT 0.7.1
|
154 |
- Transformers 4.35.2
|
155 |
- Pytorch 2.1.0+cu121
|
156 |
- Datasets 2.16.0
|
157 |
-
- Tokenizers 0.15.0
|
|
|
1 |
---
|
2 |
license: mit
|
3 |
+
base_model: TheBloke/zephyr-7B-beta-GPTQ
|
4 |
tags:
|
5 |
- trl
|
6 |
- sft
|
7 |
- generated_from_trainer
|
8 |
metrics:
|
9 |
- rouge
|
|
|
10 |
model-index:
|
11 |
- name: zephyr-support-chatbot
|
12 |
results: []
|
|
|
19 |
|
20 |
This model is a fine-tuned version of [TheBloke/zephyr-7B-beta-GPTQ](https://huggingface.co/TheBloke/zephyr-7B-beta-GPTQ) on the None dataset.
|
21 |
It achieves the following results on the evaluation set:
|
22 |
+
- Loss: 1.2805
|
23 |
+
- Rouge1: 0.6842
|
24 |
+
- Rouge2: 0.4855
|
25 |
+
- Rougel: 0.6563
|
26 |
+
- Rougelsum: 0.6711
|
27 |
|
28 |
## Model description
|
29 |
|
|
|
42 |
### Training hyperparameters
|
43 |
|
44 |
The following hyperparameters were used during training:
|
45 |
+
- learning_rate: 2e-05
|
46 |
- train_batch_size: 16
|
47 |
- eval_batch_size: 8
|
48 |
- seed: 42
|
49 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
50 |
- lr_scheduler_type: cosine
|
51 |
+
- num_epochs: 20
|
52 |
- mixed_precision_training: Native AMP
|
53 |
|
54 |
### Training results
|
55 |
|
56 |
| Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
|
57 |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|
|
58 |
+
| 2.422 | 1.11 | 10 | 2.7640 | 0.4291 | 0.1054 | 0.3461 | 0.3890 |
|
59 |
+
| 2.2454 | 2.22 | 20 | 2.5777 | 0.4423 | 0.1184 | 0.3607 | 0.4034 |
|
60 |
+
| 2.1454 | 3.33 | 30 | 2.3809 | 0.4713 | 0.1437 | 0.3860 | 0.4288 |
|
61 |
+
| 1.9437 | 4.44 | 40 | 2.1804 | 0.5021 | 0.1646 | 0.4027 | 0.4598 |
|
62 |
+
| 1.7975 | 5.56 | 50 | 2.0124 | 0.5355 | 0.1786 | 0.4425 | 0.4941 |
|
63 |
+
| 1.6621 | 6.67 | 60 | 1.8249 | 0.5540 | 0.2188 | 0.5011 | 0.5348 |
|
64 |
+
| 1.5141 | 7.78 | 70 | 1.6004 | 0.6161 | 0.3377 | 0.5701 | 0.5961 |
|
65 |
+
| 1.3291 | 8.89 | 80 | 1.4718 | 0.6513 | 0.3903 | 0.6072 | 0.6322 |
|
66 |
+
| 1.2206 | 10.0 | 90 | 1.3916 | 0.6652 | 0.4218 | 0.6265 | 0.6471 |
|
67 |
+
| 1.1767 | 11.11 | 100 | 1.3339 | 0.6840 | 0.4769 | 0.6489 | 0.6675 |
|
68 |
+
| 1.1462 | 12.22 | 110 | 1.3115 | 0.6807 | 0.4785 | 0.6506 | 0.6665 |
|
69 |
+
| 1.0924 | 13.33 | 120 | 1.2993 | 0.6843 | 0.4842 | 0.6539 | 0.6701 |
|
70 |
+
| 1.0602 | 14.44 | 130 | 1.2917 | 0.6854 | 0.4845 | 0.6561 | 0.6717 |
|
71 |
+
| 1.1177 | 15.56 | 140 | 1.2863 | 0.6835 | 0.4842 | 0.6547 | 0.6703 |
|
72 |
+
| 1.0756 | 16.67 | 150 | 1.2830 | 0.6838 | 0.4825 | 0.6549 | 0.6705 |
|
73 |
+
| 1.0894 | 17.78 | 160 | 1.2813 | 0.6838 | 0.4844 | 0.6560 | 0.6719 |
|
74 |
+
| 1.0649 | 18.89 | 170 | 1.2806 | 0.6842 | 0.4855 | 0.6563 | 0.6711 |
|
75 |
+
| 1.1019 | 20.0 | 180 | 1.2805 | 0.6842 | 0.4855 | 0.6563 | 0.6711 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
76 |
|
77 |
|
78 |
### Framework versions
|
79 |
|
|
|
80 |
- Transformers 4.35.2
|
81 |
- Pytorch 2.1.0+cu121
|
82 |
- Datasets 2.16.0
|
83 |
+
- Tokenizers 0.15.0
|