gjonesQ02 commited on
Commit
008a9bc
1 Parent(s): 5a7b680

End of training

Browse files
README.md ADDED
@@ -0,0 +1,206 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: distilgpt2
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: StatementOfWork_Generator_Omega_BS_512_2
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # StatementOfWork_Generator_Omega_BS_512_2
15
+
16
+ This model is a fine-tuned version of [distilgpt2](https://huggingface.co/distilgpt2) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.9847
19
+
20
+ ## Model description
21
+
22
+ More information needed
23
+
24
+ ## Intended uses & limitations
25
+
26
+ More information needed
27
+
28
+ ## Training and evaluation data
29
+
30
+ More information needed
31
+
32
+ ## Training procedure
33
+
34
+ ### Training hyperparameters
35
+
36
+ The following hyperparameters were used during training:
37
+ - learning_rate: 2e-05
38
+ - train_batch_size: 50
39
+ - eval_batch_size: 50
40
+ - seed: 42
41
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
42
+ - lr_scheduler_type: linear
43
+ - num_epochs: 150
44
+
45
+ ### Training results
46
+
47
+ | Training Loss | Epoch | Step | Validation Loss |
48
+ |:-------------:|:-----:|:----:|:---------------:|
49
+ | No log | 1.0 | 4 | 1.3147 |
50
+ | No log | 2.0 | 8 | 1.3104 |
51
+ | No log | 3.0 | 12 | 1.3048 |
52
+ | No log | 4.0 | 16 | 1.2989 |
53
+ | No log | 5.0 | 20 | 1.2924 |
54
+ | No log | 6.0 | 24 | 1.2882 |
55
+ | No log | 7.0 | 28 | 1.2813 |
56
+ | No log | 8.0 | 32 | 1.2768 |
57
+ | No log | 9.0 | 36 | 1.2722 |
58
+ | No log | 10.0 | 40 | 1.2675 |
59
+ | No log | 11.0 | 44 | 1.2619 |
60
+ | No log | 12.0 | 48 | 1.2614 |
61
+ | No log | 13.0 | 52 | 1.2515 |
62
+ | No log | 14.0 | 56 | 1.2459 |
63
+ | No log | 15.0 | 60 | 1.2444 |
64
+ | No log | 16.0 | 64 | 1.2373 |
65
+ | No log | 17.0 | 68 | 1.2364 |
66
+ | No log | 18.0 | 72 | 1.2286 |
67
+ | No log | 19.0 | 76 | 1.2269 |
68
+ | No log | 20.0 | 80 | 1.2196 |
69
+ | No log | 21.0 | 84 | 1.2205 |
70
+ | No log | 22.0 | 88 | 1.2131 |
71
+ | No log | 23.0 | 92 | 1.2086 |
72
+ | No log | 24.0 | 96 | 1.2060 |
73
+ | No log | 25.0 | 100 | 1.1999 |
74
+ | No log | 26.0 | 104 | 1.1943 |
75
+ | No log | 27.0 | 108 | 1.1909 |
76
+ | No log | 28.0 | 112 | 1.1879 |
77
+ | No log | 29.0 | 116 | 1.1841 |
78
+ | No log | 30.0 | 120 | 1.1813 |
79
+ | No log | 31.0 | 124 | 1.1735 |
80
+ | No log | 32.0 | 128 | 1.1756 |
81
+ | No log | 33.0 | 132 | 1.1711 |
82
+ | No log | 34.0 | 136 | 1.1677 |
83
+ | No log | 35.0 | 140 | 1.1627 |
84
+ | No log | 36.0 | 144 | 1.1565 |
85
+ | No log | 37.0 | 148 | 1.1567 |
86
+ | No log | 38.0 | 152 | 1.1517 |
87
+ | No log | 39.0 | 156 | 1.1481 |
88
+ | No log | 40.0 | 160 | 1.1442 |
89
+ | No log | 41.0 | 164 | 1.1407 |
90
+ | No log | 42.0 | 168 | 1.1383 |
91
+ | No log | 43.0 | 172 | 1.1354 |
92
+ | No log | 44.0 | 176 | 1.1322 |
93
+ | No log | 45.0 | 180 | 1.1287 |
94
+ | No log | 46.0 | 184 | 1.1243 |
95
+ | No log | 47.0 | 188 | 1.1217 |
96
+ | No log | 48.0 | 192 | 1.1186 |
97
+ | No log | 49.0 | 196 | 1.1175 |
98
+ | No log | 50.0 | 200 | 1.1145 |
99
+ | No log | 51.0 | 204 | 1.1116 |
100
+ | No log | 52.0 | 208 | 1.1077 |
101
+ | No log | 53.0 | 212 | 1.1039 |
102
+ | No log | 54.0 | 216 | 1.1031 |
103
+ | No log | 55.0 | 220 | 1.1011 |
104
+ | No log | 56.0 | 224 | 1.0970 |
105
+ | No log | 57.0 | 228 | 1.0935 |
106
+ | No log | 58.0 | 232 | 1.0937 |
107
+ | No log | 59.0 | 236 | 1.0898 |
108
+ | No log | 60.0 | 240 | 1.0853 |
109
+ | No log | 61.0 | 244 | 1.0845 |
110
+ | No log | 62.0 | 248 | 1.0825 |
111
+ | No log | 63.0 | 252 | 1.0778 |
112
+ | No log | 64.0 | 256 | 1.0766 |
113
+ | No log | 65.0 | 260 | 1.0749 |
114
+ | No log | 66.0 | 264 | 1.0707 |
115
+ | No log | 67.0 | 268 | 1.0702 |
116
+ | No log | 68.0 | 272 | 1.0696 |
117
+ | No log | 69.0 | 276 | 1.0659 |
118
+ | No log | 70.0 | 280 | 1.0641 |
119
+ | No log | 71.0 | 284 | 1.0636 |
120
+ | No log | 72.0 | 288 | 1.0579 |
121
+ | No log | 73.0 | 292 | 1.0551 |
122
+ | No log | 74.0 | 296 | 1.0565 |
123
+ | No log | 75.0 | 300 | 1.0531 |
124
+ | No log | 76.0 | 304 | 1.0493 |
125
+ | No log | 77.0 | 308 | 1.0511 |
126
+ | No log | 78.0 | 312 | 1.0494 |
127
+ | No log | 79.0 | 316 | 1.0468 |
128
+ | No log | 80.0 | 320 | 1.0455 |
129
+ | No log | 81.0 | 324 | 1.0421 |
130
+ | No log | 82.0 | 328 | 1.0399 |
131
+ | No log | 83.0 | 332 | 1.0402 |
132
+ | No log | 84.0 | 336 | 1.0357 |
133
+ | No log | 85.0 | 340 | 1.0351 |
134
+ | No log | 86.0 | 344 | 1.0349 |
135
+ | No log | 87.0 | 348 | 1.0335 |
136
+ | No log | 88.0 | 352 | 1.0299 |
137
+ | No log | 89.0 | 356 | 1.0293 |
138
+ | No log | 90.0 | 360 | 1.0292 |
139
+ | No log | 91.0 | 364 | 1.0268 |
140
+ | No log | 92.0 | 368 | 1.0254 |
141
+ | No log | 93.0 | 372 | 1.0246 |
142
+ | No log | 94.0 | 376 | 1.0241 |
143
+ | No log | 95.0 | 380 | 1.0218 |
144
+ | No log | 96.0 | 384 | 1.0183 |
145
+ | No log | 97.0 | 388 | 1.0173 |
146
+ | No log | 98.0 | 392 | 1.0177 |
147
+ | No log | 99.0 | 396 | 1.0161 |
148
+ | No log | 100.0 | 400 | 1.0134 |
149
+ | No log | 101.0 | 404 | 1.0135 |
150
+ | No log | 102.0 | 408 | 1.0129 |
151
+ | No log | 103.0 | 412 | 1.0118 |
152
+ | No log | 104.0 | 416 | 1.0107 |
153
+ | No log | 105.0 | 420 | 1.0087 |
154
+ | No log | 106.0 | 424 | 1.0064 |
155
+ | No log | 107.0 | 428 | 1.0046 |
156
+ | No log | 108.0 | 432 | 1.0047 |
157
+ | No log | 109.0 | 436 | 1.0050 |
158
+ | No log | 110.0 | 440 | 1.0052 |
159
+ | No log | 111.0 | 444 | 1.0033 |
160
+ | No log | 112.0 | 448 | 1.0013 |
161
+ | No log | 113.0 | 452 | 1.0001 |
162
+ | No log | 114.0 | 456 | 1.0005 |
163
+ | No log | 115.0 | 460 | 1.0001 |
164
+ | No log | 116.0 | 464 | 0.9982 |
165
+ | No log | 117.0 | 468 | 0.9965 |
166
+ | No log | 118.0 | 472 | 0.9955 |
167
+ | No log | 119.0 | 476 | 0.9952 |
168
+ | No log | 120.0 | 480 | 0.9954 |
169
+ | No log | 121.0 | 484 | 0.9950 |
170
+ | No log | 122.0 | 488 | 0.9941 |
171
+ | No log | 123.0 | 492 | 0.9936 |
172
+ | No log | 124.0 | 496 | 0.9926 |
173
+ | 0.7114 | 125.0 | 500 | 0.9919 |
174
+ | 0.7114 | 126.0 | 504 | 0.9915 |
175
+ | 0.7114 | 127.0 | 508 | 0.9911 |
176
+ | 0.7114 | 128.0 | 512 | 0.9906 |
177
+ | 0.7114 | 129.0 | 516 | 0.9895 |
178
+ | 0.7114 | 130.0 | 520 | 0.9889 |
179
+ | 0.7114 | 131.0 | 524 | 0.9884 |
180
+ | 0.7114 | 132.0 | 528 | 0.9882 |
181
+ | 0.7114 | 133.0 | 532 | 0.9885 |
182
+ | 0.7114 | 134.0 | 536 | 0.9886 |
183
+ | 0.7114 | 135.0 | 540 | 0.9883 |
184
+ | 0.7114 | 136.0 | 544 | 0.9879 |
185
+ | 0.7114 | 137.0 | 548 | 0.9874 |
186
+ | 0.7114 | 138.0 | 552 | 0.9869 |
187
+ | 0.7114 | 139.0 | 556 | 0.9867 |
188
+ | 0.7114 | 140.0 | 560 | 0.9865 |
189
+ | 0.7114 | 141.0 | 564 | 0.9867 |
190
+ | 0.7114 | 142.0 | 568 | 0.9869 |
191
+ | 0.7114 | 143.0 | 572 | 0.9867 |
192
+ | 0.7114 | 144.0 | 576 | 0.9862 |
193
+ | 0.7114 | 145.0 | 580 | 0.9857 |
194
+ | 0.7114 | 146.0 | 584 | 0.9853 |
195
+ | 0.7114 | 147.0 | 588 | 0.9850 |
196
+ | 0.7114 | 148.0 | 592 | 0.9848 |
197
+ | 0.7114 | 149.0 | 596 | 0.9847 |
198
+ | 0.7114 | 150.0 | 600 | 0.9847 |
199
+
200
+
201
+ ### Framework versions
202
+
203
+ - Transformers 4.38.2
204
+ - Pytorch 2.2.1+cu121
205
+ - Datasets 2.18.0
206
+ - Tokenizers 0.15.2
config.json ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "distilgpt2",
3
+ "_num_labels": 1,
4
+ "activation_function": "gelu_new",
5
+ "architectures": [
6
+ "GPT2LMHeadModel"
7
+ ],
8
+ "attn_pdrop": 0.1,
9
+ "bos_token_id": 50256,
10
+ "embd_pdrop": 0.1,
11
+ "eos_token_id": 50256,
12
+ "id2label": {
13
+ "0": "LABEL_0"
14
+ },
15
+ "initializer_range": 0.02,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_epsilon": 1e-05,
20
+ "model_type": "gpt2",
21
+ "n_ctx": 1024,
22
+ "n_embd": 768,
23
+ "n_head": 12,
24
+ "n_inner": null,
25
+ "n_layer": 6,
26
+ "n_positions": 1024,
27
+ "reorder_and_upcast_attn": false,
28
+ "resid_pdrop": 0.1,
29
+ "scale_attn_by_inverse_layer_idx": false,
30
+ "scale_attn_weights": true,
31
+ "summary_activation": null,
32
+ "summary_first_dropout": 0.1,
33
+ "summary_proj_to_labels": true,
34
+ "summary_type": "cls_index",
35
+ "summary_use_proj": true,
36
+ "task_specific_params": {
37
+ "text-generation": {
38
+ "do_sample": true,
39
+ "max_length": 50
40
+ }
41
+ },
42
+ "torch_dtype": "float32",
43
+ "transformers_version": "4.38.2",
44
+ "use_cache": true,
45
+ "vocab_size": 50257
46
+ }
generation_config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 50256,
4
+ "eos_token_id": 50256,
5
+ "transformers_version": "4.38.2"
6
+ }
merges.txt ADDED
The diff for this file is too large to render. See raw diff
 
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d0971079e1f04a3e710f79cdc57ed4e1688b8b95dce8268423cf665b76e89483
3
+ size 327657928
runs/Jul01_12-53-02_viridian/events.out.tfevents.1719838388.viridian.3777261.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:52fd4d93b1045bf0c9ffe83871cc698ef4492173f64b5623af54b285e5d13a2f
3
+ size 5047
runs/Jul01_12-55-33_viridian/events.out.tfevents.1719838549.viridian.3777261.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9cc9df34c5e95c6084d4d92e7c07eabfb73a9816e624ee16b743ca5193b6d260
3
+ size 5047
runs/Jul01_13-28-02_viridian/events.out.tfevents.1719840502.viridian.3778847.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:eb47e1bd52ece937bb1f5b095eedb2764545e7127328b6a4f6b051a3164b59f8
3
+ size 13564
runs/Jul01_13-38-05_viridian/events.out.tfevents.1719841093.viridian.3786935.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1c9e5fd685e05aea763167f0edb277757448b7850932ba604787ff941a6846a3
3
+ size 46107
runs/Jul01_13-38-05_viridian/events.out.tfevents.1719843146.viridian.3786935.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b6fe1d2b3ed10cee051ae08da08993bfc32ab727804ed83e88f531588dd41907
3
+ size 46135
special_tokens_map.json ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<|endoftext|>",
3
+ "eos_token": "<|endoftext|>",
4
+ "unk_token": "<|endoftext|>"
5
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_prefix_space": false,
3
+ "added_tokens_decoder": {
4
+ "50256": {
5
+ "content": "<|endoftext|>",
6
+ "lstrip": false,
7
+ "normalized": true,
8
+ "rstrip": false,
9
+ "single_word": false,
10
+ "special": true
11
+ }
12
+ },
13
+ "bos_token": "<|endoftext|>",
14
+ "clean_up_tokenization_spaces": true,
15
+ "eos_token": "<|endoftext|>",
16
+ "model_max_length": 1024,
17
+ "tokenizer_class": "GPT2Tokenizer",
18
+ "unk_token": "<|endoftext|>"
19
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:df83f7a348c9e1e59cb73e220cb5e770e20b96ea2f0fd40e2f5dd8a85e864297
3
+ size 5048
vocab.json ADDED
The diff for this file is too large to render. See raw diff