End of training
Browse files- README.md +112 -0
- generation_config.json +6 -0
README.md
ADDED
@@ -0,0 +1,112 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
base_model: gpt2
|
3 |
+
library_name: distily
|
4 |
+
license: mit
|
5 |
+
tags:
|
6 |
+
- generated_from_trainer
|
7 |
+
model-index:
|
8 |
+
- name: distily_bench_gpt2_optim_extended2
|
9 |
+
results: []
|
10 |
+
---
|
11 |
+
|
12 |
+
# distily_bench_gpt2_optim_extended2
|
13 |
+
|
14 |
+
This student model is distilled from the teacher model [gpt2](https://huggingface.co/gpt2) using the dataset (unspecified).
|
15 |
+
|
16 |
+
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
+
|
18 |
+
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 859.1765
|
20 |
+
- eval_frwikippl: 4914.2627
|
21 |
+
- eval_zhwikippl: 7375.7471
|
22 |
+
- eval_loss: 6893.8027
|
23 |
+
- eval_runtime: 65.3454
|
24 |
+
- eval_samples_per_second: 45.91
|
25 |
+
- eval_steps_per_second: 11.477
|
26 |
+
|
27 |
+
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
+
should probably proofread and complete it, then remove this comment.
|
29 |
+
|
30 |
+
## Model description
|
31 |
+
|
32 |
+
More information needed
|
33 |
+
|
34 |
+
## Intended uses & limitations
|
35 |
+
|
36 |
+
More information needed
|
37 |
+
|
38 |
+
## Training and evaluation data
|
39 |
+
|
40 |
+
More information needed
|
41 |
+
-->
|
42 |
+
|
43 |
+
## Training procedure
|
44 |
+
|
45 |
+
### Training hyperparameters
|
46 |
+
|
47 |
+
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: 'legacy'
|
49 |
+
- loss_fn: kl
|
50 |
+
- train_embeddings: True
|
51 |
+
- learning_rate: 4e-05
|
52 |
+
- train_batch_size: 8
|
53 |
+
- eval_batch_size: 4
|
54 |
+
- seed: 42
|
55 |
+
- gradient_accumulation_steps: 2
|
56 |
+
- total_train_batch_size: 16
|
57 |
+
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
58 |
+
- lr_scheduler_type: constant
|
59 |
+
- num_epochs: 1.0
|
60 |
+
|
61 |
+
### Resource Usage
|
62 |
+
Peak GPU Memory: 8.3331 GB
|
63 |
+
|
64 |
+
### Eval-Phase Metrics
|
65 |
+
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
66 |
+
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
67 |
+
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
68 |
+
| 0 | 0 | 59513.7266 | 59655.5039 | 330435.25 | 63.4949 | 47.248 | 11.812 | 53382.8789 |
|
69 |
+
| 500 | 0.0269 | 2863.2634 | 11508.8330 | 12259.9258 | 63.8689 | 46.971 | 11.743 | 38858.9141 |
|
70 |
+
| 1000 | 0.0539 | 2136.9172 | 7522.9854 | 10143.5098 | 63.9165 | 46.936 | 11.734 | 16435.1426 |
|
71 |
+
| 1500 | 0.0808 | 1930.7159 | 7313.2690 | 9805.8242 | 64.0662 | 46.827 | 11.707 | 14467.3398 |
|
72 |
+
| 2000 | 0.1077 | 1691.7502 | 6743.2197 | 9288.6826 | 64.7936 | 46.301 | 11.575 | 13923.3652 |
|
73 |
+
| 2500 | 0.1347 | 1564.1060 | 6471.2065 | 8980.0957 | 65.967 | 45.477 | 11.369 | 13381.9658 |
|
74 |
+
| 3000 | 0.1616 | 1452.3389 | 6264.2476 | 8701.7598 | 65.1396 | 46.055 | 11.514 | 14317.4297 |
|
75 |
+
| 3500 | 0.1886 | 1346.0464 | 6201.8452 | 8515.4990 | 64.711 | 46.36 | 11.59 | 14058.8223 |
|
76 |
+
| 4000 | 0.2155 | 1293.1278 | 5995.4678 | 8215.7871 | 64.881 | 46.238 | 11.56 | 14290.6924 |
|
77 |
+
| 4500 | 0.2424 | 1188.8944 | 5653.4883 | 7998.4854 | 64.8042 | 46.293 | 11.573 | 12600.6602 |
|
78 |
+
| 5000 | 0.2694 | 1150.8783 | 5734.7847 | 7816.5864 | 65.0575 | 46.113 | 11.528 | 11801.7627 |
|
79 |
+
| 5500 | 0.2963 | 1106.1051 | 5579.8325 | 7761.1094 | 64.8301 | 46.275 | 11.569 | 10491.8916 |
|
80 |
+
| 6000 | 0.3232 | 1062.4747 | 5325.9385 | 7568.8638 | 64.8331 | 46.273 | 11.568 | 9862.1807 |
|
81 |
+
| 6500 | 0.3502 | 1016.8871 | 5219.4351 | 7424.0107 | 65.1855 | 46.023 | 11.506 | 8266.7178 |
|
82 |
+
| 7000 | 0.3771 | 990.4482 | 5391.6797 | 7382.8267 | 64.9653 | 46.179 | 11.545 | 8352.1592 |
|
83 |
+
| 7500 | 0.4040 | 957.9040 | 5171.6309 | 7296.6401 | 64.9498 | 46.19 | 11.547 | 7235.2671 |
|
84 |
+
| 8000 | 0.4310 | 936.6096 | 5295.7979 | 7155.7334 | 64.9012 | 46.224 | 11.556 | 8302.1201 |
|
85 |
+
| 8500 | 0.4579 | 893.5040 | 5091.3115 | 7057.3013 | 65.0288 | 46.133 | 11.533 | 7185.6812 |
|
86 |
+
| 9000 | 0.4848 | 894.5970 | 5188.4321 | 6994.9229 | 65.1952 | 46.016 | 11.504 | 8320.9854 |
|
87 |
+
| 9500 | 0.5118 | 859.1765 | 4914.2627 | 6893.8027 | 65.3454 | 45.91 | 11.477 | 7375.7471 |
|
88 |
+
| 10000 | 0.5387 | 844.4594 | 5006.5923 | 6830.0801 | 64.9208 | 46.21 | 11.553 | 7772.1235 |
|
89 |
+
| 10500 | 0.5657 | 825.0941 | 4731.5112 | 6761.4507 | 64.7216 | 46.352 | 11.588 | 6593.5493 |
|
90 |
+
| 11000 | 0.5926 | 805.9855 | 4779.2881 | 6747.7651 | 65.2107 | 46.005 | 11.501 | 6463.2256 |
|
91 |
+
| 11500 | 0.6195 | 798.3549 | 4823.9761 | 6676.7681 | 65.0141 | 46.144 | 11.536 | 9802.4414 |
|
92 |
+
| 12000 | 0.6465 | 791.6104 | 4696.2798 | 6579.1147 | 64.8969 | 46.227 | 11.557 | 7084.2090 |
|
93 |
+
| 12500 | 0.6734 | 761.6416 | 4785.1895 | 6548.4907 | 64.8647 | 46.25 | 11.563 | 9604.8457 |
|
94 |
+
| 13000 | 0.7003 | 761.1536 | 4701.0835 | 6501.8882 | 65.0795 | 46.097 | 11.524 | 6464.0889 |
|
95 |
+
| 13500 | 0.7273 | 734.9013 | 4585.2031 | 6514.5386 | 65.3072 | 45.937 | 11.484 | 6561.9272 |
|
96 |
+
| 14000 | 0.7542 | 740.7595 | 4580.6797 | 6382.9121 | 65.4037 | 45.869 | 11.467 | 6729.2021 |
|
97 |
+
| 14500 | 0.7811 | 711.6922 | 4399.3062 | 6421.3013 | 64.9755 | 46.171 | 11.543 | 10408.1582 |
|
98 |
+
| 15000 | 0.8081 | 699.6238 | 4419.9839 | 6434.4214 | 64.9281 | 46.205 | 11.551 | 6650.1450 |
|
99 |
+
| 15500 | 0.8350 | 689.9260 | 4160.6733 | 6344.1709 | 65.0911 | 46.089 | 11.522 | 5445.5518 |
|
100 |
+
| 16000 | 0.8620 | 690.0198 | 4297.5234 | 6258.8799 | 64.8144 | 46.286 | 11.572 | 4706.0220 |
|
101 |
+
| 16500 | 0.8889 | 691.7233 | 4264.9258 | 6328.2451 | 65.5974 | 45.734 | 11.433 | 5371.5254 |
|
102 |
+
| 17000 | 0.9158 | 665.1479 | 4280.1362 | 6238.0371 | 64.909 | 46.219 | 11.555 | 7401.3955 |
|
103 |
+
| 17500 | 0.9428 | 668.0211 | 4168.7480 | 6233.2695 | 64.7669 | 46.32 | 11.58 | 6142.2017 |
|
104 |
+
| 18000 | 0.9697 | 653.9958 | 4099.5210 | 6188.6719 | 64.1993 | 46.729 | 11.682 | 5851.9600 |
|
105 |
+
| 18500 | 0.9966 | 644.6927 | 3924.1768 | 6175.1572 | 64.4251 | 46.566 | 11.641 | 5162.6572 |
|
106 |
+
| 18562 | 1.0000 | 649.2895 | 3895.5068 | 6124.0107 | 64.4839 | 46.523 | 11.631 | 4821.7959 |
|
107 |
+
|
108 |
+
### Framework versions
|
109 |
+
- Distily 0.2.0
|
110 |
+
- Transformers 4.44.0
|
111 |
+
- Pytorch 2.3.0
|
112 |
+
- Datasets 2.20.0
|
generation_config.json
ADDED
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_from_model_config": true,
|
3 |
+
"bos_token_id": 50256,
|
4 |
+
"eos_token_id": 50256,
|
5 |
+
"transformers_version": "4.44.0"
|
6 |
+
}
|