End of training
Browse files- README.md +47 -47
- logs/optim=adalomo/events.out.tfevents.1723326007.93d6cbb3ad53 +3 -0
- logs/optim=paged_adamw_8bit/events.out.tfevents.1723328775.93d6cbb3ad53 +3 -0
- logs/optim=paged_adamw_8bit/events.out.tfevents.1723341311.93d6cbb3ad53 +3 -0
- logs/optim=paged_lion_32bit/events.out.tfevents.1723313305.93d6cbb3ad53 +3 -0
- logs/optim=paged_lion_32bit/events.out.tfevents.1723325781.93d6cbb3ad53 +3 -0
- logs/optim=paged_lion_8bit/events.out.tfevents.1723300341.93d6cbb3ad53 +3 -0
- logs/optim=paged_lion_8bit/events.out.tfevents.1723300593.93d6cbb3ad53 +3 -0
- logs/optim=paged_lion_8bit/events.out.tfevents.1723313087.93d6cbb3ad53 +3 -0
- model.safetensors +1 -1
- training_args.bin +2 -2
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 64.
|
24 |
-
- eval_samples_per_second: 46.
|
25 |
-
- eval_steps_per_second: 11.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -59,51 +59,51 @@ The following hyperparameters were used during training:
|
|
59 |
- num_epochs: 1.0
|
60 |
|
61 |
### Resource Usage
|
62 |
-
Peak GPU Memory: 8.
|
63 |
|
64 |
### Eval-Phase Metrics
|
65 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
66 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
67 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
68 |
-
| 0 | 0 |
|
69 |
-
| 500 | 0.0269 |
|
70 |
-
| 1000 | 0.0539 |
|
71 |
-
| 1500 | 0.0808 |
|
72 |
-
| 2000 | 0.1077 |
|
73 |
-
| 2500 | 0.1347 |
|
74 |
-
| 3000 | 0.1616 |
|
75 |
-
| 3500 | 0.1886 |
|
76 |
-
| 4000 | 0.2155 |
|
77 |
-
| 4500 | 0.2424 |
|
78 |
-
| 5000 | 0.2694 |
|
79 |
-
| 5500 | 0.2963 |
|
80 |
-
| 6000 | 0.3232 |
|
81 |
-
| 6500 | 0.3502 |
|
82 |
-
| 7000 | 0.3771 |
|
83 |
-
| 7500 | 0.4040 |
|
84 |
-
| 8000 | 0.4310 |
|
85 |
-
| 8500 | 0.4579 |
|
86 |
-
| 9000 | 0.4848 |
|
87 |
-
| 9500 | 0.5118 |
|
88 |
-
| 10000 | 0.5387 |
|
89 |
-
| 10500 | 0.5657 |
|
90 |
-
| 11000 | 0.5926 |
|
91 |
-
| 11500 | 0.6195 |
|
92 |
-
| 12000 | 0.6465 |
|
93 |
-
| 12500 | 0.6734 |
|
94 |
-
| 13000 | 0.7003 |
|
95 |
-
| 13500 | 0.7273 |
|
96 |
-
| 14000 | 0.7542 |
|
97 |
-
| 14500 | 0.7811 |
|
98 |
-
| 15000 | 0.8081 |
|
99 |
-
| 15500 | 0.8350 |
|
100 |
-
| 16000 | 0.8620 |
|
101 |
-
| 16500 | 0.8889 |
|
102 |
-
| 17000 | 0.9158 |
|
103 |
-
| 17500 | 0.9428 |
|
104 |
-
| 18000 | 0.9697 |
|
105 |
-
| 18500 | 0.9966 |
|
106 |
-
| 18562 | 1.0000 |
|
107 |
|
108 |
### Framework versions
|
109 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 1375.5275
|
20 |
+
- eval_frwikippl: 6766.5586
|
21 |
+
- eval_zhwikippl: 17695.9277
|
22 |
+
- eval_loss: 8329.1631
|
23 |
+
- eval_runtime: 64.7405
|
24 |
+
- eval_samples_per_second: 46.339
|
25 |
+
- eval_steps_per_second: 11.585
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
59 |
- num_epochs: 1.0
|
60 |
|
61 |
### Resource Usage
|
62 |
+
Peak GPU Memory: 8.3354 GB
|
63 |
|
64 |
### Eval-Phase Metrics
|
65 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
66 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
67 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
68 |
+
| 0 | 0 | 59938.1289 | 59232.2031 | 331532.9688 | 64.3677 | 46.607 | 11.652 | 56882.4922 |
|
69 |
+
| 500 | 0.0269 | 3156.7812 | 13407.7314 | 11482.2617 | 64.7552 | 46.328 | 11.582 | 46374.0977 |
|
70 |
+
| 1000 | 0.0539 | 2456.4883 | 10286.8799 | 10423.125 | 66.7762 | 44.926 | 11.232 | 37178.7227 |
|
71 |
+
| 1500 | 0.0808 | 2211.4065 | 9323.9248 | 9931.5410 | 68.5121 | 43.788 | 10.947 | 30078.5703 |
|
72 |
+
| 2000 | 0.1077 | 2036.7498 | 8931.4150 | 9687.9463 | 65.8573 | 45.553 | 11.388 | 28182.8398 |
|
73 |
+
| 2500 | 0.1347 | 1920.9956 | 8044.7114 | 9415.5947 | 65.1775 | 46.028 | 11.507 | 24796.8789 |
|
74 |
+
| 3000 | 0.1616 | 1873.2666 | 8170.4624 | 9294.6562 | 65.2306 | 45.991 | 11.498 | 24111.1641 |
|
75 |
+
| 3500 | 0.1886 | 1794.6318 | 8009.0542 | 9103.3604 | 64.6784 | 46.383 | 11.596 | 24227.3574 |
|
76 |
+
| 4000 | 0.2155 | 1767.7296 | 7658.9282 | 9101.5039 | 65.4052 | 45.868 | 11.467 | 22171.6113 |
|
77 |
+
| 4500 | 0.2424 | 1662.8684 | 7530.4146 | 8929.0029 | 64.9694 | 46.176 | 11.544 | 22204.2188 |
|
78 |
+
| 5000 | 0.2694 | 1649.6533 | 7726.7241 | 8838.6562 | 65.1184 | 46.07 | 11.517 | 22987.1426 |
|
79 |
+
| 5500 | 0.2963 | 1599.3251 | 7247.0552 | 8786.3145 | 64.8678 | 46.248 | 11.562 | 20469.9453 |
|
80 |
+
| 6000 | 0.3232 | 1576.4832 | 7656.2266 | 8733.5889 | 63.9508 | 46.911 | 11.728 | 22970.2676 |
|
81 |
+
| 6500 | 0.3502 | 1542.9945 | 7010.3413 | 8645.2266 | 64.1329 | 46.778 | 11.694 | 19388.3926 |
|
82 |
+
| 7000 | 0.3771 | 1508.8114 | 6926.3296 | 8571.9463 | 64.1586 | 46.759 | 11.69 | 19860.0664 |
|
83 |
+
| 7500 | 0.4040 | 1468.5557 | 6836.5732 | 8549.0029 | 64.4335 | 46.56 | 11.64 | 18730.5410 |
|
84 |
+
| 8000 | 0.4310 | 1446.2615 | 6887.3745 | 8446.8584 | 64.7914 | 46.302 | 11.576 | 18665.6152 |
|
85 |
+
| 8500 | 0.4579 | 1424.2772 | 6938.0576 | 8386.9863 | 64.6538 | 46.401 | 11.6 | 19722.6406 |
|
86 |
+
| 9000 | 0.4848 | 1396.2977 | 6694.8984 | 8320.6289 | 64.4737 | 46.531 | 11.633 | 16747.5215 |
|
87 |
+
| 9500 | 0.5118 | 1375.5275 | 6766.5586 | 8329.1631 | 64.7405 | 46.339 | 11.585 | 17695.9277 |
|
88 |
+
| 10000 | 0.5387 | 1362.8486 | 6724.2305 | 8235.6797 | 64.3591 | 46.613 | 11.653 | 18138.5488 |
|
89 |
+
| 10500 | 0.5657 | 1323.8888 | 6641.3037 | 8259.5732 | 64.5004 | 46.511 | 11.628 | 18308.9023 |
|
90 |
+
| 11000 | 0.5926 | 1313.7504 | 6603.9517 | 8208.4697 | 63.9043 | 46.945 | 11.736 | 16451.6074 |
|
91 |
+
| 11500 | 0.6195 | 1301.4640 | 6654.8970 | 8135.1362 | 63.8702 | 46.97 | 11.743 | 17253.7539 |
|
92 |
+
| 12000 | 0.6465 | 1276.9374 | 6700.5708 | 8126.4321 | 63.8094 | 47.015 | 11.754 | 17780.0176 |
|
93 |
+
| 12500 | 0.6734 | 1261.5624 | 6390.4980 | 8023.9038 | 64.2272 | 46.709 | 11.677 | 16394.5977 |
|
94 |
+
| 13000 | 0.7003 | 1266.7156 | 6481.2490 | 8042.4531 | 63.9608 | 46.904 | 11.726 | 17241.0879 |
|
95 |
+
| 13500 | 0.7273 | 1234.4520 | 6369.3550 | 8034.0479 | 64.8274 | 46.277 | 11.569 | 16078.0498 |
|
96 |
+
| 14000 | 0.7542 | 1225.3788 | 6342.9141 | 7984.6187 | 64.523 | 46.495 | 11.624 | 16475.8027 |
|
97 |
+
| 14500 | 0.7811 | 1191.1580 | 6149.5986 | 7939.2002 | 64.9092 | 46.218 | 11.555 | 14996.4697 |
|
98 |
+
| 15000 | 0.8081 | 1184.7468 | 6450.7070 | 7919.1147 | 65.2526 | 45.975 | 11.494 | 16544.1367 |
|
99 |
+
| 15500 | 0.8350 | 1166.7611 | 6195.7266 | 7864.8428 | 64.9659 | 46.178 | 11.545 | 14867.8652 |
|
100 |
+
| 16000 | 0.8620 | 1162.8715 | 6133.1406 | 7857.3867 | 64.8395 | 46.268 | 11.567 | 14692.2109 |
|
101 |
+
| 16500 | 0.8889 | 1153.1372 | 6122.7734 | 7785.0986 | 65.0504 | 46.118 | 11.53 | 15377.7285 |
|
102 |
+
| 17000 | 0.9158 | 1137.9264 | 6099.0776 | 7778.0054 | 64.9115 | 46.217 | 11.554 | 14169.0732 |
|
103 |
+
| 17500 | 0.9428 | 1129.4969 | 5922.5732 | 7767.8188 | 64.6909 | 46.374 | 11.594 | 13724.9141 |
|
104 |
+
| 18000 | 0.9697 | 1111.5293 | 5840.4692 | 7714.6880 | 65.0093 | 46.147 | 11.537 | 12743.6494 |
|
105 |
+
| 18500 | 0.9966 | 1116.8070 | 5738.8276 | 7720.9282 | 64.6499 | 46.404 | 11.601 | 12507.6094 |
|
106 |
+
| 18562 | 1.0000 | 1113.0409 | 5667.2534 | 7727.6904 | 64.516 | 46.5 | 11.625 | 12154.4287 |
|
107 |
|
108 |
### Framework versions
|
109 |
- Distily 0.2.0
|
logs/optim=adalomo/events.out.tfevents.1723326007.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:c6f11e3a120d4788b2845f40375f1d8a41c07875ddc539a46ff0f036f4ebca56
|
3 |
+
size 386346
|
logs/optim=paged_adamw_8bit/events.out.tfevents.1723328775.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ee820b62fa9c2e38d84e6e2a3327d3bf76e89cf3fd6a40591075c840583a84df
|
3 |
+
size 3949260
|
logs/optim=paged_adamw_8bit/events.out.tfevents.1723341311.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4f0f3cc04d5be16c692bff885beb5e9a9837a23f0c45f371ff9da853809fc0ef
|
3 |
+
size 253
|
logs/optim=paged_lion_32bit/events.out.tfevents.1723313305.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f7741b5e740821252005fb316e1bfcd2d2898780bf4502db2420448d848adb9f
|
3 |
+
size 3949220
|
logs/optim=paged_lion_32bit/events.out.tfevents.1723325781.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d138024348a62821b3d4427eb58877e313bd0de0c9e7efe7cd7cd3416e707bf8
|
3 |
+
size 529
|
logs/optim=paged_lion_8bit/events.out.tfevents.1723300341.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:9da38980a7c9a5fae9261533c4a631ea1c1b32a1b8c1c20aed6f261299bc76e3
|
3 |
+
size 6007
|
logs/optim=paged_lion_8bit/events.out.tfevents.1723300593.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:34f307783a45b61c5feb93b6f41334f33fb71319d192cbe5c09e73597e9eaf16
|
3 |
+
size 3949218
|
logs/optim=paged_lion_8bit/events.out.tfevents.1723313087.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:187019067cc901836fd98ff2d88b17f63fcccf65da38e007e94d66a01c6ce2d2
|
3 |
+
size 529
|
model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 248894656
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4116d18319ae57e567bf43fd6960b662bdffd40803c25e5b2577f84c8f22de73
|
3 |
size 248894656
|
training_args.bin
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:d9dc819218a49c4b66c04c2864cc57a5e435ef94ee3bed56d52d4013c94c43e3
|
3 |
+
size 907106692
|