lapp0 commited on
Commit
6c37e0a
·
verified ·
1 Parent(s): 9b26317

End of training

Browse files
README.md CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
- - eval_enwikippl: 603.2673
20
- - eval_frwikippl: 3866.3679
21
- - eval_zhwikippl: 9060.9883
22
- - eval_loss: 6355.0508
23
- - eval_runtime: 64.6366
24
- - eval_samples_per_second: 46.413
25
- - eval_steps_per_second: 11.603
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
@@ -59,51 +59,51 @@ The following hyperparameters were used during training:
59
  - num_epochs: 1.0
60
 
61
  ### Resource Usage
62
- Peak GPU Memory: 8.3344 GB
63
 
64
  ### Eval-Phase Metrics
65
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
66
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
67
  | **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
68
- | 0 | 0 | 55332.9297 | 57511.9648 | 333834.9375 | 63.8516 | 46.984 | 11.746 | 57797.4375 |
69
- | 500 | 0.0269 | 2446.0188 | 10865.3799 | 11817.8984 | 64.1124 | 46.793 | 11.698 | 39870.7812 |
70
- | 1000 | 0.0539 | 1804.3785 | 6767.0361 | 9836.2031 | 64.095 | 46.806 | 11.701 | 19923.8262 |
71
- | 1500 | 0.0808 | 1456.1499 | 5625.2583 | 9170.7520 | 65.271 | 45.962 | 11.491 | 18979.7988 |
72
- | 2000 | 0.1077 | 1255.2349 | 5859.0298 | 8742.0908 | 64.4753 | 46.529 | 11.632 | 17829.9570 |
73
- | 2500 | 0.1347 | 1123.1558 | 5142.7266 | 8474.3467 | 64.6172 | 46.427 | 11.607 | 18204.0723 |
74
- | 3000 | 0.1616 | 1041.0769 | 5179.4790 | 8192.3838 | 64.0965 | 46.804 | 11.701 | 16922.875 |
75
- | 3500 | 0.1886 | 948.0062 | 4929.7056 | 7911.0400 | 64.5488 | 46.476 | 11.619 | 22088.8789 |
76
- | 4000 | 0.2155 | 899.1066 | 4752.2407 | 7641.6426 | 65.5993 | 45.732 | 11.433 | 16942.1074 |
77
- | 4500 | 0.2424 | 843.3125 | 4732.0117 | 7480.7788 | 64.5158 | 46.5 | 11.625 | 13217.6758 |
78
- | 5000 | 0.2694 | 796.5746 | 4456.2817 | 7343.6479 | 65.0161 | 46.142 | 11.536 | 12772.6074 |
79
- | 5500 | 0.2963 | 772.2271 | 4386.7627 | 7222.3145 | 65.0008 | 46.153 | 11.538 | 11082.3330 |
80
- | 6000 | 0.3232 | 723.9974 | 4267.7817 | 7016.9600 | 64.7743 | 46.315 | 11.579 | 9581.7812 |
81
- | 6500 | 0.3502 | 696.7773 | 4287.5391 | 6892.1387 | 64.6727 | 46.387 | 11.597 | 8422.7246 |
82
- | 7000 | 0.3771 | 679.4652 | 4046.8250 | 6773.9629 | 64.6977 | 46.369 | 11.592 | 7275.9604 |
83
- | 7500 | 0.4040 | 667.8522 | 4138.4370 | 6713.6533 | 65.028 | 46.134 | 11.533 | 8175.5986 |
84
- | 8000 | 0.4310 | 647.4772 | 3977.0999 | 6626.9331 | 64.3886 | 46.592 | 11.648 | 5914.0166 |
85
- | 8500 | 0.4579 | 627.8210 | 3850.3174 | 6548.4160 | 64.3532 | 46.618 | 11.654 | 7728.6548 |
86
- | 9000 | 0.4848 | 608.0646 | 3773.8511 | 6449.4614 | 64.3549 | 46.616 | 11.654 | 7419.7065 |
87
- | 9500 | 0.5118 | 603.2673 | 3866.3679 | 6355.0508 | 64.6366 | 46.413 | 11.603 | 9060.9883 |
88
- | 10000 | 0.5387 | 588.2559 | 3563.7371 | 6282.0479 | 65.1489 | 46.048 | 11.512 | 7187.1206 |
89
- | 10500 | 0.5657 | 569.4130 | 3654.1926 | 6309.9839 | 64.8852 | 46.235 | 11.559 | 7732.7837 |
90
- | 11000 | 0.5926 | 572.8280 | 3728.8887 | 6206.9868 | 65.1196 | 46.069 | 11.517 | 6973.9194 |
91
- | 11500 | 0.6195 | 551.1736 | 3640.4358 | 6146.9331 | 65.3439 | 45.911 | 11.478 | 5983.9292 |
92
- | 12000 | 0.6465 | 544.3150 | 3507.0312 | 6073.0454 | 65.3717 | 45.891 | 11.473 | 5726.3408 |
93
- | 12500 | 0.6734 | 538.8688 | 3312.2402 | 6032.6079 | 65.1968 | 46.015 | 11.504 | 5642.0854 |
94
- | 13000 | 0.7003 | 525.2048 | 3317.0325 | 6042.6240 | 65.216 | 46.001 | 11.5 | 11299.7695 |
95
- | 13500 | 0.7273 | 516.2283 | 3381.7358 | 5946.3682 | 67.4205 | 44.497 | 11.124 | 7501.9004 |
96
- | 14000 | 0.7542 | 508.5393 | 3201.6807 | 5921.8345 | 65.0932 | 46.088 | 11.522 | 7485.8843 |
97
- | 14500 | 0.7811 | 499.8382 | 3091.7612 | 5887.8721 | 65.2716 | 45.962 | 11.49 | 5927.4609 |
98
- | 15000 | 0.8081 | 491.9155 | 3132.6841 | 5930.3252 | 65.4781 | 45.817 | 11.454 | 7431.6040 |
99
- | 15500 | 0.8350 | 485.9736 | 3050.2964 | 5844.8960 | 65.1349 | 46.058 | 11.515 | 6106.6260 |
100
- | 16000 | 0.8620 | 483.0016 | 2964.3213 | 5828.2241 | 65.6241 | 45.715 | 11.429 | 5001.1572 |
101
- | 16500 | 0.8889 | 480.1220 | 2957.3284 | 5789.1626 | 65.5498 | 45.767 | 11.442 | 4932.5088 |
102
- | 17000 | 0.9158 | 470.7449 | 2851.3689 | 5783.3174 | 65.2632 | 45.968 | 11.492 | 4651.6655 |
103
- | 17500 | 0.9428 | 471.4951 | 2821.2729 | 5762.3945 | 65.89 | 45.53 | 11.383 | 4335.2710 |
104
- | 18000 | 0.9697 | 467.4575 | 2898.3936 | 5772.5654 | 65.0307 | 46.132 | 11.533 | 3703.9866 |
105
- | 18500 | 0.9966 | 465.3025 | 2792.5769 | 5640.9438 | 65.1725 | 46.032 | 11.508 | 4174.7715 |
106
- | 18562 | 1.0000 | 459.9143 | 2775.4995 | 5699.1255 | 65.9141 | 45.514 | 11.378 | 4052.5564 |
107
 
108
  ### Framework versions
109
  - Distily 0.2.0
 
16
  The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
17
 
18
  It achieves the following results on the evaluation set:
19
+ - eval_enwikippl: 1375.5275
20
+ - eval_frwikippl: 6766.5586
21
+ - eval_zhwikippl: 17695.9277
22
+ - eval_loss: 8329.1631
23
+ - eval_runtime: 64.7405
24
+ - eval_samples_per_second: 46.339
25
+ - eval_steps_per_second: 11.585
26
 
27
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
28
  should probably proofread and complete it, then remove this comment.
 
59
  - num_epochs: 1.0
60
 
61
  ### Resource Usage
62
+ Peak GPU Memory: 8.3354 GB
63
 
64
  ### Eval-Phase Metrics
65
  | step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
66
  | --- | --- | --- | --- | --- | --- | --- | --- | --- |
67
  | **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
68
+ | 0 | 0 | 59938.1289 | 59232.2031 | 331532.9688 | 64.3677 | 46.607 | 11.652 | 56882.4922 |
69
+ | 500 | 0.0269 | 3156.7812 | 13407.7314 | 11482.2617 | 64.7552 | 46.328 | 11.582 | 46374.0977 |
70
+ | 1000 | 0.0539 | 2456.4883 | 10286.8799 | 10423.125 | 66.7762 | 44.926 | 11.232 | 37178.7227 |
71
+ | 1500 | 0.0808 | 2211.4065 | 9323.9248 | 9931.5410 | 68.5121 | 43.788 | 10.947 | 30078.5703 |
72
+ | 2000 | 0.1077 | 2036.7498 | 8931.4150 | 9687.9463 | 65.8573 | 45.553 | 11.388 | 28182.8398 |
73
+ | 2500 | 0.1347 | 1920.9956 | 8044.7114 | 9415.5947 | 65.1775 | 46.028 | 11.507 | 24796.8789 |
74
+ | 3000 | 0.1616 | 1873.2666 | 8170.4624 | 9294.6562 | 65.2306 | 45.991 | 11.498 | 24111.1641 |
75
+ | 3500 | 0.1886 | 1794.6318 | 8009.0542 | 9103.3604 | 64.6784 | 46.383 | 11.596 | 24227.3574 |
76
+ | 4000 | 0.2155 | 1767.7296 | 7658.9282 | 9101.5039 | 65.4052 | 45.868 | 11.467 | 22171.6113 |
77
+ | 4500 | 0.2424 | 1662.8684 | 7530.4146 | 8929.0029 | 64.9694 | 46.176 | 11.544 | 22204.2188 |
78
+ | 5000 | 0.2694 | 1649.6533 | 7726.7241 | 8838.6562 | 65.1184 | 46.07 | 11.517 | 22987.1426 |
79
+ | 5500 | 0.2963 | 1599.3251 | 7247.0552 | 8786.3145 | 64.8678 | 46.248 | 11.562 | 20469.9453 |
80
+ | 6000 | 0.3232 | 1576.4832 | 7656.2266 | 8733.5889 | 63.9508 | 46.911 | 11.728 | 22970.2676 |
81
+ | 6500 | 0.3502 | 1542.9945 | 7010.3413 | 8645.2266 | 64.1329 | 46.778 | 11.694 | 19388.3926 |
82
+ | 7000 | 0.3771 | 1508.8114 | 6926.3296 | 8571.9463 | 64.1586 | 46.759 | 11.69 | 19860.0664 |
83
+ | 7500 | 0.4040 | 1468.5557 | 6836.5732 | 8549.0029 | 64.4335 | 46.56 | 11.64 | 18730.5410 |
84
+ | 8000 | 0.4310 | 1446.2615 | 6887.3745 | 8446.8584 | 64.7914 | 46.302 | 11.576 | 18665.6152 |
85
+ | 8500 | 0.4579 | 1424.2772 | 6938.0576 | 8386.9863 | 64.6538 | 46.401 | 11.6 | 19722.6406 |
86
+ | 9000 | 0.4848 | 1396.2977 | 6694.8984 | 8320.6289 | 64.4737 | 46.531 | 11.633 | 16747.5215 |
87
+ | 9500 | 0.5118 | 1375.5275 | 6766.5586 | 8329.1631 | 64.7405 | 46.339 | 11.585 | 17695.9277 |
88
+ | 10000 | 0.5387 | 1362.8486 | 6724.2305 | 8235.6797 | 64.3591 | 46.613 | 11.653 | 18138.5488 |
89
+ | 10500 | 0.5657 | 1323.8888 | 6641.3037 | 8259.5732 | 64.5004 | 46.511 | 11.628 | 18308.9023 |
90
+ | 11000 | 0.5926 | 1313.7504 | 6603.9517 | 8208.4697 | 63.9043 | 46.945 | 11.736 | 16451.6074 |
91
+ | 11500 | 0.6195 | 1301.4640 | 6654.8970 | 8135.1362 | 63.8702 | 46.97 | 11.743 | 17253.7539 |
92
+ | 12000 | 0.6465 | 1276.9374 | 6700.5708 | 8126.4321 | 63.8094 | 47.015 | 11.754 | 17780.0176 |
93
+ | 12500 | 0.6734 | 1261.5624 | 6390.4980 | 8023.9038 | 64.2272 | 46.709 | 11.677 | 16394.5977 |
94
+ | 13000 | 0.7003 | 1266.7156 | 6481.2490 | 8042.4531 | 63.9608 | 46.904 | 11.726 | 17241.0879 |
95
+ | 13500 | 0.7273 | 1234.4520 | 6369.3550 | 8034.0479 | 64.8274 | 46.277 | 11.569 | 16078.0498 |
96
+ | 14000 | 0.7542 | 1225.3788 | 6342.9141 | 7984.6187 | 64.523 | 46.495 | 11.624 | 16475.8027 |
97
+ | 14500 | 0.7811 | 1191.1580 | 6149.5986 | 7939.2002 | 64.9092 | 46.218 | 11.555 | 14996.4697 |
98
+ | 15000 | 0.8081 | 1184.7468 | 6450.7070 | 7919.1147 | 65.2526 | 45.975 | 11.494 | 16544.1367 |
99
+ | 15500 | 0.8350 | 1166.7611 | 6195.7266 | 7864.8428 | 64.9659 | 46.178 | 11.545 | 14867.8652 |
100
+ | 16000 | 0.8620 | 1162.8715 | 6133.1406 | 7857.3867 | 64.8395 | 46.268 | 11.567 | 14692.2109 |
101
+ | 16500 | 0.8889 | 1153.1372 | 6122.7734 | 7785.0986 | 65.0504 | 46.118 | 11.53 | 15377.7285 |
102
+ | 17000 | 0.9158 | 1137.9264 | 6099.0776 | 7778.0054 | 64.9115 | 46.217 | 11.554 | 14169.0732 |
103
+ | 17500 | 0.9428 | 1129.4969 | 5922.5732 | 7767.8188 | 64.6909 | 46.374 | 11.594 | 13724.9141 |
104
+ | 18000 | 0.9697 | 1111.5293 | 5840.4692 | 7714.6880 | 65.0093 | 46.147 | 11.537 | 12743.6494 |
105
+ | 18500 | 0.9966 | 1116.8070 | 5738.8276 | 7720.9282 | 64.6499 | 46.404 | 11.601 | 12507.6094 |
106
+ | 18562 | 1.0000 | 1113.0409 | 5667.2534 | 7727.6904 | 64.516 | 46.5 | 11.625 | 12154.4287 |
107
 
108
  ### Framework versions
109
  - Distily 0.2.0
logs/optim=adalomo/events.out.tfevents.1723326007.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c6f11e3a120d4788b2845f40375f1d8a41c07875ddc539a46ff0f036f4ebca56
3
+ size 386346
logs/optim=paged_adamw_8bit/events.out.tfevents.1723328775.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ee820b62fa9c2e38d84e6e2a3327d3bf76e89cf3fd6a40591075c840583a84df
3
+ size 3949260
logs/optim=paged_adamw_8bit/events.out.tfevents.1723341311.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4f0f3cc04d5be16c692bff885beb5e9a9837a23f0c45f371ff9da853809fc0ef
3
+ size 253
logs/optim=paged_lion_32bit/events.out.tfevents.1723313305.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f7741b5e740821252005fb316e1bfcd2d2898780bf4502db2420448d848adb9f
3
+ size 3949220
logs/optim=paged_lion_32bit/events.out.tfevents.1723325781.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d138024348a62821b3d4427eb58877e313bd0de0c9e7efe7cd7cd3416e707bf8
3
+ size 529
logs/optim=paged_lion_8bit/events.out.tfevents.1723300341.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9da38980a7c9a5fae9261533c4a631ea1c1b32a1b8c1c20aed6f261299bc76e3
3
+ size 6007
logs/optim=paged_lion_8bit/events.out.tfevents.1723300593.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:34f307783a45b61c5feb93b6f41334f33fb71319d192cbe5c09e73597e9eaf16
3
+ size 3949218
logs/optim=paged_lion_8bit/events.out.tfevents.1723313087.93d6cbb3ad53 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:187019067cc901836fd98ff2d88b17f63fcccf65da38e007e94d66a01c6ce2d2
3
+ size 529
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:58fd8b145fb283bec7b9b8f4ce6907a9f3180d8b5aa823cab21027cd8f6798e1
3
  size 248894656
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4116d18319ae57e567bf43fd6960b662bdffd40803c25e5b2577f84c8f22de73
3
  size 248894656
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ab2a17e58b5c70f0462e9437f329ea0c36e51554ef5edd62cb2f16ca8250b778
3
- size 907106628
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9dc819218a49c4b66c04c2864cc57a5e435ef94ee3bed56d52d4013c94c43e3
3
+ size 907106692