Commit
·
106940f
1
Parent(s):
3add897
initial commit
Browse files- README.md +17 -0
- config.json +26 -0
- pytorch_model.bin +3 -0
- special_tokens_map.json +1 -0
- tokenizer.json +0 -0
- tokenizer_config.json +1 -0
- training.log +63 -0
- vocab.txt +0 -0
README.md
ADDED
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language: en
|
3 |
+
tags:
|
4 |
+
- bert
|
5 |
+
- cola
|
6 |
+
- glue
|
7 |
+
- torchdistill
|
8 |
+
license: apache-2.0
|
9 |
+
datasets:
|
10 |
+
- cola
|
11 |
+
metrics:
|
12 |
+
- matthew's correlation
|
13 |
+
---
|
14 |
+
|
15 |
+
`bert-base-uncased` fine-tuned on CoLA dataset, using [***torchdistill***](https://github.com/yoshitomo-matsubara/torchdistill) and [Google Colab](https://colab.research.google.com/github/yoshitomo-matsubara/torchdistill/blob/master/demo/glue_finetuning_and_submission.ipynb).
|
16 |
+
The hyperparameters are the same as those in Hugging Face's example and/or the paper of BERT, and the training configuration (including hyperparameters) is available [here](https://github.com/yoshitomo-matsubara/torchdistill/blob/main/configs/sample/glue/cola/ce/bert_base_uncased.yaml).
|
17 |
+
I submitted prediction files to [the GLUE leaderboard](https://gluebenchmark.com/leaderboard), and the overall GLUE score was **77.9**.
|
config.json
ADDED
@@ -0,0 +1,26 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "bert-base-uncased",
|
3 |
+
"architectures": [
|
4 |
+
"BertForSequenceClassification"
|
5 |
+
],
|
6 |
+
"attention_probs_dropout_prob": 0.1,
|
7 |
+
"finetuning_task": "cola",
|
8 |
+
"gradient_checkpointing": false,
|
9 |
+
"hidden_act": "gelu",
|
10 |
+
"hidden_dropout_prob": 0.1,
|
11 |
+
"hidden_size": 768,
|
12 |
+
"initializer_range": 0.02,
|
13 |
+
"intermediate_size": 3072,
|
14 |
+
"layer_norm_eps": 1e-12,
|
15 |
+
"max_position_embeddings": 512,
|
16 |
+
"model_type": "bert",
|
17 |
+
"num_attention_heads": 12,
|
18 |
+
"num_hidden_layers": 12,
|
19 |
+
"pad_token_id": 0,
|
20 |
+
"position_embedding_type": "absolute",
|
21 |
+
"problem_type": "single_label_classification",
|
22 |
+
"transformers_version": "4.6.1",
|
23 |
+
"type_vocab_size": 2,
|
24 |
+
"use_cache": true,
|
25 |
+
"vocab_size": 30522
|
26 |
+
}
|
pytorch_model.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:7d72c48db75cda8acc20864a566a1937f947ad8aa59af2edaf1ef677ac21f5bb
|
3 |
+
size 438024457
|
special_tokens_map.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer_config.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "do_lower": true, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "bert-base-uncased"}
|
training.log
ADDED
@@ -0,0 +1,63 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
2021-05-27 20:30:20,874 INFO __main__ Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/cola/ce/bert_base_uncased.yaml', log='log/glue/cola/ce/bert_base_uncased.txt', private_output='leaderboard/glue/standard/bert_base_uncased/', seed=None, student_only=False, task_name='cola', test_only=False, world_size=1)
|
2 |
+
2021-05-27 20:30:20,901 INFO __main__ Distributed environment: NO
|
3 |
+
Num processes: 1
|
4 |
+
Process index: 0
|
5 |
+
Local process index: 0
|
6 |
+
Device: cuda
|
7 |
+
Use FP16 precision: True
|
8 |
+
|
9 |
+
2021-05-27 20:30:25,701 WARNING datasets.builder Reusing dataset glue (/root/.cache/huggingface/datasets/glue/cola/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
|
10 |
+
2021-05-27 20:30:26,821 INFO __main__ Start training
|
11 |
+
2021-05-27 20:30:26,821 INFO torchdistill.models.util [student model]
|
12 |
+
2021-05-27 20:30:26,822 INFO torchdistill.models.util Using the original student model
|
13 |
+
2021-05-27 20:30:26,822 INFO torchdistill.core.training Loss = 1.0 * OrgLoss
|
14 |
+
2021-05-27 20:30:29,248 INFO torchdistill.misc.log Epoch: [0] [ 0/535] eta: 0:00:41 lr: 4.9968847352024925e-05 sample/s: 54.10643773502151 loss: 0.6693 (0.6693) time: 0.0775 data: 0.0035 max mem: 842
|
15 |
+
2021-05-27 20:30:33,369 INFO torchdistill.misc.log Epoch: [0] [ 50/535] eta: 0:00:39 lr: 4.841121495327103e-05 sample/s: 48.982427046990644 loss: 0.5547 (0.5976) time: 0.0834 data: 0.0016 max mem: 2142
|
16 |
+
2021-05-27 20:30:37,481 INFO torchdistill.misc.log Epoch: [0] [100/535] eta: 0:00:35 lr: 4.685358255451713e-05 sample/s: 49.044858059103305 loss: 0.5561 (0.5635) time: 0.0824 data: 0.0017 max mem: 2142
|
17 |
+
2021-05-27 20:30:41,654 INFO torchdistill.misc.log Epoch: [0] [150/535] eta: 0:00:31 lr: 4.529595015576324e-05 sample/s: 49.125278535015624 loss: 0.5053 (0.5476) time: 0.0832 data: 0.0018 max mem: 2142
|
18 |
+
2021-05-27 20:30:45,773 INFO torchdistill.misc.log Epoch: [0] [200/535] eta: 0:00:27 lr: 4.373831775700935e-05 sample/s: 44.87794543669635 loss: 0.3340 (0.5200) time: 0.0836 data: 0.0017 max mem: 2142
|
19 |
+
2021-05-27 20:30:49,874 INFO torchdistill.misc.log Epoch: [0] [250/535] eta: 0:00:23 lr: 4.218068535825546e-05 sample/s: 44.73637953837621 loss: 0.4407 (0.5087) time: 0.0827 data: 0.0017 max mem: 2142
|
20 |
+
2021-05-27 20:30:53,937 INFO torchdistill.misc.log Epoch: [0] [300/535] eta: 0:00:19 lr: 4.0623052959501565e-05 sample/s: 48.89948002891319 loss: 0.4557 (0.4978) time: 0.0814 data: 0.0016 max mem: 2142
|
21 |
+
2021-05-27 20:30:58,107 INFO torchdistill.misc.log Epoch: [0] [350/535] eta: 0:00:15 lr: 3.9065420560747665e-05 sample/s: 58.53960278580301 loss: 0.4266 (0.4919) time: 0.0810 data: 0.0017 max mem: 2298
|
22 |
+
2021-05-27 20:31:02,230 INFO torchdistill.misc.log Epoch: [0] [400/535] eta: 0:00:11 lr: 3.750778816199377e-05 sample/s: 49.29256081795746 loss: 0.4043 (0.4870) time: 0.0819 data: 0.0017 max mem: 2298
|
23 |
+
2021-05-27 20:31:06,351 INFO torchdistill.misc.log Epoch: [0] [450/535] eta: 0:00:07 lr: 3.595015576323988e-05 sample/s: 49.25204321277595 loss: 0.3922 (0.4763) time: 0.0816 data: 0.0016 max mem: 2298
|
24 |
+
2021-05-27 20:31:10,480 INFO torchdistill.misc.log Epoch: [0] [500/535] eta: 0:00:02 lr: 3.4392523364485985e-05 sample/s: 49.238744819975814 loss: 0.4247 (0.4715) time: 0.0816 data: 0.0017 max mem: 2298
|
25 |
+
2021-05-27 20:31:13,221 INFO torchdistill.misc.log Epoch: [0] Total time: 0:00:44
|
26 |
+
2021-05-27 20:31:14,429 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
27 |
+
2021-05-27 20:31:14,429 INFO __main__ Validation: matthews_correlation = 0.5943825670451766
|
28 |
+
2021-05-27 20:31:14,429 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/ce/cola-bert-base-uncased
|
29 |
+
2021-05-27 20:31:15,521 INFO torchdistill.misc.log Epoch: [1] [ 0/535] eta: 0:00:39 lr: 3.330218068535826e-05 sample/s: 56.5559720611634 loss: 0.3736 (0.3736) time: 0.0732 data: 0.0025 max mem: 2298
|
30 |
+
2021-05-27 20:31:19,586 INFO torchdistill.misc.log Epoch: [1] [ 50/535] eta: 0:00:39 lr: 3.1744548286604364e-05 sample/s: 56.025459414874256 loss: 0.2157 (0.2235) time: 0.0829 data: 0.0016 max mem: 2298
|
31 |
+
2021-05-27 20:31:23,717 INFO torchdistill.misc.log Epoch: [1] [100/535] eta: 0:00:35 lr: 3.018691588785047e-05 sample/s: 49.288650859169124 loss: 0.1558 (0.2244) time: 0.0834 data: 0.0016 max mem: 2298
|
32 |
+
2021-05-27 20:31:27,825 INFO torchdistill.misc.log Epoch: [1] [150/535] eta: 0:00:31 lr: 2.8629283489096577e-05 sample/s: 49.013336293708136 loss: 0.1865 (0.2305) time: 0.0832 data: 0.0016 max mem: 2298
|
33 |
+
2021-05-27 20:31:31,934 INFO torchdistill.misc.log Epoch: [1] [200/535] eta: 0:00:27 lr: 2.707165109034268e-05 sample/s: 58.04642408599769 loss: 0.2078 (0.2373) time: 0.0803 data: 0.0016 max mem: 2298
|
34 |
+
2021-05-27 20:31:36,015 INFO torchdistill.misc.log Epoch: [1] [250/535] eta: 0:00:23 lr: 2.5514018691588787e-05 sample/s: 49.34460780997703 loss: 0.1068 (0.2396) time: 0.0819 data: 0.0016 max mem: 2298
|
35 |
+
2021-05-27 20:31:40,185 INFO torchdistill.misc.log Epoch: [1] [300/535] eta: 0:00:19 lr: 2.3956386292834894e-05 sample/s: 49.097532424966055 loss: 0.1558 (0.2320) time: 0.0838 data: 0.0016 max mem: 2302
|
36 |
+
2021-05-27 20:31:44,209 INFO torchdistill.misc.log Epoch: [1] [350/535] eta: 0:00:15 lr: 2.2398753894080997e-05 sample/s: 49.345623746301406 loss: 0.2053 (0.2402) time: 0.0821 data: 0.0016 max mem: 2302
|
37 |
+
2021-05-27 20:31:48,346 INFO torchdistill.misc.log Epoch: [1] [400/535] eta: 0:00:11 lr: 2.0841121495327104e-05 sample/s: 48.23575615912966 loss: 0.2332 (0.2399) time: 0.0819 data: 0.0017 max mem: 2302
|
38 |
+
2021-05-27 20:31:52,463 INFO torchdistill.misc.log Epoch: [1] [450/535] eta: 0:00:06 lr: 1.928348909657321e-05 sample/s: 44.758697780635316 loss: 0.1631 (0.2416) time: 0.0818 data: 0.0017 max mem: 2302
|
39 |
+
2021-05-27 20:31:56,594 INFO torchdistill.misc.log Epoch: [1] [500/535] eta: 0:00:02 lr: 1.7725856697819314e-05 sample/s: 44.85514837005489 loss: 0.2411 (0.2436) time: 0.0816 data: 0.0017 max mem: 2302
|
40 |
+
2021-05-27 20:31:59,372 INFO torchdistill.misc.log Epoch: [1] Total time: 0:00:43
|
41 |
+
2021-05-27 20:32:00,587 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
42 |
+
2021-05-27 20:32:00,587 INFO __main__ Validation: matthews_correlation = 0.5998094084751417
|
43 |
+
2021-05-27 20:32:00,588 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/ce/cola-bert-base-uncased
|
44 |
+
2021-05-27 20:32:01,635 INFO torchdistill.misc.log Epoch: [2] [ 0/535] eta: 0:00:45 lr: 1.663551401869159e-05 sample/s: 48.80772442042119 loss: 0.0228 (0.0228) time: 0.0846 data: 0.0026 max mem: 2302
|
45 |
+
2021-05-27 20:32:05,729 INFO torchdistill.misc.log Epoch: [2] [ 50/535] eta: 0:00:39 lr: 1.5077881619937695e-05 sample/s: 49.13031357252463 loss: 0.0390 (0.1286) time: 0.0790 data: 0.0016 max mem: 2302
|
46 |
+
2021-05-27 20:32:09,836 INFO torchdistill.misc.log Epoch: [2] [100/535] eta: 0:00:35 lr: 1.3520249221183801e-05 sample/s: 49.299947988445794 loss: 0.0097 (0.1387) time: 0.0811 data: 0.0016 max mem: 2302
|
47 |
+
2021-05-27 20:32:14,007 INFO torchdistill.misc.log Epoch: [2] [150/535] eta: 0:00:31 lr: 1.1962616822429908e-05 sample/s: 49.242935929533935 loss: 0.0422 (0.1355) time: 0.0848 data: 0.0017 max mem: 2302
|
48 |
+
2021-05-27 20:32:18,024 INFO torchdistill.misc.log Epoch: [2] [200/535] eta: 0:00:27 lr: 1.0404984423676013e-05 sample/s: 49.29676109434639 loss: 0.0012 (0.1284) time: 0.0793 data: 0.0016 max mem: 2302
|
49 |
+
2021-05-27 20:32:22,137 INFO torchdistill.misc.log Epoch: [2] [250/535] eta: 0:00:23 lr: 8.84735202492212e-06 sample/s: 49.21852190851698 loss: 0.0336 (0.1363) time: 0.0818 data: 0.0018 max mem: 2302
|
50 |
+
2021-05-27 20:32:26,212 INFO torchdistill.misc.log Epoch: [2] [300/535] eta: 0:00:19 lr: 7.289719626168225e-06 sample/s: 58.12827762071068 loss: 0.0005 (0.1429) time: 0.0802 data: 0.0016 max mem: 2302
|
51 |
+
2021-05-27 20:32:30,347 INFO torchdistill.misc.log Epoch: [2] [350/535] eta: 0:00:15 lr: 5.7320872274143305e-06 sample/s: 49.27055727845127 loss: 0.0002 (0.1483) time: 0.0836 data: 0.0017 max mem: 2302
|
52 |
+
2021-05-27 20:32:34,500 INFO torchdistill.misc.log Epoch: [2] [400/535] eta: 0:00:11 lr: 4.174454828660436e-06 sample/s: 49.26288960991294 loss: 0.0004 (0.1554) time: 0.0837 data: 0.0017 max mem: 2302
|
53 |
+
2021-05-27 20:32:38,676 INFO torchdistill.misc.log Epoch: [2] [450/535] eta: 0:00:06 lr: 2.6168224299065425e-06 sample/s: 58.12908322361583 loss: 0.0001 (0.1709) time: 0.0835 data: 0.0016 max mem: 2302
|
54 |
+
2021-05-27 20:32:42,778 INFO torchdistill.misc.log Epoch: [2] [500/535] eta: 0:00:02 lr: 1.059190031152648e-06 sample/s: 48.95069980772429 loss: 0.0002 (0.1783) time: 0.0830 data: 0.0016 max mem: 2302
|
55 |
+
2021-05-27 20:32:45,536 INFO torchdistill.misc.log Epoch: [2] Total time: 0:00:43
|
56 |
+
2021-05-27 20:32:46,745 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
57 |
+
2021-05-27 20:32:46,745 INFO __main__ Validation: matthews_correlation = 0.6104966084654571
|
58 |
+
2021-05-27 20:32:46,746 INFO __main__ Updating ckpt at ./resource/ckpt/glue/cola/ce/cola-bert-base-uncased
|
59 |
+
2021-05-27 20:32:51,744 INFO __main__ [Student: bert-base-uncased]
|
60 |
+
2021-05-27 20:32:52,965 INFO /usr/local/lib/python3.7/dist-packages/datasets/metric.py Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
|
61 |
+
2021-05-27 20:32:52,965 INFO __main__ Test: matthews_correlation = 0.6104966084654571
|
62 |
+
2021-05-27 20:32:52,966 INFO __main__ Start prediction for private dataset(s)
|
63 |
+
2021-05-27 20:32:52,966 INFO __main__ cola/test: 1063 samples
|
vocab.txt
ADDED
The diff for this file is too large to render.
See raw diff
|
|