DD0101 commited on
Commit
b5ea02a
1 Parent(s): f90c19a

End of training

Browse files
README.md ADDED
@@ -0,0 +1,134 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: vinai/phobert-base
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - precision
7
+ - recall
8
+ - f1
9
+ - accuracy
10
+ model-index:
11
+ - name: disfluency-large-3
12
+ results: []
13
+ ---
14
+
15
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
+ should probably proofread and complete it, then remove this comment. -->
17
+
18
+ # disfluency-large-3
19
+
20
+ This model is a fine-tuned version of [vinai/phobert-base](https://huggingface.co/vinai/phobert-base) on an unknown dataset.
21
+ It achieves the following results on the evaluation set:
22
+ - Loss: 38.9748
23
+ - Precision: 0.9892
24
+ - Recall: 0.9868
25
+ - F1: 0.9880
26
+ - Accuracy: 0.9956
27
+
28
+ ## Model description
29
+
30
+ More information needed
31
+
32
+ ## Intended uses & limitations
33
+
34
+ More information needed
35
+
36
+ ## Training and evaluation data
37
+
38
+ More information needed
39
+
40
+ ## Training procedure
41
+
42
+ ### Training hyperparameters
43
+
44
+ The following hyperparameters were used during training:
45
+ - learning_rate: 5e-05
46
+ - train_batch_size: 32
47
+ - eval_batch_size: 32
48
+ - seed: 42
49
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
50
+ - lr_scheduler_type: linear
51
+ - num_epochs: 100
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
56
+ |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
57
+ | No log | 1.0 | 140 | 102.2763 | 0.7799 | 0.8666 | 0.8210 | 0.9512 |
58
+ | No log | 2.0 | 280 | 45.4414 | 0.9380 | 0.9543 | 0.9461 | 0.9831 |
59
+ | No log | 3.0 | 420 | 28.8766 | 0.9627 | 0.9621 | 0.9624 | 0.9894 |
60
+ | 144.9683 | 4.0 | 560 | 23.9895 | 0.9705 | 0.9688 | 0.9696 | 0.9913 |
61
+ | 144.9683 | 5.0 | 700 | 24.3438 | 0.9700 | 0.9724 | 0.9712 | 0.9917 |
62
+ | 144.9683 | 6.0 | 840 | 35.6997 | 0.9622 | 0.9639 | 0.9631 | 0.9879 |
63
+ | 144.9683 | 7.0 | 980 | 22.0471 | 0.9783 | 0.9760 | 0.9771 | 0.9937 |
64
+ | 25.8899 | 8.0 | 1120 | 27.7609 | 0.9712 | 0.9724 | 0.9718 | 0.9916 |
65
+ | 25.8899 | 9.0 | 1260 | 26.6561 | 0.9783 | 0.9760 | 0.9771 | 0.9930 |
66
+ | 25.8899 | 10.0 | 1400 | 24.3437 | 0.9819 | 0.9808 | 0.9814 | 0.9942 |
67
+ | 16.1779 | 11.0 | 1540 | 28.9594 | 0.9725 | 0.9778 | 0.9751 | 0.9928 |
68
+ | 16.1779 | 12.0 | 1680 | 27.7449 | 0.9790 | 0.9784 | 0.9787 | 0.9938 |
69
+ | 16.1779 | 13.0 | 1820 | 30.4554 | 0.9766 | 0.9790 | 0.9778 | 0.9923 |
70
+ | 16.1779 | 14.0 | 1960 | 24.9683 | 0.9856 | 0.9844 | 0.9850 | 0.9950 |
71
+ | 11.2418 | 15.0 | 2100 | 26.0186 | 0.9832 | 0.9838 | 0.9835 | 0.9946 |
72
+ | 11.2418 | 16.0 | 2240 | 25.6512 | 0.9826 | 0.9832 | 0.9829 | 0.9946 |
73
+ | 11.2418 | 17.0 | 2380 | 27.0076 | 0.9808 | 0.9826 | 0.9817 | 0.9941 |
74
+ | 8.4914 | 18.0 | 2520 | 35.0380 | 0.9789 | 0.9778 | 0.9784 | 0.9940 |
75
+ | 8.4914 | 19.0 | 2660 | 37.8171 | 0.9778 | 0.9784 | 0.9781 | 0.9928 |
76
+ | 8.4914 | 20.0 | 2800 | 34.0740 | 0.9843 | 0.9826 | 0.9835 | 0.9945 |
77
+ | 8.4914 | 21.0 | 2940 | 35.1558 | 0.9837 | 0.9820 | 0.9829 | 0.9948 |
78
+ | 8.5438 | 22.0 | 3080 | 35.5458 | 0.9850 | 0.9838 | 0.9844 | 0.9949 |
79
+ | 8.5438 | 23.0 | 3220 | 35.5941 | 0.9868 | 0.9850 | 0.9859 | 0.9952 |
80
+ | 8.5438 | 24.0 | 3360 | 38.6942 | 0.9843 | 0.9820 | 0.9832 | 0.9951 |
81
+ | 6.4481 | 25.0 | 3500 | 39.7245 | 0.9843 | 0.9826 | 0.9835 | 0.9945 |
82
+ | 6.4481 | 26.0 | 3640 | 51.0287 | 0.9789 | 0.9772 | 0.9780 | 0.9934 |
83
+ | 6.4481 | 27.0 | 3780 | 42.5358 | 0.9814 | 0.9808 | 0.9811 | 0.9944 |
84
+ | 6.4481 | 28.0 | 3920 | 45.3493 | 0.9850 | 0.9838 | 0.9844 | 0.9946 |
85
+ | 5.809 | 29.0 | 4060 | 45.2262 | 0.9861 | 0.9838 | 0.9850 | 0.9951 |
86
+ | 5.809 | 30.0 | 4200 | 48.4879 | 0.9802 | 0.9796 | 0.9799 | 0.9939 |
87
+ | 5.809 | 31.0 | 4340 | 42.5276 | 0.9844 | 0.9850 | 0.9847 | 0.9950 |
88
+ | 5.809 | 32.0 | 4480 | 42.3311 | 0.9862 | 0.9844 | 0.9853 | 0.9948 |
89
+ | 5.2809 | 33.0 | 4620 | 40.3374 | 0.9819 | 0.9802 | 0.9811 | 0.9947 |
90
+ | 5.2809 | 34.0 | 4760 | 39.5919 | 0.9849 | 0.9832 | 0.9841 | 0.9951 |
91
+ | 5.2809 | 35.0 | 4900 | 41.3088 | 0.9879 | 0.9838 | 0.9858 | 0.9952 |
92
+ | 4.165 | 36.0 | 5040 | 45.8545 | 0.9843 | 0.9826 | 0.9835 | 0.9949 |
93
+ | 4.165 | 37.0 | 5180 | 46.9784 | 0.9843 | 0.9826 | 0.9835 | 0.9942 |
94
+ | 4.165 | 38.0 | 5320 | 41.9215 | 0.9856 | 0.9850 | 0.9853 | 0.9947 |
95
+ | 4.165 | 39.0 | 5460 | 45.2609 | 0.9855 | 0.9826 | 0.9841 | 0.9948 |
96
+ | 3.6327 | 40.0 | 5600 | 43.3053 | 0.9880 | 0.9856 | 0.9868 | 0.9946 |
97
+ | 3.6327 | 41.0 | 5740 | 46.4860 | 0.9843 | 0.9820 | 0.9832 | 0.9949 |
98
+ | 3.6327 | 42.0 | 5880 | 47.4994 | 0.9838 | 0.9832 | 0.9835 | 0.9946 |
99
+ | 2.8287 | 43.0 | 6020 | 49.2580 | 0.9861 | 0.9838 | 0.9850 | 0.9948 |
100
+ | 2.8287 | 44.0 | 6160 | 43.4413 | 0.9849 | 0.9820 | 0.9834 | 0.9951 |
101
+ | 2.8287 | 45.0 | 6300 | 38.9748 | 0.9892 | 0.9868 | 0.9880 | 0.9956 |
102
+ | 2.8287 | 46.0 | 6440 | 39.2511 | 0.9885 | 0.9856 | 0.9871 | 0.9955 |
103
+ | 3.1047 | 47.0 | 6580 | 44.7982 | 0.9843 | 0.9826 | 0.9835 | 0.9943 |
104
+ | 3.1047 | 48.0 | 6720 | 44.1594 | 0.9850 | 0.9838 | 0.9844 | 0.9949 |
105
+ | 3.1047 | 49.0 | 6860 | 40.8717 | 0.9892 | 0.9862 | 0.9877 | 0.9955 |
106
+ | 2.7354 | 50.0 | 7000 | 57.3700 | 0.9849 | 0.9820 | 0.9834 | 0.9938 |
107
+ | 2.7354 | 51.0 | 7140 | 51.8525 | 0.9880 | 0.9856 | 0.9868 | 0.9945 |
108
+ | 2.7354 | 52.0 | 7280 | 45.2376 | 0.9879 | 0.9844 | 0.9862 | 0.9951 |
109
+ | 2.7354 | 53.0 | 7420 | 43.6209 | 0.9873 | 0.9850 | 0.9862 | 0.9955 |
110
+ | 1.7042 | 54.0 | 7560 | 43.9714 | 0.9880 | 0.9862 | 0.9871 | 0.9954 |
111
+ | 1.7042 | 55.0 | 7700 | 52.5784 | 0.9831 | 0.9802 | 0.9816 | 0.9943 |
112
+ | 1.7042 | 56.0 | 7840 | 50.2582 | 0.9849 | 0.9832 | 0.9841 | 0.9950 |
113
+ | 1.7042 | 57.0 | 7980 | 45.0711 | 0.9861 | 0.9838 | 0.9850 | 0.9950 |
114
+ | 1.6298 | 58.0 | 8120 | 45.4635 | 0.9867 | 0.9844 | 0.9856 | 0.9952 |
115
+ | 1.6298 | 59.0 | 8260 | 42.1318 | 0.9892 | 0.9868 | 0.9880 | 0.9957 |
116
+ | 1.6298 | 60.0 | 8400 | 45.4197 | 0.9885 | 0.9856 | 0.9871 | 0.9951 |
117
+ | 1.5229 | 61.0 | 8540 | 49.6159 | 0.9855 | 0.9826 | 0.9841 | 0.9950 |
118
+ | 1.5229 | 62.0 | 8680 | 47.5180 | 0.9849 | 0.9820 | 0.9834 | 0.9948 |
119
+ | 1.5229 | 63.0 | 8820 | 45.6821 | 0.9867 | 0.9832 | 0.9849 | 0.9955 |
120
+ | 1.5229 | 64.0 | 8960 | 44.1710 | 0.9897 | 0.9862 | 0.9880 | 0.9958 |
121
+ | 1.6783 | 65.0 | 9100 | 43.8102 | 0.9880 | 0.9856 | 0.9868 | 0.9956 |
122
+ | 1.6783 | 66.0 | 9240 | 41.8846 | 0.9868 | 0.9850 | 0.9859 | 0.9956 |
123
+ | 1.6783 | 67.0 | 9380 | 42.1225 | 0.9886 | 0.9862 | 0.9874 | 0.9960 |
124
+ | 1.9169 | 68.0 | 9520 | 42.4050 | 0.9880 | 0.9862 | 0.9871 | 0.9956 |
125
+ | 1.9169 | 69.0 | 9660 | 43.9178 | 0.9867 | 0.9844 | 0.9856 | 0.9956 |
126
+ | 1.9169 | 70.0 | 9800 | 48.1057 | 0.9879 | 0.9838 | 0.9858 | 0.9953 |
127
+
128
+
129
+ ### Framework versions
130
+
131
+ - Transformers 4.32.0
132
+ - Pytorch 2.0.1+cu118
133
+ - Datasets 2.14.4
134
+ - Tokenizers 0.13.3
added_tokens.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ {
2
+ "<mask>": 64000
3
+ }
bpe.codes ADDED
The diff for this file is too large to render. See raw diff
 
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b0132f3079b3b2f694d1013ad54aa97df32a7d0530a53a00fbe3b4622ad1551f
3
+ size 540096317
special_tokens_map.json ADDED
@@ -0,0 +1,9 @@
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "cls_token": "<s>",
4
+ "eos_token": "</s>",
5
+ "mask_token": "<mask>",
6
+ "pad_token": "<pad>",
7
+ "sep_token": "</s>",
8
+ "unk_token": "<unk>"
9
+ }
tokenizer_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_special_tokens": true,
3
+ "bos_token": "<s>",
4
+ "clean_up_tokenization_spaces": true,
5
+ "cls_token": "<s>",
6
+ "eos_token": "</s>",
7
+ "mask_token": "<mask>",
8
+ "model_max_length": 256,
9
+ "pad_token": "<pad>",
10
+ "sep_token": "</s>",
11
+ "tokenizer_class": "PhobertTokenizer",
12
+ "unk_token": "<unk>"
13
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e4098162ddc76d1027cc5861d1a41685c983220c90e1771e384b44774b34999b
3
+ size 4091
vocab.txt ADDED
The diff for this file is too large to render. See raw diff