salbatarni commited on
Commit
c39bb06
·
verified ·
1 Parent(s): 61626cc

End of training

Browse files
Files changed (1) hide show
  1. README.md +80 -80
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task1_fold5
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task1_fold5
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.1878
18
- - Qwk: 0.3819
19
- - Mse: 0.1876
20
 
21
  ## Model description
22
 
@@ -47,81 +47,81 @@ The following hyperparameters were used during training:
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
  |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.1333 | 2 | 0.4109 | 0.2336 | 0.4106 |
51
- | No log | 0.2667 | 4 | 0.3205 | 0.4919 | 0.3198 |
52
- | No log | 0.4 | 6 | 0.3796 | 0.1681 | 0.3786 |
53
- | No log | 0.5333 | 8 | 0.4000 | 0.1674 | 0.3992 |
54
- | No log | 0.6667 | 10 | 0.3695 | 0.3199 | 0.3690 |
55
- | No log | 0.8 | 12 | 0.2845 | 0.3350 | 0.2841 |
56
- | No log | 0.9333 | 14 | 0.2416 | 0.3864 | 0.2408 |
57
- | No log | 1.0667 | 16 | 0.3319 | 0.5930 | 0.3309 |
58
- | No log | 1.2 | 18 | 0.2997 | 0.5123 | 0.2988 |
59
- | No log | 1.3333 | 20 | 0.2367 | 0.2937 | 0.2361 |
60
- | No log | 1.4667 | 22 | 0.2587 | 0.3126 | 0.2583 |
61
- | No log | 1.6 | 24 | 0.2424 | 0.3355 | 0.2423 |
62
- | No log | 1.7333 | 26 | 0.2050 | 0.3596 | 0.2050 |
63
- | No log | 1.8667 | 28 | 0.1955 | 0.4152 | 0.1955 |
64
- | No log | 2.0 | 30 | 0.2132 | 0.4904 | 0.2130 |
65
- | No log | 2.1333 | 32 | 0.2250 | 0.4135 | 0.2247 |
66
- | No log | 2.2667 | 34 | 0.2143 | 0.3443 | 0.2140 |
67
- | No log | 2.4 | 36 | 0.1969 | 0.3501 | 0.1967 |
68
- | No log | 2.5333 | 38 | 0.1788 | 0.3703 | 0.1787 |
69
- | No log | 2.6667 | 40 | 0.1794 | 0.3757 | 0.1792 |
70
- | No log | 2.8 | 42 | 0.1996 | 0.3852 | 0.1993 |
71
- | No log | 2.9333 | 44 | 0.2126 | 0.3724 | 0.2122 |
72
- | No log | 3.0667 | 46 | 0.2175 | 0.3534 | 0.2172 |
73
- | No log | 3.2 | 48 | 0.1984 | 0.3771 | 0.1982 |
74
- | No log | 3.3333 | 50 | 0.1905 | 0.3695 | 0.1903 |
75
- | No log | 3.4667 | 52 | 0.1879 | 0.3695 | 0.1877 |
76
- | No log | 3.6 | 54 | 0.1891 | 0.3766 | 0.1889 |
77
- | No log | 3.7333 | 56 | 0.1833 | 0.4205 | 0.1831 |
78
- | No log | 3.8667 | 58 | 0.1756 | 0.3942 | 0.1755 |
79
- | No log | 4.0 | 60 | 0.1782 | 0.3642 | 0.1782 |
80
- | No log | 4.1333 | 62 | 0.1748 | 0.3617 | 0.1748 |
81
- | No log | 4.2667 | 64 | 0.1775 | 0.3736 | 0.1774 |
82
- | No log | 4.4 | 66 | 0.1912 | 0.3958 | 0.1910 |
83
- | No log | 4.5333 | 68 | 0.2067 | 0.3725 | 0.2064 |
84
- | No log | 4.6667 | 70 | 0.1994 | 0.3983 | 0.1991 |
85
- | No log | 4.8 | 72 | 0.1917 | 0.4319 | 0.1914 |
86
- | No log | 4.9333 | 74 | 0.1976 | 0.4129 | 0.1976 |
87
- | No log | 5.0667 | 76 | 0.2040 | 0.4133 | 0.2040 |
88
- | No log | 5.2 | 78 | 0.1996 | 0.3886 | 0.1996 |
89
- | No log | 5.3333 | 80 | 0.1919 | 0.3944 | 0.1917 |
90
- | No log | 5.4667 | 82 | 0.2038 | 0.3686 | 0.2035 |
91
- | No log | 5.6 | 84 | 0.2110 | 0.3572 | 0.2107 |
92
- | No log | 5.7333 | 86 | 0.2060 | 0.3677 | 0.2058 |
93
- | No log | 5.8667 | 88 | 0.1924 | 0.3835 | 0.1922 |
94
- | No log | 6.0 | 90 | 0.1834 | 0.4065 | 0.1833 |
95
- | No log | 6.1333 | 92 | 0.1793 | 0.4025 | 0.1792 |
96
- | No log | 6.2667 | 94 | 0.1797 | 0.4067 | 0.1796 |
97
- | No log | 6.4 | 96 | 0.1838 | 0.3854 | 0.1837 |
98
- | No log | 6.5333 | 98 | 0.1902 | 0.3712 | 0.1900 |
99
- | No log | 6.6667 | 100 | 0.1977 | 0.3732 | 0.1975 |
100
- | No log | 6.8 | 102 | 0.1991 | 0.3668 | 0.1989 |
101
- | No log | 6.9333 | 104 | 0.1975 | 0.3732 | 0.1973 |
102
- | No log | 7.0667 | 106 | 0.1963 | 0.3677 | 0.1961 |
103
- | No log | 7.2 | 108 | 0.1964 | 0.3641 | 0.1962 |
104
- | No log | 7.3333 | 110 | 0.1928 | 0.3659 | 0.1926 |
105
- | No log | 7.4667 | 112 | 0.1906 | 0.3765 | 0.1905 |
106
- | No log | 7.6 | 114 | 0.1938 | 0.3804 | 0.1936 |
107
- | No log | 7.7333 | 116 | 0.1973 | 0.3852 | 0.1971 |
108
- | No log | 7.8667 | 118 | 0.2049 | 0.3790 | 0.2047 |
109
- | No log | 8.0 | 120 | 0.2071 | 0.3725 | 0.2069 |
110
- | No log | 8.1333 | 122 | 0.2054 | 0.3763 | 0.2051 |
111
- | No log | 8.2667 | 124 | 0.2031 | 0.3715 | 0.2028 |
112
- | No log | 8.4 | 126 | 0.2018 | 0.3547 | 0.2016 |
113
- | No log | 8.5333 | 128 | 0.2011 | 0.3501 | 0.2009 |
114
- | No log | 8.6667 | 130 | 0.2020 | 0.3532 | 0.2018 |
115
- | No log | 8.8 | 132 | 0.2004 | 0.3532 | 0.2002 |
116
- | No log | 8.9333 | 134 | 0.1983 | 0.3587 | 0.1982 |
117
- | No log | 9.0667 | 136 | 0.1957 | 0.3632 | 0.1955 |
118
- | No log | 9.2 | 138 | 0.1933 | 0.3632 | 0.1931 |
119
- | No log | 9.3333 | 140 | 0.1911 | 0.3749 | 0.1909 |
120
- | No log | 9.4667 | 142 | 0.1894 | 0.3819 | 0.1893 |
121
- | No log | 9.6 | 144 | 0.1885 | 0.3819 | 0.1883 |
122
- | No log | 9.7333 | 146 | 0.1879 | 0.3819 | 0.1878 |
123
- | No log | 9.8667 | 148 | 0.1877 | 0.3819 | 0.1876 |
124
- | No log | 10.0 | 150 | 0.1878 | 0.3819 | 0.1876 |
125
 
126
 
127
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task1_fold6
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task1_fold6
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.3732
18
+ - Qwk: 0.2161
19
+ - Mse: 0.3733
20
 
21
  ## Model description
22
 
 
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
  |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
+ | No log | 0.1333 | 2 | 0.9235 | 0.0305 | 0.9241 |
51
+ | No log | 0.2667 | 4 | 0.4213 | 0.0752 | 0.4205 |
52
+ | No log | 0.4 | 6 | 0.4900 | 0.2872 | 0.4904 |
53
+ | No log | 0.5333 | 8 | 0.3989 | 0.1492 | 0.3997 |
54
+ | No log | 0.6667 | 10 | 0.2875 | 0.0828 | 0.2881 |
55
+ | No log | 0.8 | 12 | 0.3103 | 0.1946 | 0.3107 |
56
+ | No log | 0.9333 | 14 | 0.3505 | 0.1869 | 0.3507 |
57
+ | No log | 1.0667 | 16 | 0.2987 | 0.2328 | 0.2991 |
58
+ | No log | 1.2 | 18 | 0.2743 | 0.1987 | 0.2749 |
59
+ | No log | 1.3333 | 20 | 0.2678 | 0.1801 | 0.2685 |
60
+ | No log | 1.4667 | 22 | 0.2718 | 0.2380 | 0.2723 |
61
+ | No log | 1.6 | 24 | 0.2750 | 0.2484 | 0.2754 |
62
+ | No log | 1.7333 | 26 | 0.2802 | 0.2407 | 0.2807 |
63
+ | No log | 1.8667 | 28 | 0.2967 | 0.2647 | 0.2971 |
64
+ | No log | 2.0 | 30 | 0.3002 | 0.2386 | 0.3005 |
65
+ | No log | 2.1333 | 32 | 0.3042 | 0.2307 | 0.3046 |
66
+ | No log | 2.2667 | 34 | 0.2836 | 0.2266 | 0.2840 |
67
+ | No log | 2.4 | 36 | 0.2566 | 0.2275 | 0.2571 |
68
+ | No log | 2.5333 | 38 | 0.2566 | 0.2726 | 0.2572 |
69
+ | No log | 2.6667 | 40 | 0.2685 | 0.2318 | 0.2689 |
70
+ | No log | 2.8 | 42 | 0.3239 | 0.2291 | 0.3238 |
71
+ | No log | 2.9333 | 44 | 0.3137 | 0.2140 | 0.3139 |
72
+ | No log | 3.0667 | 46 | 0.2868 | 0.2462 | 0.2874 |
73
+ | No log | 3.2 | 48 | 0.2840 | 0.2475 | 0.2845 |
74
+ | No log | 3.3333 | 50 | 0.2972 | 0.2539 | 0.2975 |
75
+ | No log | 3.4667 | 52 | 0.2987 | 0.2373 | 0.2989 |
76
+ | No log | 3.6 | 54 | 0.2922 | 0.2472 | 0.2925 |
77
+ | No log | 3.7333 | 56 | 0.2979 | 0.2276 | 0.2983 |
78
+ | No log | 3.8667 | 58 | 0.3082 | 0.2134 | 0.3086 |
79
+ | No log | 4.0 | 60 | 0.3245 | 0.2197 | 0.3249 |
80
+ | No log | 4.1333 | 62 | 0.3302 | 0.2197 | 0.3305 |
81
+ | No log | 4.2667 | 64 | 0.3164 | 0.2079 | 0.3169 |
82
+ | No log | 4.4 | 66 | 0.3410 | 0.2131 | 0.3413 |
83
+ | No log | 4.5333 | 68 | 0.3844 | 0.2178 | 0.3845 |
84
+ | No log | 4.6667 | 70 | 0.3887 | 0.2178 | 0.3888 |
85
+ | No log | 4.8 | 72 | 0.3995 | 0.2096 | 0.3996 |
86
+ | No log | 4.9333 | 74 | 0.3677 | 0.2141 | 0.3679 |
87
+ | No log | 5.0667 | 76 | 0.3517 | 0.2188 | 0.3520 |
88
+ | No log | 5.2 | 78 | 0.3437 | 0.2188 | 0.3439 |
89
+ | No log | 5.3333 | 80 | 0.3781 | 0.2096 | 0.3782 |
90
+ | No log | 5.4667 | 82 | 0.4079 | 0.1975 | 0.4080 |
91
+ | No log | 5.6 | 84 | 0.4051 | 0.2133 | 0.4052 |
92
+ | No log | 5.7333 | 86 | 0.3469 | 0.2313 | 0.3471 |
93
+ | No log | 5.8667 | 88 | 0.3091 | 0.2317 | 0.3095 |
94
+ | No log | 6.0 | 90 | 0.3034 | 0.2341 | 0.3039 |
95
+ | No log | 6.1333 | 92 | 0.3187 | 0.2175 | 0.3191 |
96
+ | No log | 6.2667 | 94 | 0.3556 | 0.2123 | 0.3558 |
97
+ | No log | 6.4 | 96 | 0.4256 | 0.1936 | 0.4256 |
98
+ | No log | 6.5333 | 98 | 0.4370 | 0.1899 | 0.4370 |
99
+ | No log | 6.6667 | 100 | 0.4141 | 0.2053 | 0.4142 |
100
+ | No log | 6.8 | 102 | 0.4062 | 0.2053 | 0.4063 |
101
+ | No log | 6.9333 | 104 | 0.3722 | 0.2178 | 0.3723 |
102
+ | No log | 7.0667 | 106 | 0.3420 | 0.2313 | 0.3423 |
103
+ | No log | 7.2 | 108 | 0.3323 | 0.2337 | 0.3326 |
104
+ | No log | 7.3333 | 110 | 0.3482 | 0.2161 | 0.3484 |
105
+ | No log | 7.4667 | 112 | 0.3690 | 0.2178 | 0.3691 |
106
+ | No log | 7.6 | 114 | 0.3867 | 0.2169 | 0.3868 |
107
+ | No log | 7.7333 | 116 | 0.4039 | 0.2029 | 0.4039 |
108
+ | No log | 7.8667 | 118 | 0.4092 | 0.2029 | 0.4091 |
109
+ | No log | 8.0 | 120 | 0.4179 | 0.2012 | 0.4178 |
110
+ | No log | 8.1333 | 122 | 0.4160 | 0.2089 | 0.4160 |
111
+ | No log | 8.2667 | 124 | 0.4199 | 0.2012 | 0.4199 |
112
+ | No log | 8.4 | 126 | 0.4338 | 0.1899 | 0.4337 |
113
+ | No log | 8.5333 | 128 | 0.4437 | 0.1935 | 0.4436 |
114
+ | No log | 8.6667 | 130 | 0.4404 | 0.1973 | 0.4403 |
115
+ | No log | 8.8 | 132 | 0.4283 | 0.2012 | 0.4282 |
116
+ | No log | 8.9333 | 134 | 0.4152 | 0.2012 | 0.4151 |
117
+ | No log | 9.0667 | 136 | 0.3937 | 0.2089 | 0.3937 |
118
+ | No log | 9.2 | 138 | 0.3753 | 0.2141 | 0.3753 |
119
+ | No log | 9.3333 | 140 | 0.3664 | 0.2097 | 0.3665 |
120
+ | No log | 9.4667 | 142 | 0.3644 | 0.2097 | 0.3645 |
121
+ | No log | 9.6 | 144 | 0.3651 | 0.2097 | 0.3652 |
122
+ | No log | 9.7333 | 146 | 0.3680 | 0.2097 | 0.3681 |
123
+ | No log | 9.8667 | 148 | 0.3713 | 0.2097 | 0.3714 |
124
+ | No log | 10.0 | 150 | 0.3732 | 0.2161 | 0.3733 |
125
 
126
 
127
  ### Framework versions