salbatarni commited on
Commit
479833c
·
verified ·
1 Parent(s): 9f760ca

End of training

Browse files
Files changed (1) hide show
  1. README.md +87 -82
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task7_fold4
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task7_fold4
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.3071
18
- - Qwk: 0.5503
19
- - Mse: 0.3071
20
 
21
  ## Model description
22
 
@@ -45,83 +45,88 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.1333 | 2 | 0.5503 | 0.1601 | 0.5503 |
51
- | No log | 0.2667 | 4 | 0.4709 | 0.1756 | 0.4709 |
52
- | No log | 0.4 | 6 | 0.5461 | 0.3162 | 0.5461 |
53
- | No log | 0.5333 | 8 | 0.4597 | 0.2317 | 0.4597 |
54
- | No log | 0.6667 | 10 | 0.3774 | 0.2343 | 0.3774 |
55
- | No log | 0.8 | 12 | 0.3353 | 0.2749 | 0.3353 |
56
- | No log | 0.9333 | 14 | 0.3174 | 0.2948 | 0.3174 |
57
- | No log | 1.0667 | 16 | 0.2902 | 0.3035 | 0.2902 |
58
- | No log | 1.2 | 18 | 0.2766 | 0.3168 | 0.2766 |
59
- | No log | 1.3333 | 20 | 0.2981 | 0.3060 | 0.2981 |
60
- | No log | 1.4667 | 22 | 0.2863 | 0.3304 | 0.2863 |
61
- | No log | 1.6 | 24 | 0.2902 | 0.3012 | 0.2902 |
62
- | No log | 1.7333 | 26 | 0.2762 | 0.3012 | 0.2762 |
63
- | No log | 1.8667 | 28 | 0.2673 | 0.3216 | 0.2673 |
64
- | No log | 2.0 | 30 | 0.2662 | 0.3673 | 0.2662 |
65
- | No log | 2.1333 | 32 | 0.2927 | 0.4416 | 0.2927 |
66
- | No log | 2.2667 | 34 | 0.3407 | 0.4358 | 0.3407 |
67
- | No log | 2.4 | 36 | 0.3384 | 0.4477 | 0.3384 |
68
- | No log | 2.5333 | 38 | 0.3010 | 0.4244 | 0.3010 |
69
- | No log | 2.6667 | 40 | 0.3009 | 0.3892 | 0.3009 |
70
- | No log | 2.8 | 42 | 0.3374 | 0.4316 | 0.3374 |
71
- | No log | 2.9333 | 44 | 0.3314 | 0.5561 | 0.3314 |
72
- | No log | 3.0667 | 46 | 0.3496 | 0.6218 | 0.3496 |
73
- | No log | 3.2 | 48 | 0.3271 | 0.5809 | 0.3271 |
74
- | No log | 3.3333 | 50 | 0.2795 | 0.4989 | 0.2795 |
75
- | No log | 3.4667 | 52 | 0.2819 | 0.4439 | 0.2819 |
76
- | No log | 3.6 | 54 | 0.3087 | 0.3899 | 0.3087 |
77
- | No log | 3.7333 | 56 | 0.3129 | 0.3657 | 0.3129 |
78
- | No log | 3.8667 | 58 | 0.3247 | 0.4403 | 0.3247 |
79
- | No log | 4.0 | 60 | 0.3497 | 0.5590 | 0.3497 |
80
- | No log | 4.1333 | 62 | 0.3554 | 0.5878 | 0.3554 |
81
- | No log | 4.2667 | 64 | 0.3325 | 0.5931 | 0.3325 |
82
- | No log | 4.4 | 66 | 0.2900 | 0.5432 | 0.2900 |
83
- | No log | 4.5333 | 68 | 0.2886 | 0.5107 | 0.2886 |
84
- | No log | 4.6667 | 70 | 0.3060 | 0.4830 | 0.3060 |
85
- | No log | 4.8 | 72 | 0.3388 | 0.4839 | 0.3388 |
86
- | No log | 4.9333 | 74 | 0.3650 | 0.4978 | 0.3650 |
87
- | No log | 5.0667 | 76 | 0.3668 | 0.5231 | 0.3668 |
88
- | No log | 5.2 | 78 | 0.3779 | 0.5912 | 0.3779 |
89
- | No log | 5.3333 | 80 | 0.3797 | 0.6118 | 0.3797 |
90
- | No log | 5.4667 | 82 | 0.3500 | 0.5912 | 0.3500 |
91
- | No log | 5.6 | 84 | 0.3089 | 0.5757 | 0.3089 |
92
- | No log | 5.7333 | 86 | 0.2927 | 0.5560 | 0.2927 |
93
- | No log | 5.8667 | 88 | 0.3153 | 0.5457 | 0.3153 |
94
- | No log | 6.0 | 90 | 0.3342 | 0.5489 | 0.3342 |
95
- | No log | 6.1333 | 92 | 0.3318 | 0.5833 | 0.3318 |
96
- | No log | 6.2667 | 94 | 0.3329 | 0.5929 | 0.3329 |
97
- | No log | 6.4 | 96 | 0.3261 | 0.5905 | 0.3261 |
98
- | No log | 6.5333 | 98 | 0.3147 | 0.5945 | 0.3147 |
99
- | No log | 6.6667 | 100 | 0.3034 | 0.5890 | 0.3034 |
100
- | No log | 6.8 | 102 | 0.3084 | 0.5863 | 0.3084 |
101
- | No log | 6.9333 | 104 | 0.3094 | 0.5743 | 0.3094 |
102
- | No log | 7.0667 | 106 | 0.3099 | 0.5635 | 0.3099 |
103
- | No log | 7.2 | 108 | 0.3101 | 0.5743 | 0.3101 |
104
- | No log | 7.3333 | 110 | 0.3126 | 0.5689 | 0.3126 |
105
- | No log | 7.4667 | 112 | 0.3295 | 0.5777 | 0.3295 |
106
- | No log | 7.6 | 114 | 0.3516 | 0.5943 | 0.3516 |
107
- | No log | 7.7333 | 116 | 0.3630 | 0.5902 | 0.3630 |
108
- | No log | 7.8667 | 118 | 0.3567 | 0.5791 | 0.3567 |
109
- | No log | 8.0 | 120 | 0.3425 | 0.5667 | 0.3425 |
110
- | No log | 8.1333 | 122 | 0.3293 | 0.5424 | 0.3293 |
111
- | No log | 8.2667 | 124 | 0.3261 | 0.5369 | 0.3261 |
112
- | No log | 8.4 | 126 | 0.3225 | 0.5728 | 0.3225 |
113
- | No log | 8.5333 | 128 | 0.3121 | 0.5644 | 0.3121 |
114
- | No log | 8.6667 | 130 | 0.3072 | 0.5740 | 0.3072 |
115
- | No log | 8.8 | 132 | 0.3079 | 0.5644 | 0.3079 |
116
- | No log | 8.9333 | 134 | 0.3012 | 0.5687 | 0.3012 |
117
- | No log | 9.0667 | 136 | 0.3021 | 0.5644 | 0.3021 |
118
- | No log | 9.2 | 138 | 0.3058 | 0.5644 | 0.3058 |
119
- | No log | 9.3333 | 140 | 0.3067 | 0.5557 | 0.3067 |
120
- | No log | 9.4667 | 142 | 0.3080 | 0.5663 | 0.3080 |
121
- | No log | 9.6 | 144 | 0.3079 | 0.5663 | 0.3079 |
122
- | No log | 9.7333 | 146 | 0.3063 | 0.5493 | 0.3063 |
123
- | No log | 9.8667 | 148 | 0.3065 | 0.5547 | 0.3065 |
124
- | No log | 10.0 | 150 | 0.3071 | 0.5503 | 0.3071 |
 
 
 
 
 
125
 
126
 
127
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task7_fold5
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task7_fold5
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2737
18
+ - Qwk: 0.2531
19
+ - Mse: 0.2737
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
50
+ | No log | 0.125 | 2 | 1.2635 | 0.0075 | 1.2635 |
51
+ | No log | 0.25 | 4 | 0.4733 | 0.0856 | 0.4733 |
52
+ | No log | 0.375 | 6 | 0.4780 | 0.1193 | 0.4780 |
53
+ | No log | 0.5 | 8 | 0.4171 | 0.1125 | 0.4171 |
54
+ | No log | 0.625 | 10 | 0.3393 | 0.1154 | 0.3393 |
55
+ | No log | 0.75 | 12 | 0.3133 | 0.0877 | 0.3133 |
56
+ | No log | 0.875 | 14 | 0.2954 | 0.0958 | 0.2954 |
57
+ | No log | 1.0 | 16 | 0.2822 | 0.1293 | 0.2822 |
58
+ | No log | 1.125 | 18 | 0.2616 | 0.1948 | 0.2616 |
59
+ | No log | 1.25 | 20 | 0.2545 | 0.2304 | 0.2545 |
60
+ | No log | 1.375 | 22 | 0.2454 | 0.2207 | 0.2454 |
61
+ | No log | 1.5 | 24 | 0.2508 | 0.2316 | 0.2508 |
62
+ | No log | 1.625 | 26 | 0.2477 | 0.2028 | 0.2477 |
63
+ | No log | 1.75 | 28 | 0.2442 | 0.2000 | 0.2442 |
64
+ | No log | 1.875 | 30 | 0.2513 | 0.2028 | 0.2513 |
65
+ | No log | 2.0 | 32 | 0.2468 | 0.2138 | 0.2468 |
66
+ | No log | 2.125 | 34 | 0.2358 | 0.2349 | 0.2358 |
67
+ | No log | 2.25 | 36 | 0.2333 | 0.2259 | 0.2333 |
68
+ | No log | 2.375 | 38 | 0.2391 | 0.2212 | 0.2391 |
69
+ | No log | 2.5 | 40 | 0.2305 | 0.2355 | 0.2305 |
70
+ | No log | 2.625 | 42 | 0.2393 | 0.2837 | 0.2393 |
71
+ | No log | 2.75 | 44 | 0.2395 | 0.2837 | 0.2395 |
72
+ | No log | 2.875 | 46 | 0.2326 | 0.2838 | 0.2326 |
73
+ | No log | 3.0 | 48 | 0.2323 | 0.2435 | 0.2323 |
74
+ | No log | 3.125 | 50 | 0.2326 | 0.2723 | 0.2326 |
75
+ | No log | 3.25 | 52 | 0.2331 | 0.2723 | 0.2331 |
76
+ | No log | 3.375 | 54 | 0.2328 | 0.2800 | 0.2328 |
77
+ | No log | 3.5 | 56 | 0.2397 | 0.2677 | 0.2397 |
78
+ | No log | 3.625 | 58 | 0.2444 | 0.2503 | 0.2444 |
79
+ | No log | 3.75 | 60 | 0.2477 | 0.2371 | 0.2477 |
80
+ | No log | 3.875 | 62 | 0.2481 | 0.2355 | 0.2481 |
81
+ | No log | 4.0 | 64 | 0.2561 | 0.2148 | 0.2561 |
82
+ | No log | 4.125 | 66 | 0.2682 | 0.2101 | 0.2682 |
83
+ | No log | 4.25 | 68 | 0.2759 | 0.1725 | 0.2759 |
84
+ | No log | 4.375 | 70 | 0.2719 | 0.1665 | 0.2719 |
85
+ | No log | 4.5 | 72 | 0.2545 | 0.2022 | 0.2545 |
86
+ | No log | 4.625 | 74 | 0.2459 | 0.2488 | 0.2459 |
87
+ | No log | 4.75 | 76 | 0.2442 | 0.2674 | 0.2442 |
88
+ | No log | 4.875 | 78 | 0.2574 | 0.2183 | 0.2574 |
89
+ | No log | 5.0 | 80 | 0.2695 | 0.2133 | 0.2695 |
90
+ | No log | 5.125 | 82 | 0.2908 | 0.2237 | 0.2908 |
91
+ | No log | 5.25 | 84 | 0.2741 | 0.2523 | 0.2741 |
92
+ | No log | 5.375 | 86 | 0.2535 | 0.2795 | 0.2535 |
93
+ | No log | 5.5 | 88 | 0.2511 | 0.2795 | 0.2511 |
94
+ | No log | 5.625 | 90 | 0.2503 | 0.2680 | 0.2503 |
95
+ | No log | 5.75 | 92 | 0.2516 | 0.2536 | 0.2516 |
96
+ | No log | 5.875 | 94 | 0.2541 | 0.2575 | 0.2541 |
97
+ | No log | 6.0 | 96 | 0.2541 | 0.2575 | 0.2541 |
98
+ | No log | 6.125 | 98 | 0.2569 | 0.2609 | 0.2569 |
99
+ | No log | 6.25 | 100 | 0.2577 | 0.2609 | 0.2577 |
100
+ | No log | 6.375 | 102 | 0.2619 | 0.2609 | 0.2619 |
101
+ | No log | 6.5 | 104 | 0.2736 | 0.2575 | 0.2736 |
102
+ | No log | 6.625 | 106 | 0.2782 | 0.2575 | 0.2782 |
103
+ | No log | 6.75 | 108 | 0.2718 | 0.2793 | 0.2718 |
104
+ | No log | 6.875 | 110 | 0.2666 | 0.2575 | 0.2666 |
105
+ | No log | 7.0 | 112 | 0.2601 | 0.2575 | 0.2601 |
106
+ | No log | 7.125 | 114 | 0.2517 | 0.2605 | 0.2517 |
107
+ | No log | 7.25 | 116 | 0.2515 | 0.2641 | 0.2515 |
108
+ | No log | 7.375 | 118 | 0.2527 | 0.2641 | 0.2527 |
109
+ | No log | 7.5 | 120 | 0.2579 | 0.2565 | 0.2579 |
110
+ | No log | 7.625 | 122 | 0.2612 | 0.2644 | 0.2612 |
111
+ | No log | 7.75 | 124 | 0.2592 | 0.2644 | 0.2592 |
112
+ | No log | 7.875 | 126 | 0.2572 | 0.2682 | 0.2572 |
113
+ | No log | 8.0 | 128 | 0.2594 | 0.2682 | 0.2594 |
114
+ | No log | 8.125 | 130 | 0.2600 | 0.2682 | 0.2600 |
115
+ | No log | 8.25 | 132 | 0.2633 | 0.2609 | 0.2633 |
116
+ | No log | 8.375 | 134 | 0.2691 | 0.2496 | 0.2691 |
117
+ | No log | 8.5 | 136 | 0.2714 | 0.2426 | 0.2714 |
118
+ | No log | 8.625 | 138 | 0.2651 | 0.2609 | 0.2651 |
119
+ | No log | 8.75 | 140 | 0.2593 | 0.2682 | 0.2593 |
120
+ | No log | 8.875 | 142 | 0.2587 | 0.2682 | 0.2587 |
121
+ | No log | 9.0 | 144 | 0.2587 | 0.2682 | 0.2587 |
122
+ | No log | 9.125 | 146 | 0.2620 | 0.2682 | 0.2620 |
123
+ | No log | 9.25 | 148 | 0.2666 | 0.2570 | 0.2666 |
124
+ | No log | 9.375 | 150 | 0.2697 | 0.2570 | 0.2697 |
125
+ | No log | 9.5 | 152 | 0.2714 | 0.2570 | 0.2714 |
126
+ | No log | 9.625 | 154 | 0.2724 | 0.2531 | 0.2724 |
127
+ | No log | 9.75 | 156 | 0.2735 | 0.2458 | 0.2735 |
128
+ | No log | 9.875 | 158 | 0.2736 | 0.2531 | 0.2736 |
129
+ | No log | 10.0 | 160 | 0.2737 | 0.2531 | 0.2737 |
130
 
131
 
132
  ### Framework versions