lombardata commited on
Commit
e947dae
1 Parent(s): a96176b

Upload README.md

Browse files
Files changed (1) hide show
  1. README.md +207 -132
README.md CHANGED
@@ -1,156 +1,231 @@
 
1
  ---
2
- license: apache-2.0
3
- base_model: facebook/dinov2-large
 
4
  tags:
 
 
5
  - generated_from_trainer
6
- metrics:
7
- - accuracy
8
  model-index:
9
  - name: DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze
10
  results: []
11
  ---
12
 
13
- <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
- should probably proofread and complete it, then remove this comment. -->
15
 
16
- # DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze
17
-
18
- This model is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large) on the None dataset.
19
- It achieves the following results on the evaluation set:
20
  - Loss: 0.1209
21
  - F1 Micro: 0.8228
22
  - F1 Macro: 0.7175
23
  - Roc Auc: 0.8813
24
  - Accuracy: 0.3111
25
- - Learning Rate: 0.0000
26
 
27
- ## Model description
 
 
 
28
 
29
- More information needed
30
 
31
- ## Intended uses & limitations
32
 
33
- More information needed
 
 
 
 
 
34
 
35
- ## Training and evaluation data
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
 
37
- More information needed
38
 
39
- ## Training procedure
40
 
41
- ### Training hyperparameters
42
 
43
  The following hyperparameters were used during training:
44
- - learning_rate: 0.001
45
- - train_batch_size: 32
46
- - eval_batch_size: 32
47
- - seed: 42
48
- - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
- - lr_scheduler_type: linear
50
- - num_epochs: 150
51
- - mixed_precision_training: Native AMP
52
-
53
- ### Training results
54
-
55
- | Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
56
- |:-------------:|:-----:|:-----:|:---------------:|:--------:|:--------:|:-------:|:--------:|:------:|
57
- | No log | 1.0 | 273 | 0.1690 | 0.7517 | 0.5430 | 0.8384 | 0.2231 | 0.001 |
58
- | 0.2719 | 2.0 | 546 | 0.1538 | 0.7657 | 0.5721 | 0.8396 | 0.2401 | 0.001 |
59
- | 0.2719 | 3.0 | 819 | 0.1483 | 0.7773 | 0.6138 | 0.8516 | 0.2346 | 0.001 |
60
- | 0.1694 | 4.0 | 1092 | 0.1480 | 0.7723 | 0.6225 | 0.8407 | 0.2495 | 0.001 |
61
- | 0.1694 | 5.0 | 1365 | 0.1458 | 0.7797 | 0.6302 | 0.8470 | 0.2495 | 0.001 |
62
- | 0.1625 | 6.0 | 1638 | 0.1450 | 0.7798 | 0.6093 | 0.8477 | 0.2481 | 0.001 |
63
- | 0.1625 | 7.0 | 1911 | 0.1475 | 0.7767 | 0.6248 | 0.8454 | 0.2526 | 0.001 |
64
- | 0.1592 | 8.0 | 2184 | 0.1457 | 0.7804 | 0.6249 | 0.8521 | 0.2574 | 0.001 |
65
- | 0.1592 | 9.0 | 2457 | 0.1417 | 0.7869 | 0.6526 | 0.8561 | 0.2574 | 0.001 |
66
- | 0.157 | 10.0 | 2730 | 0.1436 | 0.7757 | 0.6290 | 0.8403 | 0.2547 | 0.001 |
67
- | 0.1563 | 11.0 | 3003 | 0.1428 | 0.7887 | 0.6448 | 0.8569 | 0.2640 | 0.001 |
68
- | 0.1563 | 12.0 | 3276 | 0.1439 | 0.7905 | 0.6493 | 0.8638 | 0.2581 | 0.001 |
69
- | 0.1558 | 13.0 | 3549 | 0.1391 | 0.7907 | 0.6562 | 0.8551 | 0.2713 | 0.001 |
70
- | 0.1558 | 14.0 | 3822 | 0.1409 | 0.7838 | 0.6338 | 0.8485 | 0.2644 | 0.001 |
71
- | 0.1543 | 15.0 | 4095 | 0.1396 | 0.7907 | 0.6463 | 0.8603 | 0.2578 | 0.001 |
72
- | 0.1543 | 16.0 | 4368 | 0.1390 | 0.7913 | 0.6594 | 0.8564 | 0.2654 | 0.001 |
73
- | 0.1535 | 17.0 | 4641 | 0.1418 | 0.7940 | 0.6586 | 0.8665 | 0.2564 | 0.001 |
74
- | 0.1535 | 18.0 | 4914 | 0.1416 | 0.7957 | 0.6560 | 0.8646 | 0.2658 | 0.001 |
75
- | 0.1549 | 19.0 | 5187 | 0.1403 | 0.7886 | 0.6524 | 0.8536 | 0.2630 | 0.001 |
76
- | 0.1549 | 20.0 | 5460 | 0.1476 | 0.7911 | 0.6558 | 0.8568 | 0.2613 | 0.001 |
77
- | 0.154 | 21.0 | 5733 | 0.1429 | 0.7880 | 0.6397 | 0.8568 | 0.2658 | 0.001 |
78
- | 0.1529 | 22.0 | 6006 | 0.1414 | 0.7937 | 0.6508 | 0.8654 | 0.2613 | 0.001 |
79
- | 0.1529 | 23.0 | 6279 | 0.1415 | 0.7976 | 0.6618 | 0.8613 | 0.2685 | 0.0001 |
80
- | 0.1449 | 24.0 | 6552 | 0.1323 | 0.8045 | 0.6751 | 0.8665 | 0.2789 | 0.0001 |
81
- | 0.1449 | 25.0 | 6825 | 0.1310 | 0.8044 | 0.6724 | 0.8688 | 0.2793 | 0.0001 |
82
- | 0.1416 | 26.0 | 7098 | 0.1327 | 0.8036 | 0.6689 | 0.8646 | 0.2821 | 0.0001 |
83
- | 0.1416 | 27.0 | 7371 | 0.1317 | 0.8069 | 0.6797 | 0.8715 | 0.2817 | 0.0001 |
84
- | 0.1391 | 28.0 | 7644 | 0.1288 | 0.8072 | 0.6818 | 0.8698 | 0.2775 | 0.0001 |
85
- | 0.1391 | 29.0 | 7917 | 0.1294 | 0.8038 | 0.6808 | 0.8629 | 0.2845 | 0.0001 |
86
- | 0.138 | 30.0 | 8190 | 0.1294 | 0.8077 | 0.6826 | 0.8702 | 0.2859 | 0.0001 |
87
- | 0.138 | 31.0 | 8463 | 0.1274 | 0.8074 | 0.6779 | 0.8666 | 0.2879 | 0.0001 |
88
- | 0.1364 | 32.0 | 8736 | 0.1278 | 0.8104 | 0.6869 | 0.8728 | 0.2883 | 0.0001 |
89
- | 0.1359 | 33.0 | 9009 | 0.1277 | 0.8077 | 0.6811 | 0.8692 | 0.2869 | 0.0001 |
90
- | 0.1359 | 34.0 | 9282 | 0.1266 | 0.8109 | 0.6874 | 0.8714 | 0.2883 | 0.0001 |
91
- | 0.1341 | 35.0 | 9555 | 0.1262 | 0.8104 | 0.6885 | 0.8716 | 0.2904 | 0.0001 |
92
- | 0.1341 | 36.0 | 9828 | 0.1269 | 0.8070 | 0.6876 | 0.8657 | 0.2827 | 0.0001 |
93
- | 0.1339 | 37.0 | 10101 | 0.1266 | 0.8082 | 0.6834 | 0.8678 | 0.2866 | 0.0001 |
94
- | 0.1339 | 38.0 | 10374 | 0.1255 | 0.8106 | 0.6936 | 0.8707 | 0.2956 | 0.0001 |
95
- | 0.1307 | 39.0 | 10647 | 0.1249 | 0.8142 | 0.6986 | 0.8768 | 0.2928 | 0.0001 |
96
- | 0.1307 | 40.0 | 10920 | 0.1258 | 0.8138 | 0.6990 | 0.8773 | 0.2935 | 0.0001 |
97
- | 0.1317 | 41.0 | 11193 | 0.1253 | 0.8101 | 0.6924 | 0.8688 | 0.2924 | 0.0001 |
98
- | 0.1317 | 42.0 | 11466 | 0.1244 | 0.8138 | 0.6970 | 0.8738 | 0.3004 | 0.0001 |
99
- | 0.1308 | 43.0 | 11739 | 0.1245 | 0.8131 | 0.6956 | 0.8734 | 0.2949 | 0.0001 |
100
- | 0.1307 | 44.0 | 12012 | 0.1250 | 0.8130 | 0.6915 | 0.8743 | 0.2966 | 0.0001 |
101
- | 0.1307 | 45.0 | 12285 | 0.1240 | 0.8137 | 0.7051 | 0.8740 | 0.2963 | 0.0001 |
102
- | 0.1295 | 46.0 | 12558 | 0.1241 | 0.8131 | 0.6988 | 0.8733 | 0.2976 | 0.0001 |
103
- | 0.1295 | 47.0 | 12831 | 0.1243 | 0.8119 | 0.6958 | 0.8716 | 0.2956 | 0.0001 |
104
- | 0.1293 | 48.0 | 13104 | 0.1239 | 0.8135 | 0.6990 | 0.8744 | 0.2956 | 0.0001 |
105
- | 0.1293 | 49.0 | 13377 | 0.1243 | 0.8153 | 0.7007 | 0.8775 | 0.2997 | 0.0001 |
106
- | 0.1274 | 50.0 | 13650 | 0.1241 | 0.8152 | 0.7000 | 0.8769 | 0.2980 | 0.0001 |
107
- | 0.1274 | 51.0 | 13923 | 0.1248 | 0.8153 | 0.7056 | 0.8803 | 0.3011 | 0.0001 |
108
- | 0.1271 | 52.0 | 14196 | 0.1243 | 0.8157 | 0.7036 | 0.8751 | 0.3049 | 0.0001 |
109
- | 0.1271 | 53.0 | 14469 | 0.1241 | 0.8153 | 0.7032 | 0.8778 | 0.3021 | 0.0001 |
110
- | 0.1275 | 54.0 | 14742 | 0.1234 | 0.8152 | 0.7068 | 0.8753 | 0.3021 | 0.0001 |
111
- | 0.1256 | 55.0 | 15015 | 0.1231 | 0.8166 | 0.7076 | 0.8776 | 0.3018 | 0.0001 |
112
- | 0.1256 | 56.0 | 15288 | 0.1228 | 0.8190 | 0.7088 | 0.8822 | 0.3067 | 0.0001 |
113
- | 0.1258 | 57.0 | 15561 | 0.1226 | 0.8160 | 0.7080 | 0.8767 | 0.3070 | 0.0001 |
114
- | 0.1258 | 58.0 | 15834 | 0.1233 | 0.8170 | 0.7073 | 0.8773 | 0.3021 | 0.0001 |
115
- | 0.1258 | 59.0 | 16107 | 0.1227 | 0.8172 | 0.7135 | 0.8781 | 0.3021 | 0.0001 |
116
- | 0.1258 | 60.0 | 16380 | 0.1233 | 0.8143 | 0.7040 | 0.8729 | 0.3021 | 0.0001 |
117
- | 0.1252 | 61.0 | 16653 | 0.1234 | 0.8168 | 0.7121 | 0.8784 | 0.3042 | 0.0001 |
118
- | 0.1252 | 62.0 | 16926 | 0.1223 | 0.8169 | 0.7125 | 0.8764 | 0.3049 | 0.0001 |
119
- | 0.1238 | 63.0 | 17199 | 0.1231 | 0.8151 | 0.7090 | 0.8752 | 0.3035 | 0.0001 |
120
- | 0.1238 | 64.0 | 17472 | 0.1228 | 0.8183 | 0.7114 | 0.8785 | 0.3067 | 0.0001 |
121
- | 0.1247 | 65.0 | 17745 | 0.1231 | 0.8185 | 0.7156 | 0.8802 | 0.3035 | 0.0001 |
122
- | 0.123 | 66.0 | 18018 | 0.1225 | 0.8193 | 0.7084 | 0.8809 | 0.3021 | 0.0001 |
123
- | 0.123 | 67.0 | 18291 | 0.1222 | 0.8186 | 0.7136 | 0.8814 | 0.3032 | 0.0001 |
124
- | 0.1224 | 68.0 | 18564 | 0.1220 | 0.8201 | 0.7169 | 0.8818 | 0.3091 | 0.0001 |
125
- | 0.1224 | 69.0 | 18837 | 0.1228 | 0.8171 | 0.7165 | 0.8768 | 0.3018 | 0.0001 |
126
- | 0.1228 | 70.0 | 19110 | 0.1227 | 0.8177 | 0.7131 | 0.8765 | 0.3042 | 0.0001 |
127
- | 0.1228 | 71.0 | 19383 | 0.1232 | 0.8155 | 0.7123 | 0.8733 | 0.2980 | 0.0001 |
128
- | 0.1224 | 72.0 | 19656 | 0.1222 | 0.8177 | 0.7181 | 0.8780 | 0.3056 | 0.0001 |
129
- | 0.1224 | 73.0 | 19929 | 0.1221 | 0.8162 | 0.7047 | 0.8760 | 0.3077 | 0.0001 |
130
- | 0.122 | 74.0 | 20202 | 0.1230 | 0.8148 | 0.7070 | 0.8732 | 0.2973 | 0.0001 |
131
- | 0.122 | 75.0 | 20475 | 0.1214 | 0.8176 | 0.7124 | 0.8768 | 0.3049 | 1e-05 |
132
- | 0.1201 | 76.0 | 20748 | 0.1209 | 0.8213 | 0.7265 | 0.8828 | 0.3067 | 1e-05 |
133
- | 0.1192 | 77.0 | 21021 | 0.1216 | 0.8221 | 0.7249 | 0.8860 | 0.3073 | 1e-05 |
134
- | 0.1192 | 78.0 | 21294 | 0.1211 | 0.8210 | 0.7233 | 0.8828 | 0.3056 | 1e-05 |
135
- | 0.1178 | 79.0 | 21567 | 0.1211 | 0.8181 | 0.7158 | 0.8769 | 0.3056 | 1e-05 |
136
- | 0.1178 | 80.0 | 21840 | 0.1210 | 0.8200 | 0.7197 | 0.8824 | 0.3091 | 1e-05 |
137
- | 0.1178 | 81.0 | 22113 | 0.1205 | 0.8190 | 0.7194 | 0.8784 | 0.3105 | 1e-05 |
138
- | 0.1178 | 82.0 | 22386 | 0.1205 | 0.8187 | 0.7213 | 0.8782 | 0.3070 | 1e-05 |
139
- | 0.1162 | 83.0 | 22659 | 0.1215 | 0.8171 | 0.7136 | 0.8754 | 0.3049 | 1e-05 |
140
- | 0.1162 | 84.0 | 22932 | 0.1209 | 0.8212 | 0.7226 | 0.8817 | 0.3115 | 1e-05 |
141
- | 0.1174 | 85.0 | 23205 | 0.1206 | 0.8213 | 0.7219 | 0.8823 | 0.3094 | 1e-05 |
142
- | 0.1174 | 86.0 | 23478 | 0.1210 | 0.8207 | 0.7256 | 0.8811 | 0.3084 | 1e-05 |
143
- | 0.1167 | 87.0 | 23751 | 0.1210 | 0.8192 | 0.7163 | 0.8800 | 0.3073 | 1e-05 |
144
- | 0.116 | 88.0 | 24024 | 0.1208 | 0.8219 | 0.7180 | 0.8831 | 0.3094 | 1e-05 |
145
- | 0.116 | 89.0 | 24297 | 0.1213 | 0.8236 | 0.7293 | 0.8872 | 0.3125 | 0.0000 |
146
- | 0.1161 | 90.0 | 24570 | 0.1211 | 0.8228 | 0.7250 | 0.8869 | 0.3108 | 0.0000 |
147
- | 0.1161 | 91.0 | 24843 | 0.1206 | 0.8191 | 0.7187 | 0.8779 | 0.3105 | 0.0000 |
148
- | 0.1162 | 92.0 | 25116 | 0.1208 | 0.8196 | 0.7150 | 0.8793 | 0.3105 | 0.0000 |
149
-
150
-
151
- ### Framework versions
152
-
153
- - Transformers 4.41.1
154
- - Pytorch 2.3.0+cu121
155
- - Datasets 2.19.1
156
- - Tokenizers 0.19.1
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
  ---
3
+ language:
4
+ - eng
5
+ license: wtfpl
6
  tags:
7
+ - multilabel-image-classification
8
+ - multilabel
9
  - generated_from_trainer
10
+ base_model: facebook/dinov2-large
 
11
  model-index:
12
  - name: DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze
13
  results: []
14
  ---
15
 
16
+ DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
 
17
 
 
 
 
 
18
  - Loss: 0.1209
19
  - F1 Micro: 0.8228
20
  - F1 Macro: 0.7175
21
  - Roc Auc: 0.8813
22
  - Accuracy: 0.3111
 
23
 
24
+ ---
25
+
26
+ # Model description
27
+ DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
28
 
29
+ The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
30
 
31
+ - **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
32
 
33
+ ---
34
+
35
+ # Intended uses & limitations
36
+ You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
37
+
38
+ ---
39
 
40
+ # Training and evaluation data
41
+ Details on the number of images for each class are given in the following table:
42
+ | Class | train | val | test | Total |
43
+ |:-------------------------|--------:|------:|-------:|--------:|
44
+ | Acropore_branched | 1469 | 464 | 475 | 2408 |
45
+ | Acropore_digitised | 568 | 160 | 160 | 888 |
46
+ | Acropore_sub_massive | 150 | 50 | 43 | 243 |
47
+ | Acropore_tabular | 999 | 297 | 293 | 1589 |
48
+ | Algae_assembly | 2546 | 847 | 845 | 4238 |
49
+ | Algae_drawn_up | 367 | 126 | 127 | 620 |
50
+ | Algae_limestone | 1652 | 557 | 563 | 2772 |
51
+ | Algae_sodding | 3148 | 984 | 985 | 5117 |
52
+ | Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
53
+ | Bleached_coral | 219 | 71 | 70 | 360 |
54
+ | Blurred | 191 | 67 | 62 | 320 |
55
+ | Dead_coral | 1979 | 642 | 643 | 3264 |
56
+ | Fish | 2018 | 656 | 647 | 3321 |
57
+ | Homo_sapiens | 161 | 62 | 59 | 282 |
58
+ | Human_object | 157 | 58 | 55 | 270 |
59
+ | Living_coral | 406 | 154 | 141 | 701 |
60
+ | Millepore | 385 | 127 | 125 | 637 |
61
+ | No_acropore_encrusting | 441 | 130 | 154 | 725 |
62
+ | No_acropore_foliaceous | 204 | 36 | 46 | 286 |
63
+ | No_acropore_massive | 1031 | 336 | 338 | 1705 |
64
+ | No_acropore_solitary | 202 | 53 | 48 | 303 |
65
+ | No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
66
+ | Rock | 4489 | 1495 | 1473 | 7457 |
67
+ | Rubble | 3092 | 1030 | 1001 | 5123 |
68
+ | Sand | 5842 | 1939 | 1938 | 9719 |
69
+ | Sea_cucumber | 1408 | 439 | 447 | 2294 |
70
+ | Sea_urchins | 327 | 107 | 111 | 545 |
71
+ | Sponge | 269 | 96 | 105 | 470 |
72
+ | Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
73
+ | Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
74
+ | Useless | 579 | 193 | 193 | 965 |
75
 
76
+ ---
77
 
78
+ # Training procedure
79
 
80
+ ## Training hyperparameters
81
 
82
  The following hyperparameters were used during training:
83
+
84
+ - **Number of Epochs**: 150
85
+ - **Learning Rate**: 0.001
86
+ - **Train Batch Size**: 32
87
+ - **Eval Batch Size**: 32
88
+ - **Optimizer**: Adam
89
+ - **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
90
+ - **Freeze Encoder**: Yes
91
+ - **Data Augmentation**: Yes
92
+
93
+
94
+ ## Data Augmentation
95
+ Data were augmented using the following transformations :
96
+
97
+ Train Transforms
98
+ - **PreProcess**: No additional parameters
99
+ - **Resize**: probability=1.00
100
+ - **RandomHorizontalFlip**: probability=0.25
101
+ - **RandomVerticalFlip**: probability=0.25
102
+ - **ColorJiggle**: probability=0.25
103
+ - **RandomPerspective**: probability=0.25
104
+ - **Normalize**: probability=1.00
105
+
106
+ Val Transforms
107
+ - **PreProcess**: No additional parameters
108
+ - **Resize**: probability=1.00
109
+ - **Normalize**: probability=1.00
110
+
111
+
112
+
113
+ ## Training results
114
+ Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
115
+ --- | --- | --- | --- | --- | ---
116
+ 1 | 0.16899551451206207 | 0.22314622314622315 | 0.7516596896274684 | 0.5430112866470752 | 0.001
117
+ 2 | 0.153842031955719 | 0.24012474012474014 | 0.765669700910273 | 0.5721428312627432 | 0.001
118
+ 3 | 0.14828726649284363 | 0.23458073458073458 | 0.7772688719253604 | 0.6137585525531024 | 0.001
119
+ 4 | 0.1479637324810028 | 0.2494802494802495 | 0.7722737615963591 | 0.6224730910908008 | 0.001
120
+ 5 | 0.14575305581092834 | 0.2494802494802495 | 0.779738930569409 | 0.6302307709949958 | 0.001
121
+ 6 | 0.14499613642692566 | 0.2480942480942481 | 0.7798061948433986 | 0.6092591780781843 | 0.001
122
+ 7 | 0.1474585235118866 | 0.2525987525987526 | 0.7767369242779079 | 0.624806622732382 | 0.001
123
+ 8 | 0.14568069577217102 | 0.25744975744975745 | 0.7803859753759638 | 0.6249401475720361 | 0.001
124
+ 9 | 0.14169421792030334 | 0.25744975744975745 | 0.7868685150535805 | 0.652642904607388 | 0.001
125
+ 10 | 0.1436299830675125 | 0.25467775467775466 | 0.7757335098168984 | 0.6289931868767601 | 0.001
126
+ 11 | 0.1428152322769165 | 0.26403326403326405 | 0.7886988341417751 | 0.6447870111639475 | 0.001
127
+ 12 | 0.1438700556755066 | 0.25814275814275817 | 0.7904845227679873 | 0.6493205009564239 | 0.001
128
+ 13 | 0.13913600146770477 | 0.2713097713097713 | 0.7906956746065871 | 0.6561811626743236 | 0.001
129
+ 14 | 0.14094506204128265 | 0.2643797643797644 | 0.783810807286006 | 0.6337626365639194 | 0.001
130
+ 15 | 0.1396123319864273 | 0.2577962577962578 | 0.7907172995780591 | 0.6463067634895379 | 0.001
131
+ 16 | 0.13904806971549988 | 0.2654192654192654 | 0.7913274487959551 | 0.6593840515969085 | 0.001
132
+ 17 | 0.1418265849351883 | 0.2564102564102564 | 0.7939832128313804 | 0.6585824628325464 | 0.001
133
+ 18 | 0.14155420660972595 | 0.26576576576576577 | 0.7957187827911858 | 0.6560187518750095 | 0.001
134
+ 19 | 0.14027266204357147 | 0.262993762993763 | 0.7885625699767461 | 0.6524018082903621 | 0.001
135
+ 20 | 0.14759798347949982 | 0.26126126126126126 | 0.7910696719558615 | 0.6558190248610255 | 0.001
136
+ 21 | 0.14285211265087128 | 0.26576576576576577 | 0.7879767016708474 | 0.6397027546064713 | 0.001
137
+ 22 | 0.141402930021286 | 0.26126126126126126 | 0.7936799099512236 | 0.650810186340724 | 0.001
138
+ 23 | 0.1415141373872757 | 0.26853776853776856 | 0.7975794766896787 | 0.6618136826297922 | 0.0001
139
+ 24 | 0.13230843842029572 | 0.27893277893277896 | 0.8044778018063861 | 0.6750686264509598 | 0.0001
140
+ 25 | 0.13101588189601898 | 0.27927927927927926 | 0.8044072500946213 | 0.6724022117445357 | 0.0001
141
+ 26 | 0.13268393278121948 | 0.28205128205128205 | 0.8035965398218775 | 0.6689442300740391 | 0.0001
142
+ 27 | 0.1317097693681717 | 0.2817047817047817 | 0.8068647969861867 | 0.679681812643572 | 0.0001
143
+ 28 | 0.12880520522594452 | 0.27754677754677753 | 0.8072126727334008 | 0.6818462300001074 | 0.0001
144
+ 29 | 0.12942521274089813 | 0.2844767844767845 | 0.8038088702067427 | 0.6807929806344717 | 0.0001
145
+ 30 | 0.12943296134471893 | 0.28586278586278585 | 0.8077149835761811 | 0.6825529208005033 | 0.0001
146
+ 31 | 0.12738928198814392 | 0.28794178794178793 | 0.8073808915025994 | 0.6779122940127521 | 0.0001
147
+ 32 | 0.12775012850761414 | 0.2882882882882883 | 0.8104185890445432 | 0.6868638344898197 | 0.0001
148
+ 33 | 0.12765593826770782 | 0.2869022869022869 | 0.8077248140635565 | 0.6810807224403135 | 0.0001
149
+ 34 | 0.12660712003707886 | 0.2882882882882883 | 0.8108837797932926 | 0.687361527737602 | 0.0001
150
+ 35 | 0.1262102574110031 | 0.29036729036729036 | 0.8103963941193815 | 0.688483181989703 | 0.0001
151
+ 36 | 0.12687553465366364 | 0.28274428274428276 | 0.8070400273399119 | 0.6876394944988364 | 0.0001
152
+ 37 | 0.12656189501285553 | 0.28655578655578656 | 0.8081597960050999 | 0.6833930255395054 | 0.0001
153
+ 38 | 0.12547720968723297 | 0.2955647955647956 | 0.8106371284826448 | 0.6936175483283518 | 0.0001
154
+ 39 | 0.12485096603631973 | 0.2927927927927928 | 0.8141880626875626 | 0.6985657340894045 | 0.0001
155
+ 40 | 0.1257668137550354 | 0.2934857934857935 | 0.8138017044273539 | 0.6989554260935754 | 0.0001
156
+ 41 | 0.12528541684150696 | 0.29244629244629244 | 0.8101351925856646 | 0.6923923602014324 | 0.0001
157
+ 42 | 0.12443084269762039 | 0.3004158004158004 | 0.8138018093835474 | 0.6970236383039276 | 0.0001
158
+ 43 | 0.12451612949371338 | 0.2948717948717949 | 0.8131470414948238 | 0.6956334056896907 | 0.0001
159
+ 44 | 0.12501148879528046 | 0.2966042966042966 | 0.812950847173293 | 0.6915470420512126 | 0.0001
160
+ 45 | 0.12397606670856476 | 0.29625779625779625 | 0.8136846971798428 | 0.7050548840380568 | 0.0001
161
+ 46 | 0.12409698963165283 | 0.29764379764379767 | 0.8130628734954971 | 0.6987723620069867 | 0.0001
162
+ 47 | 0.12429661303758621 | 0.2955647955647956 | 0.811911298838437 | 0.6957628076563835 | 0.0001
163
+ 48 | 0.12393072247505188 | 0.2955647955647956 | 0.8135280295401142 | 0.6990296569974817 | 0.0001
164
+ 49 | 0.1242954283952713 | 0.29972279972279975 | 0.8152993625265614 | 0.7007060102949784 | 0.0001
165
+ 50 | 0.12405084818601608 | 0.29799029799029797 | 0.8151919866444074 | 0.6999734070385492 | 0.0001
166
+ 51 | 0.12483017891645432 | 0.3011088011088011 | 0.8153039745759215 | 0.7055935576453343 | 0.0001
167
+ 52 | 0.12426182627677917 | 0.3049203049203049 | 0.8157241959217996 | 0.7035566403965832 | 0.0001
168
+ 53 | 0.12408608943223953 | 0.30214830214830213 | 0.8152648882600192 | 0.7031528349086803 | 0.0001
169
+ 54 | 0.12344320118427277 | 0.30214830214830213 | 0.8152251458307105 | 0.7067666695453366 | 0.0001
170
+ 55 | 0.12307523190975189 | 0.30180180180180183 | 0.8166332665330662 | 0.7075536762185066 | 0.0001
171
+ 56 | 0.12282071262598038 | 0.30665280665280664 | 0.8189626693095475 | 0.7087921855865761 | 0.0001
172
+ 57 | 0.12259934842586517 | 0.306999306999307 | 0.8160328019748128 | 0.7079839879234633 | 0.0001
173
+ 58 | 0.12334763258695602 | 0.30214830214830213 | 0.8170145133631687 | 0.7072503847729165 | 0.0001
174
+ 59 | 0.12272054702043533 | 0.30214830214830213 | 0.8172105834237543 | 0.713532815646164 | 0.0001
175
+ 60 | 0.12334387749433517 | 0.30214830214830213 | 0.8142579609764339 | 0.7039801220819605 | 0.0001
176
+ 61 | 0.12339764833450317 | 0.3042273042273042 | 0.816814564846061 | 0.7120578542808926 | 0.0001
177
+ 62 | 0.12234435975551605 | 0.3049203049203049 | 0.8169309505831026 | 0.7124854785684515 | 0.0001
178
+ 63 | 0.12311259657144547 | 0.30353430353430355 | 0.8151443922095366 | 0.709030237195192 | 0.0001
179
+ 64 | 0.12282687425613403 | 0.30665280665280664 | 0.8183222681531587 | 0.7114197657112039 | 0.0001
180
+ 65 | 0.12305620312690735 | 0.30353430353430355 | 0.8185065204751224 | 0.715610525327271 | 0.0001
181
+ 66 | 0.12252139300107956 | 0.30214830214830213 | 0.8193021036471515 | 0.7083957677770276 | 0.0001
182
+ 67 | 0.12215397506952286 | 0.3031878031878032 | 0.8185542268382505 | 0.713563304331985 | 0.0001
183
+ 68 | 0.12200037389993668 | 0.3090783090783091 | 0.8201218248870841 | 0.7169216330412181 | 0.0001
184
+ 69 | 0.12282921373844147 | 0.30180180180180183 | 0.8171493231633209 | 0.7165157275423649 | 0.0001
185
+ 70 | 0.12265007942914963 | 0.3042273042273042 | 0.8176893032631977 | 0.7130922408537738 | 0.0001
186
+ 71 | 0.12318737804889679 | 0.29799029799029797 | 0.8155257705805251 | 0.7123118599173115 | 0.0001
187
+ 72 | 0.12224896252155304 | 0.30561330561330563 | 0.8177146438270315 | 0.7181217472368024 | 0.0001
188
+ 73 | 0.12214501202106476 | 0.3076923076923077 | 0.8161570403926011 | 0.7046690012290543 | 0.0001
189
+ 74 | 0.12297073751688004 | 0.2972972972972973 | 0.8147835269271382 | 0.7070482653980339 | 0.0001
190
+ 75 | 0.12141965329647064 | 0.3049203049203049 | 0.8175831550689987 | 0.7123584497861349 | 1e-05
191
+ 76 | 0.12091591954231262 | 0.30665280665280664 | 0.8212704324436167 | 0.7265282519195887 | 1e-05
192
+ 77 | 0.12162773311138153 | 0.30734580734580735 | 0.8221009885557243 | 0.7249141687532618 | 1e-05
193
+ 78 | 0.12114103883504868 | 0.30561330561330563 | 0.821013443640124 | 0.7232913822219021 | 1e-05
194
+ 79 | 0.1210767850279808 | 0.30561330561330563 | 0.8181284095677717 | 0.7157592534107864 | 1e-05
195
+ 80 | 0.12099559605121613 | 0.3090783090783091 | 0.8200463116109824 | 0.7196736600383237 | 1e-05
196
+ 81 | 0.12053155153989792 | 0.31046431046431044 | 0.8189727287937092 | 0.7194056763702963 | 1e-05
197
+ 82 | 0.12050338089466095 | 0.306999306999307 | 0.8186875235267054 | 0.7212694332008583 | 1e-05
198
+ 83 | 0.12153622508049011 | 0.3049203049203049 | 0.817129142279675 | 0.7136069207682542 | 1e-05
199
+ 84 | 0.12091034650802612 | 0.3115038115038115 | 0.8212135055442501 | 0.72263281374496 | 1e-05
200
+ 85 | 0.12058679759502411 | 0.30942480942480943 | 0.8212908842183808 | 0.7219026145386024 | 1e-05
201
+ 86 | 0.1210218220949173 | 0.30838530838530837 | 0.8206727371003285 | 0.7255503995321377 | 1e-05
202
+ 87 | 0.12097787857055664 | 0.30734580734580735 | 0.81919187715867 | 0.7163464112504625 | 1e-05
203
+ 88 | 0.12078534066677094 | 0.30942480942480943 | 0.8219223445649475 | 0.7179611359738045 | 1e-05
204
+ 89 | 0.1213160827755928 | 0.3125433125433125 | 0.8235824319895118 | 0.7293063087262872 | 1.0000000000000002e-06
205
+ 90 | 0.12110408395528793 | 0.3108108108108108 | 0.8228019165403988 | 0.7249894355418997 | 1.0000000000000002e-06
206
+ 91 | 0.1205781027674675 | 0.31046431046431044 | 0.8191074795725959 | 0.7187027508297176 | 1.0000000000000002e-06
207
+ 92 | 0.12076584249734879 | 0.31046431046431044 | 0.8196009683612989 | 0.7150284118631205 | 1.0000000000000002e-06
208
+
209
+
210
+ ---
211
+
212
+ # CO2 Emissions
213
+
214
+ The estimated CO2 emissions for training this model are documented below:
215
+
216
+ - **Emissions**: 1.414263264227963 grams of CO2
217
+ - **Source**: Code Carbon
218
+ - **Training Type**: fine-tuning
219
+ - **Geographical Location**: Brest, France
220
+ - **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
221
+
222
+
223
+ ---
224
+
225
+ # Framework Versions
226
+
227
+ - **Transformers**: 4.41.1
228
+ - **Pytorch**: 2.3.0+cu121
229
+ - **Datasets**: 2.19.1
230
+ - **Tokenizers**: 0.19.1
231
+