lombardata's picture
Upload README.md
e947dae verified
---
language:
- eng
license: wtfpl
tags:
- multilabel-image-classification
- multilabel
- generated_from_trainer
base_model: facebook/dinov2-large
model-index:
- name: DinoVdeau-large-2024_09_05-batch-size32_epochs150_freeze
results: []
---
DinoVd'eau is a fine-tuned version of [facebook/dinov2-large](https://huggingface.co/facebook/dinov2-large). It achieves the following results on the test set:
- Loss: 0.1209
- F1 Micro: 0.8228
- F1 Macro: 0.7175
- Roc Auc: 0.8813
- Accuracy: 0.3111
---
# Model description
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
The source code for training the model can be found in this [Git repository](https://github.com/SeatizenDOI/DinoVdeau).
- **Developed by:** [lombardata](https://huggingface.co/lombardata), credits to [César Leblanc](https://huggingface.co/CesarLeblanc) and [Victor Illien](https://huggingface.co/groderg)
---
# Intended uses & limitations
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
---
# Training and evaluation data
Details on the number of images for each class are given in the following table:
| Class | train | val | test | Total |
|:-------------------------|--------:|------:|-------:|--------:|
| Acropore_branched | 1469 | 464 | 475 | 2408 |
| Acropore_digitised | 568 | 160 | 160 | 888 |
| Acropore_sub_massive | 150 | 50 | 43 | 243 |
| Acropore_tabular | 999 | 297 | 293 | 1589 |
| Algae_assembly | 2546 | 847 | 845 | 4238 |
| Algae_drawn_up | 367 | 126 | 127 | 620 |
| Algae_limestone | 1652 | 557 | 563 | 2772 |
| Algae_sodding | 3148 | 984 | 985 | 5117 |
| Atra/Leucospilota | 1084 | 348 | 360 | 1792 |
| Bleached_coral | 219 | 71 | 70 | 360 |
| Blurred | 191 | 67 | 62 | 320 |
| Dead_coral | 1979 | 642 | 643 | 3264 |
| Fish | 2018 | 656 | 647 | 3321 |
| Homo_sapiens | 161 | 62 | 59 | 282 |
| Human_object | 157 | 58 | 55 | 270 |
| Living_coral | 406 | 154 | 141 | 701 |
| Millepore | 385 | 127 | 125 | 637 |
| No_acropore_encrusting | 441 | 130 | 154 | 725 |
| No_acropore_foliaceous | 204 | 36 | 46 | 286 |
| No_acropore_massive | 1031 | 336 | 338 | 1705 |
| No_acropore_solitary | 202 | 53 | 48 | 303 |
| No_acropore_sub_massive | 1401 | 433 | 422 | 2256 |
| Rock | 4489 | 1495 | 1473 | 7457 |
| Rubble | 3092 | 1030 | 1001 | 5123 |
| Sand | 5842 | 1939 | 1938 | 9719 |
| Sea_cucumber | 1408 | 439 | 447 | 2294 |
| Sea_urchins | 327 | 107 | 111 | 545 |
| Sponge | 269 | 96 | 105 | 470 |
| Syringodium_isoetifolium | 1212 | 392 | 391 | 1995 |
| Thalassodendron_ciliatum | 782 | 261 | 260 | 1303 |
| Useless | 579 | 193 | 193 | 965 |
---
# Training procedure
## Training hyperparameters
The following hyperparameters were used during training:
- **Number of Epochs**: 150
- **Learning Rate**: 0.001
- **Train Batch Size**: 32
- **Eval Batch Size**: 32
- **Optimizer**: Adam
- **LR Scheduler Type**: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
- **Freeze Encoder**: Yes
- **Data Augmentation**: Yes
## Data Augmentation
Data were augmented using the following transformations :
Train Transforms
- **PreProcess**: No additional parameters
- **Resize**: probability=1.00
- **RandomHorizontalFlip**: probability=0.25
- **RandomVerticalFlip**: probability=0.25
- **ColorJiggle**: probability=0.25
- **RandomPerspective**: probability=0.25
- **Normalize**: probability=1.00
Val Transforms
- **PreProcess**: No additional parameters
- **Resize**: probability=1.00
- **Normalize**: probability=1.00
## Training results
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate
--- | --- | --- | --- | --- | ---
1 | 0.16899551451206207 | 0.22314622314622315 | 0.7516596896274684 | 0.5430112866470752 | 0.001
2 | 0.153842031955719 | 0.24012474012474014 | 0.765669700910273 | 0.5721428312627432 | 0.001
3 | 0.14828726649284363 | 0.23458073458073458 | 0.7772688719253604 | 0.6137585525531024 | 0.001
4 | 0.1479637324810028 | 0.2494802494802495 | 0.7722737615963591 | 0.6224730910908008 | 0.001
5 | 0.14575305581092834 | 0.2494802494802495 | 0.779738930569409 | 0.6302307709949958 | 0.001
6 | 0.14499613642692566 | 0.2480942480942481 | 0.7798061948433986 | 0.6092591780781843 | 0.001
7 | 0.1474585235118866 | 0.2525987525987526 | 0.7767369242779079 | 0.624806622732382 | 0.001
8 | 0.14568069577217102 | 0.25744975744975745 | 0.7803859753759638 | 0.6249401475720361 | 0.001
9 | 0.14169421792030334 | 0.25744975744975745 | 0.7868685150535805 | 0.652642904607388 | 0.001
10 | 0.1436299830675125 | 0.25467775467775466 | 0.7757335098168984 | 0.6289931868767601 | 0.001
11 | 0.1428152322769165 | 0.26403326403326405 | 0.7886988341417751 | 0.6447870111639475 | 0.001
12 | 0.1438700556755066 | 0.25814275814275817 | 0.7904845227679873 | 0.6493205009564239 | 0.001
13 | 0.13913600146770477 | 0.2713097713097713 | 0.7906956746065871 | 0.6561811626743236 | 0.001
14 | 0.14094506204128265 | 0.2643797643797644 | 0.783810807286006 | 0.6337626365639194 | 0.001
15 | 0.1396123319864273 | 0.2577962577962578 | 0.7907172995780591 | 0.6463067634895379 | 0.001
16 | 0.13904806971549988 | 0.2654192654192654 | 0.7913274487959551 | 0.6593840515969085 | 0.001
17 | 0.1418265849351883 | 0.2564102564102564 | 0.7939832128313804 | 0.6585824628325464 | 0.001
18 | 0.14155420660972595 | 0.26576576576576577 | 0.7957187827911858 | 0.6560187518750095 | 0.001
19 | 0.14027266204357147 | 0.262993762993763 | 0.7885625699767461 | 0.6524018082903621 | 0.001
20 | 0.14759798347949982 | 0.26126126126126126 | 0.7910696719558615 | 0.6558190248610255 | 0.001
21 | 0.14285211265087128 | 0.26576576576576577 | 0.7879767016708474 | 0.6397027546064713 | 0.001
22 | 0.141402930021286 | 0.26126126126126126 | 0.7936799099512236 | 0.650810186340724 | 0.001
23 | 0.1415141373872757 | 0.26853776853776856 | 0.7975794766896787 | 0.6618136826297922 | 0.0001
24 | 0.13230843842029572 | 0.27893277893277896 | 0.8044778018063861 | 0.6750686264509598 | 0.0001
25 | 0.13101588189601898 | 0.27927927927927926 | 0.8044072500946213 | 0.6724022117445357 | 0.0001
26 | 0.13268393278121948 | 0.28205128205128205 | 0.8035965398218775 | 0.6689442300740391 | 0.0001
27 | 0.1317097693681717 | 0.2817047817047817 | 0.8068647969861867 | 0.679681812643572 | 0.0001
28 | 0.12880520522594452 | 0.27754677754677753 | 0.8072126727334008 | 0.6818462300001074 | 0.0001
29 | 0.12942521274089813 | 0.2844767844767845 | 0.8038088702067427 | 0.6807929806344717 | 0.0001
30 | 0.12943296134471893 | 0.28586278586278585 | 0.8077149835761811 | 0.6825529208005033 | 0.0001
31 | 0.12738928198814392 | 0.28794178794178793 | 0.8073808915025994 | 0.6779122940127521 | 0.0001
32 | 0.12775012850761414 | 0.2882882882882883 | 0.8104185890445432 | 0.6868638344898197 | 0.0001
33 | 0.12765593826770782 | 0.2869022869022869 | 0.8077248140635565 | 0.6810807224403135 | 0.0001
34 | 0.12660712003707886 | 0.2882882882882883 | 0.8108837797932926 | 0.687361527737602 | 0.0001
35 | 0.1262102574110031 | 0.29036729036729036 | 0.8103963941193815 | 0.688483181989703 | 0.0001
36 | 0.12687553465366364 | 0.28274428274428276 | 0.8070400273399119 | 0.6876394944988364 | 0.0001
37 | 0.12656189501285553 | 0.28655578655578656 | 0.8081597960050999 | 0.6833930255395054 | 0.0001
38 | 0.12547720968723297 | 0.2955647955647956 | 0.8106371284826448 | 0.6936175483283518 | 0.0001
39 | 0.12485096603631973 | 0.2927927927927928 | 0.8141880626875626 | 0.6985657340894045 | 0.0001
40 | 0.1257668137550354 | 0.2934857934857935 | 0.8138017044273539 | 0.6989554260935754 | 0.0001
41 | 0.12528541684150696 | 0.29244629244629244 | 0.8101351925856646 | 0.6923923602014324 | 0.0001
42 | 0.12443084269762039 | 0.3004158004158004 | 0.8138018093835474 | 0.6970236383039276 | 0.0001
43 | 0.12451612949371338 | 0.2948717948717949 | 0.8131470414948238 | 0.6956334056896907 | 0.0001
44 | 0.12501148879528046 | 0.2966042966042966 | 0.812950847173293 | 0.6915470420512126 | 0.0001
45 | 0.12397606670856476 | 0.29625779625779625 | 0.8136846971798428 | 0.7050548840380568 | 0.0001
46 | 0.12409698963165283 | 0.29764379764379767 | 0.8130628734954971 | 0.6987723620069867 | 0.0001
47 | 0.12429661303758621 | 0.2955647955647956 | 0.811911298838437 | 0.6957628076563835 | 0.0001
48 | 0.12393072247505188 | 0.2955647955647956 | 0.8135280295401142 | 0.6990296569974817 | 0.0001
49 | 0.1242954283952713 | 0.29972279972279975 | 0.8152993625265614 | 0.7007060102949784 | 0.0001
50 | 0.12405084818601608 | 0.29799029799029797 | 0.8151919866444074 | 0.6999734070385492 | 0.0001
51 | 0.12483017891645432 | 0.3011088011088011 | 0.8153039745759215 | 0.7055935576453343 | 0.0001
52 | 0.12426182627677917 | 0.3049203049203049 | 0.8157241959217996 | 0.7035566403965832 | 0.0001
53 | 0.12408608943223953 | 0.30214830214830213 | 0.8152648882600192 | 0.7031528349086803 | 0.0001
54 | 0.12344320118427277 | 0.30214830214830213 | 0.8152251458307105 | 0.7067666695453366 | 0.0001
55 | 0.12307523190975189 | 0.30180180180180183 | 0.8166332665330662 | 0.7075536762185066 | 0.0001
56 | 0.12282071262598038 | 0.30665280665280664 | 0.8189626693095475 | 0.7087921855865761 | 0.0001
57 | 0.12259934842586517 | 0.306999306999307 | 0.8160328019748128 | 0.7079839879234633 | 0.0001
58 | 0.12334763258695602 | 0.30214830214830213 | 0.8170145133631687 | 0.7072503847729165 | 0.0001
59 | 0.12272054702043533 | 0.30214830214830213 | 0.8172105834237543 | 0.713532815646164 | 0.0001
60 | 0.12334387749433517 | 0.30214830214830213 | 0.8142579609764339 | 0.7039801220819605 | 0.0001
61 | 0.12339764833450317 | 0.3042273042273042 | 0.816814564846061 | 0.7120578542808926 | 0.0001
62 | 0.12234435975551605 | 0.3049203049203049 | 0.8169309505831026 | 0.7124854785684515 | 0.0001
63 | 0.12311259657144547 | 0.30353430353430355 | 0.8151443922095366 | 0.709030237195192 | 0.0001
64 | 0.12282687425613403 | 0.30665280665280664 | 0.8183222681531587 | 0.7114197657112039 | 0.0001
65 | 0.12305620312690735 | 0.30353430353430355 | 0.8185065204751224 | 0.715610525327271 | 0.0001
66 | 0.12252139300107956 | 0.30214830214830213 | 0.8193021036471515 | 0.7083957677770276 | 0.0001
67 | 0.12215397506952286 | 0.3031878031878032 | 0.8185542268382505 | 0.713563304331985 | 0.0001
68 | 0.12200037389993668 | 0.3090783090783091 | 0.8201218248870841 | 0.7169216330412181 | 0.0001
69 | 0.12282921373844147 | 0.30180180180180183 | 0.8171493231633209 | 0.7165157275423649 | 0.0001
70 | 0.12265007942914963 | 0.3042273042273042 | 0.8176893032631977 | 0.7130922408537738 | 0.0001
71 | 0.12318737804889679 | 0.29799029799029797 | 0.8155257705805251 | 0.7123118599173115 | 0.0001
72 | 0.12224896252155304 | 0.30561330561330563 | 0.8177146438270315 | 0.7181217472368024 | 0.0001
73 | 0.12214501202106476 | 0.3076923076923077 | 0.8161570403926011 | 0.7046690012290543 | 0.0001
74 | 0.12297073751688004 | 0.2972972972972973 | 0.8147835269271382 | 0.7070482653980339 | 0.0001
75 | 0.12141965329647064 | 0.3049203049203049 | 0.8175831550689987 | 0.7123584497861349 | 1e-05
76 | 0.12091591954231262 | 0.30665280665280664 | 0.8212704324436167 | 0.7265282519195887 | 1e-05
77 | 0.12162773311138153 | 0.30734580734580735 | 0.8221009885557243 | 0.7249141687532618 | 1e-05
78 | 0.12114103883504868 | 0.30561330561330563 | 0.821013443640124 | 0.7232913822219021 | 1e-05
79 | 0.1210767850279808 | 0.30561330561330563 | 0.8181284095677717 | 0.7157592534107864 | 1e-05
80 | 0.12099559605121613 | 0.3090783090783091 | 0.8200463116109824 | 0.7196736600383237 | 1e-05
81 | 0.12053155153989792 | 0.31046431046431044 | 0.8189727287937092 | 0.7194056763702963 | 1e-05
82 | 0.12050338089466095 | 0.306999306999307 | 0.8186875235267054 | 0.7212694332008583 | 1e-05
83 | 0.12153622508049011 | 0.3049203049203049 | 0.817129142279675 | 0.7136069207682542 | 1e-05
84 | 0.12091034650802612 | 0.3115038115038115 | 0.8212135055442501 | 0.72263281374496 | 1e-05
85 | 0.12058679759502411 | 0.30942480942480943 | 0.8212908842183808 | 0.7219026145386024 | 1e-05
86 | 0.1210218220949173 | 0.30838530838530837 | 0.8206727371003285 | 0.7255503995321377 | 1e-05
87 | 0.12097787857055664 | 0.30734580734580735 | 0.81919187715867 | 0.7163464112504625 | 1e-05
88 | 0.12078534066677094 | 0.30942480942480943 | 0.8219223445649475 | 0.7179611359738045 | 1e-05
89 | 0.1213160827755928 | 0.3125433125433125 | 0.8235824319895118 | 0.7293063087262872 | 1.0000000000000002e-06
90 | 0.12110408395528793 | 0.3108108108108108 | 0.8228019165403988 | 0.7249894355418997 | 1.0000000000000002e-06
91 | 0.1205781027674675 | 0.31046431046431044 | 0.8191074795725959 | 0.7187027508297176 | 1.0000000000000002e-06
92 | 0.12076584249734879 | 0.31046431046431044 | 0.8196009683612989 | 0.7150284118631205 | 1.0000000000000002e-06
---
# CO2 Emissions
The estimated CO2 emissions for training this model are documented below:
- **Emissions**: 1.414263264227963 grams of CO2
- **Source**: Code Carbon
- **Training Type**: fine-tuning
- **Geographical Location**: Brest, France
- **Hardware Used**: NVIDIA Tesla V100 PCIe 32 Go
---
# Framework Versions
- **Transformers**: 4.41.1
- **Pytorch**: 2.3.0+cu121
- **Datasets**: 2.19.1
- **Tokenizers**: 0.19.1