DinoVd'eau is a fine-tuned version of facebook/dinov2-large. It achieves the following results on the test set:
- Loss: 0.1247
- F1 Micro: 0.8153
- F1 Macro: 0.7021
- Roc Auc: 0.8747
- Accuracy: 0.3144
Model description
DinoVd'eau is a model built on top of dinov2 model for underwater multilabel image classification.The classification head is a combination of linear, ReLU, batch normalization, and dropout layers.
The source code for training the model can be found in this Git repository.
- Developed by: lombardata, credits to César Leblanc and Victor Illien
Intended uses & limitations
You can use the raw model for classify diverse marine species, encompassing coral morphotypes classes taken from the Global Coral Reef Monitoring Network (GCRMN), habitats classes and seagrass species.
Training and evaluation data
Details on the number of images for each class are given in the following table:
Class | train | val | test | Total |
---|---|---|---|---|
Acropore_branched | 1488 | 465 | 455 | 2408 |
Acropore_digitised | 566 | 169 | 153 | 888 |
Acropore_sub_massive | 147 | 48 | 48 | 243 |
Acropore_tabular | 997 | 290 | 302 | 1589 |
Algae_assembly | 2537 | 859 | 842 | 4238 |
Algae_drawn_up | 368 | 121 | 131 | 620 |
Algae_limestone | 1651 | 559 | 562 | 2772 |
Algae_sodding | 3155 | 980 | 982 | 5117 |
Atra/Leucospilota | 1090 | 359 | 343 | 1792 |
Bleached_coral | 219 | 69 | 72 | 360 |
Blurred | 190 | 63 | 67 | 320 |
Dead_coral | 1981 | 644 | 639 | 3264 |
Fish | 2029 | 657 | 635 | 3321 |
Homo_sapiens | 160 | 63 | 59 | 282 |
Human_object | 156 | 61 | 53 | 270 |
Living_coral | 854 | 289 | 271 | 1414 |
Millepore | 383 | 129 | 125 | 637 |
No_acropore_encrusting | 420 | 153 | 152 | 725 |
No_acropore_foliaceous | 204 | 44 | 38 | 286 |
No_acropore_massive | 1017 | 345 | 343 | 1705 |
No_acropore_solitary | 195 | 54 | 54 | 303 |
No_acropore_sub_massive | 1383 | 445 | 428 | 2256 |
Rock | 4469 | 1499 | 1489 | 7457 |
Rubble | 3089 | 1011 | 1023 | 5123 |
Sand | 5840 | 1949 | 1930 | 9719 |
Sea_cucumber | 1413 | 445 | 436 | 2294 |
Sea_urchins | 327 | 107 | 111 | 545 |
Sponge | 269 | 104 | 97 | 470 |
Syringodium_isoetifolium | 1214 | 388 | 393 | 1995 |
Thalassodendron_ciliatum | 781 | 262 | 260 | 1303 |
Useless | 579 | 193 | 193 | 965 |
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- Number of Epochs: 150
- Learning Rate: 0.001
- Train Batch Size: 32
- Eval Batch Size: 32
- Optimizer: Adam
- LR Scheduler Type: ReduceLROnPlateau with a patience of 5 epochs and a factor of 0.1
- Freeze Encoder: Yes
- Data Augmentation: Yes
Data Augmentation
Data were augmented using the following transformations :
Train Transforms
- PreProcess: No additional parameters
- Resize: probability=1.00
- RandomHorizontalFlip: probability=0.25
- RandomVerticalFlip: probability=0.25
- ColorJiggle: probability=0.25
- RandomPerspective: probability=0.25
- Normalize: probability=1.00
Val Transforms
- PreProcess: No additional parameters
- Resize: probability=1.00
- Normalize: probability=1.00
Training results
Epoch | Validation Loss | Accuracy | F1 Macro | F1 Micro | Learning Rate |
---|---|---|---|---|---|
1.0 | 0.1701383888721466 | 0.232726022688209 | 0.7380235658381353 | 0.4712591871520079 | 0.001 |
1.8315018315018317 | N/A | N/A | N/A | N/A | 0.001 |
2.0 | 0.15890291333198547 | 0.24888277758679958 | 0.7568708574323469 | 0.5722657636852799 | 0.001 |
3.0 | 0.15134122967720032 | 0.24785149535922998 | 0.7723932964583505 | 0.6104117366005594 | 0.001 |
3.663003663003663 | N/A | N/A | N/A | N/A | 0.001 |
4.0 | 0.15164224803447723 | 0.24853901684427637 | 0.7608496532472631 | 0.599745200497783 | 0.001 |
5.0 | 0.15243493020534515 | 0.24750773461670678 | 0.7692371752165224 | 0.5935106518877853 | 0.001 |
5.4945054945054945 | N/A | N/A | N/A | N/A | 0.001 |
6.0 | 0.14673969149589539 | 0.24269508422138192 | 0.7718080548414739 | 0.613788061665736 | 0.001 |
7.0 | 0.1506606936454773 | 0.2509453420419388 | 0.7732481363152289 | 0.6309898748773293 | 0.001 |
7.326007326007326 | N/A | N/A | N/A | N/A | 0.001 |
8.0 | 0.14430351555347443 | 0.2609144035751117 | 0.7828014555188422 | 0.6403740520777896 | 0.001 |
9.0 | 0.14617429673671722 | 0.25128910278446204 | 0.781416038551835 | 0.6366498775226099 | 0.001 |
9.157509157509157 | N/A | N/A | N/A | N/A | 0.001 |
10.0 | 0.14414818584918976 | 0.2688209006531454 | 0.7794745970641737 | 0.6254472467805158 | 0.001 |
10.989010989010989 | N/A | N/A | N/A | N/A | 0.001 |
11.0 | 0.14589238166809082 | 0.2595393606050189 | 0.7780349253103302 | 0.6357434835994208 | 0.001 |
12.0 | 0.14458976686000824 | 0.2554142316947405 | 0.7823495795575149 | 0.638389910481932 | 0.001 |
12.820512820512821 | N/A | N/A | N/A | N/A | 0.001 |
13.0 | 0.14142437279224396 | 0.2561017531797869 | 0.786284091383703 | 0.6574365741219022 | 0.001 |
14.0 | 0.1581379920244217 | 0.24682021313166036 | 0.7766990291262137 | 0.6245865910731833 | 0.001 |
14.652014652014651 | N/A | N/A | N/A | N/A | 0.001 |
15.0 | 0.1447945237159729 | 0.2598831213475421 | 0.7859620485615181 | 0.6552072842486797 | 0.001 |
16.0 | 0.1438169628381729 | 0.2605706428325885 | 0.7853051058530511 | 0.6495757946819554 | 0.001 |
16.483516483516482 | N/A | N/A | N/A | N/A | 0.001 |
17.0 | 0.14359386265277863 | 0.2506015812994156 | 0.7824457675812967 | 0.6310969679900952 | 0.001 |
18.0 | 0.1412857472896576 | 0.2564455139223101 | 0.7848311343456975 | 0.6531950395959965 | 0.001 |
18.315018315018314 | N/A | N/A | N/A | N/A | 0.001 |
19.0 | 0.14079046249389648 | 0.26022688209006534 | 0.7833830386020918 | 0.6486819478687708 | 0.001 |
20.0 | 0.14640754461288452 | 0.26400825025782054 | 0.7775968460747342 | 0.6262168341318395 | 0.001 |
20.146520146520146 | N/A | N/A | N/A | N/A | 0.001 |
21.0 | 0.1412632316350937 | 0.2653832932279134 | 0.7890916719110552 | 0.6582044080070929 | 0.001 |
21.978021978021978 | N/A | N/A | N/A | N/A | 0.001 |
22.0 | 0.14168681204319 | 0.2543829494671708 | 0.7871090517954659 | 0.6586947128782558 | 0.001 |
23.0 | 0.1393543779850006 | 0.269852182880715 | 0.7863651704353696 | 0.6427873985434494 | 0.001 |
23.80952380952381 | N/A | N/A | N/A | N/A | 0.001 |
24.0 | 0.14052371680736542 | 0.2588518391199725 | 0.7857706852844616 | 0.6618962794412713 | 0.001 |
25.0 | 0.1392364352941513 | 0.2653832932279134 | 0.7897693920335429 | 0.653320279245233 | 0.001 |
25.641025641025642 | N/A | N/A | N/A | N/A | 0.001 |
26.0 | 0.14239099621772766 | 0.27019594362323823 | 0.7838044308632545 | 0.6529431984792132 | 0.001 |
27.0 | 0.1386287957429886 | 0.2671020969405294 | 0.7974886125815585 | 0.6810613208979668 | 0.001 |
27.47252747252747 | N/A | N/A | N/A | N/A | 0.001 |
28.0 | 0.15519200265407562 | 0.2650395324853902 | 0.7791304347826087 | 0.6474807711800876 | 0.001 |
29.0 | 0.14190098643302917 | 0.27019594362323823 | 0.7913651213762871 | 0.6550381793679035 | 0.001 |
29.304029304029303 | N/A | N/A | N/A | N/A | 0.001 |
30.0 | 0.13986903429031372 | 0.2767273977311791 | 0.7857173292428311 | 0.663185953977854 | 0.001 |
31.0 | 0.13765402138233185 | 0.27260226882090066 | 0.7881844380403459 | 0.6554744698272669 | 0.001 |
31.135531135531135 | N/A | N/A | N/A | N/A | 0.001 |
32.0 | 0.13866138458251953 | 0.2677896184255758 | 0.7914770376499792 | 0.6596978272887946 | 0.001 |
32.967032967032964 | N/A | N/A | N/A | N/A | 0.001 |
33.0 | 0.13930276036262512 | 0.2605706428325885 | 0.7887546855476885 | 0.6583814800932023 | 0.001 |
34.0 | 0.1374826431274414 | 0.2763836369886559 | 0.795303262082937 | 0.6636727922636001 | 0.001 |
34.798534798534796 | N/A | N/A | N/A | N/A | 0.001 |
35.0 | 0.14001137018203735 | 0.25850807837744927 | 0.7860775988902434 | 0.6442491093834092 | 0.001 |
36.0 | 0.13899104297161102 | 0.26916466139566864 | 0.7890085033301218 | 0.6541220211466265 | 0.001 |
36.63003663003663 | N/A | N/A | N/A | N/A | 0.001 |
37.0 | 0.14101693034172058 | 0.2667583361980062 | 0.788356222091162 | 0.6602790790864311 | 0.001 |
38.0 | 0.13849563896656036 | 0.2633207287727741 | 0.7864065343433915 | 0.6508514926081754 | 0.001 |
38.46153846153846 | N/A | N/A | N/A | N/A | 0.001 |
39.0 | 0.14249388873577118 | 0.26263320728772777 | 0.7819844457738655 | 0.6513021077089046 | 0.001 |
40.0 | 0.1512959599494934 | 0.2633207287727741 | 0.7819497946916141 | 0.6421624481517915 | 0.001 |
40.29304029304029 | N/A | N/A | N/A | N/A | 0.0001 |
41.0 | 0.1416281908750534 | 0.27157098659333107 | 0.795353889863792 | 0.6708412782877394 | 0.0001 |
42.0 | 0.13480685651302338 | 0.2811962873839807 | 0.8014906832298136 | 0.6820172839356666 | 0.0001 |
42.124542124542124 | N/A | N/A | N/A | N/A | 0.0001 |
43.0 | 0.1342025101184845 | 0.2756961155036095 | 0.8014919187733112 | 0.681931169239128 | 0.0001 |
43.956043956043956 | N/A | N/A | N/A | N/A | 0.0001 |
44.0 | 0.1327475756406784 | 0.2811962873839807 | 0.8019789631231031 | 0.683693351140427 | 0.0001 |
45.0 | 0.1318245828151703 | 0.2811962873839807 | 0.8049446006284108 | 0.6900135704395078 | 0.0001 |
45.78754578754579 | N/A | N/A | N/A | N/A | 0.0001 |
46.0 | 0.13027183711528778 | 0.28910278446201443 | 0.8063969585520062 | 0.6920134474185277 | 0.0001 |
47.0 | 0.12985946238040924 | 0.284977655551736 | 0.8065087538619978 | 0.6938459582689339 | 0.0001 |
47.61904761904762 | N/A | N/A | N/A | N/A | 0.0001 |
48.0 | 0.12981055676937103 | 0.2853214162942592 | 0.8031727379553465 | 0.6917397436201066 | 0.0001 |
49.0 | 0.1301460713148117 | 0.2839463733241664 | 0.8081048867699644 | 0.6980761423122126 | 0.0001 |
49.45054945054945 | N/A | N/A | N/A | N/A | 0.0001 |
50.0 | 0.1294524371623993 | 0.2829150910965968 | 0.8056895691232739 | 0.6968263757426811 | 0.0001 |
51.0 | 0.12989668548107147 | 0.2846338948092128 | 0.8078541374474054 | 0.6981227572539419 | 0.0001 |
51.282051282051285 | N/A | N/A | N/A | N/A | 0.0001 |
52.0 | 0.13097986578941345 | 0.284977655551736 | 0.809621541745341 | 0.7032059573412642 | 0.0001 |
53.0 | 0.12910524010658264 | 0.288415262976968 | 0.8082875892525485 | 0.6952081515364695 | 0.0001 |
53.11355311355312 | N/A | N/A | N/A | N/A | 0.0001 |
54.0 | 0.1276824176311493 | 0.2860089377793056 | 0.8055729885778838 | 0.6914506394370794 | 0.0001 |
54.94505494505494 | N/A | N/A | N/A | N/A | 0.0001 |
55.0 | 0.12751279771327972 | 0.28979030594706084 | 0.8091508143727464 | 0.7051415507931676 | 0.0001 |
56.0 | 0.12798655033111572 | 0.2911653489171537 | 0.8077718065316246 | 0.6990943862949641 | 0.0001 |
56.776556776556774 | N/A | N/A | N/A | N/A | 0.0001 |
57.0 | 0.1279618740081787 | 0.29150910965967686 | 0.8107930240210597 | 0.7001268142729874 | 0.0001 |
58.0 | 0.1280883550643921 | 0.290134066689584 | 0.8108946874106743 | 0.7039327958876614 | 0.0001 |
58.608058608058606 | N/A | N/A | N/A | N/A | 0.0001 |
59.0 | 0.1287168562412262 | 0.2873839807493984 | 0.8071845383437488 | 0.699653006099352 | 0.0001 |
60.0 | 0.1270500272512436 | 0.28875902371949125 | 0.8103491168421926 | 0.7042073996338176 | 0.0001 |
60.43956043956044 | N/A | N/A | N/A | N/A | 0.0001 |
61.0 | 0.1269637793302536 | 0.28944654520453766 | 0.8072888368788399 | 0.6994480698947442 | 0.0001 |
62.0 | 0.12639474868774414 | 0.28979030594706084 | 0.8124407826982492 | 0.7105518005302388 | 0.0001 |
62.27106227106227 | N/A | N/A | N/A | N/A | 0.0001 |
63.0 | 0.12643341720104218 | 0.2918528704022001 | 0.8093336660843524 | 0.7042257858113937 | 0.0001 |
64.0 | 0.12570597231388092 | 0.2918528704022001 | 0.8119739624362535 | 0.7054117610081568 | 0.0001 |
64.1025641025641 | N/A | N/A | N/A | N/A | 0.0001 |
65.0 | 0.12599390745162964 | 0.29322791337229287 | 0.8103770839396333 | 0.7040599127700347 | 0.0001 |
65.93406593406593 | N/A | N/A | N/A | N/A | 0.0001 |
66.0 | 0.12674611806869507 | 0.29769680302509455 | 0.8141795311606633 | 0.7083351143800681 | 0.0001 |
67.0 | 0.12676431238651276 | 0.28979030594706084 | 0.8090950582963362 | 0.6998024530144022 | 0.0001 |
67.76556776556777 | N/A | N/A | N/A | N/A | 0.0001 |
68.0 | 0.12638631463050842 | 0.2928841526297697 | 0.8127327032445482 | 0.7034736625177254 | 0.0001 |
69.0 | 0.12608103454113007 | 0.2952904778274321 | 0.8131967584022379 | 0.7078892431331377 | 0.0001 |
69.59706959706959 | N/A | N/A | N/A | N/A | 0.0001 |
70.0 | 0.12582050263881683 | 0.29150910965967686 | 0.8136722606120435 | 0.7081157868651535 | 0.0001 |
71.0 | 0.12533149123191833 | 0.2918528704022001 | 0.8123295595405339 | 0.7044517956080781 | 1e-05 |
71.42857142857143 | N/A | N/A | N/A | N/A | 1e-05 |
72.0 | 0.1258901059627533 | 0.2966655207975249 | 0.8159506713723581 | 0.7099295458861072 | 1e-05 |
73.0 | 0.12526649236679077 | 0.2949467170849089 | 0.8159496670343587 | 0.7116557450872655 | 1e-05 |
73.26007326007326 | N/A | N/A | N/A | N/A | 1e-05 |
74.0 | 0.12490212172269821 | 0.29769680302509455 | 0.8156100747030249 | 0.7159515864206437 | 1e-05 |
75.0 | 0.12504002451896667 | 0.2966655207975249 | 0.8135426082669078 | 0.7082306828309269 | 1e-05 |
75.0915750915751 | N/A | N/A | N/A | N/A | 1e-05 |
76.0 | 0.12634462118148804 | 0.2966655207975249 | 0.8099675513769865 | 0.6998917153140419 | 1e-05 |
76.92307692307692 | N/A | N/A | N/A | N/A | 1e-05 |
77.0 | 0.1249643936753273 | 0.2966655207975249 | 0.8142915811088296 | 0.7104044870773909 | 1e-05 |
78.0 | 0.12509745359420776 | 0.2939154348573393 | 0.812339968613199 | 0.7076718539561497 | 1e-05 |
78.75457875457876 | N/A | N/A | N/A | N/A | 1e-05 |
79.0 | 0.12465520948171616 | 0.29838432451014096 | 0.8147326016360423 | 0.7097766100728804 | 1e-05 |
80.0 | 0.12526248395442963 | 0.2990718459951873 | 0.8166140393490405 | 0.7133791911991404 | 1e-05 |
80.58608058608058 | N/A | N/A | N/A | N/A | 1e-05 |
81.0 | 0.12510864436626434 | 0.2952904778274321 | 0.8121923983622152 | 0.705898272950067 | 1e-05 |
82.0 | 0.12532733380794525 | 0.29975936748023374 | 0.8150326797385622 | 0.7095032932540235 | 1e-05 |
82.41758241758242 | N/A | N/A | N/A | N/A | 1e-05 |
83.0 | 0.12474868446588516 | 0.29597799931247853 | 0.815855206584497 | 0.7124383950303705 | 1e-05 |
84.0 | 0.12511858344078064 | 0.3007906497078034 | 0.8175330467926365 | 0.7138847615465347 | 1e-05 |
84.24908424908425 | N/A | N/A | N/A | N/A | 1e-05 |
85.0 | 0.12457013875246048 | 0.2966655207975249 | 0.8132141082960754 | 0.7054571621251418 | 1e-05 |
86.0 | 0.1251869946718216 | 0.2946029563423857 | 0.8143732269868025 | 0.7142702846379808 | 1e-05 |
86.08058608058609 | N/A | N/A | N/A | N/A | 1e-05 |
87.0 | 0.12492978572845459 | 0.2935716741148161 | 0.8135328455150868 | 0.7081357577756824 | 1e-05 |
87.91208791208791 | N/A | N/A | N/A | N/A | 1e-05 |
88.0 | 0.12513719499111176 | 0.2990718459951873 | 0.815831263487927 | 0.7099379006276698 | 1e-05 |
89.0 | 0.12514576315879822 | 0.2963217600550017 | 0.8143914473684211 | 0.7092910188720426 | 1e-05 |
89.74358974358974 | N/A | N/A | N/A | N/A | 1e-05 |
90.0 | 0.1244530975818634 | 0.2942591955998625 | 0.8134516195584898 | 0.7121664381657501 | 1e-05 |
91.0 | 0.12501013278961182 | 0.2990718459951873 | 0.8153902768123646 | 0.7106178930468596 | 1e-05 |
91.57509157509158 | N/A | N/A | N/A | N/A | 1e-05 |
92.0 | 0.12525025010108948 | 0.2973530422825713 | 0.8163049232398094 | 0.7140173113811211 | 1e-05 |
93.0 | 0.12471849471330643 | 0.29872808525266414 | 0.8148661314641998 | 0.7129019083206937 | 1e-05 |
93.4065934065934 | N/A | N/A | N/A | N/A | 1e-05 |
94.0 | 0.12515641748905182 | 0.2980405637676177 | 0.8141884924726748 | 0.7053935701419592 | 1e-05 |
95.0 | 0.12481416761875153 | 0.30147817119284975 | 0.8165906870726147 | 0.7134995447430972 | 1e-05 |
95.23809523809524 | N/A | N/A | N/A | N/A | 1e-05 |
96.0 | 0.12492986023426056 | 0.2980405637676177 | 0.8160666176830762 | 0.7110442004495683 | 1e-05 |
97.0 | 0.12459924072027206 | 0.30147817119284975 | 0.8168590473093806 | 0.7158597011246477 | 1.0000000000000002e-06 |
97.06959706959707 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06 |
98.0 | 0.12447398155927658 | 0.29975936748023374 | 0.8149457415323906 | 0.707122866121441 | 1.0000000000000002e-06 |
98.9010989010989 | N/A | N/A | N/A | N/A | 1.0000000000000002e-06 |
99.0 | 0.12462905794382095 | 0.30216569267789617 | 0.8165748111859562 | 0.7182970295785608 | 1.0000000000000002e-06 |
100.0 | 0.12463195621967316 | 0.30147817119284975 | 0.8161644284310514 | 0.7136275002413193 | 1.0000000000000002e-06 |
CO2 Emissions
The estimated CO2 emissions for training this model are documented below:
- Emissions: 1.562242452449767 grams of CO2
- Source: Code Carbon
- Training Type: fine-tuning
- Geographical Location: Brest, France
- Hardware Used: NVIDIA Tesla V100 PCIe 32 Go
Framework Versions
- Transformers: 4.41.1
- Pytorch: 2.3.0+cu121
- Datasets: 2.19.1
- Tokenizers: 0.19.1
- Downloads last month
- 2
Model tree for lombardata/dinov2-large-2024_05_27-_batch-size32_epochs150_freeze
Base model
facebook/dinov2-large