dinov2-large-2024_01_19-with_data_aug_batch-size32_epochs50_freeze
This model is a fine-tuned version of facebook/dinov2-large on the multilabel_complete_dataset dataset. It achieves the following results on the evaluation set:
- Loss: 0.0893
- F1 Micro: 0.8609
- F1 Macro: 0.8255
- Roc Auc: 0.9106
- Accuracy: 0.5640
- Learning Rate: 0.0001
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | F1 Micro | F1 Macro | Roc Auc | Accuracy | Rate |
---|---|---|---|---|---|---|---|---|
No log | 1.0 | 274 | 0.1433 | 0.7651 | 0.6114 | 0.8511 | 0.4395 | 0.001 |
0.2465 | 2.0 | 548 | 0.1238 | 0.8065 | 0.7226 | 0.8761 | 0.4896 | 0.001 |
0.2465 | 3.0 | 822 | 0.1152 | 0.8172 | 0.7453 | 0.8789 | 0.5191 | 0.001 |
0.1402 | 4.0 | 1096 | 0.1138 | 0.8186 | 0.7360 | 0.8777 | 0.5184 | 0.001 |
0.1402 | 5.0 | 1370 | 0.1144 | 0.8230 | 0.7547 | 0.8817 | 0.5184 | 0.001 |
0.1317 | 6.0 | 1644 | 0.1100 | 0.8201 | 0.7469 | 0.8745 | 0.5261 | 0.001 |
0.1317 | 7.0 | 1918 | 0.1125 | 0.8241 | 0.7511 | 0.8797 | 0.5306 | 0.001 |
0.1284 | 8.0 | 2192 | 0.1084 | 0.8320 | 0.7709 | 0.8895 | 0.5411 | 0.001 |
0.1284 | 9.0 | 2466 | 0.1115 | 0.8266 | 0.7647 | 0.8936 | 0.5198 | 0.001 |
0.1267 | 10.0 | 2740 | 0.1100 | 0.8328 | 0.7685 | 0.8935 | 0.5334 | 0.001 |
0.1251 | 11.0 | 3014 | 0.1750 | 0.8081 | 0.7345 | 0.8760 | 0.5177 | 0.001 |
0.1251 | 12.0 | 3288 | 0.1086 | 0.8244 | 0.7612 | 0.8779 | 0.5379 | 0.001 |
0.1247 | 13.0 | 3562 | 0.1064 | 0.8295 | 0.7613 | 0.8870 | 0.5320 | 0.001 |
0.1247 | 14.0 | 3836 | 0.1050 | 0.8318 | 0.7684 | 0.8886 | 0.5289 | 0.001 |
0.123 | 15.0 | 4110 | 0.1043 | 0.8362 | 0.7696 | 0.8915 | 0.5341 | 0.001 |
0.123 | 16.0 | 4384 | 0.1045 | 0.8428 | 0.7891 | 0.9073 | 0.5341 | 0.001 |
0.1229 | 17.0 | 4658 | 0.1059 | 0.8386 | 0.7775 | 0.9033 | 0.5143 | 0.001 |
0.1229 | 18.0 | 4932 | 0.1063 | 0.8308 | 0.7606 | 0.8906 | 0.5261 | 0.001 |
0.1205 | 19.0 | 5206 | 0.1046 | 0.8367 | 0.7733 | 0.8916 | 0.5421 | 0.001 |
0.1205 | 20.0 | 5480 | 0.1091 | 0.8384 | 0.7787 | 0.9023 | 0.5379 | 0.001 |
0.1213 | 21.0 | 5754 | 0.1077 | 0.8323 | 0.7708 | 0.8907 | 0.5358 | 0.001 |
0.118 | 22.0 | 6028 | 0.1023 | 0.8446 | 0.7858 | 0.9041 | 0.5459 | 0.0001 |
0.118 | 23.0 | 6302 | 0.1009 | 0.8458 | 0.7888 | 0.8992 | 0.5546 | 0.0001 |
0.1102 | 24.0 | 6576 | 0.1000 | 0.8484 | 0.7925 | 0.9039 | 0.5546 | 0.0001 |
0.1102 | 25.0 | 6850 | 0.0974 | 0.8487 | 0.7919 | 0.9015 | 0.5557 | 0.0001 |
0.107 | 26.0 | 7124 | 0.1046 | 0.8510 | 0.7945 | 0.9057 | 0.5560 | 0.0001 |
0.107 | 27.0 | 7398 | 0.0967 | 0.8504 | 0.7968 | 0.9040 | 0.5602 | 0.0001 |
0.1051 | 28.0 | 7672 | 0.0941 | 0.8513 | 0.8002 | 0.9062 | 0.5550 | 0.0001 |
0.1051 | 29.0 | 7946 | 0.0941 | 0.8534 | 0.7979 | 0.9053 | 0.5612 | 0.0001 |
0.1039 | 30.0 | 8220 | 0.0949 | 0.8530 | 0.8028 | 0.9101 | 0.5553 | 0.0001 |
0.1039 | 31.0 | 8494 | 0.0942 | 0.8529 | 0.8005 | 0.9059 | 0.5637 | 0.0001 |
0.1025 | 32.0 | 8768 | 0.0951 | 0.8539 | 0.8021 | 0.9083 | 0.5623 | 0.0001 |
0.1016 | 33.0 | 9042 | 0.0974 | 0.8520 | 0.8013 | 0.9024 | 0.5612 | 0.0001 |
0.1016 | 34.0 | 9316 | 0.0926 | 0.8528 | 0.7981 | 0.9029 | 0.5609 | 0.0001 |
0.1004 | 35.0 | 9590 | 0.0924 | 0.8552 | 0.8059 | 0.9058 | 0.5689 | 0.0001 |
0.1004 | 36.0 | 9864 | 0.0927 | 0.8539 | 0.8087 | 0.9047 | 0.5630 | 0.0001 |
0.0993 | 37.0 | 10138 | 0.0915 | 0.8535 | 0.8050 | 0.9052 | 0.5665 | 0.0001 |
0.0993 | 38.0 | 10412 | 0.0923 | 0.8568 | 0.8059 | 0.9078 | 0.5672 | 0.0001 |
0.0992 | 39.0 | 10686 | 0.0930 | 0.8556 | 0.8078 | 0.9079 | 0.5654 | 0.0001 |
0.0992 | 40.0 | 10960 | 0.0932 | 0.8552 | 0.8109 | 0.9079 | 0.5612 | 0.0001 |
0.0984 | 41.0 | 11234 | 0.0922 | 0.8575 | 0.8114 | 0.9111 | 0.5598 | 0.0001 |
0.0973 | 42.0 | 11508 | 0.0927 | 0.8553 | 0.8147 | 0.9058 | 0.5672 | 0.0001 |
0.0973 | 43.0 | 11782 | 0.0911 | 0.8581 | 0.8148 | 0.9130 | 0.5637 | 0.0001 |
0.0973 | 44.0 | 12056 | 0.0915 | 0.8570 | 0.8127 | 0.9080 | 0.5665 | 0.0001 |
0.0973 | 45.0 | 12330 | 0.0908 | 0.8564 | 0.8094 | 0.9062 | 0.5640 | 0.0001 |
0.0954 | 46.0 | 12604 | 0.0900 | 0.8600 | 0.8157 | 0.9106 | 0.5696 | 0.0001 |
0.0954 | 47.0 | 12878 | 0.0901 | 0.8596 | 0.8204 | 0.9096 | 0.5672 | 0.0001 |
0.096 | 48.0 | 13152 | 0.0902 | 0.8594 | 0.8172 | 0.9082 | 0.5734 | 0.0001 |
0.096 | 49.0 | 13426 | 0.0903 | 0.8575 | 0.8128 | 0.9107 | 0.5717 | 0.0001 |
0.0939 | 50.0 | 13700 | 0.0897 | 0.8598 | 0.8164 | 0.9117 | 0.5696 | 0.0001 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.0+cu118
- Datasets 2.14.5
- Tokenizers 0.15.0
- Downloads last month
- 0
Model tree for lombardata/dinov2-large-2024_01_19-with_data_aug_batch-size32_epochs50_freeze
Base model
facebook/dinov2-large
Finetuned
this model