swin-tiny-patch4-window7-224-swinnn
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0883
- Accuracy: 0.8232
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.2802 | 0.9979 | 351 | 0.2783 | 0.3222 |
0.2702 | 1.9986 | 703 | 0.2652 | 0.376 |
0.2565 | 2.9993 | 1055 | 0.2474 | 0.431 |
0.2448 | 4.0 | 1407 | 0.2358 | 0.4558 |
0.2433 | 4.9979 | 1758 | 0.2223 | 0.4994 |
0.2095 | 5.9986 | 2110 | 0.2058 | 0.5434 |
0.2197 | 6.9993 | 2462 | 0.1963 | 0.568 |
0.2093 | 8.0 | 2814 | 0.1906 | 0.5764 |
0.2047 | 8.9979 | 3165 | 0.1888 | 0.5874 |
0.1952 | 9.9986 | 3517 | 0.1743 | 0.6192 |
0.1926 | 10.9993 | 3869 | 0.1740 | 0.6234 |
0.1838 | 12.0 | 4221 | 0.1667 | 0.6448 |
0.1822 | 12.9979 | 4572 | 0.1629 | 0.6468 |
0.1838 | 13.9986 | 4924 | 0.1587 | 0.6638 |
0.1689 | 14.9993 | 5276 | 0.1563 | 0.675 |
0.1697 | 16.0 | 5628 | 0.1472 | 0.6916 |
0.1643 | 16.9979 | 5979 | 0.1435 | 0.6912 |
0.1655 | 17.9986 | 6331 | 0.1395 | 0.706 |
0.1555 | 18.9993 | 6683 | 0.1371 | 0.714 |
0.1577 | 20.0 | 7035 | 0.1321 | 0.7258 |
0.1575 | 20.9979 | 7386 | 0.1318 | 0.7284 |
0.141 | 21.9986 | 7738 | 0.1228 | 0.7438 |
0.151 | 22.9993 | 8090 | 0.1260 | 0.7392 |
0.1403 | 24.0 | 8442 | 0.1178 | 0.7558 |
0.1434 | 24.9979 | 8793 | 0.1185 | 0.7534 |
0.1465 | 25.9986 | 9145 | 0.1162 | 0.759 |
0.1362 | 26.9993 | 9497 | 0.1121 | 0.769 |
0.138 | 28.0 | 9849 | 0.1099 | 0.769 |
0.1293 | 28.9979 | 10200 | 0.1094 | 0.7754 |
0.1273 | 29.9986 | 10552 | 0.1091 | 0.7768 |
0.1363 | 30.9993 | 10904 | 0.1078 | 0.7766 |
0.1293 | 32.0 | 11256 | 0.1091 | 0.7736 |
0.1275 | 32.9979 | 11607 | 0.1068 | 0.7806 |
0.1263 | 33.9986 | 11959 | 0.1040 | 0.7888 |
0.1243 | 34.9993 | 12311 | 0.1019 | 0.7954 |
0.1237 | 36.0 | 12663 | 0.1016 | 0.7958 |
0.1243 | 36.9979 | 13014 | 0.0993 | 0.7988 |
0.1194 | 37.9986 | 13366 | 0.1011 | 0.7986 |
0.1213 | 38.9993 | 13718 | 0.0959 | 0.8064 |
0.1155 | 40.0 | 14070 | 0.0942 | 0.8108 |
0.1179 | 40.9979 | 14421 | 0.0950 | 0.8072 |
0.1057 | 41.9986 | 14773 | 0.0924 | 0.8166 |
0.1042 | 42.9993 | 15125 | 0.0924 | 0.8152 |
0.1151 | 44.0 | 15477 | 0.0928 | 0.8132 |
0.1122 | 44.9979 | 15828 | 0.0920 | 0.8146 |
0.11 | 45.9986 | 16180 | 0.0906 | 0.8152 |
0.1096 | 46.9993 | 16532 | 0.0894 | 0.82 |
0.1082 | 48.0 | 16884 | 0.0885 | 0.821 |
0.108 | 48.9979 | 17235 | 0.0886 | 0.8204 |
0.112 | 49.8934 | 17550 | 0.0883 | 0.8232 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.1.0a0+32f93b1
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 34
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.