metadata
license: apache-2.0
base_model: microsoft/swinv2-tiny-patch4-window8-256
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: SWv2-DMAE-H-5-ps-clean-fix-U-40-Cross-2
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: train
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.9047619047619048
SWv2-DMAE-H-5-ps-clean-fix-U-40-Cross-2
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.3493
- Accuracy: 0.9048
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 4e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 40
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.6089 | 0.98 | 12 | 1.6061 | 0.2024 |
1.6027 | 1.96 | 24 | 1.5770 | 0.2024 |
1.563 | 2.94 | 36 | 1.5561 | 0.2024 |
1.5141 | 4.0 | 49 | 1.4404 | 0.2024 |
1.3727 | 4.98 | 61 | 1.2293 | 0.6310 |
1.2375 | 5.96 | 73 | 0.9512 | 0.7738 |
1.0523 | 6.94 | 85 | 0.7222 | 0.75 |
0.9473 | 8.0 | 98 | 0.5629 | 0.8571 |
0.7895 | 8.98 | 110 | 0.4626 | 0.8571 |
0.7525 | 9.96 | 122 | 0.4545 | 0.8095 |
0.669 | 10.94 | 134 | 0.4193 | 0.8452 |
0.6892 | 12.0 | 147 | 0.4371 | 0.8333 |
0.6742 | 12.98 | 159 | 0.4079 | 0.8452 |
0.5667 | 13.96 | 171 | 0.3863 | 0.8690 |
0.56 | 14.94 | 183 | 0.4160 | 0.8571 |
0.5683 | 16.0 | 196 | 0.4555 | 0.7976 |
0.5347 | 16.98 | 208 | 0.3839 | 0.8571 |
0.4723 | 17.96 | 220 | 0.3763 | 0.8810 |
0.4566 | 18.94 | 232 | 0.3525 | 0.8690 |
0.4328 | 20.0 | 245 | 0.3548 | 0.8690 |
0.4691 | 20.98 | 257 | 0.3934 | 0.8571 |
0.3675 | 21.96 | 269 | 0.3728 | 0.8810 |
0.3744 | 22.94 | 281 | 0.3810 | 0.8690 |
0.3883 | 24.0 | 294 | 0.3982 | 0.8571 |
0.3633 | 24.98 | 306 | 0.4277 | 0.8690 |
0.3439 | 25.96 | 318 | 0.3579 | 0.8810 |
0.3924 | 26.94 | 330 | 0.3929 | 0.8571 |
0.3139 | 28.0 | 343 | 0.3637 | 0.8929 |
0.2987 | 28.98 | 355 | 0.3493 | 0.9048 |
0.3285 | 29.96 | 367 | 0.3524 | 0.9048 |
0.4061 | 30.94 | 379 | 0.3607 | 0.9048 |
0.3307 | 32.0 | 392 | 0.3616 | 0.8810 |
0.2882 | 32.98 | 404 | 0.3529 | 0.8810 |
0.3074 | 33.96 | 416 | 0.3429 | 0.8929 |
0.3141 | 34.94 | 428 | 0.3420 | 0.8929 |
0.3206 | 36.0 | 441 | 0.3537 | 0.8929 |
0.2772 | 36.98 | 453 | 0.3475 | 0.8929 |
0.2707 | 37.96 | 465 | 0.3461 | 0.8690 |
0.2857 | 38.94 | 477 | 0.3453 | 0.8929 |
0.277 | 39.18 | 480 | 0.3454 | 0.8929 |
Framework versions
- Transformers 4.36.2
- Pytorch 2.1.2+cu118
- Datasets 2.16.1
- Tokenizers 0.15.0