wav2vec2-xls-r-1b-scandinavian-E4-100h-30-epochs-20250208_v3
This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: inf
- Wer: 17.8979
- Cer: 4.8286
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 8
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
2.6196 | 0.3909 | 500 | inf | 100.0 | 65.7730 |
1.0801 | 0.7819 | 1000 | inf | 38.9103 | 10.8535 |
0.69 | 1.1728 | 1500 | inf | 28.1298 | 7.8743 |
0.5238 | 1.5637 | 2000 | inf | 24.6450 | 6.9218 |
0.4787 | 1.9547 | 2500 | inf | 23.5140 | 6.6324 |
0.41 | 2.3456 | 3000 | inf | 23.2639 | 6.4844 |
0.3868 | 2.7365 | 3500 | inf | 23.0202 | 6.4139 |
0.4987 | 3.1274 | 4000 | inf | 22.1255 | 6.1609 |
0.5173 | 3.5184 | 4500 | inf | 22.0348 | 6.1361 |
0.5159 | 3.9093 | 5000 | inf | 22.2732 | 6.2526 |
0.2712 | 4.3002 | 5500 | inf | 22.0759 | 6.1940 |
0.2565 | 4.6912 | 6000 | inf | 21.4397 | 5.9482 |
0.4129 | 5.0821 | 6500 | inf | 21.1116 | 5.8439 |
0.3925 | 5.4730 | 7000 | inf | 21.1506 | 5.9633 |
0.3943 | 5.8640 | 7500 | inf | 19.8160 | 5.4331 |
0.1971 | 6.2549 | 8000 | inf | 18.7778 | 5.1171 |
0.1898 | 6.6458 | 8500 | inf | 18.2018 | 4.8967 |
0.2553 | 7.0367 | 9000 | inf | 18.1142 | 4.9026 |
0.2291 | 7.4277 | 9500 | inf | 17.5730 | 4.7604 |
0.2655 | 7.8186 | 10000 | inf | 17.8979 | 4.8286 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 11
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.