wav2vec2-xls-r-1b-E6-faroese-100h-30-epochs_20250209
This model is a fine-tuned version of davidilag/wav2vec2-xls-r-1b-scandinavian-E6-25h-30-epochs-20250208_v6 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.1050
- Wer: 18.8307
- Cer: 4.0997
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
0.547 | 0.4877 | 1000 | 0.3462 | 41.2213 | 11.5859 |
0.4224 | 0.9754 | 2000 | 0.2269 | 32.1408 | 8.5514 |
0.3633 | 1.4628 | 3000 | 0.2090 | 30.8719 | 8.0929 |
0.3533 | 1.9505 | 4000 | 0.2088 | 30.2815 | 7.8933 |
0.2868 | 2.4379 | 5000 | 0.2205 | 31.0349 | 8.1158 |
0.2877 | 2.9256 | 6000 | 0.2017 | 29.4797 | 7.5548 |
0.2775 | 3.4131 | 7000 | 0.1781 | 28.1447 | 7.2258 |
0.2577 | 3.9008 | 8000 | 0.1727 | 28.0213 | 6.9702 |
0.231 | 4.3882 | 9000 | 0.1703 | 27.1886 | 6.8187 |
0.2474 | 4.8759 | 10000 | 0.1670 | 27.0785 | 6.7642 |
0.2099 | 5.3633 | 11000 | 0.1699 | 26.2766 | 6.5559 |
0.2237 | 5.8510 | 12000 | 0.1597 | 26.0563 | 6.4605 |
0.1814 | 6.3385 | 13000 | 0.1475 | 25.0738 | 6.1259 |
0.1749 | 6.8261 | 14000 | 0.1475 | 25.3514 | 6.2017 |
0.1635 | 7.3136 | 15000 | 0.1557 | 24.9284 | 6.0273 |
0.1556 | 7.8013 | 16000 | 0.1540 | 24.7169 | 5.9942 |
0.14 | 8.2887 | 17000 | 0.1444 | 24.3160 | 5.8545 |
0.1444 | 8.7764 | 18000 | 0.1433 | 24.0913 | 5.8214 |
0.1144 | 9.2638 | 19000 | 0.1367 | 23.8974 | 5.7070 |
0.1347 | 9.7515 | 20000 | 0.1364 | 23.6771 | 5.6446 |
0.1157 | 10.2390 | 21000 | 0.1308 | 23.3511 | 5.5428 |
0.1175 | 10.7267 | 22000 | 0.1259 | 22.9854 | 5.4426 |
0.116 | 11.2141 | 23000 | 0.1358 | 22.8444 | 5.3756 |
0.1112 | 11.7018 | 24000 | 0.1213 | 22.7255 | 5.3125 |
0.096 | 12.1892 | 25000 | 0.1305 | 22.4303 | 5.2722 |
0.097 | 12.6769 | 26000 | 0.1295 | 22.5052 | 5.3061 |
0.0805 | 13.1644 | 27000 | 0.1261 | 21.7650 | 5.0600 |
0.083 | 13.6520 | 28000 | 0.1234 | 21.9412 | 5.0868 |
0.0716 | 14.1395 | 29000 | 0.1292 | 21.7782 | 5.0355 |
0.0761 | 14.6272 | 30000 | 0.1184 | 21.5271 | 4.9779 |
0.0736 | 15.1146 | 31000 | 0.1198 | 21.5095 | 4.9116 |
0.0713 | 15.6023 | 32000 | 0.1161 | 21.2407 | 4.8517 |
0.0687 | 16.0897 | 33000 | 0.1176 | 21.2671 | 4.8603 |
0.0526 | 16.5774 | 34000 | 0.1221 | 20.9191 | 4.7286 |
0.0517 | 17.0649 | 35000 | 0.1182 | 20.7781 | 4.7081 |
0.0624 | 17.5525 | 36000 | 0.1165 | 20.7428 | 4.6757 |
0.0476 | 18.0400 | 37000 | 0.1186 | 20.6239 | 4.6741 |
0.0437 | 18.5277 | 38000 | 0.1243 | 20.5754 | 4.6513 |
0.0489 | 19.0151 | 39000 | 0.1117 | 20.2934 | 4.5447 |
0.0445 | 19.5028 | 40000 | 0.1138 | 20.1789 | 4.5274 |
0.042 | 19.9905 | 41000 | 0.1108 | 19.9542 | 4.4477 |
0.0502 | 20.4779 | 42000 | 0.1119 | 19.9101 | 4.4374 |
0.0431 | 20.9656 | 43000 | 0.1108 | 19.8881 | 4.4003 |
0.0351 | 21.4531 | 44000 | 0.1097 | 19.8132 | 4.3830 |
0.0419 | 21.9407 | 45000 | 0.1124 | 19.7559 | 4.3869 |
0.0288 | 22.4282 | 46000 | 0.1095 | 19.4783 | 4.3136 |
0.0342 | 22.9159 | 47000 | 0.1117 | 19.5753 | 4.3285 |
0.0362 | 23.4033 | 48000 | 0.1075 | 19.4343 | 4.2686 |
0.0388 | 23.8910 | 49000 | 0.1075 | 19.3682 | 4.2607 |
0.0334 | 24.3784 | 50000 | 0.1121 | 19.2052 | 4.2181 |
0.0267 | 24.8661 | 51000 | 0.1054 | 19.0862 | 4.1723 |
0.0338 | 25.3536 | 52000 | 0.1084 | 19.0113 | 4.1629 |
0.0291 | 25.8413 | 53000 | 0.1060 | 19.0950 | 4.1676 |
0.0274 | 26.3287 | 54000 | 0.1071 | 18.9144 | 4.1258 |
0.0255 | 26.8164 | 55000 | 0.1048 | 18.9188 | 4.1195 |
0.0227 | 27.3038 | 56000 | 0.1061 | 18.8968 | 4.1147 |
0.0302 | 27.7915 | 57000 | 0.1060 | 18.8659 | 4.1045 |
0.0298 | 28.2790 | 58000 | 0.1048 | 18.8791 | 4.1053 |
0.0281 | 28.7666 | 59000 | 0.1054 | 18.8659 | 4.1068 |
0.0341 | 29.2541 | 60000 | 0.1050 | 18.8351 | 4.1021 |
0.0393 | 29.7418 | 61000 | 0.1050 | 18.8307 | 4.0997 |
Framework versions
- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 17
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for davidilag/wav2vec2-xls-r-1b-E6-faroese-100h-30-epochs_20250209
Base model
facebook/wav2vec2-xls-r-1b