wav2vec2-xls-r-1b-scandinavian-E6-25h-30-epochs-20250208_v6

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: inf
  • Wer: 30.3141
  • Cer: 10.6693

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 5000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.1557 0.8052 1000 inf 82.2931 29.3959
1.2064 1.6103 2000 inf 60.0400 20.7421
1.0098 2.4155 3000 inf 51.4387 17.2982
0.5777 3.2206 4000 inf 50.0820 16.9330
0.4035 4.0258 5000 inf 49.7392 16.9217
0.4148 4.8309 6000 inf 48.6947 16.5653
0.3909 5.6361 7000 inf 47.2167 16.1521
0.3957 6.4412 8000 inf 46.7938 15.9950
0.5195 7.2464 9000 inf 45.6720 15.6448
0.4977 8.0515 10000 inf 41.3393 14.0356
0.4793 8.8567 11000 inf 41.2713 14.0323
0.4175 9.6618 12000 inf 40.5189 13.7936
0.2602 10.4670 13000 inf 38.5700 13.2449
0.2566 11.2721 14000 inf 37.8977 13.0691
0.2215 12.0773 15000 inf 39.2877 13.5549
0.2557 12.8824 16000 inf 37.8510 12.8671
0.3672 13.6876 17000 inf 38.6660 13.1300
0.3282 14.4928 18000 inf 37.3481 12.7591
0.3101 15.2979 19000 inf 35.9034 12.3669
0.1821 16.1031 20000 inf 35.0097 12.1182
0.1719 16.9082 21000 inf 35.7754 12.4047
0.1529 17.7134 22000 inf 34.9030 12.0927
0.1338 18.5185 23000 inf 34.3040 11.8955
0.2186 19.3237 24000 inf 33.5490 11.7366
0.2311 20.1288 25000 inf 33.3462 11.5700
0.2306 20.9340 26000 inf 32.6766 11.3913
0.2267 21.7391 27000 inf 33.3049 11.4791
0.1381 22.5443 28000 inf 32.1603 11.2660
0.0838 23.3494 29000 inf 32.0910 11.2171
0.0821 24.1546 30000 inf 31.2506 10.9537
0.077 24.9597 31000 inf 31.0865 10.9077
0.1248 25.7649 32000 inf 30.6210 10.7582
0.1548 26.5700 33000 inf 30.7423 10.8239
0.1648 27.3752 34000 inf 30.2901 10.6713
0.1303 28.1804 35000 inf 30.3675 10.6884
0.135 28.9855 36000 inf 30.3075 10.6664
0.062 29.7907 37000 inf 30.3141 10.6693

Framework versions

  • Transformers 4.48.2
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
18
Safetensors
Model size
963M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for davidilag/wav2vec2-xls-r-1b-scandinavian-E6-25h-30-epochs-20250208_v6

Finetuned
(111)
this model
Finetunes
1 model