wav2vec2-1b-E4 / README.md
Gummybear05's picture
End of training
153b9d1 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-1b
tags:
  - generated_from_trainer
model-index:
  - name: wav2vec2-1b-E4
    results: []

wav2vec2-1b-E4

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0115
  • Cer: 22.0865

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 1
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 16
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 3
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
11.1609 0.2580 200 6.3728 92.8689
4.3491 0.5161 400 5.6784 92.8160
4.0824 0.7741 600 5.3062 86.6424
3.4061 1.0322 800 3.3096 68.3917
2.5185 1.2902 1000 2.6009 54.2587
2.0009 1.5483 1200 2.2565 47.9323
1.6149 1.8063 1400 1.5933 32.3602
1.3834 2.0643 1600 1.7115 35.8494
1.1505 2.3224 1800 1.1848 26.5449
1.0075 2.5804 2000 1.1573 25.9868
0.967 2.8385 2200 1.0115 22.0865

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.5.1+cu121
  • Datasets 3.1.0
  • Tokenizers 0.20.3