wav2vec2-1b-Y2 / README.md
Gummybear05's picture
End of training
2431326 verified
metadata
library_name: transformers
license: apache-2.0
base_model: facebook/wav2vec2-xls-r-1b
tags:
  - generated_from_trainer
model-index:
  - name: wav2vec2-1b-Y2
    results: []

wav2vec2-1b-Y2

This model is a fine-tuned version of facebook/wav2vec2-xls-r-1b on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9063
  • Cer: 22.3214

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 50
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
14.1704 0.2580 200 4.3519 97.8090
2.4465 0.5160 400 2.2289 45.2126
1.3213 0.7741 600 1.7263 39.3386
1.0017 1.0321 800 1.2111 29.5348
0.8038 1.2901 1000 1.4298 34.3339
0.7428 1.5481 1200 1.3785 32.6656
0.705 1.8062 1400 1.2020 30.1104
0.5742 2.0642 1600 1.1142 27.8959
0.4805 2.3222 1800 1.3923 33.2237
0.471 2.5802 2000 1.1341 27.9840
0.4327 2.8383 2200 0.9932 25.4582
0.3677 3.0963 2400 0.8940 22.6739
0.2976 3.3543 2600 0.9628 24.6182
0.2802 3.6123 2800 0.8832 22.1570
0.2637 3.8703 3000 0.9087 22.8207
0.2175 4.1284 3200 0.9277 22.7326
0.1932 4.3864 3400 0.9152 22.6328
0.1724 4.6444 3600 0.9290 22.6621
0.1693 4.9024 3800 0.9063 22.3214

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.3.1.post100
  • Datasets 2.19.1
  • Tokenizers 0.20.1