zuazo's picture
End of training
153dc27 verified
metadata
library_name: transformers
language:
  - eu
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - whisper-event
  - generated_from_trainer
datasets:
  - mozilla-foundation/common_voice_17_0
metrics:
  - wer
model-index:
  - name: Whisper Tiny Basque
    results:
      - task:
          name: Automatic Speech Recognition
          type: automatic-speech-recognition
        dataset:
          name: mozilla-foundation/common_voice_17_0 eu
          type: mozilla-foundation/common_voice_17_0
          config: eu
          split: test
          args: eu
        metrics:
          - name: Wer
            type: wer
            value: 20.124409102568798

Whisper Tiny Basque

This model is a fine-tuned version of openai/whisper-tiny on the mozilla-foundation/common_voice_17_0 eu dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4350
  • Wer: 20.1244

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3.75e-05
  • train_batch_size: 256
  • eval_batch_size: 128
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 40000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.0348 9.3458 1000 0.3382 22.7152
0.0021 18.6916 2000 0.4092 21.7844
0.0009 28.0374 3000 0.4509 21.9026
0.0023 37.3832 4000 0.4062 20.7181
0.0003 46.7290 5000 0.4350 20.1244
0.0002 56.0748 6000 0.4546 20.1702
0.0001 65.4206 7000 0.4745 20.2179
0.0001 74.7664 8000 0.4941 20.1995
0.0 84.1121 9000 0.5142 20.3342
0.0 93.4579 10000 0.5353 20.4386
0.0 102.8037 11000 0.5567 20.5495
0.0 112.1495 12000 0.5788 20.6100
0.0 121.4953 13000 0.6023 20.6988
0.0 130.8411 14000 0.6253 20.7767
0.0 140.1869 15000 0.6486 20.8683

Framework versions

  • Transformers 4.52.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.6.0
  • Tokenizers 0.21.1