Edit model card

hubert_zeroth_gpu_freeze

This model is a fine-tuned version of facebook/hubert-base-ls960 on the zeroth_korean_asr dataset. It achieves the following results on the evaluation set:

  • Loss: 4.8310
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
26.2877 0.14 100 10.6810 1.0
6.4696 0.29 200 4.8799 1.0
4.841 0.43 300 4.8521 1.0
4.8366 0.57 400 4.8736 1.0
4.8311 0.72 500 4.8559 1.0
4.8383 0.86 600 4.8601 1.0
4.8288 1.01 700 4.8474 1.0
4.8283 1.15 800 4.8436 1.0
4.8283 1.29 900 4.8440 1.0
4.8299 1.44 1000 4.8518 1.0
4.8274 1.58 1100 4.8406 1.0
4.8308 1.72 1200 4.8384 1.0
4.8316 1.87 1300 4.8427 1.0
4.8298 2.01 1400 4.8423 1.0
4.8291 2.16 1500 4.8481 1.0
4.8326 2.3 1600 4.8426 1.0
4.83 2.44 1700 4.8362 1.0
4.8286 2.59 1800 4.8424 1.0
4.8269 2.73 1900 4.8362 1.0
4.8234 2.87 2000 4.8452 1.0
4.8179 3.02 2100 4.8416 1.0
4.825 3.16 2200 4.8519 1.0
4.8185 3.3 2300 4.8384 1.0
4.827 3.45 2400 4.8519 1.0
4.8316 3.59 2500 4.8467 1.0
4.825 3.74 2600 4.8465 1.0
4.8246 3.88 2700 4.8422 1.0
4.8228 4.02 2800 4.8326 1.0
4.8277 4.17 2900 4.8353 1.0
4.822 4.31 3000 4.8349 1.0
4.82 4.45 3100 4.8395 1.0
4.8252 4.6 3200 4.8350 1.0
4.8283 4.74 3300 4.8377 1.0
4.8229 4.89 3400 4.8344 1.0
4.8264 5.03 3500 4.8352 1.0
4.8237 5.17 3600 4.8337 1.0
4.8271 5.32 3700 4.8385 1.0
4.8332 5.46 3800 4.8392 1.0
4.8189 5.6 3900 4.8353 1.0
4.8209 5.75 4000 4.8355 1.0
4.8179 5.89 4100 4.8297 1.0
4.821 6.03 4200 4.8505 1.0
4.8243 6.18 4300 4.8371 1.0
4.8224 6.32 4400 4.8378 1.0
4.8261 6.47 4500 4.8368 1.0
4.8233 6.61 4600 4.8326 1.0
4.8252 6.75 4700 4.8364 1.0
4.8247 6.9 4800 4.8438 1.0
4.8139 7.04 4900 4.8435 1.0
4.8204 7.18 5000 4.8398 1.0
4.8197 7.33 5100 4.8382 1.0
4.82 7.47 5200 4.8371 1.0
4.8266 7.61 5300 4.8431 1.0
4.826 7.76 5400 4.8390 1.0
4.8216 7.9 5500 4.8381 1.0
4.82 8.05 5600 4.8339 1.0
4.8281 8.19 5700 4.8316 1.0
4.8246 8.33 5800 4.8361 1.0
4.8169 8.48 5900 4.8338 1.0
4.8175 8.62 6000 4.8341 1.0
4.8283 8.76 6100 4.8358 1.0
4.8232 8.91 6200 4.8356 1.0
4.8193 9.05 6300 4.8325 1.0
4.8146 9.2 6400 4.8297 1.0
4.8207 9.34 6500 4.8283 1.0
4.8221 9.48 6600 4.8334 1.0
4.8229 9.63 6700 4.8308 1.0
4.8239 9.77 6800 4.8352 1.0
4.8245 9.91 6900 4.8314 1.0
4.8173 10.06 7000 4.8300 1.0
4.8189 10.2 7100 4.8341 1.0
4.8209 10.34 7200 4.8287 1.0
4.823 10.49 7300 4.8320 1.0
4.8226 10.63 7400 4.8273 1.0
4.8241 10.78 7500 4.8308 1.0
4.8177 10.92 7600 4.8316 1.0
4.8235 11.06 7700 4.8274 1.0
4.8188 11.21 7800 4.8290 1.0
4.8183 11.35 7900 4.8355 1.0
4.8226 11.49 8000 4.8312 1.0
4.8209 11.64 8100 4.8307 1.0
4.8208 11.78 8200 4.8300 1.0
4.8221 11.93 8300 4.8281 1.0
4.82 12.07 8400 4.8306 1.0
4.8199 12.21 8500 4.8343 1.0
4.8212 12.36 8600 4.8314 1.0
4.8212 12.5 8700 4.8309 1.0
4.8228 12.64 8800 4.8310 1.0
4.8225 12.79 8900 4.8325 1.0
4.8146 12.93 9000 4.8364 1.0
4.8174 13.07 9100 4.8328 1.0
4.816 13.22 9200 4.8338 1.0
4.822 13.36 9300 4.8378 1.0
4.8253 13.51 9400 4.8411 1.0
4.8173 13.65 9500 4.8379 1.0
4.8227 13.79 9600 4.8374 1.0
4.8138 13.94 9700 4.8372 1.0
4.8191 14.08 9800 4.8327 1.0
4.8259 14.22 9900 4.8335 1.0
4.8098 14.37 10000 4.8301 1.0
4.8248 14.51 10100 4.8315 1.0
4.8199 14.66 10200 4.8304 1.0
4.8202 14.8 10300 4.8312 1.0
4.8159 14.94 10400 4.8316 1.0
4.8181 15.09 10500 4.8306 1.0
4.8217 15.23 10600 4.8350 1.0
4.8095 15.37 10700 4.8328 1.0
4.8249 15.52 10800 4.8329 1.0
4.8178 15.66 10900 4.8355 1.0
4.8192 15.8 11000 4.8342 1.0
4.8249 15.95 11100 4.8366 1.0
4.8096 16.09 11200 4.8385 1.0
4.8196 16.24 11300 4.8390 1.0
4.8271 16.38 11400 4.8352 1.0
4.8166 16.52 11500 4.8371 1.0
4.8206 16.67 11600 4.8348 1.0
4.817 16.81 11700 4.8347 1.0
4.8165 16.95 11800 4.8386 1.0
4.8159 17.1 11900 4.8376 1.0
4.8202 17.24 12000 4.8374 1.0
4.8157 17.39 12100 4.8370 1.0
4.8175 17.53 12200 4.8405 1.0
4.8189 17.67 12300 4.8321 1.0
4.8167 17.82 12400 4.8322 1.0
4.8229 17.96 12500 4.8353 1.0
4.8179 18.1 12600 4.8322 1.0
4.8183 18.25 12700 4.8379 1.0
4.8151 18.39 12800 4.8375 1.0
4.8211 18.53 12900 4.8355 1.0
4.8241 18.68 13000 4.8352 1.0
4.8185 18.82 13100 4.8350 1.0
4.8175 18.97 13200 4.8352 1.0
4.8094 19.11 13300 4.8337 1.0
4.8149 19.25 13400 4.8344 1.0
4.8131 19.4 13500 4.8386 1.0
4.8227 19.54 13600 4.8350 1.0
4.8175 19.68 13700 4.8325 1.0
4.8204 19.83 13800 4.8344 1.0
4.8228 19.97 13900 4.8322 1.0
4.8177 20.11 14000 4.8365 1.0
4.824 20.26 14100 4.8338 1.0
4.8151 20.4 14200 4.8342 1.0
4.8189 20.55 14300 4.8339 1.0
4.8115 20.69 14400 4.8325 1.0
4.8162 20.83 14500 4.8291 1.0
4.8182 20.98 14600 4.8321 1.0
4.8189 21.12 14700 4.8314 1.0
4.8123 21.26 14800 4.8318 1.0
4.8165 21.41 14900 4.8320 1.0
4.8247 21.55 15000 4.8315 1.0
4.8165 21.7 15100 4.8311 1.0
4.8151 21.84 15200 4.8352 1.0
4.8234 21.98 15300 4.8298 1.0
4.8136 22.13 15400 4.8282 1.0
4.8179 22.27 15500 4.8297 1.0
4.8128 22.41 15600 4.8307 1.0
4.8216 22.56 15700 4.8290 1.0
4.8177 22.7 15800 4.8286 1.0
4.8209 22.84 15900 4.8311 1.0
4.8183 22.99 16000 4.8276 1.0
4.8135 23.13 16100 4.8284 1.0
4.8116 23.28 16200 4.8279 1.0
4.8161 23.42 16300 4.8291 1.0
4.8202 23.56 16400 4.8292 1.0
4.8199 23.71 16500 4.8298 1.0
4.8203 23.85 16600 4.8293 1.0
4.8177 23.99 16700 4.8286 1.0
4.8153 24.14 16800 4.8273 1.0
4.8202 24.28 16900 4.8260 1.0
4.8189 24.43 17000 4.8289 1.0
4.8219 24.57 17100 4.8279 1.0
4.8148 24.71 17200 4.8284 1.0
4.8113 24.86 17300 4.8286 1.0
4.8133 25.0 17400 4.8299 1.0
4.8164 25.14 17500 4.8309 1.0
4.8231 25.29 17600 4.8279 1.0
4.8135 25.43 17700 4.8296 1.0
4.8118 25.57 17800 4.8293 1.0
4.8139 25.72 17900 4.8279 1.0
4.8144 25.86 18000 4.8281 1.0
4.8207 26.01 18100 4.8284 1.0
4.8096 26.15 18200 4.8285 1.0
4.8177 26.29 18300 4.8275 1.0
4.8221 26.44 18400 4.8288 1.0
4.8147 26.58 18500 4.8281 1.0
4.8148 26.72 18600 4.8281 1.0
4.819 26.87 18700 4.8282 1.0
4.8138 27.01 18800 4.8297 1.0
4.8094 27.16 18900 4.8291 1.0
4.8236 27.3 19000 4.8288 1.0
4.8208 27.44 19100 4.8292 1.0
4.816 27.59 19200 4.8279 1.0
4.8103 27.73 19300 4.8290 1.0
4.8152 27.87 19400 4.8296 1.0
4.8158 28.02 19500 4.8304 1.0
4.8122 28.16 19600 4.8293 1.0
4.8199 28.3 19700 4.8293 1.0
4.8185 28.45 19800 4.8287 1.0
4.8198 28.59 19900 4.8294 1.0
4.8102 28.74 20000 4.8291 1.0
4.8168 28.88 20100 4.8290 1.0
4.8117 29.02 20200 4.8303 1.0
4.8156 29.17 20300 4.8295 1.0
4.8127 29.31 20400 4.8298 1.0
4.8193 29.45 20500 4.8301 1.0
4.8174 29.6 20600 4.8301 1.0
4.8167 29.74 20700 4.8301 1.0
4.8137 29.89 20800 4.8310 1.0

Framework versions

  • Transformers 4.24.0
  • Pytorch 1.13.0+cu117
  • Datasets 2.0.0
  • Tokenizers 0.13.2
Downloads last month
14
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results