AlexN commited on
Commit
3770788
β€’
1 Parent(s): 2368511

Training in progress, step 4500

Browse files
pytorch_model.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b18a05fe3e872cd4dda6f2ab42d9fe2011ebfa27c9fd51548851eb83eeedbd84
3
  size 1262817457
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:555c567d63047da5c9226f4c383755306d075b9c037698e784e3f654f575b531
3
  size 1262817457
wandb/run-20220129_131141-h6nhqm30/files/output.log CHANGED
@@ -4233,3 +4233,533 @@ Deleting older checkpoint [checkpoint-2000] due to args.save_total_limit
4233
  Configuration saved in ./checkpoint-4000/config.json
4234
  Model weights saved in ./checkpoint-4000/pytorch_model.bin
4235
  Configuration saved in ./checkpoint-4000/preprocessor_config.json
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4233
  Configuration saved in ./checkpoint-4000/config.json
4234
  Model weights saved in ./checkpoint-4000/pytorch_model.bin
4235
  Configuration saved in ./checkpoint-4000/preprocessor_config.json
4236
+ Configuration saved in ./preprocessor_config.json
4237
+ Deleting older checkpoint [checkpoint-2500] due to args.save_total_limit
4238
+
4239
+
4240
+
4241
+
4242
+
4243
+
4244
+
4245
+
4246
+
4247
+
4248
+
4249
+
4250
+
4251
+
4252
+
4253
+
4254
+
4255
+
4256
+
4257
+
4258
+
4259
+
4260
+
4261
+
4262
+
4263
+
4264
+
4265
+
4266
+
4267
+
4268
+
4269
+
4270
+
4271
+
4272
+
4273
+
4274
+
4275
+
4276
+
4277
+
4278
+
4279
+
4280
+
4281
+
4282
+
4283
+
4284
+
4285
+
4286
+
4287
+
4288
+
4289
+
4290
+
4291
+
4292
+
4293
+
4294
+
4295
+
4296
+
4297
+
4298
+
4299
+
4300
+
4301
+
4302
+
4303
+
4304
+
4305
+
4306
+
4307
+
4308
+
4309
+
4310
+
4311
+
4312
+
4313
+
4314
+
4315
+
4316
+
4317
+
4318
+
4319
+
4320
+
4321
+
4322
+
4323
+
4324
+
4325
+
4326
+
4327
+
4328
+
4329
+
4330
+
4331
+
4332
+
4333
+
4334
+
4335
+ 19%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‹ | 4100/21520 [3:22:57<7:45:28, 1.60s/it]
4336
+
4337
+
4338
+
4339
+
4340
+
4341
+
4342
+
4343
+
4344
+
4345
+
4346
+
4347
+
4348
+
4349
+
4350
+
4351
+
4352
+
4353
+
4354
+
4355
+
4356
+
4357
+
4358
+
4359
+
4360
+
4361
+
4362
+
4363
+
4364
+
4365
+
4366
+
4367
+
4368
+
4369
+
4370
+
4371
+
4372
+
4373
+
4374
+
4375
+
4376
+
4377
+
4378
+
4379
+
4380
+
4381
+
4382
+
4383
+
4384
+
4385
+
4386
+
4387
+
4388
+
4389
+
4390
+
4391
+
4392
+
4393
+
4394
+
4395
+
4396
+
4397
+
4398
+
4399
+
4400
+
4401
+
4402
+
4403
+
4404
+
4405
+
4406
+
4407
+
4408
+
4409
+
4410
+
4411
+
4412
+
4413
+
4414
+
4415
+
4416
+
4417
+
4418
+
4419
+
4420
+
4421
+
4422
+
4423
+
4424
+
4425
+
4426
+
4427
+
4428
+
4429
+
4430
+
4431
+
4432
+
4433
+ 20%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 4199/21520 [3:27:18<7:58:53, 1.66s/it]
4434
+
4435
+
4436
+
4437
+
4438
+
4439
+
4440
+
4441
+
4442
+
4443
+
4444
+
4445
+
4446
+
4447
+
4448
+
4449
+
4450
+
4451
+
4452
+
4453
+
4454
+
4455
+
4456
+
4457
+
4458
+
4459
+
4460
+
4461
+
4462
+
4463
+
4464
+
4465
+
4466
+
4467
+
4468
+
4469
+
4470
+
4471
+
4472
+
4473
+
4474
+
4475
+
4476
+
4477
+
4478
+
4479
+
4480
+
4481
+
4482
+
4483
+
4484
+
4485
+
4486
+
4487
+
4488
+
4489
+
4490
+
4491
+
4492
+
4493
+
4494
+
4495
+
4496
+
4497
+
4498
+
4499
+
4500
+
4501
+
4502
+
4503
+
4504
+
4505
+
4506
+
4507
+
4508
+
4509
+
4510
+
4511
+
4512
+
4513
+
4514
+
4515
+
4516
+
4517
+
4518
+
4519
+
4520
+
4521
+
4522
+
4523
+
4524
+
4525
+
4526
+
4527
+
4528
+
4529
+
4530
+
4531
+
4532
+ 20%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ– | 4299/21520 [3:31:42<8:00:50, 1.68s/it]
4533
+
4534
+
4535
+
4536
+
4537
+
4538
+
4539
+
4540
+
4541
+
4542
+
4543
+
4544
+
4545
+
4546
+
4547
+
4548
+
4549
+
4550
+
4551
+
4552
+
4553
+
4554
+
4555
+
4556
+
4557
+
4558
+
4559
+
4560
+
4561
+
4562
+
4563
+
4564
+
4565
+
4566
+
4567
+
4568
+
4569
+
4570
+
4571
+
4572
+
4573
+
4574
+
4575
+
4576
+
4577
+
4578
+
4579
+
4580
+
4581
+
4582
+
4583
+
4584
+
4585
+
4586
+
4587
+
4588
+
4589
+
4590
+
4591
+
4592
+
4593
+
4594
+
4595
+
4596
+
4597
+
4598
+
4599
+
4600
+
4601
+
4602
+
4603
+
4604
+
4605
+
4606
+
4607
+
4608
+
4609
+
4610
+
4611
+
4612
+
4613
+
4614
+
4615
+
4616
+
4617
+
4618
+
4619
+
4620
+
4621
+
4622
+
4623
+
4624
+
4625
+
4626
+
4627
+
4628
+
4629
+
4630
+
4631
+
4632
+ 20%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–‰ | 4400/21520 [3:36:10<9:34:20, 2.01s/it]
4633
+
4634
+
4635
+
4636
+
4637
+
4638
+
4639
+
4640
+
4641
+
4642
+
4643
+
4644
+
4645
+
4646
+
4647
+
4648
+
4649
+
4650
+
4651
+
4652
+
4653
+
4654
+
4655
+
4656
+
4657
+
4658
+
4659
+
4660
+
4661
+
4662
+
4663
+
4664
+
4665
+
4666
+
4667
+
4668
+
4669
+
4670
+
4671
+
4672
+
4673
+
4674
+
4675
+
4676
+
4677
+
4678
+
4679
+
4680
+
4681
+
4682
+
4683
+
4684
+
4685
+
4686
+
4687
+
4688
+
4689
+
4690
+
4691
+
4692
+
4693
+
4694
+
4695
+
4696
+
4697
+
4698
+
4699
+
4700
+
4701
+
4702
+
4703
+
4704
+
4705
+
4706
+
4707
+
4708
+
4709
+
4710
+
4711
+
4712
+
4713
+
4714
+
4715
+
4716
+
4717
+
4718
+
4719
+
4720
+
4721
+
4722
+
4723
+
4724
+
4725
+
4726
+
4727
+
4728
+
4729
+ 21%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–Œ | 4500/21520 [3:40:32<9:18:48, 1.97s/it]The following columns in the evaluation set don't have a corresponding argument in `Wav2Vec2ForCTC.forward` and have been ignored: input_length.
4730
+ ***** Running Evaluation *****
4731
+ Num examples = 1839
4732
+ Batch size = 64
4733
+ {'loss': 1.1671, 'learning_rate': 5.953496503496503e-05, 'epoch': 1.05}
4734
+
4735
+
4736
+
4737
+
4738
+
4739
+
4740
+
4741
+
4742
+
4743
+
4744
+
4745
+
4746
+
4747
+
4748
+
4749
+
4750
+
4751
+
4752
+
4753
+
4754
+
4755
+
4756
+
4757
+
4758
+
4759
+
4760
+
4761
+ 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 29/29 [00:56<00:00, 1.81s/it]
4762
+
4763
+ Configuration saved in ./checkpoint-4500/config.json
4764
+ Model weights saved in ./checkpoint-4500/pytorch_model.bin
4765
+ Configuration saved in ./checkpoint-4500/preprocessor_config.json
wandb/run-20220129_131141-h6nhqm30/files/wandb-summary.json CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_131141-h6nhqm30/logs/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/run-20220129_131141-h6nhqm30/run-h6nhqm30.wandb CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f9739eb30c365b55a391ae6ef9ae79354e19dafe75345239ba3cc9ab6c8ad81e
3
- size 27510055
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b91b547c6384170ef589da7269c4676c95497c03a83fe8ae18d858a1ce4be487
3
+ size 31219824