2024-10-20 17:51:56,963 INFO [train.py:614] (1/4) Training started 2024-10-20 17:51:56,963 INFO [train.py:624] (1/4) Device: cuda:1 2024-10-20 17:51:56,966 INFO [train.py:639] (1/4) {'model_args': {'n_spks': 1, 'spk_emb_dim': 64, 'n_feats': 80, 'out_size': None, 'prior_loss': True, 'use_precomputed_durations': False, 'data_statistics': {'mel_mean': -5.523808479309082, 'mel_std': 2.066143751144409}, 'encoder': {'encoder_type': 'RoPE Encoder', 'encoder_params': {'n_feats': 80, 'n_channels': 192, 'filter_channels': 768, 'filter_channels_dp': 256, 'n_heads': 2, 'n_layers': 6, 'kernel_size': 3, 'p_dropout': 0.1, 'spk_emb_dim': 64, 'n_spks': 1, 'prenet': True}, 'duration_predictor_params': {'filter_channels_dp': 256, 'kernel_size': 3, 'p_dropout': 0.1}}, 'decoder': {'channels': [256, 256], 'dropout': 0.05, 'attention_head_dim': 64, 'n_blocks': 1, 'num_mid_blocks': 2, 'num_heads': 2, 'act_fn': 'snakebeta'}, 'cfm': {'name': 'CFM', 'solver': 'euler', 'sigma_min': 0.0001}, 'optimizer': {'lr': 0.0001, 'weight_decay': 0.0}, 'n_vocab': 159}, 'data_args': {'name': 'ljspeech', 'train_filelist_path': './filelists/ljs_audio_text_train_filelist.txt', 'valid_filelist_path': './filelists/ljs_audio_text_val_filelist.txt', 'cleaners': ['english_cleaners2'], 'add_blank': True, 'n_spks': 1, 'n_fft': 1024, 'n_feats': 80, 'sample_rate': 22050, 'hop_length': 256, 'win_length': 1024, 'f_min': 0, 'f_max': 8000, 'seed': 1234, 'load_durations': False, 'data_statistics': {'mel_mean': -5.523808479309082, 'mel_std': 2.066143751144409}}, 'best_train_loss': inf, 'best_valid_loss': inf, 'best_train_epoch': -1, 'best_valid_epoch': -1, 'batch_idx_train': -1, 'log_interval': 10, 'valid_interval': 1500, 'env_info': {'k2-version': '1.24.4', 'k2-build-type': 'Release', 'k2-with-cuda': True, 'k2-git-sha1': 'ff1d435a8d3c4eaa15828a84a7240678a70539a7', 'k2-git-date': 'Fri Feb 23 01:48:38 2024', 'lhotse-version': '1.26.0', 'torch-version': '2.2.0+cu118', 'torch-cuda-available': True, 'torch-cuda-version': '11.8', 'python-version': '3.8', 'icefall-git-branch': 'matcha-tts', 'icefall-git-sha1': '6a4cb112-dirty', 'icefall-git-date': 'Sun Oct 20 10:14:10 2024', 'icefall-path': '/star-fj/fangjun/open-source/icefall-2024', 'k2-path': '/star-fj/fangjun/py38/lib/python3.8/site-packages/k2/__init__.py', 'lhotse-path': '/star-fj/fangjun/py38/lib/python3.8/site-packages/lhotse/__init__.py', 'hostname': 'de-74279-k2-train-3-0904151514-6f47fc7cf9-gqs2c', 'IP address': '10.30.28.123'}, 'world_size': 4, 'master_port': 12335, 'tensorboard': True, 'num_epochs': 4000, 'start_epoch': 1, 'exp_dir': PosixPath('matcha/exp-new-3'), 'tokens': 'data/tokens.txt', 'cmvn': 'data/fbank/cmvn.json', 'seed': 42, 'save_every_n': 10, 'use_fp16': False, 'manifest_dir': PosixPath('data/fbank'), 'max_duration': 1000, 'bucketing_sampler': True, 'num_buckets': 30, 'on_the_fly_feats': False, 'shuffle': True, 'drop_last': True, 'return_cuts': False, 'num_workers': 4, 'input_strategy': 'PrecomputedFeatures', 'pad_id': 0, 'vocab_size': 159} 2024-10-20 17:51:56,966 INFO [train.py:642] (1/4) About to create model 2024-10-20 17:51:57,285 INFO [train.py:646] (1/4) Number of parameters: 18200545 2024-10-20 17:51:57,727 INFO [train.py:654] (1/4) Using DDP 2024-10-20 17:51:58,017 INFO [train.py:659] (1/4) About to create datamodule 2024-10-20 17:51:58,017 INFO [tts_datamodule.py:324] (1/4) About to get train cuts 2024-10-20 17:51:58,019 INFO [tts_datamodule.py:170] (1/4) About to create train dataset 2024-10-20 17:51:58,020 INFO [tts_datamodule.py:197] (1/4) Using DynamicBucketingSampler. 2024-10-20 17:51:58,795 INFO [tts_datamodule.py:214] (1/4) About to create train dataloader 2024-10-20 17:51:58,796 INFO [tts_datamodule.py:331] (1/4) About to get validation cuts 2024-10-20 17:51:58,799 INFO [tts_datamodule.py:238] (1/4) About to create dev dataset 2024-10-20 17:51:58,808 INFO [tts_datamodule.py:269] (1/4) About to create valid dataloader 2024-10-20 17:51:58,808 INFO [train.py:682] (1/4) Start epoch 1 2024-10-20 17:52:21,728 INFO [train.py:561] (1/4) Epoch 1, batch 0, global_batch_idx: 0, batch size: 108, loss[dur_loss=0.7965, prior_loss=1.531, diff_loss=2.975, tot_loss=5.302, over 108.00 samples.], tot_loss[dur_loss=0.7965, prior_loss=1.531, diff_loss=2.975, tot_loss=5.302, over 108.00 samples.], 2024-10-20 17:52:23,141 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 17:52:40,328 INFO [train.py:589] (1/4) Epoch 1, validation: dur_loss=0.7527, prior_loss=1.435, diff_loss=2.269, tot_loss=4.456, over 100.00 samples. 2024-10-20 17:52:40,329 INFO [train.py:590] (1/4) Maximum memory allocated so far is 20443MB 2024-10-20 17:52:53,330 INFO [train.py:561] (1/4) Epoch 1, batch 10, global_batch_idx: 10, batch size: 111, loss[dur_loss=0.826, prior_loss=1.303, diff_loss=1.748, tot_loss=3.877, over 111.00 samples.], tot_loss[dur_loss=0.8065, prior_loss=1.385, diff_loss=2.159, tot_loss=4.351, over 1656.00 samples.], 2024-10-20 17:53:00,370 INFO [train.py:682] (1/4) Start epoch 2 2024-10-20 17:53:15,326 INFO [train.py:561] (1/4) Epoch 2, batch 4, global_batch_idx: 20, batch size: 189, loss[dur_loss=0.7422, prior_loss=1.265, diff_loss=1.612, tot_loss=3.62, over 189.00 samples.], tot_loss[dur_loss=0.7465, prior_loss=1.272, diff_loss=1.735, tot_loss=3.753, over 937.00 samples.], 2024-10-20 17:53:29,980 INFO [train.py:561] (1/4) Epoch 2, batch 14, global_batch_idx: 30, batch size: 142, loss[dur_loss=0.7347, prior_loss=1.239, diff_loss=1.457, tot_loss=3.431, over 142.00 samples.], tot_loss[dur_loss=0.7451, prior_loss=1.259, diff_loss=1.606, tot_loss=3.61, over 2210.00 samples.], 2024-10-20 17:53:31,372 INFO [train.py:682] (1/4) Start epoch 3 2024-10-20 17:53:51,501 INFO [train.py:561] (1/4) Epoch 3, batch 8, global_batch_idx: 40, batch size: 170, loss[dur_loss=0.6962, prior_loss=1.213, diff_loss=1.418, tot_loss=3.327, over 170.00 samples.], tot_loss[dur_loss=0.7021, prior_loss=1.223, diff_loss=1.506, tot_loss=3.431, over 1432.00 samples.], 2024-10-20 17:54:01,310 INFO [train.py:682] (1/4) Start epoch 4 2024-10-20 17:54:12,715 INFO [train.py:561] (1/4) Epoch 4, batch 2, global_batch_idx: 50, batch size: 203, loss[dur_loss=0.6749, prior_loss=1.211, diff_loss=1.405, tot_loss=3.291, over 203.00 samples.], tot_loss[dur_loss=0.6935, prior_loss=1.211, diff_loss=1.384, tot_loss=3.289, over 442.00 samples.], 2024-10-20 17:54:26,745 INFO [train.py:561] (1/4) Epoch 4, batch 12, global_batch_idx: 60, batch size: 152, loss[dur_loss=0.683, prior_loss=1.208, diff_loss=1.355, tot_loss=3.246, over 152.00 samples.], tot_loss[dur_loss=0.6821, prior_loss=1.21, diff_loss=1.424, tot_loss=3.316, over 1966.00 samples.], 2024-10-20 17:54:31,136 INFO [train.py:682] (1/4) Start epoch 5 2024-10-20 17:54:48,387 INFO [train.py:561] (1/4) Epoch 5, batch 6, global_batch_idx: 70, batch size: 106, loss[dur_loss=0.6858, prior_loss=1.208, diff_loss=1.321, tot_loss=3.215, over 106.00 samples.], tot_loss[dur_loss=0.6614, prior_loss=1.208, diff_loss=1.441, tot_loss=3.31, over 1142.00 samples.], 2024-10-20 17:55:01,183 INFO [train.py:682] (1/4) Start epoch 6 2024-10-20 17:55:10,079 INFO [train.py:561] (1/4) Epoch 6, batch 0, global_batch_idx: 80, batch size: 108, loss[dur_loss=0.6957, prior_loss=1.205, diff_loss=1.316, tot_loss=3.217, over 108.00 samples.], tot_loss[dur_loss=0.6957, prior_loss=1.205, diff_loss=1.316, tot_loss=3.217, over 108.00 samples.], 2024-10-20 17:55:24,013 INFO [train.py:561] (1/4) Epoch 6, batch 10, global_batch_idx: 90, batch size: 111, loss[dur_loss=0.6884, prior_loss=1.202, diff_loss=1.313, tot_loss=3.203, over 111.00 samples.], tot_loss[dur_loss=0.6582, prior_loss=1.204, diff_loss=1.389, tot_loss=3.251, over 1656.00 samples.], 2024-10-20 17:55:30,982 INFO [train.py:682] (1/4) Start epoch 7 2024-10-20 17:55:45,230 INFO [train.py:561] (1/4) Epoch 7, batch 4, global_batch_idx: 100, batch size: 189, loss[dur_loss=0.6108, prior_loss=1.191, diff_loss=1.338, tot_loss=3.139, over 189.00 samples.], tot_loss[dur_loss=0.613, prior_loss=1.193, diff_loss=1.435, tot_loss=3.241, over 937.00 samples.], 2024-10-20 17:55:59,961 INFO [train.py:561] (1/4) Epoch 7, batch 14, global_batch_idx: 110, batch size: 142, loss[dur_loss=0.5993, prior_loss=1.181, diff_loss=1.289, tot_loss=3.069, over 142.00 samples.], tot_loss[dur_loss=0.6138, prior_loss=1.189, diff_loss=1.357, tot_loss=3.16, over 2210.00 samples.], 2024-10-20 17:56:01,367 INFO [train.py:682] (1/4) Start epoch 8 2024-10-20 17:56:21,861 INFO [train.py:561] (1/4) Epoch 8, batch 8, global_batch_idx: 120, batch size: 170, loss[dur_loss=0.5665, prior_loss=1.174, diff_loss=1.306, tot_loss=3.046, over 170.00 samples.], tot_loss[dur_loss=0.5733, prior_loss=1.18, diff_loss=1.384, tot_loss=3.137, over 1432.00 samples.], 2024-10-20 17:56:32,034 INFO [train.py:682] (1/4) Start epoch 9 2024-10-20 17:56:43,636 INFO [train.py:561] (1/4) Epoch 9, batch 2, global_batch_idx: 130, batch size: 203, loss[dur_loss=0.5864, prior_loss=1.171, diff_loss=1.32, tot_loss=3.077, over 203.00 samples.], tot_loss[dur_loss=0.5863, prior_loss=1.17, diff_loss=1.305, tot_loss=3.061, over 442.00 samples.], 2024-10-20 17:56:57,862 INFO [train.py:561] (1/4) Epoch 9, batch 12, global_batch_idx: 140, batch size: 152, loss[dur_loss=0.634, prior_loss=1.156, diff_loss=1.296, tot_loss=3.086, over 152.00 samples.], tot_loss[dur_loss=0.6057, prior_loss=1.165, diff_loss=1.35, tot_loss=3.121, over 1966.00 samples.], 2024-10-20 17:57:02,332 INFO [train.py:682] (1/4) Start epoch 10 2024-10-20 17:57:19,683 INFO [train.py:561] (1/4) Epoch 10, batch 6, global_batch_idx: 150, batch size: 106, loss[dur_loss=0.6498, prior_loss=1.143, diff_loss=1.277, tot_loss=3.07, over 106.00 samples.], tot_loss[dur_loss=0.6378, prior_loss=1.149, diff_loss=1.382, tot_loss=3.169, over 1142.00 samples.], 2024-10-20 17:57:32,825 INFO [train.py:682] (1/4) Start epoch 11 2024-10-20 17:57:41,566 INFO [train.py:561] (1/4) Epoch 11, batch 0, global_batch_idx: 160, batch size: 108, loss[dur_loss=0.6589, prior_loss=1.137, diff_loss=1.267, tot_loss=3.063, over 108.00 samples.], tot_loss[dur_loss=0.6589, prior_loss=1.137, diff_loss=1.267, tot_loss=3.063, over 108.00 samples.], 2024-10-20 17:57:55,802 INFO [train.py:561] (1/4) Epoch 11, batch 10, global_batch_idx: 170, batch size: 111, loss[dur_loss=0.6578, prior_loss=1.134, diff_loss=1.269, tot_loss=3.061, over 111.00 samples.], tot_loss[dur_loss=0.6398, prior_loss=1.137, diff_loss=1.34, tot_loss=3.116, over 1656.00 samples.], 2024-10-20 17:58:02,878 INFO [train.py:682] (1/4) Start epoch 12 2024-10-20 17:58:17,100 INFO [train.py:561] (1/4) Epoch 12, batch 4, global_batch_idx: 180, batch size: 189, loss[dur_loss=0.6292, prior_loss=1.132, diff_loss=1.288, tot_loss=3.05, over 189.00 samples.], tot_loss[dur_loss=0.6294, prior_loss=1.133, diff_loss=1.391, tot_loss=3.153, over 937.00 samples.], 2024-10-20 17:58:32,281 INFO [train.py:561] (1/4) Epoch 12, batch 14, global_batch_idx: 190, batch size: 142, loss[dur_loss=0.6379, prior_loss=1.126, diff_loss=1.251, tot_loss=3.014, over 142.00 samples.], tot_loss[dur_loss=0.636, prior_loss=1.131, diff_loss=1.315, tot_loss=3.082, over 2210.00 samples.], 2024-10-20 17:58:33,741 INFO [train.py:682] (1/4) Start epoch 13 2024-10-20 17:58:54,199 INFO [train.py:561] (1/4) Epoch 13, batch 8, global_batch_idx: 200, batch size: 170, loss[dur_loss=0.6371, prior_loss=1.126, diff_loss=1.268, tot_loss=3.031, over 170.00 samples.], tot_loss[dur_loss=0.6308, prior_loss=1.128, diff_loss=1.338, tot_loss=3.097, over 1432.00 samples.], 2024-10-20 17:59:04,434 INFO [train.py:682] (1/4) Start epoch 14 2024-10-20 17:59:16,014 INFO [train.py:561] (1/4) Epoch 14, batch 2, global_batch_idx: 210, batch size: 203, loss[dur_loss=0.6223, prior_loss=1.125, diff_loss=1.276, tot_loss=3.024, over 203.00 samples.], tot_loss[dur_loss=0.6319, prior_loss=1.124, diff_loss=1.258, tot_loss=3.014, over 442.00 samples.], 2024-10-20 17:59:30,381 INFO [train.py:561] (1/4) Epoch 14, batch 12, global_batch_idx: 220, batch size: 152, loss[dur_loss=0.6343, prior_loss=1.122, diff_loss=1.244, tot_loss=3, over 152.00 samples.], tot_loss[dur_loss=0.6303, prior_loss=1.123, diff_loss=1.305, tot_loss=3.059, over 1966.00 samples.], 2024-10-20 17:59:34,912 INFO [train.py:682] (1/4) Start epoch 15 2024-10-20 17:59:52,287 INFO [train.py:561] (1/4) Epoch 15, batch 6, global_batch_idx: 230, batch size: 106, loss[dur_loss=0.6268, prior_loss=1.119, diff_loss=1.226, tot_loss=2.971, over 106.00 samples.], tot_loss[dur_loss=0.6217, prior_loss=1.121, diff_loss=1.339, tot_loss=3.082, over 1142.00 samples.], 2024-10-20 18:00:05,360 INFO [train.py:682] (1/4) Start epoch 16 2024-10-20 18:00:14,173 INFO [train.py:561] (1/4) Epoch 16, batch 0, global_batch_idx: 240, batch size: 108, loss[dur_loss=0.6339, prior_loss=1.116, diff_loss=1.227, tot_loss=2.977, over 108.00 samples.], tot_loss[dur_loss=0.6339, prior_loss=1.116, diff_loss=1.227, tot_loss=2.977, over 108.00 samples.], 2024-10-20 18:00:28,476 INFO [train.py:561] (1/4) Epoch 16, batch 10, global_batch_idx: 250, batch size: 111, loss[dur_loss=0.6231, prior_loss=1.119, diff_loss=1.237, tot_loss=2.979, over 111.00 samples.], tot_loss[dur_loss=0.6198, prior_loss=1.117, diff_loss=1.3, tot_loss=3.037, over 1656.00 samples.], 2024-10-20 18:00:35,563 INFO [train.py:682] (1/4) Start epoch 17 2024-10-20 18:00:49,586 INFO [train.py:561] (1/4) Epoch 17, batch 4, global_batch_idx: 260, batch size: 189, loss[dur_loss=0.624, prior_loss=1.114, diff_loss=1.25, tot_loss=2.989, over 189.00 samples.], tot_loss[dur_loss=0.6089, prior_loss=1.114, diff_loss=1.345, tot_loss=3.068, over 937.00 samples.], 2024-10-20 18:01:04,417 INFO [train.py:561] (1/4) Epoch 17, batch 14, global_batch_idx: 270, batch size: 142, loss[dur_loss=0.6178, prior_loss=1.107, diff_loss=1.212, tot_loss=2.937, over 142.00 samples.], tot_loss[dur_loss=0.6178, prior_loss=1.112, diff_loss=1.27, tot_loss=3, over 2210.00 samples.], 2024-10-20 18:01:05,854 INFO [train.py:682] (1/4) Start epoch 18 2024-10-20 18:01:25,932 INFO [train.py:561] (1/4) Epoch 18, batch 8, global_batch_idx: 280, batch size: 170, loss[dur_loss=0.6083, prior_loss=1.104, diff_loss=1.219, tot_loss=2.931, over 170.00 samples.], tot_loss[dur_loss=0.6074, prior_loss=1.106, diff_loss=1.292, tot_loss=3.006, over 1432.00 samples.], 2024-10-20 18:01:36,097 INFO [train.py:682] (1/4) Start epoch 19 2024-10-20 18:01:47,642 INFO [train.py:561] (1/4) Epoch 19, batch 2, global_batch_idx: 290, batch size: 203, loss[dur_loss=0.5869, prior_loss=1.099, diff_loss=1.238, tot_loss=2.924, over 203.00 samples.], tot_loss[dur_loss=0.5985, prior_loss=1.099, diff_loss=1.215, tot_loss=2.913, over 442.00 samples.], 2024-10-20 18:02:01,937 INFO [train.py:561] (1/4) Epoch 19, batch 12, global_batch_idx: 300, batch size: 152, loss[dur_loss=0.5885, prior_loss=1.094, diff_loss=1.205, tot_loss=2.888, over 152.00 samples.], tot_loss[dur_loss=0.5964, prior_loss=1.098, diff_loss=1.261, tot_loss=2.956, over 1966.00 samples.], 2024-10-20 18:02:06,413 INFO [train.py:682] (1/4) Start epoch 20 2024-10-20 18:02:23,387 INFO [train.py:561] (1/4) Epoch 20, batch 6, global_batch_idx: 310, batch size: 106, loss[dur_loss=0.5926, prior_loss=1.093, diff_loss=1.184, tot_loss=2.869, over 106.00 samples.], tot_loss[dur_loss=0.5755, prior_loss=1.091, diff_loss=1.292, tot_loss=2.958, over 1142.00 samples.], 2024-10-20 18:02:36,576 INFO [train.py:682] (1/4) Start epoch 21 2024-10-20 18:02:45,299 INFO [train.py:561] (1/4) Epoch 21, batch 0, global_batch_idx: 320, batch size: 108, loss[dur_loss=0.5876, prior_loss=1.085, diff_loss=1.173, tot_loss=2.846, over 108.00 samples.], tot_loss[dur_loss=0.5876, prior_loss=1.085, diff_loss=1.173, tot_loss=2.846, over 108.00 samples.], 2024-10-20 18:02:59,597 INFO [train.py:561] (1/4) Epoch 21, batch 10, global_batch_idx: 330, batch size: 111, loss[dur_loss=0.6042, prior_loss=1.085, diff_loss=1.17, tot_loss=2.859, over 111.00 samples.], tot_loss[dur_loss=0.5754, prior_loss=1.085, diff_loss=1.247, tot_loss=2.907, over 1656.00 samples.], 2024-10-20 18:03:06,726 INFO [train.py:682] (1/4) Start epoch 22 2024-10-20 18:03:20,741 INFO [train.py:561] (1/4) Epoch 22, batch 4, global_batch_idx: 340, batch size: 189, loss[dur_loss=0.5552, prior_loss=1.081, diff_loss=1.197, tot_loss=2.832, over 189.00 samples.], tot_loss[dur_loss=0.5599, prior_loss=1.081, diff_loss=1.292, tot_loss=2.933, over 937.00 samples.], 2024-10-20 18:03:35,687 INFO [train.py:561] (1/4) Epoch 22, batch 14, global_batch_idx: 350, batch size: 142, loss[dur_loss=0.5633, prior_loss=1.076, diff_loss=1.153, tot_loss=2.792, over 142.00 samples.], tot_loss[dur_loss=0.5694, prior_loss=1.08, diff_loss=1.216, tot_loss=2.865, over 2210.00 samples.], 2024-10-20 18:03:37,120 INFO [train.py:682] (1/4) Start epoch 23 2024-10-20 18:03:57,146 INFO [train.py:561] (1/4) Epoch 23, batch 8, global_batch_idx: 360, batch size: 170, loss[dur_loss=0.5649, prior_loss=1.074, diff_loss=1.162, tot_loss=2.802, over 170.00 samples.], tot_loss[dur_loss=0.5573, prior_loss=1.075, diff_loss=1.233, tot_loss=2.866, over 1432.00 samples.], 2024-10-20 18:04:07,352 INFO [train.py:682] (1/4) Start epoch 24 2024-10-20 18:04:18,810 INFO [train.py:561] (1/4) Epoch 24, batch 2, global_batch_idx: 370, batch size: 203, loss[dur_loss=0.5452, prior_loss=1.071, diff_loss=1.174, tot_loss=2.791, over 203.00 samples.], tot_loss[dur_loss=0.5542, prior_loss=1.072, diff_loss=1.153, tot_loss=2.779, over 442.00 samples.], 2024-10-20 18:04:33,022 INFO [train.py:561] (1/4) Epoch 24, batch 12, global_batch_idx: 380, batch size: 152, loss[dur_loss=0.5537, prior_loss=1.069, diff_loss=1.14, tot_loss=2.762, over 152.00 samples.], tot_loss[dur_loss=0.5512, prior_loss=1.071, diff_loss=1.196, tot_loss=2.818, over 1966.00 samples.], 2024-10-20 18:04:37,462 INFO [train.py:682] (1/4) Start epoch 25 2024-10-20 18:04:54,690 INFO [train.py:561] (1/4) Epoch 25, batch 6, global_batch_idx: 390, batch size: 106, loss[dur_loss=0.5457, prior_loss=1.068, diff_loss=1.113, tot_loss=2.727, over 106.00 samples.], tot_loss[dur_loss=0.5404, prior_loss=1.068, diff_loss=1.225, tot_loss=2.833, over 1142.00 samples.], 2024-10-20 18:05:07,769 INFO [train.py:682] (1/4) Start epoch 26 2024-10-20 18:05:16,493 INFO [train.py:561] (1/4) Epoch 26, batch 0, global_batch_idx: 400, batch size: 108, loss[dur_loss=0.5527, prior_loss=1.066, diff_loss=1.105, tot_loss=2.724, over 108.00 samples.], tot_loss[dur_loss=0.5527, prior_loss=1.066, diff_loss=1.105, tot_loss=2.724, over 108.00 samples.], 2024-10-20 18:05:30,778 INFO [train.py:561] (1/4) Epoch 26, batch 10, global_batch_idx: 410, batch size: 111, loss[dur_loss=0.5456, prior_loss=1.065, diff_loss=1.102, tot_loss=2.713, over 111.00 samples.], tot_loss[dur_loss=0.5332, prior_loss=1.065, diff_loss=1.177, tot_loss=2.775, over 1656.00 samples.], 2024-10-20 18:05:37,920 INFO [train.py:682] (1/4) Start epoch 27 2024-10-20 18:05:51,791 INFO [train.py:561] (1/4) Epoch 27, batch 4, global_batch_idx: 420, batch size: 189, loss[dur_loss=0.5186, prior_loss=1.063, diff_loss=1.119, tot_loss=2.701, over 189.00 samples.], tot_loss[dur_loss=0.5202, prior_loss=1.063, diff_loss=1.22, tot_loss=2.803, over 937.00 samples.], 2024-10-20 18:06:06,712 INFO [train.py:561] (1/4) Epoch 27, batch 14, global_batch_idx: 430, batch size: 142, loss[dur_loss=0.5244, prior_loss=1.06, diff_loss=1.083, tot_loss=2.668, over 142.00 samples.], tot_loss[dur_loss=0.5264, prior_loss=1.062, diff_loss=1.144, tot_loss=2.733, over 2210.00 samples.], 2024-10-20 18:06:08,146 INFO [train.py:682] (1/4) Start epoch 28 2024-10-20 18:06:28,175 INFO [train.py:561] (1/4) Epoch 28, batch 8, global_batch_idx: 440, batch size: 170, loss[dur_loss=0.5217, prior_loss=1.061, diff_loss=1.096, tot_loss=2.678, over 170.00 samples.], tot_loss[dur_loss=0.5173, prior_loss=1.061, diff_loss=1.161, tot_loss=2.739, over 1432.00 samples.], 2024-10-20 18:06:38,339 INFO [train.py:682] (1/4) Start epoch 29 2024-10-20 18:06:49,714 INFO [train.py:561] (1/4) Epoch 29, batch 2, global_batch_idx: 450, batch size: 203, loss[dur_loss=0.5043, prior_loss=1.058, diff_loss=1.099, tot_loss=2.662, over 203.00 samples.], tot_loss[dur_loss=0.5157, prior_loss=1.059, diff_loss=1.08, tot_loss=2.655, over 442.00 samples.], 2024-10-20 18:07:03,912 INFO [train.py:561] (1/4) Epoch 29, batch 12, global_batch_idx: 460, batch size: 152, loss[dur_loss=0.5191, prior_loss=1.057, diff_loss=1.065, tot_loss=2.641, over 152.00 samples.], tot_loss[dur_loss=0.5127, prior_loss=1.058, diff_loss=1.122, tot_loss=2.693, over 1966.00 samples.], 2024-10-20 18:07:08,369 INFO [train.py:682] (1/4) Start epoch 30 2024-10-20 18:07:25,337 INFO [train.py:561] (1/4) Epoch 30, batch 6, global_batch_idx: 470, batch size: 106, loss[dur_loss=0.5095, prior_loss=1.056, diff_loss=1.027, tot_loss=2.592, over 106.00 samples.], tot_loss[dur_loss=0.5017, prior_loss=1.056, diff_loss=1.149, tot_loss=2.707, over 1142.00 samples.], 2024-10-20 18:07:38,488 INFO [train.py:682] (1/4) Start epoch 31 2024-10-20 18:07:47,041 INFO [train.py:561] (1/4) Epoch 31, batch 0, global_batch_idx: 480, batch size: 108, loss[dur_loss=0.5195, prior_loss=1.056, diff_loss=1.029, tot_loss=2.604, over 108.00 samples.], tot_loss[dur_loss=0.5195, prior_loss=1.056, diff_loss=1.029, tot_loss=2.604, over 108.00 samples.], 2024-10-20 18:08:01,380 INFO [train.py:561] (1/4) Epoch 31, batch 10, global_batch_idx: 490, batch size: 111, loss[dur_loss=0.5183, prior_loss=1.056, diff_loss=1.02, tot_loss=2.594, over 111.00 samples.], tot_loss[dur_loss=0.5017, prior_loss=1.055, diff_loss=1.103, tot_loss=2.659, over 1656.00 samples.], 2024-10-20 18:08:08,493 INFO [train.py:682] (1/4) Start epoch 32 2024-10-20 18:08:22,402 INFO [train.py:561] (1/4) Epoch 32, batch 4, global_batch_idx: 500, batch size: 189, loss[dur_loss=0.4858, prior_loss=1.054, diff_loss=1.044, tot_loss=2.584, over 189.00 samples.], tot_loss[dur_loss=0.4885, prior_loss=1.053, diff_loss=1.142, tot_loss=2.684, over 937.00 samples.], 2024-10-20 18:08:37,337 INFO [train.py:561] (1/4) Epoch 32, batch 14, global_batch_idx: 510, batch size: 142, loss[dur_loss=0.4983, prior_loss=1.052, diff_loss=1, tot_loss=2.551, over 142.00 samples.], tot_loss[dur_loss=0.4976, prior_loss=1.053, diff_loss=1.066, tot_loss=2.617, over 2210.00 samples.], 2024-10-20 18:08:38,777 INFO [train.py:682] (1/4) Start epoch 33 2024-10-20 18:08:58,713 INFO [train.py:561] (1/4) Epoch 33, batch 8, global_batch_idx: 520, batch size: 170, loss[dur_loss=0.499, prior_loss=1.052, diff_loss=1.013, tot_loss=2.565, over 170.00 samples.], tot_loss[dur_loss=0.4901, prior_loss=1.052, diff_loss=1.083, tot_loss=2.625, over 1432.00 samples.], 2024-10-20 18:09:08,902 INFO [train.py:682] (1/4) Start epoch 34 2024-10-20 18:09:20,237 INFO [train.py:561] (1/4) Epoch 34, batch 2, global_batch_idx: 530, batch size: 203, loss[dur_loss=0.4814, prior_loss=1.051, diff_loss=1.02, tot_loss=2.552, over 203.00 samples.], tot_loss[dur_loss=0.4922, prior_loss=1.051, diff_loss=0.9995, tot_loss=2.543, over 442.00 samples.], 2024-10-20 18:09:34,513 INFO [train.py:561] (1/4) Epoch 34, batch 12, global_batch_idx: 540, batch size: 152, loss[dur_loss=0.4981, prior_loss=1.051, diff_loss=0.9908, tot_loss=2.54, over 152.00 samples.], tot_loss[dur_loss=0.4904, prior_loss=1.051, diff_loss=1.045, tot_loss=2.586, over 1966.00 samples.], 2024-10-20 18:09:38,977 INFO [train.py:682] (1/4) Start epoch 35 2024-10-20 18:09:56,166 INFO [train.py:561] (1/4) Epoch 35, batch 6, global_batch_idx: 550, batch size: 106, loss[dur_loss=0.4862, prior_loss=1.051, diff_loss=0.9705, tot_loss=2.507, over 106.00 samples.], tot_loss[dur_loss=0.4811, prior_loss=1.05, diff_loss=1.075, tot_loss=2.606, over 1142.00 samples.], 2024-10-20 18:10:09,297 INFO [train.py:682] (1/4) Start epoch 36 2024-10-20 18:10:17,963 INFO [train.py:561] (1/4) Epoch 36, batch 0, global_batch_idx: 560, batch size: 108, loss[dur_loss=0.5036, prior_loss=1.05, diff_loss=0.9611, tot_loss=2.515, over 108.00 samples.], tot_loss[dur_loss=0.5036, prior_loss=1.05, diff_loss=0.9611, tot_loss=2.515, over 108.00 samples.], 2024-10-20 18:10:32,270 INFO [train.py:561] (1/4) Epoch 36, batch 10, global_batch_idx: 570, batch size: 111, loss[dur_loss=0.4968, prior_loss=1.051, diff_loss=0.9388, tot_loss=2.486, over 111.00 samples.], tot_loss[dur_loss=0.4831, prior_loss=1.05, diff_loss=1.027, tot_loss=2.559, over 1656.00 samples.], 2024-10-20 18:10:39,408 INFO [train.py:682] (1/4) Start epoch 37 2024-10-20 18:10:53,323 INFO [train.py:561] (1/4) Epoch 37, batch 4, global_batch_idx: 580, batch size: 189, loss[dur_loss=0.4758, prior_loss=1.049, diff_loss=0.9597, tot_loss=2.485, over 189.00 samples.], tot_loss[dur_loss=0.4744, prior_loss=1.048, diff_loss=1.068, tot_loss=2.591, over 937.00 samples.], 2024-10-20 18:11:08,290 INFO [train.py:561] (1/4) Epoch 37, batch 14, global_batch_idx: 590, batch size: 142, loss[dur_loss=0.4853, prior_loss=1.047, diff_loss=0.9365, tot_loss=2.469, over 142.00 samples.], tot_loss[dur_loss=0.481, prior_loss=1.048, diff_loss=0.9947, tot_loss=2.524, over 2210.00 samples.], 2024-10-20 18:11:09,714 INFO [train.py:682] (1/4) Start epoch 38 2024-10-20 18:11:29,465 INFO [train.py:561] (1/4) Epoch 38, batch 8, global_batch_idx: 600, batch size: 170, loss[dur_loss=0.4816, prior_loss=1.048, diff_loss=0.943, tot_loss=2.473, over 170.00 samples.], tot_loss[dur_loss=0.4749, prior_loss=1.048, diff_loss=1.014, tot_loss=2.537, over 1432.00 samples.], 2024-10-20 18:11:39,640 INFO [train.py:682] (1/4) Start epoch 39 2024-10-20 18:11:51,049 INFO [train.py:561] (1/4) Epoch 39, batch 2, global_batch_idx: 610, batch size: 203, loss[dur_loss=0.4674, prior_loss=1.048, diff_loss=0.9533, tot_loss=2.468, over 203.00 samples.], tot_loss[dur_loss=0.4777, prior_loss=1.048, diff_loss=0.9322, tot_loss=2.458, over 442.00 samples.], 2024-10-20 18:12:05,304 INFO [train.py:561] (1/4) Epoch 39, batch 12, global_batch_idx: 620, batch size: 152, loss[dur_loss=0.4799, prior_loss=1.047, diff_loss=0.8994, tot_loss=2.426, over 152.00 samples.], tot_loss[dur_loss=0.4759, prior_loss=1.047, diff_loss=0.9739, tot_loss=2.497, over 1966.00 samples.], 2024-10-20 18:12:09,750 INFO [train.py:682] (1/4) Start epoch 40 2024-10-20 18:12:27,288 INFO [train.py:561] (1/4) Epoch 40, batch 6, global_batch_idx: 630, batch size: 106, loss[dur_loss=0.4795, prior_loss=1.047, diff_loss=0.9001, tot_loss=2.427, over 106.00 samples.], tot_loss[dur_loss=0.4678, prior_loss=1.046, diff_loss=1.005, tot_loss=2.519, over 1142.00 samples.], 2024-10-20 18:12:40,461 INFO [train.py:682] (1/4) Start epoch 41 2024-10-20 18:12:49,220 INFO [train.py:561] (1/4) Epoch 41, batch 0, global_batch_idx: 640, batch size: 108, loss[dur_loss=0.4898, prior_loss=1.047, diff_loss=0.886, tot_loss=2.423, over 108.00 samples.], tot_loss[dur_loss=0.4898, prior_loss=1.047, diff_loss=0.886, tot_loss=2.423, over 108.00 samples.], 2024-10-20 18:13:03,492 INFO [train.py:561] (1/4) Epoch 41, batch 10, global_batch_idx: 650, batch size: 111, loss[dur_loss=0.489, prior_loss=1.047, diff_loss=0.8701, tot_loss=2.406, over 111.00 samples.], tot_loss[dur_loss=0.471, prior_loss=1.046, diff_loss=0.958, tot_loss=2.475, over 1656.00 samples.], 2024-10-20 18:13:10,651 INFO [train.py:682] (1/4) Start epoch 42 2024-10-20 18:13:24,311 INFO [train.py:561] (1/4) Epoch 42, batch 4, global_batch_idx: 660, batch size: 189, loss[dur_loss=0.4614, prior_loss=1.047, diff_loss=0.8938, tot_loss=2.402, over 189.00 samples.], tot_loss[dur_loss=0.4613, prior_loss=1.045, diff_loss=0.9983, tot_loss=2.505, over 937.00 samples.], 2024-10-20 18:13:39,331 INFO [train.py:561] (1/4) Epoch 42, batch 14, global_batch_idx: 670, batch size: 142, loss[dur_loss=0.4737, prior_loss=1.044, diff_loss=0.8622, tot_loss=2.38, over 142.00 samples.], tot_loss[dur_loss=0.4704, prior_loss=1.045, diff_loss=0.9244, tot_loss=2.44, over 2210.00 samples.], 2024-10-20 18:13:40,751 INFO [train.py:682] (1/4) Start epoch 43 2024-10-20 18:14:00,728 INFO [train.py:561] (1/4) Epoch 43, batch 8, global_batch_idx: 680, batch size: 170, loss[dur_loss=0.4767, prior_loss=1.045, diff_loss=0.8854, tot_loss=2.408, over 170.00 samples.], tot_loss[dur_loss=0.4674, prior_loss=1.046, diff_loss=0.9546, tot_loss=2.468, over 1432.00 samples.], 2024-10-20 18:14:10,863 INFO [train.py:682] (1/4) Start epoch 44 2024-10-20 18:14:22,517 INFO [train.py:561] (1/4) Epoch 44, batch 2, global_batch_idx: 690, batch size: 203, loss[dur_loss=0.464, prior_loss=1.044, diff_loss=0.9141, tot_loss=2.422, over 203.00 samples.], tot_loss[dur_loss=0.4712, prior_loss=1.044, diff_loss=0.8826, tot_loss=2.398, over 442.00 samples.], 2024-10-20 18:14:36,807 INFO [train.py:561] (1/4) Epoch 44, batch 12, global_batch_idx: 700, batch size: 152, loss[dur_loss=0.4724, prior_loss=1.044, diff_loss=0.8459, tot_loss=2.362, over 152.00 samples.], tot_loss[dur_loss=0.4689, prior_loss=1.044, diff_loss=0.9184, tot_loss=2.432, over 1966.00 samples.], 2024-10-20 18:14:41,298 INFO [train.py:682] (1/4) Start epoch 45 2024-10-20 18:14:58,528 INFO [train.py:561] (1/4) Epoch 45, batch 6, global_batch_idx: 710, batch size: 106, loss[dur_loss=0.4696, prior_loss=1.045, diff_loss=0.8357, tot_loss=2.35, over 106.00 samples.], tot_loss[dur_loss=0.46, prior_loss=1.044, diff_loss=0.9443, tot_loss=2.448, over 1142.00 samples.], 2024-10-20 18:15:11,589 INFO [train.py:682] (1/4) Start epoch 46 2024-10-20 18:15:20,481 INFO [train.py:561] (1/4) Epoch 46, batch 0, global_batch_idx: 720, batch size: 108, loss[dur_loss=0.4839, prior_loss=1.045, diff_loss=0.8438, tot_loss=2.372, over 108.00 samples.], tot_loss[dur_loss=0.4839, prior_loss=1.045, diff_loss=0.8438, tot_loss=2.372, over 108.00 samples.], 2024-10-20 18:15:34,768 INFO [train.py:561] (1/4) Epoch 46, batch 10, global_batch_idx: 730, batch size: 111, loss[dur_loss=0.4795, prior_loss=1.045, diff_loss=0.828, tot_loss=2.352, over 111.00 samples.], tot_loss[dur_loss=0.464, prior_loss=1.044, diff_loss=0.8999, tot_loss=2.408, over 1656.00 samples.], 2024-10-20 18:15:41,924 INFO [train.py:682] (1/4) Start epoch 47 2024-10-20 18:15:55,857 INFO [train.py:561] (1/4) Epoch 47, batch 4, global_batch_idx: 740, batch size: 189, loss[dur_loss=0.454, prior_loss=1.044, diff_loss=0.8404, tot_loss=2.338, over 189.00 samples.], tot_loss[dur_loss=0.455, prior_loss=1.043, diff_loss=0.951, tot_loss=2.449, over 937.00 samples.], 2024-10-20 18:16:10,737 INFO [train.py:561] (1/4) Epoch 47, batch 14, global_batch_idx: 750, batch size: 142, loss[dur_loss=0.4644, prior_loss=1.041, diff_loss=0.8103, tot_loss=2.316, over 142.00 samples.], tot_loss[dur_loss=0.4631, prior_loss=1.043, diff_loss=0.8692, tot_loss=2.375, over 2210.00 samples.], 2024-10-20 18:16:12,173 INFO [train.py:682] (1/4) Start epoch 48 2024-10-20 18:16:32,475 INFO [train.py:561] (1/4) Epoch 48, batch 8, global_batch_idx: 760, batch size: 170, loss[dur_loss=0.4701, prior_loss=1.043, diff_loss=0.8196, tot_loss=2.333, over 170.00 samples.], tot_loss[dur_loss=0.4588, prior_loss=1.042, diff_loss=0.888, tot_loss=2.389, over 1432.00 samples.], 2024-10-20 18:16:42,785 INFO [train.py:682] (1/4) Start epoch 49 2024-10-20 18:16:54,457 INFO [train.py:561] (1/4) Epoch 49, batch 2, global_batch_idx: 770, batch size: 203, loss[dur_loss=0.4526, prior_loss=1.042, diff_loss=0.8415, tot_loss=2.336, over 203.00 samples.], tot_loss[dur_loss=0.4597, prior_loss=1.043, diff_loss=0.8156, tot_loss=2.318, over 442.00 samples.], 2024-10-20 18:17:08,767 INFO [train.py:561] (1/4) Epoch 49, batch 12, global_batch_idx: 780, batch size: 152, loss[dur_loss=0.4599, prior_loss=1.042, diff_loss=0.7834, tot_loss=2.285, over 152.00 samples.], tot_loss[dur_loss=0.4591, prior_loss=1.042, diff_loss=0.8558, tot_loss=2.357, over 1966.00 samples.], 2024-10-20 18:17:13,244 INFO [train.py:682] (1/4) Start epoch 50 2024-10-20 18:17:30,282 INFO [train.py:561] (1/4) Epoch 50, batch 6, global_batch_idx: 790, batch size: 106, loss[dur_loss=0.4616, prior_loss=1.042, diff_loss=0.7681, tot_loss=2.272, over 106.00 samples.], tot_loss[dur_loss=0.4531, prior_loss=1.042, diff_loss=0.8894, tot_loss=2.384, over 1142.00 samples.], 2024-10-20 18:17:43,649 INFO [train.py:682] (1/4) Start epoch 51 2024-10-20 18:17:52,655 INFO [train.py:561] (1/4) Epoch 51, batch 0, global_batch_idx: 800, batch size: 108, loss[dur_loss=0.479, prior_loss=1.042, diff_loss=0.7656, tot_loss=2.287, over 108.00 samples.], tot_loss[dur_loss=0.479, prior_loss=1.042, diff_loss=0.7656, tot_loss=2.287, over 108.00 samples.], 2024-10-20 18:18:06,989 INFO [train.py:561] (1/4) Epoch 51, batch 10, global_batch_idx: 810, batch size: 111, loss[dur_loss=0.473, prior_loss=1.042, diff_loss=0.7518, tot_loss=2.267, over 111.00 samples.], tot_loss[dur_loss=0.4561, prior_loss=1.042, diff_loss=0.8506, tot_loss=2.349, over 1656.00 samples.], 2024-10-20 18:18:14,124 INFO [train.py:682] (1/4) Start epoch 52 2024-10-20 18:18:28,309 INFO [train.py:561] (1/4) Epoch 52, batch 4, global_batch_idx: 820, batch size: 189, loss[dur_loss=0.4498, prior_loss=1.042, diff_loss=0.7904, tot_loss=2.282, over 189.00 samples.], tot_loss[dur_loss=0.4493, prior_loss=1.041, diff_loss=0.8959, tot_loss=2.386, over 937.00 samples.], 2024-10-20 18:18:43,430 INFO [train.py:561] (1/4) Epoch 52, batch 14, global_batch_idx: 830, batch size: 142, loss[dur_loss=0.4541, prior_loss=1.04, diff_loss=0.7436, tot_loss=2.237, over 142.00 samples.], tot_loss[dur_loss=0.4583, prior_loss=1.041, diff_loss=0.822, tot_loss=2.321, over 2210.00 samples.], 2024-10-20 18:18:44,867 INFO [train.py:682] (1/4) Start epoch 53 2024-10-20 18:19:05,029 INFO [train.py:561] (1/4) Epoch 53, batch 8, global_batch_idx: 840, batch size: 170, loss[dur_loss=0.4626, prior_loss=1.041, diff_loss=0.7746, tot_loss=2.279, over 170.00 samples.], tot_loss[dur_loss=0.4543, prior_loss=1.041, diff_loss=0.846, tot_loss=2.341, over 1432.00 samples.], 2024-10-20 18:19:15,195 INFO [train.py:682] (1/4) Start epoch 54 2024-10-20 18:19:26,556 INFO [train.py:561] (1/4) Epoch 54, batch 2, global_batch_idx: 850, batch size: 203, loss[dur_loss=0.4485, prior_loss=1.04, diff_loss=0.7718, tot_loss=2.261, over 203.00 samples.], tot_loss[dur_loss=0.4555, prior_loss=1.041, diff_loss=0.7614, tot_loss=2.258, over 442.00 samples.], 2024-10-20 18:19:40,853 INFO [train.py:561] (1/4) Epoch 54, batch 12, global_batch_idx: 860, batch size: 152, loss[dur_loss=0.4597, prior_loss=1.041, diff_loss=0.7749, tot_loss=2.275, over 152.00 samples.], tot_loss[dur_loss=0.4562, prior_loss=1.041, diff_loss=0.8166, tot_loss=2.314, over 1966.00 samples.], 2024-10-20 18:19:45,338 INFO [train.py:682] (1/4) Start epoch 55 2024-10-20 18:20:02,697 INFO [train.py:561] (1/4) Epoch 55, batch 6, global_batch_idx: 870, batch size: 106, loss[dur_loss=0.4618, prior_loss=1.041, diff_loss=0.7211, tot_loss=2.224, over 106.00 samples.], tot_loss[dur_loss=0.4507, prior_loss=1.04, diff_loss=0.839, tot_loss=2.33, over 1142.00 samples.], 2024-10-20 18:20:15,848 INFO [train.py:682] (1/4) Start epoch 56 2024-10-20 18:20:24,470 INFO [train.py:561] (1/4) Epoch 56, batch 0, global_batch_idx: 880, batch size: 108, loss[dur_loss=0.4707, prior_loss=1.041, diff_loss=0.7119, tot_loss=2.223, over 108.00 samples.], tot_loss[dur_loss=0.4707, prior_loss=1.041, diff_loss=0.7119, tot_loss=2.223, over 108.00 samples.], 2024-10-20 18:20:38,693 INFO [train.py:561] (1/4) Epoch 56, batch 10, global_batch_idx: 890, batch size: 111, loss[dur_loss=0.4725, prior_loss=1.041, diff_loss=0.729, tot_loss=2.243, over 111.00 samples.], tot_loss[dur_loss=0.4528, prior_loss=1.04, diff_loss=0.8057, tot_loss=2.298, over 1656.00 samples.], 2024-10-20 18:20:45,838 INFO [train.py:682] (1/4) Start epoch 57 2024-10-20 18:20:59,726 INFO [train.py:561] (1/4) Epoch 57, batch 4, global_batch_idx: 900, batch size: 189, loss[dur_loss=0.4487, prior_loss=1.04, diff_loss=0.7294, tot_loss=2.218, over 189.00 samples.], tot_loss[dur_loss=0.4458, prior_loss=1.039, diff_loss=0.8467, tot_loss=2.332, over 937.00 samples.], 2024-10-20 18:21:14,637 INFO [train.py:561] (1/4) Epoch 57, batch 14, global_batch_idx: 910, batch size: 142, loss[dur_loss=0.4506, prior_loss=1.038, diff_loss=0.718, tot_loss=2.207, over 142.00 samples.], tot_loss[dur_loss=0.4516, prior_loss=1.039, diff_loss=0.7725, tot_loss=2.263, over 2210.00 samples.], 2024-10-20 18:21:16,073 INFO [train.py:682] (1/4) Start epoch 58 2024-10-20 18:21:35,881 INFO [train.py:561] (1/4) Epoch 58, batch 8, global_batch_idx: 920, batch size: 170, loss[dur_loss=0.4568, prior_loss=1.04, diff_loss=0.759, tot_loss=2.255, over 170.00 samples.], tot_loss[dur_loss=0.4481, prior_loss=1.039, diff_loss=0.7992, tot_loss=2.286, over 1432.00 samples.], 2024-10-20 18:21:46,024 INFO [train.py:682] (1/4) Start epoch 59 2024-10-20 18:21:57,538 INFO [train.py:561] (1/4) Epoch 59, batch 2, global_batch_idx: 930, batch size: 203, loss[dur_loss=0.4424, prior_loss=1.039, diff_loss=0.7477, tot_loss=2.229, over 203.00 samples.], tot_loss[dur_loss=0.4514, prior_loss=1.039, diff_loss=0.7345, tot_loss=2.225, over 442.00 samples.], 2024-10-20 18:22:11,771 INFO [train.py:561] (1/4) Epoch 59, batch 12, global_batch_idx: 940, batch size: 152, loss[dur_loss=0.4537, prior_loss=1.039, diff_loss=0.7174, tot_loss=2.21, over 152.00 samples.], tot_loss[dur_loss=0.449, prior_loss=1.039, diff_loss=0.7732, tot_loss=2.261, over 1966.00 samples.], 2024-10-20 18:22:16,229 INFO [train.py:682] (1/4) Start epoch 60 2024-10-20 18:22:33,420 INFO [train.py:561] (1/4) Epoch 60, batch 6, global_batch_idx: 950, batch size: 106, loss[dur_loss=0.4491, prior_loss=1.039, diff_loss=0.6973, tot_loss=2.185, over 106.00 samples.], tot_loss[dur_loss=0.443, prior_loss=1.038, diff_loss=0.8092, tot_loss=2.291, over 1142.00 samples.], 2024-10-20 18:22:46,557 INFO [train.py:682] (1/4) Start epoch 61 2024-10-20 18:22:55,252 INFO [train.py:561] (1/4) Epoch 61, batch 0, global_batch_idx: 960, batch size: 108, loss[dur_loss=0.4679, prior_loss=1.039, diff_loss=0.7064, tot_loss=2.213, over 108.00 samples.], tot_loss[dur_loss=0.4679, prior_loss=1.039, diff_loss=0.7064, tot_loss=2.213, over 108.00 samples.], 2024-10-20 18:23:09,506 INFO [train.py:561] (1/4) Epoch 61, batch 10, global_batch_idx: 970, batch size: 111, loss[dur_loss=0.4661, prior_loss=1.041, diff_loss=0.7036, tot_loss=2.21, over 111.00 samples.], tot_loss[dur_loss=0.4473, prior_loss=1.038, diff_loss=0.7715, tot_loss=2.257, over 1656.00 samples.], 2024-10-20 18:23:16,620 INFO [train.py:682] (1/4) Start epoch 62 2024-10-20 18:23:30,284 INFO [train.py:561] (1/4) Epoch 62, batch 4, global_batch_idx: 980, batch size: 189, loss[dur_loss=0.4416, prior_loss=1.039, diff_loss=0.7192, tot_loss=2.199, over 189.00 samples.], tot_loss[dur_loss=0.4395, prior_loss=1.038, diff_loss=0.8124, tot_loss=2.289, over 937.00 samples.], 2024-10-20 18:23:45,167 INFO [train.py:561] (1/4) Epoch 62, batch 14, global_batch_idx: 990, batch size: 142, loss[dur_loss=0.4487, prior_loss=1.037, diff_loss=0.6876, tot_loss=2.173, over 142.00 samples.], tot_loss[dur_loss=0.4463, prior_loss=1.038, diff_loss=0.7445, tot_loss=2.229, over 2210.00 samples.], 2024-10-20 18:23:46,601 INFO [train.py:682] (1/4) Start epoch 63 2024-10-20 18:24:06,388 INFO [train.py:561] (1/4) Epoch 63, batch 8, global_batch_idx: 1000, batch size: 170, loss[dur_loss=0.4514, prior_loss=1.038, diff_loss=0.7059, tot_loss=2.196, over 170.00 samples.], tot_loss[dur_loss=0.4422, prior_loss=1.038, diff_loss=0.7707, tot_loss=2.251, over 1432.00 samples.], 2024-10-20 18:24:16,528 INFO [train.py:682] (1/4) Start epoch 64 2024-10-20 18:24:28,098 INFO [train.py:561] (1/4) Epoch 64, batch 2, global_batch_idx: 1010, batch size: 203, loss[dur_loss=0.439, prior_loss=1.037, diff_loss=0.7062, tot_loss=2.182, over 203.00 samples.], tot_loss[dur_loss=0.4462, prior_loss=1.037, diff_loss=0.6933, tot_loss=2.177, over 442.00 samples.], 2024-10-20 18:24:42,329 INFO [train.py:561] (1/4) Epoch 64, batch 12, global_batch_idx: 1020, batch size: 152, loss[dur_loss=0.4458, prior_loss=1.037, diff_loss=0.6848, tot_loss=2.167, over 152.00 samples.], tot_loss[dur_loss=0.4452, prior_loss=1.037, diff_loss=0.7404, tot_loss=2.223, over 1966.00 samples.], 2024-10-20 18:24:46,810 INFO [train.py:682] (1/4) Start epoch 65 2024-10-20 18:25:03,700 INFO [train.py:561] (1/4) Epoch 65, batch 6, global_batch_idx: 1030, batch size: 106, loss[dur_loss=0.4447, prior_loss=1.037, diff_loss=0.64, tot_loss=2.122, over 106.00 samples.], tot_loss[dur_loss=0.4393, prior_loss=1.037, diff_loss=0.7626, tot_loss=2.239, over 1142.00 samples.], 2024-10-20 18:25:16,699 INFO [train.py:682] (1/4) Start epoch 66 2024-10-20 18:25:25,865 INFO [train.py:561] (1/4) Epoch 66, batch 0, global_batch_idx: 1040, batch size: 108, loss[dur_loss=0.4656, prior_loss=1.038, diff_loss=0.6686, tot_loss=2.172, over 108.00 samples.], tot_loss[dur_loss=0.4656, prior_loss=1.038, diff_loss=0.6686, tot_loss=2.172, over 108.00 samples.], 2024-10-20 18:25:40,001 INFO [train.py:561] (1/4) Epoch 66, batch 10, global_batch_idx: 1050, batch size: 111, loss[dur_loss=0.4607, prior_loss=1.038, diff_loss=0.641, tot_loss=2.14, over 111.00 samples.], tot_loss[dur_loss=0.4433, prior_loss=1.037, diff_loss=0.7349, tot_loss=2.215, over 1656.00 samples.], 2024-10-20 18:25:47,090 INFO [train.py:682] (1/4) Start epoch 67 2024-10-20 18:26:00,643 INFO [train.py:561] (1/4) Epoch 67, batch 4, global_batch_idx: 1060, batch size: 189, loss[dur_loss=0.4392, prior_loss=1.038, diff_loss=0.7018, tot_loss=2.179, over 189.00 samples.], tot_loss[dur_loss=0.4382, prior_loss=1.037, diff_loss=0.7895, tot_loss=2.264, over 937.00 samples.], 2024-10-20 18:26:15,446 INFO [train.py:561] (1/4) Epoch 67, batch 14, global_batch_idx: 1070, batch size: 142, loss[dur_loss=0.4474, prior_loss=1.036, diff_loss=0.6362, tot_loss=2.119, over 142.00 samples.], tot_loss[dur_loss=0.4458, prior_loss=1.037, diff_loss=0.7129, tot_loss=2.195, over 2210.00 samples.], 2024-10-20 18:26:16,865 INFO [train.py:682] (1/4) Start epoch 68 2024-10-20 18:26:36,575 INFO [train.py:561] (1/4) Epoch 68, batch 8, global_batch_idx: 1080, batch size: 170, loss[dur_loss=0.45, prior_loss=1.037, diff_loss=0.6898, tot_loss=2.177, over 170.00 samples.], tot_loss[dur_loss=0.4408, prior_loss=1.036, diff_loss=0.7404, tot_loss=2.218, over 1432.00 samples.], 2024-10-20 18:26:46,727 INFO [train.py:682] (1/4) Start epoch 69 2024-10-20 18:26:58,073 INFO [train.py:561] (1/4) Epoch 69, batch 2, global_batch_idx: 1090, batch size: 203, loss[dur_loss=0.4406, prior_loss=1.036, diff_loss=0.6906, tot_loss=2.167, over 203.00 samples.], tot_loss[dur_loss=0.4471, prior_loss=1.037, diff_loss=0.6659, tot_loss=2.15, over 442.00 samples.], 2024-10-20 18:27:12,481 INFO [train.py:561] (1/4) Epoch 69, batch 12, global_batch_idx: 1100, batch size: 152, loss[dur_loss=0.4449, prior_loss=1.036, diff_loss=0.6696, tot_loss=2.151, over 152.00 samples.], tot_loss[dur_loss=0.4426, prior_loss=1.036, diff_loss=0.7121, tot_loss=2.191, over 1966.00 samples.], 2024-10-20 18:27:17,028 INFO [train.py:682] (1/4) Start epoch 70 2024-10-20 18:27:34,026 INFO [train.py:561] (1/4) Epoch 70, batch 6, global_batch_idx: 1110, batch size: 106, loss[dur_loss=0.4434, prior_loss=1.036, diff_loss=0.6147, tot_loss=2.094, over 106.00 samples.], tot_loss[dur_loss=0.437, prior_loss=1.036, diff_loss=0.7395, tot_loss=2.212, over 1142.00 samples.], 2024-10-20 18:27:47,256 INFO [train.py:682] (1/4) Start epoch 71 2024-10-20 18:27:55,879 INFO [train.py:561] (1/4) Epoch 71, batch 0, global_batch_idx: 1120, batch size: 108, loss[dur_loss=0.4611, prior_loss=1.037, diff_loss=0.649, tot_loss=2.147, over 108.00 samples.], tot_loss[dur_loss=0.4611, prior_loss=1.037, diff_loss=0.649, tot_loss=2.147, over 108.00 samples.], 2024-10-20 18:28:10,187 INFO [train.py:561] (1/4) Epoch 71, batch 10, global_batch_idx: 1130, batch size: 111, loss[dur_loss=0.4555, prior_loss=1.037, diff_loss=0.6197, tot_loss=2.113, over 111.00 samples.], tot_loss[dur_loss=0.4398, prior_loss=1.036, diff_loss=0.7057, tot_loss=2.181, over 1656.00 samples.], 2024-10-20 18:28:17,346 INFO [train.py:682] (1/4) Start epoch 72 2024-10-20 18:28:31,255 INFO [train.py:561] (1/4) Epoch 72, batch 4, global_batch_idx: 1140, batch size: 189, loss[dur_loss=0.4327, prior_loss=1.036, diff_loss=0.651, tot_loss=2.12, over 189.00 samples.], tot_loss[dur_loss=0.4327, prior_loss=1.035, diff_loss=0.7497, tot_loss=2.218, over 937.00 samples.], 2024-10-20 18:28:46,137 INFO [train.py:561] (1/4) Epoch 72, batch 14, global_batch_idx: 1150, batch size: 142, loss[dur_loss=0.4412, prior_loss=1.034, diff_loss=0.6134, tot_loss=2.089, over 142.00 samples.], tot_loss[dur_loss=0.4405, prior_loss=1.035, diff_loss=0.6884, tot_loss=2.164, over 2210.00 samples.], 2024-10-20 18:28:47,581 INFO [train.py:682] (1/4) Start epoch 73 2024-10-20 18:29:07,593 INFO [train.py:561] (1/4) Epoch 73, batch 8, global_batch_idx: 1160, batch size: 170, loss[dur_loss=0.4458, prior_loss=1.036, diff_loss=0.6544, tot_loss=2.136, over 170.00 samples.], tot_loss[dur_loss=0.4356, prior_loss=1.035, diff_loss=0.7137, tot_loss=2.184, over 1432.00 samples.], 2024-10-20 18:29:17,791 INFO [train.py:682] (1/4) Start epoch 74 2024-10-20 18:29:29,032 INFO [train.py:561] (1/4) Epoch 74, batch 2, global_batch_idx: 1170, batch size: 203, loss[dur_loss=0.435, prior_loss=1.035, diff_loss=0.6616, tot_loss=2.131, over 203.00 samples.], tot_loss[dur_loss=0.441, prior_loss=1.035, diff_loss=0.6389, tot_loss=2.115, over 442.00 samples.], 2024-10-20 18:29:43,286 INFO [train.py:561] (1/4) Epoch 74, batch 12, global_batch_idx: 1180, batch size: 152, loss[dur_loss=0.4467, prior_loss=1.035, diff_loss=0.6269, tot_loss=2.109, over 152.00 samples.], tot_loss[dur_loss=0.4404, prior_loss=1.035, diff_loss=0.6846, tot_loss=2.16, over 1966.00 samples.], 2024-10-20 18:29:47,733 INFO [train.py:682] (1/4) Start epoch 75 2024-10-20 18:30:04,818 INFO [train.py:561] (1/4) Epoch 75, batch 6, global_batch_idx: 1190, batch size: 106, loss[dur_loss=0.4408, prior_loss=1.035, diff_loss=0.6195, tot_loss=2.095, over 106.00 samples.], tot_loss[dur_loss=0.4337, prior_loss=1.034, diff_loss=0.7257, tot_loss=2.194, over 1142.00 samples.], 2024-10-20 18:30:17,926 INFO [train.py:682] (1/4) Start epoch 76 2024-10-20 18:30:26,449 INFO [train.py:561] (1/4) Epoch 76, batch 0, global_batch_idx: 1200, batch size: 108, loss[dur_loss=0.4582, prior_loss=1.036, diff_loss=0.6319, tot_loss=2.126, over 108.00 samples.], tot_loss[dur_loss=0.4582, prior_loss=1.036, diff_loss=0.6319, tot_loss=2.126, over 108.00 samples.], 2024-10-20 18:30:40,769 INFO [train.py:561] (1/4) Epoch 76, batch 10, global_batch_idx: 1210, batch size: 111, loss[dur_loss=0.4545, prior_loss=1.036, diff_loss=0.5983, tot_loss=2.089, over 111.00 samples.], tot_loss[dur_loss=0.438, prior_loss=1.035, diff_loss=0.6804, tot_loss=2.153, over 1656.00 samples.], 2024-10-20 18:30:47,928 INFO [train.py:682] (1/4) Start epoch 77 2024-10-20 18:31:01,751 INFO [train.py:561] (1/4) Epoch 77, batch 4, global_batch_idx: 1220, batch size: 189, loss[dur_loss=0.4325, prior_loss=1.036, diff_loss=0.6376, tot_loss=2.106, over 189.00 samples.], tot_loss[dur_loss=0.4296, prior_loss=1.034, diff_loss=0.7318, tot_loss=2.196, over 937.00 samples.], 2024-10-20 18:31:16,776 INFO [train.py:561] (1/4) Epoch 77, batch 14, global_batch_idx: 1230, batch size: 142, loss[dur_loss=0.4398, prior_loss=1.033, diff_loss=0.6011, tot_loss=2.074, over 142.00 samples.], tot_loss[dur_loss=0.4371, prior_loss=1.035, diff_loss=0.6607, tot_loss=2.132, over 2210.00 samples.], 2024-10-20 18:31:18,272 INFO [train.py:682] (1/4) Start epoch 78 2024-10-20 18:31:37,941 INFO [train.py:561] (1/4) Epoch 78, batch 8, global_batch_idx: 1240, batch size: 170, loss[dur_loss=0.437, prior_loss=1.035, diff_loss=0.6357, tot_loss=2.107, over 170.00 samples.], tot_loss[dur_loss=0.4321, prior_loss=1.034, diff_loss=0.6913, tot_loss=2.158, over 1432.00 samples.], 2024-10-20 18:31:48,050 INFO [train.py:682] (1/4) Start epoch 79 2024-10-20 18:31:59,516 INFO [train.py:561] (1/4) Epoch 79, batch 2, global_batch_idx: 1250, batch size: 203, loss[dur_loss=0.4358, prior_loss=1.034, diff_loss=0.6372, tot_loss=2.107, over 203.00 samples.], tot_loss[dur_loss=0.4403, prior_loss=1.034, diff_loss=0.6129, tot_loss=2.087, over 442.00 samples.], 2024-10-20 18:32:13,820 INFO [train.py:561] (1/4) Epoch 79, batch 12, global_batch_idx: 1260, batch size: 152, loss[dur_loss=0.4384, prior_loss=1.034, diff_loss=0.6245, tot_loss=2.097, over 152.00 samples.], tot_loss[dur_loss=0.4363, prior_loss=1.034, diff_loss=0.6694, tot_loss=2.14, over 1966.00 samples.], 2024-10-20 18:32:18,265 INFO [train.py:682] (1/4) Start epoch 80 2024-10-20 18:32:35,446 INFO [train.py:561] (1/4) Epoch 80, batch 6, global_batch_idx: 1270, batch size: 106, loss[dur_loss=0.4355, prior_loss=1.034, diff_loss=0.5894, tot_loss=2.059, over 106.00 samples.], tot_loss[dur_loss=0.4291, prior_loss=1.033, diff_loss=0.7104, tot_loss=2.173, over 1142.00 samples.], 2024-10-20 18:32:48,507 INFO [train.py:682] (1/4) Start epoch 81 2024-10-20 18:32:57,086 INFO [train.py:561] (1/4) Epoch 81, batch 0, global_batch_idx: 1280, batch size: 108, loss[dur_loss=0.4553, prior_loss=1.034, diff_loss=0.6065, tot_loss=2.096, over 108.00 samples.], tot_loss[dur_loss=0.4553, prior_loss=1.034, diff_loss=0.6065, tot_loss=2.096, over 108.00 samples.], 2024-10-20 18:33:11,374 INFO [train.py:561] (1/4) Epoch 81, batch 10, global_batch_idx: 1290, batch size: 111, loss[dur_loss=0.4501, prior_loss=1.035, diff_loss=0.6053, tot_loss=2.091, over 111.00 samples.], tot_loss[dur_loss=0.4333, prior_loss=1.034, diff_loss=0.6695, tot_loss=2.136, over 1656.00 samples.], 2024-10-20 18:33:18,504 INFO [train.py:682] (1/4) Start epoch 82 2024-10-20 18:33:32,040 INFO [train.py:561] (1/4) Epoch 82, batch 4, global_batch_idx: 1300, batch size: 189, loss[dur_loss=0.4292, prior_loss=1.034, diff_loss=0.6219, tot_loss=2.085, over 189.00 samples.], tot_loss[dur_loss=0.4272, prior_loss=1.033, diff_loss=0.7169, tot_loss=2.177, over 937.00 samples.], 2024-10-20 18:33:46,810 INFO [train.py:561] (1/4) Epoch 82, batch 14, global_batch_idx: 1310, batch size: 142, loss[dur_loss=0.4345, prior_loss=1.032, diff_loss=0.6043, tot_loss=2.071, over 142.00 samples.], tot_loss[dur_loss=0.4342, prior_loss=1.033, diff_loss=0.6557, tot_loss=2.123, over 2210.00 samples.], 2024-10-20 18:33:48,239 INFO [train.py:682] (1/4) Start epoch 83 2024-10-20 18:34:07,919 INFO [train.py:561] (1/4) Epoch 83, batch 8, global_batch_idx: 1320, batch size: 170, loss[dur_loss=0.4368, prior_loss=1.034, diff_loss=0.5953, tot_loss=2.066, over 170.00 samples.], tot_loss[dur_loss=0.4299, prior_loss=1.033, diff_loss=0.6628, tot_loss=2.126, over 1432.00 samples.], 2024-10-20 18:34:17,939 INFO [train.py:682] (1/4) Start epoch 84 2024-10-20 18:34:29,448 INFO [train.py:561] (1/4) Epoch 84, batch 2, global_batch_idx: 1330, batch size: 203, loss[dur_loss=0.4255, prior_loss=1.032, diff_loss=0.5939, tot_loss=2.052, over 203.00 samples.], tot_loss[dur_loss=0.4342, prior_loss=1.033, diff_loss=0.5767, tot_loss=2.044, over 442.00 samples.], 2024-10-20 18:34:43,716 INFO [train.py:561] (1/4) Epoch 84, batch 12, global_batch_idx: 1340, batch size: 152, loss[dur_loss=0.4336, prior_loss=1.033, diff_loss=0.5867, tot_loss=2.053, over 152.00 samples.], tot_loss[dur_loss=0.4316, prior_loss=1.033, diff_loss=0.6475, tot_loss=2.112, over 1966.00 samples.], 2024-10-20 18:34:48,171 INFO [train.py:682] (1/4) Start epoch 85 2024-10-20 18:35:04,882 INFO [train.py:561] (1/4) Epoch 85, batch 6, global_batch_idx: 1350, batch size: 106, loss[dur_loss=0.4363, prior_loss=1.033, diff_loss=0.5943, tot_loss=2.063, over 106.00 samples.], tot_loss[dur_loss=0.4283, prior_loss=1.033, diff_loss=0.6831, tot_loss=2.144, over 1142.00 samples.], 2024-10-20 18:35:17,894 INFO [train.py:682] (1/4) Start epoch 86 2024-10-20 18:35:26,653 INFO [train.py:561] (1/4) Epoch 86, batch 0, global_batch_idx: 1360, batch size: 108, loss[dur_loss=0.4541, prior_loss=1.034, diff_loss=0.6278, tot_loss=2.116, over 108.00 samples.], tot_loss[dur_loss=0.4541, prior_loss=1.034, diff_loss=0.6278, tot_loss=2.116, over 108.00 samples.], 2024-10-20 18:35:40,817 INFO [train.py:561] (1/4) Epoch 86, batch 10, global_batch_idx: 1370, batch size: 111, loss[dur_loss=0.4458, prior_loss=1.034, diff_loss=0.5498, tot_loss=2.03, over 111.00 samples.], tot_loss[dur_loss=0.4317, prior_loss=1.033, diff_loss=0.6523, tot_loss=2.117, over 1656.00 samples.], 2024-10-20 18:35:47,872 INFO [train.py:682] (1/4) Start epoch 87 2024-10-20 18:36:01,498 INFO [train.py:561] (1/4) Epoch 87, batch 4, global_batch_idx: 1380, batch size: 189, loss[dur_loss=0.4262, prior_loss=1.033, diff_loss=0.5944, tot_loss=2.054, over 189.00 samples.], tot_loss[dur_loss=0.4236, prior_loss=1.032, diff_loss=0.7077, tot_loss=2.163, over 937.00 samples.], 2024-10-20 18:36:16,327 INFO [train.py:561] (1/4) Epoch 87, batch 14, global_batch_idx: 1390, batch size: 142, loss[dur_loss=0.4341, prior_loss=1.031, diff_loss=0.5661, tot_loss=2.031, over 142.00 samples.], tot_loss[dur_loss=0.4322, prior_loss=1.032, diff_loss=0.6291, tot_loss=2.094, over 2210.00 samples.], 2024-10-20 18:36:17,750 INFO [train.py:682] (1/4) Start epoch 88 2024-10-20 18:36:37,856 INFO [train.py:561] (1/4) Epoch 88, batch 8, global_batch_idx: 1400, batch size: 170, loss[dur_loss=0.4346, prior_loss=1.033, diff_loss=0.6127, tot_loss=2.08, over 170.00 samples.], tot_loss[dur_loss=0.4283, prior_loss=1.032, diff_loss=0.6554, tot_loss=2.116, over 1432.00 samples.], 2024-10-20 18:36:47,981 INFO [train.py:682] (1/4) Start epoch 89 2024-10-20 18:36:59,391 INFO [train.py:561] (1/4) Epoch 89, batch 2, global_batch_idx: 1410, batch size: 203, loss[dur_loss=0.4253, prior_loss=1.032, diff_loss=0.5842, tot_loss=2.041, over 203.00 samples.], tot_loss[dur_loss=0.4332, prior_loss=1.032, diff_loss=0.5897, tot_loss=2.055, over 442.00 samples.], 2024-10-20 18:37:13,573 INFO [train.py:561] (1/4) Epoch 89, batch 12, global_batch_idx: 1420, batch size: 152, loss[dur_loss=0.4323, prior_loss=1.032, diff_loss=0.5496, tot_loss=2.014, over 152.00 samples.], tot_loss[dur_loss=0.4301, prior_loss=1.032, diff_loss=0.637, tot_loss=2.099, over 1966.00 samples.], 2024-10-20 18:37:18,014 INFO [train.py:682] (1/4) Start epoch 90 2024-10-20 18:37:34,855 INFO [train.py:561] (1/4) Epoch 90, batch 6, global_batch_idx: 1430, batch size: 106, loss[dur_loss=0.4323, prior_loss=1.033, diff_loss=0.5877, tot_loss=2.053, over 106.00 samples.], tot_loss[dur_loss=0.4253, prior_loss=1.032, diff_loss=0.6683, tot_loss=2.125, over 1142.00 samples.], 2024-10-20 18:37:47,848 INFO [train.py:682] (1/4) Start epoch 91 2024-10-20 18:37:56,363 INFO [train.py:561] (1/4) Epoch 91, batch 0, global_batch_idx: 1440, batch size: 108, loss[dur_loss=0.4531, prior_loss=1.033, diff_loss=0.5613, tot_loss=2.047, over 108.00 samples.], tot_loss[dur_loss=0.4531, prior_loss=1.033, diff_loss=0.5613, tot_loss=2.047, over 108.00 samples.], 2024-10-20 18:38:10,558 INFO [train.py:561] (1/4) Epoch 91, batch 10, global_batch_idx: 1450, batch size: 111, loss[dur_loss=0.4436, prior_loss=1.034, diff_loss=0.5964, tot_loss=2.074, over 111.00 samples.], tot_loss[dur_loss=0.4293, prior_loss=1.032, diff_loss=0.6391, tot_loss=2.1, over 1656.00 samples.], 2024-10-20 18:38:17,652 INFO [train.py:682] (1/4) Start epoch 92 2024-10-20 18:38:31,378 INFO [train.py:561] (1/4) Epoch 92, batch 4, global_batch_idx: 1460, batch size: 189, loss[dur_loss=0.4234, prior_loss=1.032, diff_loss=0.6115, tot_loss=2.067, over 189.00 samples.], tot_loss[dur_loss=0.4207, prior_loss=1.031, diff_loss=0.6967, tot_loss=2.149, over 937.00 samples.], 2024-10-20 18:38:46,254 INFO [train.py:561] (1/4) Epoch 92, batch 14, global_batch_idx: 1470, batch size: 142, loss[dur_loss=0.4297, prior_loss=1.031, diff_loss=0.5577, tot_loss=2.018, over 142.00 samples.], tot_loss[dur_loss=0.429, prior_loss=1.032, diff_loss=0.6199, tot_loss=2.08, over 2210.00 samples.], 2024-10-20 18:38:47,695 INFO [train.py:682] (1/4) Start epoch 93 2024-10-20 18:39:07,416 INFO [train.py:561] (1/4) Epoch 93, batch 8, global_batch_idx: 1480, batch size: 170, loss[dur_loss=0.4334, prior_loss=1.032, diff_loss=0.5593, tot_loss=2.025, over 170.00 samples.], tot_loss[dur_loss=0.4241, prior_loss=1.031, diff_loss=0.6443, tot_loss=2.1, over 1432.00 samples.], 2024-10-20 18:39:17,603 INFO [train.py:682] (1/4) Start epoch 94 2024-10-20 18:39:28,771 INFO [train.py:561] (1/4) Epoch 94, batch 2, global_batch_idx: 1490, batch size: 203, loss[dur_loss=0.4189, prior_loss=1.031, diff_loss=0.6101, tot_loss=2.06, over 203.00 samples.], tot_loss[dur_loss=0.4291, prior_loss=1.031, diff_loss=0.5839, tot_loss=2.044, over 442.00 samples.], 2024-10-20 18:39:42,932 INFO [train.py:561] (1/4) Epoch 94, batch 12, global_batch_idx: 1500, batch size: 152, loss[dur_loss=0.4316, prior_loss=1.031, diff_loss=0.5658, tot_loss=2.029, over 152.00 samples.], tot_loss[dur_loss=0.4269, prior_loss=1.031, diff_loss=0.6258, tot_loss=2.084, over 1966.00 samples.], 2024-10-20 18:39:44,554 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 18:40:18,597 INFO [train.py:589] (1/4) Epoch 94, validation: dur_loss=0.4399, prior_loss=1.035, diff_loss=0.5649, tot_loss=2.04, over 100.00 samples. 2024-10-20 18:40:18,598 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21048MB 2024-10-20 18:40:21,469 INFO [train.py:682] (1/4) Start epoch 95 2024-10-20 18:40:38,893 INFO [train.py:561] (1/4) Epoch 95, batch 6, global_batch_idx: 1510, batch size: 106, loss[dur_loss=0.4256, prior_loss=1.031, diff_loss=0.5572, tot_loss=2.014, over 106.00 samples.], tot_loss[dur_loss=0.4206, prior_loss=1.031, diff_loss=0.6563, tot_loss=2.108, over 1142.00 samples.], 2024-10-20 18:40:52,004 INFO [train.py:682] (1/4) Start epoch 96 2024-10-20 18:41:00,837 INFO [train.py:561] (1/4) Epoch 96, batch 0, global_batch_idx: 1520, batch size: 108, loss[dur_loss=0.4486, prior_loss=1.032, diff_loss=0.5236, tot_loss=2.004, over 108.00 samples.], tot_loss[dur_loss=0.4486, prior_loss=1.032, diff_loss=0.5236, tot_loss=2.004, over 108.00 samples.], 2024-10-20 18:41:14,997 INFO [train.py:561] (1/4) Epoch 96, batch 10, global_batch_idx: 1530, batch size: 111, loss[dur_loss=0.4442, prior_loss=1.033, diff_loss=0.5354, tot_loss=2.012, over 111.00 samples.], tot_loss[dur_loss=0.4269, prior_loss=1.031, diff_loss=0.6213, tot_loss=2.079, over 1656.00 samples.], 2024-10-20 18:41:22,066 INFO [train.py:682] (1/4) Start epoch 97 2024-10-20 18:41:36,108 INFO [train.py:561] (1/4) Epoch 97, batch 4, global_batch_idx: 1540, batch size: 189, loss[dur_loss=0.4191, prior_loss=1.031, diff_loss=0.5957, tot_loss=2.046, over 189.00 samples.], tot_loss[dur_loss=0.4187, prior_loss=1.03, diff_loss=0.6753, tot_loss=2.124, over 937.00 samples.], 2024-10-20 18:41:50,853 INFO [train.py:561] (1/4) Epoch 97, batch 14, global_batch_idx: 1550, batch size: 142, loss[dur_loss=0.43, prior_loss=1.03, diff_loss=0.5408, tot_loss=2.001, over 142.00 samples.], tot_loss[dur_loss=0.4276, prior_loss=1.031, diff_loss=0.6067, tot_loss=2.065, over 2210.00 samples.], 2024-10-20 18:41:52,270 INFO [train.py:682] (1/4) Start epoch 98 2024-10-20 18:42:12,052 INFO [train.py:561] (1/4) Epoch 98, batch 8, global_batch_idx: 1560, batch size: 170, loss[dur_loss=0.4341, prior_loss=1.031, diff_loss=0.5869, tot_loss=2.052, over 170.00 samples.], tot_loss[dur_loss=0.4248, prior_loss=1.031, diff_loss=0.6339, tot_loss=2.089, over 1432.00 samples.], 2024-10-20 18:42:22,103 INFO [train.py:682] (1/4) Start epoch 99 2024-10-20 18:42:33,437 INFO [train.py:561] (1/4) Epoch 99, batch 2, global_batch_idx: 1570, batch size: 203, loss[dur_loss=0.4182, prior_loss=1.03, diff_loss=0.5889, tot_loss=2.037, over 203.00 samples.], tot_loss[dur_loss=0.428, prior_loss=1.031, diff_loss=0.5522, tot_loss=2.011, over 442.00 samples.], 2024-10-20 18:42:47,700 INFO [train.py:561] (1/4) Epoch 99, batch 12, global_batch_idx: 1580, batch size: 152, loss[dur_loss=0.4347, prior_loss=1.031, diff_loss=0.5787, tot_loss=2.045, over 152.00 samples.], tot_loss[dur_loss=0.4248, prior_loss=1.031, diff_loss=0.6105, tot_loss=2.066, over 1966.00 samples.], 2024-10-20 18:42:52,124 INFO [train.py:682] (1/4) Start epoch 100 2024-10-20 18:43:09,259 INFO [train.py:561] (1/4) Epoch 100, batch 6, global_batch_idx: 1590, batch size: 106, loss[dur_loss=0.4295, prior_loss=1.031, diff_loss=0.5659, tot_loss=2.026, over 106.00 samples.], tot_loss[dur_loss=0.4206, prior_loss=1.03, diff_loss=0.6428, tot_loss=2.093, over 1142.00 samples.], 2024-10-20 18:43:22,305 INFO [train.py:682] (1/4) Start epoch 101 2024-10-20 18:43:31,067 INFO [train.py:561] (1/4) Epoch 101, batch 0, global_batch_idx: 1600, batch size: 108, loss[dur_loss=0.4482, prior_loss=1.031, diff_loss=0.5409, tot_loss=2.02, over 108.00 samples.], tot_loss[dur_loss=0.4482, prior_loss=1.031, diff_loss=0.5409, tot_loss=2.02, over 108.00 samples.], 2024-10-20 18:43:45,280 INFO [train.py:561] (1/4) Epoch 101, batch 10, global_batch_idx: 1610, batch size: 111, loss[dur_loss=0.4398, prior_loss=1.032, diff_loss=0.5851, tot_loss=2.057, over 111.00 samples.], tot_loss[dur_loss=0.4246, prior_loss=1.03, diff_loss=0.6098, tot_loss=2.065, over 1656.00 samples.], 2024-10-20 18:43:52,355 INFO [train.py:682] (1/4) Start epoch 102 2024-10-20 18:44:05,969 INFO [train.py:561] (1/4) Epoch 102, batch 4, global_batch_idx: 1620, batch size: 189, loss[dur_loss=0.4156, prior_loss=1.03, diff_loss=0.563, tot_loss=2.009, over 189.00 samples.], tot_loss[dur_loss=0.4184, prior_loss=1.03, diff_loss=0.6611, tot_loss=2.109, over 937.00 samples.], 2024-10-20 18:44:20,789 INFO [train.py:561] (1/4) Epoch 102, batch 14, global_batch_idx: 1630, batch size: 142, loss[dur_loss=0.4281, prior_loss=1.029, diff_loss=0.5304, tot_loss=1.988, over 142.00 samples.], tot_loss[dur_loss=0.4254, prior_loss=1.03, diff_loss=0.5899, tot_loss=2.045, over 2210.00 samples.], 2024-10-20 18:44:22,221 INFO [train.py:682] (1/4) Start epoch 103 2024-10-20 18:44:42,042 INFO [train.py:561] (1/4) Epoch 103, batch 8, global_batch_idx: 1640, batch size: 170, loss[dur_loss=0.4282, prior_loss=1.031, diff_loss=0.5471, tot_loss=2.006, over 170.00 samples.], tot_loss[dur_loss=0.4221, prior_loss=1.03, diff_loss=0.6235, tot_loss=2.075, over 1432.00 samples.], 2024-10-20 18:44:52,138 INFO [train.py:682] (1/4) Start epoch 104 2024-10-20 18:45:03,649 INFO [train.py:561] (1/4) Epoch 104, batch 2, global_batch_idx: 1650, batch size: 203, loss[dur_loss=0.4226, prior_loss=1.031, diff_loss=0.5622, tot_loss=2.015, over 203.00 samples.], tot_loss[dur_loss=0.4295, prior_loss=1.031, diff_loss=0.5673, tot_loss=2.027, over 442.00 samples.], 2024-10-20 18:45:17,763 INFO [train.py:561] (1/4) Epoch 104, batch 12, global_batch_idx: 1660, batch size: 152, loss[dur_loss=0.4295, prior_loss=1.03, diff_loss=0.5437, tot_loss=2.003, over 152.00 samples.], tot_loss[dur_loss=0.4243, prior_loss=1.03, diff_loss=0.5991, tot_loss=2.053, over 1966.00 samples.], 2024-10-20 18:45:22,191 INFO [train.py:682] (1/4) Start epoch 105 2024-10-20 18:45:39,235 INFO [train.py:561] (1/4) Epoch 105, batch 6, global_batch_idx: 1670, batch size: 106, loss[dur_loss=0.4229, prior_loss=1.03, diff_loss=0.5193, tot_loss=1.972, over 106.00 samples.], tot_loss[dur_loss=0.4169, prior_loss=1.029, diff_loss=0.6404, tot_loss=2.086, over 1142.00 samples.], 2024-10-20 18:45:52,248 INFO [train.py:682] (1/4) Start epoch 106 2024-10-20 18:46:01,102 INFO [train.py:561] (1/4) Epoch 106, batch 0, global_batch_idx: 1680, batch size: 108, loss[dur_loss=0.4448, prior_loss=1.031, diff_loss=0.5439, tot_loss=2.019, over 108.00 samples.], tot_loss[dur_loss=0.4448, prior_loss=1.031, diff_loss=0.5439, tot_loss=2.019, over 108.00 samples.], 2024-10-20 18:46:15,363 INFO [train.py:561] (1/4) Epoch 106, batch 10, global_batch_idx: 1690, batch size: 111, loss[dur_loss=0.4393, prior_loss=1.031, diff_loss=0.5624, tot_loss=2.033, over 111.00 samples.], tot_loss[dur_loss=0.4211, prior_loss=1.029, diff_loss=0.607, tot_loss=2.057, over 1656.00 samples.], 2024-10-20 18:46:22,481 INFO [train.py:682] (1/4) Start epoch 107 2024-10-20 18:46:36,297 INFO [train.py:561] (1/4) Epoch 107, batch 4, global_batch_idx: 1700, batch size: 189, loss[dur_loss=0.4215, prior_loss=1.03, diff_loss=0.5582, tot_loss=2.01, over 189.00 samples.], tot_loss[dur_loss=0.4181, prior_loss=1.029, diff_loss=0.6605, tot_loss=2.108, over 937.00 samples.], 2024-10-20 18:46:51,169 INFO [train.py:561] (1/4) Epoch 107, batch 14, global_batch_idx: 1710, batch size: 142, loss[dur_loss=0.4239, prior_loss=1.029, diff_loss=0.549, tot_loss=2.002, over 142.00 samples.], tot_loss[dur_loss=0.4252, prior_loss=1.03, diff_loss=0.5943, tot_loss=2.049, over 2210.00 samples.], 2024-10-20 18:46:52,601 INFO [train.py:682] (1/4) Start epoch 108 2024-10-20 18:47:12,717 INFO [train.py:561] (1/4) Epoch 108, batch 8, global_batch_idx: 1720, batch size: 170, loss[dur_loss=0.4293, prior_loss=1.03, diff_loss=0.5477, tot_loss=2.007, over 170.00 samples.], tot_loss[dur_loss=0.4217, prior_loss=1.03, diff_loss=0.6201, tot_loss=2.072, over 1432.00 samples.], 2024-10-20 18:47:22,782 INFO [train.py:682] (1/4) Start epoch 109 2024-10-20 18:47:34,160 INFO [train.py:561] (1/4) Epoch 109, batch 2, global_batch_idx: 1730, batch size: 203, loss[dur_loss=0.4182, prior_loss=1.029, diff_loss=0.5534, tot_loss=2.001, over 203.00 samples.], tot_loss[dur_loss=0.4279, prior_loss=1.03, diff_loss=0.533, tot_loss=1.991, over 442.00 samples.], 2024-10-20 18:47:48,349 INFO [train.py:561] (1/4) Epoch 109, batch 12, global_batch_idx: 1740, batch size: 152, loss[dur_loss=0.4298, prior_loss=1.03, diff_loss=0.542, tot_loss=2.001, over 152.00 samples.], tot_loss[dur_loss=0.4243, prior_loss=1.03, diff_loss=0.5953, tot_loss=2.049, over 1966.00 samples.], 2024-10-20 18:47:52,787 INFO [train.py:682] (1/4) Start epoch 110 2024-10-20 18:48:09,481 INFO [train.py:561] (1/4) Epoch 110, batch 6, global_batch_idx: 1750, batch size: 106, loss[dur_loss=0.4331, prior_loss=1.03, diff_loss=0.4909, tot_loss=1.954, over 106.00 samples.], tot_loss[dur_loss=0.4192, prior_loss=1.029, diff_loss=0.6291, tot_loss=2.078, over 1142.00 samples.], 2024-10-20 18:48:22,498 INFO [train.py:682] (1/4) Start epoch 111 2024-10-20 18:48:31,005 INFO [train.py:561] (1/4) Epoch 111, batch 0, global_batch_idx: 1760, batch size: 108, loss[dur_loss=0.4438, prior_loss=1.031, diff_loss=0.5411, tot_loss=2.016, over 108.00 samples.], tot_loss[dur_loss=0.4438, prior_loss=1.031, diff_loss=0.5411, tot_loss=2.016, over 108.00 samples.], 2024-10-20 18:48:45,147 INFO [train.py:561] (1/4) Epoch 111, batch 10, global_batch_idx: 1770, batch size: 111, loss[dur_loss=0.4418, prior_loss=1.031, diff_loss=0.5564, tot_loss=2.029, over 111.00 samples.], tot_loss[dur_loss=0.4237, prior_loss=1.029, diff_loss=0.6138, tot_loss=2.067, over 1656.00 samples.], 2024-10-20 18:48:52,199 INFO [train.py:682] (1/4) Start epoch 112 2024-10-20 18:49:05,568 INFO [train.py:561] (1/4) Epoch 112, batch 4, global_batch_idx: 1780, batch size: 189, loss[dur_loss=0.4188, prior_loss=1.029, diff_loss=0.5743, tot_loss=2.022, over 189.00 samples.], tot_loss[dur_loss=0.4161, prior_loss=1.028, diff_loss=0.6637, tot_loss=2.108, over 937.00 samples.], 2024-10-20 18:49:20,368 INFO [train.py:561] (1/4) Epoch 112, batch 14, global_batch_idx: 1790, batch size: 142, loss[dur_loss=0.424, prior_loss=1.028, diff_loss=0.553, tot_loss=2.005, over 142.00 samples.], tot_loss[dur_loss=0.4227, prior_loss=1.029, diff_loss=0.5913, tot_loss=2.043, over 2210.00 samples.], 2024-10-20 18:49:21,788 INFO [train.py:682] (1/4) Start epoch 113 2024-10-20 18:49:41,608 INFO [train.py:561] (1/4) Epoch 113, batch 8, global_batch_idx: 1800, batch size: 170, loss[dur_loss=0.4216, prior_loss=1.029, diff_loss=0.5235, tot_loss=1.974, over 170.00 samples.], tot_loss[dur_loss=0.4168, prior_loss=1.028, diff_loss=0.6056, tot_loss=2.051, over 1432.00 samples.], 2024-10-20 18:49:51,710 INFO [train.py:682] (1/4) Start epoch 114 2024-10-20 18:50:02,894 INFO [train.py:561] (1/4) Epoch 114, batch 2, global_batch_idx: 1810, batch size: 203, loss[dur_loss=0.4116, prior_loss=1.028, diff_loss=0.5459, tot_loss=1.985, over 203.00 samples.], tot_loss[dur_loss=0.4205, prior_loss=1.029, diff_loss=0.5306, tot_loss=1.98, over 442.00 samples.], 2024-10-20 18:50:17,094 INFO [train.py:561] (1/4) Epoch 114, batch 12, global_batch_idx: 1820, batch size: 152, loss[dur_loss=0.4271, prior_loss=1.028, diff_loss=0.4985, tot_loss=1.954, over 152.00 samples.], tot_loss[dur_loss=0.4195, prior_loss=1.028, diff_loss=0.5802, tot_loss=2.028, over 1966.00 samples.], 2024-10-20 18:50:21,535 INFO [train.py:682] (1/4) Start epoch 115 2024-10-20 18:50:38,470 INFO [train.py:561] (1/4) Epoch 115, batch 6, global_batch_idx: 1830, batch size: 106, loss[dur_loss=0.4209, prior_loss=1.029, diff_loss=0.507, tot_loss=1.956, over 106.00 samples.], tot_loss[dur_loss=0.4154, prior_loss=1.028, diff_loss=0.6123, tot_loss=2.056, over 1142.00 samples.], 2024-10-20 18:50:51,460 INFO [train.py:682] (1/4) Start epoch 116 2024-10-20 18:51:00,046 INFO [train.py:561] (1/4) Epoch 116, batch 0, global_batch_idx: 1840, batch size: 108, loss[dur_loss=0.4401, prior_loss=1.029, diff_loss=0.5144, tot_loss=1.984, over 108.00 samples.], tot_loss[dur_loss=0.4401, prior_loss=1.029, diff_loss=0.5144, tot_loss=1.984, over 108.00 samples.], 2024-10-20 18:51:14,221 INFO [train.py:561] (1/4) Epoch 116, batch 10, global_batch_idx: 1850, batch size: 111, loss[dur_loss=0.4375, prior_loss=1.031, diff_loss=0.5122, tot_loss=1.98, over 111.00 samples.], tot_loss[dur_loss=0.4173, prior_loss=1.028, diff_loss=0.5937, tot_loss=2.039, over 1656.00 samples.], 2024-10-20 18:51:21,321 INFO [train.py:682] (1/4) Start epoch 117 2024-10-20 18:51:34,885 INFO [train.py:561] (1/4) Epoch 117, batch 4, global_batch_idx: 1860, batch size: 189, loss[dur_loss=0.4108, prior_loss=1.028, diff_loss=0.5368, tot_loss=1.976, over 189.00 samples.], tot_loss[dur_loss=0.4114, prior_loss=1.028, diff_loss=0.6429, tot_loss=2.082, over 937.00 samples.], 2024-10-20 18:51:49,691 INFO [train.py:561] (1/4) Epoch 117, batch 14, global_batch_idx: 1870, batch size: 142, loss[dur_loss=0.4191, prior_loss=1.027, diff_loss=0.5052, tot_loss=1.952, over 142.00 samples.], tot_loss[dur_loss=0.4185, prior_loss=1.028, diff_loss=0.5759, tot_loss=2.022, over 2210.00 samples.], 2024-10-20 18:51:51,114 INFO [train.py:682] (1/4) Start epoch 118 2024-10-20 18:52:10,916 INFO [train.py:561] (1/4) Epoch 118, batch 8, global_batch_idx: 1880, batch size: 170, loss[dur_loss=0.4232, prior_loss=1.028, diff_loss=0.5452, tot_loss=1.996, over 170.00 samples.], tot_loss[dur_loss=0.4153, prior_loss=1.028, diff_loss=0.6027, tot_loss=2.046, over 1432.00 samples.], 2024-10-20 18:52:21,008 INFO [train.py:682] (1/4) Start epoch 119 2024-10-20 18:52:32,287 INFO [train.py:561] (1/4) Epoch 119, batch 2, global_batch_idx: 1890, batch size: 203, loss[dur_loss=0.4109, prior_loss=1.028, diff_loss=0.5611, tot_loss=2, over 203.00 samples.], tot_loss[dur_loss=0.4212, prior_loss=1.028, diff_loss=0.5473, tot_loss=1.997, over 442.00 samples.], 2024-10-20 18:52:46,497 INFO [train.py:561] (1/4) Epoch 119, batch 12, global_batch_idx: 1900, batch size: 152, loss[dur_loss=0.4231, prior_loss=1.028, diff_loss=0.5158, tot_loss=1.967, over 152.00 samples.], tot_loss[dur_loss=0.419, prior_loss=1.028, diff_loss=0.5815, tot_loss=2.029, over 1966.00 samples.], 2024-10-20 18:52:50,926 INFO [train.py:682] (1/4) Start epoch 120 2024-10-20 18:53:07,550 INFO [train.py:561] (1/4) Epoch 120, batch 6, global_batch_idx: 1910, batch size: 106, loss[dur_loss=0.4302, prior_loss=1.028, diff_loss=0.5064, tot_loss=1.965, over 106.00 samples.], tot_loss[dur_loss=0.4149, prior_loss=1.028, diff_loss=0.6087, tot_loss=2.051, over 1142.00 samples.], 2024-10-20 18:53:20,518 INFO [train.py:682] (1/4) Start epoch 121 2024-10-20 18:53:29,555 INFO [train.py:561] (1/4) Epoch 121, batch 0, global_batch_idx: 1920, batch size: 108, loss[dur_loss=0.4388, prior_loss=1.029, diff_loss=0.5279, tot_loss=1.996, over 108.00 samples.], tot_loss[dur_loss=0.4388, prior_loss=1.029, diff_loss=0.5279, tot_loss=1.996, over 108.00 samples.], 2024-10-20 18:53:43,789 INFO [train.py:561] (1/4) Epoch 121, batch 10, global_batch_idx: 1930, batch size: 111, loss[dur_loss=0.4304, prior_loss=1.03, diff_loss=0.4697, tot_loss=1.93, over 111.00 samples.], tot_loss[dur_loss=0.416, prior_loss=1.028, diff_loss=0.5807, tot_loss=2.024, over 1656.00 samples.], 2024-10-20 18:53:50,904 INFO [train.py:682] (1/4) Start epoch 122 2024-10-20 18:54:04,452 INFO [train.py:561] (1/4) Epoch 122, batch 4, global_batch_idx: 1940, batch size: 189, loss[dur_loss=0.4127, prior_loss=1.028, diff_loss=0.5862, tot_loss=2.027, over 189.00 samples.], tot_loss[dur_loss=0.4095, prior_loss=1.027, diff_loss=0.6436, tot_loss=2.08, over 937.00 samples.], 2024-10-20 18:54:19,276 INFO [train.py:561] (1/4) Epoch 122, batch 14, global_batch_idx: 1950, batch size: 142, loss[dur_loss=0.4161, prior_loss=1.027, diff_loss=0.5223, tot_loss=1.965, over 142.00 samples.], tot_loss[dur_loss=0.4172, prior_loss=1.027, diff_loss=0.5705, tot_loss=2.015, over 2210.00 samples.], 2024-10-20 18:54:20,695 INFO [train.py:682] (1/4) Start epoch 123 2024-10-20 18:54:40,356 INFO [train.py:561] (1/4) Epoch 123, batch 8, global_batch_idx: 1960, batch size: 170, loss[dur_loss=0.4216, prior_loss=1.028, diff_loss=0.5061, tot_loss=1.955, over 170.00 samples.], tot_loss[dur_loss=0.4142, prior_loss=1.027, diff_loss=0.593, tot_loss=2.034, over 1432.00 samples.], 2024-10-20 18:54:50,513 INFO [train.py:682] (1/4) Start epoch 124 2024-10-20 18:55:01,869 INFO [train.py:561] (1/4) Epoch 124, batch 2, global_batch_idx: 1970, batch size: 203, loss[dur_loss=0.4125, prior_loss=1.027, diff_loss=0.5801, tot_loss=2.02, over 203.00 samples.], tot_loss[dur_loss=0.4211, prior_loss=1.028, diff_loss=0.5418, tot_loss=1.99, over 442.00 samples.], 2024-10-20 18:55:16,270 INFO [train.py:561] (1/4) Epoch 124, batch 12, global_batch_idx: 1980, batch size: 152, loss[dur_loss=0.4172, prior_loss=1.027, diff_loss=0.4832, tot_loss=1.928, over 152.00 samples.], tot_loss[dur_loss=0.4157, prior_loss=1.027, diff_loss=0.5728, tot_loss=2.016, over 1966.00 samples.], 2024-10-20 18:55:20,727 INFO [train.py:682] (1/4) Start epoch 125 2024-10-20 18:55:37,593 INFO [train.py:561] (1/4) Epoch 125, batch 6, global_batch_idx: 1990, batch size: 106, loss[dur_loss=0.4148, prior_loss=1.027, diff_loss=0.5218, tot_loss=1.964, over 106.00 samples.], tot_loss[dur_loss=0.4104, prior_loss=1.027, diff_loss=0.611, tot_loss=2.048, over 1142.00 samples.], 2024-10-20 18:55:50,774 INFO [train.py:682] (1/4) Start epoch 126 2024-10-20 18:55:59,760 INFO [train.py:561] (1/4) Epoch 126, batch 0, global_batch_idx: 2000, batch size: 108, loss[dur_loss=0.4371, prior_loss=1.028, diff_loss=0.4924, tot_loss=1.958, over 108.00 samples.], tot_loss[dur_loss=0.4371, prior_loss=1.028, diff_loss=0.4924, tot_loss=1.958, over 108.00 samples.], 2024-10-20 18:56:14,166 INFO [train.py:561] (1/4) Epoch 126, batch 10, global_batch_idx: 2010, batch size: 111, loss[dur_loss=0.4289, prior_loss=1.029, diff_loss=0.5626, tot_loss=2.021, over 111.00 samples.], tot_loss[dur_loss=0.4163, prior_loss=1.027, diff_loss=0.584, tot_loss=2.027, over 1656.00 samples.], 2024-10-20 18:56:21,369 INFO [train.py:682] (1/4) Start epoch 127 2024-10-20 18:56:35,172 INFO [train.py:561] (1/4) Epoch 127, batch 4, global_batch_idx: 2020, batch size: 189, loss[dur_loss=0.4124, prior_loss=1.027, diff_loss=0.5171, tot_loss=1.957, over 189.00 samples.], tot_loss[dur_loss=0.4069, prior_loss=1.026, diff_loss=0.625, tot_loss=2.058, over 937.00 samples.], 2024-10-20 18:56:50,342 INFO [train.py:561] (1/4) Epoch 127, batch 14, global_batch_idx: 2030, batch size: 142, loss[dur_loss=0.4174, prior_loss=1.026, diff_loss=0.516, tot_loss=1.959, over 142.00 samples.], tot_loss[dur_loss=0.4152, prior_loss=1.027, diff_loss=0.5564, tot_loss=1.998, over 2210.00 samples.], 2024-10-20 18:56:51,798 INFO [train.py:682] (1/4) Start epoch 128 2024-10-20 18:57:11,766 INFO [train.py:561] (1/4) Epoch 128, batch 8, global_batch_idx: 2040, batch size: 170, loss[dur_loss=0.4223, prior_loss=1.027, diff_loss=0.4964, tot_loss=1.946, over 170.00 samples.], tot_loss[dur_loss=0.4122, prior_loss=1.026, diff_loss=0.5899, tot_loss=2.029, over 1432.00 samples.], 2024-10-20 18:57:21,951 INFO [train.py:682] (1/4) Start epoch 129 2024-10-20 18:57:33,729 INFO [train.py:561] (1/4) Epoch 129, batch 2, global_batch_idx: 2050, batch size: 203, loss[dur_loss=0.4075, prior_loss=1.026, diff_loss=0.5535, tot_loss=1.987, over 203.00 samples.], tot_loss[dur_loss=0.4166, prior_loss=1.027, diff_loss=0.5316, tot_loss=1.975, over 442.00 samples.], 2024-10-20 18:57:47,944 INFO [train.py:561] (1/4) Epoch 129, batch 12, global_batch_idx: 2060, batch size: 152, loss[dur_loss=0.4197, prior_loss=1.027, diff_loss=0.5042, tot_loss=1.95, over 152.00 samples.], tot_loss[dur_loss=0.4139, prior_loss=1.026, diff_loss=0.5653, tot_loss=2.006, over 1966.00 samples.], 2024-10-20 18:57:52,384 INFO [train.py:682] (1/4) Start epoch 130 2024-10-20 18:58:09,310 INFO [train.py:561] (1/4) Epoch 130, batch 6, global_batch_idx: 2070, batch size: 106, loss[dur_loss=0.4155, prior_loss=1.027, diff_loss=0.4926, tot_loss=1.935, over 106.00 samples.], tot_loss[dur_loss=0.4094, prior_loss=1.026, diff_loss=0.6028, tot_loss=2.038, over 1142.00 samples.], 2024-10-20 18:58:22,366 INFO [train.py:682] (1/4) Start epoch 131 2024-10-20 18:58:31,036 INFO [train.py:561] (1/4) Epoch 131, batch 0, global_batch_idx: 2080, batch size: 108, loss[dur_loss=0.4364, prior_loss=1.028, diff_loss=0.4995, tot_loss=1.963, over 108.00 samples.], tot_loss[dur_loss=0.4364, prior_loss=1.028, diff_loss=0.4995, tot_loss=1.963, over 108.00 samples.], 2024-10-20 18:58:45,310 INFO [train.py:561] (1/4) Epoch 131, batch 10, global_batch_idx: 2090, batch size: 111, loss[dur_loss=0.4282, prior_loss=1.028, diff_loss=0.5178, tot_loss=1.974, over 111.00 samples.], tot_loss[dur_loss=0.4137, prior_loss=1.026, diff_loss=0.5705, tot_loss=2.011, over 1656.00 samples.], 2024-10-20 18:58:52,398 INFO [train.py:682] (1/4) Start epoch 132 2024-10-20 18:59:06,179 INFO [train.py:561] (1/4) Epoch 132, batch 4, global_batch_idx: 2100, batch size: 189, loss[dur_loss=0.409, prior_loss=1.027, diff_loss=0.5445, tot_loss=1.98, over 189.00 samples.], tot_loss[dur_loss=0.4064, prior_loss=1.026, diff_loss=0.6317, tot_loss=2.064, over 937.00 samples.], 2024-10-20 18:59:20,948 INFO [train.py:561] (1/4) Epoch 132, batch 14, global_batch_idx: 2110, batch size: 142, loss[dur_loss=0.4156, prior_loss=1.026, diff_loss=0.4986, tot_loss=1.94, over 142.00 samples.], tot_loss[dur_loss=0.4135, prior_loss=1.026, diff_loss=0.5549, tot_loss=1.995, over 2210.00 samples.], 2024-10-20 18:59:22,363 INFO [train.py:682] (1/4) Start epoch 133 2024-10-20 18:59:42,295 INFO [train.py:561] (1/4) Epoch 133, batch 8, global_batch_idx: 2120, batch size: 170, loss[dur_loss=0.4142, prior_loss=1.027, diff_loss=0.5357, tot_loss=1.977, over 170.00 samples.], tot_loss[dur_loss=0.4099, prior_loss=1.026, diff_loss=0.5782, tot_loss=2.014, over 1432.00 samples.], 2024-10-20 18:59:52,353 INFO [train.py:682] (1/4) Start epoch 134 2024-10-20 19:00:03,482 INFO [train.py:561] (1/4) Epoch 134, batch 2, global_batch_idx: 2130, batch size: 203, loss[dur_loss=0.4108, prior_loss=1.026, diff_loss=0.5823, tot_loss=2.019, over 203.00 samples.], tot_loss[dur_loss=0.4176, prior_loss=1.026, diff_loss=0.5389, tot_loss=1.983, over 442.00 samples.], 2024-10-20 19:00:17,659 INFO [train.py:561] (1/4) Epoch 134, batch 12, global_batch_idx: 2140, batch size: 152, loss[dur_loss=0.4156, prior_loss=1.026, diff_loss=0.4967, tot_loss=1.938, over 152.00 samples.], tot_loss[dur_loss=0.4137, prior_loss=1.026, diff_loss=0.5674, tot_loss=2.007, over 1966.00 samples.], 2024-10-20 19:00:22,104 INFO [train.py:682] (1/4) Start epoch 135 2024-10-20 19:00:39,065 INFO [train.py:561] (1/4) Epoch 135, batch 6, global_batch_idx: 2150, batch size: 106, loss[dur_loss=0.4253, prior_loss=1.027, diff_loss=0.4992, tot_loss=1.952, over 106.00 samples.], tot_loss[dur_loss=0.4102, prior_loss=1.026, diff_loss=0.5941, tot_loss=2.03, over 1142.00 samples.], 2024-10-20 19:00:52,067 INFO [train.py:682] (1/4) Start epoch 136 2024-10-20 19:01:00,543 INFO [train.py:561] (1/4) Epoch 136, batch 0, global_batch_idx: 2160, batch size: 108, loss[dur_loss=0.4347, prior_loss=1.027, diff_loss=0.5277, tot_loss=1.99, over 108.00 samples.], tot_loss[dur_loss=0.4347, prior_loss=1.027, diff_loss=0.5277, tot_loss=1.99, over 108.00 samples.], 2024-10-20 19:01:14,822 INFO [train.py:561] (1/4) Epoch 136, batch 10, global_batch_idx: 2170, batch size: 111, loss[dur_loss=0.43, prior_loss=1.028, diff_loss=0.5119, tot_loss=1.97, over 111.00 samples.], tot_loss[dur_loss=0.4129, prior_loss=1.026, diff_loss=0.579, tot_loss=2.018, over 1656.00 samples.], 2024-10-20 19:01:21,924 INFO [train.py:682] (1/4) Start epoch 137 2024-10-20 19:01:35,864 INFO [train.py:561] (1/4) Epoch 137, batch 4, global_batch_idx: 2180, batch size: 189, loss[dur_loss=0.407, prior_loss=1.026, diff_loss=0.539, tot_loss=1.972, over 189.00 samples.], tot_loss[dur_loss=0.405, prior_loss=1.026, diff_loss=0.6305, tot_loss=2.061, over 937.00 samples.], 2024-10-20 19:01:50,752 INFO [train.py:561] (1/4) Epoch 137, batch 14, global_batch_idx: 2190, batch size: 142, loss[dur_loss=0.4145, prior_loss=1.025, diff_loss=0.4655, tot_loss=1.905, over 142.00 samples.], tot_loss[dur_loss=0.4134, prior_loss=1.026, diff_loss=0.5523, tot_loss=1.992, over 2210.00 samples.], 2024-10-20 19:01:52,193 INFO [train.py:682] (1/4) Start epoch 138 2024-10-20 19:02:12,544 INFO [train.py:561] (1/4) Epoch 138, batch 8, global_batch_idx: 2200, batch size: 170, loss[dur_loss=0.419, prior_loss=1.026, diff_loss=0.5043, tot_loss=1.949, over 170.00 samples.], tot_loss[dur_loss=0.412, prior_loss=1.026, diff_loss=0.5811, tot_loss=2.019, over 1432.00 samples.], 2024-10-20 19:02:22,783 INFO [train.py:682] (1/4) Start epoch 139 2024-10-20 19:02:34,237 INFO [train.py:561] (1/4) Epoch 139, batch 2, global_batch_idx: 2210, batch size: 203, loss[dur_loss=0.4058, prior_loss=1.025, diff_loss=0.5343, tot_loss=1.965, over 203.00 samples.], tot_loss[dur_loss=0.4145, prior_loss=1.026, diff_loss=0.4906, tot_loss=1.931, over 442.00 samples.], 2024-10-20 19:02:48,483 INFO [train.py:561] (1/4) Epoch 139, batch 12, global_batch_idx: 2220, batch size: 152, loss[dur_loss=0.4148, prior_loss=1.025, diff_loss=0.4946, tot_loss=1.935, over 152.00 samples.], tot_loss[dur_loss=0.4112, prior_loss=1.025, diff_loss=0.554, tot_loss=1.991, over 1966.00 samples.], 2024-10-20 19:02:52,995 INFO [train.py:682] (1/4) Start epoch 140 2024-10-20 19:03:10,153 INFO [train.py:561] (1/4) Epoch 140, batch 6, global_batch_idx: 2230, batch size: 106, loss[dur_loss=0.4178, prior_loss=1.026, diff_loss=0.4572, tot_loss=1.901, over 106.00 samples.], tot_loss[dur_loss=0.4054, prior_loss=1.025, diff_loss=0.5881, tot_loss=2.019, over 1142.00 samples.], 2024-10-20 19:03:23,273 INFO [train.py:682] (1/4) Start epoch 141 2024-10-20 19:03:31,942 INFO [train.py:561] (1/4) Epoch 141, batch 0, global_batch_idx: 2240, batch size: 108, loss[dur_loss=0.4302, prior_loss=1.026, diff_loss=0.484, tot_loss=1.94, over 108.00 samples.], tot_loss[dur_loss=0.4302, prior_loss=1.026, diff_loss=0.484, tot_loss=1.94, over 108.00 samples.], 2024-10-20 19:03:46,274 INFO [train.py:561] (1/4) Epoch 141, batch 10, global_batch_idx: 2250, batch size: 111, loss[dur_loss=0.4238, prior_loss=1.027, diff_loss=0.4708, tot_loss=1.922, over 111.00 samples.], tot_loss[dur_loss=0.4092, prior_loss=1.025, diff_loss=0.5531, tot_loss=1.988, over 1656.00 samples.], 2024-10-20 19:03:53,423 INFO [train.py:682] (1/4) Start epoch 142 2024-10-20 19:04:07,234 INFO [train.py:561] (1/4) Epoch 142, batch 4, global_batch_idx: 2260, batch size: 189, loss[dur_loss=0.4066, prior_loss=1.026, diff_loss=0.5561, tot_loss=1.989, over 189.00 samples.], tot_loss[dur_loss=0.4045, prior_loss=1.025, diff_loss=0.6188, tot_loss=2.048, over 937.00 samples.], 2024-10-20 19:04:22,433 INFO [train.py:561] (1/4) Epoch 142, batch 14, global_batch_idx: 2270, batch size: 142, loss[dur_loss=0.4139, prior_loss=1.024, diff_loss=0.449, tot_loss=1.887, over 142.00 samples.], tot_loss[dur_loss=0.4111, prior_loss=1.025, diff_loss=0.5452, tot_loss=1.981, over 2210.00 samples.], 2024-10-20 19:04:23,873 INFO [train.py:682] (1/4) Start epoch 143 2024-10-20 19:04:43,974 INFO [train.py:561] (1/4) Epoch 143, batch 8, global_batch_idx: 2280, batch size: 170, loss[dur_loss=0.414, prior_loss=1.025, diff_loss=0.5116, tot_loss=1.951, over 170.00 samples.], tot_loss[dur_loss=0.4065, prior_loss=1.025, diff_loss=0.5791, tot_loss=2.01, over 1432.00 samples.], 2024-10-20 19:04:54,138 INFO [train.py:682] (1/4) Start epoch 144 2024-10-20 19:05:05,557 INFO [train.py:561] (1/4) Epoch 144, batch 2, global_batch_idx: 2290, batch size: 203, loss[dur_loss=0.4036, prior_loss=1.025, diff_loss=0.5173, tot_loss=1.946, over 203.00 samples.], tot_loss[dur_loss=0.412, prior_loss=1.025, diff_loss=0.5053, tot_loss=1.942, over 442.00 samples.], 2024-10-20 19:05:19,869 INFO [train.py:561] (1/4) Epoch 144, batch 12, global_batch_idx: 2300, batch size: 152, loss[dur_loss=0.4177, prior_loss=1.025, diff_loss=0.5057, tot_loss=1.949, over 152.00 samples.], tot_loss[dur_loss=0.4089, prior_loss=1.025, diff_loss=0.5427, tot_loss=1.976, over 1966.00 samples.], 2024-10-20 19:05:24,337 INFO [train.py:682] (1/4) Start epoch 145 2024-10-20 19:05:41,141 INFO [train.py:561] (1/4) Epoch 145, batch 6, global_batch_idx: 2310, batch size: 106, loss[dur_loss=0.4136, prior_loss=1.025, diff_loss=0.4612, tot_loss=1.9, over 106.00 samples.], tot_loss[dur_loss=0.4056, prior_loss=1.025, diff_loss=0.5764, tot_loss=2.007, over 1142.00 samples.], 2024-10-20 19:05:54,323 INFO [train.py:682] (1/4) Start epoch 146 2024-10-20 19:06:03,375 INFO [train.py:561] (1/4) Epoch 146, batch 0, global_batch_idx: 2320, batch size: 108, loss[dur_loss=0.4324, prior_loss=1.026, diff_loss=0.5316, tot_loss=1.99, over 108.00 samples.], tot_loss[dur_loss=0.4324, prior_loss=1.026, diff_loss=0.5316, tot_loss=1.99, over 108.00 samples.], 2024-10-20 19:06:17,632 INFO [train.py:561] (1/4) Epoch 146, batch 10, global_batch_idx: 2330, batch size: 111, loss[dur_loss=0.4261, prior_loss=1.027, diff_loss=0.5214, tot_loss=1.974, over 111.00 samples.], tot_loss[dur_loss=0.4083, prior_loss=1.025, diff_loss=0.5612, tot_loss=1.994, over 1656.00 samples.], 2024-10-20 19:06:24,734 INFO [train.py:682] (1/4) Start epoch 147 2024-10-20 19:06:38,536 INFO [train.py:561] (1/4) Epoch 147, batch 4, global_batch_idx: 2340, batch size: 189, loss[dur_loss=0.4064, prior_loss=1.025, diff_loss=0.559, tot_loss=1.991, over 189.00 samples.], tot_loss[dur_loss=0.4033, prior_loss=1.024, diff_loss=0.6185, tot_loss=2.046, over 937.00 samples.], 2024-10-20 19:06:53,460 INFO [train.py:561] (1/4) Epoch 147, batch 14, global_batch_idx: 2350, batch size: 142, loss[dur_loss=0.4147, prior_loss=1.024, diff_loss=0.4886, tot_loss=1.927, over 142.00 samples.], tot_loss[dur_loss=0.4105, prior_loss=1.025, diff_loss=0.5416, tot_loss=1.977, over 2210.00 samples.], 2024-10-20 19:06:54,885 INFO [train.py:682] (1/4) Start epoch 148 2024-10-20 19:07:15,192 INFO [train.py:561] (1/4) Epoch 148, batch 8, global_batch_idx: 2360, batch size: 170, loss[dur_loss=0.4109, prior_loss=1.025, diff_loss=0.5151, tot_loss=1.951, over 170.00 samples.], tot_loss[dur_loss=0.406, prior_loss=1.024, diff_loss=0.5633, tot_loss=1.994, over 1432.00 samples.], 2024-10-20 19:07:25,444 INFO [train.py:682] (1/4) Start epoch 149 2024-10-20 19:07:37,134 INFO [train.py:561] (1/4) Epoch 149, batch 2, global_batch_idx: 2370, batch size: 203, loss[dur_loss=0.4039, prior_loss=1.024, diff_loss=0.4866, tot_loss=1.915, over 203.00 samples.], tot_loss[dur_loss=0.4106, prior_loss=1.025, diff_loss=0.4922, tot_loss=1.928, over 442.00 samples.], 2024-10-20 19:07:51,484 INFO [train.py:561] (1/4) Epoch 149, batch 12, global_batch_idx: 2380, batch size: 152, loss[dur_loss=0.4105, prior_loss=1.024, diff_loss=0.4893, tot_loss=1.924, over 152.00 samples.], tot_loss[dur_loss=0.4074, prior_loss=1.024, diff_loss=0.5453, tot_loss=1.977, over 1966.00 samples.], 2024-10-20 19:07:55,964 INFO [train.py:682] (1/4) Start epoch 150 2024-10-20 19:08:13,190 INFO [train.py:561] (1/4) Epoch 150, batch 6, global_batch_idx: 2390, batch size: 106, loss[dur_loss=0.4078, prior_loss=1.025, diff_loss=0.4437, tot_loss=1.876, over 106.00 samples.], tot_loss[dur_loss=0.4025, prior_loss=1.024, diff_loss=0.5723, tot_loss=1.999, over 1142.00 samples.], 2024-10-20 19:08:26,384 INFO [train.py:682] (1/4) Start epoch 151 2024-10-20 19:08:35,057 INFO [train.py:561] (1/4) Epoch 151, batch 0, global_batch_idx: 2400, batch size: 108, loss[dur_loss=0.4278, prior_loss=1.025, diff_loss=0.4578, tot_loss=1.911, over 108.00 samples.], tot_loss[dur_loss=0.4278, prior_loss=1.025, diff_loss=0.4578, tot_loss=1.911, over 108.00 samples.], 2024-10-20 19:08:49,418 INFO [train.py:561] (1/4) Epoch 151, batch 10, global_batch_idx: 2410, batch size: 111, loss[dur_loss=0.425, prior_loss=1.027, diff_loss=0.4604, tot_loss=1.912, over 111.00 samples.], tot_loss[dur_loss=0.4087, prior_loss=1.025, diff_loss=0.5539, tot_loss=1.987, over 1656.00 samples.], 2024-10-20 19:08:56,470 INFO [train.py:682] (1/4) Start epoch 152 2024-10-20 19:09:10,530 INFO [train.py:561] (1/4) Epoch 152, batch 4, global_batch_idx: 2420, batch size: 189, loss[dur_loss=0.4017, prior_loss=1.025, diff_loss=0.5476, tot_loss=1.974, over 189.00 samples.], tot_loss[dur_loss=0.4026, prior_loss=1.024, diff_loss=0.614, tot_loss=2.041, over 937.00 samples.], 2024-10-20 19:09:25,521 INFO [train.py:561] (1/4) Epoch 152, batch 14, global_batch_idx: 2430, batch size: 142, loss[dur_loss=0.4118, prior_loss=1.024, diff_loss=0.5138, tot_loss=1.949, over 142.00 samples.], tot_loss[dur_loss=0.4095, prior_loss=1.024, diff_loss=0.5467, tot_loss=1.981, over 2210.00 samples.], 2024-10-20 19:09:26,956 INFO [train.py:682] (1/4) Start epoch 153 2024-10-20 19:09:46,808 INFO [train.py:561] (1/4) Epoch 153, batch 8, global_batch_idx: 2440, batch size: 170, loss[dur_loss=0.4136, prior_loss=1.024, diff_loss=0.5105, tot_loss=1.949, over 170.00 samples.], tot_loss[dur_loss=0.4053, prior_loss=1.024, diff_loss=0.5692, tot_loss=1.999, over 1432.00 samples.], 2024-10-20 19:09:57,003 INFO [train.py:682] (1/4) Start epoch 154 2024-10-20 19:10:08,382 INFO [train.py:561] (1/4) Epoch 154, batch 2, global_batch_idx: 2450, batch size: 203, loss[dur_loss=0.4029, prior_loss=1.024, diff_loss=0.5094, tot_loss=1.936, over 203.00 samples.], tot_loss[dur_loss=0.4106, prior_loss=1.024, diff_loss=0.4957, tot_loss=1.931, over 442.00 samples.], 2024-10-20 19:10:22,677 INFO [train.py:561] (1/4) Epoch 154, batch 12, global_batch_idx: 2460, batch size: 152, loss[dur_loss=0.4136, prior_loss=1.024, diff_loss=0.4713, tot_loss=1.909, over 152.00 samples.], tot_loss[dur_loss=0.4057, prior_loss=1.024, diff_loss=0.5326, tot_loss=1.962, over 1966.00 samples.], 2024-10-20 19:10:27,136 INFO [train.py:682] (1/4) Start epoch 155 2024-10-20 19:10:44,172 INFO [train.py:561] (1/4) Epoch 155, batch 6, global_batch_idx: 2470, batch size: 106, loss[dur_loss=0.4045, prior_loss=1.024, diff_loss=0.473, tot_loss=1.901, over 106.00 samples.], tot_loss[dur_loss=0.4009, prior_loss=1.023, diff_loss=0.5918, tot_loss=2.016, over 1142.00 samples.], 2024-10-20 19:10:57,340 INFO [train.py:682] (1/4) Start epoch 156 2024-10-20 19:11:05,926 INFO [train.py:561] (1/4) Epoch 156, batch 0, global_batch_idx: 2480, batch size: 108, loss[dur_loss=0.4262, prior_loss=1.025, diff_loss=0.5312, tot_loss=1.982, over 108.00 samples.], tot_loss[dur_loss=0.4262, prior_loss=1.025, diff_loss=0.5312, tot_loss=1.982, over 108.00 samples.], 2024-10-20 19:11:20,137 INFO [train.py:561] (1/4) Epoch 156, batch 10, global_batch_idx: 2490, batch size: 111, loss[dur_loss=0.4207, prior_loss=1.026, diff_loss=0.4607, tot_loss=1.908, over 111.00 samples.], tot_loss[dur_loss=0.4044, prior_loss=1.024, diff_loss=0.5542, tot_loss=1.982, over 1656.00 samples.], 2024-10-20 19:11:27,219 INFO [train.py:682] (1/4) Start epoch 157 2024-10-20 19:11:40,973 INFO [train.py:561] (1/4) Epoch 157, batch 4, global_batch_idx: 2500, batch size: 189, loss[dur_loss=0.4025, prior_loss=1.024, diff_loss=0.4839, tot_loss=1.91, over 189.00 samples.], tot_loss[dur_loss=0.3979, prior_loss=1.023, diff_loss=0.5915, tot_loss=2.013, over 937.00 samples.], 2024-10-20 19:11:55,808 INFO [train.py:561] (1/4) Epoch 157, batch 14, global_batch_idx: 2510, batch size: 142, loss[dur_loss=0.4032, prior_loss=1.023, diff_loss=0.5008, tot_loss=1.926, over 142.00 samples.], tot_loss[dur_loss=0.4042, prior_loss=1.023, diff_loss=0.5318, tot_loss=1.959, over 2210.00 samples.], 2024-10-20 19:11:57,226 INFO [train.py:682] (1/4) Start epoch 158 2024-10-20 19:12:16,967 INFO [train.py:561] (1/4) Epoch 158, batch 8, global_batch_idx: 2520, batch size: 170, loss[dur_loss=0.4059, prior_loss=1.024, diff_loss=0.4716, tot_loss=1.901, over 170.00 samples.], tot_loss[dur_loss=0.4008, prior_loss=1.023, diff_loss=0.5526, tot_loss=1.977, over 1432.00 samples.], 2024-10-20 19:12:27,099 INFO [train.py:682] (1/4) Start epoch 159 2024-10-20 19:12:38,256 INFO [train.py:561] (1/4) Epoch 159, batch 2, global_batch_idx: 2530, batch size: 203, loss[dur_loss=0.3997, prior_loss=1.023, diff_loss=0.4927, tot_loss=1.915, over 203.00 samples.], tot_loss[dur_loss=0.4076, prior_loss=1.023, diff_loss=0.4824, tot_loss=1.913, over 442.00 samples.], 2024-10-20 19:12:52,482 INFO [train.py:561] (1/4) Epoch 159, batch 12, global_batch_idx: 2540, batch size: 152, loss[dur_loss=0.4094, prior_loss=1.023, diff_loss=0.4567, tot_loss=1.889, over 152.00 samples.], tot_loss[dur_loss=0.4039, prior_loss=1.023, diff_loss=0.5433, tot_loss=1.97, over 1966.00 samples.], 2024-10-20 19:12:56,897 INFO [train.py:682] (1/4) Start epoch 160 2024-10-20 19:13:13,983 INFO [train.py:561] (1/4) Epoch 160, batch 6, global_batch_idx: 2550, batch size: 106, loss[dur_loss=0.4053, prior_loss=1.024, diff_loss=0.4909, tot_loss=1.92, over 106.00 samples.], tot_loss[dur_loss=0.4002, prior_loss=1.023, diff_loss=0.5792, tot_loss=2.002, over 1142.00 samples.], 2024-10-20 19:13:27,043 INFO [train.py:682] (1/4) Start epoch 161 2024-10-20 19:13:36,184 INFO [train.py:561] (1/4) Epoch 161, batch 0, global_batch_idx: 2560, batch size: 108, loss[dur_loss=0.4262, prior_loss=1.025, diff_loss=0.4793, tot_loss=1.93, over 108.00 samples.], tot_loss[dur_loss=0.4262, prior_loss=1.025, diff_loss=0.4793, tot_loss=1.93, over 108.00 samples.], 2024-10-20 19:13:50,577 INFO [train.py:561] (1/4) Epoch 161, batch 10, global_batch_idx: 2570, batch size: 111, loss[dur_loss=0.4191, prior_loss=1.025, diff_loss=0.5001, tot_loss=1.945, over 111.00 samples.], tot_loss[dur_loss=0.4019, prior_loss=1.023, diff_loss=0.5511, tot_loss=1.976, over 1656.00 samples.], 2024-10-20 19:13:57,728 INFO [train.py:682] (1/4) Start epoch 162 2024-10-20 19:14:11,401 INFO [train.py:561] (1/4) Epoch 162, batch 4, global_batch_idx: 2580, batch size: 189, loss[dur_loss=0.3963, prior_loss=1.023, diff_loss=0.5159, tot_loss=1.936, over 189.00 samples.], tot_loss[dur_loss=0.3953, prior_loss=1.023, diff_loss=0.5981, tot_loss=2.016, over 937.00 samples.], 2024-10-20 19:14:26,321 INFO [train.py:561] (1/4) Epoch 162, batch 14, global_batch_idx: 2590, batch size: 142, loss[dur_loss=0.4088, prior_loss=1.022, diff_loss=0.4333, tot_loss=1.864, over 142.00 samples.], tot_loss[dur_loss=0.4029, prior_loss=1.023, diff_loss=0.5226, tot_loss=1.949, over 2210.00 samples.], 2024-10-20 19:14:27,743 INFO [train.py:682] (1/4) Start epoch 163 2024-10-20 19:14:48,166 INFO [train.py:561] (1/4) Epoch 163, batch 8, global_batch_idx: 2600, batch size: 170, loss[dur_loss=0.4072, prior_loss=1.023, diff_loss=0.48, tot_loss=1.91, over 170.00 samples.], tot_loss[dur_loss=0.4006, prior_loss=1.023, diff_loss=0.5553, tot_loss=1.979, over 1432.00 samples.], 2024-10-20 19:14:58,368 INFO [train.py:682] (1/4) Start epoch 164 2024-10-20 19:15:09,607 INFO [train.py:561] (1/4) Epoch 164, batch 2, global_batch_idx: 2610, batch size: 203, loss[dur_loss=0.3989, prior_loss=1.023, diff_loss=0.5142, tot_loss=1.936, over 203.00 samples.], tot_loss[dur_loss=0.4058, prior_loss=1.023, diff_loss=0.4926, tot_loss=1.922, over 442.00 samples.], 2024-10-20 19:15:23,867 INFO [train.py:561] (1/4) Epoch 164, batch 12, global_batch_idx: 2620, batch size: 152, loss[dur_loss=0.4069, prior_loss=1.022, diff_loss=0.4697, tot_loss=1.899, over 152.00 samples.], tot_loss[dur_loss=0.4021, prior_loss=1.023, diff_loss=0.5274, tot_loss=1.952, over 1966.00 samples.], 2024-10-20 19:15:28,325 INFO [train.py:682] (1/4) Start epoch 165 2024-10-20 19:15:45,399 INFO [train.py:561] (1/4) Epoch 165, batch 6, global_batch_idx: 2630, batch size: 106, loss[dur_loss=0.4048, prior_loss=1.023, diff_loss=0.5079, tot_loss=1.936, over 106.00 samples.], tot_loss[dur_loss=0.395, prior_loss=1.022, diff_loss=0.5772, tot_loss=1.994, over 1142.00 samples.], 2024-10-20 19:15:58,506 INFO [train.py:682] (1/4) Start epoch 166 2024-10-20 19:16:07,093 INFO [train.py:561] (1/4) Epoch 166, batch 0, global_batch_idx: 2640, batch size: 108, loss[dur_loss=0.4226, prior_loss=1.024, diff_loss=0.4526, tot_loss=1.899, over 108.00 samples.], tot_loss[dur_loss=0.4226, prior_loss=1.024, diff_loss=0.4526, tot_loss=1.899, over 108.00 samples.], 2024-10-20 19:16:21,331 INFO [train.py:561] (1/4) Epoch 166, batch 10, global_batch_idx: 2650, batch size: 111, loss[dur_loss=0.4173, prior_loss=1.025, diff_loss=0.4931, tot_loss=1.935, over 111.00 samples.], tot_loss[dur_loss=0.4007, prior_loss=1.023, diff_loss=0.5347, tot_loss=1.958, over 1656.00 samples.], 2024-10-20 19:16:28,423 INFO [train.py:682] (1/4) Start epoch 167 2024-10-20 19:16:42,262 INFO [train.py:561] (1/4) Epoch 167, batch 4, global_batch_idx: 2660, batch size: 189, loss[dur_loss=0.3961, prior_loss=1.023, diff_loss=0.5322, tot_loss=1.951, over 189.00 samples.], tot_loss[dur_loss=0.3944, prior_loss=1.022, diff_loss=0.6014, tot_loss=2.018, over 937.00 samples.], 2024-10-20 19:16:57,108 INFO [train.py:561] (1/4) Epoch 167, batch 14, global_batch_idx: 2670, batch size: 142, loss[dur_loss=0.4016, prior_loss=1.022, diff_loss=0.4992, tot_loss=1.923, over 142.00 samples.], tot_loss[dur_loss=0.4013, prior_loss=1.022, diff_loss=0.5286, tot_loss=1.952, over 2210.00 samples.], 2024-10-20 19:16:58,541 INFO [train.py:682] (1/4) Start epoch 168 2024-10-20 19:17:18,395 INFO [train.py:561] (1/4) Epoch 168, batch 8, global_batch_idx: 2680, batch size: 170, loss[dur_loss=0.4092, prior_loss=1.023, diff_loss=0.5227, tot_loss=1.955, over 170.00 samples.], tot_loss[dur_loss=0.399, prior_loss=1.022, diff_loss=0.5486, tot_loss=1.97, over 1432.00 samples.], 2024-10-20 19:17:28,508 INFO [train.py:682] (1/4) Start epoch 169 2024-10-20 19:17:39,968 INFO [train.py:561] (1/4) Epoch 169, batch 2, global_batch_idx: 2690, batch size: 203, loss[dur_loss=0.3954, prior_loss=1.022, diff_loss=0.5095, tot_loss=1.927, over 203.00 samples.], tot_loss[dur_loss=0.4044, prior_loss=1.022, diff_loss=0.4845, tot_loss=1.911, over 442.00 samples.], 2024-10-20 19:17:54,156 INFO [train.py:561] (1/4) Epoch 169, batch 12, global_batch_idx: 2700, batch size: 152, loss[dur_loss=0.4058, prior_loss=1.022, diff_loss=0.4724, tot_loss=1.9, over 152.00 samples.], tot_loss[dur_loss=0.4004, prior_loss=1.022, diff_loss=0.527, tot_loss=1.95, over 1966.00 samples.], 2024-10-20 19:17:58,580 INFO [train.py:682] (1/4) Start epoch 170 2024-10-20 19:18:15,677 INFO [train.py:561] (1/4) Epoch 170, batch 6, global_batch_idx: 2710, batch size: 106, loss[dur_loss=0.4043, prior_loss=1.022, diff_loss=0.4533, tot_loss=1.88, over 106.00 samples.], tot_loss[dur_loss=0.3963, prior_loss=1.022, diff_loss=0.5696, tot_loss=1.988, over 1142.00 samples.], 2024-10-20 19:18:28,699 INFO [train.py:682] (1/4) Start epoch 171 2024-10-20 19:18:37,190 INFO [train.py:561] (1/4) Epoch 171, batch 0, global_batch_idx: 2720, batch size: 108, loss[dur_loss=0.4169, prior_loss=1.023, diff_loss=0.4506, tot_loss=1.891, over 108.00 samples.], tot_loss[dur_loss=0.4169, prior_loss=1.023, diff_loss=0.4506, tot_loss=1.891, over 108.00 samples.], 2024-10-20 19:18:51,338 INFO [train.py:561] (1/4) Epoch 171, batch 10, global_batch_idx: 2730, batch size: 111, loss[dur_loss=0.4162, prior_loss=1.025, diff_loss=0.4363, tot_loss=1.877, over 111.00 samples.], tot_loss[dur_loss=0.3974, prior_loss=1.022, diff_loss=0.5439, tot_loss=1.963, over 1656.00 samples.], 2024-10-20 19:18:58,433 INFO [train.py:682] (1/4) Start epoch 172 2024-10-20 19:19:12,131 INFO [train.py:561] (1/4) Epoch 172, batch 4, global_batch_idx: 2740, batch size: 189, loss[dur_loss=0.3923, prior_loss=1.023, diff_loss=0.5149, tot_loss=1.93, over 189.00 samples.], tot_loss[dur_loss=0.3917, prior_loss=1.022, diff_loss=0.5828, tot_loss=1.996, over 937.00 samples.], 2024-10-20 19:19:26,924 INFO [train.py:561] (1/4) Epoch 172, batch 14, global_batch_idx: 2750, batch size: 142, loss[dur_loss=0.4021, prior_loss=1.021, diff_loss=0.4869, tot_loss=1.91, over 142.00 samples.], tot_loss[dur_loss=0.3997, prior_loss=1.022, diff_loss=0.5169, tot_loss=1.939, over 2210.00 samples.], 2024-10-20 19:19:28,348 INFO [train.py:682] (1/4) Start epoch 173 2024-10-20 19:19:48,058 INFO [train.py:561] (1/4) Epoch 173, batch 8, global_batch_idx: 2760, batch size: 170, loss[dur_loss=0.3985, prior_loss=1.022, diff_loss=0.4791, tot_loss=1.9, over 170.00 samples.], tot_loss[dur_loss=0.396, prior_loss=1.022, diff_loss=0.5451, tot_loss=1.963, over 1432.00 samples.], 2024-10-20 19:19:58,129 INFO [train.py:682] (1/4) Start epoch 174 2024-10-20 19:20:09,507 INFO [train.py:561] (1/4) Epoch 174, batch 2, global_batch_idx: 2770, batch size: 203, loss[dur_loss=0.399, prior_loss=1.022, diff_loss=0.5324, tot_loss=1.953, over 203.00 samples.], tot_loss[dur_loss=0.4047, prior_loss=1.022, diff_loss=0.4891, tot_loss=1.916, over 442.00 samples.], 2024-10-20 19:20:23,637 INFO [train.py:561] (1/4) Epoch 174, batch 12, global_batch_idx: 2780, batch size: 152, loss[dur_loss=0.4014, prior_loss=1.022, diff_loss=0.4774, tot_loss=1.901, over 152.00 samples.], tot_loss[dur_loss=0.3999, prior_loss=1.022, diff_loss=0.5247, tot_loss=1.947, over 1966.00 samples.], 2024-10-20 19:20:28,097 INFO [train.py:682] (1/4) Start epoch 175 2024-10-20 19:20:44,769 INFO [train.py:561] (1/4) Epoch 175, batch 6, global_batch_idx: 2790, batch size: 106, loss[dur_loss=0.4024, prior_loss=1.022, diff_loss=0.4644, tot_loss=1.889, over 106.00 samples.], tot_loss[dur_loss=0.3939, prior_loss=1.022, diff_loss=0.5721, tot_loss=1.988, over 1142.00 samples.], 2024-10-20 19:20:57,749 INFO [train.py:682] (1/4) Start epoch 176 2024-10-20 19:21:06,214 INFO [train.py:561] (1/4) Epoch 176, batch 0, global_batch_idx: 2800, batch size: 108, loss[dur_loss=0.4225, prior_loss=1.023, diff_loss=0.4609, tot_loss=1.906, over 108.00 samples.], tot_loss[dur_loss=0.4225, prior_loss=1.023, diff_loss=0.4609, tot_loss=1.906, over 108.00 samples.], 2024-10-20 19:21:20,302 INFO [train.py:561] (1/4) Epoch 176, batch 10, global_batch_idx: 2810, batch size: 111, loss[dur_loss=0.4189, prior_loss=1.024, diff_loss=0.4464, tot_loss=1.89, over 111.00 samples.], tot_loss[dur_loss=0.3986, prior_loss=1.022, diff_loss=0.5293, tot_loss=1.95, over 1656.00 samples.], 2024-10-20 19:21:27,372 INFO [train.py:682] (1/4) Start epoch 177 2024-10-20 19:21:41,056 INFO [train.py:561] (1/4) Epoch 177, batch 4, global_batch_idx: 2820, batch size: 189, loss[dur_loss=0.3971, prior_loss=1.023, diff_loss=0.4955, tot_loss=1.915, over 189.00 samples.], tot_loss[dur_loss=0.3934, prior_loss=1.022, diff_loss=0.5915, tot_loss=2.006, over 937.00 samples.], 2024-10-20 19:21:55,828 INFO [train.py:561] (1/4) Epoch 177, batch 14, global_batch_idx: 2830, batch size: 142, loss[dur_loss=0.3979, prior_loss=1.021, diff_loss=0.4541, tot_loss=1.873, over 142.00 samples.], tot_loss[dur_loss=0.3999, prior_loss=1.022, diff_loss=0.5208, tot_loss=1.943, over 2210.00 samples.], 2024-10-20 19:21:57,242 INFO [train.py:682] (1/4) Start epoch 178 2024-10-20 19:22:17,041 INFO [train.py:561] (1/4) Epoch 178, batch 8, global_batch_idx: 2840, batch size: 170, loss[dur_loss=0.3969, prior_loss=1.022, diff_loss=0.5102, tot_loss=1.929, over 170.00 samples.], tot_loss[dur_loss=0.3953, prior_loss=1.022, diff_loss=0.5496, tot_loss=1.967, over 1432.00 samples.], 2024-10-20 19:22:27,093 INFO [train.py:682] (1/4) Start epoch 179 2024-10-20 19:22:38,237 INFO [train.py:561] (1/4) Epoch 179, batch 2, global_batch_idx: 2850, batch size: 203, loss[dur_loss=0.395, prior_loss=1.022, diff_loss=0.5135, tot_loss=1.93, over 203.00 samples.], tot_loss[dur_loss=0.4025, prior_loss=1.022, diff_loss=0.4862, tot_loss=1.911, over 442.00 samples.], 2024-10-20 19:22:52,362 INFO [train.py:561] (1/4) Epoch 179, batch 12, global_batch_idx: 2860, batch size: 152, loss[dur_loss=0.4071, prior_loss=1.022, diff_loss=0.4507, tot_loss=1.88, over 152.00 samples.], tot_loss[dur_loss=0.3991, prior_loss=1.022, diff_loss=0.5273, tot_loss=1.948, over 1966.00 samples.], 2024-10-20 19:22:56,795 INFO [train.py:682] (1/4) Start epoch 180 2024-10-20 19:23:13,651 INFO [train.py:561] (1/4) Epoch 180, batch 6, global_batch_idx: 2870, batch size: 106, loss[dur_loss=0.3964, prior_loss=1.021, diff_loss=0.4886, tot_loss=1.906, over 106.00 samples.], tot_loss[dur_loss=0.3917, prior_loss=1.021, diff_loss=0.5561, tot_loss=1.969, over 1142.00 samples.], 2024-10-20 19:23:26,623 INFO [train.py:682] (1/4) Start epoch 181 2024-10-20 19:23:35,211 INFO [train.py:561] (1/4) Epoch 181, batch 0, global_batch_idx: 2880, batch size: 108, loss[dur_loss=0.4208, prior_loss=1.023, diff_loss=0.4474, tot_loss=1.891, over 108.00 samples.], tot_loss[dur_loss=0.4208, prior_loss=1.023, diff_loss=0.4474, tot_loss=1.891, over 108.00 samples.], 2024-10-20 19:23:49,396 INFO [train.py:561] (1/4) Epoch 181, batch 10, global_batch_idx: 2890, batch size: 111, loss[dur_loss=0.4147, prior_loss=1.023, diff_loss=0.4481, tot_loss=1.886, over 111.00 samples.], tot_loss[dur_loss=0.3965, prior_loss=1.021, diff_loss=0.5204, tot_loss=1.938, over 1656.00 samples.], 2024-10-20 19:23:56,515 INFO [train.py:682] (1/4) Start epoch 182 2024-10-20 19:24:10,092 INFO [train.py:561] (1/4) Epoch 182, batch 4, global_batch_idx: 2900, batch size: 189, loss[dur_loss=0.3921, prior_loss=1.021, diff_loss=0.4882, tot_loss=1.901, over 189.00 samples.], tot_loss[dur_loss=0.3909, prior_loss=1.021, diff_loss=0.5836, tot_loss=1.995, over 937.00 samples.], 2024-10-20 19:24:24,844 INFO [train.py:561] (1/4) Epoch 182, batch 14, global_batch_idx: 2910, batch size: 142, loss[dur_loss=0.3974, prior_loss=1.02, diff_loss=0.4474, tot_loss=1.865, over 142.00 samples.], tot_loss[dur_loss=0.3968, prior_loss=1.021, diff_loss=0.5152, tot_loss=1.933, over 2210.00 samples.], 2024-10-20 19:24:26,273 INFO [train.py:682] (1/4) Start epoch 183 2024-10-20 19:24:46,651 INFO [train.py:561] (1/4) Epoch 183, batch 8, global_batch_idx: 2920, batch size: 170, loss[dur_loss=0.3993, prior_loss=1.021, diff_loss=0.4548, tot_loss=1.875, over 170.00 samples.], tot_loss[dur_loss=0.3938, prior_loss=1.021, diff_loss=0.5437, tot_loss=1.958, over 1432.00 samples.], 2024-10-20 19:24:56,814 INFO [train.py:682] (1/4) Start epoch 184 2024-10-20 19:25:08,052 INFO [train.py:561] (1/4) Epoch 184, batch 2, global_batch_idx: 2930, batch size: 203, loss[dur_loss=0.3866, prior_loss=1.021, diff_loss=0.509, tot_loss=1.916, over 203.00 samples.], tot_loss[dur_loss=0.3978, prior_loss=1.021, diff_loss=0.4705, tot_loss=1.89, over 442.00 samples.], 2024-10-20 19:25:22,209 INFO [train.py:561] (1/4) Epoch 184, batch 12, global_batch_idx: 2940, batch size: 152, loss[dur_loss=0.4025, prior_loss=1.021, diff_loss=0.4514, tot_loss=1.875, over 152.00 samples.], tot_loss[dur_loss=0.3951, prior_loss=1.021, diff_loss=0.5141, tot_loss=1.93, over 1966.00 samples.], 2024-10-20 19:25:26,638 INFO [train.py:682] (1/4) Start epoch 185 2024-10-20 19:25:43,735 INFO [train.py:561] (1/4) Epoch 185, batch 6, global_batch_idx: 2950, batch size: 106, loss[dur_loss=0.3981, prior_loss=1.021, diff_loss=0.4491, tot_loss=1.868, over 106.00 samples.], tot_loss[dur_loss=0.3907, prior_loss=1.021, diff_loss=0.5563, tot_loss=1.968, over 1142.00 samples.], 2024-10-20 19:25:56,811 INFO [train.py:682] (1/4) Start epoch 186 2024-10-20 19:26:05,810 INFO [train.py:561] (1/4) Epoch 186, batch 0, global_batch_idx: 2960, batch size: 108, loss[dur_loss=0.4188, prior_loss=1.022, diff_loss=0.4591, tot_loss=1.9, over 108.00 samples.], tot_loss[dur_loss=0.4188, prior_loss=1.022, diff_loss=0.4591, tot_loss=1.9, over 108.00 samples.], 2024-10-20 19:26:20,000 INFO [train.py:561] (1/4) Epoch 186, batch 10, global_batch_idx: 2970, batch size: 111, loss[dur_loss=0.4139, prior_loss=1.023, diff_loss=0.4617, tot_loss=1.899, over 111.00 samples.], tot_loss[dur_loss=0.3959, prior_loss=1.021, diff_loss=0.5413, tot_loss=1.958, over 1656.00 samples.], 2024-10-20 19:26:27,132 INFO [train.py:682] (1/4) Start epoch 187 2024-10-20 19:26:40,721 INFO [train.py:561] (1/4) Epoch 187, batch 4, global_batch_idx: 2980, batch size: 189, loss[dur_loss=0.3945, prior_loss=1.022, diff_loss=0.4901, tot_loss=1.906, over 189.00 samples.], tot_loss[dur_loss=0.3924, prior_loss=1.021, diff_loss=0.5889, tot_loss=2.003, over 937.00 samples.], 2024-10-20 19:26:55,539 INFO [train.py:561] (1/4) Epoch 187, batch 14, global_batch_idx: 2990, batch size: 142, loss[dur_loss=0.4083, prior_loss=1.021, diff_loss=0.4886, tot_loss=1.918, over 142.00 samples.], tot_loss[dur_loss=0.3989, prior_loss=1.022, diff_loss=0.5172, tot_loss=1.938, over 2210.00 samples.], 2024-10-20 19:26:56,967 INFO [train.py:682] (1/4) Start epoch 188 2024-10-20 19:27:16,829 INFO [train.py:561] (1/4) Epoch 188, batch 8, global_batch_idx: 3000, batch size: 170, loss[dur_loss=0.3928, prior_loss=1.021, diff_loss=0.4623, tot_loss=1.876, over 170.00 samples.], tot_loss[dur_loss=0.3932, prior_loss=1.021, diff_loss=0.5342, tot_loss=1.948, over 1432.00 samples.], 2024-10-20 19:27:18,324 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 19:27:44,800 INFO [train.py:589] (1/4) Epoch 188, validation: dur_loss=0.4169, prior_loss=1.029, diff_loss=0.4498, tot_loss=1.896, over 100.00 samples. 2024-10-20 19:27:44,801 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21048MB 2024-10-20 19:27:53,484 INFO [train.py:682] (1/4) Start epoch 189 2024-10-20 19:28:04,836 INFO [train.py:561] (1/4) Epoch 189, batch 2, global_batch_idx: 3010, batch size: 203, loss[dur_loss=0.3902, prior_loss=1.021, diff_loss=0.4748, tot_loss=1.886, over 203.00 samples.], tot_loss[dur_loss=0.3977, prior_loss=1.021, diff_loss=0.4509, tot_loss=1.87, over 442.00 samples.], 2024-10-20 19:28:19,087 INFO [train.py:561] (1/4) Epoch 189, batch 12, global_batch_idx: 3020, batch size: 152, loss[dur_loss=0.3981, prior_loss=1.02, diff_loss=0.4695, tot_loss=1.888, over 152.00 samples.], tot_loss[dur_loss=0.3937, prior_loss=1.021, diff_loss=0.5145, tot_loss=1.929, over 1966.00 samples.], 2024-10-20 19:28:23,513 INFO [train.py:682] (1/4) Start epoch 190 2024-10-20 19:28:40,362 INFO [train.py:561] (1/4) Epoch 190, batch 6, global_batch_idx: 3030, batch size: 106, loss[dur_loss=0.3964, prior_loss=1.021, diff_loss=0.4474, tot_loss=1.865, over 106.00 samples.], tot_loss[dur_loss=0.3895, prior_loss=1.02, diff_loss=0.5516, tot_loss=1.961, over 1142.00 samples.], 2024-10-20 19:28:53,417 INFO [train.py:682] (1/4) Start epoch 191 2024-10-20 19:29:02,347 INFO [train.py:561] (1/4) Epoch 191, batch 0, global_batch_idx: 3040, batch size: 108, loss[dur_loss=0.4138, prior_loss=1.022, diff_loss=0.4668, tot_loss=1.902, over 108.00 samples.], tot_loss[dur_loss=0.4138, prior_loss=1.022, diff_loss=0.4668, tot_loss=1.902, over 108.00 samples.], 2024-10-20 19:29:16,676 INFO [train.py:561] (1/4) Epoch 191, batch 10, global_batch_idx: 3050, batch size: 111, loss[dur_loss=0.4114, prior_loss=1.023, diff_loss=0.451, tot_loss=1.885, over 111.00 samples.], tot_loss[dur_loss=0.3937, prior_loss=1.02, diff_loss=0.5254, tot_loss=1.94, over 1656.00 samples.], 2024-10-20 19:29:23,810 INFO [train.py:682] (1/4) Start epoch 192 2024-10-20 19:29:37,726 INFO [train.py:561] (1/4) Epoch 192, batch 4, global_batch_idx: 3060, batch size: 189, loss[dur_loss=0.3871, prior_loss=1.021, diff_loss=0.4573, tot_loss=1.865, over 189.00 samples.], tot_loss[dur_loss=0.3881, prior_loss=1.02, diff_loss=0.5698, tot_loss=1.978, over 937.00 samples.], 2024-10-20 19:29:52,680 INFO [train.py:561] (1/4) Epoch 192, batch 14, global_batch_idx: 3070, batch size: 142, loss[dur_loss=0.3968, prior_loss=1.019, diff_loss=0.461, tot_loss=1.877, over 142.00 samples.], tot_loss[dur_loss=0.3946, prior_loss=1.02, diff_loss=0.5079, tot_loss=1.923, over 2210.00 samples.], 2024-10-20 19:29:54,108 INFO [train.py:682] (1/4) Start epoch 193 2024-10-20 19:30:14,063 INFO [train.py:561] (1/4) Epoch 193, batch 8, global_batch_idx: 3080, batch size: 170, loss[dur_loss=0.3953, prior_loss=1.02, diff_loss=0.4923, tot_loss=1.908, over 170.00 samples.], tot_loss[dur_loss=0.3881, prior_loss=1.02, diff_loss=0.5501, tot_loss=1.958, over 1432.00 samples.], 2024-10-20 19:30:24,265 INFO [train.py:682] (1/4) Start epoch 194 2024-10-20 19:30:35,734 INFO [train.py:561] (1/4) Epoch 194, batch 2, global_batch_idx: 3090, batch size: 203, loss[dur_loss=0.3897, prior_loss=1.02, diff_loss=0.4972, tot_loss=1.907, over 203.00 samples.], tot_loss[dur_loss=0.3964, prior_loss=1.021, diff_loss=0.4692, tot_loss=1.886, over 442.00 samples.], 2024-10-20 19:30:49,986 INFO [train.py:561] (1/4) Epoch 194, batch 12, global_batch_idx: 3100, batch size: 152, loss[dur_loss=0.3939, prior_loss=1.02, diff_loss=0.4691, tot_loss=1.883, over 152.00 samples.], tot_loss[dur_loss=0.3921, prior_loss=1.02, diff_loss=0.5148, tot_loss=1.927, over 1966.00 samples.], 2024-10-20 19:30:54,420 INFO [train.py:682] (1/4) Start epoch 195 2024-10-20 19:31:12,054 INFO [train.py:561] (1/4) Epoch 195, batch 6, global_batch_idx: 3110, batch size: 106, loss[dur_loss=0.3955, prior_loss=1.02, diff_loss=0.4462, tot_loss=1.862, over 106.00 samples.], tot_loss[dur_loss=0.3881, prior_loss=1.02, diff_loss=0.5538, tot_loss=1.962, over 1142.00 samples.], 2024-10-20 19:31:25,104 INFO [train.py:682] (1/4) Start epoch 196 2024-10-20 19:31:33,867 INFO [train.py:561] (1/4) Epoch 196, batch 0, global_batch_idx: 3120, batch size: 108, loss[dur_loss=0.4097, prior_loss=1.021, diff_loss=0.4556, tot_loss=1.887, over 108.00 samples.], tot_loss[dur_loss=0.4097, prior_loss=1.021, diff_loss=0.4556, tot_loss=1.887, over 108.00 samples.], 2024-10-20 19:31:48,144 INFO [train.py:561] (1/4) Epoch 196, batch 10, global_batch_idx: 3130, batch size: 111, loss[dur_loss=0.408, prior_loss=1.022, diff_loss=0.4638, tot_loss=1.894, over 111.00 samples.], tot_loss[dur_loss=0.3906, prior_loss=1.02, diff_loss=0.5248, tot_loss=1.935, over 1656.00 samples.], 2024-10-20 19:31:55,214 INFO [train.py:682] (1/4) Start epoch 197 2024-10-20 19:32:08,899 INFO [train.py:561] (1/4) Epoch 197, batch 4, global_batch_idx: 3140, batch size: 189, loss[dur_loss=0.3821, prior_loss=1.02, diff_loss=0.479, tot_loss=1.881, over 189.00 samples.], tot_loss[dur_loss=0.3838, prior_loss=1.019, diff_loss=0.5741, tot_loss=1.977, over 937.00 samples.], 2024-10-20 19:32:23,677 INFO [train.py:561] (1/4) Epoch 197, batch 14, global_batch_idx: 3150, batch size: 142, loss[dur_loss=0.3886, prior_loss=1.019, diff_loss=0.44, tot_loss=1.848, over 142.00 samples.], tot_loss[dur_loss=0.3918, prior_loss=1.02, diff_loss=0.5049, tot_loss=1.917, over 2210.00 samples.], 2024-10-20 19:32:25,100 INFO [train.py:682] (1/4) Start epoch 198 2024-10-20 19:32:45,158 INFO [train.py:561] (1/4) Epoch 198, batch 8, global_batch_idx: 3160, batch size: 170, loss[dur_loss=0.3891, prior_loss=1.02, diff_loss=0.5036, tot_loss=1.913, over 170.00 samples.], tot_loss[dur_loss=0.3886, prior_loss=1.02, diff_loss=0.5363, tot_loss=1.945, over 1432.00 samples.], 2024-10-20 19:32:55,222 INFO [train.py:682] (1/4) Start epoch 199 2024-10-20 19:33:06,586 INFO [train.py:561] (1/4) Epoch 199, batch 2, global_batch_idx: 3170, batch size: 203, loss[dur_loss=0.3945, prior_loss=1.02, diff_loss=0.486, tot_loss=1.901, over 203.00 samples.], tot_loss[dur_loss=0.3988, prior_loss=1.02, diff_loss=0.4641, tot_loss=1.883, over 442.00 samples.], 2024-10-20 19:33:20,733 INFO [train.py:561] (1/4) Epoch 199, batch 12, global_batch_idx: 3180, batch size: 152, loss[dur_loss=0.3969, prior_loss=1.02, diff_loss=0.4476, tot_loss=1.864, over 152.00 samples.], tot_loss[dur_loss=0.3919, prior_loss=1.02, diff_loss=0.5119, tot_loss=1.924, over 1966.00 samples.], 2024-10-20 19:33:25,150 INFO [train.py:682] (1/4) Start epoch 200 2024-10-20 19:33:42,172 INFO [train.py:561] (1/4) Epoch 200, batch 6, global_batch_idx: 3190, batch size: 106, loss[dur_loss=0.3895, prior_loss=1.02, diff_loss=0.465, tot_loss=1.874, over 106.00 samples.], tot_loss[dur_loss=0.3874, prior_loss=1.019, diff_loss=0.5615, tot_loss=1.968, over 1142.00 samples.], 2024-10-20 19:33:55,246 INFO [train.py:682] (1/4) Start epoch 201 2024-10-20 19:34:03,991 INFO [train.py:561] (1/4) Epoch 201, batch 0, global_batch_idx: 3200, batch size: 108, loss[dur_loss=0.4148, prior_loss=1.021, diff_loss=0.4405, tot_loss=1.877, over 108.00 samples.], tot_loss[dur_loss=0.4148, prior_loss=1.021, diff_loss=0.4405, tot_loss=1.877, over 108.00 samples.], 2024-10-20 19:34:18,286 INFO [train.py:561] (1/4) Epoch 201, batch 10, global_batch_idx: 3210, batch size: 111, loss[dur_loss=0.4074, prior_loss=1.022, diff_loss=0.4894, tot_loss=1.919, over 111.00 samples.], tot_loss[dur_loss=0.3904, prior_loss=1.02, diff_loss=0.5252, tot_loss=1.935, over 1656.00 samples.], 2024-10-20 19:34:25,473 INFO [train.py:682] (1/4) Start epoch 202 2024-10-20 19:34:39,600 INFO [train.py:561] (1/4) Epoch 202, batch 4, global_batch_idx: 3220, batch size: 189, loss[dur_loss=0.3845, prior_loss=1.02, diff_loss=0.4903, tot_loss=1.895, over 189.00 samples.], tot_loss[dur_loss=0.3851, prior_loss=1.019, diff_loss=0.576, tot_loss=1.98, over 937.00 samples.], 2024-10-20 19:34:54,398 INFO [train.py:561] (1/4) Epoch 202, batch 14, global_batch_idx: 3230, batch size: 142, loss[dur_loss=0.3925, prior_loss=1.019, diff_loss=0.4756, tot_loss=1.887, over 142.00 samples.], tot_loss[dur_loss=0.3918, prior_loss=1.02, diff_loss=0.5057, tot_loss=1.917, over 2210.00 samples.], 2024-10-20 19:34:55,869 INFO [train.py:682] (1/4) Start epoch 203 2024-10-20 19:35:15,766 INFO [train.py:561] (1/4) Epoch 203, batch 8, global_batch_idx: 3240, batch size: 170, loss[dur_loss=0.392, prior_loss=1.02, diff_loss=0.4669, tot_loss=1.878, over 170.00 samples.], tot_loss[dur_loss=0.3882, prior_loss=1.019, diff_loss=0.532, tot_loss=1.939, over 1432.00 samples.], 2024-10-20 19:35:25,816 INFO [train.py:682] (1/4) Start epoch 204 2024-10-20 19:35:37,123 INFO [train.py:561] (1/4) Epoch 204, batch 2, global_batch_idx: 3250, batch size: 203, loss[dur_loss=0.3857, prior_loss=1.019, diff_loss=0.5032, tot_loss=1.908, over 203.00 samples.], tot_loss[dur_loss=0.3914, prior_loss=1.02, diff_loss=0.4699, tot_loss=1.881, over 442.00 samples.], 2024-10-20 19:35:51,569 INFO [train.py:561] (1/4) Epoch 204, batch 12, global_batch_idx: 3260, batch size: 152, loss[dur_loss=0.3956, prior_loss=1.019, diff_loss=0.4584, tot_loss=1.873, over 152.00 samples.], tot_loss[dur_loss=0.3892, prior_loss=1.019, diff_loss=0.517, tot_loss=1.925, over 1966.00 samples.], 2024-10-20 19:35:56,051 INFO [train.py:682] (1/4) Start epoch 205 2024-10-20 19:36:13,109 INFO [train.py:561] (1/4) Epoch 205, batch 6, global_batch_idx: 3270, batch size: 106, loss[dur_loss=0.3889, prior_loss=1.02, diff_loss=0.4529, tot_loss=1.861, over 106.00 samples.], tot_loss[dur_loss=0.3839, prior_loss=1.019, diff_loss=0.5471, tot_loss=1.95, over 1142.00 samples.], 2024-10-20 19:36:26,235 INFO [train.py:682] (1/4) Start epoch 206 2024-10-20 19:36:35,030 INFO [train.py:561] (1/4) Epoch 206, batch 0, global_batch_idx: 3280, batch size: 108, loss[dur_loss=0.4087, prior_loss=1.021, diff_loss=0.4605, tot_loss=1.89, over 108.00 samples.], tot_loss[dur_loss=0.4087, prior_loss=1.021, diff_loss=0.4605, tot_loss=1.89, over 108.00 samples.], 2024-10-20 19:36:49,294 INFO [train.py:561] (1/4) Epoch 206, batch 10, global_batch_idx: 3290, batch size: 111, loss[dur_loss=0.4045, prior_loss=1.022, diff_loss=0.431, tot_loss=1.857, over 111.00 samples.], tot_loss[dur_loss=0.3881, prior_loss=1.019, diff_loss=0.5092, tot_loss=1.917, over 1656.00 samples.], 2024-10-20 19:36:56,394 INFO [train.py:682] (1/4) Start epoch 207 2024-10-20 19:37:10,075 INFO [train.py:561] (1/4) Epoch 207, batch 4, global_batch_idx: 3300, batch size: 189, loss[dur_loss=0.3808, prior_loss=1.019, diff_loss=0.4699, tot_loss=1.87, over 189.00 samples.], tot_loss[dur_loss=0.3823, prior_loss=1.019, diff_loss=0.5661, tot_loss=1.967, over 937.00 samples.], 2024-10-20 19:37:24,876 INFO [train.py:561] (1/4) Epoch 207, batch 14, global_batch_idx: 3310, batch size: 142, loss[dur_loss=0.389, prior_loss=1.018, diff_loss=0.4394, tot_loss=1.847, over 142.00 samples.], tot_loss[dur_loss=0.3892, prior_loss=1.019, diff_loss=0.5009, tot_loss=1.909, over 2210.00 samples.], 2024-10-20 19:37:26,335 INFO [train.py:682] (1/4) Start epoch 208 2024-10-20 19:37:46,514 INFO [train.py:561] (1/4) Epoch 208, batch 8, global_batch_idx: 3320, batch size: 170, loss[dur_loss=0.3885, prior_loss=1.019, diff_loss=0.4544, tot_loss=1.862, over 170.00 samples.], tot_loss[dur_loss=0.3868, prior_loss=1.019, diff_loss=0.5298, tot_loss=1.935, over 1432.00 samples.], 2024-10-20 19:37:56,676 INFO [train.py:682] (1/4) Start epoch 209 2024-10-20 19:38:08,099 INFO [train.py:561] (1/4) Epoch 209, batch 2, global_batch_idx: 3330, batch size: 203, loss[dur_loss=0.3817, prior_loss=1.019, diff_loss=0.4585, tot_loss=1.859, over 203.00 samples.], tot_loss[dur_loss=0.3906, prior_loss=1.019, diff_loss=0.4557, tot_loss=1.866, over 442.00 samples.], 2024-10-20 19:38:22,353 INFO [train.py:561] (1/4) Epoch 209, batch 12, global_batch_idx: 3340, batch size: 152, loss[dur_loss=0.3924, prior_loss=1.019, diff_loss=0.4603, tot_loss=1.872, over 152.00 samples.], tot_loss[dur_loss=0.3864, prior_loss=1.019, diff_loss=0.5106, tot_loss=1.916, over 1966.00 samples.], 2024-10-20 19:38:26,777 INFO [train.py:682] (1/4) Start epoch 210 2024-10-20 19:38:43,877 INFO [train.py:561] (1/4) Epoch 210, batch 6, global_batch_idx: 3350, batch size: 106, loss[dur_loss=0.39, prior_loss=1.019, diff_loss=0.4661, tot_loss=1.875, over 106.00 samples.], tot_loss[dur_loss=0.3823, prior_loss=1.019, diff_loss=0.5395, tot_loss=1.94, over 1142.00 samples.], 2024-10-20 19:38:56,848 INFO [train.py:682] (1/4) Start epoch 211 2024-10-20 19:39:05,660 INFO [train.py:561] (1/4) Epoch 211, batch 0, global_batch_idx: 3360, batch size: 108, loss[dur_loss=0.405, prior_loss=1.02, diff_loss=0.4908, tot_loss=1.916, over 108.00 samples.], tot_loss[dur_loss=0.405, prior_loss=1.02, diff_loss=0.4908, tot_loss=1.916, over 108.00 samples.], 2024-10-20 19:39:19,870 INFO [train.py:561] (1/4) Epoch 211, batch 10, global_batch_idx: 3370, batch size: 111, loss[dur_loss=0.4081, prior_loss=1.021, diff_loss=0.4255, tot_loss=1.855, over 111.00 samples.], tot_loss[dur_loss=0.3857, prior_loss=1.019, diff_loss=0.5186, tot_loss=1.923, over 1656.00 samples.], 2024-10-20 19:39:26,930 INFO [train.py:682] (1/4) Start epoch 212 2024-10-20 19:39:40,697 INFO [train.py:561] (1/4) Epoch 212, batch 4, global_batch_idx: 3380, batch size: 189, loss[dur_loss=0.3784, prior_loss=1.019, diff_loss=0.4691, tot_loss=1.867, over 189.00 samples.], tot_loss[dur_loss=0.379, prior_loss=1.018, diff_loss=0.5574, tot_loss=1.955, over 937.00 samples.], 2024-10-20 19:39:55,418 INFO [train.py:561] (1/4) Epoch 212, batch 14, global_batch_idx: 3390, batch size: 142, loss[dur_loss=0.3891, prior_loss=1.018, diff_loss=0.4544, tot_loss=1.862, over 142.00 samples.], tot_loss[dur_loss=0.3865, prior_loss=1.019, diff_loss=0.4963, tot_loss=1.902, over 2210.00 samples.], 2024-10-20 19:39:56,879 INFO [train.py:682] (1/4) Start epoch 213 2024-10-20 19:40:16,528 INFO [train.py:561] (1/4) Epoch 213, batch 8, global_batch_idx: 3400, batch size: 170, loss[dur_loss=0.3873, prior_loss=1.019, diff_loss=0.4454, tot_loss=1.851, over 170.00 samples.], tot_loss[dur_loss=0.3844, prior_loss=1.018, diff_loss=0.5224, tot_loss=1.925, over 1432.00 samples.], 2024-10-20 19:40:26,619 INFO [train.py:682] (1/4) Start epoch 214 2024-10-20 19:40:38,245 INFO [train.py:561] (1/4) Epoch 214, batch 2, global_batch_idx: 3410, batch size: 203, loss[dur_loss=0.3937, prior_loss=1.019, diff_loss=0.4683, tot_loss=1.881, over 203.00 samples.], tot_loss[dur_loss=0.3961, prior_loss=1.019, diff_loss=0.448, tot_loss=1.863, over 442.00 samples.], 2024-10-20 19:40:52,453 INFO [train.py:561] (1/4) Epoch 214, batch 12, global_batch_idx: 3420, batch size: 152, loss[dur_loss=0.3893, prior_loss=1.018, diff_loss=0.4922, tot_loss=1.9, over 152.00 samples.], tot_loss[dur_loss=0.3885, prior_loss=1.019, diff_loss=0.5051, tot_loss=1.912, over 1966.00 samples.], 2024-10-20 19:40:56,905 INFO [train.py:682] (1/4) Start epoch 215 2024-10-20 19:41:13,889 INFO [train.py:561] (1/4) Epoch 215, batch 6, global_batch_idx: 3430, batch size: 106, loss[dur_loss=0.3848, prior_loss=1.018, diff_loss=0.4172, tot_loss=1.82, over 106.00 samples.], tot_loss[dur_loss=0.3808, prior_loss=1.018, diff_loss=0.5367, tot_loss=1.936, over 1142.00 samples.], 2024-10-20 19:41:26,945 INFO [train.py:682] (1/4) Start epoch 216 2024-10-20 19:41:35,556 INFO [train.py:561] (1/4) Epoch 216, batch 0, global_batch_idx: 3440, batch size: 108, loss[dur_loss=0.4044, prior_loss=1.02, diff_loss=0.4139, tot_loss=1.838, over 108.00 samples.], tot_loss[dur_loss=0.4044, prior_loss=1.02, diff_loss=0.4139, tot_loss=1.838, over 108.00 samples.], 2024-10-20 19:41:49,685 INFO [train.py:561] (1/4) Epoch 216, batch 10, global_batch_idx: 3450, batch size: 111, loss[dur_loss=0.3999, prior_loss=1.021, diff_loss=0.4479, tot_loss=1.868, over 111.00 samples.], tot_loss[dur_loss=0.3841, prior_loss=1.018, diff_loss=0.5131, tot_loss=1.916, over 1656.00 samples.], 2024-10-20 19:41:56,761 INFO [train.py:682] (1/4) Start epoch 217 2024-10-20 19:42:10,327 INFO [train.py:561] (1/4) Epoch 217, batch 4, global_batch_idx: 3460, batch size: 189, loss[dur_loss=0.3791, prior_loss=1.018, diff_loss=0.4717, tot_loss=1.869, over 189.00 samples.], tot_loss[dur_loss=0.3776, prior_loss=1.018, diff_loss=0.5606, tot_loss=1.956, over 937.00 samples.], 2024-10-20 19:42:25,133 INFO [train.py:561] (1/4) Epoch 217, batch 14, global_batch_idx: 3470, batch size: 142, loss[dur_loss=0.3832, prior_loss=1.017, diff_loss=0.4507, tot_loss=1.851, over 142.00 samples.], tot_loss[dur_loss=0.3847, prior_loss=1.018, diff_loss=0.5001, tot_loss=1.903, over 2210.00 samples.], 2024-10-20 19:42:26,575 INFO [train.py:682] (1/4) Start epoch 218 2024-10-20 19:42:46,582 INFO [train.py:561] (1/4) Epoch 218, batch 8, global_batch_idx: 3480, batch size: 170, loss[dur_loss=0.3853, prior_loss=1.018, diff_loss=0.4743, tot_loss=1.878, over 170.00 samples.], tot_loss[dur_loss=0.3815, prior_loss=1.018, diff_loss=0.5247, tot_loss=1.924, over 1432.00 samples.], 2024-10-20 19:42:56,666 INFO [train.py:682] (1/4) Start epoch 219 2024-10-20 19:43:08,370 INFO [train.py:561] (1/4) Epoch 219, batch 2, global_batch_idx: 3490, batch size: 203, loss[dur_loss=0.3834, prior_loss=1.018, diff_loss=0.4869, tot_loss=1.888, over 203.00 samples.], tot_loss[dur_loss=0.3885, prior_loss=1.018, diff_loss=0.4688, tot_loss=1.876, over 442.00 samples.], 2024-10-20 19:43:22,480 INFO [train.py:561] (1/4) Epoch 219, batch 12, global_batch_idx: 3500, batch size: 152, loss[dur_loss=0.3958, prior_loss=1.019, diff_loss=0.4576, tot_loss=1.872, over 152.00 samples.], tot_loss[dur_loss=0.3859, prior_loss=1.018, diff_loss=0.5095, tot_loss=1.914, over 1966.00 samples.], 2024-10-20 19:43:26,922 INFO [train.py:682] (1/4) Start epoch 220 2024-10-20 19:43:43,842 INFO [train.py:561] (1/4) Epoch 220, batch 6, global_batch_idx: 3510, batch size: 106, loss[dur_loss=0.3844, prior_loss=1.019, diff_loss=0.4273, tot_loss=1.831, over 106.00 samples.], tot_loss[dur_loss=0.3811, prior_loss=1.018, diff_loss=0.5425, tot_loss=1.942, over 1142.00 samples.], 2024-10-20 19:43:56,861 INFO [train.py:682] (1/4) Start epoch 221 2024-10-20 19:44:05,464 INFO [train.py:561] (1/4) Epoch 221, batch 0, global_batch_idx: 3520, batch size: 108, loss[dur_loss=0.4071, prior_loss=1.02, diff_loss=0.4449, tot_loss=1.872, over 108.00 samples.], tot_loss[dur_loss=0.4071, prior_loss=1.02, diff_loss=0.4449, tot_loss=1.872, over 108.00 samples.], 2024-10-20 19:44:19,581 INFO [train.py:561] (1/4) Epoch 221, batch 10, global_batch_idx: 3530, batch size: 111, loss[dur_loss=0.4009, prior_loss=1.021, diff_loss=0.4406, tot_loss=1.862, over 111.00 samples.], tot_loss[dur_loss=0.3868, prior_loss=1.019, diff_loss=0.5306, tot_loss=1.936, over 1656.00 samples.], 2024-10-20 19:44:26,635 INFO [train.py:682] (1/4) Start epoch 222 2024-10-20 19:44:40,210 INFO [train.py:561] (1/4) Epoch 222, batch 4, global_batch_idx: 3540, batch size: 189, loss[dur_loss=0.3745, prior_loss=1.018, diff_loss=0.4924, tot_loss=1.885, over 189.00 samples.], tot_loss[dur_loss=0.378, prior_loss=1.018, diff_loss=0.5642, tot_loss=1.96, over 937.00 samples.], 2024-10-20 19:44:55,074 INFO [train.py:561] (1/4) Epoch 222, batch 14, global_batch_idx: 3550, batch size: 142, loss[dur_loss=0.3923, prior_loss=1.018, diff_loss=0.4418, tot_loss=1.852, over 142.00 samples.], tot_loss[dur_loss=0.3856, prior_loss=1.018, diff_loss=0.4922, tot_loss=1.896, over 2210.00 samples.], 2024-10-20 19:44:56,509 INFO [train.py:682] (1/4) Start epoch 223 2024-10-20 19:45:16,657 INFO [train.py:561] (1/4) Epoch 223, batch 8, global_batch_idx: 3560, batch size: 170, loss[dur_loss=0.3844, prior_loss=1.018, diff_loss=0.4516, tot_loss=1.854, over 170.00 samples.], tot_loss[dur_loss=0.3824, prior_loss=1.018, diff_loss=0.5325, tot_loss=1.933, over 1432.00 samples.], 2024-10-20 19:45:26,902 INFO [train.py:682] (1/4) Start epoch 224 2024-10-20 19:45:38,104 INFO [train.py:561] (1/4) Epoch 224, batch 2, global_batch_idx: 3570, batch size: 203, loss[dur_loss=0.3844, prior_loss=1.018, diff_loss=0.4563, tot_loss=1.858, over 203.00 samples.], tot_loss[dur_loss=0.3894, prior_loss=1.018, diff_loss=0.4496, tot_loss=1.857, over 442.00 samples.], 2024-10-20 19:45:52,451 INFO [train.py:561] (1/4) Epoch 224, batch 12, global_batch_idx: 3580, batch size: 152, loss[dur_loss=0.388, prior_loss=1.018, diff_loss=0.441, tot_loss=1.847, over 152.00 samples.], tot_loss[dur_loss=0.3839, prior_loss=1.018, diff_loss=0.5046, tot_loss=1.907, over 1966.00 samples.], 2024-10-20 19:45:56,901 INFO [train.py:682] (1/4) Start epoch 225 2024-10-20 19:46:13,958 INFO [train.py:561] (1/4) Epoch 225, batch 6, global_batch_idx: 3590, batch size: 106, loss[dur_loss=0.3805, prior_loss=1.018, diff_loss=0.3899, tot_loss=1.788, over 106.00 samples.], tot_loss[dur_loss=0.378, prior_loss=1.017, diff_loss=0.5473, tot_loss=1.943, over 1142.00 samples.], 2024-10-20 19:46:27,077 INFO [train.py:682] (1/4) Start epoch 226 2024-10-20 19:46:35,893 INFO [train.py:561] (1/4) Epoch 226, batch 0, global_batch_idx: 3600, batch size: 108, loss[dur_loss=0.403, prior_loss=1.019, diff_loss=0.4488, tot_loss=1.871, over 108.00 samples.], tot_loss[dur_loss=0.403, prior_loss=1.019, diff_loss=0.4488, tot_loss=1.871, over 108.00 samples.], 2024-10-20 19:46:50,103 INFO [train.py:561] (1/4) Epoch 226, batch 10, global_batch_idx: 3610, batch size: 111, loss[dur_loss=0.3986, prior_loss=1.02, diff_loss=0.4064, tot_loss=1.825, over 111.00 samples.], tot_loss[dur_loss=0.3836, prior_loss=1.018, diff_loss=0.5034, tot_loss=1.905, over 1656.00 samples.], 2024-10-20 19:46:57,178 INFO [train.py:682] (1/4) Start epoch 227 2024-10-20 19:47:11,077 INFO [train.py:561] (1/4) Epoch 227, batch 4, global_batch_idx: 3620, batch size: 189, loss[dur_loss=0.3829, prior_loss=1.018, diff_loss=0.4433, tot_loss=1.844, over 189.00 samples.], tot_loss[dur_loss=0.3777, prior_loss=1.017, diff_loss=0.5557, tot_loss=1.951, over 937.00 samples.], 2024-10-20 19:47:25,931 INFO [train.py:561] (1/4) Epoch 227, batch 14, global_batch_idx: 3630, batch size: 142, loss[dur_loss=0.3873, prior_loss=1.017, diff_loss=0.451, tot_loss=1.856, over 142.00 samples.], tot_loss[dur_loss=0.3837, prior_loss=1.018, diff_loss=0.4889, tot_loss=1.89, over 2210.00 samples.], 2024-10-20 19:47:27,351 INFO [train.py:682] (1/4) Start epoch 228 2024-10-20 19:47:47,502 INFO [train.py:561] (1/4) Epoch 228, batch 8, global_batch_idx: 3640, batch size: 170, loss[dur_loss=0.3819, prior_loss=1.018, diff_loss=0.4824, tot_loss=1.882, over 170.00 samples.], tot_loss[dur_loss=0.3798, prior_loss=1.017, diff_loss=0.528, tot_loss=1.925, over 1432.00 samples.], 2024-10-20 19:47:57,695 INFO [train.py:682] (1/4) Start epoch 229 2024-10-20 19:48:09,010 INFO [train.py:561] (1/4) Epoch 229, batch 2, global_batch_idx: 3650, batch size: 203, loss[dur_loss=0.3811, prior_loss=1.017, diff_loss=0.4578, tot_loss=1.856, over 203.00 samples.], tot_loss[dur_loss=0.387, prior_loss=1.018, diff_loss=0.442, tot_loss=1.847, over 442.00 samples.], 2024-10-20 19:48:23,346 INFO [train.py:561] (1/4) Epoch 229, batch 12, global_batch_idx: 3660, batch size: 152, loss[dur_loss=0.3891, prior_loss=1.017, diff_loss=0.4711, tot_loss=1.878, over 152.00 samples.], tot_loss[dur_loss=0.3811, prior_loss=1.017, diff_loss=0.4943, tot_loss=1.893, over 1966.00 samples.], 2024-10-20 19:48:27,862 INFO [train.py:682] (1/4) Start epoch 230 2024-10-20 19:48:44,773 INFO [train.py:561] (1/4) Epoch 230, batch 6, global_batch_idx: 3670, batch size: 106, loss[dur_loss=0.3818, prior_loss=1.017, diff_loss=0.4314, tot_loss=1.831, over 106.00 samples.], tot_loss[dur_loss=0.378, prior_loss=1.017, diff_loss=0.5385, tot_loss=1.934, over 1142.00 samples.], 2024-10-20 19:48:57,849 INFO [train.py:682] (1/4) Start epoch 231 2024-10-20 19:49:06,749 INFO [train.py:561] (1/4) Epoch 231, batch 0, global_batch_idx: 3680, batch size: 108, loss[dur_loss=0.4006, prior_loss=1.019, diff_loss=0.4215, tot_loss=1.841, over 108.00 samples.], tot_loss[dur_loss=0.4006, prior_loss=1.019, diff_loss=0.4215, tot_loss=1.841, over 108.00 samples.], 2024-10-20 19:49:21,033 INFO [train.py:561] (1/4) Epoch 231, batch 10, global_batch_idx: 3690, batch size: 111, loss[dur_loss=0.4005, prior_loss=1.019, diff_loss=0.4175, tot_loss=1.837, over 111.00 samples.], tot_loss[dur_loss=0.3803, prior_loss=1.017, diff_loss=0.4976, tot_loss=1.895, over 1656.00 samples.], 2024-10-20 19:49:28,138 INFO [train.py:682] (1/4) Start epoch 232 2024-10-20 19:49:41,751 INFO [train.py:561] (1/4) Epoch 232, batch 4, global_batch_idx: 3700, batch size: 189, loss[dur_loss=0.3806, prior_loss=1.018, diff_loss=0.4529, tot_loss=1.852, over 189.00 samples.], tot_loss[dur_loss=0.3771, prior_loss=1.017, diff_loss=0.557, tot_loss=1.951, over 937.00 samples.], 2024-10-20 19:49:56,630 INFO [train.py:561] (1/4) Epoch 232, batch 14, global_batch_idx: 3710, batch size: 142, loss[dur_loss=0.3864, prior_loss=1.017, diff_loss=0.4357, tot_loss=1.839, over 142.00 samples.], tot_loss[dur_loss=0.3835, prior_loss=1.017, diff_loss=0.4822, tot_loss=1.883, over 2210.00 samples.], 2024-10-20 19:49:58,053 INFO [train.py:682] (1/4) Start epoch 233 2024-10-20 19:50:17,741 INFO [train.py:561] (1/4) Epoch 233, batch 8, global_batch_idx: 3720, batch size: 170, loss[dur_loss=0.3824, prior_loss=1.017, diff_loss=0.4711, tot_loss=1.871, over 170.00 samples.], tot_loss[dur_loss=0.3777, prior_loss=1.017, diff_loss=0.5265, tot_loss=1.921, over 1432.00 samples.], 2024-10-20 19:50:27,865 INFO [train.py:682] (1/4) Start epoch 234 2024-10-20 19:50:39,253 INFO [train.py:561] (1/4) Epoch 234, batch 2, global_batch_idx: 3730, batch size: 203, loss[dur_loss=0.3767, prior_loss=1.017, diff_loss=0.4532, tot_loss=1.847, over 203.00 samples.], tot_loss[dur_loss=0.3838, prior_loss=1.018, diff_loss=0.4511, tot_loss=1.852, over 442.00 samples.], 2024-10-20 19:50:53,378 INFO [train.py:561] (1/4) Epoch 234, batch 12, global_batch_idx: 3740, batch size: 152, loss[dur_loss=0.3837, prior_loss=1.017, diff_loss=0.4592, tot_loss=1.86, over 152.00 samples.], tot_loss[dur_loss=0.3792, prior_loss=1.017, diff_loss=0.4908, tot_loss=1.887, over 1966.00 samples.], 2024-10-20 19:50:57,788 INFO [train.py:682] (1/4) Start epoch 235 2024-10-20 19:51:14,838 INFO [train.py:561] (1/4) Epoch 235, batch 6, global_batch_idx: 3750, batch size: 106, loss[dur_loss=0.3809, prior_loss=1.017, diff_loss=0.4236, tot_loss=1.821, over 106.00 samples.], tot_loss[dur_loss=0.3768, prior_loss=1.017, diff_loss=0.5389, tot_loss=1.932, over 1142.00 samples.], 2024-10-20 19:51:27,882 INFO [train.py:682] (1/4) Start epoch 236 2024-10-20 19:51:36,652 INFO [train.py:561] (1/4) Epoch 236, batch 0, global_batch_idx: 3760, batch size: 108, loss[dur_loss=0.4018, prior_loss=1.019, diff_loss=0.4458, tot_loss=1.866, over 108.00 samples.], tot_loss[dur_loss=0.4018, prior_loss=1.019, diff_loss=0.4458, tot_loss=1.866, over 108.00 samples.], 2024-10-20 19:51:50,867 INFO [train.py:561] (1/4) Epoch 236, batch 10, global_batch_idx: 3770, batch size: 111, loss[dur_loss=0.3943, prior_loss=1.02, diff_loss=0.4461, tot_loss=1.86, over 111.00 samples.], tot_loss[dur_loss=0.3789, prior_loss=1.017, diff_loss=0.5014, tot_loss=1.897, over 1656.00 samples.], 2024-10-20 19:51:57,938 INFO [train.py:682] (1/4) Start epoch 237 2024-10-20 19:52:11,603 INFO [train.py:561] (1/4) Epoch 237, batch 4, global_batch_idx: 3780, batch size: 189, loss[dur_loss=0.3742, prior_loss=1.017, diff_loss=0.4514, tot_loss=1.842, over 189.00 samples.], tot_loss[dur_loss=0.373, prior_loss=1.017, diff_loss=0.5479, tot_loss=1.937, over 937.00 samples.], 2024-10-20 19:52:26,437 INFO [train.py:561] (1/4) Epoch 237, batch 14, global_batch_idx: 3790, batch size: 142, loss[dur_loss=0.3805, prior_loss=1.016, diff_loss=0.4462, tot_loss=1.843, over 142.00 samples.], tot_loss[dur_loss=0.379, prior_loss=1.017, diff_loss=0.4849, tot_loss=1.881, over 2210.00 samples.], 2024-10-20 19:52:27,855 INFO [train.py:682] (1/4) Start epoch 238 2024-10-20 19:52:47,628 INFO [train.py:561] (1/4) Epoch 238, batch 8, global_batch_idx: 3800, batch size: 170, loss[dur_loss=0.3799, prior_loss=1.017, diff_loss=0.4763, tot_loss=1.873, over 170.00 samples.], tot_loss[dur_loss=0.3768, prior_loss=1.017, diff_loss=0.5106, tot_loss=1.904, over 1432.00 samples.], 2024-10-20 19:52:57,777 INFO [train.py:682] (1/4) Start epoch 239 2024-10-20 19:53:09,303 INFO [train.py:561] (1/4) Epoch 239, batch 2, global_batch_idx: 3810, batch size: 203, loss[dur_loss=0.3769, prior_loss=1.017, diff_loss=0.4875, tot_loss=1.881, over 203.00 samples.], tot_loss[dur_loss=0.3828, prior_loss=1.017, diff_loss=0.4482, tot_loss=1.848, over 442.00 samples.], 2024-10-20 19:53:23,665 INFO [train.py:561] (1/4) Epoch 239, batch 12, global_batch_idx: 3820, batch size: 152, loss[dur_loss=0.3867, prior_loss=1.017, diff_loss=0.4443, tot_loss=1.848, over 152.00 samples.], tot_loss[dur_loss=0.3796, prior_loss=1.017, diff_loss=0.4889, tot_loss=1.885, over 1966.00 samples.], 2024-10-20 19:53:28,138 INFO [train.py:682] (1/4) Start epoch 240 2024-10-20 19:53:45,130 INFO [train.py:561] (1/4) Epoch 240, batch 6, global_batch_idx: 3830, batch size: 106, loss[dur_loss=0.3839, prior_loss=1.018, diff_loss=0.4805, tot_loss=1.882, over 106.00 samples.], tot_loss[dur_loss=0.3755, prior_loss=1.016, diff_loss=0.5327, tot_loss=1.925, over 1142.00 samples.], 2024-10-20 19:53:58,215 INFO [train.py:682] (1/4) Start epoch 241 2024-10-20 19:54:06,780 INFO [train.py:561] (1/4) Epoch 241, batch 0, global_batch_idx: 3840, batch size: 108, loss[dur_loss=0.3949, prior_loss=1.018, diff_loss=0.4326, tot_loss=1.846, over 108.00 samples.], tot_loss[dur_loss=0.3949, prior_loss=1.018, diff_loss=0.4326, tot_loss=1.846, over 108.00 samples.], 2024-10-20 19:54:21,020 INFO [train.py:561] (1/4) Epoch 241, batch 10, global_batch_idx: 3850, batch size: 111, loss[dur_loss=0.3948, prior_loss=1.019, diff_loss=0.422, tot_loss=1.836, over 111.00 samples.], tot_loss[dur_loss=0.3783, prior_loss=1.017, diff_loss=0.4993, tot_loss=1.895, over 1656.00 samples.], 2024-10-20 19:54:28,096 INFO [train.py:682] (1/4) Start epoch 242 2024-10-20 19:54:41,810 INFO [train.py:561] (1/4) Epoch 242, batch 4, global_batch_idx: 3860, batch size: 189, loss[dur_loss=0.3777, prior_loss=1.017, diff_loss=0.4664, tot_loss=1.861, over 189.00 samples.], tot_loss[dur_loss=0.3742, prior_loss=1.016, diff_loss=0.5646, tot_loss=1.955, over 937.00 samples.], 2024-10-20 19:54:56,744 INFO [train.py:561] (1/4) Epoch 242, batch 14, global_batch_idx: 3870, batch size: 142, loss[dur_loss=0.3822, prior_loss=1.016, diff_loss=0.4526, tot_loss=1.851, over 142.00 samples.], tot_loss[dur_loss=0.3786, prior_loss=1.017, diff_loss=0.491, tot_loss=1.886, over 2210.00 samples.], 2024-10-20 19:54:58,174 INFO [train.py:682] (1/4) Start epoch 243 2024-10-20 19:55:17,960 INFO [train.py:561] (1/4) Epoch 243, batch 8, global_batch_idx: 3880, batch size: 170, loss[dur_loss=0.375, prior_loss=1.017, diff_loss=0.4561, tot_loss=1.848, over 170.00 samples.], tot_loss[dur_loss=0.3736, prior_loss=1.016, diff_loss=0.5203, tot_loss=1.91, over 1432.00 samples.], 2024-10-20 19:55:28,120 INFO [train.py:682] (1/4) Start epoch 244 2024-10-20 19:55:39,506 INFO [train.py:561] (1/4) Epoch 244, batch 2, global_batch_idx: 3890, batch size: 203, loss[dur_loss=0.3772, prior_loss=1.017, diff_loss=0.4721, tot_loss=1.866, over 203.00 samples.], tot_loss[dur_loss=0.3826, prior_loss=1.017, diff_loss=0.4505, tot_loss=1.85, over 442.00 samples.], 2024-10-20 19:55:53,865 INFO [train.py:561] (1/4) Epoch 244, batch 12, global_batch_idx: 3900, batch size: 152, loss[dur_loss=0.3821, prior_loss=1.016, diff_loss=0.4243, tot_loss=1.822, over 152.00 samples.], tot_loss[dur_loss=0.3785, prior_loss=1.016, diff_loss=0.4898, tot_loss=1.885, over 1966.00 samples.], 2024-10-20 19:55:58,356 INFO [train.py:682] (1/4) Start epoch 245 2024-10-20 19:56:15,616 INFO [train.py:561] (1/4) Epoch 245, batch 6, global_batch_idx: 3910, batch size: 106, loss[dur_loss=0.3772, prior_loss=1.016, diff_loss=0.4417, tot_loss=1.835, over 106.00 samples.], tot_loss[dur_loss=0.3722, prior_loss=1.016, diff_loss=0.5304, tot_loss=1.919, over 1142.00 samples.], 2024-10-20 19:56:28,897 INFO [train.py:682] (1/4) Start epoch 246 2024-10-20 19:56:37,643 INFO [train.py:561] (1/4) Epoch 246, batch 0, global_batch_idx: 3920, batch size: 108, loss[dur_loss=0.3973, prior_loss=1.017, diff_loss=0.4347, tot_loss=1.849, over 108.00 samples.], tot_loss[dur_loss=0.3973, prior_loss=1.017, diff_loss=0.4347, tot_loss=1.849, over 108.00 samples.], 2024-10-20 19:56:52,012 INFO [train.py:561] (1/4) Epoch 246, batch 10, global_batch_idx: 3930, batch size: 111, loss[dur_loss=0.3906, prior_loss=1.019, diff_loss=0.4242, tot_loss=1.834, over 111.00 samples.], tot_loss[dur_loss=0.3758, prior_loss=1.016, diff_loss=0.4964, tot_loss=1.888, over 1656.00 samples.], 2024-10-20 19:56:59,093 INFO [train.py:682] (1/4) Start epoch 247 2024-10-20 19:57:12,907 INFO [train.py:561] (1/4) Epoch 247, batch 4, global_batch_idx: 3940, batch size: 189, loss[dur_loss=0.3704, prior_loss=1.016, diff_loss=0.4868, tot_loss=1.873, over 189.00 samples.], tot_loss[dur_loss=0.3715, prior_loss=1.016, diff_loss=0.5514, tot_loss=1.939, over 937.00 samples.], 2024-10-20 19:57:27,865 INFO [train.py:561] (1/4) Epoch 247, batch 14, global_batch_idx: 3950, batch size: 142, loss[dur_loss=0.377, prior_loss=1.015, diff_loss=0.4343, tot_loss=1.827, over 142.00 samples.], tot_loss[dur_loss=0.3765, prior_loss=1.016, diff_loss=0.4821, tot_loss=1.875, over 2210.00 samples.], 2024-10-20 19:57:29,296 INFO [train.py:682] (1/4) Start epoch 248 2024-10-20 19:57:49,228 INFO [train.py:561] (1/4) Epoch 248, batch 8, global_batch_idx: 3960, batch size: 170, loss[dur_loss=0.3756, prior_loss=1.016, diff_loss=0.4438, tot_loss=1.835, over 170.00 samples.], tot_loss[dur_loss=0.3737, prior_loss=1.016, diff_loss=0.5128, tot_loss=1.902, over 1432.00 samples.], 2024-10-20 19:57:59,417 INFO [train.py:682] (1/4) Start epoch 249 2024-10-20 19:58:10,588 INFO [train.py:561] (1/4) Epoch 249, batch 2, global_batch_idx: 3970, batch size: 203, loss[dur_loss=0.3751, prior_loss=1.016, diff_loss=0.4733, tot_loss=1.864, over 203.00 samples.], tot_loss[dur_loss=0.3799, prior_loss=1.016, diff_loss=0.4347, tot_loss=1.831, over 442.00 samples.], 2024-10-20 19:58:24,907 INFO [train.py:561] (1/4) Epoch 249, batch 12, global_batch_idx: 3980, batch size: 152, loss[dur_loss=0.3805, prior_loss=1.016, diff_loss=0.4227, tot_loss=1.819, over 152.00 samples.], tot_loss[dur_loss=0.3757, prior_loss=1.016, diff_loss=0.4834, tot_loss=1.875, over 1966.00 samples.], 2024-10-20 19:58:29,375 INFO [train.py:682] (1/4) Start epoch 250 2024-10-20 19:58:46,397 INFO [train.py:561] (1/4) Epoch 250, batch 6, global_batch_idx: 3990, batch size: 106, loss[dur_loss=0.3778, prior_loss=1.016, diff_loss=0.4278, tot_loss=1.822, over 106.00 samples.], tot_loss[dur_loss=0.3704, prior_loss=1.016, diff_loss=0.5301, tot_loss=1.916, over 1142.00 samples.], 2024-10-20 19:58:59,479 INFO [train.py:682] (1/4) Start epoch 251 2024-10-20 19:59:08,005 INFO [train.py:561] (1/4) Epoch 251, batch 0, global_batch_idx: 4000, batch size: 108, loss[dur_loss=0.3935, prior_loss=1.017, diff_loss=0.3994, tot_loss=1.81, over 108.00 samples.], tot_loss[dur_loss=0.3935, prior_loss=1.017, diff_loss=0.3994, tot_loss=1.81, over 108.00 samples.], 2024-10-20 19:59:22,176 INFO [train.py:561] (1/4) Epoch 251, batch 10, global_batch_idx: 4010, batch size: 111, loss[dur_loss=0.3892, prior_loss=1.019, diff_loss=0.4668, tot_loss=1.875, over 111.00 samples.], tot_loss[dur_loss=0.3747, prior_loss=1.016, diff_loss=0.498, tot_loss=1.889, over 1656.00 samples.], 2024-10-20 19:59:29,254 INFO [train.py:682] (1/4) Start epoch 252 2024-10-20 19:59:42,778 INFO [train.py:561] (1/4) Epoch 252, batch 4, global_batch_idx: 4020, batch size: 189, loss[dur_loss=0.3685, prior_loss=1.016, diff_loss=0.5063, tot_loss=1.891, over 189.00 samples.], tot_loss[dur_loss=0.3713, prior_loss=1.016, diff_loss=0.5519, tot_loss=1.939, over 937.00 samples.], 2024-10-20 19:59:57,684 INFO [train.py:561] (1/4) Epoch 252, batch 14, global_batch_idx: 4030, batch size: 142, loss[dur_loss=0.3787, prior_loss=1.016, diff_loss=0.4139, tot_loss=1.808, over 142.00 samples.], tot_loss[dur_loss=0.3757, prior_loss=1.016, diff_loss=0.4856, tot_loss=1.877, over 2210.00 samples.], 2024-10-20 19:59:59,126 INFO [train.py:682] (1/4) Start epoch 253 2024-10-20 20:00:19,191 INFO [train.py:561] (1/4) Epoch 253, batch 8, global_batch_idx: 4040, batch size: 170, loss[dur_loss=0.3753, prior_loss=1.016, diff_loss=0.4355, tot_loss=1.826, over 170.00 samples.], tot_loss[dur_loss=0.3714, prior_loss=1.015, diff_loss=0.5052, tot_loss=1.892, over 1432.00 samples.], 2024-10-20 20:00:29,338 INFO [train.py:682] (1/4) Start epoch 254 2024-10-20 20:00:40,612 INFO [train.py:561] (1/4) Epoch 254, batch 2, global_batch_idx: 4050, batch size: 203, loss[dur_loss=0.3738, prior_loss=1.016, diff_loss=0.4674, tot_loss=1.857, over 203.00 samples.], tot_loss[dur_loss=0.379, prior_loss=1.016, diff_loss=0.4402, tot_loss=1.835, over 442.00 samples.], 2024-10-20 20:00:54,961 INFO [train.py:561] (1/4) Epoch 254, batch 12, global_batch_idx: 4060, batch size: 152, loss[dur_loss=0.3832, prior_loss=1.016, diff_loss=0.4684, tot_loss=1.868, over 152.00 samples.], tot_loss[dur_loss=0.3748, prior_loss=1.016, diff_loss=0.49, tot_loss=1.88, over 1966.00 samples.], 2024-10-20 20:00:59,458 INFO [train.py:682] (1/4) Start epoch 255 2024-10-20 20:01:16,217 INFO [train.py:561] (1/4) Epoch 255, batch 6, global_batch_idx: 4070, batch size: 106, loss[dur_loss=0.3798, prior_loss=1.016, diff_loss=0.4293, tot_loss=1.825, over 106.00 samples.], tot_loss[dur_loss=0.3709, prior_loss=1.015, diff_loss=0.5302, tot_loss=1.917, over 1142.00 samples.], 2024-10-20 20:01:29,251 INFO [train.py:682] (1/4) Start epoch 256 2024-10-20 20:01:37,819 INFO [train.py:561] (1/4) Epoch 256, batch 0, global_batch_idx: 4080, batch size: 108, loss[dur_loss=0.3957, prior_loss=1.017, diff_loss=0.4462, tot_loss=1.859, over 108.00 samples.], tot_loss[dur_loss=0.3957, prior_loss=1.017, diff_loss=0.4462, tot_loss=1.859, over 108.00 samples.], 2024-10-20 20:01:52,111 INFO [train.py:561] (1/4) Epoch 256, batch 10, global_batch_idx: 4090, batch size: 111, loss[dur_loss=0.3902, prior_loss=1.018, diff_loss=0.4022, tot_loss=1.81, over 111.00 samples.], tot_loss[dur_loss=0.3737, prior_loss=1.016, diff_loss=0.5074, tot_loss=1.897, over 1656.00 samples.], 2024-10-20 20:01:59,174 INFO [train.py:682] (1/4) Start epoch 257 2024-10-20 20:02:12,725 INFO [train.py:561] (1/4) Epoch 257, batch 4, global_batch_idx: 4100, batch size: 189, loss[dur_loss=0.3688, prior_loss=1.016, diff_loss=0.4427, tot_loss=1.827, over 189.00 samples.], tot_loss[dur_loss=0.3676, prior_loss=1.015, diff_loss=0.5477, tot_loss=1.931, over 937.00 samples.], 2024-10-20 20:02:27,608 INFO [train.py:561] (1/4) Epoch 257, batch 14, global_batch_idx: 4110, batch size: 142, loss[dur_loss=0.3758, prior_loss=1.015, diff_loss=0.4262, tot_loss=1.817, over 142.00 samples.], tot_loss[dur_loss=0.3739, prior_loss=1.016, diff_loss=0.4783, tot_loss=1.868, over 2210.00 samples.], 2024-10-20 20:02:29,032 INFO [train.py:682] (1/4) Start epoch 258 2024-10-20 20:02:48,758 INFO [train.py:561] (1/4) Epoch 258, batch 8, global_batch_idx: 4120, batch size: 170, loss[dur_loss=0.3722, prior_loss=1.016, diff_loss=0.445, tot_loss=1.833, over 170.00 samples.], tot_loss[dur_loss=0.3708, prior_loss=1.016, diff_loss=0.5067, tot_loss=1.893, over 1432.00 samples.], 2024-10-20 20:02:58,859 INFO [train.py:682] (1/4) Start epoch 259 2024-10-20 20:03:10,066 INFO [train.py:561] (1/4) Epoch 259, batch 2, global_batch_idx: 4130, batch size: 203, loss[dur_loss=0.375, prior_loss=1.015, diff_loss=0.476, tot_loss=1.866, over 203.00 samples.], tot_loss[dur_loss=0.379, prior_loss=1.016, diff_loss=0.4335, tot_loss=1.828, over 442.00 samples.], 2024-10-20 20:03:24,262 INFO [train.py:561] (1/4) Epoch 259, batch 12, global_batch_idx: 4140, batch size: 152, loss[dur_loss=0.3829, prior_loss=1.015, diff_loss=0.4525, tot_loss=1.851, over 152.00 samples.], tot_loss[dur_loss=0.3749, prior_loss=1.015, diff_loss=0.4832, tot_loss=1.874, over 1966.00 samples.], 2024-10-20 20:03:28,683 INFO [train.py:682] (1/4) Start epoch 260 2024-10-20 20:03:45,853 INFO [train.py:561] (1/4) Epoch 260, batch 6, global_batch_idx: 4150, batch size: 106, loss[dur_loss=0.3751, prior_loss=1.016, diff_loss=0.4365, tot_loss=1.827, over 106.00 samples.], tot_loss[dur_loss=0.3681, prior_loss=1.015, diff_loss=0.5352, tot_loss=1.918, over 1142.00 samples.], 2024-10-20 20:03:58,916 INFO [train.py:682] (1/4) Start epoch 261 2024-10-20 20:04:07,492 INFO [train.py:561] (1/4) Epoch 261, batch 0, global_batch_idx: 4160, batch size: 108, loss[dur_loss=0.3893, prior_loss=1.016, diff_loss=0.4258, tot_loss=1.831, over 108.00 samples.], tot_loss[dur_loss=0.3893, prior_loss=1.016, diff_loss=0.4258, tot_loss=1.831, over 108.00 samples.], 2024-10-20 20:04:21,598 INFO [train.py:561] (1/4) Epoch 261, batch 10, global_batch_idx: 4170, batch size: 111, loss[dur_loss=0.3917, prior_loss=1.018, diff_loss=0.4379, tot_loss=1.847, over 111.00 samples.], tot_loss[dur_loss=0.3728, prior_loss=1.016, diff_loss=0.4922, tot_loss=1.881, over 1656.00 samples.], 2024-10-20 20:04:28,680 INFO [train.py:682] (1/4) Start epoch 262 2024-10-20 20:04:42,414 INFO [train.py:561] (1/4) Epoch 262, batch 4, global_batch_idx: 4180, batch size: 189, loss[dur_loss=0.3702, prior_loss=1.016, diff_loss=0.4388, tot_loss=1.826, over 189.00 samples.], tot_loss[dur_loss=0.3677, prior_loss=1.015, diff_loss=0.5421, tot_loss=1.925, over 937.00 samples.], 2024-10-20 20:04:57,426 INFO [train.py:561] (1/4) Epoch 262, batch 14, global_batch_idx: 4190, batch size: 142, loss[dur_loss=0.3709, prior_loss=1.015, diff_loss=0.4258, tot_loss=1.812, over 142.00 samples.], tot_loss[dur_loss=0.3747, prior_loss=1.016, diff_loss=0.4785, tot_loss=1.869, over 2210.00 samples.], 2024-10-20 20:04:58,887 INFO [train.py:682] (1/4) Start epoch 263 2024-10-20 20:05:19,014 INFO [train.py:561] (1/4) Epoch 263, batch 8, global_batch_idx: 4200, batch size: 170, loss[dur_loss=0.3782, prior_loss=1.016, diff_loss=0.426, tot_loss=1.82, over 170.00 samples.], tot_loss[dur_loss=0.3706, prior_loss=1.015, diff_loss=0.5101, tot_loss=1.896, over 1432.00 samples.], 2024-10-20 20:05:29,239 INFO [train.py:682] (1/4) Start epoch 264 2024-10-20 20:05:40,615 INFO [train.py:561] (1/4) Epoch 264, batch 2, global_batch_idx: 4210, batch size: 203, loss[dur_loss=0.3686, prior_loss=1.015, diff_loss=0.4437, tot_loss=1.828, over 203.00 samples.], tot_loss[dur_loss=0.3749, prior_loss=1.016, diff_loss=0.428, tot_loss=1.819, over 442.00 samples.], 2024-10-20 20:05:54,922 INFO [train.py:561] (1/4) Epoch 264, batch 12, global_batch_idx: 4220, batch size: 152, loss[dur_loss=0.3773, prior_loss=1.015, diff_loss=0.4251, tot_loss=1.817, over 152.00 samples.], tot_loss[dur_loss=0.3711, prior_loss=1.015, diff_loss=0.4747, tot_loss=1.861, over 1966.00 samples.], 2024-10-20 20:05:59,381 INFO [train.py:682] (1/4) Start epoch 265 2024-10-20 20:06:16,209 INFO [train.py:561] (1/4) Epoch 265, batch 6, global_batch_idx: 4230, batch size: 106, loss[dur_loss=0.3727, prior_loss=1.015, diff_loss=0.4018, tot_loss=1.79, over 106.00 samples.], tot_loss[dur_loss=0.3672, prior_loss=1.015, diff_loss=0.5246, tot_loss=1.907, over 1142.00 samples.], 2024-10-20 20:06:29,311 INFO [train.py:682] (1/4) Start epoch 266 2024-10-20 20:06:37,776 INFO [train.py:561] (1/4) Epoch 266, batch 0, global_batch_idx: 4240, batch size: 108, loss[dur_loss=0.3881, prior_loss=1.016, diff_loss=0.4311, tot_loss=1.835, over 108.00 samples.], tot_loss[dur_loss=0.3881, prior_loss=1.016, diff_loss=0.4311, tot_loss=1.835, over 108.00 samples.], 2024-10-20 20:06:52,037 INFO [train.py:561] (1/4) Epoch 266, batch 10, global_batch_idx: 4250, batch size: 111, loss[dur_loss=0.3898, prior_loss=1.017, diff_loss=0.4178, tot_loss=1.825, over 111.00 samples.], tot_loss[dur_loss=0.3687, prior_loss=1.015, diff_loss=0.4956, tot_loss=1.879, over 1656.00 samples.], 2024-10-20 20:06:59,098 INFO [train.py:682] (1/4) Start epoch 267 2024-10-20 20:07:12,900 INFO [train.py:561] (1/4) Epoch 267, batch 4, global_batch_idx: 4260, batch size: 189, loss[dur_loss=0.3662, prior_loss=1.015, diff_loss=0.4754, tot_loss=1.857, over 189.00 samples.], tot_loss[dur_loss=0.3629, prior_loss=1.014, diff_loss=0.5523, tot_loss=1.93, over 937.00 samples.], 2024-10-20 20:07:27,785 INFO [train.py:561] (1/4) Epoch 267, batch 14, global_batch_idx: 4270, batch size: 142, loss[dur_loss=0.3736, prior_loss=1.014, diff_loss=0.4071, tot_loss=1.795, over 142.00 samples.], tot_loss[dur_loss=0.3702, prior_loss=1.015, diff_loss=0.4807, tot_loss=1.866, over 2210.00 samples.], 2024-10-20 20:07:29,236 INFO [train.py:682] (1/4) Start epoch 268 2024-10-20 20:07:49,270 INFO [train.py:561] (1/4) Epoch 268, batch 8, global_batch_idx: 4280, batch size: 170, loss[dur_loss=0.3784, prior_loss=1.015, diff_loss=0.4188, tot_loss=1.813, over 170.00 samples.], tot_loss[dur_loss=0.3677, prior_loss=1.015, diff_loss=0.4926, tot_loss=1.875, over 1432.00 samples.], 2024-10-20 20:07:59,354 INFO [train.py:682] (1/4) Start epoch 269 2024-10-20 20:08:10,511 INFO [train.py:561] (1/4) Epoch 269, batch 2, global_batch_idx: 4290, batch size: 203, loss[dur_loss=0.3694, prior_loss=1.015, diff_loss=0.4428, tot_loss=1.827, over 203.00 samples.], tot_loss[dur_loss=0.3746, prior_loss=1.015, diff_loss=0.4216, tot_loss=1.811, over 442.00 samples.], 2024-10-20 20:08:25,033 INFO [train.py:561] (1/4) Epoch 269, batch 12, global_batch_idx: 4300, batch size: 152, loss[dur_loss=0.374, prior_loss=1.015, diff_loss=0.421, tot_loss=1.81, over 152.00 samples.], tot_loss[dur_loss=0.3697, prior_loss=1.015, diff_loss=0.4747, tot_loss=1.859, over 1966.00 samples.], 2024-10-20 20:08:29,486 INFO [train.py:682] (1/4) Start epoch 270 2024-10-20 20:08:46,301 INFO [train.py:561] (1/4) Epoch 270, batch 6, global_batch_idx: 4310, batch size: 106, loss[dur_loss=0.3718, prior_loss=1.015, diff_loss=0.4398, tot_loss=1.826, over 106.00 samples.], tot_loss[dur_loss=0.367, prior_loss=1.014, diff_loss=0.5284, tot_loss=1.91, over 1142.00 samples.], 2024-10-20 20:08:59,357 INFO [train.py:682] (1/4) Start epoch 271 2024-10-20 20:09:07,913 INFO [train.py:561] (1/4) Epoch 271, batch 0, global_batch_idx: 4320, batch size: 108, loss[dur_loss=0.3879, prior_loss=1.016, diff_loss=0.4192, tot_loss=1.823, over 108.00 samples.], tot_loss[dur_loss=0.3879, prior_loss=1.016, diff_loss=0.4192, tot_loss=1.823, over 108.00 samples.], 2024-10-20 20:09:22,083 INFO [train.py:561] (1/4) Epoch 271, batch 10, global_batch_idx: 4330, batch size: 111, loss[dur_loss=0.3845, prior_loss=1.017, diff_loss=0.4459, tot_loss=1.847, over 111.00 samples.], tot_loss[dur_loss=0.3666, prior_loss=1.015, diff_loss=0.4877, tot_loss=1.869, over 1656.00 samples.], 2024-10-20 20:09:29,167 INFO [train.py:682] (1/4) Start epoch 272 2024-10-20 20:09:42,578 INFO [train.py:561] (1/4) Epoch 272, batch 4, global_batch_idx: 4340, batch size: 189, loss[dur_loss=0.364, prior_loss=1.015, diff_loss=0.4528, tot_loss=1.831, over 189.00 samples.], tot_loss[dur_loss=0.3637, prior_loss=1.014, diff_loss=0.538, tot_loss=1.916, over 937.00 samples.], 2024-10-20 20:09:57,445 INFO [train.py:561] (1/4) Epoch 272, batch 14, global_batch_idx: 4350, batch size: 142, loss[dur_loss=0.3687, prior_loss=1.014, diff_loss=0.4387, tot_loss=1.822, over 142.00 samples.], tot_loss[dur_loss=0.3688, prior_loss=1.015, diff_loss=0.4756, tot_loss=1.859, over 2210.00 samples.], 2024-10-20 20:09:58,877 INFO [train.py:682] (1/4) Start epoch 273 2024-10-20 20:10:18,877 INFO [train.py:561] (1/4) Epoch 273, batch 8, global_batch_idx: 4360, batch size: 170, loss[dur_loss=0.3713, prior_loss=1.015, diff_loss=0.4291, tot_loss=1.815, over 170.00 samples.], tot_loss[dur_loss=0.3675, prior_loss=1.014, diff_loss=0.4886, tot_loss=1.871, over 1432.00 samples.], 2024-10-20 20:10:29,029 INFO [train.py:682] (1/4) Start epoch 274 2024-10-20 20:10:40,262 INFO [train.py:561] (1/4) Epoch 274, batch 2, global_batch_idx: 4370, batch size: 203, loss[dur_loss=0.3703, prior_loss=1.015, diff_loss=0.4603, tot_loss=1.846, over 203.00 samples.], tot_loss[dur_loss=0.3748, prior_loss=1.015, diff_loss=0.4397, tot_loss=1.83, over 442.00 samples.], 2024-10-20 20:10:54,539 INFO [train.py:561] (1/4) Epoch 274, batch 12, global_batch_idx: 4380, batch size: 152, loss[dur_loss=0.3718, prior_loss=1.014, diff_loss=0.4459, tot_loss=1.832, over 152.00 samples.], tot_loss[dur_loss=0.3685, prior_loss=1.015, diff_loss=0.4826, tot_loss=1.866, over 1966.00 samples.], 2024-10-20 20:10:59,003 INFO [train.py:682] (1/4) Start epoch 275 2024-10-20 20:11:15,923 INFO [train.py:561] (1/4) Epoch 275, batch 6, global_batch_idx: 4390, batch size: 106, loss[dur_loss=0.3728, prior_loss=1.014, diff_loss=0.4391, tot_loss=1.826, over 106.00 samples.], tot_loss[dur_loss=0.3641, prior_loss=1.014, diff_loss=0.5322, tot_loss=1.91, over 1142.00 samples.], 2024-10-20 20:11:29,010 INFO [train.py:682] (1/4) Start epoch 276 2024-10-20 20:11:37,986 INFO [train.py:561] (1/4) Epoch 276, batch 0, global_batch_idx: 4400, batch size: 108, loss[dur_loss=0.3852, prior_loss=1.016, diff_loss=0.4287, tot_loss=1.83, over 108.00 samples.], tot_loss[dur_loss=0.3852, prior_loss=1.016, diff_loss=0.4287, tot_loss=1.83, over 108.00 samples.], 2024-10-20 20:11:52,165 INFO [train.py:561] (1/4) Epoch 276, batch 10, global_batch_idx: 4410, batch size: 111, loss[dur_loss=0.3819, prior_loss=1.017, diff_loss=0.4349, tot_loss=1.834, over 111.00 samples.], tot_loss[dur_loss=0.3675, prior_loss=1.014, diff_loss=0.488, tot_loss=1.87, over 1656.00 samples.], 2024-10-20 20:11:59,209 INFO [train.py:682] (1/4) Start epoch 277 2024-10-20 20:12:12,799 INFO [train.py:561] (1/4) Epoch 277, batch 4, global_batch_idx: 4420, batch size: 189, loss[dur_loss=0.3616, prior_loss=1.015, diff_loss=0.4866, tot_loss=1.863, over 189.00 samples.], tot_loss[dur_loss=0.3606, prior_loss=1.014, diff_loss=0.5569, tot_loss=1.931, over 937.00 samples.], 2024-10-20 20:12:27,723 INFO [train.py:561] (1/4) Epoch 277, batch 14, global_batch_idx: 4430, batch size: 142, loss[dur_loss=0.3746, prior_loss=1.014, diff_loss=0.4381, tot_loss=1.827, over 142.00 samples.], tot_loss[dur_loss=0.3676, prior_loss=1.014, diff_loss=0.4814, tot_loss=1.863, over 2210.00 samples.], 2024-10-20 20:12:29,150 INFO [train.py:682] (1/4) Start epoch 278 2024-10-20 20:12:49,295 INFO [train.py:561] (1/4) Epoch 278, batch 8, global_batch_idx: 4440, batch size: 170, loss[dur_loss=0.368, prior_loss=1.014, diff_loss=0.4679, tot_loss=1.85, over 170.00 samples.], tot_loss[dur_loss=0.3635, prior_loss=1.014, diff_loss=0.5124, tot_loss=1.89, over 1432.00 samples.], 2024-10-20 20:12:59,413 INFO [train.py:682] (1/4) Start epoch 279 2024-10-20 20:13:10,985 INFO [train.py:561] (1/4) Epoch 279, batch 2, global_batch_idx: 4450, batch size: 203, loss[dur_loss=0.3629, prior_loss=1.014, diff_loss=0.4806, tot_loss=1.858, over 203.00 samples.], tot_loss[dur_loss=0.3704, prior_loss=1.014, diff_loss=0.4538, tot_loss=1.839, over 442.00 samples.], 2024-10-20 20:13:25,202 INFO [train.py:561] (1/4) Epoch 279, batch 12, global_batch_idx: 4460, batch size: 152, loss[dur_loss=0.3714, prior_loss=1.014, diff_loss=0.4272, tot_loss=1.813, over 152.00 samples.], tot_loss[dur_loss=0.3659, prior_loss=1.014, diff_loss=0.4887, tot_loss=1.869, over 1966.00 samples.], 2024-10-20 20:13:29,633 INFO [train.py:682] (1/4) Start epoch 280 2024-10-20 20:13:46,541 INFO [train.py:561] (1/4) Epoch 280, batch 6, global_batch_idx: 4470, batch size: 106, loss[dur_loss=0.3757, prior_loss=1.014, diff_loss=0.4234, tot_loss=1.814, over 106.00 samples.], tot_loss[dur_loss=0.3635, prior_loss=1.014, diff_loss=0.5169, tot_loss=1.894, over 1142.00 samples.], 2024-10-20 20:13:59,658 INFO [train.py:682] (1/4) Start epoch 281 2024-10-20 20:14:08,223 INFO [train.py:561] (1/4) Epoch 281, batch 0, global_batch_idx: 4480, batch size: 108, loss[dur_loss=0.386, prior_loss=1.016, diff_loss=0.4444, tot_loss=1.846, over 108.00 samples.], tot_loss[dur_loss=0.386, prior_loss=1.016, diff_loss=0.4444, tot_loss=1.846, over 108.00 samples.], 2024-10-20 20:14:22,351 INFO [train.py:561] (1/4) Epoch 281, batch 10, global_batch_idx: 4490, batch size: 111, loss[dur_loss=0.3821, prior_loss=1.016, diff_loss=0.4164, tot_loss=1.815, over 111.00 samples.], tot_loss[dur_loss=0.3661, prior_loss=1.014, diff_loss=0.486, tot_loss=1.866, over 1656.00 samples.], 2024-10-20 20:14:29,457 INFO [train.py:682] (1/4) Start epoch 282 2024-10-20 20:14:42,978 INFO [train.py:561] (1/4) Epoch 282, batch 4, global_batch_idx: 4500, batch size: 189, loss[dur_loss=0.3675, prior_loss=1.014, diff_loss=0.432, tot_loss=1.814, over 189.00 samples.], tot_loss[dur_loss=0.3611, prior_loss=1.014, diff_loss=0.5521, tot_loss=1.927, over 937.00 samples.], 2024-10-20 20:14:44,605 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 20:15:01,998 INFO [train.py:589] (1/4) Epoch 282, validation: dur_loss=0.4134, prior_loss=1.028, diff_loss=0.4022, tot_loss=1.843, over 100.00 samples. 2024-10-20 20:15:01,999 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-20 20:15:15,330 INFO [train.py:561] (1/4) Epoch 282, batch 14, global_batch_idx: 4510, batch size: 142, loss[dur_loss=0.3665, prior_loss=1.014, diff_loss=0.393, tot_loss=1.773, over 142.00 samples.], tot_loss[dur_loss=0.3667, prior_loss=1.014, diff_loss=0.4812, tot_loss=1.862, over 2210.00 samples.], 2024-10-20 20:15:16,751 INFO [train.py:682] (1/4) Start epoch 283 2024-10-20 20:15:37,131 INFO [train.py:561] (1/4) Epoch 283, batch 8, global_batch_idx: 4520, batch size: 170, loss[dur_loss=0.3662, prior_loss=1.014, diff_loss=0.4327, tot_loss=1.813, over 170.00 samples.], tot_loss[dur_loss=0.3641, prior_loss=1.014, diff_loss=0.5041, tot_loss=1.882, over 1432.00 samples.], 2024-10-20 20:15:47,323 INFO [train.py:682] (1/4) Start epoch 284 2024-10-20 20:15:59,083 INFO [train.py:561] (1/4) Epoch 284, batch 2, global_batch_idx: 4530, batch size: 203, loss[dur_loss=0.3675, prior_loss=1.014, diff_loss=0.4658, tot_loss=1.847, over 203.00 samples.], tot_loss[dur_loss=0.3726, prior_loss=1.014, diff_loss=0.4441, tot_loss=1.831, over 442.00 samples.], 2024-10-20 20:16:13,287 INFO [train.py:561] (1/4) Epoch 284, batch 12, global_batch_idx: 4540, batch size: 152, loss[dur_loss=0.3711, prior_loss=1.014, diff_loss=0.4196, tot_loss=1.804, over 152.00 samples.], tot_loss[dur_loss=0.3664, prior_loss=1.014, diff_loss=0.4809, tot_loss=1.861, over 1966.00 samples.], 2024-10-20 20:16:17,727 INFO [train.py:682] (1/4) Start epoch 285 2024-10-20 20:16:34,778 INFO [train.py:561] (1/4) Epoch 285, batch 6, global_batch_idx: 4550, batch size: 106, loss[dur_loss=0.3685, prior_loss=1.014, diff_loss=0.4042, tot_loss=1.787, over 106.00 samples.], tot_loss[dur_loss=0.3623, prior_loss=1.013, diff_loss=0.5139, tot_loss=1.889, over 1142.00 samples.], 2024-10-20 20:16:47,793 INFO [train.py:682] (1/4) Start epoch 286 2024-10-20 20:16:56,900 INFO [train.py:561] (1/4) Epoch 286, batch 0, global_batch_idx: 4560, batch size: 108, loss[dur_loss=0.3829, prior_loss=1.015, diff_loss=0.4226, tot_loss=1.821, over 108.00 samples.], tot_loss[dur_loss=0.3829, prior_loss=1.015, diff_loss=0.4226, tot_loss=1.821, over 108.00 samples.], 2024-10-20 20:17:11,072 INFO [train.py:561] (1/4) Epoch 286, batch 10, global_batch_idx: 4570, batch size: 111, loss[dur_loss=0.3829, prior_loss=1.016, diff_loss=0.3956, tot_loss=1.795, over 111.00 samples.], tot_loss[dur_loss=0.3657, prior_loss=1.014, diff_loss=0.4872, tot_loss=1.867, over 1656.00 samples.], 2024-10-20 20:17:18,105 INFO [train.py:682] (1/4) Start epoch 287 2024-10-20 20:17:31,944 INFO [train.py:561] (1/4) Epoch 287, batch 4, global_batch_idx: 4580, batch size: 189, loss[dur_loss=0.3635, prior_loss=1.014, diff_loss=0.4689, tot_loss=1.846, over 189.00 samples.], tot_loss[dur_loss=0.3604, prior_loss=1.013, diff_loss=0.5467, tot_loss=1.92, over 937.00 samples.], 2024-10-20 20:17:46,809 INFO [train.py:561] (1/4) Epoch 287, batch 14, global_batch_idx: 4590, batch size: 142, loss[dur_loss=0.3732, prior_loss=1.013, diff_loss=0.4094, tot_loss=1.796, over 142.00 samples.], tot_loss[dur_loss=0.3661, prior_loss=1.014, diff_loss=0.4749, tot_loss=1.855, over 2210.00 samples.], 2024-10-20 20:17:48,242 INFO [train.py:682] (1/4) Start epoch 288 2024-10-20 20:18:08,245 INFO [train.py:561] (1/4) Epoch 288, batch 8, global_batch_idx: 4600, batch size: 170, loss[dur_loss=0.3659, prior_loss=1.014, diff_loss=0.4614, tot_loss=1.841, over 170.00 samples.], tot_loss[dur_loss=0.3646, prior_loss=1.014, diff_loss=0.5164, tot_loss=1.895, over 1432.00 samples.], 2024-10-20 20:18:18,420 INFO [train.py:682] (1/4) Start epoch 289 2024-10-20 20:18:29,759 INFO [train.py:561] (1/4) Epoch 289, batch 2, global_batch_idx: 4610, batch size: 203, loss[dur_loss=0.362, prior_loss=1.013, diff_loss=0.474, tot_loss=1.849, over 203.00 samples.], tot_loss[dur_loss=0.3689, prior_loss=1.014, diff_loss=0.44, tot_loss=1.823, over 442.00 samples.], 2024-10-20 20:18:43,951 INFO [train.py:561] (1/4) Epoch 289, batch 12, global_batch_idx: 4620, batch size: 152, loss[dur_loss=0.3681, prior_loss=1.013, diff_loss=0.4459, tot_loss=1.827, over 152.00 samples.], tot_loss[dur_loss=0.3651, prior_loss=1.013, diff_loss=0.4819, tot_loss=1.86, over 1966.00 samples.], 2024-10-20 20:18:48,381 INFO [train.py:682] (1/4) Start epoch 290 2024-10-20 20:19:05,810 INFO [train.py:561] (1/4) Epoch 290, batch 6, global_batch_idx: 4630, batch size: 106, loss[dur_loss=0.3669, prior_loss=1.014, diff_loss=0.4174, tot_loss=1.798, over 106.00 samples.], tot_loss[dur_loss=0.3588, prior_loss=1.013, diff_loss=0.5124, tot_loss=1.884, over 1142.00 samples.], 2024-10-20 20:19:18,820 INFO [train.py:682] (1/4) Start epoch 291 2024-10-20 20:19:27,573 INFO [train.py:561] (1/4) Epoch 291, batch 0, global_batch_idx: 4640, batch size: 108, loss[dur_loss=0.3815, prior_loss=1.015, diff_loss=0.4156, tot_loss=1.812, over 108.00 samples.], tot_loss[dur_loss=0.3815, prior_loss=1.015, diff_loss=0.4156, tot_loss=1.812, over 108.00 samples.], 2024-10-20 20:19:41,779 INFO [train.py:561] (1/4) Epoch 291, batch 10, global_batch_idx: 4650, batch size: 111, loss[dur_loss=0.3788, prior_loss=1.016, diff_loss=0.3949, tot_loss=1.789, over 111.00 samples.], tot_loss[dur_loss=0.3641, prior_loss=1.013, diff_loss=0.4894, tot_loss=1.867, over 1656.00 samples.], 2024-10-20 20:19:48,833 INFO [train.py:682] (1/4) Start epoch 292 2024-10-20 20:20:02,506 INFO [train.py:561] (1/4) Epoch 292, batch 4, global_batch_idx: 4660, batch size: 189, loss[dur_loss=0.3614, prior_loss=1.014, diff_loss=0.4624, tot_loss=1.838, over 189.00 samples.], tot_loss[dur_loss=0.3581, prior_loss=1.013, diff_loss=0.5444, tot_loss=1.916, over 937.00 samples.], 2024-10-20 20:20:17,384 INFO [train.py:561] (1/4) Epoch 292, batch 14, global_batch_idx: 4670, batch size: 142, loss[dur_loss=0.3625, prior_loss=1.013, diff_loss=0.4243, tot_loss=1.8, over 142.00 samples.], tot_loss[dur_loss=0.3645, prior_loss=1.013, diff_loss=0.4783, tot_loss=1.856, over 2210.00 samples.], 2024-10-20 20:20:18,813 INFO [train.py:682] (1/4) Start epoch 293 2024-10-20 20:20:39,310 INFO [train.py:561] (1/4) Epoch 293, batch 8, global_batch_idx: 4680, batch size: 170, loss[dur_loss=0.3631, prior_loss=1.013, diff_loss=0.4696, tot_loss=1.846, over 170.00 samples.], tot_loss[dur_loss=0.3621, prior_loss=1.013, diff_loss=0.5141, tot_loss=1.889, over 1432.00 samples.], 2024-10-20 20:20:49,433 INFO [train.py:682] (1/4) Start epoch 294 2024-10-20 20:21:00,720 INFO [train.py:561] (1/4) Epoch 294, batch 2, global_batch_idx: 4690, batch size: 203, loss[dur_loss=0.361, prior_loss=1.013, diff_loss=0.4641, tot_loss=1.838, over 203.00 samples.], tot_loss[dur_loss=0.3663, prior_loss=1.014, diff_loss=0.4218, tot_loss=1.802, over 442.00 samples.], 2024-10-20 20:21:14,966 INFO [train.py:561] (1/4) Epoch 294, batch 12, global_batch_idx: 4700, batch size: 152, loss[dur_loss=0.3704, prior_loss=1.013, diff_loss=0.3905, tot_loss=1.774, over 152.00 samples.], tot_loss[dur_loss=0.3624, prior_loss=1.013, diff_loss=0.4655, tot_loss=1.841, over 1966.00 samples.], 2024-10-20 20:21:19,420 INFO [train.py:682] (1/4) Start epoch 295 2024-10-20 20:21:36,421 INFO [train.py:561] (1/4) Epoch 295, batch 6, global_batch_idx: 4710, batch size: 106, loss[dur_loss=0.3611, prior_loss=1.013, diff_loss=0.4552, tot_loss=1.829, over 106.00 samples.], tot_loss[dur_loss=0.3586, prior_loss=1.013, diff_loss=0.5269, tot_loss=1.898, over 1142.00 samples.], 2024-10-20 20:21:49,376 INFO [train.py:682] (1/4) Start epoch 296 2024-10-20 20:21:58,115 INFO [train.py:561] (1/4) Epoch 296, batch 0, global_batch_idx: 4720, batch size: 108, loss[dur_loss=0.3787, prior_loss=1.015, diff_loss=0.4138, tot_loss=1.807, over 108.00 samples.], tot_loss[dur_loss=0.3787, prior_loss=1.015, diff_loss=0.4138, tot_loss=1.807, over 108.00 samples.], 2024-10-20 20:22:12,424 INFO [train.py:561] (1/4) Epoch 296, batch 10, global_batch_idx: 4730, batch size: 111, loss[dur_loss=0.3741, prior_loss=1.015, diff_loss=0.4407, tot_loss=1.83, over 111.00 samples.], tot_loss[dur_loss=0.3604, prior_loss=1.013, diff_loss=0.4875, tot_loss=1.861, over 1656.00 samples.], 2024-10-20 20:22:19,576 INFO [train.py:682] (1/4) Start epoch 297 2024-10-20 20:22:33,817 INFO [train.py:561] (1/4) Epoch 297, batch 4, global_batch_idx: 4740, batch size: 189, loss[dur_loss=0.3707, prior_loss=1.014, diff_loss=0.4335, tot_loss=1.818, over 189.00 samples.], tot_loss[dur_loss=0.3594, prior_loss=1.013, diff_loss=0.5272, tot_loss=1.9, over 937.00 samples.], 2024-10-20 20:22:48,799 INFO [train.py:561] (1/4) Epoch 297, batch 14, global_batch_idx: 4750, batch size: 142, loss[dur_loss=0.3639, prior_loss=1.013, diff_loss=0.4229, tot_loss=1.799, over 142.00 samples.], tot_loss[dur_loss=0.3634, prior_loss=1.013, diff_loss=0.4664, tot_loss=1.843, over 2210.00 samples.], 2024-10-20 20:22:50,226 INFO [train.py:682] (1/4) Start epoch 298 2024-10-20 20:23:10,203 INFO [train.py:561] (1/4) Epoch 298, batch 8, global_batch_idx: 4760, batch size: 170, loss[dur_loss=0.3614, prior_loss=1.013, diff_loss=0.4853, tot_loss=1.86, over 170.00 samples.], tot_loss[dur_loss=0.3594, prior_loss=1.013, diff_loss=0.5095, tot_loss=1.882, over 1432.00 samples.], 2024-10-20 20:23:20,381 INFO [train.py:682] (1/4) Start epoch 299 2024-10-20 20:23:31,828 INFO [train.py:561] (1/4) Epoch 299, batch 2, global_batch_idx: 4770, batch size: 203, loss[dur_loss=0.3563, prior_loss=1.013, diff_loss=0.4786, tot_loss=1.848, over 203.00 samples.], tot_loss[dur_loss=0.3649, prior_loss=1.013, diff_loss=0.4406, tot_loss=1.819, over 442.00 samples.], 2024-10-20 20:23:46,080 INFO [train.py:561] (1/4) Epoch 299, batch 12, global_batch_idx: 4780, batch size: 152, loss[dur_loss=0.3692, prior_loss=1.013, diff_loss=0.4159, tot_loss=1.798, over 152.00 samples.], tot_loss[dur_loss=0.3611, prior_loss=1.013, diff_loss=0.4765, tot_loss=1.851, over 1966.00 samples.], 2024-10-20 20:23:50,546 INFO [train.py:682] (1/4) Start epoch 300 2024-10-20 20:24:07,845 INFO [train.py:561] (1/4) Epoch 300, batch 6, global_batch_idx: 4790, batch size: 106, loss[dur_loss=0.366, prior_loss=1.013, diff_loss=0.4181, tot_loss=1.797, over 106.00 samples.], tot_loss[dur_loss=0.3578, prior_loss=1.013, diff_loss=0.5082, tot_loss=1.879, over 1142.00 samples.], 2024-10-20 20:24:21,028 INFO [train.py:682] (1/4) Start epoch 301 2024-10-20 20:24:29,790 INFO [train.py:561] (1/4) Epoch 301, batch 0, global_batch_idx: 4800, batch size: 108, loss[dur_loss=0.3769, prior_loss=1.014, diff_loss=0.4616, tot_loss=1.853, over 108.00 samples.], tot_loss[dur_loss=0.3769, prior_loss=1.014, diff_loss=0.4616, tot_loss=1.853, over 108.00 samples.], 2024-10-20 20:24:44,120 INFO [train.py:561] (1/4) Epoch 301, batch 10, global_batch_idx: 4810, batch size: 111, loss[dur_loss=0.3752, prior_loss=1.015, diff_loss=0.4616, tot_loss=1.852, over 111.00 samples.], tot_loss[dur_loss=0.3599, prior_loss=1.013, diff_loss=0.4922, tot_loss=1.865, over 1656.00 samples.], 2024-10-20 20:24:51,272 INFO [train.py:682] (1/4) Start epoch 302 2024-10-20 20:25:04,898 INFO [train.py:561] (1/4) Epoch 302, batch 4, global_batch_idx: 4820, batch size: 189, loss[dur_loss=0.3574, prior_loss=1.013, diff_loss=0.4529, tot_loss=1.823, over 189.00 samples.], tot_loss[dur_loss=0.3562, prior_loss=1.012, diff_loss=0.5414, tot_loss=1.91, over 937.00 samples.], 2024-10-20 20:25:19,823 INFO [train.py:561] (1/4) Epoch 302, batch 14, global_batch_idx: 4830, batch size: 142, loss[dur_loss=0.366, prior_loss=1.012, diff_loss=0.3694, tot_loss=1.748, over 142.00 samples.], tot_loss[dur_loss=0.3616, prior_loss=1.013, diff_loss=0.4743, tot_loss=1.849, over 2210.00 samples.], 2024-10-20 20:25:21,261 INFO [train.py:682] (1/4) Start epoch 303 2024-10-20 20:25:41,345 INFO [train.py:561] (1/4) Epoch 303, batch 8, global_batch_idx: 4840, batch size: 170, loss[dur_loss=0.3599, prior_loss=1.013, diff_loss=0.4375, tot_loss=1.81, over 170.00 samples.], tot_loss[dur_loss=0.3586, prior_loss=1.013, diff_loss=0.4923, tot_loss=1.863, over 1432.00 samples.], 2024-10-20 20:25:51,444 INFO [train.py:682] (1/4) Start epoch 304 2024-10-20 20:26:02,718 INFO [train.py:561] (1/4) Epoch 304, batch 2, global_batch_idx: 4850, batch size: 203, loss[dur_loss=0.3595, prior_loss=1.013, diff_loss=0.4501, tot_loss=1.822, over 203.00 samples.], tot_loss[dur_loss=0.3646, prior_loss=1.013, diff_loss=0.4306, tot_loss=1.808, over 442.00 samples.], 2024-10-20 20:26:16,920 INFO [train.py:561] (1/4) Epoch 304, batch 12, global_batch_idx: 4860, batch size: 152, loss[dur_loss=0.3659, prior_loss=1.013, diff_loss=0.4395, tot_loss=1.818, over 152.00 samples.], tot_loss[dur_loss=0.3608, prior_loss=1.013, diff_loss=0.4737, tot_loss=1.847, over 1966.00 samples.], 2024-10-20 20:26:21,378 INFO [train.py:682] (1/4) Start epoch 305 2024-10-20 20:26:38,280 INFO [train.py:561] (1/4) Epoch 305, batch 6, global_batch_idx: 4870, batch size: 106, loss[dur_loss=0.3611, prior_loss=1.013, diff_loss=0.4137, tot_loss=1.788, over 106.00 samples.], tot_loss[dur_loss=0.3555, prior_loss=1.012, diff_loss=0.5194, tot_loss=1.887, over 1142.00 samples.], 2024-10-20 20:26:51,341 INFO [train.py:682] (1/4) Start epoch 306 2024-10-20 20:27:00,012 INFO [train.py:561] (1/4) Epoch 306, batch 0, global_batch_idx: 4880, batch size: 108, loss[dur_loss=0.3748, prior_loss=1.014, diff_loss=0.4027, tot_loss=1.791, over 108.00 samples.], tot_loss[dur_loss=0.3748, prior_loss=1.014, diff_loss=0.4027, tot_loss=1.791, over 108.00 samples.], 2024-10-20 20:27:14,219 INFO [train.py:561] (1/4) Epoch 306, batch 10, global_batch_idx: 4890, batch size: 111, loss[dur_loss=0.3763, prior_loss=1.015, diff_loss=0.4548, tot_loss=1.846, over 111.00 samples.], tot_loss[dur_loss=0.3598, prior_loss=1.013, diff_loss=0.4956, tot_loss=1.868, over 1656.00 samples.], 2024-10-20 20:27:21,354 INFO [train.py:682] (1/4) Start epoch 307 2024-10-20 20:27:34,964 INFO [train.py:561] (1/4) Epoch 307, batch 4, global_batch_idx: 4900, batch size: 189, loss[dur_loss=0.3574, prior_loss=1.013, diff_loss=0.4505, tot_loss=1.821, over 189.00 samples.], tot_loss[dur_loss=0.3544, prior_loss=1.012, diff_loss=0.5313, tot_loss=1.898, over 937.00 samples.], 2024-10-20 20:27:49,809 INFO [train.py:561] (1/4) Epoch 307, batch 14, global_batch_idx: 4910, batch size: 142, loss[dur_loss=0.36, prior_loss=1.012, diff_loss=0.4097, tot_loss=1.782, over 142.00 samples.], tot_loss[dur_loss=0.3608, prior_loss=1.013, diff_loss=0.4672, tot_loss=1.84, over 2210.00 samples.], 2024-10-20 20:27:51,226 INFO [train.py:682] (1/4) Start epoch 308 2024-10-20 20:28:11,248 INFO [train.py:561] (1/4) Epoch 308, batch 8, global_batch_idx: 4920, batch size: 170, loss[dur_loss=0.3588, prior_loss=1.013, diff_loss=0.4173, tot_loss=1.789, over 170.00 samples.], tot_loss[dur_loss=0.3573, prior_loss=1.012, diff_loss=0.4898, tot_loss=1.859, over 1432.00 samples.], 2024-10-20 20:28:21,375 INFO [train.py:682] (1/4) Start epoch 309 2024-10-20 20:28:32,784 INFO [train.py:561] (1/4) Epoch 309, batch 2, global_batch_idx: 4930, batch size: 203, loss[dur_loss=0.3575, prior_loss=1.013, diff_loss=0.4655, tot_loss=1.836, over 203.00 samples.], tot_loss[dur_loss=0.3637, prior_loss=1.013, diff_loss=0.4424, tot_loss=1.819, over 442.00 samples.], 2024-10-20 20:28:46,956 INFO [train.py:561] (1/4) Epoch 309, batch 12, global_batch_idx: 4940, batch size: 152, loss[dur_loss=0.3657, prior_loss=1.012, diff_loss=0.4193, tot_loss=1.797, over 152.00 samples.], tot_loss[dur_loss=0.3604, prior_loss=1.012, diff_loss=0.4765, tot_loss=1.849, over 1966.00 samples.], 2024-10-20 20:28:51,395 INFO [train.py:682] (1/4) Start epoch 310 2024-10-20 20:29:08,132 INFO [train.py:561] (1/4) Epoch 310, batch 6, global_batch_idx: 4950, batch size: 106, loss[dur_loss=0.36, prior_loss=1.012, diff_loss=0.4321, tot_loss=1.805, over 106.00 samples.], tot_loss[dur_loss=0.3544, prior_loss=1.012, diff_loss=0.513, tot_loss=1.879, over 1142.00 samples.], 2024-10-20 20:29:21,093 INFO [train.py:682] (1/4) Start epoch 311 2024-10-20 20:29:29,666 INFO [train.py:561] (1/4) Epoch 311, batch 0, global_batch_idx: 4960, batch size: 108, loss[dur_loss=0.3762, prior_loss=1.014, diff_loss=0.3965, tot_loss=1.786, over 108.00 samples.], tot_loss[dur_loss=0.3762, prior_loss=1.014, diff_loss=0.3965, tot_loss=1.786, over 108.00 samples.], 2024-10-20 20:29:43,776 INFO [train.py:561] (1/4) Epoch 311, batch 10, global_batch_idx: 4970, batch size: 111, loss[dur_loss=0.3746, prior_loss=1.015, diff_loss=0.4515, tot_loss=1.841, over 111.00 samples.], tot_loss[dur_loss=0.3579, prior_loss=1.012, diff_loss=0.4796, tot_loss=1.85, over 1656.00 samples.], 2024-10-20 20:29:50,824 INFO [train.py:682] (1/4) Start epoch 312 2024-10-20 20:30:04,653 INFO [train.py:561] (1/4) Epoch 312, batch 4, global_batch_idx: 4980, batch size: 189, loss[dur_loss=0.357, prior_loss=1.012, diff_loss=0.4853, tot_loss=1.855, over 189.00 samples.], tot_loss[dur_loss=0.3538, prior_loss=1.012, diff_loss=0.5364, tot_loss=1.902, over 937.00 samples.], 2024-10-20 20:30:19,543 INFO [train.py:561] (1/4) Epoch 312, batch 14, global_batch_idx: 4990, batch size: 142, loss[dur_loss=0.362, prior_loss=1.012, diff_loss=0.4322, tot_loss=1.806, over 142.00 samples.], tot_loss[dur_loss=0.3602, prior_loss=1.012, diff_loss=0.4697, tot_loss=1.842, over 2210.00 samples.], 2024-10-20 20:30:20,975 INFO [train.py:682] (1/4) Start epoch 313 2024-10-20 20:30:40,667 INFO [train.py:561] (1/4) Epoch 313, batch 8, global_batch_idx: 5000, batch size: 170, loss[dur_loss=0.3587, prior_loss=1.012, diff_loss=0.4485, tot_loss=1.82, over 170.00 samples.], tot_loss[dur_loss=0.3583, prior_loss=1.012, diff_loss=0.4962, tot_loss=1.867, over 1432.00 samples.], 2024-10-20 20:30:50,738 INFO [train.py:682] (1/4) Start epoch 314 2024-10-20 20:31:02,136 INFO [train.py:561] (1/4) Epoch 314, batch 2, global_batch_idx: 5010, batch size: 203, loss[dur_loss=0.3558, prior_loss=1.012, diff_loss=0.4456, tot_loss=1.814, over 203.00 samples.], tot_loss[dur_loss=0.3619, prior_loss=1.013, diff_loss=0.4288, tot_loss=1.803, over 442.00 samples.], 2024-10-20 20:31:16,320 INFO [train.py:561] (1/4) Epoch 314, batch 12, global_batch_idx: 5020, batch size: 152, loss[dur_loss=0.3636, prior_loss=1.012, diff_loss=0.3987, tot_loss=1.774, over 152.00 samples.], tot_loss[dur_loss=0.3598, prior_loss=1.012, diff_loss=0.4764, tot_loss=1.849, over 1966.00 samples.], 2024-10-20 20:31:20,753 INFO [train.py:682] (1/4) Start epoch 315 2024-10-20 20:31:37,650 INFO [train.py:561] (1/4) Epoch 315, batch 6, global_batch_idx: 5030, batch size: 106, loss[dur_loss=0.3669, prior_loss=1.013, diff_loss=0.3881, tot_loss=1.768, over 106.00 samples.], tot_loss[dur_loss=0.3548, prior_loss=1.012, diff_loss=0.5089, tot_loss=1.876, over 1142.00 samples.], 2024-10-20 20:31:50,692 INFO [train.py:682] (1/4) Start epoch 316 2024-10-20 20:31:59,352 INFO [train.py:561] (1/4) Epoch 316, batch 0, global_batch_idx: 5040, batch size: 108, loss[dur_loss=0.371, prior_loss=1.013, diff_loss=0.3832, tot_loss=1.767, over 108.00 samples.], tot_loss[dur_loss=0.371, prior_loss=1.013, diff_loss=0.3832, tot_loss=1.767, over 108.00 samples.], 2024-10-20 20:32:13,567 INFO [train.py:561] (1/4) Epoch 316, batch 10, global_batch_idx: 5050, batch size: 111, loss[dur_loss=0.3776, prior_loss=1.015, diff_loss=0.4147, tot_loss=1.807, over 111.00 samples.], tot_loss[dur_loss=0.3587, prior_loss=1.012, diff_loss=0.4819, tot_loss=1.853, over 1656.00 samples.], 2024-10-20 20:32:20,641 INFO [train.py:682] (1/4) Start epoch 317 2024-10-20 20:32:34,591 INFO [train.py:561] (1/4) Epoch 317, batch 4, global_batch_idx: 5060, batch size: 189, loss[dur_loss=0.3552, prior_loss=1.012, diff_loss=0.4282, tot_loss=1.796, over 189.00 samples.], tot_loss[dur_loss=0.3527, prior_loss=1.012, diff_loss=0.5297, tot_loss=1.894, over 937.00 samples.], 2024-10-20 20:32:49,420 INFO [train.py:561] (1/4) Epoch 317, batch 14, global_batch_idx: 5070, batch size: 142, loss[dur_loss=0.3626, prior_loss=1.012, diff_loss=0.3953, tot_loss=1.77, over 142.00 samples.], tot_loss[dur_loss=0.3594, prior_loss=1.012, diff_loss=0.4652, tot_loss=1.837, over 2210.00 samples.], 2024-10-20 20:32:50,839 INFO [train.py:682] (1/4) Start epoch 318 2024-10-20 20:33:10,546 INFO [train.py:561] (1/4) Epoch 318, batch 8, global_batch_idx: 5080, batch size: 170, loss[dur_loss=0.362, prior_loss=1.012, diff_loss=0.4145, tot_loss=1.788, over 170.00 samples.], tot_loss[dur_loss=0.3557, prior_loss=1.012, diff_loss=0.5032, tot_loss=1.87, over 1432.00 samples.], 2024-10-20 20:33:20,646 INFO [train.py:682] (1/4) Start epoch 319 2024-10-20 20:33:31,824 INFO [train.py:561] (1/4) Epoch 319, batch 2, global_batch_idx: 5090, batch size: 203, loss[dur_loss=0.3539, prior_loss=1.012, diff_loss=0.432, tot_loss=1.797, over 203.00 samples.], tot_loss[dur_loss=0.3606, prior_loss=1.012, diff_loss=0.4304, tot_loss=1.803, over 442.00 samples.], 2024-10-20 20:33:45,955 INFO [train.py:561] (1/4) Epoch 319, batch 12, global_batch_idx: 5100, batch size: 152, loss[dur_loss=0.3648, prior_loss=1.012, diff_loss=0.4376, tot_loss=1.814, over 152.00 samples.], tot_loss[dur_loss=0.3573, prior_loss=1.012, diff_loss=0.4798, tot_loss=1.849, over 1966.00 samples.], 2024-10-20 20:33:50,359 INFO [train.py:682] (1/4) Start epoch 320 2024-10-20 20:34:07,233 INFO [train.py:561] (1/4) Epoch 320, batch 6, global_batch_idx: 5110, batch size: 106, loss[dur_loss=0.3604, prior_loss=1.012, diff_loss=0.3968, tot_loss=1.769, over 106.00 samples.], tot_loss[dur_loss=0.3544, prior_loss=1.012, diff_loss=0.5139, tot_loss=1.88, over 1142.00 samples.], 2024-10-20 20:34:20,267 INFO [train.py:682] (1/4) Start epoch 321 2024-10-20 20:34:28,783 INFO [train.py:561] (1/4) Epoch 321, batch 0, global_batch_idx: 5120, batch size: 108, loss[dur_loss=0.3733, prior_loss=1.013, diff_loss=0.4126, tot_loss=1.799, over 108.00 samples.], tot_loss[dur_loss=0.3733, prior_loss=1.013, diff_loss=0.4126, tot_loss=1.799, over 108.00 samples.], 2024-10-20 20:34:43,323 INFO [train.py:561] (1/4) Epoch 321, batch 10, global_batch_idx: 5130, batch size: 111, loss[dur_loss=0.3663, prior_loss=1.014, diff_loss=0.437, tot_loss=1.817, over 111.00 samples.], tot_loss[dur_loss=0.3561, prior_loss=1.012, diff_loss=0.4911, tot_loss=1.859, over 1656.00 samples.], 2024-10-20 20:34:50,479 INFO [train.py:682] (1/4) Start epoch 322 2024-10-20 20:35:04,023 INFO [train.py:561] (1/4) Epoch 322, batch 4, global_batch_idx: 5140, batch size: 189, loss[dur_loss=0.3518, prior_loss=1.012, diff_loss=0.4177, tot_loss=1.781, over 189.00 samples.], tot_loss[dur_loss=0.3507, prior_loss=1.011, diff_loss=0.5241, tot_loss=1.886, over 937.00 samples.], 2024-10-20 20:35:19,056 INFO [train.py:561] (1/4) Epoch 322, batch 14, global_batch_idx: 5150, batch size: 142, loss[dur_loss=0.3531, prior_loss=1.011, diff_loss=0.4091, tot_loss=1.773, over 142.00 samples.], tot_loss[dur_loss=0.3561, prior_loss=1.012, diff_loss=0.4614, tot_loss=1.829, over 2210.00 samples.], 2024-10-20 20:35:20,490 INFO [train.py:682] (1/4) Start epoch 323 2024-10-20 20:35:40,502 INFO [train.py:561] (1/4) Epoch 323, batch 8, global_batch_idx: 5160, batch size: 170, loss[dur_loss=0.3549, prior_loss=1.012, diff_loss=0.4357, tot_loss=1.802, over 170.00 samples.], tot_loss[dur_loss=0.3539, prior_loss=1.012, diff_loss=0.4923, tot_loss=1.858, over 1432.00 samples.], 2024-10-20 20:35:50,648 INFO [train.py:682] (1/4) Start epoch 324 2024-10-20 20:36:01,826 INFO [train.py:561] (1/4) Epoch 324, batch 2, global_batch_idx: 5170, batch size: 203, loss[dur_loss=0.351, prior_loss=1.011, diff_loss=0.4528, tot_loss=1.815, over 203.00 samples.], tot_loss[dur_loss=0.3601, prior_loss=1.012, diff_loss=0.4291, tot_loss=1.801, over 442.00 samples.], 2024-10-20 20:36:15,987 INFO [train.py:561] (1/4) Epoch 324, batch 12, global_batch_idx: 5180, batch size: 152, loss[dur_loss=0.3608, prior_loss=1.012, diff_loss=0.4251, tot_loss=1.797, over 152.00 samples.], tot_loss[dur_loss=0.3574, prior_loss=1.012, diff_loss=0.4735, tot_loss=1.843, over 1966.00 samples.], 2024-10-20 20:36:20,408 INFO [train.py:682] (1/4) Start epoch 325 2024-10-20 20:36:37,504 INFO [train.py:561] (1/4) Epoch 325, batch 6, global_batch_idx: 5190, batch size: 106, loss[dur_loss=0.3613, prior_loss=1.012, diff_loss=0.4162, tot_loss=1.789, over 106.00 samples.], tot_loss[dur_loss=0.3515, prior_loss=1.011, diff_loss=0.5087, tot_loss=1.871, over 1142.00 samples.], 2024-10-20 20:36:50,570 INFO [train.py:682] (1/4) Start epoch 326 2024-10-20 20:36:59,325 INFO [train.py:561] (1/4) Epoch 326, batch 0, global_batch_idx: 5200, batch size: 108, loss[dur_loss=0.3713, prior_loss=1.012, diff_loss=0.388, tot_loss=1.772, over 108.00 samples.], tot_loss[dur_loss=0.3713, prior_loss=1.012, diff_loss=0.388, tot_loss=1.772, over 108.00 samples.], 2024-10-20 20:37:13,530 INFO [train.py:561] (1/4) Epoch 326, batch 10, global_batch_idx: 5210, batch size: 111, loss[dur_loss=0.3666, prior_loss=1.013, diff_loss=0.3873, tot_loss=1.767, over 111.00 samples.], tot_loss[dur_loss=0.3538, prior_loss=1.011, diff_loss=0.4765, tot_loss=1.842, over 1656.00 samples.], 2024-10-20 20:37:20,568 INFO [train.py:682] (1/4) Start epoch 327 2024-10-20 20:37:34,118 INFO [train.py:561] (1/4) Epoch 327, batch 4, global_batch_idx: 5220, batch size: 189, loss[dur_loss=0.3524, prior_loss=1.012, diff_loss=0.4347, tot_loss=1.799, over 189.00 samples.], tot_loss[dur_loss=0.3488, prior_loss=1.011, diff_loss=0.5279, tot_loss=1.888, over 937.00 samples.], 2024-10-20 20:37:49,081 INFO [train.py:561] (1/4) Epoch 327, batch 14, global_batch_idx: 5230, batch size: 142, loss[dur_loss=0.3633, prior_loss=1.011, diff_loss=0.4159, tot_loss=1.791, over 142.00 samples.], tot_loss[dur_loss=0.3557, prior_loss=1.011, diff_loss=0.4594, tot_loss=1.826, over 2210.00 samples.], 2024-10-20 20:37:50,498 INFO [train.py:682] (1/4) Start epoch 328 2024-10-20 20:38:10,487 INFO [train.py:561] (1/4) Epoch 328, batch 8, global_batch_idx: 5240, batch size: 170, loss[dur_loss=0.3568, prior_loss=1.012, diff_loss=0.4487, tot_loss=1.817, over 170.00 samples.], tot_loss[dur_loss=0.3521, prior_loss=1.011, diff_loss=0.4877, tot_loss=1.851, over 1432.00 samples.], 2024-10-20 20:38:20,702 INFO [train.py:682] (1/4) Start epoch 329 2024-10-20 20:38:31,943 INFO [train.py:561] (1/4) Epoch 329, batch 2, global_batch_idx: 5250, batch size: 203, loss[dur_loss=0.3525, prior_loss=1.011, diff_loss=0.4695, tot_loss=1.833, over 203.00 samples.], tot_loss[dur_loss=0.3577, prior_loss=1.011, diff_loss=0.4383, tot_loss=1.807, over 442.00 samples.], 2024-10-20 20:38:46,129 INFO [train.py:561] (1/4) Epoch 329, batch 12, global_batch_idx: 5260, batch size: 152, loss[dur_loss=0.3589, prior_loss=1.011, diff_loss=0.4276, tot_loss=1.798, over 152.00 samples.], tot_loss[dur_loss=0.3545, prior_loss=1.011, diff_loss=0.4758, tot_loss=1.841, over 1966.00 samples.], 2024-10-20 20:38:50,633 INFO [train.py:682] (1/4) Start epoch 330 2024-10-20 20:39:07,372 INFO [train.py:561] (1/4) Epoch 330, batch 6, global_batch_idx: 5270, batch size: 106, loss[dur_loss=0.3562, prior_loss=1.011, diff_loss=0.3909, tot_loss=1.758, over 106.00 samples.], tot_loss[dur_loss=0.3504, prior_loss=1.011, diff_loss=0.5006, tot_loss=1.862, over 1142.00 samples.], 2024-10-20 20:39:20,373 INFO [train.py:682] (1/4) Start epoch 331 2024-10-20 20:39:29,257 INFO [train.py:561] (1/4) Epoch 331, batch 0, global_batch_idx: 5280, batch size: 108, loss[dur_loss=0.3702, prior_loss=1.012, diff_loss=0.3914, tot_loss=1.774, over 108.00 samples.], tot_loss[dur_loss=0.3702, prior_loss=1.012, diff_loss=0.3914, tot_loss=1.774, over 108.00 samples.], 2024-10-20 20:39:43,521 INFO [train.py:561] (1/4) Epoch 331, batch 10, global_batch_idx: 5290, batch size: 111, loss[dur_loss=0.3668, prior_loss=1.013, diff_loss=0.4386, tot_loss=1.819, over 111.00 samples.], tot_loss[dur_loss=0.3544, prior_loss=1.011, diff_loss=0.4832, tot_loss=1.849, over 1656.00 samples.], 2024-10-20 20:39:50,615 INFO [train.py:682] (1/4) Start epoch 332 2024-10-20 20:40:04,382 INFO [train.py:561] (1/4) Epoch 332, batch 4, global_batch_idx: 5300, batch size: 189, loss[dur_loss=0.3539, prior_loss=1.012, diff_loss=0.4215, tot_loss=1.787, over 189.00 samples.], tot_loss[dur_loss=0.3506, prior_loss=1.011, diff_loss=0.5279, tot_loss=1.889, over 937.00 samples.], 2024-10-20 20:40:19,255 INFO [train.py:561] (1/4) Epoch 332, batch 14, global_batch_idx: 5310, batch size: 142, loss[dur_loss=0.3533, prior_loss=1.011, diff_loss=0.4075, tot_loss=1.772, over 142.00 samples.], tot_loss[dur_loss=0.3553, prior_loss=1.011, diff_loss=0.4642, tot_loss=1.831, over 2210.00 samples.], 2024-10-20 20:40:20,672 INFO [train.py:682] (1/4) Start epoch 333 2024-10-20 20:40:40,925 INFO [train.py:561] (1/4) Epoch 333, batch 8, global_batch_idx: 5320, batch size: 170, loss[dur_loss=0.3572, prior_loss=1.011, diff_loss=0.4275, tot_loss=1.796, over 170.00 samples.], tot_loss[dur_loss=0.352, prior_loss=1.011, diff_loss=0.4891, tot_loss=1.852, over 1432.00 samples.], 2024-10-20 20:40:50,957 INFO [train.py:682] (1/4) Start epoch 334 2024-10-20 20:41:02,206 INFO [train.py:561] (1/4) Epoch 334, batch 2, global_batch_idx: 5330, batch size: 203, loss[dur_loss=0.3555, prior_loss=1.011, diff_loss=0.4347, tot_loss=1.801, over 203.00 samples.], tot_loss[dur_loss=0.3588, prior_loss=1.011, diff_loss=0.4327, tot_loss=1.803, over 442.00 samples.], 2024-10-20 20:41:16,618 INFO [train.py:561] (1/4) Epoch 334, batch 12, global_batch_idx: 5340, batch size: 152, loss[dur_loss=0.3556, prior_loss=1.011, diff_loss=0.4406, tot_loss=1.807, over 152.00 samples.], tot_loss[dur_loss=0.3541, prior_loss=1.011, diff_loss=0.4696, tot_loss=1.835, over 1966.00 samples.], 2024-10-20 20:41:21,122 INFO [train.py:682] (1/4) Start epoch 335 2024-10-20 20:41:38,052 INFO [train.py:561] (1/4) Epoch 335, batch 6, global_batch_idx: 5350, batch size: 106, loss[dur_loss=0.361, prior_loss=1.011, diff_loss=0.4343, tot_loss=1.807, over 106.00 samples.], tot_loss[dur_loss=0.3517, prior_loss=1.011, diff_loss=0.5024, tot_loss=1.865, over 1142.00 samples.], 2024-10-20 20:41:51,152 INFO [train.py:682] (1/4) Start epoch 336 2024-10-20 20:41:59,778 INFO [train.py:561] (1/4) Epoch 336, batch 0, global_batch_idx: 5360, batch size: 108, loss[dur_loss=0.3664, prior_loss=1.012, diff_loss=0.362, tot_loss=1.741, over 108.00 samples.], tot_loss[dur_loss=0.3664, prior_loss=1.012, diff_loss=0.362, tot_loss=1.741, over 108.00 samples.], 2024-10-20 20:42:13,966 INFO [train.py:561] (1/4) Epoch 336, batch 10, global_batch_idx: 5370, batch size: 111, loss[dur_loss=0.3673, prior_loss=1.013, diff_loss=0.3925, tot_loss=1.773, over 111.00 samples.], tot_loss[dur_loss=0.3534, prior_loss=1.011, diff_loss=0.4641, tot_loss=1.828, over 1656.00 samples.], 2024-10-20 20:42:21,031 INFO [train.py:682] (1/4) Start epoch 337 2024-10-20 20:42:34,754 INFO [train.py:561] (1/4) Epoch 337, batch 4, global_batch_idx: 5380, batch size: 189, loss[dur_loss=0.3503, prior_loss=1.011, diff_loss=0.4387, tot_loss=1.801, over 189.00 samples.], tot_loss[dur_loss=0.3481, prior_loss=1.011, diff_loss=0.545, tot_loss=1.904, over 937.00 samples.], 2024-10-20 20:42:49,489 INFO [train.py:561] (1/4) Epoch 337, batch 14, global_batch_idx: 5390, batch size: 142, loss[dur_loss=0.3588, prior_loss=1.011, diff_loss=0.4017, tot_loss=1.771, over 142.00 samples.], tot_loss[dur_loss=0.3538, prior_loss=1.011, diff_loss=0.4728, tot_loss=1.838, over 2210.00 samples.], 2024-10-20 20:42:50,897 INFO [train.py:682] (1/4) Start epoch 338 2024-10-20 20:43:10,693 INFO [train.py:561] (1/4) Epoch 338, batch 8, global_batch_idx: 5400, batch size: 170, loss[dur_loss=0.3494, prior_loss=1.011, diff_loss=0.3892, tot_loss=1.749, over 170.00 samples.], tot_loss[dur_loss=0.3503, prior_loss=1.011, diff_loss=0.4783, tot_loss=1.839, over 1432.00 samples.], 2024-10-20 20:43:20,745 INFO [train.py:682] (1/4) Start epoch 339 2024-10-20 20:43:31,858 INFO [train.py:561] (1/4) Epoch 339, batch 2, global_batch_idx: 5410, batch size: 203, loss[dur_loss=0.3504, prior_loss=1.011, diff_loss=0.4366, tot_loss=1.798, over 203.00 samples.], tot_loss[dur_loss=0.3558, prior_loss=1.011, diff_loss=0.4181, tot_loss=1.785, over 442.00 samples.], 2024-10-20 20:43:46,243 INFO [train.py:561] (1/4) Epoch 339, batch 12, global_batch_idx: 5420, batch size: 152, loss[dur_loss=0.3643, prior_loss=1.011, diff_loss=0.437, tot_loss=1.813, over 152.00 samples.], tot_loss[dur_loss=0.3548, prior_loss=1.011, diff_loss=0.4706, tot_loss=1.836, over 1966.00 samples.], 2024-10-20 20:43:50,687 INFO [train.py:682] (1/4) Start epoch 340 2024-10-20 20:44:07,753 INFO [train.py:561] (1/4) Epoch 340, batch 6, global_batch_idx: 5430, batch size: 106, loss[dur_loss=0.3616, prior_loss=1.011, diff_loss=0.4212, tot_loss=1.794, over 106.00 samples.], tot_loss[dur_loss=0.3503, prior_loss=1.011, diff_loss=0.4953, tot_loss=1.856, over 1142.00 samples.], 2024-10-20 20:44:20,929 INFO [train.py:682] (1/4) Start epoch 341 2024-10-20 20:44:29,720 INFO [train.py:561] (1/4) Epoch 341, batch 0, global_batch_idx: 5440, batch size: 108, loss[dur_loss=0.3688, prior_loss=1.012, diff_loss=0.4433, tot_loss=1.824, over 108.00 samples.], tot_loss[dur_loss=0.3688, prior_loss=1.012, diff_loss=0.4433, tot_loss=1.824, over 108.00 samples.], 2024-10-20 20:44:44,041 INFO [train.py:561] (1/4) Epoch 341, batch 10, global_batch_idx: 5450, batch size: 111, loss[dur_loss=0.3701, prior_loss=1.013, diff_loss=0.3874, tot_loss=1.77, over 111.00 samples.], tot_loss[dur_loss=0.3515, prior_loss=1.011, diff_loss=0.4831, tot_loss=1.845, over 1656.00 samples.], 2024-10-20 20:44:51,126 INFO [train.py:682] (1/4) Start epoch 342 2024-10-20 20:45:05,212 INFO [train.py:561] (1/4) Epoch 342, batch 4, global_batch_idx: 5460, batch size: 189, loss[dur_loss=0.3472, prior_loss=1.011, diff_loss=0.4483, tot_loss=1.806, over 189.00 samples.], tot_loss[dur_loss=0.3462, prior_loss=1.01, diff_loss=0.5241, tot_loss=1.881, over 937.00 samples.], 2024-10-20 20:45:19,980 INFO [train.py:561] (1/4) Epoch 342, batch 14, global_batch_idx: 5470, batch size: 142, loss[dur_loss=0.3567, prior_loss=1.011, diff_loss=0.3985, tot_loss=1.766, over 142.00 samples.], tot_loss[dur_loss=0.3523, prior_loss=1.011, diff_loss=0.4542, tot_loss=1.817, over 2210.00 samples.], 2024-10-20 20:45:21,394 INFO [train.py:682] (1/4) Start epoch 343 2024-10-20 20:45:41,170 INFO [train.py:561] (1/4) Epoch 343, batch 8, global_batch_idx: 5480, batch size: 170, loss[dur_loss=0.3569, prior_loss=1.011, diff_loss=0.42, tot_loss=1.788, over 170.00 samples.], tot_loss[dur_loss=0.35, prior_loss=1.011, diff_loss=0.4903, tot_loss=1.851, over 1432.00 samples.], 2024-10-20 20:45:51,285 INFO [train.py:682] (1/4) Start epoch 344 2024-10-20 20:46:02,695 INFO [train.py:561] (1/4) Epoch 344, batch 2, global_batch_idx: 5490, batch size: 203, loss[dur_loss=0.353, prior_loss=1.011, diff_loss=0.4276, tot_loss=1.791, over 203.00 samples.], tot_loss[dur_loss=0.3557, prior_loss=1.011, diff_loss=0.4219, tot_loss=1.789, over 442.00 samples.], 2024-10-20 20:46:16,899 INFO [train.py:561] (1/4) Epoch 344, batch 12, global_batch_idx: 5500, batch size: 152, loss[dur_loss=0.3539, prior_loss=1.01, diff_loss=0.4368, tot_loss=1.801, over 152.00 samples.], tot_loss[dur_loss=0.3509, prior_loss=1.011, diff_loss=0.4642, tot_loss=1.826, over 1966.00 samples.], 2024-10-20 20:46:21,314 INFO [train.py:682] (1/4) Start epoch 345 2024-10-20 20:46:38,305 INFO [train.py:561] (1/4) Epoch 345, batch 6, global_batch_idx: 5510, batch size: 106, loss[dur_loss=0.3529, prior_loss=1.011, diff_loss=0.4008, tot_loss=1.764, over 106.00 samples.], tot_loss[dur_loss=0.3465, prior_loss=1.01, diff_loss=0.5061, tot_loss=1.863, over 1142.00 samples.], 2024-10-20 20:46:51,331 INFO [train.py:682] (1/4) Start epoch 346 2024-10-20 20:46:59,974 INFO [train.py:561] (1/4) Epoch 346, batch 0, global_batch_idx: 5520, batch size: 108, loss[dur_loss=0.3595, prior_loss=1.011, diff_loss=0.4223, tot_loss=1.793, over 108.00 samples.], tot_loss[dur_loss=0.3595, prior_loss=1.011, diff_loss=0.4223, tot_loss=1.793, over 108.00 samples.], 2024-10-20 20:47:14,181 INFO [train.py:561] (1/4) Epoch 346, batch 10, global_batch_idx: 5530, batch size: 111, loss[dur_loss=0.3624, prior_loss=1.012, diff_loss=0.3929, tot_loss=1.768, over 111.00 samples.], tot_loss[dur_loss=0.3495, prior_loss=1.01, diff_loss=0.4732, tot_loss=1.833, over 1656.00 samples.], 2024-10-20 20:47:21,266 INFO [train.py:682] (1/4) Start epoch 347 2024-10-20 20:47:34,842 INFO [train.py:561] (1/4) Epoch 347, batch 4, global_batch_idx: 5540, batch size: 189, loss[dur_loss=0.348, prior_loss=1.01, diff_loss=0.4351, tot_loss=1.793, over 189.00 samples.], tot_loss[dur_loss=0.3453, prior_loss=1.01, diff_loss=0.5255, tot_loss=1.881, over 937.00 samples.], 2024-10-20 20:47:49,806 INFO [train.py:561] (1/4) Epoch 347, batch 14, global_batch_idx: 5550, batch size: 142, loss[dur_loss=0.3536, prior_loss=1.01, diff_loss=0.3751, tot_loss=1.739, over 142.00 samples.], tot_loss[dur_loss=0.3508, prior_loss=1.01, diff_loss=0.4589, tot_loss=1.82, over 2210.00 samples.], 2024-10-20 20:47:51,230 INFO [train.py:682] (1/4) Start epoch 348 2024-10-20 20:48:10,974 INFO [train.py:561] (1/4) Epoch 348, batch 8, global_batch_idx: 5560, batch size: 170, loss[dur_loss=0.3533, prior_loss=1.01, diff_loss=0.4497, tot_loss=1.813, over 170.00 samples.], tot_loss[dur_loss=0.3483, prior_loss=1.01, diff_loss=0.4851, tot_loss=1.843, over 1432.00 samples.], 2024-10-20 20:48:21,110 INFO [train.py:682] (1/4) Start epoch 349 2024-10-20 20:48:32,578 INFO [train.py:561] (1/4) Epoch 349, batch 2, global_batch_idx: 5570, batch size: 203, loss[dur_loss=0.3531, prior_loss=1.01, diff_loss=0.4299, tot_loss=1.793, over 203.00 samples.], tot_loss[dur_loss=0.3562, prior_loss=1.011, diff_loss=0.4198, tot_loss=1.787, over 442.00 samples.], 2024-10-20 20:48:46,890 INFO [train.py:561] (1/4) Epoch 349, batch 12, global_batch_idx: 5580, batch size: 152, loss[dur_loss=0.3531, prior_loss=1.01, diff_loss=0.3782, tot_loss=1.741, over 152.00 samples.], tot_loss[dur_loss=0.3498, prior_loss=1.01, diff_loss=0.4646, tot_loss=1.825, over 1966.00 samples.], 2024-10-20 20:48:51,327 INFO [train.py:682] (1/4) Start epoch 350 2024-10-20 20:49:08,069 INFO [train.py:561] (1/4) Epoch 350, batch 6, global_batch_idx: 5590, batch size: 106, loss[dur_loss=0.353, prior_loss=1.01, diff_loss=0.4071, tot_loss=1.77, over 106.00 samples.], tot_loss[dur_loss=0.347, prior_loss=1.01, diff_loss=0.5166, tot_loss=1.873, over 1142.00 samples.], 2024-10-20 20:49:21,337 INFO [train.py:682] (1/4) Start epoch 351 2024-10-20 20:49:30,189 INFO [train.py:561] (1/4) Epoch 351, batch 0, global_batch_idx: 5600, batch size: 108, loss[dur_loss=0.362, prior_loss=1.012, diff_loss=0.4069, tot_loss=1.78, over 108.00 samples.], tot_loss[dur_loss=0.362, prior_loss=1.012, diff_loss=0.4069, tot_loss=1.78, over 108.00 samples.], 2024-10-20 20:49:44,464 INFO [train.py:561] (1/4) Epoch 351, batch 10, global_batch_idx: 5610, batch size: 111, loss[dur_loss=0.361, prior_loss=1.012, diff_loss=0.3995, tot_loss=1.773, over 111.00 samples.], tot_loss[dur_loss=0.3477, prior_loss=1.01, diff_loss=0.473, tot_loss=1.831, over 1656.00 samples.], 2024-10-20 20:49:51,567 INFO [train.py:682] (1/4) Start epoch 352 2024-10-20 20:50:05,221 INFO [train.py:561] (1/4) Epoch 352, batch 4, global_batch_idx: 5620, batch size: 189, loss[dur_loss=0.3466, prior_loss=1.011, diff_loss=0.433, tot_loss=1.79, over 189.00 samples.], tot_loss[dur_loss=0.3465, prior_loss=1.01, diff_loss=0.5246, tot_loss=1.881, over 937.00 samples.], 2024-10-20 20:50:20,108 INFO [train.py:561] (1/4) Epoch 352, batch 14, global_batch_idx: 5630, batch size: 142, loss[dur_loss=0.3558, prior_loss=1.01, diff_loss=0.399, tot_loss=1.765, over 142.00 samples.], tot_loss[dur_loss=0.3511, prior_loss=1.01, diff_loss=0.4576, tot_loss=1.819, over 2210.00 samples.], 2024-10-20 20:50:21,521 INFO [train.py:682] (1/4) Start epoch 353 2024-10-20 20:50:41,256 INFO [train.py:561] (1/4) Epoch 353, batch 8, global_batch_idx: 5640, batch size: 170, loss[dur_loss=0.3515, prior_loss=1.01, diff_loss=0.4033, tot_loss=1.765, over 170.00 samples.], tot_loss[dur_loss=0.3474, prior_loss=1.01, diff_loss=0.4806, tot_loss=1.838, over 1432.00 samples.], 2024-10-20 20:50:51,295 INFO [train.py:682] (1/4) Start epoch 354 2024-10-20 20:51:02,754 INFO [train.py:561] (1/4) Epoch 354, batch 2, global_batch_idx: 5650, batch size: 203, loss[dur_loss=0.3475, prior_loss=1.01, diff_loss=0.4311, tot_loss=1.788, over 203.00 samples.], tot_loss[dur_loss=0.3521, prior_loss=1.01, diff_loss=0.3986, tot_loss=1.761, over 442.00 samples.], 2024-10-20 20:51:16,864 INFO [train.py:561] (1/4) Epoch 354, batch 12, global_batch_idx: 5660, batch size: 152, loss[dur_loss=0.3539, prior_loss=1.01, diff_loss=0.413, tot_loss=1.777, over 152.00 samples.], tot_loss[dur_loss=0.3486, prior_loss=1.01, diff_loss=0.4661, tot_loss=1.825, over 1966.00 samples.], 2024-10-20 20:51:21,249 INFO [train.py:682] (1/4) Start epoch 355 2024-10-20 20:51:38,024 INFO [train.py:561] (1/4) Epoch 355, batch 6, global_batch_idx: 5670, batch size: 106, loss[dur_loss=0.35, prior_loss=1.01, diff_loss=0.4261, tot_loss=1.786, over 106.00 samples.], tot_loss[dur_loss=0.3463, prior_loss=1.01, diff_loss=0.511, tot_loss=1.867, over 1142.00 samples.], 2024-10-20 20:51:50,956 INFO [train.py:682] (1/4) Start epoch 356 2024-10-20 20:51:59,472 INFO [train.py:561] (1/4) Epoch 356, batch 0, global_batch_idx: 5680, batch size: 108, loss[dur_loss=0.3619, prior_loss=1.011, diff_loss=0.3961, tot_loss=1.769, over 108.00 samples.], tot_loss[dur_loss=0.3619, prior_loss=1.011, diff_loss=0.3961, tot_loss=1.769, over 108.00 samples.], 2024-10-20 20:52:13,850 INFO [train.py:561] (1/4) Epoch 356, batch 10, global_batch_idx: 5690, batch size: 111, loss[dur_loss=0.3612, prior_loss=1.012, diff_loss=0.4009, tot_loss=1.774, over 111.00 samples.], tot_loss[dur_loss=0.3477, prior_loss=1.01, diff_loss=0.4593, tot_loss=1.817, over 1656.00 samples.], 2024-10-20 20:52:21,020 INFO [train.py:682] (1/4) Start epoch 357 2024-10-20 20:52:35,018 INFO [train.py:561] (1/4) Epoch 357, batch 4, global_batch_idx: 5700, batch size: 189, loss[dur_loss=0.3448, prior_loss=1.01, diff_loss=0.4226, tot_loss=1.777, over 189.00 samples.], tot_loss[dur_loss=0.3433, prior_loss=1.009, diff_loss=0.5317, tot_loss=1.884, over 937.00 samples.], 2024-10-20 20:52:49,900 INFO [train.py:561] (1/4) Epoch 357, batch 14, global_batch_idx: 5710, batch size: 142, loss[dur_loss=0.3518, prior_loss=1.01, diff_loss=0.4165, tot_loss=1.778, over 142.00 samples.], tot_loss[dur_loss=0.3493, prior_loss=1.01, diff_loss=0.4682, tot_loss=1.827, over 2210.00 samples.], 2024-10-20 20:52:51,325 INFO [train.py:682] (1/4) Start epoch 358 2024-10-20 20:53:11,162 INFO [train.py:561] (1/4) Epoch 358, batch 8, global_batch_idx: 5720, batch size: 170, loss[dur_loss=0.3555, prior_loss=1.01, diff_loss=0.4394, tot_loss=1.805, over 170.00 samples.], tot_loss[dur_loss=0.3472, prior_loss=1.01, diff_loss=0.4881, tot_loss=1.845, over 1432.00 samples.], 2024-10-20 20:53:21,242 INFO [train.py:682] (1/4) Start epoch 359 2024-10-20 20:53:32,423 INFO [train.py:561] (1/4) Epoch 359, batch 2, global_batch_idx: 5730, batch size: 203, loss[dur_loss=0.3458, prior_loss=1.01, diff_loss=0.3831, tot_loss=1.739, over 203.00 samples.], tot_loss[dur_loss=0.3506, prior_loss=1.01, diff_loss=0.4063, tot_loss=1.767, over 442.00 samples.], 2024-10-20 20:53:46,689 INFO [train.py:561] (1/4) Epoch 359, batch 12, global_batch_idx: 5740, batch size: 152, loss[dur_loss=0.3531, prior_loss=1.009, diff_loss=0.4415, tot_loss=1.804, over 152.00 samples.], tot_loss[dur_loss=0.3475, prior_loss=1.01, diff_loss=0.4574, tot_loss=1.815, over 1966.00 samples.], 2024-10-20 20:53:51,141 INFO [train.py:682] (1/4) Start epoch 360 2024-10-20 20:54:08,092 INFO [train.py:561] (1/4) Epoch 360, batch 6, global_batch_idx: 5750, batch size: 106, loss[dur_loss=0.3512, prior_loss=1.01, diff_loss=0.3848, tot_loss=1.746, over 106.00 samples.], tot_loss[dur_loss=0.3433, prior_loss=1.009, diff_loss=0.4952, tot_loss=1.848, over 1142.00 samples.], 2024-10-20 20:54:21,070 INFO [train.py:682] (1/4) Start epoch 361 2024-10-20 20:54:29,624 INFO [train.py:561] (1/4) Epoch 361, batch 0, global_batch_idx: 5760, batch size: 108, loss[dur_loss=0.3637, prior_loss=1.011, diff_loss=0.4224, tot_loss=1.797, over 108.00 samples.], tot_loss[dur_loss=0.3637, prior_loss=1.011, diff_loss=0.4224, tot_loss=1.797, over 108.00 samples.], 2024-10-20 20:54:43,853 INFO [train.py:561] (1/4) Epoch 361, batch 10, global_batch_idx: 5770, batch size: 111, loss[dur_loss=0.3622, prior_loss=1.012, diff_loss=0.4144, tot_loss=1.788, over 111.00 samples.], tot_loss[dur_loss=0.3472, prior_loss=1.01, diff_loss=0.4828, tot_loss=1.84, over 1656.00 samples.], 2024-10-20 20:54:50,879 INFO [train.py:682] (1/4) Start epoch 362 2024-10-20 20:55:04,651 INFO [train.py:561] (1/4) Epoch 362, batch 4, global_batch_idx: 5780, batch size: 189, loss[dur_loss=0.3491, prior_loss=1.01, diff_loss=0.4473, tot_loss=1.806, over 189.00 samples.], tot_loss[dur_loss=0.3433, prior_loss=1.009, diff_loss=0.5173, tot_loss=1.87, over 937.00 samples.], 2024-10-20 20:55:19,807 INFO [train.py:561] (1/4) Epoch 362, batch 14, global_batch_idx: 5790, batch size: 142, loss[dur_loss=0.3507, prior_loss=1.01, diff_loss=0.4126, tot_loss=1.773, over 142.00 samples.], tot_loss[dur_loss=0.3474, prior_loss=1.01, diff_loss=0.4538, tot_loss=1.811, over 2210.00 samples.], 2024-10-20 20:55:21,240 INFO [train.py:682] (1/4) Start epoch 363 2024-10-20 20:55:41,203 INFO [train.py:561] (1/4) Epoch 363, batch 8, global_batch_idx: 5800, batch size: 170, loss[dur_loss=0.347, prior_loss=1.01, diff_loss=0.4178, tot_loss=1.774, over 170.00 samples.], tot_loss[dur_loss=0.3439, prior_loss=1.009, diff_loss=0.4702, tot_loss=1.823, over 1432.00 samples.], 2024-10-20 20:55:51,424 INFO [train.py:682] (1/4) Start epoch 364 2024-10-20 20:56:02,650 INFO [train.py:561] (1/4) Epoch 364, batch 2, global_batch_idx: 5810, batch size: 203, loss[dur_loss=0.3479, prior_loss=1.01, diff_loss=0.4307, tot_loss=1.788, over 203.00 samples.], tot_loss[dur_loss=0.3508, prior_loss=1.01, diff_loss=0.4039, tot_loss=1.765, over 442.00 samples.], 2024-10-20 20:56:16,895 INFO [train.py:561] (1/4) Epoch 364, batch 12, global_batch_idx: 5820, batch size: 152, loss[dur_loss=0.357, prior_loss=1.009, diff_loss=0.4435, tot_loss=1.81, over 152.00 samples.], tot_loss[dur_loss=0.3471, prior_loss=1.009, diff_loss=0.4628, tot_loss=1.819, over 1966.00 samples.], 2024-10-20 20:56:21,339 INFO [train.py:682] (1/4) Start epoch 365 2024-10-20 20:56:38,486 INFO [train.py:561] (1/4) Epoch 365, batch 6, global_batch_idx: 5830, batch size: 106, loss[dur_loss=0.3455, prior_loss=1.009, diff_loss=0.3904, tot_loss=1.745, over 106.00 samples.], tot_loss[dur_loss=0.3433, prior_loss=1.009, diff_loss=0.5082, tot_loss=1.86, over 1142.00 samples.], 2024-10-20 20:56:51,552 INFO [train.py:682] (1/4) Start epoch 366 2024-10-20 20:57:00,518 INFO [train.py:561] (1/4) Epoch 366, batch 0, global_batch_idx: 5840, batch size: 108, loss[dur_loss=0.3593, prior_loss=1.011, diff_loss=0.3935, tot_loss=1.763, over 108.00 samples.], tot_loss[dur_loss=0.3593, prior_loss=1.011, diff_loss=0.3935, tot_loss=1.763, over 108.00 samples.], 2024-10-20 20:57:15,019 INFO [train.py:561] (1/4) Epoch 366, batch 10, global_batch_idx: 5850, batch size: 111, loss[dur_loss=0.3595, prior_loss=1.011, diff_loss=0.4096, tot_loss=1.781, over 111.00 samples.], tot_loss[dur_loss=0.345, prior_loss=1.009, diff_loss=0.4677, tot_loss=1.822, over 1656.00 samples.], 2024-10-20 20:57:22,195 INFO [train.py:682] (1/4) Start epoch 367 2024-10-20 20:57:35,876 INFO [train.py:561] (1/4) Epoch 367, batch 4, global_batch_idx: 5860, batch size: 189, loss[dur_loss=0.3453, prior_loss=1.01, diff_loss=0.4143, tot_loss=1.77, over 189.00 samples.], tot_loss[dur_loss=0.3407, prior_loss=1.009, diff_loss=0.5248, tot_loss=1.875, over 937.00 samples.], 2024-10-20 20:57:50,819 INFO [train.py:561] (1/4) Epoch 367, batch 14, global_batch_idx: 5870, batch size: 142, loss[dur_loss=0.3519, prior_loss=1.009, diff_loss=0.4481, tot_loss=1.809, over 142.00 samples.], tot_loss[dur_loss=0.3476, prior_loss=1.01, diff_loss=0.4567, tot_loss=1.814, over 2210.00 samples.], 2024-10-20 20:57:52,239 INFO [train.py:682] (1/4) Start epoch 368 2024-10-20 20:58:12,066 INFO [train.py:561] (1/4) Epoch 368, batch 8, global_batch_idx: 5880, batch size: 170, loss[dur_loss=0.3487, prior_loss=1.01, diff_loss=0.4113, tot_loss=1.77, over 170.00 samples.], tot_loss[dur_loss=0.3457, prior_loss=1.01, diff_loss=0.4831, tot_loss=1.838, over 1432.00 samples.], 2024-10-20 20:58:22,212 INFO [train.py:682] (1/4) Start epoch 369 2024-10-20 20:58:33,469 INFO [train.py:561] (1/4) Epoch 369, batch 2, global_batch_idx: 5890, batch size: 203, loss[dur_loss=0.3444, prior_loss=1.01, diff_loss=0.4193, tot_loss=1.773, over 203.00 samples.], tot_loss[dur_loss=0.3495, prior_loss=1.01, diff_loss=0.4175, tot_loss=1.777, over 442.00 samples.], 2024-10-20 20:58:47,743 INFO [train.py:561] (1/4) Epoch 369, batch 12, global_batch_idx: 5900, batch size: 152, loss[dur_loss=0.3599, prior_loss=1.01, diff_loss=0.4081, tot_loss=1.778, over 152.00 samples.], tot_loss[dur_loss=0.3472, prior_loss=1.01, diff_loss=0.4628, tot_loss=1.82, over 1966.00 samples.], 2024-10-20 20:58:52,197 INFO [train.py:682] (1/4) Start epoch 370 2024-10-20 20:59:09,119 INFO [train.py:561] (1/4) Epoch 370, batch 6, global_batch_idx: 5910, batch size: 106, loss[dur_loss=0.3492, prior_loss=1.01, diff_loss=0.3866, tot_loss=1.746, over 106.00 samples.], tot_loss[dur_loss=0.3431, prior_loss=1.009, diff_loss=0.5025, tot_loss=1.855, over 1142.00 samples.], 2024-10-20 20:59:22,246 INFO [train.py:682] (1/4) Start epoch 371 2024-10-20 20:59:31,126 INFO [train.py:561] (1/4) Epoch 371, batch 0, global_batch_idx: 5920, batch size: 108, loss[dur_loss=0.3592, prior_loss=1.01, diff_loss=0.3892, tot_loss=1.759, over 108.00 samples.], tot_loss[dur_loss=0.3592, prior_loss=1.01, diff_loss=0.3892, tot_loss=1.759, over 108.00 samples.], 2024-10-20 20:59:45,284 INFO [train.py:561] (1/4) Epoch 371, batch 10, global_batch_idx: 5930, batch size: 111, loss[dur_loss=0.3566, prior_loss=1.011, diff_loss=0.388, tot_loss=1.756, over 111.00 samples.], tot_loss[dur_loss=0.3444, prior_loss=1.009, diff_loss=0.4713, tot_loss=1.825, over 1656.00 samples.], 2024-10-20 20:59:52,413 INFO [train.py:682] (1/4) Start epoch 372 2024-10-20 21:00:05,983 INFO [train.py:561] (1/4) Epoch 372, batch 4, global_batch_idx: 5940, batch size: 189, loss[dur_loss=0.3482, prior_loss=1.01, diff_loss=0.4284, tot_loss=1.786, over 189.00 samples.], tot_loss[dur_loss=0.3416, prior_loss=1.009, diff_loss=0.5093, tot_loss=1.86, over 937.00 samples.], 2024-10-20 21:00:20,967 INFO [train.py:561] (1/4) Epoch 372, batch 14, global_batch_idx: 5950, batch size: 142, loss[dur_loss=0.349, prior_loss=1.009, diff_loss=0.4023, tot_loss=1.761, over 142.00 samples.], tot_loss[dur_loss=0.3472, prior_loss=1.009, diff_loss=0.4466, tot_loss=1.803, over 2210.00 samples.], 2024-10-20 21:00:22,393 INFO [train.py:682] (1/4) Start epoch 373 2024-10-20 21:00:42,117 INFO [train.py:561] (1/4) Epoch 373, batch 8, global_batch_idx: 5960, batch size: 170, loss[dur_loss=0.3433, prior_loss=1.009, diff_loss=0.425, tot_loss=1.777, over 170.00 samples.], tot_loss[dur_loss=0.3419, prior_loss=1.009, diff_loss=0.4847, tot_loss=1.835, over 1432.00 samples.], 2024-10-20 21:00:52,317 INFO [train.py:682] (1/4) Start epoch 374 2024-10-20 21:01:03,922 INFO [train.py:561] (1/4) Epoch 374, batch 2, global_batch_idx: 5970, batch size: 203, loss[dur_loss=0.3452, prior_loss=1.009, diff_loss=0.429, tot_loss=1.783, over 203.00 samples.], tot_loss[dur_loss=0.3486, prior_loss=1.009, diff_loss=0.4115, tot_loss=1.77, over 442.00 samples.], 2024-10-20 21:01:18,149 INFO [train.py:561] (1/4) Epoch 374, batch 12, global_batch_idx: 5980, batch size: 152, loss[dur_loss=0.3492, prior_loss=1.009, diff_loss=0.4313, tot_loss=1.79, over 152.00 samples.], tot_loss[dur_loss=0.3442, prior_loss=1.009, diff_loss=0.4636, tot_loss=1.817, over 1966.00 samples.], 2024-10-20 21:01:22,620 INFO [train.py:682] (1/4) Start epoch 375 2024-10-20 21:01:39,525 INFO [train.py:561] (1/4) Epoch 375, batch 6, global_batch_idx: 5990, batch size: 106, loss[dur_loss=0.351, prior_loss=1.009, diff_loss=0.4201, tot_loss=1.781, over 106.00 samples.], tot_loss[dur_loss=0.3412, prior_loss=1.009, diff_loss=0.4963, tot_loss=1.846, over 1142.00 samples.], 2024-10-20 21:01:52,605 INFO [train.py:682] (1/4) Start epoch 376 2024-10-20 21:02:01,413 INFO [train.py:561] (1/4) Epoch 376, batch 0, global_batch_idx: 6000, batch size: 108, loss[dur_loss=0.359, prior_loss=1.01, diff_loss=0.4096, tot_loss=1.779, over 108.00 samples.], tot_loss[dur_loss=0.359, prior_loss=1.01, diff_loss=0.4096, tot_loss=1.779, over 108.00 samples.], 2024-10-20 21:02:02,804 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 21:02:17,512 INFO [train.py:589] (1/4) Epoch 376, validation: dur_loss=0.4181, prior_loss=1.027, diff_loss=0.4324, tot_loss=1.877, over 100.00 samples. 2024-10-20 21:02:17,513 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-20 21:02:30,427 INFO [train.py:561] (1/4) Epoch 376, batch 10, global_batch_idx: 6010, batch size: 111, loss[dur_loss=0.3576, prior_loss=1.011, diff_loss=0.3675, tot_loss=1.736, over 111.00 samples.], tot_loss[dur_loss=0.3443, prior_loss=1.009, diff_loss=0.4596, tot_loss=1.813, over 1656.00 samples.], 2024-10-20 21:02:37,503 INFO [train.py:682] (1/4) Start epoch 377 2024-10-20 21:02:51,349 INFO [train.py:561] (1/4) Epoch 377, batch 4, global_batch_idx: 6020, batch size: 189, loss[dur_loss=0.3413, prior_loss=1.009, diff_loss=0.4608, tot_loss=1.811, over 189.00 samples.], tot_loss[dur_loss=0.3401, prior_loss=1.009, diff_loss=0.544, tot_loss=1.893, over 937.00 samples.], 2024-10-20 21:03:06,154 INFO [train.py:561] (1/4) Epoch 377, batch 14, global_batch_idx: 6030, batch size: 142, loss[dur_loss=0.3507, prior_loss=1.009, diff_loss=0.4115, tot_loss=1.771, over 142.00 samples.], tot_loss[dur_loss=0.3447, prior_loss=1.009, diff_loss=0.463, tot_loss=1.817, over 2210.00 samples.], 2024-10-20 21:03:07,575 INFO [train.py:682] (1/4) Start epoch 378 2024-10-20 21:03:27,403 INFO [train.py:561] (1/4) Epoch 378, batch 8, global_batch_idx: 6040, batch size: 170, loss[dur_loss=0.3524, prior_loss=1.009, diff_loss=0.4119, tot_loss=1.773, over 170.00 samples.], tot_loss[dur_loss=0.3417, prior_loss=1.009, diff_loss=0.4747, tot_loss=1.825, over 1432.00 samples.], 2024-10-20 21:03:37,444 INFO [train.py:682] (1/4) Start epoch 379 2024-10-20 21:03:48,930 INFO [train.py:561] (1/4) Epoch 379, batch 2, global_batch_idx: 6050, batch size: 203, loss[dur_loss=0.3407, prior_loss=1.009, diff_loss=0.4275, tot_loss=1.777, over 203.00 samples.], tot_loss[dur_loss=0.3455, prior_loss=1.009, diff_loss=0.423, tot_loss=1.778, over 442.00 samples.], 2024-10-20 21:04:03,096 INFO [train.py:561] (1/4) Epoch 379, batch 12, global_batch_idx: 6060, batch size: 152, loss[dur_loss=0.3498, prior_loss=1.009, diff_loss=0.4016, tot_loss=1.76, over 152.00 samples.], tot_loss[dur_loss=0.3424, prior_loss=1.009, diff_loss=0.4636, tot_loss=1.815, over 1966.00 samples.], 2024-10-20 21:04:07,547 INFO [train.py:682] (1/4) Start epoch 380 2024-10-20 21:04:24,506 INFO [train.py:561] (1/4) Epoch 380, batch 6, global_batch_idx: 6070, batch size: 106, loss[dur_loss=0.3443, prior_loss=1.009, diff_loss=0.4295, tot_loss=1.783, over 106.00 samples.], tot_loss[dur_loss=0.3379, prior_loss=1.008, diff_loss=0.498, tot_loss=1.844, over 1142.00 samples.], 2024-10-20 21:04:37,404 INFO [train.py:682] (1/4) Start epoch 381 2024-10-20 21:04:46,281 INFO [train.py:561] (1/4) Epoch 381, batch 0, global_batch_idx: 6080, batch size: 108, loss[dur_loss=0.357, prior_loss=1.01, diff_loss=0.377, tot_loss=1.744, over 108.00 samples.], tot_loss[dur_loss=0.357, prior_loss=1.01, diff_loss=0.377, tot_loss=1.744, over 108.00 samples.], 2024-10-20 21:05:00,350 INFO [train.py:561] (1/4) Epoch 381, batch 10, global_batch_idx: 6090, batch size: 111, loss[dur_loss=0.355, prior_loss=1.011, diff_loss=0.4014, tot_loss=1.767, over 111.00 samples.], tot_loss[dur_loss=0.3416, prior_loss=1.009, diff_loss=0.4713, tot_loss=1.822, over 1656.00 samples.], 2024-10-20 21:05:07,377 INFO [train.py:682] (1/4) Start epoch 382 2024-10-20 21:05:21,337 INFO [train.py:561] (1/4) Epoch 382, batch 4, global_batch_idx: 6100, batch size: 189, loss[dur_loss=0.342, prior_loss=1.009, diff_loss=0.4174, tot_loss=1.768, over 189.00 samples.], tot_loss[dur_loss=0.3368, prior_loss=1.008, diff_loss=0.5153, tot_loss=1.86, over 937.00 samples.], 2024-10-20 21:05:36,111 INFO [train.py:561] (1/4) Epoch 382, batch 14, global_batch_idx: 6110, batch size: 142, loss[dur_loss=0.3543, prior_loss=1.009, diff_loss=0.3864, tot_loss=1.749, over 142.00 samples.], tot_loss[dur_loss=0.3434, prior_loss=1.009, diff_loss=0.4493, tot_loss=1.801, over 2210.00 samples.], 2024-10-20 21:05:37,526 INFO [train.py:682] (1/4) Start epoch 383 2024-10-20 21:05:57,352 INFO [train.py:561] (1/4) Epoch 383, batch 8, global_batch_idx: 6120, batch size: 170, loss[dur_loss=0.346, prior_loss=1.009, diff_loss=0.4578, tot_loss=1.813, over 170.00 samples.], tot_loss[dur_loss=0.3404, prior_loss=1.008, diff_loss=0.4842, tot_loss=1.833, over 1432.00 samples.], 2024-10-20 21:06:07,341 INFO [train.py:682] (1/4) Start epoch 384 2024-10-20 21:06:18,724 INFO [train.py:561] (1/4) Epoch 384, batch 2, global_batch_idx: 6130, batch size: 203, loss[dur_loss=0.3386, prior_loss=1.008, diff_loss=0.4259, tot_loss=1.773, over 203.00 samples.], tot_loss[dur_loss=0.3435, prior_loss=1.009, diff_loss=0.4201, tot_loss=1.773, over 442.00 samples.], 2024-10-20 21:06:32,831 INFO [train.py:561] (1/4) Epoch 384, batch 12, global_batch_idx: 6140, batch size: 152, loss[dur_loss=0.3558, prior_loss=1.009, diff_loss=0.4151, tot_loss=1.78, over 152.00 samples.], tot_loss[dur_loss=0.3425, prior_loss=1.009, diff_loss=0.4646, tot_loss=1.816, over 1966.00 samples.], 2024-10-20 21:06:37,222 INFO [train.py:682] (1/4) Start epoch 385 2024-10-20 21:06:54,174 INFO [train.py:561] (1/4) Epoch 385, batch 6, global_batch_idx: 6150, batch size: 106, loss[dur_loss=0.3418, prior_loss=1.009, diff_loss=0.3815, tot_loss=1.732, over 106.00 samples.], tot_loss[dur_loss=0.3375, prior_loss=1.008, diff_loss=0.4953, tot_loss=1.841, over 1142.00 samples.], 2024-10-20 21:07:07,091 INFO [train.py:682] (1/4) Start epoch 386 2024-10-20 21:07:15,523 INFO [train.py:561] (1/4) Epoch 386, batch 0, global_batch_idx: 6160, batch size: 108, loss[dur_loss=0.3586, prior_loss=1.01, diff_loss=0.4239, tot_loss=1.793, over 108.00 samples.], tot_loss[dur_loss=0.3586, prior_loss=1.01, diff_loss=0.4239, tot_loss=1.793, over 108.00 samples.], 2024-10-20 21:07:29,684 INFO [train.py:561] (1/4) Epoch 386, batch 10, global_batch_idx: 6170, batch size: 111, loss[dur_loss=0.356, prior_loss=1.011, diff_loss=0.4159, tot_loss=1.783, over 111.00 samples.], tot_loss[dur_loss=0.341, prior_loss=1.009, diff_loss=0.4743, tot_loss=1.824, over 1656.00 samples.], 2024-10-20 21:07:36,743 INFO [train.py:682] (1/4) Start epoch 387 2024-10-20 21:07:50,560 INFO [train.py:561] (1/4) Epoch 387, batch 4, global_batch_idx: 6180, batch size: 189, loss[dur_loss=0.334, prior_loss=1.009, diff_loss=0.4134, tot_loss=1.756, over 189.00 samples.], tot_loss[dur_loss=0.3357, prior_loss=1.008, diff_loss=0.5166, tot_loss=1.86, over 937.00 samples.], 2024-10-20 21:08:05,277 INFO [train.py:561] (1/4) Epoch 387, batch 14, global_batch_idx: 6190, batch size: 142, loss[dur_loss=0.3506, prior_loss=1.009, diff_loss=0.4066, tot_loss=1.766, over 142.00 samples.], tot_loss[dur_loss=0.3426, prior_loss=1.008, diff_loss=0.4492, tot_loss=1.8, over 2210.00 samples.], 2024-10-20 21:08:06,701 INFO [train.py:682] (1/4) Start epoch 388 2024-10-20 21:08:26,494 INFO [train.py:561] (1/4) Epoch 388, batch 8, global_batch_idx: 6200, batch size: 170, loss[dur_loss=0.3475, prior_loss=1.009, diff_loss=0.4428, tot_loss=1.799, over 170.00 samples.], tot_loss[dur_loss=0.3401, prior_loss=1.008, diff_loss=0.4803, tot_loss=1.829, over 1432.00 samples.], 2024-10-20 21:08:36,527 INFO [train.py:682] (1/4) Start epoch 389 2024-10-20 21:08:47,547 INFO [train.py:561] (1/4) Epoch 389, batch 2, global_batch_idx: 6210, batch size: 203, loss[dur_loss=0.3401, prior_loss=1.008, diff_loss=0.4308, tot_loss=1.779, over 203.00 samples.], tot_loss[dur_loss=0.3431, prior_loss=1.009, diff_loss=0.4051, tot_loss=1.757, over 442.00 samples.], 2024-10-20 21:09:01,640 INFO [train.py:561] (1/4) Epoch 389, batch 12, global_batch_idx: 6220, batch size: 152, loss[dur_loss=0.3499, prior_loss=1.008, diff_loss=0.4296, tot_loss=1.788, over 152.00 samples.], tot_loss[dur_loss=0.3419, prior_loss=1.008, diff_loss=0.4575, tot_loss=1.808, over 1966.00 samples.], 2024-10-20 21:09:06,051 INFO [train.py:682] (1/4) Start epoch 390 2024-10-20 21:09:23,022 INFO [train.py:561] (1/4) Epoch 390, batch 6, global_batch_idx: 6230, batch size: 106, loss[dur_loss=0.3469, prior_loss=1.009, diff_loss=0.351, tot_loss=1.707, over 106.00 samples.], tot_loss[dur_loss=0.3376, prior_loss=1.008, diff_loss=0.4854, tot_loss=1.831, over 1142.00 samples.], 2024-10-20 21:09:35,993 INFO [train.py:682] (1/4) Start epoch 391 2024-10-20 21:09:44,768 INFO [train.py:561] (1/4) Epoch 391, batch 0, global_batch_idx: 6240, batch size: 108, loss[dur_loss=0.3571, prior_loss=1.01, diff_loss=0.3933, tot_loss=1.76, over 108.00 samples.], tot_loss[dur_loss=0.3571, prior_loss=1.01, diff_loss=0.3933, tot_loss=1.76, over 108.00 samples.], 2024-10-20 21:09:58,877 INFO [train.py:561] (1/4) Epoch 391, batch 10, global_batch_idx: 6250, batch size: 111, loss[dur_loss=0.3526, prior_loss=1.01, diff_loss=0.3797, tot_loss=1.743, over 111.00 samples.], tot_loss[dur_loss=0.3411, prior_loss=1.008, diff_loss=0.467, tot_loss=1.816, over 1656.00 samples.], 2024-10-20 21:10:05,926 INFO [train.py:682] (1/4) Start epoch 392 2024-10-20 21:10:19,612 INFO [train.py:561] (1/4) Epoch 392, batch 4, global_batch_idx: 6260, batch size: 189, loss[dur_loss=0.3342, prior_loss=1.009, diff_loss=0.423, tot_loss=1.766, over 189.00 samples.], tot_loss[dur_loss=0.3348, prior_loss=1.008, diff_loss=0.5189, tot_loss=1.862, over 937.00 samples.], 2024-10-20 21:10:34,396 INFO [train.py:561] (1/4) Epoch 392, batch 14, global_batch_idx: 6270, batch size: 142, loss[dur_loss=0.3478, prior_loss=1.008, diff_loss=0.3993, tot_loss=1.755, over 142.00 samples.], tot_loss[dur_loss=0.3423, prior_loss=1.008, diff_loss=0.4553, tot_loss=1.806, over 2210.00 samples.], 2024-10-20 21:10:35,812 INFO [train.py:682] (1/4) Start epoch 393 2024-10-20 21:10:55,549 INFO [train.py:561] (1/4) Epoch 393, batch 8, global_batch_idx: 6280, batch size: 170, loss[dur_loss=0.3449, prior_loss=1.008, diff_loss=0.418, tot_loss=1.771, over 170.00 samples.], tot_loss[dur_loss=0.3404, prior_loss=1.008, diff_loss=0.4857, tot_loss=1.834, over 1432.00 samples.], 2024-10-20 21:11:05,642 INFO [train.py:682] (1/4) Start epoch 394 2024-10-20 21:11:16,767 INFO [train.py:561] (1/4) Epoch 394, batch 2, global_batch_idx: 6290, batch size: 203, loss[dur_loss=0.3408, prior_loss=1.008, diff_loss=0.4257, tot_loss=1.775, over 203.00 samples.], tot_loss[dur_loss=0.3445, prior_loss=1.009, diff_loss=0.4129, tot_loss=1.766, over 442.00 samples.], 2024-10-20 21:11:30,889 INFO [train.py:561] (1/4) Epoch 394, batch 12, global_batch_idx: 6300, batch size: 152, loss[dur_loss=0.3517, prior_loss=1.009, diff_loss=0.4038, tot_loss=1.764, over 152.00 samples.], tot_loss[dur_loss=0.3424, prior_loss=1.008, diff_loss=0.456, tot_loss=1.807, over 1966.00 samples.], 2024-10-20 21:11:35,360 INFO [train.py:682] (1/4) Start epoch 395 2024-10-20 21:11:52,327 INFO [train.py:561] (1/4) Epoch 395, batch 6, global_batch_idx: 6310, batch size: 106, loss[dur_loss=0.3419, prior_loss=1.008, diff_loss=0.4194, tot_loss=1.77, over 106.00 samples.], tot_loss[dur_loss=0.3383, prior_loss=1.008, diff_loss=0.4848, tot_loss=1.831, over 1142.00 samples.], 2024-10-20 21:12:05,327 INFO [train.py:682] (1/4) Start epoch 396 2024-10-20 21:12:13,875 INFO [train.py:561] (1/4) Epoch 396, batch 0, global_batch_idx: 6320, batch size: 108, loss[dur_loss=0.3539, prior_loss=1.01, diff_loss=0.3764, tot_loss=1.74, over 108.00 samples.], tot_loss[dur_loss=0.3539, prior_loss=1.01, diff_loss=0.3764, tot_loss=1.74, over 108.00 samples.], 2024-10-20 21:12:28,236 INFO [train.py:561] (1/4) Epoch 396, batch 10, global_batch_idx: 6330, batch size: 111, loss[dur_loss=0.3528, prior_loss=1.01, diff_loss=0.3649, tot_loss=1.728, over 111.00 samples.], tot_loss[dur_loss=0.3408, prior_loss=1.008, diff_loss=0.4588, tot_loss=1.808, over 1656.00 samples.], 2024-10-20 21:12:35,279 INFO [train.py:682] (1/4) Start epoch 397 2024-10-20 21:12:48,931 INFO [train.py:561] (1/4) Epoch 397, batch 4, global_batch_idx: 6340, batch size: 189, loss[dur_loss=0.3386, prior_loss=1.009, diff_loss=0.4494, tot_loss=1.797, over 189.00 samples.], tot_loss[dur_loss=0.3358, prior_loss=1.008, diff_loss=0.5206, tot_loss=1.864, over 937.00 samples.], 2024-10-20 21:13:03,828 INFO [train.py:561] (1/4) Epoch 397, batch 14, global_batch_idx: 6350, batch size: 142, loss[dur_loss=0.3444, prior_loss=1.008, diff_loss=0.3825, tot_loss=1.735, over 142.00 samples.], tot_loss[dur_loss=0.342, prior_loss=1.008, diff_loss=0.4544, tot_loss=1.805, over 2210.00 samples.], 2024-10-20 21:13:05,242 INFO [train.py:682] (1/4) Start epoch 398 2024-10-20 21:13:25,028 INFO [train.py:561] (1/4) Epoch 398, batch 8, global_batch_idx: 6360, batch size: 170, loss[dur_loss=0.3398, prior_loss=1.008, diff_loss=0.4381, tot_loss=1.786, over 170.00 samples.], tot_loss[dur_loss=0.338, prior_loss=1.008, diff_loss=0.4844, tot_loss=1.83, over 1432.00 samples.], 2024-10-20 21:13:35,148 INFO [train.py:682] (1/4) Start epoch 399 2024-10-20 21:13:46,350 INFO [train.py:561] (1/4) Epoch 399, batch 2, global_batch_idx: 6370, batch size: 203, loss[dur_loss=0.3364, prior_loss=1.008, diff_loss=0.4231, tot_loss=1.767, over 203.00 samples.], tot_loss[dur_loss=0.3405, prior_loss=1.008, diff_loss=0.4144, tot_loss=1.763, over 442.00 samples.], 2024-10-20 21:14:00,555 INFO [train.py:561] (1/4) Epoch 399, batch 12, global_batch_idx: 6380, batch size: 152, loss[dur_loss=0.3458, prior_loss=1.008, diff_loss=0.4313, tot_loss=1.785, over 152.00 samples.], tot_loss[dur_loss=0.3381, prior_loss=1.008, diff_loss=0.4505, tot_loss=1.796, over 1966.00 samples.], 2024-10-20 21:14:05,003 INFO [train.py:682] (1/4) Start epoch 400 2024-10-20 21:14:22,205 INFO [train.py:561] (1/4) Epoch 400, batch 6, global_batch_idx: 6390, batch size: 106, loss[dur_loss=0.343, prior_loss=1.008, diff_loss=0.3778, tot_loss=1.729, over 106.00 samples.], tot_loss[dur_loss=0.3354, prior_loss=1.008, diff_loss=0.5062, tot_loss=1.849, over 1142.00 samples.], 2024-10-20 21:14:35,228 INFO [train.py:682] (1/4) Start epoch 401 2024-10-20 21:14:43,814 INFO [train.py:561] (1/4) Epoch 401, batch 0, global_batch_idx: 6400, batch size: 108, loss[dur_loss=0.3521, prior_loss=1.009, diff_loss=0.3961, tot_loss=1.757, over 108.00 samples.], tot_loss[dur_loss=0.3521, prior_loss=1.009, diff_loss=0.3961, tot_loss=1.757, over 108.00 samples.], 2024-10-20 21:14:57,971 INFO [train.py:561] (1/4) Epoch 401, batch 10, global_batch_idx: 6410, batch size: 111, loss[dur_loss=0.3527, prior_loss=1.01, diff_loss=0.4087, tot_loss=1.771, over 111.00 samples.], tot_loss[dur_loss=0.3388, prior_loss=1.008, diff_loss=0.4742, tot_loss=1.821, over 1656.00 samples.], 2024-10-20 21:15:04,993 INFO [train.py:682] (1/4) Start epoch 402 2024-10-20 21:15:18,404 INFO [train.py:561] (1/4) Epoch 402, batch 4, global_batch_idx: 6420, batch size: 189, loss[dur_loss=0.3319, prior_loss=1.008, diff_loss=0.4141, tot_loss=1.754, over 189.00 samples.], tot_loss[dur_loss=0.3337, prior_loss=1.007, diff_loss=0.5055, tot_loss=1.847, over 937.00 samples.], 2024-10-20 21:15:33,298 INFO [train.py:561] (1/4) Epoch 402, batch 14, global_batch_idx: 6430, batch size: 142, loss[dur_loss=0.3515, prior_loss=1.008, diff_loss=0.3865, tot_loss=1.746, over 142.00 samples.], tot_loss[dur_loss=0.3389, prior_loss=1.008, diff_loss=0.4443, tot_loss=1.791, over 2210.00 samples.], 2024-10-20 21:15:34,718 INFO [train.py:682] (1/4) Start epoch 403 2024-10-20 21:15:54,491 INFO [train.py:561] (1/4) Epoch 403, batch 8, global_batch_idx: 6440, batch size: 170, loss[dur_loss=0.344, prior_loss=1.008, diff_loss=0.4324, tot_loss=1.784, over 170.00 samples.], tot_loss[dur_loss=0.3357, prior_loss=1.008, diff_loss=0.4727, tot_loss=1.816, over 1432.00 samples.], 2024-10-20 21:16:04,538 INFO [train.py:682] (1/4) Start epoch 404 2024-10-20 21:16:15,638 INFO [train.py:561] (1/4) Epoch 404, batch 2, global_batch_idx: 6450, batch size: 203, loss[dur_loss=0.3394, prior_loss=1.008, diff_loss=0.4289, tot_loss=1.776, over 203.00 samples.], tot_loss[dur_loss=0.3428, prior_loss=1.008, diff_loss=0.4191, tot_loss=1.77, over 442.00 samples.], 2024-10-20 21:16:29,714 INFO [train.py:561] (1/4) Epoch 404, batch 12, global_batch_idx: 6460, batch size: 152, loss[dur_loss=0.3447, prior_loss=1.007, diff_loss=0.3973, tot_loss=1.749, over 152.00 samples.], tot_loss[dur_loss=0.3384, prior_loss=1.008, diff_loss=0.4595, tot_loss=1.806, over 1966.00 samples.], 2024-10-20 21:16:34,121 INFO [train.py:682] (1/4) Start epoch 405 2024-10-20 21:16:50,956 INFO [train.py:561] (1/4) Epoch 405, batch 6, global_batch_idx: 6470, batch size: 106, loss[dur_loss=0.3408, prior_loss=1.008, diff_loss=0.4233, tot_loss=1.772, over 106.00 samples.], tot_loss[dur_loss=0.3355, prior_loss=1.007, diff_loss=0.4988, tot_loss=1.842, over 1142.00 samples.], 2024-10-20 21:17:04,052 INFO [train.py:682] (1/4) Start epoch 406 2024-10-20 21:17:12,798 INFO [train.py:561] (1/4) Epoch 406, batch 0, global_batch_idx: 6480, batch size: 108, loss[dur_loss=0.3537, prior_loss=1.009, diff_loss=0.3805, tot_loss=1.743, over 108.00 samples.], tot_loss[dur_loss=0.3537, prior_loss=1.009, diff_loss=0.3805, tot_loss=1.743, over 108.00 samples.], 2024-10-20 21:17:27,041 INFO [train.py:561] (1/4) Epoch 406, batch 10, global_batch_idx: 6490, batch size: 111, loss[dur_loss=0.3501, prior_loss=1.009, diff_loss=0.3673, tot_loss=1.727, over 111.00 samples.], tot_loss[dur_loss=0.3374, prior_loss=1.008, diff_loss=0.4722, tot_loss=1.817, over 1656.00 samples.], 2024-10-20 21:17:34,189 INFO [train.py:682] (1/4) Start epoch 407 2024-10-20 21:17:47,819 INFO [train.py:561] (1/4) Epoch 407, batch 4, global_batch_idx: 6500, batch size: 189, loss[dur_loss=0.3345, prior_loss=1.008, diff_loss=0.4103, tot_loss=1.752, over 189.00 samples.], tot_loss[dur_loss=0.3326, prior_loss=1.007, diff_loss=0.5221, tot_loss=1.862, over 937.00 samples.], 2024-10-20 21:18:02,736 INFO [train.py:561] (1/4) Epoch 407, batch 14, global_batch_idx: 6510, batch size: 142, loss[dur_loss=0.3437, prior_loss=1.008, diff_loss=0.3691, tot_loss=1.72, over 142.00 samples.], tot_loss[dur_loss=0.3379, prior_loss=1.007, diff_loss=0.4495, tot_loss=1.795, over 2210.00 samples.], 2024-10-20 21:18:04,161 INFO [train.py:682] (1/4) Start epoch 408 2024-10-20 21:18:23,793 INFO [train.py:561] (1/4) Epoch 408, batch 8, global_batch_idx: 6520, batch size: 170, loss[dur_loss=0.3381, prior_loss=1.007, diff_loss=0.4816, tot_loss=1.827, over 170.00 samples.], tot_loss[dur_loss=0.3345, prior_loss=1.007, diff_loss=0.4833, tot_loss=1.825, over 1432.00 samples.], 2024-10-20 21:18:33,924 INFO [train.py:682] (1/4) Start epoch 409 2024-10-20 21:18:45,405 INFO [train.py:561] (1/4) Epoch 409, batch 2, global_batch_idx: 6530, batch size: 203, loss[dur_loss=0.3394, prior_loss=1.008, diff_loss=0.4322, tot_loss=1.779, over 203.00 samples.], tot_loss[dur_loss=0.3423, prior_loss=1.008, diff_loss=0.4048, tot_loss=1.755, over 442.00 samples.], 2024-10-20 21:18:59,656 INFO [train.py:561] (1/4) Epoch 409, batch 12, global_batch_idx: 6540, batch size: 152, loss[dur_loss=0.3463, prior_loss=1.008, diff_loss=0.4192, tot_loss=1.773, over 152.00 samples.], tot_loss[dur_loss=0.3378, prior_loss=1.007, diff_loss=0.4562, tot_loss=1.801, over 1966.00 samples.], 2024-10-20 21:19:04,088 INFO [train.py:682] (1/4) Start epoch 410 2024-10-20 21:19:20,784 INFO [train.py:561] (1/4) Epoch 410, batch 6, global_batch_idx: 6550, batch size: 106, loss[dur_loss=0.3375, prior_loss=1.007, diff_loss=0.3588, tot_loss=1.704, over 106.00 samples.], tot_loss[dur_loss=0.3334, prior_loss=1.007, diff_loss=0.4964, tot_loss=1.837, over 1142.00 samples.], 2024-10-20 21:19:33,778 INFO [train.py:682] (1/4) Start epoch 411 2024-10-20 21:19:42,414 INFO [train.py:561] (1/4) Epoch 411, batch 0, global_batch_idx: 6560, batch size: 108, loss[dur_loss=0.3518, prior_loss=1.008, diff_loss=0.3567, tot_loss=1.717, over 108.00 samples.], tot_loss[dur_loss=0.3518, prior_loss=1.008, diff_loss=0.3567, tot_loss=1.717, over 108.00 samples.], 2024-10-20 21:19:56,520 INFO [train.py:561] (1/4) Epoch 411, batch 10, global_batch_idx: 6570, batch size: 111, loss[dur_loss=0.3523, prior_loss=1.009, diff_loss=0.3846, tot_loss=1.746, over 111.00 samples.], tot_loss[dur_loss=0.3361, prior_loss=1.007, diff_loss=0.4486, tot_loss=1.792, over 1656.00 samples.], 2024-10-20 21:20:03,655 INFO [train.py:682] (1/4) Start epoch 412 2024-10-20 21:20:17,184 INFO [train.py:561] (1/4) Epoch 412, batch 4, global_batch_idx: 6580, batch size: 189, loss[dur_loss=0.3345, prior_loss=1.008, diff_loss=0.4127, tot_loss=1.756, over 189.00 samples.], tot_loss[dur_loss=0.3319, prior_loss=1.007, diff_loss=0.5108, tot_loss=1.85, over 937.00 samples.], 2024-10-20 21:20:32,186 INFO [train.py:561] (1/4) Epoch 412, batch 14, global_batch_idx: 6590, batch size: 142, loss[dur_loss=0.3447, prior_loss=1.008, diff_loss=0.4231, tot_loss=1.775, over 142.00 samples.], tot_loss[dur_loss=0.3382, prior_loss=1.008, diff_loss=0.4513, tot_loss=1.797, over 2210.00 samples.], 2024-10-20 21:20:33,639 INFO [train.py:682] (1/4) Start epoch 413 2024-10-20 21:20:53,658 INFO [train.py:561] (1/4) Epoch 413, batch 8, global_batch_idx: 6600, batch size: 170, loss[dur_loss=0.3394, prior_loss=1.007, diff_loss=0.4251, tot_loss=1.772, over 170.00 samples.], tot_loss[dur_loss=0.3342, prior_loss=1.007, diff_loss=0.4747, tot_loss=1.816, over 1432.00 samples.], 2024-10-20 21:21:03,972 INFO [train.py:682] (1/4) Start epoch 414 2024-10-20 21:21:15,496 INFO [train.py:561] (1/4) Epoch 414, batch 2, global_batch_idx: 6610, batch size: 203, loss[dur_loss=0.3365, prior_loss=1.007, diff_loss=0.454, tot_loss=1.798, over 203.00 samples.], tot_loss[dur_loss=0.3402, prior_loss=1.008, diff_loss=0.419, tot_loss=1.767, over 442.00 samples.], 2024-10-20 21:21:29,737 INFO [train.py:561] (1/4) Epoch 414, batch 12, global_batch_idx: 6620, batch size: 152, loss[dur_loss=0.3457, prior_loss=1.007, diff_loss=0.3999, tot_loss=1.753, over 152.00 samples.], tot_loss[dur_loss=0.3369, prior_loss=1.007, diff_loss=0.4517, tot_loss=1.796, over 1966.00 samples.], 2024-10-20 21:21:34,248 INFO [train.py:682] (1/4) Start epoch 415 2024-10-20 21:21:51,466 INFO [train.py:561] (1/4) Epoch 415, batch 6, global_batch_idx: 6630, batch size: 106, loss[dur_loss=0.34, prior_loss=1.007, diff_loss=0.4055, tot_loss=1.753, over 106.00 samples.], tot_loss[dur_loss=0.3332, prior_loss=1.007, diff_loss=0.4893, tot_loss=1.829, over 1142.00 samples.], 2024-10-20 21:22:04,530 INFO [train.py:682] (1/4) Start epoch 416 2024-10-20 21:22:13,065 INFO [train.py:561] (1/4) Epoch 416, batch 0, global_batch_idx: 6640, batch size: 108, loss[dur_loss=0.3533, prior_loss=1.009, diff_loss=0.3835, tot_loss=1.745, over 108.00 samples.], tot_loss[dur_loss=0.3533, prior_loss=1.009, diff_loss=0.3835, tot_loss=1.745, over 108.00 samples.], 2024-10-20 21:22:27,310 INFO [train.py:561] (1/4) Epoch 416, batch 10, global_batch_idx: 6650, batch size: 111, loss[dur_loss=0.3456, prior_loss=1.009, diff_loss=0.3767, tot_loss=1.732, over 111.00 samples.], tot_loss[dur_loss=0.3343, prior_loss=1.007, diff_loss=0.4598, tot_loss=1.801, over 1656.00 samples.], 2024-10-20 21:22:34,390 INFO [train.py:682] (1/4) Start epoch 417 2024-10-20 21:22:48,137 INFO [train.py:561] (1/4) Epoch 417, batch 4, global_batch_idx: 6660, batch size: 189, loss[dur_loss=0.3365, prior_loss=1.008, diff_loss=0.4193, tot_loss=1.763, over 189.00 samples.], tot_loss[dur_loss=0.3316, prior_loss=1.007, diff_loss=0.5046, tot_loss=1.843, over 937.00 samples.], 2024-10-20 21:23:02,971 INFO [train.py:561] (1/4) Epoch 417, batch 14, global_batch_idx: 6670, batch size: 142, loss[dur_loss=0.3409, prior_loss=1.007, diff_loss=0.3555, tot_loss=1.703, over 142.00 samples.], tot_loss[dur_loss=0.337, prior_loss=1.007, diff_loss=0.4402, tot_loss=1.784, over 2210.00 samples.], 2024-10-20 21:23:04,406 INFO [train.py:682] (1/4) Start epoch 418 2024-10-20 21:23:24,179 INFO [train.py:561] (1/4) Epoch 418, batch 8, global_batch_idx: 6680, batch size: 170, loss[dur_loss=0.3447, prior_loss=1.007, diff_loss=0.4324, tot_loss=1.784, over 170.00 samples.], tot_loss[dur_loss=0.334, prior_loss=1.007, diff_loss=0.4693, tot_loss=1.81, over 1432.00 samples.], 2024-10-20 21:23:34,313 INFO [train.py:682] (1/4) Start epoch 419 2024-10-20 21:23:45,454 INFO [train.py:561] (1/4) Epoch 419, batch 2, global_batch_idx: 6690, batch size: 203, loss[dur_loss=0.3376, prior_loss=1.008, diff_loss=0.392, tot_loss=1.737, over 203.00 samples.], tot_loss[dur_loss=0.3399, prior_loss=1.008, diff_loss=0.3842, tot_loss=1.732, over 442.00 samples.], 2024-10-20 21:23:59,646 INFO [train.py:561] (1/4) Epoch 419, batch 12, global_batch_idx: 6700, batch size: 152, loss[dur_loss=0.3394, prior_loss=1.007, diff_loss=0.3738, tot_loss=1.72, over 152.00 samples.], tot_loss[dur_loss=0.3357, prior_loss=1.007, diff_loss=0.4482, tot_loss=1.791, over 1966.00 samples.], 2024-10-20 21:24:04,106 INFO [train.py:682] (1/4) Start epoch 420 2024-10-20 21:24:21,181 INFO [train.py:561] (1/4) Epoch 420, batch 6, global_batch_idx: 6710, batch size: 106, loss[dur_loss=0.3368, prior_loss=1.007, diff_loss=0.374, tot_loss=1.718, over 106.00 samples.], tot_loss[dur_loss=0.3333, prior_loss=1.007, diff_loss=0.4846, tot_loss=1.825, over 1142.00 samples.], 2024-10-20 21:24:34,247 INFO [train.py:682] (1/4) Start epoch 421 2024-10-20 21:24:42,724 INFO [train.py:561] (1/4) Epoch 421, batch 0, global_batch_idx: 6720, batch size: 108, loss[dur_loss=0.3502, prior_loss=1.008, diff_loss=0.3969, tot_loss=1.755, over 108.00 samples.], tot_loss[dur_loss=0.3502, prior_loss=1.008, diff_loss=0.3969, tot_loss=1.755, over 108.00 samples.], 2024-10-20 21:24:56,957 INFO [train.py:561] (1/4) Epoch 421, batch 10, global_batch_idx: 6730, batch size: 111, loss[dur_loss=0.3502, prior_loss=1.009, diff_loss=0.3896, tot_loss=1.749, over 111.00 samples.], tot_loss[dur_loss=0.3365, prior_loss=1.007, diff_loss=0.4707, tot_loss=1.814, over 1656.00 samples.], 2024-10-20 21:25:04,062 INFO [train.py:682] (1/4) Start epoch 422 2024-10-20 21:25:17,757 INFO [train.py:561] (1/4) Epoch 422, batch 4, global_batch_idx: 6740, batch size: 189, loss[dur_loss=0.3326, prior_loss=1.008, diff_loss=0.4386, tot_loss=1.779, over 189.00 samples.], tot_loss[dur_loss=0.3291, prior_loss=1.007, diff_loss=0.5155, tot_loss=1.851, over 937.00 samples.], 2024-10-20 21:25:32,602 INFO [train.py:561] (1/4) Epoch 422, batch 14, global_batch_idx: 6750, batch size: 142, loss[dur_loss=0.3461, prior_loss=1.007, diff_loss=0.3866, tot_loss=1.74, over 142.00 samples.], tot_loss[dur_loss=0.3359, prior_loss=1.007, diff_loss=0.448, tot_loss=1.791, over 2210.00 samples.], 2024-10-20 21:25:34,009 INFO [train.py:682] (1/4) Start epoch 423 2024-10-20 21:25:53,878 INFO [train.py:561] (1/4) Epoch 423, batch 8, global_batch_idx: 6760, batch size: 170, loss[dur_loss=0.3378, prior_loss=1.007, diff_loss=0.4111, tot_loss=1.756, over 170.00 samples.], tot_loss[dur_loss=0.3336, prior_loss=1.007, diff_loss=0.478, tot_loss=1.818, over 1432.00 samples.], 2024-10-20 21:26:03,965 INFO [train.py:682] (1/4) Start epoch 424 2024-10-20 21:26:15,058 INFO [train.py:561] (1/4) Epoch 424, batch 2, global_batch_idx: 6770, batch size: 203, loss[dur_loss=0.3314, prior_loss=1.007, diff_loss=0.4388, tot_loss=1.777, over 203.00 samples.], tot_loss[dur_loss=0.3366, prior_loss=1.007, diff_loss=0.4287, tot_loss=1.772, over 442.00 samples.], 2024-10-20 21:26:29,186 INFO [train.py:561] (1/4) Epoch 424, batch 12, global_batch_idx: 6780, batch size: 152, loss[dur_loss=0.339, prior_loss=1.007, diff_loss=0.4056, tot_loss=1.751, over 152.00 samples.], tot_loss[dur_loss=0.3342, prior_loss=1.007, diff_loss=0.4514, tot_loss=1.792, over 1966.00 samples.], 2024-10-20 21:26:33,616 INFO [train.py:682] (1/4) Start epoch 425 2024-10-20 21:26:50,458 INFO [train.py:561] (1/4) Epoch 425, batch 6, global_batch_idx: 6790, batch size: 106, loss[dur_loss=0.3422, prior_loss=1.007, diff_loss=0.427, tot_loss=1.777, over 106.00 samples.], tot_loss[dur_loss=0.3313, prior_loss=1.006, diff_loss=0.4944, tot_loss=1.832, over 1142.00 samples.], 2024-10-20 21:27:03,507 INFO [train.py:682] (1/4) Start epoch 426 2024-10-20 21:27:12,166 INFO [train.py:561] (1/4) Epoch 426, batch 0, global_batch_idx: 6800, batch size: 108, loss[dur_loss=0.3489, prior_loss=1.008, diff_loss=0.3939, tot_loss=1.751, over 108.00 samples.], tot_loss[dur_loss=0.3489, prior_loss=1.008, diff_loss=0.3939, tot_loss=1.751, over 108.00 samples.], 2024-10-20 21:27:26,309 INFO [train.py:561] (1/4) Epoch 426, batch 10, global_batch_idx: 6810, batch size: 111, loss[dur_loss=0.3475, prior_loss=1.008, diff_loss=0.3812, tot_loss=1.737, over 111.00 samples.], tot_loss[dur_loss=0.3343, prior_loss=1.007, diff_loss=0.4688, tot_loss=1.81, over 1656.00 samples.], 2024-10-20 21:27:33,397 INFO [train.py:682] (1/4) Start epoch 427 2024-10-20 21:27:46,940 INFO [train.py:561] (1/4) Epoch 427, batch 4, global_batch_idx: 6820, batch size: 189, loss[dur_loss=0.3305, prior_loss=1.008, diff_loss=0.4227, tot_loss=1.761, over 189.00 samples.], tot_loss[dur_loss=0.3297, prior_loss=1.006, diff_loss=0.5076, tot_loss=1.844, over 937.00 samples.], 2024-10-20 21:28:01,688 INFO [train.py:561] (1/4) Epoch 427, batch 14, global_batch_idx: 6830, batch size: 142, loss[dur_loss=0.3423, prior_loss=1.007, diff_loss=0.4114, tot_loss=1.761, over 142.00 samples.], tot_loss[dur_loss=0.3358, prior_loss=1.007, diff_loss=0.4466, tot_loss=1.789, over 2210.00 samples.], 2024-10-20 21:28:03,129 INFO [train.py:682] (1/4) Start epoch 428 2024-10-20 21:28:22,807 INFO [train.py:561] (1/4) Epoch 428, batch 8, global_batch_idx: 6840, batch size: 170, loss[dur_loss=0.3398, prior_loss=1.007, diff_loss=0.4291, tot_loss=1.776, over 170.00 samples.], tot_loss[dur_loss=0.3319, prior_loss=1.006, diff_loss=0.4795, tot_loss=1.818, over 1432.00 samples.], 2024-10-20 21:28:32,937 INFO [train.py:682] (1/4) Start epoch 429 2024-10-20 21:28:44,355 INFO [train.py:561] (1/4) Epoch 429, batch 2, global_batch_idx: 6850, batch size: 203, loss[dur_loss=0.3348, prior_loss=1.007, diff_loss=0.4396, tot_loss=1.781, over 203.00 samples.], tot_loss[dur_loss=0.3384, prior_loss=1.007, diff_loss=0.402, tot_loss=1.748, over 442.00 samples.], 2024-10-20 21:28:58,534 INFO [train.py:561] (1/4) Epoch 429, batch 12, global_batch_idx: 6860, batch size: 152, loss[dur_loss=0.3423, prior_loss=1.007, diff_loss=0.4107, tot_loss=1.76, over 152.00 samples.], tot_loss[dur_loss=0.3348, prior_loss=1.007, diff_loss=0.4557, tot_loss=1.797, over 1966.00 samples.], 2024-10-20 21:29:02,972 INFO [train.py:682] (1/4) Start epoch 430 2024-10-20 21:29:19,879 INFO [train.py:561] (1/4) Epoch 430, batch 6, global_batch_idx: 6870, batch size: 106, loss[dur_loss=0.3348, prior_loss=1.007, diff_loss=0.4011, tot_loss=1.743, over 106.00 samples.], tot_loss[dur_loss=0.3288, prior_loss=1.006, diff_loss=0.4978, tot_loss=1.833, over 1142.00 samples.], 2024-10-20 21:29:32,934 INFO [train.py:682] (1/4) Start epoch 431 2024-10-20 21:29:42,586 INFO [train.py:561] (1/4) Epoch 431, batch 0, global_batch_idx: 6880, batch size: 108, loss[dur_loss=0.3459, prior_loss=1.008, diff_loss=0.3771, tot_loss=1.73, over 108.00 samples.], tot_loss[dur_loss=0.3459, prior_loss=1.008, diff_loss=0.3771, tot_loss=1.73, over 108.00 samples.], 2024-10-20 21:29:56,861 INFO [train.py:561] (1/4) Epoch 431, batch 10, global_batch_idx: 6890, batch size: 111, loss[dur_loss=0.3443, prior_loss=1.008, diff_loss=0.3867, tot_loss=1.739, over 111.00 samples.], tot_loss[dur_loss=0.3319, prior_loss=1.006, diff_loss=0.4576, tot_loss=1.796, over 1656.00 samples.], 2024-10-20 21:30:04,012 INFO [train.py:682] (1/4) Start epoch 432 2024-10-20 21:30:17,682 INFO [train.py:561] (1/4) Epoch 432, batch 4, global_batch_idx: 6900, batch size: 189, loss[dur_loss=0.3281, prior_loss=1.007, diff_loss=0.4357, tot_loss=1.771, over 189.00 samples.], tot_loss[dur_loss=0.3275, prior_loss=1.006, diff_loss=0.5128, tot_loss=1.846, over 937.00 samples.], 2024-10-20 21:30:32,517 INFO [train.py:561] (1/4) Epoch 432, batch 14, global_batch_idx: 6910, batch size: 142, loss[dur_loss=0.3391, prior_loss=1.007, diff_loss=0.3673, tot_loss=1.713, over 142.00 samples.], tot_loss[dur_loss=0.3333, prior_loss=1.007, diff_loss=0.4389, tot_loss=1.779, over 2210.00 samples.], 2024-10-20 21:30:33,952 INFO [train.py:682] (1/4) Start epoch 433 2024-10-20 21:30:54,110 INFO [train.py:561] (1/4) Epoch 433, batch 8, global_batch_idx: 6920, batch size: 170, loss[dur_loss=0.3326, prior_loss=1.006, diff_loss=0.3925, tot_loss=1.731, over 170.00 samples.], tot_loss[dur_loss=0.3299, prior_loss=1.006, diff_loss=0.4762, tot_loss=1.812, over 1432.00 samples.], 2024-10-20 21:31:04,278 INFO [train.py:682] (1/4) Start epoch 434 2024-10-20 21:31:15,525 INFO [train.py:561] (1/4) Epoch 434, batch 2, global_batch_idx: 6930, batch size: 203, loss[dur_loss=0.3313, prior_loss=1.007, diff_loss=0.4126, tot_loss=1.75, over 203.00 samples.], tot_loss[dur_loss=0.3343, prior_loss=1.007, diff_loss=0.4027, tot_loss=1.744, over 442.00 samples.], 2024-10-20 21:31:29,665 INFO [train.py:561] (1/4) Epoch 434, batch 12, global_batch_idx: 6940, batch size: 152, loss[dur_loss=0.3391, prior_loss=1.006, diff_loss=0.389, tot_loss=1.734, over 152.00 samples.], tot_loss[dur_loss=0.3323, prior_loss=1.006, diff_loss=0.4486, tot_loss=1.787, over 1966.00 samples.], 2024-10-20 21:31:34,129 INFO [train.py:682] (1/4) Start epoch 435 2024-10-20 21:31:51,045 INFO [train.py:561] (1/4) Epoch 435, batch 6, global_batch_idx: 6950, batch size: 106, loss[dur_loss=0.3383, prior_loss=1.007, diff_loss=0.4251, tot_loss=1.77, over 106.00 samples.], tot_loss[dur_loss=0.3303, prior_loss=1.006, diff_loss=0.4922, tot_loss=1.829, over 1142.00 samples.], 2024-10-20 21:32:04,084 INFO [train.py:682] (1/4) Start epoch 436 2024-10-20 21:32:12,880 INFO [train.py:561] (1/4) Epoch 436, batch 0, global_batch_idx: 6960, batch size: 108, loss[dur_loss=0.3402, prior_loss=1.007, diff_loss=0.419, tot_loss=1.767, over 108.00 samples.], tot_loss[dur_loss=0.3402, prior_loss=1.007, diff_loss=0.419, tot_loss=1.767, over 108.00 samples.], 2024-10-20 21:32:27,029 INFO [train.py:561] (1/4) Epoch 436, batch 10, global_batch_idx: 6970, batch size: 111, loss[dur_loss=0.339, prior_loss=1.008, diff_loss=0.3978, tot_loss=1.745, over 111.00 samples.], tot_loss[dur_loss=0.3302, prior_loss=1.006, diff_loss=0.4655, tot_loss=1.802, over 1656.00 samples.], 2024-10-20 21:32:34,099 INFO [train.py:682] (1/4) Start epoch 437 2024-10-20 21:32:47,732 INFO [train.py:561] (1/4) Epoch 437, batch 4, global_batch_idx: 6980, batch size: 189, loss[dur_loss=0.3289, prior_loss=1.007, diff_loss=0.4444, tot_loss=1.78, over 189.00 samples.], tot_loss[dur_loss=0.3268, prior_loss=1.006, diff_loss=0.5117, tot_loss=1.844, over 937.00 samples.], 2024-10-20 21:33:02,446 INFO [train.py:561] (1/4) Epoch 437, batch 14, global_batch_idx: 6990, batch size: 142, loss[dur_loss=0.3378, prior_loss=1.006, diff_loss=0.343, tot_loss=1.687, over 142.00 samples.], tot_loss[dur_loss=0.3328, prior_loss=1.006, diff_loss=0.4413, tot_loss=1.78, over 2210.00 samples.], 2024-10-20 21:33:03,861 INFO [train.py:682] (1/4) Start epoch 438 2024-10-20 21:33:23,620 INFO [train.py:561] (1/4) Epoch 438, batch 8, global_batch_idx: 7000, batch size: 170, loss[dur_loss=0.3354, prior_loss=1.007, diff_loss=0.3895, tot_loss=1.731, over 170.00 samples.], tot_loss[dur_loss=0.3291, prior_loss=1.006, diff_loss=0.4646, tot_loss=1.8, over 1432.00 samples.], 2024-10-20 21:33:33,797 INFO [train.py:682] (1/4) Start epoch 439 2024-10-20 21:33:45,251 INFO [train.py:561] (1/4) Epoch 439, batch 2, global_batch_idx: 7010, batch size: 203, loss[dur_loss=0.3342, prior_loss=1.006, diff_loss=0.4251, tot_loss=1.766, over 203.00 samples.], tot_loss[dur_loss=0.3362, prior_loss=1.007, diff_loss=0.3934, tot_loss=1.736, over 442.00 samples.], 2024-10-20 21:33:59,398 INFO [train.py:561] (1/4) Epoch 439, batch 12, global_batch_idx: 7020, batch size: 152, loss[dur_loss=0.333, prior_loss=1.006, diff_loss=0.3847, tot_loss=1.724, over 152.00 samples.], tot_loss[dur_loss=0.3313, prior_loss=1.006, diff_loss=0.4561, tot_loss=1.794, over 1966.00 samples.], 2024-10-20 21:34:03,826 INFO [train.py:682] (1/4) Start epoch 440 2024-10-20 21:34:20,969 INFO [train.py:561] (1/4) Epoch 440, batch 6, global_batch_idx: 7030, batch size: 106, loss[dur_loss=0.332, prior_loss=1.006, diff_loss=0.4071, tot_loss=1.745, over 106.00 samples.], tot_loss[dur_loss=0.3272, prior_loss=1.006, diff_loss=0.4849, tot_loss=1.818, over 1142.00 samples.], 2024-10-20 21:34:34,013 INFO [train.py:682] (1/4) Start epoch 441 2024-10-20 21:34:42,763 INFO [train.py:561] (1/4) Epoch 441, batch 0, global_batch_idx: 7040, batch size: 108, loss[dur_loss=0.3472, prior_loss=1.007, diff_loss=0.3874, tot_loss=1.742, over 108.00 samples.], tot_loss[dur_loss=0.3472, prior_loss=1.007, diff_loss=0.3874, tot_loss=1.742, over 108.00 samples.], 2024-10-20 21:34:57,174 INFO [train.py:561] (1/4) Epoch 441, batch 10, global_batch_idx: 7050, batch size: 111, loss[dur_loss=0.3396, prior_loss=1.008, diff_loss=0.3627, tot_loss=1.71, over 111.00 samples.], tot_loss[dur_loss=0.3295, prior_loss=1.006, diff_loss=0.4522, tot_loss=1.788, over 1656.00 samples.], 2024-10-20 21:35:04,310 INFO [train.py:682] (1/4) Start epoch 442 2024-10-20 21:35:18,377 INFO [train.py:561] (1/4) Epoch 442, batch 4, global_batch_idx: 7060, batch size: 189, loss[dur_loss=0.3268, prior_loss=1.007, diff_loss=0.4056, tot_loss=1.739, over 189.00 samples.], tot_loss[dur_loss=0.3258, prior_loss=1.006, diff_loss=0.503, tot_loss=1.834, over 937.00 samples.], 2024-10-20 21:35:33,212 INFO [train.py:561] (1/4) Epoch 442, batch 14, global_batch_idx: 7070, batch size: 142, loss[dur_loss=0.336, prior_loss=1.006, diff_loss=0.3916, tot_loss=1.734, over 142.00 samples.], tot_loss[dur_loss=0.3315, prior_loss=1.006, diff_loss=0.4418, tot_loss=1.779, over 2210.00 samples.], 2024-10-20 21:35:34,642 INFO [train.py:682] (1/4) Start epoch 443 2024-10-20 21:35:54,648 INFO [train.py:561] (1/4) Epoch 443, batch 8, global_batch_idx: 7080, batch size: 170, loss[dur_loss=0.3398, prior_loss=1.006, diff_loss=0.3926, tot_loss=1.739, over 170.00 samples.], tot_loss[dur_loss=0.33, prior_loss=1.006, diff_loss=0.4709, tot_loss=1.807, over 1432.00 samples.], 2024-10-20 21:36:04,793 INFO [train.py:682] (1/4) Start epoch 444 2024-10-20 21:36:16,296 INFO [train.py:561] (1/4) Epoch 444, batch 2, global_batch_idx: 7090, batch size: 203, loss[dur_loss=0.3306, prior_loss=1.006, diff_loss=0.3905, tot_loss=1.727, over 203.00 samples.], tot_loss[dur_loss=0.3339, prior_loss=1.006, diff_loss=0.3953, tot_loss=1.736, over 442.00 samples.], 2024-10-20 21:36:30,446 INFO [train.py:561] (1/4) Epoch 444, batch 12, global_batch_idx: 7100, batch size: 152, loss[dur_loss=0.3349, prior_loss=1.006, diff_loss=0.3598, tot_loss=1.701, over 152.00 samples.], tot_loss[dur_loss=0.33, prior_loss=1.006, diff_loss=0.4413, tot_loss=1.777, over 1966.00 samples.], 2024-10-20 21:36:34,868 INFO [train.py:682] (1/4) Start epoch 445 2024-10-20 21:36:51,637 INFO [train.py:561] (1/4) Epoch 445, batch 6, global_batch_idx: 7110, batch size: 106, loss[dur_loss=0.3304, prior_loss=1.006, diff_loss=0.3504, tot_loss=1.687, over 106.00 samples.], tot_loss[dur_loss=0.3256, prior_loss=1.006, diff_loss=0.4842, tot_loss=1.815, over 1142.00 samples.], 2024-10-20 21:37:04,577 INFO [train.py:682] (1/4) Start epoch 446 2024-10-20 21:37:13,156 INFO [train.py:561] (1/4) Epoch 446, batch 0, global_batch_idx: 7120, batch size: 108, loss[dur_loss=0.3464, prior_loss=1.007, diff_loss=0.3652, tot_loss=1.719, over 108.00 samples.], tot_loss[dur_loss=0.3464, prior_loss=1.007, diff_loss=0.3652, tot_loss=1.719, over 108.00 samples.], 2024-10-20 21:37:27,243 INFO [train.py:561] (1/4) Epoch 446, batch 10, global_batch_idx: 7130, batch size: 111, loss[dur_loss=0.3415, prior_loss=1.008, diff_loss=0.3862, tot_loss=1.736, over 111.00 samples.], tot_loss[dur_loss=0.3303, prior_loss=1.006, diff_loss=0.4577, tot_loss=1.794, over 1656.00 samples.], 2024-10-20 21:37:34,294 INFO [train.py:682] (1/4) Start epoch 447 2024-10-20 21:37:47,873 INFO [train.py:561] (1/4) Epoch 447, batch 4, global_batch_idx: 7140, batch size: 189, loss[dur_loss=0.3328, prior_loss=1.007, diff_loss=0.4129, tot_loss=1.753, over 189.00 samples.], tot_loss[dur_loss=0.326, prior_loss=1.006, diff_loss=0.5073, tot_loss=1.839, over 937.00 samples.], 2024-10-20 21:38:02,580 INFO [train.py:561] (1/4) Epoch 447, batch 14, global_batch_idx: 7150, batch size: 142, loss[dur_loss=0.3402, prior_loss=1.006, diff_loss=0.3963, tot_loss=1.742, over 142.00 samples.], tot_loss[dur_loss=0.333, prior_loss=1.006, diff_loss=0.4376, tot_loss=1.777, over 2210.00 samples.], 2024-10-20 21:38:03,994 INFO [train.py:682] (1/4) Start epoch 448 2024-10-20 21:38:24,045 INFO [train.py:561] (1/4) Epoch 448, batch 8, global_batch_idx: 7160, batch size: 170, loss[dur_loss=0.3339, prior_loss=1.006, diff_loss=0.3824, tot_loss=1.722, over 170.00 samples.], tot_loss[dur_loss=0.3285, prior_loss=1.006, diff_loss=0.4769, tot_loss=1.811, over 1432.00 samples.], 2024-10-20 21:38:34,055 INFO [train.py:682] (1/4) Start epoch 449 2024-10-20 21:38:45,350 INFO [train.py:561] (1/4) Epoch 449, batch 2, global_batch_idx: 7170, batch size: 203, loss[dur_loss=0.3325, prior_loss=1.006, diff_loss=0.4135, tot_loss=1.752, over 203.00 samples.], tot_loss[dur_loss=0.3367, prior_loss=1.007, diff_loss=0.3887, tot_loss=1.732, over 442.00 samples.], 2024-10-20 21:38:59,469 INFO [train.py:561] (1/4) Epoch 449, batch 12, global_batch_idx: 7180, batch size: 152, loss[dur_loss=0.3333, prior_loss=1.006, diff_loss=0.3803, tot_loss=1.719, over 152.00 samples.], tot_loss[dur_loss=0.3302, prior_loss=1.006, diff_loss=0.4485, tot_loss=1.785, over 1966.00 samples.], 2024-10-20 21:39:03,885 INFO [train.py:682] (1/4) Start epoch 450 2024-10-20 21:39:20,871 INFO [train.py:561] (1/4) Epoch 450, batch 6, global_batch_idx: 7190, batch size: 106, loss[dur_loss=0.3281, prior_loss=1.006, diff_loss=0.4292, tot_loss=1.763, over 106.00 samples.], tot_loss[dur_loss=0.3233, prior_loss=1.005, diff_loss=0.4882, tot_loss=1.817, over 1142.00 samples.], 2024-10-20 21:39:33,782 INFO [train.py:682] (1/4) Start epoch 451 2024-10-20 21:39:42,502 INFO [train.py:561] (1/4) Epoch 451, batch 0, global_batch_idx: 7200, batch size: 108, loss[dur_loss=0.3452, prior_loss=1.007, diff_loss=0.3471, tot_loss=1.699, over 108.00 samples.], tot_loss[dur_loss=0.3452, prior_loss=1.007, diff_loss=0.3471, tot_loss=1.699, over 108.00 samples.], 2024-10-20 21:39:56,539 INFO [train.py:561] (1/4) Epoch 451, batch 10, global_batch_idx: 7210, batch size: 111, loss[dur_loss=0.3387, prior_loss=1.007, diff_loss=0.3621, tot_loss=1.708, over 111.00 samples.], tot_loss[dur_loss=0.3292, prior_loss=1.006, diff_loss=0.4564, tot_loss=1.791, over 1656.00 samples.], 2024-10-20 21:40:03,524 INFO [train.py:682] (1/4) Start epoch 452 2024-10-20 21:40:17,264 INFO [train.py:561] (1/4) Epoch 452, batch 4, global_batch_idx: 7220, batch size: 189, loss[dur_loss=0.3241, prior_loss=1.006, diff_loss=0.4099, tot_loss=1.74, over 189.00 samples.], tot_loss[dur_loss=0.3244, prior_loss=1.005, diff_loss=0.5152, tot_loss=1.845, over 937.00 samples.], 2024-10-20 21:40:31,903 INFO [train.py:561] (1/4) Epoch 452, batch 14, global_batch_idx: 7230, batch size: 142, loss[dur_loss=0.3327, prior_loss=1.006, diff_loss=0.4032, tot_loss=1.742, over 142.00 samples.], tot_loss[dur_loss=0.3296, prior_loss=1.006, diff_loss=0.4376, tot_loss=1.773, over 2210.00 samples.], 2024-10-20 21:40:33,313 INFO [train.py:682] (1/4) Start epoch 453 2024-10-20 21:40:53,007 INFO [train.py:561] (1/4) Epoch 453, batch 8, global_batch_idx: 7240, batch size: 170, loss[dur_loss=0.3358, prior_loss=1.006, diff_loss=0.4094, tot_loss=1.751, over 170.00 samples.], tot_loss[dur_loss=0.3256, prior_loss=1.005, diff_loss=0.4659, tot_loss=1.797, over 1432.00 samples.], 2024-10-20 21:41:03,028 INFO [train.py:682] (1/4) Start epoch 454 2024-10-20 21:41:14,165 INFO [train.py:561] (1/4) Epoch 454, batch 2, global_batch_idx: 7250, batch size: 203, loss[dur_loss=0.3252, prior_loss=1.006, diff_loss=0.4087, tot_loss=1.739, over 203.00 samples.], tot_loss[dur_loss=0.332, prior_loss=1.006, diff_loss=0.3972, tot_loss=1.735, over 442.00 samples.], 2024-10-20 21:41:28,163 INFO [train.py:561] (1/4) Epoch 454, batch 12, global_batch_idx: 7260, batch size: 152, loss[dur_loss=0.3365, prior_loss=1.005, diff_loss=0.3497, tot_loss=1.692, over 152.00 samples.], tot_loss[dur_loss=0.3279, prior_loss=1.005, diff_loss=0.4396, tot_loss=1.773, over 1966.00 samples.], 2024-10-20 21:41:32,606 INFO [train.py:682] (1/4) Start epoch 455 2024-10-20 21:41:49,221 INFO [train.py:561] (1/4) Epoch 455, batch 6, global_batch_idx: 7270, batch size: 106, loss[dur_loss=0.3312, prior_loss=1.006, diff_loss=0.3739, tot_loss=1.711, over 106.00 samples.], tot_loss[dur_loss=0.3236, prior_loss=1.005, diff_loss=0.4947, tot_loss=1.823, over 1142.00 samples.], 2024-10-20 21:42:02,104 INFO [train.py:682] (1/4) Start epoch 456 2024-10-20 21:42:10,505 INFO [train.py:561] (1/4) Epoch 456, batch 0, global_batch_idx: 7280, batch size: 108, loss[dur_loss=0.3403, prior_loss=1.006, diff_loss=0.3934, tot_loss=1.74, over 108.00 samples.], tot_loss[dur_loss=0.3403, prior_loss=1.006, diff_loss=0.3934, tot_loss=1.74, over 108.00 samples.], 2024-10-20 21:42:24,559 INFO [train.py:561] (1/4) Epoch 456, batch 10, global_batch_idx: 7290, batch size: 111, loss[dur_loss=0.3389, prior_loss=1.007, diff_loss=0.3678, tot_loss=1.714, over 111.00 samples.], tot_loss[dur_loss=0.3276, prior_loss=1.005, diff_loss=0.451, tot_loss=1.784, over 1656.00 samples.], 2024-10-20 21:42:31,605 INFO [train.py:682] (1/4) Start epoch 457 2024-10-20 21:42:45,439 INFO [train.py:561] (1/4) Epoch 457, batch 4, global_batch_idx: 7300, batch size: 189, loss[dur_loss=0.3244, prior_loss=1.006, diff_loss=0.4184, tot_loss=1.749, over 189.00 samples.], tot_loss[dur_loss=0.3228, prior_loss=1.005, diff_loss=0.5177, tot_loss=1.845, over 937.00 samples.], 2024-10-20 21:43:00,052 INFO [train.py:561] (1/4) Epoch 457, batch 14, global_batch_idx: 7310, batch size: 142, loss[dur_loss=0.3354, prior_loss=1.006, diff_loss=0.4103, tot_loss=1.751, over 142.00 samples.], tot_loss[dur_loss=0.3278, prior_loss=1.005, diff_loss=0.4477, tot_loss=1.781, over 2210.00 samples.], 2024-10-20 21:43:01,454 INFO [train.py:682] (1/4) Start epoch 458 2024-10-20 21:43:20,995 INFO [train.py:561] (1/4) Epoch 458, batch 8, global_batch_idx: 7320, batch size: 170, loss[dur_loss=0.3327, prior_loss=1.005, diff_loss=0.3921, tot_loss=1.73, over 170.00 samples.], tot_loss[dur_loss=0.325, prior_loss=1.005, diff_loss=0.4596, tot_loss=1.79, over 1432.00 samples.], 2024-10-20 21:43:31,088 INFO [train.py:682] (1/4) Start epoch 459 2024-10-20 21:43:42,420 INFO [train.py:561] (1/4) Epoch 459, batch 2, global_batch_idx: 7330, batch size: 203, loss[dur_loss=0.3298, prior_loss=1.005, diff_loss=0.4165, tot_loss=1.752, over 203.00 samples.], tot_loss[dur_loss=0.3317, prior_loss=1.006, diff_loss=0.404, tot_loss=1.741, over 442.00 samples.], 2024-10-20 21:43:56,516 INFO [train.py:561] (1/4) Epoch 459, batch 12, global_batch_idx: 7340, batch size: 152, loss[dur_loss=0.3318, prior_loss=1.005, diff_loss=0.3915, tot_loss=1.728, over 152.00 samples.], tot_loss[dur_loss=0.3271, prior_loss=1.005, diff_loss=0.4484, tot_loss=1.781, over 1966.00 samples.], 2024-10-20 21:44:00,950 INFO [train.py:682] (1/4) Start epoch 460 2024-10-20 21:44:17,685 INFO [train.py:561] (1/4) Epoch 460, batch 6, global_batch_idx: 7350, batch size: 106, loss[dur_loss=0.3289, prior_loss=1.005, diff_loss=0.386, tot_loss=1.72, over 106.00 samples.], tot_loss[dur_loss=0.3221, prior_loss=1.005, diff_loss=0.4851, tot_loss=1.812, over 1142.00 samples.], 2024-10-20 21:44:30,670 INFO [train.py:682] (1/4) Start epoch 461 2024-10-20 21:44:39,308 INFO [train.py:561] (1/4) Epoch 461, batch 0, global_batch_idx: 7360, batch size: 108, loss[dur_loss=0.3398, prior_loss=1.006, diff_loss=0.3869, tot_loss=1.733, over 108.00 samples.], tot_loss[dur_loss=0.3398, prior_loss=1.006, diff_loss=0.3869, tot_loss=1.733, over 108.00 samples.], 2024-10-20 21:44:53,438 INFO [train.py:561] (1/4) Epoch 461, batch 10, global_batch_idx: 7370, batch size: 111, loss[dur_loss=0.3387, prior_loss=1.007, diff_loss=0.3891, tot_loss=1.735, over 111.00 samples.], tot_loss[dur_loss=0.3255, prior_loss=1.005, diff_loss=0.4626, tot_loss=1.793, over 1656.00 samples.], 2024-10-20 21:45:00,485 INFO [train.py:682] (1/4) Start epoch 462 2024-10-20 21:45:14,076 INFO [train.py:561] (1/4) Epoch 462, batch 4, global_batch_idx: 7380, batch size: 189, loss[dur_loss=0.3208, prior_loss=1.006, diff_loss=0.4079, tot_loss=1.734, over 189.00 samples.], tot_loss[dur_loss=0.3215, prior_loss=1.005, diff_loss=0.5035, tot_loss=1.83, over 937.00 samples.], 2024-10-20 21:45:28,793 INFO [train.py:561] (1/4) Epoch 462, batch 14, global_batch_idx: 7390, batch size: 142, loss[dur_loss=0.3346, prior_loss=1.005, diff_loss=0.3658, tot_loss=1.706, over 142.00 samples.], tot_loss[dur_loss=0.3267, prior_loss=1.005, diff_loss=0.4387, tot_loss=1.771, over 2210.00 samples.], 2024-10-20 21:45:30,208 INFO [train.py:682] (1/4) Start epoch 463 2024-10-20 21:45:49,936 INFO [train.py:561] (1/4) Epoch 463, batch 8, global_batch_idx: 7400, batch size: 170, loss[dur_loss=0.3321, prior_loss=1.005, diff_loss=0.3887, tot_loss=1.726, over 170.00 samples.], tot_loss[dur_loss=0.3248, prior_loss=1.005, diff_loss=0.4665, tot_loss=1.796, over 1432.00 samples.], 2024-10-20 21:45:59,927 INFO [train.py:682] (1/4) Start epoch 464 2024-10-20 21:46:11,511 INFO [train.py:561] (1/4) Epoch 464, batch 2, global_batch_idx: 7410, batch size: 203, loss[dur_loss=0.3242, prior_loss=1.005, diff_loss=0.4261, tot_loss=1.755, over 203.00 samples.], tot_loss[dur_loss=0.3296, prior_loss=1.005, diff_loss=0.401, tot_loss=1.736, over 442.00 samples.], 2024-10-20 21:46:25,537 INFO [train.py:561] (1/4) Epoch 464, batch 12, global_batch_idx: 7420, batch size: 152, loss[dur_loss=0.3315, prior_loss=1.005, diff_loss=0.4127, tot_loss=1.749, over 152.00 samples.], tot_loss[dur_loss=0.3265, prior_loss=1.005, diff_loss=0.4463, tot_loss=1.778, over 1966.00 samples.], 2024-10-20 21:46:29,947 INFO [train.py:682] (1/4) Start epoch 465 2024-10-20 21:46:46,878 INFO [train.py:561] (1/4) Epoch 465, batch 6, global_batch_idx: 7430, batch size: 106, loss[dur_loss=0.3243, prior_loss=1.005, diff_loss=0.3808, tot_loss=1.71, over 106.00 samples.], tot_loss[dur_loss=0.3217, prior_loss=1.005, diff_loss=0.5006, tot_loss=1.827, over 1142.00 samples.], 2024-10-20 21:46:59,768 INFO [train.py:682] (1/4) Start epoch 466 2024-10-20 21:47:08,292 INFO [train.py:561] (1/4) Epoch 466, batch 0, global_batch_idx: 7440, batch size: 108, loss[dur_loss=0.3372, prior_loss=1.006, diff_loss=0.3543, tot_loss=1.698, over 108.00 samples.], tot_loss[dur_loss=0.3372, prior_loss=1.006, diff_loss=0.3543, tot_loss=1.698, over 108.00 samples.], 2024-10-20 21:47:22,266 INFO [train.py:561] (1/4) Epoch 466, batch 10, global_batch_idx: 7450, batch size: 111, loss[dur_loss=0.3366, prior_loss=1.007, diff_loss=0.3713, tot_loss=1.715, over 111.00 samples.], tot_loss[dur_loss=0.3246, prior_loss=1.005, diff_loss=0.4458, tot_loss=1.775, over 1656.00 samples.], 2024-10-20 21:47:29,290 INFO [train.py:682] (1/4) Start epoch 467 2024-10-20 21:47:42,722 INFO [train.py:561] (1/4) Epoch 467, batch 4, global_batch_idx: 7460, batch size: 189, loss[dur_loss=0.3228, prior_loss=1.006, diff_loss=0.4313, tot_loss=1.76, over 189.00 samples.], tot_loss[dur_loss=0.3211, prior_loss=1.004, diff_loss=0.5222, tot_loss=1.848, over 937.00 samples.], 2024-10-20 21:47:57,505 INFO [train.py:561] (1/4) Epoch 467, batch 14, global_batch_idx: 7470, batch size: 142, loss[dur_loss=0.3337, prior_loss=1.005, diff_loss=0.3826, tot_loss=1.722, over 142.00 samples.], tot_loss[dur_loss=0.3269, prior_loss=1.005, diff_loss=0.4461, tot_loss=1.778, over 2210.00 samples.], 2024-10-20 21:47:58,924 INFO [train.py:682] (1/4) Start epoch 468 2024-10-20 21:48:18,948 INFO [train.py:561] (1/4) Epoch 468, batch 8, global_batch_idx: 7480, batch size: 170, loss[dur_loss=0.3309, prior_loss=1.005, diff_loss=0.4438, tot_loss=1.78, over 170.00 samples.], tot_loss[dur_loss=0.3231, prior_loss=1.005, diff_loss=0.4721, tot_loss=1.8, over 1432.00 samples.], 2024-10-20 21:48:29,039 INFO [train.py:682] (1/4) Start epoch 469 2024-10-20 21:48:40,260 INFO [train.py:561] (1/4) Epoch 469, batch 2, global_batch_idx: 7490, batch size: 203, loss[dur_loss=0.3229, prior_loss=1.005, diff_loss=0.4077, tot_loss=1.736, over 203.00 samples.], tot_loss[dur_loss=0.3278, prior_loss=1.005, diff_loss=0.3936, tot_loss=1.727, over 442.00 samples.], 2024-10-20 21:48:54,320 INFO [train.py:561] (1/4) Epoch 469, batch 12, global_batch_idx: 7500, batch size: 152, loss[dur_loss=0.3275, prior_loss=1.005, diff_loss=0.3977, tot_loss=1.73, over 152.00 samples.], tot_loss[dur_loss=0.325, prior_loss=1.005, diff_loss=0.4463, tot_loss=1.776, over 1966.00 samples.], 2024-10-20 21:48:55,926 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 21:49:36,462 INFO [train.py:589] (1/4) Epoch 469, validation: dur_loss=0.425, prior_loss=1.027, diff_loss=0.3621, tot_loss=1.814, over 100.00 samples. 2024-10-20 21:49:36,463 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-20 21:49:39,321 INFO [train.py:682] (1/4) Start epoch 470 2024-10-20 21:49:56,551 INFO [train.py:561] (1/4) Epoch 470, batch 6, global_batch_idx: 7510, batch size: 106, loss[dur_loss=0.3269, prior_loss=1.005, diff_loss=0.3943, tot_loss=1.726, over 106.00 samples.], tot_loss[dur_loss=0.3226, prior_loss=1.005, diff_loss=0.472, tot_loss=1.799, over 1142.00 samples.], 2024-10-20 21:50:09,523 INFO [train.py:682] (1/4) Start epoch 471 2024-10-20 21:50:18,340 INFO [train.py:561] (1/4) Epoch 471, batch 0, global_batch_idx: 7520, batch size: 108, loss[dur_loss=0.3365, prior_loss=1.006, diff_loss=0.395, tot_loss=1.737, over 108.00 samples.], tot_loss[dur_loss=0.3365, prior_loss=1.006, diff_loss=0.395, tot_loss=1.737, over 108.00 samples.], 2024-10-20 21:50:32,392 INFO [train.py:561] (1/4) Epoch 471, batch 10, global_batch_idx: 7530, batch size: 111, loss[dur_loss=0.3408, prior_loss=1.007, diff_loss=0.3504, tot_loss=1.698, over 111.00 samples.], tot_loss[dur_loss=0.3258, prior_loss=1.005, diff_loss=0.4585, tot_loss=1.789, over 1656.00 samples.], 2024-10-20 21:50:39,362 INFO [train.py:682] (1/4) Start epoch 472 2024-10-20 21:50:53,512 INFO [train.py:561] (1/4) Epoch 472, batch 4, global_batch_idx: 7540, batch size: 189, loss[dur_loss=0.3254, prior_loss=1.007, diff_loss=0.4285, tot_loss=1.76, over 189.00 samples.], tot_loss[dur_loss=0.323, prior_loss=1.005, diff_loss=0.5029, tot_loss=1.831, over 937.00 samples.], 2024-10-20 21:51:08,092 INFO [train.py:561] (1/4) Epoch 472, batch 14, global_batch_idx: 7550, batch size: 142, loss[dur_loss=0.3311, prior_loss=1.005, diff_loss=0.3856, tot_loss=1.722, over 142.00 samples.], tot_loss[dur_loss=0.3278, prior_loss=1.005, diff_loss=0.4362, tot_loss=1.769, over 2210.00 samples.], 2024-10-20 21:51:09,491 INFO [train.py:682] (1/4) Start epoch 473 2024-10-20 21:51:29,279 INFO [train.py:561] (1/4) Epoch 473, batch 8, global_batch_idx: 7560, batch size: 170, loss[dur_loss=0.3285, prior_loss=1.005, diff_loss=0.431, tot_loss=1.764, over 170.00 samples.], tot_loss[dur_loss=0.3226, prior_loss=1.005, diff_loss=0.4647, tot_loss=1.792, over 1432.00 samples.], 2024-10-20 21:51:39,232 INFO [train.py:682] (1/4) Start epoch 474 2024-10-20 21:51:50,502 INFO [train.py:561] (1/4) Epoch 474, batch 2, global_batch_idx: 7570, batch size: 203, loss[dur_loss=0.3234, prior_loss=1.005, diff_loss=0.4164, tot_loss=1.745, over 203.00 samples.], tot_loss[dur_loss=0.3277, prior_loss=1.005, diff_loss=0.392, tot_loss=1.725, over 442.00 samples.], 2024-10-20 21:52:04,497 INFO [train.py:561] (1/4) Epoch 474, batch 12, global_batch_idx: 7580, batch size: 152, loss[dur_loss=0.3291, prior_loss=1.005, diff_loss=0.4107, tot_loss=1.744, over 152.00 samples.], tot_loss[dur_loss=0.3245, prior_loss=1.005, diff_loss=0.4428, tot_loss=1.772, over 1966.00 samples.], 2024-10-20 21:52:08,881 INFO [train.py:682] (1/4) Start epoch 475 2024-10-20 21:52:25,664 INFO [train.py:561] (1/4) Epoch 475, batch 6, global_batch_idx: 7590, batch size: 106, loss[dur_loss=0.3234, prior_loss=1.005, diff_loss=0.3847, tot_loss=1.713, over 106.00 samples.], tot_loss[dur_loss=0.3206, prior_loss=1.004, diff_loss=0.4922, tot_loss=1.817, over 1142.00 samples.], 2024-10-20 21:52:38,590 INFO [train.py:682] (1/4) Start epoch 476 2024-10-20 21:52:47,428 INFO [train.py:561] (1/4) Epoch 476, batch 0, global_batch_idx: 7600, batch size: 108, loss[dur_loss=0.3419, prior_loss=1.006, diff_loss=0.3492, tot_loss=1.697, over 108.00 samples.], tot_loss[dur_loss=0.3419, prior_loss=1.006, diff_loss=0.3492, tot_loss=1.697, over 108.00 samples.], 2024-10-20 21:53:01,496 INFO [train.py:561] (1/4) Epoch 476, batch 10, global_batch_idx: 7610, batch size: 111, loss[dur_loss=0.3333, prior_loss=1.006, diff_loss=0.3726, tot_loss=1.712, over 111.00 samples.], tot_loss[dur_loss=0.323, prior_loss=1.005, diff_loss=0.4534, tot_loss=1.781, over 1656.00 samples.], 2024-10-20 21:53:08,492 INFO [train.py:682] (1/4) Start epoch 477 2024-10-20 21:53:22,096 INFO [train.py:561] (1/4) Epoch 477, batch 4, global_batch_idx: 7620, batch size: 189, loss[dur_loss=0.3186, prior_loss=1.005, diff_loss=0.4464, tot_loss=1.77, over 189.00 samples.], tot_loss[dur_loss=0.3176, prior_loss=1.004, diff_loss=0.5204, tot_loss=1.842, over 937.00 samples.], 2024-10-20 21:53:36,842 INFO [train.py:561] (1/4) Epoch 477, batch 14, global_batch_idx: 7630, batch size: 142, loss[dur_loss=0.3286, prior_loss=1.005, diff_loss=0.4002, tot_loss=1.733, over 142.00 samples.], tot_loss[dur_loss=0.3235, prior_loss=1.004, diff_loss=0.4474, tot_loss=1.775, over 2210.00 samples.], 2024-10-20 21:53:38,254 INFO [train.py:682] (1/4) Start epoch 478 2024-10-20 21:53:57,850 INFO [train.py:561] (1/4) Epoch 478, batch 8, global_batch_idx: 7640, batch size: 170, loss[dur_loss=0.3275, prior_loss=1.005, diff_loss=0.3754, tot_loss=1.707, over 170.00 samples.], tot_loss[dur_loss=0.323, prior_loss=1.004, diff_loss=0.467, tot_loss=1.794, over 1432.00 samples.], 2024-10-20 21:54:07,788 INFO [train.py:682] (1/4) Start epoch 479 2024-10-20 21:54:18,848 INFO [train.py:561] (1/4) Epoch 479, batch 2, global_batch_idx: 7650, batch size: 203, loss[dur_loss=0.3227, prior_loss=1.004, diff_loss=0.4093, tot_loss=1.736, over 203.00 samples.], tot_loss[dur_loss=0.3272, prior_loss=1.005, diff_loss=0.3795, tot_loss=1.712, over 442.00 samples.], 2024-10-20 21:54:32,866 INFO [train.py:561] (1/4) Epoch 479, batch 12, global_batch_idx: 7660, batch size: 152, loss[dur_loss=0.3282, prior_loss=1.004, diff_loss=0.3859, tot_loss=1.718, over 152.00 samples.], tot_loss[dur_loss=0.3236, prior_loss=1.004, diff_loss=0.4347, tot_loss=1.763, over 1966.00 samples.], 2024-10-20 21:54:37,239 INFO [train.py:682] (1/4) Start epoch 480 2024-10-20 21:54:54,541 INFO [train.py:561] (1/4) Epoch 480, batch 6, global_batch_idx: 7670, batch size: 106, loss[dur_loss=0.3255, prior_loss=1.005, diff_loss=0.3532, tot_loss=1.684, over 106.00 samples.], tot_loss[dur_loss=0.321, prior_loss=1.004, diff_loss=0.4713, tot_loss=1.797, over 1142.00 samples.], 2024-10-20 21:55:07,423 INFO [train.py:682] (1/4) Start epoch 481 2024-10-20 21:55:15,914 INFO [train.py:561] (1/4) Epoch 481, batch 0, global_batch_idx: 7680, batch size: 108, loss[dur_loss=0.3409, prior_loss=1.006, diff_loss=0.3708, tot_loss=1.718, over 108.00 samples.], tot_loss[dur_loss=0.3409, prior_loss=1.006, diff_loss=0.3708, tot_loss=1.718, over 108.00 samples.], 2024-10-20 21:55:29,923 INFO [train.py:561] (1/4) Epoch 481, batch 10, global_batch_idx: 7690, batch size: 111, loss[dur_loss=0.3369, prior_loss=1.006, diff_loss=0.409, tot_loss=1.752, over 111.00 samples.], tot_loss[dur_loss=0.3228, prior_loss=1.004, diff_loss=0.4647, tot_loss=1.792, over 1656.00 samples.], 2024-10-20 21:55:36,886 INFO [train.py:682] (1/4) Start epoch 482 2024-10-20 21:55:50,351 INFO [train.py:561] (1/4) Epoch 482, batch 4, global_batch_idx: 7700, batch size: 189, loss[dur_loss=0.3206, prior_loss=1.005, diff_loss=0.3804, tot_loss=1.706, over 189.00 samples.], tot_loss[dur_loss=0.3182, prior_loss=1.004, diff_loss=0.4912, tot_loss=1.813, over 937.00 samples.], 2024-10-20 21:56:05,011 INFO [train.py:561] (1/4) Epoch 482, batch 14, global_batch_idx: 7710, batch size: 142, loss[dur_loss=0.3316, prior_loss=1.005, diff_loss=0.363, tot_loss=1.699, over 142.00 samples.], tot_loss[dur_loss=0.324, prior_loss=1.004, diff_loss=0.4298, tot_loss=1.758, over 2210.00 samples.], 2024-10-20 21:56:06,420 INFO [train.py:682] (1/4) Start epoch 483 2024-10-20 21:56:26,359 INFO [train.py:561] (1/4) Epoch 483, batch 8, global_batch_idx: 7720, batch size: 170, loss[dur_loss=0.3257, prior_loss=1.004, diff_loss=0.4182, tot_loss=1.748, over 170.00 samples.], tot_loss[dur_loss=0.3201, prior_loss=1.004, diff_loss=0.4611, tot_loss=1.785, over 1432.00 samples.], 2024-10-20 21:56:36,392 INFO [train.py:682] (1/4) Start epoch 484 2024-10-20 21:56:47,872 INFO [train.py:561] (1/4) Epoch 484, batch 2, global_batch_idx: 7730, batch size: 203, loss[dur_loss=0.3221, prior_loss=1.004, diff_loss=0.4014, tot_loss=1.728, over 203.00 samples.], tot_loss[dur_loss=0.3257, prior_loss=1.005, diff_loss=0.3877, tot_loss=1.718, over 442.00 samples.], 2024-10-20 21:57:02,043 INFO [train.py:561] (1/4) Epoch 484, batch 12, global_batch_idx: 7740, batch size: 152, loss[dur_loss=0.3258, prior_loss=1.004, diff_loss=0.401, tot_loss=1.731, over 152.00 samples.], tot_loss[dur_loss=0.3217, prior_loss=1.004, diff_loss=0.443, tot_loss=1.769, over 1966.00 samples.], 2024-10-20 21:57:06,422 INFO [train.py:682] (1/4) Start epoch 485 2024-10-20 21:57:23,363 INFO [train.py:561] (1/4) Epoch 485, batch 6, global_batch_idx: 7750, batch size: 106, loss[dur_loss=0.3231, prior_loss=1.005, diff_loss=0.3457, tot_loss=1.673, over 106.00 samples.], tot_loss[dur_loss=0.3174, prior_loss=1.004, diff_loss=0.4925, tot_loss=1.814, over 1142.00 samples.], 2024-10-20 21:57:36,329 INFO [train.py:682] (1/4) Start epoch 486 2024-10-20 21:57:44,895 INFO [train.py:561] (1/4) Epoch 486, batch 0, global_batch_idx: 7760, batch size: 108, loss[dur_loss=0.3326, prior_loss=1.005, diff_loss=0.3928, tot_loss=1.731, over 108.00 samples.], tot_loss[dur_loss=0.3326, prior_loss=1.005, diff_loss=0.3928, tot_loss=1.731, over 108.00 samples.], 2024-10-20 21:57:59,002 INFO [train.py:561] (1/4) Epoch 486, batch 10, global_batch_idx: 7770, batch size: 111, loss[dur_loss=0.3332, prior_loss=1.006, diff_loss=0.38, tot_loss=1.72, over 111.00 samples.], tot_loss[dur_loss=0.3201, prior_loss=1.004, diff_loss=0.4543, tot_loss=1.779, over 1656.00 samples.], 2024-10-20 21:58:06,004 INFO [train.py:682] (1/4) Start epoch 487 2024-10-20 21:58:19,966 INFO [train.py:561] (1/4) Epoch 487, batch 4, global_batch_idx: 7780, batch size: 189, loss[dur_loss=0.3241, prior_loss=1.005, diff_loss=0.4085, tot_loss=1.737, over 189.00 samples.], tot_loss[dur_loss=0.3185, prior_loss=1.004, diff_loss=0.5012, tot_loss=1.823, over 937.00 samples.], 2024-10-20 21:58:34,864 INFO [train.py:561] (1/4) Epoch 487, batch 14, global_batch_idx: 7790, batch size: 142, loss[dur_loss=0.3272, prior_loss=1.004, diff_loss=0.3708, tot_loss=1.702, over 142.00 samples.], tot_loss[dur_loss=0.3227, prior_loss=1.004, diff_loss=0.4347, tot_loss=1.762, over 2210.00 samples.], 2024-10-20 21:58:36,286 INFO [train.py:682] (1/4) Start epoch 488 2024-10-20 21:58:56,014 INFO [train.py:561] (1/4) Epoch 488, batch 8, global_batch_idx: 7800, batch size: 170, loss[dur_loss=0.3292, prior_loss=1.004, diff_loss=0.3764, tot_loss=1.71, over 170.00 samples.], tot_loss[dur_loss=0.319, prior_loss=1.004, diff_loss=0.4699, tot_loss=1.793, over 1432.00 samples.], 2024-10-20 21:59:06,158 INFO [train.py:682] (1/4) Start epoch 489 2024-10-20 21:59:17,745 INFO [train.py:561] (1/4) Epoch 489, batch 2, global_batch_idx: 7810, batch size: 203, loss[dur_loss=0.3187, prior_loss=1.004, diff_loss=0.4083, tot_loss=1.731, over 203.00 samples.], tot_loss[dur_loss=0.3258, prior_loss=1.005, diff_loss=0.4103, tot_loss=1.741, over 442.00 samples.], 2024-10-20 21:59:31,932 INFO [train.py:561] (1/4) Epoch 489, batch 12, global_batch_idx: 7820, batch size: 152, loss[dur_loss=0.3262, prior_loss=1.004, diff_loss=0.3863, tot_loss=1.717, over 152.00 samples.], tot_loss[dur_loss=0.321, prior_loss=1.004, diff_loss=0.445, tot_loss=1.77, over 1966.00 samples.], 2024-10-20 21:59:36,359 INFO [train.py:682] (1/4) Start epoch 490 2024-10-20 21:59:53,369 INFO [train.py:561] (1/4) Epoch 490, batch 6, global_batch_idx: 7830, batch size: 106, loss[dur_loss=0.3282, prior_loss=1.004, diff_loss=0.3819, tot_loss=1.715, over 106.00 samples.], tot_loss[dur_loss=0.3186, prior_loss=1.004, diff_loss=0.4862, tot_loss=1.809, over 1142.00 samples.], 2024-10-20 22:00:06,427 INFO [train.py:682] (1/4) Start epoch 491 2024-10-20 22:00:14,987 INFO [train.py:561] (1/4) Epoch 491, batch 0, global_batch_idx: 7840, batch size: 108, loss[dur_loss=0.3334, prior_loss=1.005, diff_loss=0.3821, tot_loss=1.721, over 108.00 samples.], tot_loss[dur_loss=0.3334, prior_loss=1.005, diff_loss=0.3821, tot_loss=1.721, over 108.00 samples.], 2024-10-20 22:00:29,269 INFO [train.py:561] (1/4) Epoch 491, batch 10, global_batch_idx: 7850, batch size: 111, loss[dur_loss=0.3338, prior_loss=1.006, diff_loss=0.3528, tot_loss=1.692, over 111.00 samples.], tot_loss[dur_loss=0.3203, prior_loss=1.004, diff_loss=0.4512, tot_loss=1.775, over 1656.00 samples.], 2024-10-20 22:00:36,380 INFO [train.py:682] (1/4) Start epoch 492 2024-10-20 22:00:49,961 INFO [train.py:561] (1/4) Epoch 492, batch 4, global_batch_idx: 7860, batch size: 189, loss[dur_loss=0.32, prior_loss=1.004, diff_loss=0.4216, tot_loss=1.746, over 189.00 samples.], tot_loss[dur_loss=0.3154, prior_loss=1.003, diff_loss=0.4988, tot_loss=1.818, over 937.00 samples.], 2024-10-20 22:01:04,801 INFO [train.py:561] (1/4) Epoch 492, batch 14, global_batch_idx: 7870, batch size: 142, loss[dur_loss=0.3296, prior_loss=1.004, diff_loss=0.3946, tot_loss=1.729, over 142.00 samples.], tot_loss[dur_loss=0.321, prior_loss=1.004, diff_loss=0.4346, tot_loss=1.76, over 2210.00 samples.], 2024-10-20 22:01:06,217 INFO [train.py:682] (1/4) Start epoch 493 2024-10-20 22:01:25,959 INFO [train.py:561] (1/4) Epoch 493, batch 8, global_batch_idx: 7880, batch size: 170, loss[dur_loss=0.3291, prior_loss=1.005, diff_loss=0.406, tot_loss=1.74, over 170.00 samples.], tot_loss[dur_loss=0.3199, prior_loss=1.004, diff_loss=0.4502, tot_loss=1.774, over 1432.00 samples.], 2024-10-20 22:01:36,097 INFO [train.py:682] (1/4) Start epoch 494 2024-10-20 22:01:47,521 INFO [train.py:561] (1/4) Epoch 494, batch 2, global_batch_idx: 7890, batch size: 203, loss[dur_loss=0.3183, prior_loss=1.004, diff_loss=0.424, tot_loss=1.746, over 203.00 samples.], tot_loss[dur_loss=0.3227, prior_loss=1.004, diff_loss=0.403, tot_loss=1.73, over 442.00 samples.], 2024-10-20 22:02:01,685 INFO [train.py:561] (1/4) Epoch 494, batch 12, global_batch_idx: 7900, batch size: 152, loss[dur_loss=0.3241, prior_loss=1.004, diff_loss=0.4298, tot_loss=1.758, over 152.00 samples.], tot_loss[dur_loss=0.3192, prior_loss=1.004, diff_loss=0.4484, tot_loss=1.772, over 1966.00 samples.], 2024-10-20 22:02:06,115 INFO [train.py:682] (1/4) Start epoch 495 2024-10-20 22:02:23,022 INFO [train.py:561] (1/4) Epoch 495, batch 6, global_batch_idx: 7910, batch size: 106, loss[dur_loss=0.321, prior_loss=1.004, diff_loss=0.3865, tot_loss=1.712, over 106.00 samples.], tot_loss[dur_loss=0.3175, prior_loss=1.003, diff_loss=0.4798, tot_loss=1.801, over 1142.00 samples.], 2024-10-20 22:02:36,061 INFO [train.py:682] (1/4) Start epoch 496 2024-10-20 22:02:44,726 INFO [train.py:561] (1/4) Epoch 496, batch 0, global_batch_idx: 7920, batch size: 108, loss[dur_loss=0.3358, prior_loss=1.005, diff_loss=0.3848, tot_loss=1.726, over 108.00 samples.], tot_loss[dur_loss=0.3358, prior_loss=1.005, diff_loss=0.3848, tot_loss=1.726, over 108.00 samples.], 2024-10-20 22:02:59,059 INFO [train.py:561] (1/4) Epoch 496, batch 10, global_batch_idx: 7930, batch size: 111, loss[dur_loss=0.3313, prior_loss=1.006, diff_loss=0.3691, tot_loss=1.706, over 111.00 samples.], tot_loss[dur_loss=0.3204, prior_loss=1.004, diff_loss=0.4515, tot_loss=1.776, over 1656.00 samples.], 2024-10-20 22:03:06,114 INFO [train.py:682] (1/4) Start epoch 497 2024-10-20 22:03:20,047 INFO [train.py:561] (1/4) Epoch 497, batch 4, global_batch_idx: 7940, batch size: 189, loss[dur_loss=0.3163, prior_loss=1.004, diff_loss=0.4056, tot_loss=1.726, over 189.00 samples.], tot_loss[dur_loss=0.3138, prior_loss=1.003, diff_loss=0.5029, tot_loss=1.82, over 937.00 samples.], 2024-10-20 22:03:34,927 INFO [train.py:561] (1/4) Epoch 497, batch 14, global_batch_idx: 7950, batch size: 142, loss[dur_loss=0.3311, prior_loss=1.004, diff_loss=0.41, tot_loss=1.745, over 142.00 samples.], tot_loss[dur_loss=0.3205, prior_loss=1.004, diff_loss=0.4376, tot_loss=1.762, over 2210.00 samples.], 2024-10-20 22:03:36,350 INFO [train.py:682] (1/4) Start epoch 498 2024-10-20 22:03:56,183 INFO [train.py:561] (1/4) Epoch 498, batch 8, global_batch_idx: 7960, batch size: 170, loss[dur_loss=0.3267, prior_loss=1.004, diff_loss=0.3885, tot_loss=1.719, over 170.00 samples.], tot_loss[dur_loss=0.3194, prior_loss=1.004, diff_loss=0.4648, tot_loss=1.788, over 1432.00 samples.], 2024-10-20 22:04:06,281 INFO [train.py:682] (1/4) Start epoch 499 2024-10-20 22:04:17,538 INFO [train.py:561] (1/4) Epoch 499, batch 2, global_batch_idx: 7970, batch size: 203, loss[dur_loss=0.3243, prior_loss=1.004, diff_loss=0.4104, tot_loss=1.739, over 203.00 samples.], tot_loss[dur_loss=0.326, prior_loss=1.004, diff_loss=0.4014, tot_loss=1.732, over 442.00 samples.], 2024-10-20 22:04:31,695 INFO [train.py:561] (1/4) Epoch 499, batch 12, global_batch_idx: 7980, batch size: 152, loss[dur_loss=0.3231, prior_loss=1.004, diff_loss=0.3739, tot_loss=1.701, over 152.00 samples.], tot_loss[dur_loss=0.3215, prior_loss=1.004, diff_loss=0.4422, tot_loss=1.768, over 1966.00 samples.], 2024-10-20 22:04:36,079 INFO [train.py:682] (1/4) Start epoch 500 2024-10-20 22:04:53,137 INFO [train.py:561] (1/4) Epoch 500, batch 6, global_batch_idx: 7990, batch size: 106, loss[dur_loss=0.3237, prior_loss=1.004, diff_loss=0.3934, tot_loss=1.721, over 106.00 samples.], tot_loss[dur_loss=0.3168, prior_loss=1.003, diff_loss=0.478, tot_loss=1.798, over 1142.00 samples.], 2024-10-20 22:05:06,147 INFO [train.py:682] (1/4) Start epoch 501 2024-10-20 22:05:14,763 INFO [train.py:561] (1/4) Epoch 501, batch 0, global_batch_idx: 8000, batch size: 108, loss[dur_loss=0.3309, prior_loss=1.005, diff_loss=0.4025, tot_loss=1.738, over 108.00 samples.], tot_loss[dur_loss=0.3309, prior_loss=1.005, diff_loss=0.4025, tot_loss=1.738, over 108.00 samples.], 2024-10-20 22:05:28,810 INFO [train.py:561] (1/4) Epoch 501, batch 10, global_batch_idx: 8010, batch size: 111, loss[dur_loss=0.3385, prior_loss=1.006, diff_loss=0.3707, tot_loss=1.716, over 111.00 samples.], tot_loss[dur_loss=0.3209, prior_loss=1.004, diff_loss=0.4512, tot_loss=1.776, over 1656.00 samples.], 2024-10-20 22:05:35,822 INFO [train.py:682] (1/4) Start epoch 502 2024-10-20 22:05:49,307 INFO [train.py:561] (1/4) Epoch 502, batch 4, global_batch_idx: 8020, batch size: 189, loss[dur_loss=0.3229, prior_loss=1.004, diff_loss=0.4512, tot_loss=1.779, over 189.00 samples.], tot_loss[dur_loss=0.3181, prior_loss=1.003, diff_loss=0.5122, tot_loss=1.834, over 937.00 samples.], 2024-10-20 22:06:03,927 INFO [train.py:561] (1/4) Epoch 502, batch 14, global_batch_idx: 8030, batch size: 142, loss[dur_loss=0.3256, prior_loss=1.004, diff_loss=0.4043, tot_loss=1.734, over 142.00 samples.], tot_loss[dur_loss=0.3218, prior_loss=1.004, diff_loss=0.4429, tot_loss=1.769, over 2210.00 samples.], 2024-10-20 22:06:05,339 INFO [train.py:682] (1/4) Start epoch 503 2024-10-20 22:06:24,908 INFO [train.py:561] (1/4) Epoch 503, batch 8, global_batch_idx: 8040, batch size: 170, loss[dur_loss=0.3247, prior_loss=1.004, diff_loss=0.409, tot_loss=1.738, over 170.00 samples.], tot_loss[dur_loss=0.3188, prior_loss=1.003, diff_loss=0.4686, tot_loss=1.791, over 1432.00 samples.], 2024-10-20 22:06:34,938 INFO [train.py:682] (1/4) Start epoch 504 2024-10-20 22:06:45,975 INFO [train.py:561] (1/4) Epoch 504, batch 2, global_batch_idx: 8050, batch size: 203, loss[dur_loss=0.3197, prior_loss=1.004, diff_loss=0.4112, tot_loss=1.734, over 203.00 samples.], tot_loss[dur_loss=0.3249, prior_loss=1.004, diff_loss=0.3948, tot_loss=1.724, over 442.00 samples.], 2024-10-20 22:07:00,027 INFO [train.py:561] (1/4) Epoch 504, batch 12, global_batch_idx: 8060, batch size: 152, loss[dur_loss=0.3236, prior_loss=1.003, diff_loss=0.3637, tot_loss=1.691, over 152.00 samples.], tot_loss[dur_loss=0.3195, prior_loss=1.004, diff_loss=0.4383, tot_loss=1.761, over 1966.00 samples.], 2024-10-20 22:07:04,390 INFO [train.py:682] (1/4) Start epoch 505 2024-10-20 22:07:21,242 INFO [train.py:561] (1/4) Epoch 505, batch 6, global_batch_idx: 8070, batch size: 106, loss[dur_loss=0.327, prior_loss=1.004, diff_loss=0.3785, tot_loss=1.71, over 106.00 samples.], tot_loss[dur_loss=0.315, prior_loss=1.003, diff_loss=0.4824, tot_loss=1.801, over 1142.00 samples.], 2024-10-20 22:07:34,143 INFO [train.py:682] (1/4) Start epoch 506 2024-10-20 22:07:42,718 INFO [train.py:561] (1/4) Epoch 506, batch 0, global_batch_idx: 8080, batch size: 108, loss[dur_loss=0.3356, prior_loss=1.005, diff_loss=0.3959, tot_loss=1.736, over 108.00 samples.], tot_loss[dur_loss=0.3356, prior_loss=1.005, diff_loss=0.3959, tot_loss=1.736, over 108.00 samples.], 2024-10-20 22:07:56,756 INFO [train.py:561] (1/4) Epoch 506, batch 10, global_batch_idx: 8090, batch size: 111, loss[dur_loss=0.3324, prior_loss=1.006, diff_loss=0.3391, tot_loss=1.677, over 111.00 samples.], tot_loss[dur_loss=0.3173, prior_loss=1.004, diff_loss=0.4457, tot_loss=1.767, over 1656.00 samples.], 2024-10-20 22:08:03,763 INFO [train.py:682] (1/4) Start epoch 507 2024-10-20 22:08:17,156 INFO [train.py:561] (1/4) Epoch 507, batch 4, global_batch_idx: 8100, batch size: 189, loss[dur_loss=0.3214, prior_loss=1.004, diff_loss=0.4191, tot_loss=1.744, over 189.00 samples.], tot_loss[dur_loss=0.3161, prior_loss=1.003, diff_loss=0.4933, tot_loss=1.812, over 937.00 samples.], 2024-10-20 22:08:31,838 INFO [train.py:561] (1/4) Epoch 507, batch 14, global_batch_idx: 8110, batch size: 142, loss[dur_loss=0.3229, prior_loss=1.003, diff_loss=0.375, tot_loss=1.701, over 142.00 samples.], tot_loss[dur_loss=0.3198, prior_loss=1.003, diff_loss=0.4302, tot_loss=1.753, over 2210.00 samples.], 2024-10-20 22:08:33,244 INFO [train.py:682] (1/4) Start epoch 508 2024-10-20 22:08:52,625 INFO [train.py:561] (1/4) Epoch 508, batch 8, global_batch_idx: 8120, batch size: 170, loss[dur_loss=0.3214, prior_loss=1.003, diff_loss=0.388, tot_loss=1.713, over 170.00 samples.], tot_loss[dur_loss=0.3166, prior_loss=1.003, diff_loss=0.4684, tot_loss=1.788, over 1432.00 samples.], 2024-10-20 22:09:02,600 INFO [train.py:682] (1/4) Start epoch 509 2024-10-20 22:09:13,875 INFO [train.py:561] (1/4) Epoch 509, batch 2, global_batch_idx: 8130, batch size: 203, loss[dur_loss=0.3223, prior_loss=1.003, diff_loss=0.3862, tot_loss=1.712, over 203.00 samples.], tot_loss[dur_loss=0.3244, prior_loss=1.004, diff_loss=0.3775, tot_loss=1.706, over 442.00 samples.], 2024-10-20 22:09:27,940 INFO [train.py:561] (1/4) Epoch 509, batch 12, global_batch_idx: 8140, batch size: 152, loss[dur_loss=0.3197, prior_loss=1.003, diff_loss=0.3997, tot_loss=1.722, over 152.00 samples.], tot_loss[dur_loss=0.3179, prior_loss=1.003, diff_loss=0.4254, tot_loss=1.747, over 1966.00 samples.], 2024-10-20 22:09:32,335 INFO [train.py:682] (1/4) Start epoch 510 2024-10-20 22:09:48,951 INFO [train.py:561] (1/4) Epoch 510, batch 6, global_batch_idx: 8150, batch size: 106, loss[dur_loss=0.3191, prior_loss=1.004, diff_loss=0.3423, tot_loss=1.665, over 106.00 samples.], tot_loss[dur_loss=0.3151, prior_loss=1.003, diff_loss=0.4705, tot_loss=1.788, over 1142.00 samples.], 2024-10-20 22:10:01,878 INFO [train.py:682] (1/4) Start epoch 511 2024-10-20 22:10:10,487 INFO [train.py:561] (1/4) Epoch 511, batch 0, global_batch_idx: 8160, batch size: 108, loss[dur_loss=0.3316, prior_loss=1.004, diff_loss=0.3776, tot_loss=1.713, over 108.00 samples.], tot_loss[dur_loss=0.3316, prior_loss=1.004, diff_loss=0.3776, tot_loss=1.713, over 108.00 samples.], 2024-10-20 22:10:24,491 INFO [train.py:561] (1/4) Epoch 511, batch 10, global_batch_idx: 8170, batch size: 111, loss[dur_loss=0.3285, prior_loss=1.005, diff_loss=0.376, tot_loss=1.71, over 111.00 samples.], tot_loss[dur_loss=0.3191, prior_loss=1.004, diff_loss=0.4594, tot_loss=1.782, over 1656.00 samples.], 2024-10-20 22:10:31,546 INFO [train.py:682] (1/4) Start epoch 512 2024-10-20 22:10:44,980 INFO [train.py:561] (1/4) Epoch 512, batch 4, global_batch_idx: 8180, batch size: 189, loss[dur_loss=0.3161, prior_loss=1.004, diff_loss=0.3924, tot_loss=1.713, over 189.00 samples.], tot_loss[dur_loss=0.3152, prior_loss=1.003, diff_loss=0.5193, tot_loss=1.838, over 937.00 samples.], 2024-10-20 22:10:59,648 INFO [train.py:561] (1/4) Epoch 512, batch 14, global_batch_idx: 8190, batch size: 142, loss[dur_loss=0.3255, prior_loss=1.004, diff_loss=0.3693, tot_loss=1.698, over 142.00 samples.], tot_loss[dur_loss=0.3186, prior_loss=1.003, diff_loss=0.4409, tot_loss=1.763, over 2210.00 samples.], 2024-10-20 22:11:01,052 INFO [train.py:682] (1/4) Start epoch 513 2024-10-20 22:11:20,763 INFO [train.py:561] (1/4) Epoch 513, batch 8, global_batch_idx: 8200, batch size: 170, loss[dur_loss=0.3176, prior_loss=1.004, diff_loss=0.4047, tot_loss=1.726, over 170.00 samples.], tot_loss[dur_loss=0.3169, prior_loss=1.003, diff_loss=0.4543, tot_loss=1.774, over 1432.00 samples.], 2024-10-20 22:11:30,747 INFO [train.py:682] (1/4) Start epoch 514 2024-10-20 22:11:41,935 INFO [train.py:561] (1/4) Epoch 514, batch 2, global_batch_idx: 8210, batch size: 203, loss[dur_loss=0.3182, prior_loss=1.003, diff_loss=0.4151, tot_loss=1.737, over 203.00 samples.], tot_loss[dur_loss=0.323, prior_loss=1.004, diff_loss=0.3964, tot_loss=1.723, over 442.00 samples.], 2024-10-20 22:11:55,989 INFO [train.py:561] (1/4) Epoch 514, batch 12, global_batch_idx: 8220, batch size: 152, loss[dur_loss=0.3235, prior_loss=1.003, diff_loss=0.42, tot_loss=1.746, over 152.00 samples.], tot_loss[dur_loss=0.3185, prior_loss=1.003, diff_loss=0.4473, tot_loss=1.769, over 1966.00 samples.], 2024-10-20 22:12:00,366 INFO [train.py:682] (1/4) Start epoch 515 2024-10-20 22:12:17,088 INFO [train.py:561] (1/4) Epoch 515, batch 6, global_batch_idx: 8230, batch size: 106, loss[dur_loss=0.3178, prior_loss=1.003, diff_loss=0.3818, tot_loss=1.703, over 106.00 samples.], tot_loss[dur_loss=0.3147, prior_loss=1.003, diff_loss=0.4821, tot_loss=1.8, over 1142.00 samples.], 2024-10-20 22:12:29,990 INFO [train.py:682] (1/4) Start epoch 516 2024-10-20 22:12:38,368 INFO [train.py:561] (1/4) Epoch 516, batch 0, global_batch_idx: 8240, batch size: 108, loss[dur_loss=0.3356, prior_loss=1.004, diff_loss=0.3695, tot_loss=1.71, over 108.00 samples.], tot_loss[dur_loss=0.3356, prior_loss=1.004, diff_loss=0.3695, tot_loss=1.71, over 108.00 samples.], 2024-10-20 22:12:52,423 INFO [train.py:561] (1/4) Epoch 516, batch 10, global_batch_idx: 8250, batch size: 111, loss[dur_loss=0.3309, prior_loss=1.005, diff_loss=0.3871, tot_loss=1.724, over 111.00 samples.], tot_loss[dur_loss=0.3191, prior_loss=1.003, diff_loss=0.4494, tot_loss=1.772, over 1656.00 samples.], 2024-10-20 22:12:59,480 INFO [train.py:682] (1/4) Start epoch 517 2024-10-20 22:13:13,074 INFO [train.py:561] (1/4) Epoch 517, batch 4, global_batch_idx: 8260, batch size: 189, loss[dur_loss=0.3201, prior_loss=1.004, diff_loss=0.4529, tot_loss=1.777, over 189.00 samples.], tot_loss[dur_loss=0.3135, prior_loss=1.003, diff_loss=0.5167, tot_loss=1.833, over 937.00 samples.], 2024-10-20 22:13:27,694 INFO [train.py:561] (1/4) Epoch 517, batch 14, global_batch_idx: 8270, batch size: 142, loss[dur_loss=0.3243, prior_loss=1.004, diff_loss=0.4012, tot_loss=1.729, over 142.00 samples.], tot_loss[dur_loss=0.3186, prior_loss=1.003, diff_loss=0.4401, tot_loss=1.762, over 2210.00 samples.], 2024-10-20 22:13:29,107 INFO [train.py:682] (1/4) Start epoch 518 2024-10-20 22:13:48,747 INFO [train.py:561] (1/4) Epoch 518, batch 8, global_batch_idx: 8280, batch size: 170, loss[dur_loss=0.322, prior_loss=1.003, diff_loss=0.3647, tot_loss=1.69, over 170.00 samples.], tot_loss[dur_loss=0.3157, prior_loss=1.003, diff_loss=0.4536, tot_loss=1.772, over 1432.00 samples.], 2024-10-20 22:13:58,738 INFO [train.py:682] (1/4) Start epoch 519 2024-10-20 22:14:09,855 INFO [train.py:561] (1/4) Epoch 519, batch 2, global_batch_idx: 8290, batch size: 203, loss[dur_loss=0.3185, prior_loss=1.003, diff_loss=0.4287, tot_loss=1.751, over 203.00 samples.], tot_loss[dur_loss=0.3229, prior_loss=1.004, diff_loss=0.3958, tot_loss=1.722, over 442.00 samples.], 2024-10-20 22:14:23,886 INFO [train.py:561] (1/4) Epoch 519, batch 12, global_batch_idx: 8300, batch size: 152, loss[dur_loss=0.32, prior_loss=1.003, diff_loss=0.3898, tot_loss=1.713, over 152.00 samples.], tot_loss[dur_loss=0.3182, prior_loss=1.003, diff_loss=0.4456, tot_loss=1.767, over 1966.00 samples.], 2024-10-20 22:14:28,289 INFO [train.py:682] (1/4) Start epoch 520 2024-10-20 22:14:45,032 INFO [train.py:561] (1/4) Epoch 520, batch 6, global_batch_idx: 8310, batch size: 106, loss[dur_loss=0.3163, prior_loss=1.003, diff_loss=0.3569, tot_loss=1.677, over 106.00 samples.], tot_loss[dur_loss=0.3143, prior_loss=1.003, diff_loss=0.4796, tot_loss=1.797, over 1142.00 samples.], 2024-10-20 22:14:57,946 INFO [train.py:682] (1/4) Start epoch 521 2024-10-20 22:15:06,644 INFO [train.py:561] (1/4) Epoch 521, batch 0, global_batch_idx: 8320, batch size: 108, loss[dur_loss=0.3329, prior_loss=1.004, diff_loss=0.3751, tot_loss=1.712, over 108.00 samples.], tot_loss[dur_loss=0.3329, prior_loss=1.004, diff_loss=0.3751, tot_loss=1.712, over 108.00 samples.], 2024-10-20 22:15:20,620 INFO [train.py:561] (1/4) Epoch 521, batch 10, global_batch_idx: 8330, batch size: 111, loss[dur_loss=0.329, prior_loss=1.005, diff_loss=0.3861, tot_loss=1.72, over 111.00 samples.], tot_loss[dur_loss=0.3175, prior_loss=1.003, diff_loss=0.4476, tot_loss=1.768, over 1656.00 samples.], 2024-10-20 22:15:27,634 INFO [train.py:682] (1/4) Start epoch 522 2024-10-20 22:15:41,131 INFO [train.py:561] (1/4) Epoch 522, batch 4, global_batch_idx: 8340, batch size: 189, loss[dur_loss=0.3183, prior_loss=1.004, diff_loss=0.4027, tot_loss=1.725, over 189.00 samples.], tot_loss[dur_loss=0.3139, prior_loss=1.003, diff_loss=0.4962, tot_loss=1.813, over 937.00 samples.], 2024-10-20 22:15:55,952 INFO [train.py:561] (1/4) Epoch 522, batch 14, global_batch_idx: 8350, batch size: 142, loss[dur_loss=0.3282, prior_loss=1.004, diff_loss=0.3688, tot_loss=1.701, over 142.00 samples.], tot_loss[dur_loss=0.3183, prior_loss=1.003, diff_loss=0.4347, tot_loss=1.756, over 2210.00 samples.], 2024-10-20 22:15:57,406 INFO [train.py:682] (1/4) Start epoch 523 2024-10-20 22:16:17,402 INFO [train.py:561] (1/4) Epoch 523, batch 8, global_batch_idx: 8360, batch size: 170, loss[dur_loss=0.3198, prior_loss=1.003, diff_loss=0.3887, tot_loss=1.712, over 170.00 samples.], tot_loss[dur_loss=0.3169, prior_loss=1.003, diff_loss=0.4609, tot_loss=1.781, over 1432.00 samples.], 2024-10-20 22:16:27,522 INFO [train.py:682] (1/4) Start epoch 524 2024-10-20 22:16:38,950 INFO [train.py:561] (1/4) Epoch 524, batch 2, global_batch_idx: 8370, batch size: 203, loss[dur_loss=0.3231, prior_loss=1.003, diff_loss=0.3982, tot_loss=1.725, over 203.00 samples.], tot_loss[dur_loss=0.3247, prior_loss=1.004, diff_loss=0.4011, tot_loss=1.729, over 442.00 samples.], 2024-10-20 22:16:53,148 INFO [train.py:561] (1/4) Epoch 524, batch 12, global_batch_idx: 8380, batch size: 152, loss[dur_loss=0.3185, prior_loss=1.003, diff_loss=0.3625, tot_loss=1.684, over 152.00 samples.], tot_loss[dur_loss=0.3182, prior_loss=1.003, diff_loss=0.4314, tot_loss=1.753, over 1966.00 samples.], 2024-10-20 22:16:57,580 INFO [train.py:682] (1/4) Start epoch 525 2024-10-20 22:17:14,635 INFO [train.py:561] (1/4) Epoch 525, batch 6, global_batch_idx: 8390, batch size: 106, loss[dur_loss=0.3199, prior_loss=1.003, diff_loss=0.3616, tot_loss=1.685, over 106.00 samples.], tot_loss[dur_loss=0.3161, prior_loss=1.003, diff_loss=0.4649, tot_loss=1.784, over 1142.00 samples.], 2024-10-20 22:17:27,436 INFO [train.py:682] (1/4) Start epoch 526 2024-10-20 22:17:35,992 INFO [train.py:561] (1/4) Epoch 526, batch 0, global_batch_idx: 8400, batch size: 108, loss[dur_loss=0.3273, prior_loss=1.004, diff_loss=0.3413, tot_loss=1.673, over 108.00 samples.], tot_loss[dur_loss=0.3273, prior_loss=1.004, diff_loss=0.3413, tot_loss=1.673, over 108.00 samples.], 2024-10-20 22:17:49,991 INFO [train.py:561] (1/4) Epoch 526, batch 10, global_batch_idx: 8410, batch size: 111, loss[dur_loss=0.3321, prior_loss=1.004, diff_loss=0.3412, tot_loss=1.678, over 111.00 samples.], tot_loss[dur_loss=0.3177, prior_loss=1.003, diff_loss=0.4423, tot_loss=1.763, over 1656.00 samples.], 2024-10-20 22:17:56,938 INFO [train.py:682] (1/4) Start epoch 527 2024-10-20 22:18:10,461 INFO [train.py:561] (1/4) Epoch 527, batch 4, global_batch_idx: 8420, batch size: 189, loss[dur_loss=0.3204, prior_loss=1.005, diff_loss=0.3942, tot_loss=1.719, over 189.00 samples.], tot_loss[dur_loss=0.3143, prior_loss=1.003, diff_loss=0.4883, tot_loss=1.805, over 937.00 samples.], 2024-10-20 22:18:25,096 INFO [train.py:561] (1/4) Epoch 527, batch 14, global_batch_idx: 8430, batch size: 142, loss[dur_loss=0.3212, prior_loss=1.003, diff_loss=0.3451, tot_loss=1.67, over 142.00 samples.], tot_loss[dur_loss=0.318, prior_loss=1.003, diff_loss=0.4289, tot_loss=1.75, over 2210.00 samples.], 2024-10-20 22:18:26,506 INFO [train.py:682] (1/4) Start epoch 528 2024-10-20 22:18:46,348 INFO [train.py:561] (1/4) Epoch 528, batch 8, global_batch_idx: 8440, batch size: 170, loss[dur_loss=0.3197, prior_loss=1.003, diff_loss=0.4464, tot_loss=1.77, over 170.00 samples.], tot_loss[dur_loss=0.3169, prior_loss=1.003, diff_loss=0.4713, tot_loss=1.791, over 1432.00 samples.], 2024-10-20 22:18:56,376 INFO [train.py:682] (1/4) Start epoch 529 2024-10-20 22:19:07,893 INFO [train.py:561] (1/4) Epoch 529, batch 2, global_batch_idx: 8450, batch size: 203, loss[dur_loss=0.3228, prior_loss=1.003, diff_loss=0.4068, tot_loss=1.733, over 203.00 samples.], tot_loss[dur_loss=0.3252, prior_loss=1.004, diff_loss=0.3938, tot_loss=1.723, over 442.00 samples.], 2024-10-20 22:19:21,891 INFO [train.py:561] (1/4) Epoch 529, batch 12, global_batch_idx: 8460, batch size: 152, loss[dur_loss=0.3168, prior_loss=1.003, diff_loss=0.3761, tot_loss=1.696, over 152.00 samples.], tot_loss[dur_loss=0.3178, prior_loss=1.003, diff_loss=0.4355, tot_loss=1.756, over 1966.00 samples.], 2024-10-20 22:19:26,255 INFO [train.py:682] (1/4) Start epoch 530 2024-10-20 22:19:42,971 INFO [train.py:561] (1/4) Epoch 530, batch 6, global_batch_idx: 8470, batch size: 106, loss[dur_loss=0.3174, prior_loss=1.003, diff_loss=0.4108, tot_loss=1.731, over 106.00 samples.], tot_loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.483, tot_loss=1.8, over 1142.00 samples.], 2024-10-20 22:19:55,811 INFO [train.py:682] (1/4) Start epoch 531 2024-10-20 22:20:04,420 INFO [train.py:561] (1/4) Epoch 531, batch 0, global_batch_idx: 8480, batch size: 108, loss[dur_loss=0.3355, prior_loss=1.004, diff_loss=0.3721, tot_loss=1.712, over 108.00 samples.], tot_loss[dur_loss=0.3355, prior_loss=1.004, diff_loss=0.3721, tot_loss=1.712, over 108.00 samples.], 2024-10-20 22:20:18,525 INFO [train.py:561] (1/4) Epoch 531, batch 10, global_batch_idx: 8490, batch size: 111, loss[dur_loss=0.3317, prior_loss=1.004, diff_loss=0.3743, tot_loss=1.71, over 111.00 samples.], tot_loss[dur_loss=0.3176, prior_loss=1.003, diff_loss=0.4422, tot_loss=1.763, over 1656.00 samples.], 2024-10-20 22:20:25,507 INFO [train.py:682] (1/4) Start epoch 532 2024-10-20 22:20:39,103 INFO [train.py:561] (1/4) Epoch 532, batch 4, global_batch_idx: 8500, batch size: 189, loss[dur_loss=0.3189, prior_loss=1.004, diff_loss=0.4212, tot_loss=1.744, over 189.00 samples.], tot_loss[dur_loss=0.3122, prior_loss=1.002, diff_loss=0.4952, tot_loss=1.81, over 937.00 samples.], 2024-10-20 22:20:53,734 INFO [train.py:561] (1/4) Epoch 532, batch 14, global_batch_idx: 8510, batch size: 142, loss[dur_loss=0.3248, prior_loss=1.003, diff_loss=0.388, tot_loss=1.716, over 142.00 samples.], tot_loss[dur_loss=0.3167, prior_loss=1.003, diff_loss=0.4292, tot_loss=1.749, over 2210.00 samples.], 2024-10-20 22:20:55,150 INFO [train.py:682] (1/4) Start epoch 533 2024-10-20 22:21:14,619 INFO [train.py:561] (1/4) Epoch 533, batch 8, global_batch_idx: 8520, batch size: 170, loss[dur_loss=0.3163, prior_loss=1.003, diff_loss=0.4066, tot_loss=1.726, over 170.00 samples.], tot_loss[dur_loss=0.3138, prior_loss=1.002, diff_loss=0.4573, tot_loss=1.774, over 1432.00 samples.], 2024-10-20 22:21:24,571 INFO [train.py:682] (1/4) Start epoch 534 2024-10-20 22:21:35,883 INFO [train.py:561] (1/4) Epoch 534, batch 2, global_batch_idx: 8530, batch size: 203, loss[dur_loss=0.3186, prior_loss=1.003, diff_loss=0.4353, tot_loss=1.757, over 203.00 samples.], tot_loss[dur_loss=0.321, prior_loss=1.003, diff_loss=0.3976, tot_loss=1.722, over 442.00 samples.], 2024-10-20 22:21:49,946 INFO [train.py:561] (1/4) Epoch 534, batch 12, global_batch_idx: 8540, batch size: 152, loss[dur_loss=0.318, prior_loss=1.003, diff_loss=0.3509, tot_loss=1.672, over 152.00 samples.], tot_loss[dur_loss=0.3158, prior_loss=1.003, diff_loss=0.4259, tot_loss=1.744, over 1966.00 samples.], 2024-10-20 22:21:54,343 INFO [train.py:682] (1/4) Start epoch 535 2024-10-20 22:22:11,003 INFO [train.py:561] (1/4) Epoch 535, batch 6, global_batch_idx: 8550, batch size: 106, loss[dur_loss=0.3196, prior_loss=1.003, diff_loss=0.3482, tot_loss=1.671, over 106.00 samples.], tot_loss[dur_loss=0.3139, prior_loss=1.002, diff_loss=0.4745, tot_loss=1.791, over 1142.00 samples.], 2024-10-20 22:22:23,887 INFO [train.py:682] (1/4) Start epoch 536 2024-10-20 22:22:32,563 INFO [train.py:561] (1/4) Epoch 536, batch 0, global_batch_idx: 8560, batch size: 108, loss[dur_loss=0.3267, prior_loss=1.003, diff_loss=0.3418, tot_loss=1.672, over 108.00 samples.], tot_loss[dur_loss=0.3267, prior_loss=1.003, diff_loss=0.3418, tot_loss=1.672, over 108.00 samples.], 2024-10-20 22:22:46,661 INFO [train.py:561] (1/4) Epoch 536, batch 10, global_batch_idx: 8570, batch size: 111, loss[dur_loss=0.3257, prior_loss=1.004, diff_loss=0.386, tot_loss=1.716, over 111.00 samples.], tot_loss[dur_loss=0.3144, prior_loss=1.002, diff_loss=0.4341, tot_loss=1.751, over 1656.00 samples.], 2024-10-20 22:22:53,700 INFO [train.py:682] (1/4) Start epoch 537 2024-10-20 22:23:07,588 INFO [train.py:561] (1/4) Epoch 537, batch 4, global_batch_idx: 8580, batch size: 189, loss[dur_loss=0.3183, prior_loss=1.003, diff_loss=0.3742, tot_loss=1.696, over 189.00 samples.], tot_loss[dur_loss=0.3124, prior_loss=1.002, diff_loss=0.4854, tot_loss=1.8, over 937.00 samples.], 2024-10-20 22:23:22,596 INFO [train.py:561] (1/4) Epoch 537, batch 14, global_batch_idx: 8590, batch size: 142, loss[dur_loss=0.3218, prior_loss=1.003, diff_loss=0.3762, tot_loss=1.701, over 142.00 samples.], tot_loss[dur_loss=0.3155, prior_loss=1.002, diff_loss=0.4231, tot_loss=1.741, over 2210.00 samples.], 2024-10-20 22:23:24,017 INFO [train.py:682] (1/4) Start epoch 538 2024-10-20 22:23:44,072 INFO [train.py:561] (1/4) Epoch 538, batch 8, global_batch_idx: 8600, batch size: 170, loss[dur_loss=0.3166, prior_loss=1.003, diff_loss=0.3916, tot_loss=1.711, over 170.00 samples.], tot_loss[dur_loss=0.3135, prior_loss=1.002, diff_loss=0.4514, tot_loss=1.767, over 1432.00 samples.], 2024-10-20 22:23:54,108 INFO [train.py:682] (1/4) Start epoch 539 2024-10-20 22:24:05,565 INFO [train.py:561] (1/4) Epoch 539, batch 2, global_batch_idx: 8610, batch size: 203, loss[dur_loss=0.3191, prior_loss=1.003, diff_loss=0.3937, tot_loss=1.715, over 203.00 samples.], tot_loss[dur_loss=0.3211, prior_loss=1.003, diff_loss=0.3782, tot_loss=1.702, over 442.00 samples.], 2024-10-20 22:24:19,627 INFO [train.py:561] (1/4) Epoch 539, batch 12, global_batch_idx: 8620, batch size: 152, loss[dur_loss=0.3194, prior_loss=1.003, diff_loss=0.3732, tot_loss=1.695, over 152.00 samples.], tot_loss[dur_loss=0.3156, prior_loss=1.003, diff_loss=0.4326, tot_loss=1.751, over 1966.00 samples.], 2024-10-20 22:24:24,054 INFO [train.py:682] (1/4) Start epoch 540 2024-10-20 22:24:40,895 INFO [train.py:561] (1/4) Epoch 540, batch 6, global_batch_idx: 8630, batch size: 106, loss[dur_loss=0.3189, prior_loss=1.003, diff_loss=0.3704, tot_loss=1.693, over 106.00 samples.], tot_loss[dur_loss=0.3124, prior_loss=1.002, diff_loss=0.4865, tot_loss=1.801, over 1142.00 samples.], 2024-10-20 22:24:53,947 INFO [train.py:682] (1/4) Start epoch 541 2024-10-20 22:25:02,510 INFO [train.py:561] (1/4) Epoch 541, batch 0, global_batch_idx: 8640, batch size: 108, loss[dur_loss=0.3285, prior_loss=1.003, diff_loss=0.3609, tot_loss=1.693, over 108.00 samples.], tot_loss[dur_loss=0.3285, prior_loss=1.003, diff_loss=0.3609, tot_loss=1.693, over 108.00 samples.], 2024-10-20 22:25:16,779 INFO [train.py:561] (1/4) Epoch 541, batch 10, global_batch_idx: 8650, batch size: 111, loss[dur_loss=0.3221, prior_loss=1.004, diff_loss=0.3485, tot_loss=1.675, over 111.00 samples.], tot_loss[dur_loss=0.3141, prior_loss=1.002, diff_loss=0.4408, tot_loss=1.757, over 1656.00 samples.], 2024-10-20 22:25:23,773 INFO [train.py:682] (1/4) Start epoch 542 2024-10-20 22:25:37,866 INFO [train.py:561] (1/4) Epoch 542, batch 4, global_batch_idx: 8660, batch size: 189, loss[dur_loss=0.3142, prior_loss=1.003, diff_loss=0.4045, tot_loss=1.722, over 189.00 samples.], tot_loss[dur_loss=0.3103, prior_loss=1.002, diff_loss=0.5006, tot_loss=1.813, over 937.00 samples.], 2024-10-20 22:25:52,581 INFO [train.py:561] (1/4) Epoch 542, batch 14, global_batch_idx: 8670, batch size: 142, loss[dur_loss=0.3191, prior_loss=1.002, diff_loss=0.3853, tot_loss=1.707, over 142.00 samples.], tot_loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.4323, tot_loss=1.749, over 2210.00 samples.], 2024-10-20 22:25:53,991 INFO [train.py:682] (1/4) Start epoch 543 2024-10-20 22:26:13,871 INFO [train.py:561] (1/4) Epoch 543, batch 8, global_batch_idx: 8680, batch size: 170, loss[dur_loss=0.3182, prior_loss=1.002, diff_loss=0.4108, tot_loss=1.731, over 170.00 samples.], tot_loss[dur_loss=0.3135, prior_loss=1.002, diff_loss=0.4531, tot_loss=1.769, over 1432.00 samples.], 2024-10-20 22:26:23,909 INFO [train.py:682] (1/4) Start epoch 544 2024-10-20 22:26:35,099 INFO [train.py:561] (1/4) Epoch 544, batch 2, global_batch_idx: 8690, batch size: 203, loss[dur_loss=0.3182, prior_loss=1.003, diff_loss=0.4187, tot_loss=1.74, over 203.00 samples.], tot_loss[dur_loss=0.32, prior_loss=1.003, diff_loss=0.3918, tot_loss=1.715, over 442.00 samples.], 2024-10-20 22:26:49,100 INFO [train.py:561] (1/4) Epoch 544, batch 12, global_batch_idx: 8700, batch size: 152, loss[dur_loss=0.3174, prior_loss=1.003, diff_loss=0.3761, tot_loss=1.696, over 152.00 samples.], tot_loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.4371, tot_loss=1.754, over 1966.00 samples.], 2024-10-20 22:26:53,479 INFO [train.py:682] (1/4) Start epoch 545 2024-10-20 22:27:10,434 INFO [train.py:561] (1/4) Epoch 545, batch 6, global_batch_idx: 8710, batch size: 106, loss[dur_loss=0.3247, prior_loss=1.004, diff_loss=0.4203, tot_loss=1.748, over 106.00 samples.], tot_loss[dur_loss=0.3138, prior_loss=1.002, diff_loss=0.4755, tot_loss=1.792, over 1142.00 samples.], 2024-10-20 22:27:23,219 INFO [train.py:682] (1/4) Start epoch 546 2024-10-20 22:27:31,815 INFO [train.py:561] (1/4) Epoch 546, batch 0, global_batch_idx: 8720, batch size: 108, loss[dur_loss=0.3244, prior_loss=1.004, diff_loss=0.4125, tot_loss=1.741, over 108.00 samples.], tot_loss[dur_loss=0.3244, prior_loss=1.004, diff_loss=0.4125, tot_loss=1.741, over 108.00 samples.], 2024-10-20 22:27:45,844 INFO [train.py:561] (1/4) Epoch 546, batch 10, global_batch_idx: 8730, batch size: 111, loss[dur_loss=0.321, prior_loss=1.004, diff_loss=0.3823, tot_loss=1.707, over 111.00 samples.], tot_loss[dur_loss=0.3146, prior_loss=1.002, diff_loss=0.4565, tot_loss=1.774, over 1656.00 samples.], 2024-10-20 22:27:52,843 INFO [train.py:682] (1/4) Start epoch 547 2024-10-20 22:28:06,103 INFO [train.py:561] (1/4) Epoch 547, batch 4, global_batch_idx: 8740, batch size: 189, loss[dur_loss=0.315, prior_loss=1.003, diff_loss=0.3902, tot_loss=1.708, over 189.00 samples.], tot_loss[dur_loss=0.308, prior_loss=1.002, diff_loss=0.4975, tot_loss=1.807, over 937.00 samples.], 2024-10-20 22:28:20,727 INFO [train.py:561] (1/4) Epoch 547, batch 14, global_batch_idx: 8750, batch size: 142, loss[dur_loss=0.3195, prior_loss=1.003, diff_loss=0.3462, tot_loss=1.668, over 142.00 samples.], tot_loss[dur_loss=0.3134, prior_loss=1.002, diff_loss=0.4301, tot_loss=1.746, over 2210.00 samples.], 2024-10-20 22:28:22,133 INFO [train.py:682] (1/4) Start epoch 548 2024-10-20 22:28:41,903 INFO [train.py:561] (1/4) Epoch 548, batch 8, global_batch_idx: 8760, batch size: 170, loss[dur_loss=0.3175, prior_loss=1.002, diff_loss=0.3658, tot_loss=1.686, over 170.00 samples.], tot_loss[dur_loss=0.3129, prior_loss=1.002, diff_loss=0.46, tot_loss=1.775, over 1432.00 samples.], 2024-10-20 22:28:51,984 INFO [train.py:682] (1/4) Start epoch 549 2024-10-20 22:29:03,001 INFO [train.py:561] (1/4) Epoch 549, batch 2, global_batch_idx: 8770, batch size: 203, loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.4132, tot_loss=1.73, over 203.00 samples.], tot_loss[dur_loss=0.3156, prior_loss=1.002, diff_loss=0.399, tot_loss=1.717, over 442.00 samples.], 2024-10-20 22:29:17,075 INFO [train.py:561] (1/4) Epoch 549, batch 12, global_batch_idx: 8780, batch size: 152, loss[dur_loss=0.3154, prior_loss=1.002, diff_loss=0.3726, tot_loss=1.69, over 152.00 samples.], tot_loss[dur_loss=0.311, prior_loss=1.002, diff_loss=0.4374, tot_loss=1.75, over 1966.00 samples.], 2024-10-20 22:29:21,461 INFO [train.py:682] (1/4) Start epoch 550 2024-10-20 22:29:38,229 INFO [train.py:561] (1/4) Epoch 550, batch 6, global_batch_idx: 8790, batch size: 106, loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.3906, tot_loss=1.707, over 106.00 samples.], tot_loss[dur_loss=0.3083, prior_loss=1.001, diff_loss=0.4799, tot_loss=1.79, over 1142.00 samples.], 2024-10-20 22:29:51,047 INFO [train.py:682] (1/4) Start epoch 551 2024-10-20 22:29:59,471 INFO [train.py:561] (1/4) Epoch 551, batch 0, global_batch_idx: 8800, batch size: 108, loss[dur_loss=0.328, prior_loss=1.004, diff_loss=0.3406, tot_loss=1.672, over 108.00 samples.], tot_loss[dur_loss=0.328, prior_loss=1.004, diff_loss=0.3406, tot_loss=1.672, over 108.00 samples.], 2024-10-20 22:30:13,473 INFO [train.py:561] (1/4) Epoch 551, batch 10, global_batch_idx: 8810, batch size: 111, loss[dur_loss=0.3213, prior_loss=1.004, diff_loss=0.3629, tot_loss=1.688, over 111.00 samples.], tot_loss[dur_loss=0.3128, prior_loss=1.002, diff_loss=0.4436, tot_loss=1.758, over 1656.00 samples.], 2024-10-20 22:30:20,467 INFO [train.py:682] (1/4) Start epoch 552 2024-10-20 22:30:33,833 INFO [train.py:561] (1/4) Epoch 552, batch 4, global_batch_idx: 8820, batch size: 189, loss[dur_loss=0.3153, prior_loss=1.003, diff_loss=0.4099, tot_loss=1.728, over 189.00 samples.], tot_loss[dur_loss=0.3064, prior_loss=1.001, diff_loss=0.4887, tot_loss=1.796, over 937.00 samples.], 2024-10-20 22:30:48,499 INFO [train.py:561] (1/4) Epoch 552, batch 14, global_batch_idx: 8830, batch size: 142, loss[dur_loss=0.3161, prior_loss=1.002, diff_loss=0.3609, tot_loss=1.679, over 142.00 samples.], tot_loss[dur_loss=0.3115, prior_loss=1.002, diff_loss=0.4208, tot_loss=1.734, over 2210.00 samples.], 2024-10-20 22:30:49,898 INFO [train.py:682] (1/4) Start epoch 553 2024-10-20 22:31:09,495 INFO [train.py:561] (1/4) Epoch 553, batch 8, global_batch_idx: 8840, batch size: 170, loss[dur_loss=0.3117, prior_loss=1.002, diff_loss=0.3723, tot_loss=1.686, over 170.00 samples.], tot_loss[dur_loss=0.3101, prior_loss=1.002, diff_loss=0.4598, tot_loss=1.772, over 1432.00 samples.], 2024-10-20 22:31:19,492 INFO [train.py:682] (1/4) Start epoch 554 2024-10-20 22:31:30,613 INFO [train.py:561] (1/4) Epoch 554, batch 2, global_batch_idx: 8850, batch size: 203, loss[dur_loss=0.3127, prior_loss=1.002, diff_loss=0.3938, tot_loss=1.708, over 203.00 samples.], tot_loss[dur_loss=0.3149, prior_loss=1.002, diff_loss=0.381, tot_loss=1.698, over 442.00 samples.], 2024-10-20 22:31:44,757 INFO [train.py:561] (1/4) Epoch 554, batch 12, global_batch_idx: 8860, batch size: 152, loss[dur_loss=0.3132, prior_loss=1.002, diff_loss=0.4119, tot_loss=1.727, over 152.00 samples.], tot_loss[dur_loss=0.3096, prior_loss=1.002, diff_loss=0.4313, tot_loss=1.742, over 1966.00 samples.], 2024-10-20 22:31:49,177 INFO [train.py:682] (1/4) Start epoch 555 2024-10-20 22:32:05,822 INFO [train.py:561] (1/4) Epoch 555, batch 6, global_batch_idx: 8870, batch size: 106, loss[dur_loss=0.3141, prior_loss=1.002, diff_loss=0.3365, tot_loss=1.653, over 106.00 samples.], tot_loss[dur_loss=0.3068, prior_loss=1.001, diff_loss=0.4593, tot_loss=1.767, over 1142.00 samples.], 2024-10-20 22:32:18,845 INFO [train.py:682] (1/4) Start epoch 556 2024-10-20 22:32:27,442 INFO [train.py:561] (1/4) Epoch 556, batch 0, global_batch_idx: 8880, batch size: 108, loss[dur_loss=0.323, prior_loss=1.003, diff_loss=0.3244, tot_loss=1.65, over 108.00 samples.], tot_loss[dur_loss=0.323, prior_loss=1.003, diff_loss=0.3244, tot_loss=1.65, over 108.00 samples.], 2024-10-20 22:32:41,522 INFO [train.py:561] (1/4) Epoch 556, batch 10, global_batch_idx: 8890, batch size: 111, loss[dur_loss=0.3213, prior_loss=1.003, diff_loss=0.3522, tot_loss=1.677, over 111.00 samples.], tot_loss[dur_loss=0.3085, prior_loss=1.001, diff_loss=0.4327, tot_loss=1.743, over 1656.00 samples.], 2024-10-20 22:32:48,529 INFO [train.py:682] (1/4) Start epoch 557 2024-10-20 22:33:01,893 INFO [train.py:561] (1/4) Epoch 557, batch 4, global_batch_idx: 8900, batch size: 189, loss[dur_loss=0.3131, prior_loss=1.002, diff_loss=0.3937, tot_loss=1.709, over 189.00 samples.], tot_loss[dur_loss=0.3069, prior_loss=1.001, diff_loss=0.494, tot_loss=1.802, over 937.00 samples.], 2024-10-20 22:33:16,537 INFO [train.py:561] (1/4) Epoch 557, batch 14, global_batch_idx: 8910, batch size: 142, loss[dur_loss=0.3147, prior_loss=1.002, diff_loss=0.408, tot_loss=1.724, over 142.00 samples.], tot_loss[dur_loss=0.3107, prior_loss=1.001, diff_loss=0.4348, tot_loss=1.747, over 2210.00 samples.], 2024-10-20 22:33:17,954 INFO [train.py:682] (1/4) Start epoch 558 2024-10-20 22:33:37,268 INFO [train.py:561] (1/4) Epoch 558, batch 8, global_batch_idx: 8920, batch size: 170, loss[dur_loss=0.3133, prior_loss=1.002, diff_loss=0.4129, tot_loss=1.728, over 170.00 samples.], tot_loss[dur_loss=0.3085, prior_loss=1.001, diff_loss=0.4577, tot_loss=1.767, over 1432.00 samples.], 2024-10-20 22:33:47,316 INFO [train.py:682] (1/4) Start epoch 559 2024-10-20 22:33:58,717 INFO [train.py:561] (1/4) Epoch 559, batch 2, global_batch_idx: 8930, batch size: 203, loss[dur_loss=0.312, prior_loss=1.002, diff_loss=0.4256, tot_loss=1.739, over 203.00 samples.], tot_loss[dur_loss=0.3153, prior_loss=1.002, diff_loss=0.3914, tot_loss=1.709, over 442.00 samples.], 2024-10-20 22:34:12,794 INFO [train.py:561] (1/4) Epoch 559, batch 12, global_batch_idx: 8940, batch size: 152, loss[dur_loss=0.3106, prior_loss=1.001, diff_loss=0.3625, tot_loss=1.674, over 152.00 samples.], tot_loss[dur_loss=0.3091, prior_loss=1.001, diff_loss=0.4367, tot_loss=1.747, over 1966.00 samples.], 2024-10-20 22:34:17,222 INFO [train.py:682] (1/4) Start epoch 560 2024-10-20 22:34:33,767 INFO [train.py:561] (1/4) Epoch 560, batch 6, global_batch_idx: 8950, batch size: 106, loss[dur_loss=0.311, prior_loss=1.002, diff_loss=0.385, tot_loss=1.698, over 106.00 samples.], tot_loss[dur_loss=0.308, prior_loss=1.001, diff_loss=0.4755, tot_loss=1.784, over 1142.00 samples.], 2024-10-20 22:34:46,618 INFO [train.py:682] (1/4) Start epoch 561 2024-10-20 22:34:55,487 INFO [train.py:561] (1/4) Epoch 561, batch 0, global_batch_idx: 8960, batch size: 108, loss[dur_loss=0.3221, prior_loss=1.002, diff_loss=0.3874, tot_loss=1.712, over 108.00 samples.], tot_loss[dur_loss=0.3221, prior_loss=1.002, diff_loss=0.3874, tot_loss=1.712, over 108.00 samples.], 2024-10-20 22:35:09,443 INFO [train.py:561] (1/4) Epoch 561, batch 10, global_batch_idx: 8970, batch size: 111, loss[dur_loss=0.3202, prior_loss=1.003, diff_loss=0.3617, tot_loss=1.685, over 111.00 samples.], tot_loss[dur_loss=0.3104, prior_loss=1.001, diff_loss=0.443, tot_loss=1.755, over 1656.00 samples.], 2024-10-20 22:35:16,423 INFO [train.py:682] (1/4) Start epoch 562 2024-10-20 22:35:29,997 INFO [train.py:561] (1/4) Epoch 562, batch 4, global_batch_idx: 8980, batch size: 189, loss[dur_loss=0.314, prior_loss=1.002, diff_loss=0.427, tot_loss=1.743, over 189.00 samples.], tot_loss[dur_loss=0.3071, prior_loss=1.001, diff_loss=0.5003, tot_loss=1.808, over 937.00 samples.], 2024-10-20 22:35:44,599 INFO [train.py:561] (1/4) Epoch 562, batch 14, global_batch_idx: 8990, batch size: 142, loss[dur_loss=0.3133, prior_loss=1.002, diff_loss=0.377, tot_loss=1.692, over 142.00 samples.], tot_loss[dur_loss=0.3101, prior_loss=1.001, diff_loss=0.4316, tot_loss=1.743, over 2210.00 samples.], 2024-10-20 22:35:46,001 INFO [train.py:682] (1/4) Start epoch 563 2024-10-20 22:36:05,718 INFO [train.py:561] (1/4) Epoch 563, batch 8, global_batch_idx: 9000, batch size: 170, loss[dur_loss=0.3135, prior_loss=1.002, diff_loss=0.3638, tot_loss=1.679, over 170.00 samples.], tot_loss[dur_loss=0.3086, prior_loss=1.001, diff_loss=0.4471, tot_loss=1.757, over 1432.00 samples.], 2024-10-20 22:36:07,185 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 22:36:55,951 INFO [train.py:589] (1/4) Epoch 563, validation: dur_loss=0.4257, prior_loss=1.027, diff_loss=0.407, tot_loss=1.86, over 100.00 samples. 2024-10-20 22:36:55,952 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-20 22:37:04,675 INFO [train.py:682] (1/4) Start epoch 564 2024-10-20 22:37:16,063 INFO [train.py:561] (1/4) Epoch 564, batch 2, global_batch_idx: 9010, batch size: 203, loss[dur_loss=0.309, prior_loss=1.002, diff_loss=0.4011, tot_loss=1.712, over 203.00 samples.], tot_loss[dur_loss=0.3112, prior_loss=1.002, diff_loss=0.3855, tot_loss=1.698, over 442.00 samples.], 2024-10-20 22:37:30,225 INFO [train.py:561] (1/4) Epoch 564, batch 12, global_batch_idx: 9020, batch size: 152, loss[dur_loss=0.3142, prior_loss=1.002, diff_loss=0.3429, tot_loss=1.659, over 152.00 samples.], tot_loss[dur_loss=0.3086, prior_loss=1.001, diff_loss=0.4265, tot_loss=1.736, over 1966.00 samples.], 2024-10-20 22:37:34,638 INFO [train.py:682] (1/4) Start epoch 565 2024-10-20 22:37:51,556 INFO [train.py:561] (1/4) Epoch 565, batch 6, global_batch_idx: 9030, batch size: 106, loss[dur_loss=0.3107, prior_loss=1.001, diff_loss=0.3639, tot_loss=1.676, over 106.00 samples.], tot_loss[dur_loss=0.3075, prior_loss=1.001, diff_loss=0.4722, tot_loss=1.781, over 1142.00 samples.], 2024-10-20 22:38:04,587 INFO [train.py:682] (1/4) Start epoch 566 2024-10-20 22:38:13,143 INFO [train.py:561] (1/4) Epoch 566, batch 0, global_batch_idx: 9040, batch size: 108, loss[dur_loss=0.3214, prior_loss=1.002, diff_loss=0.3977, tot_loss=1.721, over 108.00 samples.], tot_loss[dur_loss=0.3214, prior_loss=1.002, diff_loss=0.3977, tot_loss=1.721, over 108.00 samples.], 2024-10-20 22:38:27,452 INFO [train.py:561] (1/4) Epoch 566, batch 10, global_batch_idx: 9050, batch size: 111, loss[dur_loss=0.3181, prior_loss=1.003, diff_loss=0.3511, tot_loss=1.672, over 111.00 samples.], tot_loss[dur_loss=0.3095, prior_loss=1.001, diff_loss=0.4388, tot_loss=1.75, over 1656.00 samples.], 2024-10-20 22:38:34,576 INFO [train.py:682] (1/4) Start epoch 567 2024-10-20 22:38:48,380 INFO [train.py:561] (1/4) Epoch 567, batch 4, global_batch_idx: 9060, batch size: 189, loss[dur_loss=0.3124, prior_loss=1.002, diff_loss=0.4078, tot_loss=1.722, over 189.00 samples.], tot_loss[dur_loss=0.305, prior_loss=1.001, diff_loss=0.5001, tot_loss=1.806, over 937.00 samples.], 2024-10-20 22:39:03,301 INFO [train.py:561] (1/4) Epoch 567, batch 14, global_batch_idx: 9070, batch size: 142, loss[dur_loss=0.3144, prior_loss=1.001, diff_loss=0.3431, tot_loss=1.659, over 142.00 samples.], tot_loss[dur_loss=0.3099, prior_loss=1.001, diff_loss=0.4355, tot_loss=1.747, over 2210.00 samples.], 2024-10-20 22:39:04,735 INFO [train.py:682] (1/4) Start epoch 568 2024-10-20 22:39:24,583 INFO [train.py:561] (1/4) Epoch 568, batch 8, global_batch_idx: 9080, batch size: 170, loss[dur_loss=0.3097, prior_loss=1.002, diff_loss=0.3726, tot_loss=1.684, over 170.00 samples.], tot_loss[dur_loss=0.3068, prior_loss=1.001, diff_loss=0.4457, tot_loss=1.753, over 1432.00 samples.], 2024-10-20 22:39:34,711 INFO [train.py:682] (1/4) Start epoch 569 2024-10-20 22:39:45,987 INFO [train.py:561] (1/4) Epoch 569, batch 2, global_batch_idx: 9090, batch size: 203, loss[dur_loss=0.3139, prior_loss=1.001, diff_loss=0.4197, tot_loss=1.735, over 203.00 samples.], tot_loss[dur_loss=0.3159, prior_loss=1.002, diff_loss=0.4066, tot_loss=1.724, over 442.00 samples.], 2024-10-20 22:40:00,344 INFO [train.py:561] (1/4) Epoch 569, batch 12, global_batch_idx: 9100, batch size: 152, loss[dur_loss=0.3102, prior_loss=1.001, diff_loss=0.3476, tot_loss=1.659, over 152.00 samples.], tot_loss[dur_loss=0.309, prior_loss=1.001, diff_loss=0.4383, tot_loss=1.748, over 1966.00 samples.], 2024-10-20 22:40:04,818 INFO [train.py:682] (1/4) Start epoch 570 2024-10-20 22:40:22,334 INFO [train.py:561] (1/4) Epoch 570, batch 6, global_batch_idx: 9110, batch size: 106, loss[dur_loss=0.3084, prior_loss=1.001, diff_loss=0.3795, tot_loss=1.689, over 106.00 samples.], tot_loss[dur_loss=0.3058, prior_loss=1.001, diff_loss=0.4813, tot_loss=1.788, over 1142.00 samples.], 2024-10-20 22:40:35,399 INFO [train.py:682] (1/4) Start epoch 571 2024-10-20 22:40:44,118 INFO [train.py:561] (1/4) Epoch 571, batch 0, global_batch_idx: 9120, batch size: 108, loss[dur_loss=0.3201, prior_loss=1.002, diff_loss=0.3669, tot_loss=1.689, over 108.00 samples.], tot_loss[dur_loss=0.3201, prior_loss=1.002, diff_loss=0.3669, tot_loss=1.689, over 108.00 samples.], 2024-10-20 22:40:58,296 INFO [train.py:561] (1/4) Epoch 571, batch 10, global_batch_idx: 9130, batch size: 111, loss[dur_loss=0.315, prior_loss=1.003, diff_loss=0.3769, tot_loss=1.694, over 111.00 samples.], tot_loss[dur_loss=0.3073, prior_loss=1.001, diff_loss=0.4399, tot_loss=1.748, over 1656.00 samples.], 2024-10-20 22:41:05,382 INFO [train.py:682] (1/4) Start epoch 572 2024-10-20 22:41:19,019 INFO [train.py:561] (1/4) Epoch 572, batch 4, global_batch_idx: 9140, batch size: 189, loss[dur_loss=0.307, prior_loss=1.001, diff_loss=0.3913, tot_loss=1.7, over 189.00 samples.], tot_loss[dur_loss=0.3025, prior_loss=1, diff_loss=0.4891, tot_loss=1.792, over 937.00 samples.], 2024-10-20 22:41:33,867 INFO [train.py:561] (1/4) Epoch 572, batch 14, global_batch_idx: 9150, batch size: 142, loss[dur_loss=0.3149, prior_loss=1.001, diff_loss=0.3707, tot_loss=1.687, over 142.00 samples.], tot_loss[dur_loss=0.3071, prior_loss=1.001, diff_loss=0.4247, tot_loss=1.733, over 2210.00 samples.], 2024-10-20 22:41:35,297 INFO [train.py:682] (1/4) Start epoch 573 2024-10-20 22:41:55,118 INFO [train.py:561] (1/4) Epoch 573, batch 8, global_batch_idx: 9160, batch size: 170, loss[dur_loss=0.3128, prior_loss=1.001, diff_loss=0.4153, tot_loss=1.729, over 170.00 samples.], tot_loss[dur_loss=0.3063, prior_loss=1.001, diff_loss=0.4548, tot_loss=1.762, over 1432.00 samples.], 2024-10-20 22:42:05,220 INFO [train.py:682] (1/4) Start epoch 574 2024-10-20 22:42:16,520 INFO [train.py:561] (1/4) Epoch 574, batch 2, global_batch_idx: 9170, batch size: 203, loss[dur_loss=0.318, prior_loss=1.001, diff_loss=0.4082, tot_loss=1.728, over 203.00 samples.], tot_loss[dur_loss=0.316, prior_loss=1.001, diff_loss=0.393, tot_loss=1.71, over 442.00 samples.], 2024-10-20 22:42:30,829 INFO [train.py:561] (1/4) Epoch 574, batch 12, global_batch_idx: 9180, batch size: 152, loss[dur_loss=0.3095, prior_loss=1.001, diff_loss=0.3932, tot_loss=1.704, over 152.00 samples.], tot_loss[dur_loss=0.3083, prior_loss=1.001, diff_loss=0.436, tot_loss=1.745, over 1966.00 samples.], 2024-10-20 22:42:35,357 INFO [train.py:682] (1/4) Start epoch 575 2024-10-20 22:42:52,160 INFO [train.py:561] (1/4) Epoch 575, batch 6, global_batch_idx: 9190, batch size: 106, loss[dur_loss=0.3143, prior_loss=1.001, diff_loss=0.3488, tot_loss=1.664, over 106.00 samples.], tot_loss[dur_loss=0.3052, prior_loss=1.001, diff_loss=0.4812, tot_loss=1.787, over 1142.00 samples.], 2024-10-20 22:43:05,213 INFO [train.py:682] (1/4) Start epoch 576 2024-10-20 22:43:13,913 INFO [train.py:561] (1/4) Epoch 576, batch 0, global_batch_idx: 9200, batch size: 108, loss[dur_loss=0.3185, prior_loss=1.002, diff_loss=0.3629, tot_loss=1.683, over 108.00 samples.], tot_loss[dur_loss=0.3185, prior_loss=1.002, diff_loss=0.3629, tot_loss=1.683, over 108.00 samples.], 2024-10-20 22:43:28,144 INFO [train.py:561] (1/4) Epoch 576, batch 10, global_batch_idx: 9210, batch size: 111, loss[dur_loss=0.3169, prior_loss=1.002, diff_loss=0.3933, tot_loss=1.713, over 111.00 samples.], tot_loss[dur_loss=0.3079, prior_loss=1.001, diff_loss=0.4486, tot_loss=1.757, over 1656.00 samples.], 2024-10-20 22:43:35,228 INFO [train.py:682] (1/4) Start epoch 577 2024-10-20 22:43:48,843 INFO [train.py:561] (1/4) Epoch 577, batch 4, global_batch_idx: 9220, batch size: 189, loss[dur_loss=0.3119, prior_loss=1.002, diff_loss=0.3802, tot_loss=1.694, over 189.00 samples.], tot_loss[dur_loss=0.3048, prior_loss=1.001, diff_loss=0.4939, tot_loss=1.799, over 937.00 samples.], 2024-10-20 22:44:03,717 INFO [train.py:561] (1/4) Epoch 577, batch 14, global_batch_idx: 9230, batch size: 142, loss[dur_loss=0.3135, prior_loss=1.002, diff_loss=0.3951, tot_loss=1.71, over 142.00 samples.], tot_loss[dur_loss=0.3087, prior_loss=1.001, diff_loss=0.4303, tot_loss=1.74, over 2210.00 samples.], 2024-10-20 22:44:05,143 INFO [train.py:682] (1/4) Start epoch 578 2024-10-20 22:44:25,132 INFO [train.py:561] (1/4) Epoch 578, batch 8, global_batch_idx: 9240, batch size: 170, loss[dur_loss=0.3052, prior_loss=1.001, diff_loss=0.3893, tot_loss=1.695, over 170.00 samples.], tot_loss[dur_loss=0.3069, prior_loss=1.001, diff_loss=0.4496, tot_loss=1.757, over 1432.00 samples.], 2024-10-20 22:44:35,249 INFO [train.py:682] (1/4) Start epoch 579 2024-10-20 22:44:46,391 INFO [train.py:561] (1/4) Epoch 579, batch 2, global_batch_idx: 9250, batch size: 203, loss[dur_loss=0.3167, prior_loss=1.002, diff_loss=0.3934, tot_loss=1.712, over 203.00 samples.], tot_loss[dur_loss=0.3159, prior_loss=1.002, diff_loss=0.3892, tot_loss=1.707, over 442.00 samples.], 2024-10-20 22:45:00,600 INFO [train.py:561] (1/4) Epoch 579, batch 12, global_batch_idx: 9260, batch size: 152, loss[dur_loss=0.3068, prior_loss=1.001, diff_loss=0.3773, tot_loss=1.685, over 152.00 samples.], tot_loss[dur_loss=0.31, prior_loss=1.001, diff_loss=0.4335, tot_loss=1.745, over 1966.00 samples.], 2024-10-20 22:45:05,025 INFO [train.py:682] (1/4) Start epoch 580 2024-10-20 22:45:22,007 INFO [train.py:561] (1/4) Epoch 580, batch 6, global_batch_idx: 9270, batch size: 106, loss[dur_loss=0.3097, prior_loss=1.001, diff_loss=0.3638, tot_loss=1.675, over 106.00 samples.], tot_loss[dur_loss=0.3075, prior_loss=1.001, diff_loss=0.477, tot_loss=1.785, over 1142.00 samples.], 2024-10-20 22:45:35,088 INFO [train.py:682] (1/4) Start epoch 581 2024-10-20 22:45:43,700 INFO [train.py:561] (1/4) Epoch 581, batch 0, global_batch_idx: 9280, batch size: 108, loss[dur_loss=0.3224, prior_loss=1.002, diff_loss=0.3394, tot_loss=1.664, over 108.00 samples.], tot_loss[dur_loss=0.3224, prior_loss=1.002, diff_loss=0.3394, tot_loss=1.664, over 108.00 samples.], 2024-10-20 22:45:58,010 INFO [train.py:561] (1/4) Epoch 581, batch 10, global_batch_idx: 9290, batch size: 111, loss[dur_loss=0.3211, prior_loss=1.003, diff_loss=0.3618, tot_loss=1.686, over 111.00 samples.], tot_loss[dur_loss=0.31, prior_loss=1.001, diff_loss=0.447, tot_loss=1.758, over 1656.00 samples.], 2024-10-20 22:46:05,132 INFO [train.py:682] (1/4) Start epoch 582 2024-10-20 22:46:18,636 INFO [train.py:561] (1/4) Epoch 582, batch 4, global_batch_idx: 9300, batch size: 189, loss[dur_loss=0.3112, prior_loss=1.003, diff_loss=0.4191, tot_loss=1.733, over 189.00 samples.], tot_loss[dur_loss=0.3059, prior_loss=1.001, diff_loss=0.4921, tot_loss=1.799, over 937.00 samples.], 2024-10-20 22:46:33,542 INFO [train.py:561] (1/4) Epoch 582, batch 14, global_batch_idx: 9310, batch size: 142, loss[dur_loss=0.313, prior_loss=1.001, diff_loss=0.3994, tot_loss=1.714, over 142.00 samples.], tot_loss[dur_loss=0.3104, prior_loss=1.001, diff_loss=0.425, tot_loss=1.737, over 2210.00 samples.], 2024-10-20 22:46:34,980 INFO [train.py:682] (1/4) Start epoch 583 2024-10-20 22:46:54,968 INFO [train.py:561] (1/4) Epoch 583, batch 8, global_batch_idx: 9320, batch size: 170, loss[dur_loss=0.3098, prior_loss=1.001, diff_loss=0.3741, tot_loss=1.685, over 170.00 samples.], tot_loss[dur_loss=0.3073, prior_loss=1.001, diff_loss=0.4475, tot_loss=1.756, over 1432.00 samples.], 2024-10-20 22:47:05,110 INFO [train.py:682] (1/4) Start epoch 584 2024-10-20 22:47:16,445 INFO [train.py:561] (1/4) Epoch 584, batch 2, global_batch_idx: 9330, batch size: 203, loss[dur_loss=0.3095, prior_loss=1.001, diff_loss=0.4084, tot_loss=1.719, over 203.00 samples.], tot_loss[dur_loss=0.313, prior_loss=1.001, diff_loss=0.3853, tot_loss=1.7, over 442.00 samples.], 2024-10-20 22:47:30,817 INFO [train.py:561] (1/4) Epoch 584, batch 12, global_batch_idx: 9340, batch size: 152, loss[dur_loss=0.305, prior_loss=1.001, diff_loss=0.3939, tot_loss=1.7, over 152.00 samples.], tot_loss[dur_loss=0.3093, prior_loss=1.001, diff_loss=0.4399, tot_loss=1.75, over 1966.00 samples.], 2024-10-20 22:47:35,315 INFO [train.py:682] (1/4) Start epoch 585 2024-10-20 22:47:52,274 INFO [train.py:561] (1/4) Epoch 585, batch 6, global_batch_idx: 9350, batch size: 106, loss[dur_loss=0.307, prior_loss=1.001, diff_loss=0.4177, tot_loss=1.726, over 106.00 samples.], tot_loss[dur_loss=0.3051, prior_loss=1, diff_loss=0.4695, tot_loss=1.775, over 1142.00 samples.], 2024-10-20 22:48:05,443 INFO [train.py:682] (1/4) Start epoch 586 2024-10-20 22:48:13,958 INFO [train.py:561] (1/4) Epoch 586, batch 0, global_batch_idx: 9360, batch size: 108, loss[dur_loss=0.3177, prior_loss=1.002, diff_loss=0.365, tot_loss=1.684, over 108.00 samples.], tot_loss[dur_loss=0.3177, prior_loss=1.002, diff_loss=0.365, tot_loss=1.684, over 108.00 samples.], 2024-10-20 22:48:28,238 INFO [train.py:561] (1/4) Epoch 586, batch 10, global_batch_idx: 9370, batch size: 111, loss[dur_loss=0.3198, prior_loss=1.002, diff_loss=0.3997, tot_loss=1.722, over 111.00 samples.], tot_loss[dur_loss=0.3063, prior_loss=1.001, diff_loss=0.445, tot_loss=1.752, over 1656.00 samples.], 2024-10-20 22:48:35,381 INFO [train.py:682] (1/4) Start epoch 587 2024-10-20 22:48:49,533 INFO [train.py:561] (1/4) Epoch 587, batch 4, global_batch_idx: 9380, batch size: 189, loss[dur_loss=0.3052, prior_loss=1.001, diff_loss=0.4143, tot_loss=1.721, over 189.00 samples.], tot_loss[dur_loss=0.3026, prior_loss=1, diff_loss=0.4977, tot_loss=1.8, over 937.00 samples.], 2024-10-20 22:49:04,452 INFO [train.py:561] (1/4) Epoch 587, batch 14, global_batch_idx: 9390, batch size: 142, loss[dur_loss=0.3089, prior_loss=1.001, diff_loss=0.3796, tot_loss=1.689, over 142.00 samples.], tot_loss[dur_loss=0.3073, prior_loss=1.001, diff_loss=0.4268, tot_loss=1.735, over 2210.00 samples.], 2024-10-20 22:49:05,885 INFO [train.py:682] (1/4) Start epoch 588 2024-10-20 22:49:25,645 INFO [train.py:561] (1/4) Epoch 588, batch 8, global_batch_idx: 9400, batch size: 170, loss[dur_loss=0.3129, prior_loss=1.001, diff_loss=0.3778, tot_loss=1.692, over 170.00 samples.], tot_loss[dur_loss=0.305, prior_loss=1, diff_loss=0.4505, tot_loss=1.756, over 1432.00 samples.], 2024-10-20 22:49:35,780 INFO [train.py:682] (1/4) Start epoch 589 2024-10-20 22:49:47,091 INFO [train.py:561] (1/4) Epoch 589, batch 2, global_batch_idx: 9410, batch size: 203, loss[dur_loss=0.3096, prior_loss=1, diff_loss=0.4057, tot_loss=1.716, over 203.00 samples.], tot_loss[dur_loss=0.3108, prior_loss=1.001, diff_loss=0.3967, tot_loss=1.708, over 442.00 samples.], 2024-10-20 22:50:01,381 INFO [train.py:561] (1/4) Epoch 589, batch 12, global_batch_idx: 9420, batch size: 152, loss[dur_loss=0.3048, prior_loss=1, diff_loss=0.3834, tot_loss=1.689, over 152.00 samples.], tot_loss[dur_loss=0.3057, prior_loss=1, diff_loss=0.4301, tot_loss=1.736, over 1966.00 samples.], 2024-10-20 22:50:05,825 INFO [train.py:682] (1/4) Start epoch 590 2024-10-20 22:50:22,912 INFO [train.py:561] (1/4) Epoch 590, batch 6, global_batch_idx: 9430, batch size: 106, loss[dur_loss=0.3077, prior_loss=1.001, diff_loss=0.3775, tot_loss=1.686, over 106.00 samples.], tot_loss[dur_loss=0.3002, prior_loss=1, diff_loss=0.4701, tot_loss=1.77, over 1142.00 samples.], 2024-10-20 22:50:36,068 INFO [train.py:682] (1/4) Start epoch 591 2024-10-20 22:50:44,749 INFO [train.py:561] (1/4) Epoch 591, batch 0, global_batch_idx: 9440, batch size: 108, loss[dur_loss=0.3208, prior_loss=1.002, diff_loss=0.3765, tot_loss=1.699, over 108.00 samples.], tot_loss[dur_loss=0.3208, prior_loss=1.002, diff_loss=0.3765, tot_loss=1.699, over 108.00 samples.], 2024-10-20 22:50:59,024 INFO [train.py:561] (1/4) Epoch 591, batch 10, global_batch_idx: 9450, batch size: 111, loss[dur_loss=0.3148, prior_loss=1.002, diff_loss=0.3644, tot_loss=1.681, over 111.00 samples.], tot_loss[dur_loss=0.3043, prior_loss=1, diff_loss=0.4277, tot_loss=1.732, over 1656.00 samples.], 2024-10-20 22:51:06,163 INFO [train.py:682] (1/4) Start epoch 592 2024-10-20 22:51:20,013 INFO [train.py:561] (1/4) Epoch 592, batch 4, global_batch_idx: 9460, batch size: 189, loss[dur_loss=0.3064, prior_loss=1.001, diff_loss=0.3859, tot_loss=1.693, over 189.00 samples.], tot_loss[dur_loss=0.3012, prior_loss=0.9999, diff_loss=0.4921, tot_loss=1.793, over 937.00 samples.], 2024-10-20 22:51:35,162 INFO [train.py:561] (1/4) Epoch 592, batch 14, global_batch_idx: 9470, batch size: 142, loss[dur_loss=0.309, prior_loss=1.001, diff_loss=0.3725, tot_loss=1.682, over 142.00 samples.], tot_loss[dur_loss=0.3057, prior_loss=1, diff_loss=0.432, tot_loss=1.738, over 2210.00 samples.], 2024-10-20 22:51:36,602 INFO [train.py:682] (1/4) Start epoch 593 2024-10-20 22:51:56,606 INFO [train.py:561] (1/4) Epoch 593, batch 8, global_batch_idx: 9480, batch size: 170, loss[dur_loss=0.3129, prior_loss=1.001, diff_loss=0.3968, tot_loss=1.71, over 170.00 samples.], tot_loss[dur_loss=0.3043, prior_loss=1, diff_loss=0.4475, tot_loss=1.752, over 1432.00 samples.], 2024-10-20 22:52:06,780 INFO [train.py:682] (1/4) Start epoch 594 2024-10-20 22:52:17,969 INFO [train.py:561] (1/4) Epoch 594, batch 2, global_batch_idx: 9490, batch size: 203, loss[dur_loss=0.309, prior_loss=1.001, diff_loss=0.4181, tot_loss=1.728, over 203.00 samples.], tot_loss[dur_loss=0.311, prior_loss=1.001, diff_loss=0.4086, tot_loss=1.721, over 442.00 samples.], 2024-10-20 22:52:32,371 INFO [train.py:561] (1/4) Epoch 594, batch 12, global_batch_idx: 9500, batch size: 152, loss[dur_loss=0.3061, prior_loss=1, diff_loss=0.3878, tot_loss=1.694, over 152.00 samples.], tot_loss[dur_loss=0.3065, prior_loss=1, diff_loss=0.4403, tot_loss=1.747, over 1966.00 samples.], 2024-10-20 22:52:36,881 INFO [train.py:682] (1/4) Start epoch 595 2024-10-20 22:52:54,008 INFO [train.py:561] (1/4) Epoch 595, batch 6, global_batch_idx: 9510, batch size: 106, loss[dur_loss=0.3077, prior_loss=1.001, diff_loss=0.3305, tot_loss=1.639, over 106.00 samples.], tot_loss[dur_loss=0.3016, prior_loss=0.9998, diff_loss=0.4593, tot_loss=1.761, over 1142.00 samples.], 2024-10-20 22:53:07,114 INFO [train.py:682] (1/4) Start epoch 596 2024-10-20 22:53:15,768 INFO [train.py:561] (1/4) Epoch 596, batch 0, global_batch_idx: 9520, batch size: 108, loss[dur_loss=0.3181, prior_loss=1.001, diff_loss=0.3411, tot_loss=1.661, over 108.00 samples.], tot_loss[dur_loss=0.3181, prior_loss=1.001, diff_loss=0.3411, tot_loss=1.661, over 108.00 samples.], 2024-10-20 22:53:30,062 INFO [train.py:561] (1/4) Epoch 596, batch 10, global_batch_idx: 9530, batch size: 111, loss[dur_loss=0.3164, prior_loss=1.002, diff_loss=0.3838, tot_loss=1.702, over 111.00 samples.], tot_loss[dur_loss=0.3045, prior_loss=1, diff_loss=0.4336, tot_loss=1.738, over 1656.00 samples.], 2024-10-20 22:53:37,159 INFO [train.py:682] (1/4) Start epoch 597 2024-10-20 22:53:50,816 INFO [train.py:561] (1/4) Epoch 597, batch 4, global_batch_idx: 9540, batch size: 189, loss[dur_loss=0.3033, prior_loss=1.001, diff_loss=0.404, tot_loss=1.708, over 189.00 samples.], tot_loss[dur_loss=0.2997, prior_loss=0.9996, diff_loss=0.4865, tot_loss=1.786, over 937.00 samples.], 2024-10-20 22:54:05,693 INFO [train.py:561] (1/4) Epoch 597, batch 14, global_batch_idx: 9550, batch size: 142, loss[dur_loss=0.3074, prior_loss=1.001, diff_loss=0.3696, tot_loss=1.678, over 142.00 samples.], tot_loss[dur_loss=0.3042, prior_loss=1, diff_loss=0.4218, tot_loss=1.726, over 2210.00 samples.], 2024-10-20 22:54:07,120 INFO [train.py:682] (1/4) Start epoch 598 2024-10-20 22:54:26,894 INFO [train.py:561] (1/4) Epoch 598, batch 8, global_batch_idx: 9560, batch size: 170, loss[dur_loss=0.3039, prior_loss=1, diff_loss=0.3844, tot_loss=1.689, over 170.00 samples.], tot_loss[dur_loss=0.3029, prior_loss=0.9999, diff_loss=0.4576, tot_loss=1.76, over 1432.00 samples.], 2024-10-20 22:54:37,088 INFO [train.py:682] (1/4) Start epoch 599 2024-10-20 22:54:48,864 INFO [train.py:561] (1/4) Epoch 599, batch 2, global_batch_idx: 9570, batch size: 203, loss[dur_loss=0.3101, prior_loss=1, diff_loss=0.3969, tot_loss=1.707, over 203.00 samples.], tot_loss[dur_loss=0.3095, prior_loss=1.001, diff_loss=0.3726, tot_loss=1.683, over 442.00 samples.], 2024-10-20 22:55:03,216 INFO [train.py:561] (1/4) Epoch 599, batch 12, global_batch_idx: 9580, batch size: 152, loss[dur_loss=0.3049, prior_loss=0.9999, diff_loss=0.403, tot_loss=1.708, over 152.00 samples.], tot_loss[dur_loss=0.3039, prior_loss=1, diff_loss=0.4311, tot_loss=1.735, over 1966.00 samples.], 2024-10-20 22:55:07,699 INFO [train.py:682] (1/4) Start epoch 600 2024-10-20 22:55:24,799 INFO [train.py:561] (1/4) Epoch 600, batch 6, global_batch_idx: 9590, batch size: 106, loss[dur_loss=0.3067, prior_loss=1.001, diff_loss=0.3622, tot_loss=1.669, over 106.00 samples.], tot_loss[dur_loss=0.3022, prior_loss=0.9997, diff_loss=0.4703, tot_loss=1.772, over 1142.00 samples.], 2024-10-20 22:55:37,887 INFO [train.py:682] (1/4) Start epoch 601 2024-10-20 22:55:46,465 INFO [train.py:561] (1/4) Epoch 601, batch 0, global_batch_idx: 9600, batch size: 108, loss[dur_loss=0.3182, prior_loss=1.001, diff_loss=0.3932, tot_loss=1.713, over 108.00 samples.], tot_loss[dur_loss=0.3182, prior_loss=1.001, diff_loss=0.3932, tot_loss=1.713, over 108.00 samples.], 2024-10-20 22:56:00,757 INFO [train.py:561] (1/4) Epoch 601, batch 10, global_batch_idx: 9610, batch size: 111, loss[dur_loss=0.3171, prior_loss=1.002, diff_loss=0.3571, tot_loss=1.676, over 111.00 samples.], tot_loss[dur_loss=0.3049, prior_loss=1, diff_loss=0.436, tot_loss=1.741, over 1656.00 samples.], 2024-10-20 22:56:07,850 INFO [train.py:682] (1/4) Start epoch 602 2024-10-20 22:56:21,769 INFO [train.py:561] (1/4) Epoch 602, batch 4, global_batch_idx: 9620, batch size: 189, loss[dur_loss=0.3039, prior_loss=1.001, diff_loss=0.3956, tot_loss=1.7, over 189.00 samples.], tot_loss[dur_loss=0.3013, prior_loss=0.9998, diff_loss=0.491, tot_loss=1.792, over 937.00 samples.], 2024-10-20 22:56:36,668 INFO [train.py:561] (1/4) Epoch 602, batch 14, global_batch_idx: 9630, batch size: 142, loss[dur_loss=0.3062, prior_loss=1, diff_loss=0.3638, tot_loss=1.67, over 142.00 samples.], tot_loss[dur_loss=0.3044, prior_loss=1, diff_loss=0.4213, tot_loss=1.726, over 2210.00 samples.], 2024-10-20 22:56:38,091 INFO [train.py:682] (1/4) Start epoch 603 2024-10-20 22:56:58,087 INFO [train.py:561] (1/4) Epoch 603, batch 8, global_batch_idx: 9640, batch size: 170, loss[dur_loss=0.3044, prior_loss=0.9999, diff_loss=0.4028, tot_loss=1.707, over 170.00 samples.], tot_loss[dur_loss=0.3031, prior_loss=0.9999, diff_loss=0.4481, tot_loss=1.751, over 1432.00 samples.], 2024-10-20 22:57:08,265 INFO [train.py:682] (1/4) Start epoch 604 2024-10-20 22:57:19,719 INFO [train.py:561] (1/4) Epoch 604, batch 2, global_batch_idx: 9650, batch size: 203, loss[dur_loss=0.3042, prior_loss=1, diff_loss=0.3826, tot_loss=1.687, over 203.00 samples.], tot_loss[dur_loss=0.3071, prior_loss=1, diff_loss=0.3781, tot_loss=1.686, over 442.00 samples.], 2024-10-20 22:57:33,938 INFO [train.py:561] (1/4) Epoch 604, batch 12, global_batch_idx: 9660, batch size: 152, loss[dur_loss=0.3102, prior_loss=1, diff_loss=0.397, tot_loss=1.708, over 152.00 samples.], tot_loss[dur_loss=0.3037, prior_loss=1, diff_loss=0.4258, tot_loss=1.73, over 1966.00 samples.], 2024-10-20 22:57:38,401 INFO [train.py:682] (1/4) Start epoch 605 2024-10-20 22:57:55,192 INFO [train.py:561] (1/4) Epoch 605, batch 6, global_batch_idx: 9670, batch size: 106, loss[dur_loss=0.303, prior_loss=1, diff_loss=0.3609, tot_loss=1.664, over 106.00 samples.], tot_loss[dur_loss=0.3037, prior_loss=1, diff_loss=0.4848, tot_loss=1.789, over 1142.00 samples.], 2024-10-20 22:58:08,253 INFO [train.py:682] (1/4) Start epoch 606 2024-10-20 22:58:17,142 INFO [train.py:561] (1/4) Epoch 606, batch 0, global_batch_idx: 9680, batch size: 108, loss[dur_loss=0.3175, prior_loss=1.001, diff_loss=0.3203, tot_loss=1.639, over 108.00 samples.], tot_loss[dur_loss=0.3175, prior_loss=1.001, diff_loss=0.3203, tot_loss=1.639, over 108.00 samples.], 2024-10-20 22:58:31,368 INFO [train.py:561] (1/4) Epoch 606, batch 10, global_batch_idx: 9690, batch size: 111, loss[dur_loss=0.3093, prior_loss=1.001, diff_loss=0.3513, tot_loss=1.662, over 111.00 samples.], tot_loss[dur_loss=0.3023, prior_loss=0.9999, diff_loss=0.4339, tot_loss=1.736, over 1656.00 samples.], 2024-10-20 22:58:38,429 INFO [train.py:682] (1/4) Start epoch 607 2024-10-20 22:58:51,922 INFO [train.py:561] (1/4) Epoch 607, batch 4, global_batch_idx: 9700, batch size: 189, loss[dur_loss=0.3012, prior_loss=1, diff_loss=0.4082, tot_loss=1.71, over 189.00 samples.], tot_loss[dur_loss=0.2983, prior_loss=0.9992, diff_loss=0.497, tot_loss=1.795, over 937.00 samples.], 2024-10-20 22:59:06,666 INFO [train.py:561] (1/4) Epoch 607, batch 14, global_batch_idx: 9710, batch size: 142, loss[dur_loss=0.3049, prior_loss=0.9999, diff_loss=0.3811, tot_loss=1.686, over 142.00 samples.], tot_loss[dur_loss=0.3028, prior_loss=0.9998, diff_loss=0.4321, tot_loss=1.735, over 2210.00 samples.], 2024-10-20 22:59:08,081 INFO [train.py:682] (1/4) Start epoch 608 2024-10-20 22:59:27,736 INFO [train.py:561] (1/4) Epoch 608, batch 8, global_batch_idx: 9720, batch size: 170, loss[dur_loss=0.3042, prior_loss=1, diff_loss=0.3746, tot_loss=1.679, over 170.00 samples.], tot_loss[dur_loss=0.3018, prior_loss=0.9996, diff_loss=0.4431, tot_loss=1.745, over 1432.00 samples.], 2024-10-20 22:59:37,858 INFO [train.py:682] (1/4) Start epoch 609 2024-10-20 22:59:48,955 INFO [train.py:561] (1/4) Epoch 609, batch 2, global_batch_idx: 9730, batch size: 203, loss[dur_loss=0.3067, prior_loss=1, diff_loss=0.3864, tot_loss=1.693, over 203.00 samples.], tot_loss[dur_loss=0.3068, prior_loss=1, diff_loss=0.3717, tot_loss=1.679, over 442.00 samples.], 2024-10-20 23:00:03,384 INFO [train.py:561] (1/4) Epoch 609, batch 12, global_batch_idx: 9740, batch size: 152, loss[dur_loss=0.3031, prior_loss=0.9997, diff_loss=0.3843, tot_loss=1.687, over 152.00 samples.], tot_loss[dur_loss=0.302, prior_loss=0.9996, diff_loss=0.421, tot_loss=1.723, over 1966.00 samples.], 2024-10-20 23:00:07,827 INFO [train.py:682] (1/4) Start epoch 610 2024-10-20 23:00:24,694 INFO [train.py:561] (1/4) Epoch 610, batch 6, global_batch_idx: 9750, batch size: 106, loss[dur_loss=0.305, prior_loss=1, diff_loss=0.3468, tot_loss=1.652, over 106.00 samples.], tot_loss[dur_loss=0.3, prior_loss=0.9994, diff_loss=0.4708, tot_loss=1.77, over 1142.00 samples.], 2024-10-20 23:00:37,791 INFO [train.py:682] (1/4) Start epoch 611 2024-10-20 23:00:46,676 INFO [train.py:561] (1/4) Epoch 611, batch 0, global_batch_idx: 9760, batch size: 108, loss[dur_loss=0.3141, prior_loss=1.001, diff_loss=0.3925, tot_loss=1.707, over 108.00 samples.], tot_loss[dur_loss=0.3141, prior_loss=1.001, diff_loss=0.3925, tot_loss=1.707, over 108.00 samples.], 2024-10-20 23:01:00,995 INFO [train.py:561] (1/4) Epoch 611, batch 10, global_batch_idx: 9770, batch size: 111, loss[dur_loss=0.313, prior_loss=1.001, diff_loss=0.3606, tot_loss=1.675, over 111.00 samples.], tot_loss[dur_loss=0.3018, prior_loss=0.9995, diff_loss=0.4334, tot_loss=1.735, over 1656.00 samples.], 2024-10-20 23:01:08,118 INFO [train.py:682] (1/4) Start epoch 612 2024-10-20 23:01:21,579 INFO [train.py:561] (1/4) Epoch 612, batch 4, global_batch_idx: 9780, batch size: 189, loss[dur_loss=0.3049, prior_loss=1, diff_loss=0.4046, tot_loss=1.71, over 189.00 samples.], tot_loss[dur_loss=0.2967, prior_loss=0.9989, diff_loss=0.4989, tot_loss=1.794, over 937.00 samples.], 2024-10-20 23:01:36,420 INFO [train.py:561] (1/4) Epoch 612, batch 14, global_batch_idx: 9790, batch size: 142, loss[dur_loss=0.3045, prior_loss=0.9998, diff_loss=0.3787, tot_loss=1.683, over 142.00 samples.], tot_loss[dur_loss=0.3012, prior_loss=0.9994, diff_loss=0.4272, tot_loss=1.728, over 2210.00 samples.], 2024-10-20 23:01:37,856 INFO [train.py:682] (1/4) Start epoch 613 2024-10-20 23:01:57,908 INFO [train.py:561] (1/4) Epoch 613, batch 8, global_batch_idx: 9800, batch size: 170, loss[dur_loss=0.3066, prior_loss=0.9999, diff_loss=0.3812, tot_loss=1.688, over 170.00 samples.], tot_loss[dur_loss=0.3001, prior_loss=0.9993, diff_loss=0.44, tot_loss=1.739, over 1432.00 samples.], 2024-10-20 23:02:08,034 INFO [train.py:682] (1/4) Start epoch 614 2024-10-20 23:02:19,333 INFO [train.py:561] (1/4) Epoch 614, batch 2, global_batch_idx: 9810, batch size: 203, loss[dur_loss=0.3026, prior_loss=1, diff_loss=0.4066, tot_loss=1.709, over 203.00 samples.], tot_loss[dur_loss=0.3052, prior_loss=1, diff_loss=0.391, tot_loss=1.696, over 442.00 samples.], 2024-10-20 23:02:33,708 INFO [train.py:561] (1/4) Epoch 614, batch 12, global_batch_idx: 9820, batch size: 152, loss[dur_loss=0.3033, prior_loss=0.9994, diff_loss=0.372, tot_loss=1.675, over 152.00 samples.], tot_loss[dur_loss=0.3016, prior_loss=0.9995, diff_loss=0.4346, tot_loss=1.736, over 1966.00 samples.], 2024-10-20 23:02:38,149 INFO [train.py:682] (1/4) Start epoch 615 2024-10-20 23:02:55,350 INFO [train.py:561] (1/4) Epoch 615, batch 6, global_batch_idx: 9830, batch size: 106, loss[dur_loss=0.3025, prior_loss=0.9998, diff_loss=0.3913, tot_loss=1.694, over 106.00 samples.], tot_loss[dur_loss=0.298, prior_loss=0.999, diff_loss=0.4761, tot_loss=1.773, over 1142.00 samples.], 2024-10-20 23:03:08,476 INFO [train.py:682] (1/4) Start epoch 616 2024-10-20 23:03:17,177 INFO [train.py:561] (1/4) Epoch 616, batch 0, global_batch_idx: 9840, batch size: 108, loss[dur_loss=0.3134, prior_loss=1.001, diff_loss=0.3813, tot_loss=1.695, over 108.00 samples.], tot_loss[dur_loss=0.3134, prior_loss=1.001, diff_loss=0.3813, tot_loss=1.695, over 108.00 samples.], 2024-10-20 23:03:31,584 INFO [train.py:561] (1/4) Epoch 616, batch 10, global_batch_idx: 9850, batch size: 111, loss[dur_loss=0.3131, prior_loss=1.001, diff_loss=0.3983, tot_loss=1.712, over 111.00 samples.], tot_loss[dur_loss=0.3016, prior_loss=0.9994, diff_loss=0.4468, tot_loss=1.748, over 1656.00 samples.], 2024-10-20 23:03:38,790 INFO [train.py:682] (1/4) Start epoch 617 2024-10-20 23:03:52,549 INFO [train.py:561] (1/4) Epoch 617, batch 4, global_batch_idx: 9860, batch size: 189, loss[dur_loss=0.3017, prior_loss=0.9999, diff_loss=0.3959, tot_loss=1.697, over 189.00 samples.], tot_loss[dur_loss=0.2983, prior_loss=0.9988, diff_loss=0.4995, tot_loss=1.797, over 937.00 samples.], 2024-10-20 23:04:07,536 INFO [train.py:561] (1/4) Epoch 617, batch 14, global_batch_idx: 9870, batch size: 142, loss[dur_loss=0.3046, prior_loss=0.9996, diff_loss=0.3515, tot_loss=1.656, over 142.00 samples.], tot_loss[dur_loss=0.3017, prior_loss=0.9993, diff_loss=0.4236, tot_loss=1.725, over 2210.00 samples.], 2024-10-20 23:04:08,973 INFO [train.py:682] (1/4) Start epoch 618 2024-10-20 23:04:29,056 INFO [train.py:561] (1/4) Epoch 618, batch 8, global_batch_idx: 9880, batch size: 170, loss[dur_loss=0.3047, prior_loss=1, diff_loss=0.3943, tot_loss=1.699, over 170.00 samples.], tot_loss[dur_loss=0.2973, prior_loss=0.9991, diff_loss=0.4582, tot_loss=1.755, over 1432.00 samples.], 2024-10-20 23:04:39,298 INFO [train.py:682] (1/4) Start epoch 619 2024-10-20 23:04:51,087 INFO [train.py:561] (1/4) Epoch 619, batch 2, global_batch_idx: 9890, batch size: 203, loss[dur_loss=0.3003, prior_loss=0.9994, diff_loss=0.415, tot_loss=1.715, over 203.00 samples.], tot_loss[dur_loss=0.3043, prior_loss=0.9998, diff_loss=0.3888, tot_loss=1.693, over 442.00 samples.], 2024-10-20 23:05:05,422 INFO [train.py:561] (1/4) Epoch 619, batch 12, global_batch_idx: 9900, batch size: 152, loss[dur_loss=0.3, prior_loss=0.9992, diff_loss=0.3752, tot_loss=1.674, over 152.00 samples.], tot_loss[dur_loss=0.3001, prior_loss=0.9993, diff_loss=0.4245, tot_loss=1.724, over 1966.00 samples.], 2024-10-20 23:05:09,895 INFO [train.py:682] (1/4) Start epoch 620 2024-10-20 23:05:26,931 INFO [train.py:561] (1/4) Epoch 620, batch 6, global_batch_idx: 9910, batch size: 106, loss[dur_loss=0.3028, prior_loss=0.9997, diff_loss=0.3051, tot_loss=1.608, over 106.00 samples.], tot_loss[dur_loss=0.2976, prior_loss=0.9989, diff_loss=0.4536, tot_loss=1.75, over 1142.00 samples.], 2024-10-20 23:05:39,995 INFO [train.py:682] (1/4) Start epoch 621 2024-10-20 23:05:48,533 INFO [train.py:561] (1/4) Epoch 621, batch 0, global_batch_idx: 9920, batch size: 108, loss[dur_loss=0.3135, prior_loss=1.001, diff_loss=0.3746, tot_loss=1.689, over 108.00 samples.], tot_loss[dur_loss=0.3135, prior_loss=1.001, diff_loss=0.3746, tot_loss=1.689, over 108.00 samples.], 2024-10-20 23:06:03,003 INFO [train.py:561] (1/4) Epoch 621, batch 10, global_batch_idx: 9930, batch size: 111, loss[dur_loss=0.3121, prior_loss=1.001, diff_loss=0.3671, tot_loss=1.68, over 111.00 samples.], tot_loss[dur_loss=0.3003, prior_loss=0.9992, diff_loss=0.4407, tot_loss=1.74, over 1656.00 samples.], 2024-10-20 23:06:10,101 INFO [train.py:682] (1/4) Start epoch 622 2024-10-20 23:06:23,892 INFO [train.py:561] (1/4) Epoch 622, batch 4, global_batch_idx: 9940, batch size: 189, loss[dur_loss=0.3058, prior_loss=1, diff_loss=0.4131, tot_loss=1.719, over 189.00 samples.], tot_loss[dur_loss=0.2958, prior_loss=0.9988, diff_loss=0.4848, tot_loss=1.779, over 937.00 samples.], 2024-10-20 23:06:38,723 INFO [train.py:561] (1/4) Epoch 622, batch 14, global_batch_idx: 9950, batch size: 142, loss[dur_loss=0.3015, prior_loss=0.9998, diff_loss=0.3788, tot_loss=1.68, over 142.00 samples.], tot_loss[dur_loss=0.3003, prior_loss=0.9994, diff_loss=0.4245, tot_loss=1.724, over 2210.00 samples.], 2024-10-20 23:06:40,141 INFO [train.py:682] (1/4) Start epoch 623 2024-10-20 23:07:00,008 INFO [train.py:561] (1/4) Epoch 623, batch 8, global_batch_idx: 9960, batch size: 170, loss[dur_loss=0.3008, prior_loss=0.9994, diff_loss=0.3929, tot_loss=1.693, over 170.00 samples.], tot_loss[dur_loss=0.299, prior_loss=0.9992, diff_loss=0.4526, tot_loss=1.751, over 1432.00 samples.], 2024-10-20 23:07:10,142 INFO [train.py:682] (1/4) Start epoch 624 2024-10-20 23:07:21,382 INFO [train.py:561] (1/4) Epoch 624, batch 2, global_batch_idx: 9970, batch size: 203, loss[dur_loss=0.302, prior_loss=0.9995, diff_loss=0.4113, tot_loss=1.713, over 203.00 samples.], tot_loss[dur_loss=0.3054, prior_loss=0.9997, diff_loss=0.394, tot_loss=1.699, over 442.00 samples.], 2024-10-20 23:07:35,602 INFO [train.py:561] (1/4) Epoch 624, batch 12, global_batch_idx: 9980, batch size: 152, loss[dur_loss=0.3044, prior_loss=0.9992, diff_loss=0.3584, tot_loss=1.662, over 152.00 samples.], tot_loss[dur_loss=0.2998, prior_loss=0.9992, diff_loss=0.4327, tot_loss=1.732, over 1966.00 samples.], 2024-10-20 23:07:40,042 INFO [train.py:682] (1/4) Start epoch 625 2024-10-20 23:07:57,410 INFO [train.py:561] (1/4) Epoch 625, batch 6, global_batch_idx: 9990, batch size: 106, loss[dur_loss=0.306, prior_loss=0.9999, diff_loss=0.4177, tot_loss=1.724, over 106.00 samples.], tot_loss[dur_loss=0.3, prior_loss=0.9991, diff_loss=0.4722, tot_loss=1.771, over 1142.00 samples.], 2024-10-20 23:08:10,621 INFO [train.py:682] (1/4) Start epoch 626 2024-10-20 23:08:19,311 INFO [train.py:561] (1/4) Epoch 626, batch 0, global_batch_idx: 10000, batch size: 108, loss[dur_loss=0.317, prior_loss=1.001, diff_loss=0.3465, tot_loss=1.664, over 108.00 samples.], tot_loss[dur_loss=0.317, prior_loss=1.001, diff_loss=0.3465, tot_loss=1.664, over 108.00 samples.], 2024-10-20 23:08:33,632 INFO [train.py:561] (1/4) Epoch 626, batch 10, global_batch_idx: 10010, batch size: 111, loss[dur_loss=0.3142, prior_loss=1.001, diff_loss=0.3454, tot_loss=1.661, over 111.00 samples.], tot_loss[dur_loss=0.301, prior_loss=0.9995, diff_loss=0.4297, tot_loss=1.73, over 1656.00 samples.], 2024-10-20 23:08:40,823 INFO [train.py:682] (1/4) Start epoch 627 2024-10-20 23:08:55,069 INFO [train.py:561] (1/4) Epoch 627, batch 4, global_batch_idx: 10020, batch size: 189, loss[dur_loss=0.3027, prior_loss=1, diff_loss=0.399, tot_loss=1.702, over 189.00 samples.], tot_loss[dur_loss=0.2963, prior_loss=0.9988, diff_loss=0.4764, tot_loss=1.771, over 937.00 samples.], 2024-10-20 23:09:09,956 INFO [train.py:561] (1/4) Epoch 627, batch 14, global_batch_idx: 10030, batch size: 142, loss[dur_loss=0.3042, prior_loss=0.9993, diff_loss=0.3733, tot_loss=1.677, over 142.00 samples.], tot_loss[dur_loss=0.301, prior_loss=0.9993, diff_loss=0.4132, tot_loss=1.714, over 2210.00 samples.], 2024-10-20 23:09:11,385 INFO [train.py:682] (1/4) Start epoch 628 2024-10-20 23:09:31,332 INFO [train.py:561] (1/4) Epoch 628, batch 8, global_batch_idx: 10040, batch size: 170, loss[dur_loss=0.3055, prior_loss=0.9999, diff_loss=0.4201, tot_loss=1.725, over 170.00 samples.], tot_loss[dur_loss=0.3012, prior_loss=0.9996, diff_loss=0.4478, tot_loss=1.749, over 1432.00 samples.], 2024-10-20 23:09:41,483 INFO [train.py:682] (1/4) Start epoch 629 2024-10-20 23:09:52,846 INFO [train.py:561] (1/4) Epoch 629, batch 2, global_batch_idx: 10050, batch size: 203, loss[dur_loss=0.3058, prior_loss=1, diff_loss=0.3903, tot_loss=1.696, over 203.00 samples.], tot_loss[dur_loss=0.3048, prior_loss=1, diff_loss=0.3798, tot_loss=1.685, over 442.00 samples.], 2024-10-20 23:10:07,070 INFO [train.py:561] (1/4) Epoch 629, batch 12, global_batch_idx: 10060, batch size: 152, loss[dur_loss=0.3036, prior_loss=0.9993, diff_loss=0.3801, tot_loss=1.683, over 152.00 samples.], tot_loss[dur_loss=0.3015, prior_loss=0.9995, diff_loss=0.4334, tot_loss=1.734, over 1966.00 samples.], 2024-10-20 23:10:11,521 INFO [train.py:682] (1/4) Start epoch 630 2024-10-20 23:10:28,288 INFO [train.py:561] (1/4) Epoch 630, batch 6, global_batch_idx: 10070, batch size: 106, loss[dur_loss=0.3024, prior_loss=0.9998, diff_loss=0.3593, tot_loss=1.661, over 106.00 samples.], tot_loss[dur_loss=0.2959, prior_loss=0.9989, diff_loss=0.4636, tot_loss=1.758, over 1142.00 samples.], 2024-10-20 23:10:41,278 INFO [train.py:682] (1/4) Start epoch 631 2024-10-20 23:10:49,992 INFO [train.py:561] (1/4) Epoch 631, batch 0, global_batch_idx: 10080, batch size: 108, loss[dur_loss=0.311, prior_loss=1, diff_loss=0.3439, tot_loss=1.655, over 108.00 samples.], tot_loss[dur_loss=0.311, prior_loss=1, diff_loss=0.3439, tot_loss=1.655, over 108.00 samples.], 2024-10-20 23:11:04,172 INFO [train.py:561] (1/4) Epoch 631, batch 10, global_batch_idx: 10090, batch size: 111, loss[dur_loss=0.3117, prior_loss=1.001, diff_loss=0.3526, tot_loss=1.665, over 111.00 samples.], tot_loss[dur_loss=0.3001, prior_loss=0.9992, diff_loss=0.4363, tot_loss=1.736, over 1656.00 samples.], 2024-10-20 23:11:11,281 INFO [train.py:682] (1/4) Start epoch 632 2024-10-20 23:11:24,888 INFO [train.py:561] (1/4) Epoch 632, batch 4, global_batch_idx: 10100, batch size: 189, loss[dur_loss=0.2986, prior_loss=0.9998, diff_loss=0.3913, tot_loss=1.69, over 189.00 samples.], tot_loss[dur_loss=0.2958, prior_loss=0.9986, diff_loss=0.4826, tot_loss=1.777, over 937.00 samples.], 2024-10-20 23:11:39,733 INFO [train.py:561] (1/4) Epoch 632, batch 14, global_batch_idx: 10110, batch size: 142, loss[dur_loss=0.2995, prior_loss=0.999, diff_loss=0.3681, tot_loss=1.667, over 142.00 samples.], tot_loss[dur_loss=0.2999, prior_loss=0.9991, diff_loss=0.4208, tot_loss=1.72, over 2210.00 samples.], 2024-10-20 23:11:41,176 INFO [train.py:682] (1/4) Start epoch 633 2024-10-20 23:12:01,276 INFO [train.py:561] (1/4) Epoch 633, batch 8, global_batch_idx: 10120, batch size: 170, loss[dur_loss=0.3007, prior_loss=0.9995, diff_loss=0.3914, tot_loss=1.692, over 170.00 samples.], tot_loss[dur_loss=0.2981, prior_loss=0.9991, diff_loss=0.4517, tot_loss=1.749, over 1432.00 samples.], 2024-10-20 23:12:11,410 INFO [train.py:682] (1/4) Start epoch 634 2024-10-20 23:12:22,921 INFO [train.py:561] (1/4) Epoch 634, batch 2, global_batch_idx: 10130, batch size: 203, loss[dur_loss=0.3075, prior_loss=1, diff_loss=0.3881, tot_loss=1.696, over 203.00 samples.], tot_loss[dur_loss=0.3057, prior_loss=1, diff_loss=0.3763, tot_loss=1.682, over 442.00 samples.], 2024-10-20 23:12:37,175 INFO [train.py:561] (1/4) Epoch 634, batch 12, global_batch_idx: 10140, batch size: 152, loss[dur_loss=0.2997, prior_loss=0.9992, diff_loss=0.3644, tot_loss=1.663, over 152.00 samples.], tot_loss[dur_loss=0.3004, prior_loss=0.9994, diff_loss=0.4237, tot_loss=1.723, over 1966.00 samples.], 2024-10-20 23:12:41,671 INFO [train.py:682] (1/4) Start epoch 635 2024-10-20 23:12:58,661 INFO [train.py:561] (1/4) Epoch 635, batch 6, global_batch_idx: 10150, batch size: 106, loss[dur_loss=0.3021, prior_loss=1, diff_loss=0.3976, tot_loss=1.7, over 106.00 samples.], tot_loss[dur_loss=0.2982, prior_loss=0.999, diff_loss=0.4658, tot_loss=1.763, over 1142.00 samples.], 2024-10-20 23:13:11,727 INFO [train.py:682] (1/4) Start epoch 636 2024-10-20 23:13:20,628 INFO [train.py:561] (1/4) Epoch 636, batch 0, global_batch_idx: 10160, batch size: 108, loss[dur_loss=0.3099, prior_loss=1.001, diff_loss=0.3609, tot_loss=1.671, over 108.00 samples.], tot_loss[dur_loss=0.3099, prior_loss=1.001, diff_loss=0.3609, tot_loss=1.671, over 108.00 samples.], 2024-10-20 23:13:35,153 INFO [train.py:561] (1/4) Epoch 636, batch 10, global_batch_idx: 10170, batch size: 111, loss[dur_loss=0.3104, prior_loss=1.001, diff_loss=0.3804, tot_loss=1.692, over 111.00 samples.], tot_loss[dur_loss=0.2994, prior_loss=0.9993, diff_loss=0.4339, tot_loss=1.733, over 1656.00 samples.], 2024-10-20 23:13:42,358 INFO [train.py:682] (1/4) Start epoch 637 2024-10-20 23:13:56,331 INFO [train.py:561] (1/4) Epoch 637, batch 4, global_batch_idx: 10180, batch size: 189, loss[dur_loss=0.3058, prior_loss=1, diff_loss=0.4096, tot_loss=1.715, over 189.00 samples.], tot_loss[dur_loss=0.2974, prior_loss=0.9987, diff_loss=0.4864, tot_loss=1.783, over 937.00 samples.], 2024-10-20 23:14:11,330 INFO [train.py:561] (1/4) Epoch 637, batch 14, global_batch_idx: 10190, batch size: 142, loss[dur_loss=0.2992, prior_loss=0.999, diff_loss=0.3911, tot_loss=1.689, over 142.00 samples.], tot_loss[dur_loss=0.3009, prior_loss=0.9993, diff_loss=0.4258, tot_loss=1.726, over 2210.00 samples.], 2024-10-20 23:14:12,766 INFO [train.py:682] (1/4) Start epoch 638 2024-10-20 23:14:32,654 INFO [train.py:561] (1/4) Epoch 638, batch 8, global_batch_idx: 10200, batch size: 170, loss[dur_loss=0.3021, prior_loss=0.9991, diff_loss=0.3836, tot_loss=1.685, over 170.00 samples.], tot_loss[dur_loss=0.2969, prior_loss=0.9986, diff_loss=0.4316, tot_loss=1.727, over 1432.00 samples.], 2024-10-20 23:14:42,886 INFO [train.py:682] (1/4) Start epoch 639 2024-10-20 23:14:54,124 INFO [train.py:561] (1/4) Epoch 639, batch 2, global_batch_idx: 10210, batch size: 203, loss[dur_loss=0.3019, prior_loss=0.999, diff_loss=0.3805, tot_loss=1.681, over 203.00 samples.], tot_loss[dur_loss=0.3029, prior_loss=0.9992, diff_loss=0.3677, tot_loss=1.67, over 442.00 samples.], 2024-10-20 23:15:08,674 INFO [train.py:561] (1/4) Epoch 639, batch 12, global_batch_idx: 10220, batch size: 152, loss[dur_loss=0.305, prior_loss=0.9989, diff_loss=0.3705, tot_loss=1.674, over 152.00 samples.], tot_loss[dur_loss=0.299, prior_loss=0.9987, diff_loss=0.4181, tot_loss=1.716, over 1966.00 samples.], 2024-10-20 23:15:13,105 INFO [train.py:682] (1/4) Start epoch 640 2024-10-20 23:15:30,024 INFO [train.py:561] (1/4) Epoch 640, batch 6, global_batch_idx: 10230, batch size: 106, loss[dur_loss=0.2991, prior_loss=0.999, diff_loss=0.3802, tot_loss=1.678, over 106.00 samples.], tot_loss[dur_loss=0.2937, prior_loss=0.9983, diff_loss=0.4721, tot_loss=1.764, over 1142.00 samples.], 2024-10-20 23:15:43,129 INFO [train.py:682] (1/4) Start epoch 641 2024-10-20 23:15:51,693 INFO [train.py:561] (1/4) Epoch 641, batch 0, global_batch_idx: 10240, batch size: 108, loss[dur_loss=0.3077, prior_loss=0.9997, diff_loss=0.3503, tot_loss=1.658, over 108.00 samples.], tot_loss[dur_loss=0.3077, prior_loss=0.9997, diff_loss=0.3503, tot_loss=1.658, over 108.00 samples.], 2024-10-20 23:16:05,970 INFO [train.py:561] (1/4) Epoch 641, batch 10, global_batch_idx: 10250, batch size: 111, loss[dur_loss=0.3091, prior_loss=1, diff_loss=0.346, tot_loss=1.655, over 111.00 samples.], tot_loss[dur_loss=0.2978, prior_loss=0.9985, diff_loss=0.4309, tot_loss=1.727, over 1656.00 samples.], 2024-10-20 23:16:13,059 INFO [train.py:682] (1/4) Start epoch 642 2024-10-20 23:16:26,608 INFO [train.py:561] (1/4) Epoch 642, batch 4, global_batch_idx: 10260, batch size: 189, loss[dur_loss=0.299, prior_loss=0.9996, diff_loss=0.4045, tot_loss=1.703, over 189.00 samples.], tot_loss[dur_loss=0.2938, prior_loss=0.9981, diff_loss=0.4932, tot_loss=1.785, over 937.00 samples.], 2024-10-20 23:16:41,524 INFO [train.py:561] (1/4) Epoch 642, batch 14, global_batch_idx: 10270, batch size: 142, loss[dur_loss=0.2973, prior_loss=0.9988, diff_loss=0.361, tot_loss=1.657, over 142.00 samples.], tot_loss[dur_loss=0.2977, prior_loss=0.9986, diff_loss=0.4186, tot_loss=1.715, over 2210.00 samples.], 2024-10-20 23:16:42,966 INFO [train.py:682] (1/4) Start epoch 643 2024-10-20 23:17:03,024 INFO [train.py:561] (1/4) Epoch 643, batch 8, global_batch_idx: 10280, batch size: 170, loss[dur_loss=0.3017, prior_loss=0.9988, diff_loss=0.3603, tot_loss=1.661, over 170.00 samples.], tot_loss[dur_loss=0.2963, prior_loss=0.9985, diff_loss=0.442, tot_loss=1.737, over 1432.00 samples.], 2024-10-20 23:17:13,200 INFO [train.py:682] (1/4) Start epoch 644 2024-10-20 23:17:24,489 INFO [train.py:561] (1/4) Epoch 644, batch 2, global_batch_idx: 10290, batch size: 203, loss[dur_loss=0.2954, prior_loss=0.9989, diff_loss=0.3964, tot_loss=1.691, over 203.00 samples.], tot_loss[dur_loss=0.3013, prior_loss=0.9994, diff_loss=0.3708, tot_loss=1.671, over 442.00 samples.], 2024-10-20 23:17:38,738 INFO [train.py:561] (1/4) Epoch 644, batch 12, global_batch_idx: 10300, batch size: 152, loss[dur_loss=0.2993, prior_loss=0.9988, diff_loss=0.3643, tot_loss=1.662, over 152.00 samples.], tot_loss[dur_loss=0.2969, prior_loss=0.9988, diff_loss=0.4313, tot_loss=1.727, over 1966.00 samples.], 2024-10-20 23:17:43,212 INFO [train.py:682] (1/4) Start epoch 645 2024-10-20 23:18:00,298 INFO [train.py:561] (1/4) Epoch 645, batch 6, global_batch_idx: 10310, batch size: 106, loss[dur_loss=0.2978, prior_loss=0.9987, diff_loss=0.3875, tot_loss=1.684, over 106.00 samples.], tot_loss[dur_loss=0.2928, prior_loss=0.9982, diff_loss=0.4594, tot_loss=1.75, over 1142.00 samples.], 2024-10-20 23:18:13,383 INFO [train.py:682] (1/4) Start epoch 646 2024-10-20 23:18:22,102 INFO [train.py:561] (1/4) Epoch 646, batch 0, global_batch_idx: 10320, batch size: 108, loss[dur_loss=0.3069, prior_loss=0.9998, diff_loss=0.3397, tot_loss=1.646, over 108.00 samples.], tot_loss[dur_loss=0.3069, prior_loss=0.9998, diff_loss=0.3397, tot_loss=1.646, over 108.00 samples.], 2024-10-20 23:18:36,352 INFO [train.py:561] (1/4) Epoch 646, batch 10, global_batch_idx: 10330, batch size: 111, loss[dur_loss=0.3106, prior_loss=1.001, diff_loss=0.3878, tot_loss=1.699, over 111.00 samples.], tot_loss[dur_loss=0.2975, prior_loss=0.9988, diff_loss=0.4303, tot_loss=1.727, over 1656.00 samples.], 2024-10-20 23:18:43,477 INFO [train.py:682] (1/4) Start epoch 647 2024-10-20 23:18:57,136 INFO [train.py:561] (1/4) Epoch 647, batch 4, global_batch_idx: 10340, batch size: 189, loss[dur_loss=0.2978, prior_loss=0.9992, diff_loss=0.3972, tot_loss=1.694, over 189.00 samples.], tot_loss[dur_loss=0.2941, prior_loss=0.9982, diff_loss=0.4927, tot_loss=1.785, over 937.00 samples.], 2024-10-20 23:19:11,994 INFO [train.py:561] (1/4) Epoch 647, batch 14, global_batch_idx: 10350, batch size: 142, loss[dur_loss=0.3, prior_loss=0.9987, diff_loss=0.357, tot_loss=1.656, over 142.00 samples.], tot_loss[dur_loss=0.2992, prior_loss=0.9988, diff_loss=0.4212, tot_loss=1.719, over 2210.00 samples.], 2024-10-20 23:19:13,422 INFO [train.py:682] (1/4) Start epoch 648 2024-10-20 23:19:33,118 INFO [train.py:561] (1/4) Epoch 648, batch 8, global_batch_idx: 10360, batch size: 170, loss[dur_loss=0.2985, prior_loss=0.9989, diff_loss=0.3858, tot_loss=1.683, over 170.00 samples.], tot_loss[dur_loss=0.2954, prior_loss=0.9985, diff_loss=0.4402, tot_loss=1.734, over 1432.00 samples.], 2024-10-20 23:19:43,306 INFO [train.py:682] (1/4) Start epoch 649 2024-10-20 23:19:54,950 INFO [train.py:561] (1/4) Epoch 649, batch 2, global_batch_idx: 10370, batch size: 203, loss[dur_loss=0.3036, prior_loss=0.9989, diff_loss=0.3853, tot_loss=1.688, over 203.00 samples.], tot_loss[dur_loss=0.3056, prior_loss=0.9994, diff_loss=0.3796, tot_loss=1.685, over 442.00 samples.], 2024-10-20 23:20:09,216 INFO [train.py:561] (1/4) Epoch 649, batch 12, global_batch_idx: 10380, batch size: 152, loss[dur_loss=0.2962, prior_loss=0.9985, diff_loss=0.4104, tot_loss=1.705, over 152.00 samples.], tot_loss[dur_loss=0.2985, prior_loss=0.9987, diff_loss=0.4296, tot_loss=1.727, over 1966.00 samples.], 2024-10-20 23:20:13,682 INFO [train.py:682] (1/4) Start epoch 650 2024-10-20 23:20:30,565 INFO [train.py:561] (1/4) Epoch 650, batch 6, global_batch_idx: 10390, batch size: 106, loss[dur_loss=0.301, prior_loss=0.9988, diff_loss=0.3331, tot_loss=1.633, over 106.00 samples.], tot_loss[dur_loss=0.2936, prior_loss=0.9982, diff_loss=0.4595, tot_loss=1.751, over 1142.00 samples.], 2024-10-20 23:20:43,692 INFO [train.py:682] (1/4) Start epoch 651 2024-10-20 23:20:52,271 INFO [train.py:561] (1/4) Epoch 651, batch 0, global_batch_idx: 10400, batch size: 108, loss[dur_loss=0.3093, prior_loss=0.9999, diff_loss=0.3319, tot_loss=1.641, over 108.00 samples.], tot_loss[dur_loss=0.3093, prior_loss=0.9999, diff_loss=0.3319, tot_loss=1.641, over 108.00 samples.], 2024-10-20 23:21:06,835 INFO [train.py:561] (1/4) Epoch 651, batch 10, global_batch_idx: 10410, batch size: 111, loss[dur_loss=0.3068, prior_loss=1, diff_loss=0.3678, tot_loss=1.675, over 111.00 samples.], tot_loss[dur_loss=0.2963, prior_loss=0.9986, diff_loss=0.4284, tot_loss=1.723, over 1656.00 samples.], 2024-10-20 23:21:13,952 INFO [train.py:682] (1/4) Start epoch 652 2024-10-20 23:21:27,700 INFO [train.py:561] (1/4) Epoch 652, batch 4, global_batch_idx: 10420, batch size: 189, loss[dur_loss=0.3017, prior_loss=0.9993, diff_loss=0.4029, tot_loss=1.704, over 189.00 samples.], tot_loss[dur_loss=0.2936, prior_loss=0.998, diff_loss=0.4693, tot_loss=1.761, over 937.00 samples.], 2024-10-20 23:21:42,634 INFO [train.py:561] (1/4) Epoch 652, batch 14, global_batch_idx: 10430, batch size: 142, loss[dur_loss=0.3034, prior_loss=0.9997, diff_loss=0.3782, tot_loss=1.681, over 142.00 samples.], tot_loss[dur_loss=0.298, prior_loss=0.9987, diff_loss=0.4041, tot_loss=1.701, over 2210.00 samples.], 2024-10-20 23:21:44,067 INFO [train.py:682] (1/4) Start epoch 653 2024-10-20 23:22:03,803 INFO [train.py:561] (1/4) Epoch 653, batch 8, global_batch_idx: 10440, batch size: 170, loss[dur_loss=0.3007, prior_loss=0.9986, diff_loss=0.399, tot_loss=1.698, over 170.00 samples.], tot_loss[dur_loss=0.2957, prior_loss=0.9983, diff_loss=0.4457, tot_loss=1.74, over 1432.00 samples.], 2024-10-20 23:22:14,165 INFO [train.py:682] (1/4) Start epoch 654 2024-10-20 23:22:25,425 INFO [train.py:561] (1/4) Epoch 654, batch 2, global_batch_idx: 10450, batch size: 203, loss[dur_loss=0.2987, prior_loss=0.9989, diff_loss=0.396, tot_loss=1.694, over 203.00 samples.], tot_loss[dur_loss=0.2999, prior_loss=0.9991, diff_loss=0.3807, tot_loss=1.68, over 442.00 samples.], 2024-10-20 23:22:39,782 INFO [train.py:561] (1/4) Epoch 654, batch 12, global_batch_idx: 10460, batch size: 152, loss[dur_loss=0.2998, prior_loss=0.9984, diff_loss=0.349, tot_loss=1.647, over 152.00 samples.], tot_loss[dur_loss=0.2968, prior_loss=0.9984, diff_loss=0.4238, tot_loss=1.719, over 1966.00 samples.], 2024-10-20 23:22:44,282 INFO [train.py:682] (1/4) Start epoch 655 2024-10-20 23:23:01,632 INFO [train.py:561] (1/4) Epoch 655, batch 6, global_batch_idx: 10470, batch size: 106, loss[dur_loss=0.2995, prior_loss=0.9987, diff_loss=0.3967, tot_loss=1.695, over 106.00 samples.], tot_loss[dur_loss=0.2925, prior_loss=0.9981, diff_loss=0.4775, tot_loss=1.768, over 1142.00 samples.], 2024-10-20 23:23:14,774 INFO [train.py:682] (1/4) Start epoch 656 2024-10-20 23:23:23,439 INFO [train.py:561] (1/4) Epoch 656, batch 0, global_batch_idx: 10480, batch size: 108, loss[dur_loss=0.3064, prior_loss=0.9994, diff_loss=0.3759, tot_loss=1.682, over 108.00 samples.], tot_loss[dur_loss=0.3064, prior_loss=0.9994, diff_loss=0.3759, tot_loss=1.682, over 108.00 samples.], 2024-10-20 23:23:37,707 INFO [train.py:561] (1/4) Epoch 656, batch 10, global_batch_idx: 10490, batch size: 111, loss[dur_loss=0.3087, prior_loss=1, diff_loss=0.3829, tot_loss=1.692, over 111.00 samples.], tot_loss[dur_loss=0.295, prior_loss=0.9983, diff_loss=0.4305, tot_loss=1.724, over 1656.00 samples.], 2024-10-20 23:23:44,866 INFO [train.py:682] (1/4) Start epoch 657 2024-10-20 23:23:58,815 INFO [train.py:561] (1/4) Epoch 657, batch 4, global_batch_idx: 10500, batch size: 189, loss[dur_loss=0.2967, prior_loss=0.9994, diff_loss=0.362, tot_loss=1.658, over 189.00 samples.], tot_loss[dur_loss=0.2906, prior_loss=0.9979, diff_loss=0.4755, tot_loss=1.764, over 937.00 samples.], 2024-10-20 23:24:00,451 INFO [train.py:579] (1/4) Computing validation loss 2024-10-20 23:24:35,693 INFO [train.py:589] (1/4) Epoch 657, validation: dur_loss=0.4324, prior_loss=1.028, diff_loss=0.3907, tot_loss=1.851, over 100.00 samples. 2024-10-20 23:24:35,695 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-20 23:24:49,077 INFO [train.py:561] (1/4) Epoch 657, batch 14, global_batch_idx: 10510, batch size: 142, loss[dur_loss=0.3007, prior_loss=0.9984, diff_loss=0.3503, tot_loss=1.649, over 142.00 samples.], tot_loss[dur_loss=0.2964, prior_loss=0.9984, diff_loss=0.4156, tot_loss=1.71, over 2210.00 samples.], 2024-10-20 23:24:50,512 INFO [train.py:682] (1/4) Start epoch 658 2024-10-20 23:25:10,653 INFO [train.py:561] (1/4) Epoch 658, batch 8, global_batch_idx: 10520, batch size: 170, loss[dur_loss=0.2978, prior_loss=0.9989, diff_loss=0.3617, tot_loss=1.658, over 170.00 samples.], tot_loss[dur_loss=0.2936, prior_loss=0.9981, diff_loss=0.435, tot_loss=1.727, over 1432.00 samples.], 2024-10-20 23:25:20,783 INFO [train.py:682] (1/4) Start epoch 659 2024-10-20 23:25:32,207 INFO [train.py:561] (1/4) Epoch 659, batch 2, global_batch_idx: 10530, batch size: 203, loss[dur_loss=0.303, prior_loss=0.9989, diff_loss=0.4019, tot_loss=1.704, over 203.00 samples.], tot_loss[dur_loss=0.3024, prior_loss=0.999, diff_loss=0.3805, tot_loss=1.682, over 442.00 samples.], 2024-10-20 23:25:46,505 INFO [train.py:561] (1/4) Epoch 659, batch 12, global_batch_idx: 10540, batch size: 152, loss[dur_loss=0.296, prior_loss=0.9986, diff_loss=0.3799, tot_loss=1.674, over 152.00 samples.], tot_loss[dur_loss=0.2966, prior_loss=0.9986, diff_loss=0.4261, tot_loss=1.721, over 1966.00 samples.], 2024-10-20 23:25:50,957 INFO [train.py:682] (1/4) Start epoch 660 2024-10-20 23:26:08,091 INFO [train.py:561] (1/4) Epoch 660, batch 6, global_batch_idx: 10550, batch size: 106, loss[dur_loss=0.2987, prior_loss=0.9984, diff_loss=0.3673, tot_loss=1.664, over 106.00 samples.], tot_loss[dur_loss=0.2922, prior_loss=0.998, diff_loss=0.4644, tot_loss=1.755, over 1142.00 samples.], 2024-10-20 23:26:21,169 INFO [train.py:682] (1/4) Start epoch 661 2024-10-20 23:26:30,275 INFO [train.py:561] (1/4) Epoch 661, batch 0, global_batch_idx: 10560, batch size: 108, loss[dur_loss=0.304, prior_loss=0.9992, diff_loss=0.3746, tot_loss=1.678, over 108.00 samples.], tot_loss[dur_loss=0.304, prior_loss=0.9992, diff_loss=0.3746, tot_loss=1.678, over 108.00 samples.], 2024-10-20 23:26:44,481 INFO [train.py:561] (1/4) Epoch 661, batch 10, global_batch_idx: 10570, batch size: 111, loss[dur_loss=0.3053, prior_loss=0.9999, diff_loss=0.3484, tot_loss=1.654, over 111.00 samples.], tot_loss[dur_loss=0.2947, prior_loss=0.9983, diff_loss=0.4302, tot_loss=1.723, over 1656.00 samples.], 2024-10-20 23:26:51,543 INFO [train.py:682] (1/4) Start epoch 662 2024-10-20 23:27:05,354 INFO [train.py:561] (1/4) Epoch 662, batch 4, global_batch_idx: 10580, batch size: 189, loss[dur_loss=0.2958, prior_loss=0.9992, diff_loss=0.3998, tot_loss=1.695, over 189.00 samples.], tot_loss[dur_loss=0.2901, prior_loss=0.9976, diff_loss=0.484, tot_loss=1.772, over 937.00 samples.], 2024-10-20 23:27:20,200 INFO [train.py:561] (1/4) Epoch 662, batch 14, global_batch_idx: 10590, batch size: 142, loss[dur_loss=0.299, prior_loss=0.9983, diff_loss=0.3577, tot_loss=1.655, over 142.00 samples.], tot_loss[dur_loss=0.2955, prior_loss=0.9983, diff_loss=0.4127, tot_loss=1.707, over 2210.00 samples.], 2024-10-20 23:27:21,626 INFO [train.py:682] (1/4) Start epoch 663 2024-10-20 23:27:41,548 INFO [train.py:561] (1/4) Epoch 663, batch 8, global_batch_idx: 10600, batch size: 170, loss[dur_loss=0.2979, prior_loss=0.9987, diff_loss=0.3828, tot_loss=1.679, over 170.00 samples.], tot_loss[dur_loss=0.2934, prior_loss=0.9979, diff_loss=0.4534, tot_loss=1.745, over 1432.00 samples.], 2024-10-20 23:27:51,673 INFO [train.py:682] (1/4) Start epoch 664 2024-10-20 23:28:02,982 INFO [train.py:561] (1/4) Epoch 664, batch 2, global_batch_idx: 10610, batch size: 203, loss[dur_loss=0.2952, prior_loss=0.9984, diff_loss=0.4019, tot_loss=1.696, over 203.00 samples.], tot_loss[dur_loss=0.2984, prior_loss=0.9986, diff_loss=0.385, tot_loss=1.682, over 442.00 samples.], 2024-10-20 23:28:17,249 INFO [train.py:561] (1/4) Epoch 664, batch 12, global_batch_idx: 10620, batch size: 152, loss[dur_loss=0.2967, prior_loss=0.9983, diff_loss=0.3754, tot_loss=1.67, over 152.00 samples.], tot_loss[dur_loss=0.2938, prior_loss=0.998, diff_loss=0.412, tot_loss=1.704, over 1966.00 samples.], 2024-10-20 23:28:21,713 INFO [train.py:682] (1/4) Start epoch 665 2024-10-20 23:28:38,665 INFO [train.py:561] (1/4) Epoch 665, batch 6, global_batch_idx: 10630, batch size: 106, loss[dur_loss=0.2964, prior_loss=0.9984, diff_loss=0.3397, tot_loss=1.635, over 106.00 samples.], tot_loss[dur_loss=0.293, prior_loss=0.9978, diff_loss=0.463, tot_loss=1.754, over 1142.00 samples.], 2024-10-20 23:28:51,716 INFO [train.py:682] (1/4) Start epoch 666 2024-10-20 23:29:00,879 INFO [train.py:561] (1/4) Epoch 666, batch 0, global_batch_idx: 10640, batch size: 108, loss[dur_loss=0.3069, prior_loss=0.9991, diff_loss=0.3301, tot_loss=1.636, over 108.00 samples.], tot_loss[dur_loss=0.3069, prior_loss=0.9991, diff_loss=0.3301, tot_loss=1.636, over 108.00 samples.], 2024-10-20 23:29:15,286 INFO [train.py:561] (1/4) Epoch 666, batch 10, global_batch_idx: 10650, batch size: 111, loss[dur_loss=0.3097, prior_loss=1, diff_loss=0.3781, tot_loss=1.688, over 111.00 samples.], tot_loss[dur_loss=0.2954, prior_loss=0.9981, diff_loss=0.432, tot_loss=1.726, over 1656.00 samples.], 2024-10-20 23:29:22,440 INFO [train.py:682] (1/4) Start epoch 667 2024-10-20 23:29:36,289 INFO [train.py:561] (1/4) Epoch 667, batch 4, global_batch_idx: 10660, batch size: 189, loss[dur_loss=0.2966, prior_loss=0.9985, diff_loss=0.395, tot_loss=1.69, over 189.00 samples.], tot_loss[dur_loss=0.2915, prior_loss=0.9975, diff_loss=0.4922, tot_loss=1.781, over 937.00 samples.], 2024-10-20 23:29:51,276 INFO [train.py:561] (1/4) Epoch 667, batch 14, global_batch_idx: 10670, batch size: 142, loss[dur_loss=0.2993, prior_loss=0.9985, diff_loss=0.3711, tot_loss=1.669, over 142.00 samples.], tot_loss[dur_loss=0.2965, prior_loss=0.9982, diff_loss=0.4238, tot_loss=1.718, over 2210.00 samples.], 2024-10-20 23:29:52,696 INFO [train.py:682] (1/4) Start epoch 668 2024-10-20 23:30:12,939 INFO [train.py:561] (1/4) Epoch 668, batch 8, global_batch_idx: 10680, batch size: 170, loss[dur_loss=0.3033, prior_loss=0.9989, diff_loss=0.3853, tot_loss=1.688, over 170.00 samples.], tot_loss[dur_loss=0.2945, prior_loss=0.9979, diff_loss=0.4503, tot_loss=1.743, over 1432.00 samples.], 2024-10-20 23:30:23,071 INFO [train.py:682] (1/4) Start epoch 669 2024-10-20 23:30:34,413 INFO [train.py:561] (1/4) Epoch 669, batch 2, global_batch_idx: 10690, batch size: 203, loss[dur_loss=0.3001, prior_loss=0.9982, diff_loss=0.3951, tot_loss=1.693, over 203.00 samples.], tot_loss[dur_loss=0.3011, prior_loss=0.9984, diff_loss=0.3732, tot_loss=1.673, over 442.00 samples.], 2024-10-20 23:30:48,981 INFO [train.py:561] (1/4) Epoch 669, batch 12, global_batch_idx: 10700, batch size: 152, loss[dur_loss=0.2966, prior_loss=0.9979, diff_loss=0.3959, tot_loss=1.69, over 152.00 samples.], tot_loss[dur_loss=0.2954, prior_loss=0.998, diff_loss=0.4247, tot_loss=1.718, over 1966.00 samples.], 2024-10-20 23:30:53,409 INFO [train.py:682] (1/4) Start epoch 670 2024-10-20 23:31:10,612 INFO [train.py:561] (1/4) Epoch 670, batch 6, global_batch_idx: 10710, batch size: 106, loss[dur_loss=0.2972, prior_loss=0.9985, diff_loss=0.3345, tot_loss=1.63, over 106.00 samples.], tot_loss[dur_loss=0.2919, prior_loss=0.9976, diff_loss=0.4597, tot_loss=1.749, over 1142.00 samples.], 2024-10-20 23:31:23,611 INFO [train.py:682] (1/4) Start epoch 671 2024-10-20 23:31:32,374 INFO [train.py:561] (1/4) Epoch 671, batch 0, global_batch_idx: 10720, batch size: 108, loss[dur_loss=0.304, prior_loss=0.9992, diff_loss=0.3712, tot_loss=1.674, over 108.00 samples.], tot_loss[dur_loss=0.304, prior_loss=0.9992, diff_loss=0.3712, tot_loss=1.674, over 108.00 samples.], 2024-10-20 23:31:46,621 INFO [train.py:561] (1/4) Epoch 671, batch 10, global_batch_idx: 10730, batch size: 111, loss[dur_loss=0.3054, prior_loss=0.9996, diff_loss=0.3533, tot_loss=1.658, over 111.00 samples.], tot_loss[dur_loss=0.2942, prior_loss=0.9981, diff_loss=0.4317, tot_loss=1.724, over 1656.00 samples.], 2024-10-20 23:31:53,713 INFO [train.py:682] (1/4) Start epoch 672 2024-10-20 23:32:07,359 INFO [train.py:561] (1/4) Epoch 672, batch 4, global_batch_idx: 10740, batch size: 189, loss[dur_loss=0.2997, prior_loss=0.999, diff_loss=0.4014, tot_loss=1.7, over 189.00 samples.], tot_loss[dur_loss=0.2914, prior_loss=0.9973, diff_loss=0.4975, tot_loss=1.786, over 937.00 samples.], 2024-10-20 23:32:22,190 INFO [train.py:561] (1/4) Epoch 672, batch 14, global_batch_idx: 10750, batch size: 142, loss[dur_loss=0.2978, prior_loss=0.9982, diff_loss=0.3974, tot_loss=1.693, over 142.00 samples.], tot_loss[dur_loss=0.2955, prior_loss=0.998, diff_loss=0.4284, tot_loss=1.722, over 2210.00 samples.], 2024-10-20 23:32:23,607 INFO [train.py:682] (1/4) Start epoch 673 2024-10-20 23:32:43,766 INFO [train.py:561] (1/4) Epoch 673, batch 8, global_batch_idx: 10760, batch size: 170, loss[dur_loss=0.2967, prior_loss=0.9984, diff_loss=0.4129, tot_loss=1.708, over 170.00 samples.], tot_loss[dur_loss=0.2931, prior_loss=0.9976, diff_loss=0.444, tot_loss=1.735, over 1432.00 samples.], 2024-10-20 23:32:53,884 INFO [train.py:682] (1/4) Start epoch 674 2024-10-20 23:33:05,226 INFO [train.py:561] (1/4) Epoch 674, batch 2, global_batch_idx: 10770, batch size: 203, loss[dur_loss=0.2949, prior_loss=0.9981, diff_loss=0.3851, tot_loss=1.678, over 203.00 samples.], tot_loss[dur_loss=0.2976, prior_loss=0.9983, diff_loss=0.3649, tot_loss=1.661, over 442.00 samples.], 2024-10-20 23:33:19,715 INFO [train.py:561] (1/4) Epoch 674, batch 12, global_batch_idx: 10780, batch size: 152, loss[dur_loss=0.2997, prior_loss=0.9979, diff_loss=0.3935, tot_loss=1.691, over 152.00 samples.], tot_loss[dur_loss=0.2948, prior_loss=0.9978, diff_loss=0.4178, tot_loss=1.71, over 1966.00 samples.], 2024-10-20 23:33:24,230 INFO [train.py:682] (1/4) Start epoch 675 2024-10-20 23:33:41,435 INFO [train.py:561] (1/4) Epoch 675, batch 6, global_batch_idx: 10790, batch size: 106, loss[dur_loss=0.2965, prior_loss=0.9978, diff_loss=0.3301, tot_loss=1.624, over 106.00 samples.], tot_loss[dur_loss=0.2905, prior_loss=0.9971, diff_loss=0.452, tot_loss=1.74, over 1142.00 samples.], 2024-10-20 23:33:54,463 INFO [train.py:682] (1/4) Start epoch 676 2024-10-20 23:34:03,538 INFO [train.py:561] (1/4) Epoch 676, batch 0, global_batch_idx: 10800, batch size: 108, loss[dur_loss=0.305, prior_loss=0.9988, diff_loss=0.3245, tot_loss=1.628, over 108.00 samples.], tot_loss[dur_loss=0.305, prior_loss=0.9988, diff_loss=0.3245, tot_loss=1.628, over 108.00 samples.], 2024-10-20 23:34:17,833 INFO [train.py:561] (1/4) Epoch 676, batch 10, global_batch_idx: 10810, batch size: 111, loss[dur_loss=0.3047, prior_loss=0.9991, diff_loss=0.3561, tot_loss=1.66, over 111.00 samples.], tot_loss[dur_loss=0.2929, prior_loss=0.9975, diff_loss=0.4311, tot_loss=1.722, over 1656.00 samples.], 2024-10-20 23:34:24,910 INFO [train.py:682] (1/4) Start epoch 677 2024-10-20 23:34:38,658 INFO [train.py:561] (1/4) Epoch 677, batch 4, global_batch_idx: 10820, batch size: 189, loss[dur_loss=0.2971, prior_loss=0.9984, diff_loss=0.3834, tot_loss=1.679, over 189.00 samples.], tot_loss[dur_loss=0.288, prior_loss=0.997, diff_loss=0.4739, tot_loss=1.759, over 937.00 samples.], 2024-10-20 23:34:53,448 INFO [train.py:561] (1/4) Epoch 677, batch 14, global_batch_idx: 10830, batch size: 142, loss[dur_loss=0.2956, prior_loss=0.9982, diff_loss=0.3441, tot_loss=1.638, over 142.00 samples.], tot_loss[dur_loss=0.2934, prior_loss=0.9976, diff_loss=0.4112, tot_loss=1.702, over 2210.00 samples.], 2024-10-20 23:34:54,877 INFO [train.py:682] (1/4) Start epoch 678 2024-10-20 23:35:15,000 INFO [train.py:561] (1/4) Epoch 678, batch 8, global_batch_idx: 10840, batch size: 170, loss[dur_loss=0.2963, prior_loss=0.9982, diff_loss=0.3644, tot_loss=1.659, over 170.00 samples.], tot_loss[dur_loss=0.2913, prior_loss=0.9975, diff_loss=0.4394, tot_loss=1.728, over 1432.00 samples.], 2024-10-20 23:35:25,244 INFO [train.py:682] (1/4) Start epoch 679 2024-10-20 23:35:37,043 INFO [train.py:561] (1/4) Epoch 679, batch 2, global_batch_idx: 10850, batch size: 203, loss[dur_loss=0.2953, prior_loss=0.9985, diff_loss=0.3814, tot_loss=1.675, over 203.00 samples.], tot_loss[dur_loss=0.2976, prior_loss=0.9986, diff_loss=0.378, tot_loss=1.674, over 442.00 samples.], 2024-10-20 23:35:51,380 INFO [train.py:561] (1/4) Epoch 679, batch 12, global_batch_idx: 10860, batch size: 152, loss[dur_loss=0.2995, prior_loss=0.9975, diff_loss=0.3444, tot_loss=1.641, over 152.00 samples.], tot_loss[dur_loss=0.2941, prior_loss=0.9977, diff_loss=0.4196, tot_loss=1.711, over 1966.00 samples.], 2024-10-20 23:35:55,838 INFO [train.py:682] (1/4) Start epoch 680 2024-10-20 23:36:13,067 INFO [train.py:561] (1/4) Epoch 680, batch 6, global_batch_idx: 10870, batch size: 106, loss[dur_loss=0.2945, prior_loss=0.9981, diff_loss=0.3369, tot_loss=1.63, over 106.00 samples.], tot_loss[dur_loss=0.2911, prior_loss=0.9975, diff_loss=0.4642, tot_loss=1.753, over 1142.00 samples.], 2024-10-20 23:36:26,099 INFO [train.py:682] (1/4) Start epoch 681 2024-10-20 23:36:34,714 INFO [train.py:561] (1/4) Epoch 681, batch 0, global_batch_idx: 10880, batch size: 108, loss[dur_loss=0.3034, prior_loss=0.9987, diff_loss=0.3267, tot_loss=1.629, over 108.00 samples.], tot_loss[dur_loss=0.3034, prior_loss=0.9987, diff_loss=0.3267, tot_loss=1.629, over 108.00 samples.], 2024-10-20 23:36:48,945 INFO [train.py:561] (1/4) Epoch 681, batch 10, global_batch_idx: 10890, batch size: 111, loss[dur_loss=0.3063, prior_loss=0.9993, diff_loss=0.3495, tot_loss=1.655, over 111.00 samples.], tot_loss[dur_loss=0.2939, prior_loss=0.9976, diff_loss=0.4265, tot_loss=1.718, over 1656.00 samples.], 2024-10-20 23:36:56,029 INFO [train.py:682] (1/4) Start epoch 682 2024-10-20 23:37:09,924 INFO [train.py:561] (1/4) Epoch 682, batch 4, global_batch_idx: 10900, batch size: 189, loss[dur_loss=0.3003, prior_loss=0.9984, diff_loss=0.387, tot_loss=1.686, over 189.00 samples.], tot_loss[dur_loss=0.2907, prior_loss=0.9972, diff_loss=0.4779, tot_loss=1.766, over 937.00 samples.], 2024-10-20 23:37:24,787 INFO [train.py:561] (1/4) Epoch 682, batch 14, global_batch_idx: 10910, batch size: 142, loss[dur_loss=0.298, prior_loss=0.9983, diff_loss=0.3703, tot_loss=1.667, over 142.00 samples.], tot_loss[dur_loss=0.2948, prior_loss=0.9977, diff_loss=0.4162, tot_loss=1.709, over 2210.00 samples.], 2024-10-20 23:37:26,225 INFO [train.py:682] (1/4) Start epoch 683 2024-10-20 23:37:46,158 INFO [train.py:561] (1/4) Epoch 683, batch 8, global_batch_idx: 10920, batch size: 170, loss[dur_loss=0.2966, prior_loss=0.9982, diff_loss=0.3613, tot_loss=1.656, over 170.00 samples.], tot_loss[dur_loss=0.2923, prior_loss=0.9975, diff_loss=0.442, tot_loss=1.732, over 1432.00 samples.], 2024-10-20 23:37:56,314 INFO [train.py:682] (1/4) Start epoch 684 2024-10-20 23:38:07,926 INFO [train.py:561] (1/4) Epoch 684, batch 2, global_batch_idx: 10930, batch size: 203, loss[dur_loss=0.2973, prior_loss=0.9977, diff_loss=0.4144, tot_loss=1.709, over 203.00 samples.], tot_loss[dur_loss=0.2973, prior_loss=0.9981, diff_loss=0.3723, tot_loss=1.668, over 442.00 samples.], 2024-10-20 23:38:22,023 INFO [train.py:561] (1/4) Epoch 684, batch 12, global_batch_idx: 10940, batch size: 152, loss[dur_loss=0.2953, prior_loss=0.9975, diff_loss=0.3796, tot_loss=1.672, over 152.00 samples.], tot_loss[dur_loss=0.2951, prior_loss=0.9976, diff_loss=0.4235, tot_loss=1.716, over 1966.00 samples.], 2024-10-20 23:38:26,487 INFO [train.py:682] (1/4) Start epoch 685 2024-10-20 23:38:43,602 INFO [train.py:561] (1/4) Epoch 685, batch 6, global_batch_idx: 10950, batch size: 106, loss[dur_loss=0.2972, prior_loss=0.998, diff_loss=0.3256, tot_loss=1.621, over 106.00 samples.], tot_loss[dur_loss=0.2892, prior_loss=0.9969, diff_loss=0.458, tot_loss=1.744, over 1142.00 samples.], 2024-10-20 23:38:56,676 INFO [train.py:682] (1/4) Start epoch 686 2024-10-20 23:39:05,312 INFO [train.py:561] (1/4) Epoch 686, batch 0, global_batch_idx: 10960, batch size: 108, loss[dur_loss=0.2998, prior_loss=0.9983, diff_loss=0.3492, tot_loss=1.647, over 108.00 samples.], tot_loss[dur_loss=0.2998, prior_loss=0.9983, diff_loss=0.3492, tot_loss=1.647, over 108.00 samples.], 2024-10-20 23:39:19,556 INFO [train.py:561] (1/4) Epoch 686, batch 10, global_batch_idx: 10970, batch size: 111, loss[dur_loss=0.3038, prior_loss=0.9989, diff_loss=0.3337, tot_loss=1.636, over 111.00 samples.], tot_loss[dur_loss=0.291, prior_loss=0.9973, diff_loss=0.439, tot_loss=1.727, over 1656.00 samples.], 2024-10-20 23:39:26,676 INFO [train.py:682] (1/4) Start epoch 687 2024-10-20 23:39:40,566 INFO [train.py:561] (1/4) Epoch 687, batch 4, global_batch_idx: 10980, batch size: 189, loss[dur_loss=0.2991, prior_loss=0.9983, diff_loss=0.4179, tot_loss=1.715, over 189.00 samples.], tot_loss[dur_loss=0.2887, prior_loss=0.9967, diff_loss=0.4924, tot_loss=1.778, over 937.00 samples.], 2024-10-20 23:39:55,505 INFO [train.py:561] (1/4) Epoch 687, batch 14, global_batch_idx: 10990, batch size: 142, loss[dur_loss=0.294, prior_loss=0.9977, diff_loss=0.3706, tot_loss=1.662, over 142.00 samples.], tot_loss[dur_loss=0.2932, prior_loss=0.9973, diff_loss=0.4192, tot_loss=1.71, over 2210.00 samples.], 2024-10-20 23:39:56,942 INFO [train.py:682] (1/4) Start epoch 688 2024-10-20 23:40:17,049 INFO [train.py:561] (1/4) Epoch 688, batch 8, global_batch_idx: 11000, batch size: 170, loss[dur_loss=0.2975, prior_loss=0.9976, diff_loss=0.3623, tot_loss=1.657, over 170.00 samples.], tot_loss[dur_loss=0.29, prior_loss=0.9969, diff_loss=0.4386, tot_loss=1.726, over 1432.00 samples.], 2024-10-20 23:40:27,262 INFO [train.py:682] (1/4) Start epoch 689 2024-10-20 23:40:39,074 INFO [train.py:561] (1/4) Epoch 689, batch 2, global_batch_idx: 11010, batch size: 203, loss[dur_loss=0.2946, prior_loss=0.9977, diff_loss=0.4073, tot_loss=1.7, over 203.00 samples.], tot_loss[dur_loss=0.296, prior_loss=0.9979, diff_loss=0.3845, tot_loss=1.678, over 442.00 samples.], 2024-10-20 23:40:53,377 INFO [train.py:561] (1/4) Epoch 689, batch 12, global_batch_idx: 11020, batch size: 152, loss[dur_loss=0.2951, prior_loss=0.997, diff_loss=0.3963, tot_loss=1.688, over 152.00 samples.], tot_loss[dur_loss=0.2917, prior_loss=0.9971, diff_loss=0.4261, tot_loss=1.715, over 1966.00 samples.], 2024-10-20 23:40:57,831 INFO [train.py:682] (1/4) Start epoch 690 2024-10-20 23:41:14,871 INFO [train.py:561] (1/4) Epoch 690, batch 6, global_batch_idx: 11030, batch size: 106, loss[dur_loss=0.2907, prior_loss=0.9975, diff_loss=0.3427, tot_loss=1.631, over 106.00 samples.], tot_loss[dur_loss=0.2874, prior_loss=0.9966, diff_loss=0.4553, tot_loss=1.739, over 1142.00 samples.], 2024-10-20 23:41:27,950 INFO [train.py:682] (1/4) Start epoch 691 2024-10-20 23:41:36,695 INFO [train.py:561] (1/4) Epoch 691, batch 0, global_batch_idx: 11040, batch size: 108, loss[dur_loss=0.3019, prior_loss=0.998, diff_loss=0.3578, tot_loss=1.658, over 108.00 samples.], tot_loss[dur_loss=0.3019, prior_loss=0.998, diff_loss=0.3578, tot_loss=1.658, over 108.00 samples.], 2024-10-20 23:41:50,984 INFO [train.py:561] (1/4) Epoch 691, batch 10, global_batch_idx: 11050, batch size: 111, loss[dur_loss=0.3011, prior_loss=0.9986, diff_loss=0.3389, tot_loss=1.639, over 111.00 samples.], tot_loss[dur_loss=0.2907, prior_loss=0.997, diff_loss=0.4284, tot_loss=1.716, over 1656.00 samples.], 2024-10-20 23:41:58,088 INFO [train.py:682] (1/4) Start epoch 692 2024-10-20 23:42:11,783 INFO [train.py:561] (1/4) Epoch 692, batch 4, global_batch_idx: 11060, batch size: 189, loss[dur_loss=0.2936, prior_loss=0.9975, diff_loss=0.3973, tot_loss=1.688, over 189.00 samples.], tot_loss[dur_loss=0.2858, prior_loss=0.9964, diff_loss=0.4894, tot_loss=1.772, over 937.00 samples.], 2024-10-20 23:42:26,694 INFO [train.py:561] (1/4) Epoch 692, batch 14, global_batch_idx: 11070, batch size: 142, loss[dur_loss=0.2946, prior_loss=0.9973, diff_loss=0.3335, tot_loss=1.625, over 142.00 samples.], tot_loss[dur_loss=0.2907, prior_loss=0.9971, diff_loss=0.4182, tot_loss=1.706, over 2210.00 samples.], 2024-10-20 23:42:28,128 INFO [train.py:682] (1/4) Start epoch 693 2024-10-20 23:42:48,417 INFO [train.py:561] (1/4) Epoch 693, batch 8, global_batch_idx: 11080, batch size: 170, loss[dur_loss=0.2951, prior_loss=0.9975, diff_loss=0.4037, tot_loss=1.696, over 170.00 samples.], tot_loss[dur_loss=0.2902, prior_loss=0.997, diff_loss=0.4533, tot_loss=1.74, over 1432.00 samples.], 2024-10-20 23:42:58,621 INFO [train.py:682] (1/4) Start epoch 694 2024-10-20 23:43:10,101 INFO [train.py:561] (1/4) Epoch 694, batch 2, global_batch_idx: 11090, batch size: 203, loss[dur_loss=0.2954, prior_loss=0.9974, diff_loss=0.4072, tot_loss=1.7, over 203.00 samples.], tot_loss[dur_loss=0.2957, prior_loss=0.9977, diff_loss=0.3901, tot_loss=1.683, over 442.00 samples.], 2024-10-20 23:43:24,463 INFO [train.py:561] (1/4) Epoch 694, batch 12, global_batch_idx: 11100, batch size: 152, loss[dur_loss=0.2954, prior_loss=0.997, diff_loss=0.3965, tot_loss=1.689, over 152.00 samples.], tot_loss[dur_loss=0.2916, prior_loss=0.997, diff_loss=0.4199, tot_loss=1.708, over 1966.00 samples.], 2024-10-20 23:43:29,005 INFO [train.py:682] (1/4) Start epoch 695 2024-10-20 23:43:46,491 INFO [train.py:561] (1/4) Epoch 695, batch 6, global_batch_idx: 11110, batch size: 106, loss[dur_loss=0.2931, prior_loss=0.9973, diff_loss=0.4213, tot_loss=1.712, over 106.00 samples.], tot_loss[dur_loss=0.2856, prior_loss=0.9964, diff_loss=0.461, tot_loss=1.743, over 1142.00 samples.], 2024-10-20 23:43:59,591 INFO [train.py:682] (1/4) Start epoch 696 2024-10-20 23:44:08,311 INFO [train.py:561] (1/4) Epoch 696, batch 0, global_batch_idx: 11120, batch size: 108, loss[dur_loss=0.3011, prior_loss=0.998, diff_loss=0.3371, tot_loss=1.636, over 108.00 samples.], tot_loss[dur_loss=0.3011, prior_loss=0.998, diff_loss=0.3371, tot_loss=1.636, over 108.00 samples.], 2024-10-20 23:44:22,816 INFO [train.py:561] (1/4) Epoch 696, batch 10, global_batch_idx: 11130, batch size: 111, loss[dur_loss=0.3038, prior_loss=0.9986, diff_loss=0.3445, tot_loss=1.647, over 111.00 samples.], tot_loss[dur_loss=0.2903, prior_loss=0.9969, diff_loss=0.4321, tot_loss=1.719, over 1656.00 samples.], 2024-10-20 23:44:29,942 INFO [train.py:682] (1/4) Start epoch 697 2024-10-20 23:44:43,778 INFO [train.py:561] (1/4) Epoch 697, batch 4, global_batch_idx: 11140, batch size: 189, loss[dur_loss=0.2897, prior_loss=0.9977, diff_loss=0.4289, tot_loss=1.716, over 189.00 samples.], tot_loss[dur_loss=0.2863, prior_loss=0.9964, diff_loss=0.4969, tot_loss=1.78, over 937.00 samples.], 2024-10-20 23:44:58,783 INFO [train.py:561] (1/4) Epoch 697, batch 14, global_batch_idx: 11150, batch size: 142, loss[dur_loss=0.2911, prior_loss=0.997, diff_loss=0.4172, tot_loss=1.705, over 142.00 samples.], tot_loss[dur_loss=0.2915, prior_loss=0.997, diff_loss=0.4249, tot_loss=1.713, over 2210.00 samples.], 2024-10-20 23:45:00,210 INFO [train.py:682] (1/4) Start epoch 698 2024-10-20 23:45:20,597 INFO [train.py:561] (1/4) Epoch 698, batch 8, global_batch_idx: 11160, batch size: 170, loss[dur_loss=0.2934, prior_loss=0.9972, diff_loss=0.408, tot_loss=1.699, over 170.00 samples.], tot_loss[dur_loss=0.2892, prior_loss=0.9968, diff_loss=0.4414, tot_loss=1.727, over 1432.00 samples.], 2024-10-20 23:45:30,925 INFO [train.py:682] (1/4) Start epoch 699 2024-10-20 23:45:42,577 INFO [train.py:561] (1/4) Epoch 699, batch 2, global_batch_idx: 11170, batch size: 203, loss[dur_loss=0.2948, prior_loss=0.9974, diff_loss=0.4173, tot_loss=1.709, over 203.00 samples.], tot_loss[dur_loss=0.2951, prior_loss=0.9976, diff_loss=0.391, tot_loss=1.684, over 442.00 samples.], 2024-10-20 23:45:56,859 INFO [train.py:561] (1/4) Epoch 699, batch 12, global_batch_idx: 11180, batch size: 152, loss[dur_loss=0.2943, prior_loss=0.9971, diff_loss=0.3358, tot_loss=1.627, over 152.00 samples.], tot_loss[dur_loss=0.2904, prior_loss=0.997, diff_loss=0.4236, tot_loss=1.711, over 1966.00 samples.], 2024-10-20 23:46:01,308 INFO [train.py:682] (1/4) Start epoch 700 2024-10-20 23:46:18,546 INFO [train.py:561] (1/4) Epoch 700, batch 6, global_batch_idx: 11190, batch size: 106, loss[dur_loss=0.2911, prior_loss=0.9972, diff_loss=0.3778, tot_loss=1.666, over 106.00 samples.], tot_loss[dur_loss=0.2869, prior_loss=0.9965, diff_loss=0.4598, tot_loss=1.743, over 1142.00 samples.], 2024-10-20 23:46:31,691 INFO [train.py:682] (1/4) Start epoch 701 2024-10-20 23:46:40,928 INFO [train.py:561] (1/4) Epoch 701, batch 0, global_batch_idx: 11200, batch size: 108, loss[dur_loss=0.2992, prior_loss=0.9978, diff_loss=0.3488, tot_loss=1.646, over 108.00 samples.], tot_loss[dur_loss=0.2992, prior_loss=0.9978, diff_loss=0.3488, tot_loss=1.646, over 108.00 samples.], 2024-10-20 23:46:55,386 INFO [train.py:561] (1/4) Epoch 701, batch 10, global_batch_idx: 11210, batch size: 111, loss[dur_loss=0.301, prior_loss=0.9988, diff_loss=0.3728, tot_loss=1.673, over 111.00 samples.], tot_loss[dur_loss=0.2899, prior_loss=0.9969, diff_loss=0.4242, tot_loss=1.711, over 1656.00 samples.], 2024-10-20 23:47:02,589 INFO [train.py:682] (1/4) Start epoch 702 2024-10-20 23:47:16,527 INFO [train.py:561] (1/4) Epoch 702, batch 4, global_batch_idx: 11220, batch size: 189, loss[dur_loss=0.2899, prior_loss=0.9976, diff_loss=0.3746, tot_loss=1.662, over 189.00 samples.], tot_loss[dur_loss=0.2828, prior_loss=0.9964, diff_loss=0.4768, tot_loss=1.756, over 937.00 samples.], 2024-10-20 23:47:31,563 INFO [train.py:561] (1/4) Epoch 702, batch 14, global_batch_idx: 11230, batch size: 142, loss[dur_loss=0.2969, prior_loss=0.9974, diff_loss=0.3817, tot_loss=1.676, over 142.00 samples.], tot_loss[dur_loss=0.2902, prior_loss=0.997, diff_loss=0.415, tot_loss=1.702, over 2210.00 samples.], 2024-10-20 23:47:32,981 INFO [train.py:682] (1/4) Start epoch 703 2024-10-20 23:47:53,234 INFO [train.py:561] (1/4) Epoch 703, batch 8, global_batch_idx: 11240, batch size: 170, loss[dur_loss=0.2915, prior_loss=0.9974, diff_loss=0.3954, tot_loss=1.684, over 170.00 samples.], tot_loss[dur_loss=0.2874, prior_loss=0.9965, diff_loss=0.4424, tot_loss=1.726, over 1432.00 samples.], 2024-10-20 23:48:03,497 INFO [train.py:682] (1/4) Start epoch 704 2024-10-20 23:48:15,161 INFO [train.py:561] (1/4) Epoch 704, batch 2, global_batch_idx: 11250, batch size: 203, loss[dur_loss=0.2924, prior_loss=0.9974, diff_loss=0.4121, tot_loss=1.702, over 203.00 samples.], tot_loss[dur_loss=0.2942, prior_loss=0.9975, diff_loss=0.3787, tot_loss=1.67, over 442.00 samples.], 2024-10-20 23:48:29,582 INFO [train.py:561] (1/4) Epoch 704, batch 12, global_batch_idx: 11260, batch size: 152, loss[dur_loss=0.2954, prior_loss=0.9967, diff_loss=0.3788, tot_loss=1.671, over 152.00 samples.], tot_loss[dur_loss=0.2903, prior_loss=0.9968, diff_loss=0.428, tot_loss=1.715, over 1966.00 samples.], 2024-10-20 23:48:34,084 INFO [train.py:682] (1/4) Start epoch 705 2024-10-20 23:48:51,345 INFO [train.py:561] (1/4) Epoch 705, batch 6, global_batch_idx: 11270, batch size: 106, loss[dur_loss=0.294, prior_loss=0.997, diff_loss=0.3535, tot_loss=1.645, over 106.00 samples.], tot_loss[dur_loss=0.2868, prior_loss=0.9962, diff_loss=0.4556, tot_loss=1.739, over 1142.00 samples.], 2024-10-20 23:49:04,460 INFO [train.py:682] (1/4) Start epoch 706 2024-10-20 23:49:13,463 INFO [train.py:561] (1/4) Epoch 706, batch 0, global_batch_idx: 11280, batch size: 108, loss[dur_loss=0.3001, prior_loss=0.998, diff_loss=0.3532, tot_loss=1.651, over 108.00 samples.], tot_loss[dur_loss=0.3001, prior_loss=0.998, diff_loss=0.3532, tot_loss=1.651, over 108.00 samples.], 2024-10-20 23:49:27,716 INFO [train.py:561] (1/4) Epoch 706, batch 10, global_batch_idx: 11290, batch size: 111, loss[dur_loss=0.3011, prior_loss=0.9985, diff_loss=0.3874, tot_loss=1.687, over 111.00 samples.], tot_loss[dur_loss=0.2902, prior_loss=0.9969, diff_loss=0.4188, tot_loss=1.706, over 1656.00 samples.], 2024-10-20 23:49:34,797 INFO [train.py:682] (1/4) Start epoch 707 2024-10-20 23:49:48,776 INFO [train.py:561] (1/4) Epoch 707, batch 4, global_batch_idx: 11300, batch size: 189, loss[dur_loss=0.2948, prior_loss=0.9973, diff_loss=0.4083, tot_loss=1.7, over 189.00 samples.], tot_loss[dur_loss=0.2862, prior_loss=0.9961, diff_loss=0.4759, tot_loss=1.758, over 937.00 samples.], 2024-10-20 23:50:03,744 INFO [train.py:561] (1/4) Epoch 707, batch 14, global_batch_idx: 11310, batch size: 142, loss[dur_loss=0.2913, prior_loss=0.997, diff_loss=0.3471, tot_loss=1.635, over 142.00 samples.], tot_loss[dur_loss=0.2903, prior_loss=0.9967, diff_loss=0.413, tot_loss=1.7, over 2210.00 samples.], 2024-10-20 23:50:05,179 INFO [train.py:682] (1/4) Start epoch 708 2024-10-20 23:50:25,381 INFO [train.py:561] (1/4) Epoch 708, batch 8, global_batch_idx: 11320, batch size: 170, loss[dur_loss=0.2897, prior_loss=0.9967, diff_loss=0.374, tot_loss=1.66, over 170.00 samples.], tot_loss[dur_loss=0.2868, prior_loss=0.9963, diff_loss=0.4443, tot_loss=1.727, over 1432.00 samples.], 2024-10-20 23:50:35,605 INFO [train.py:682] (1/4) Start epoch 709 2024-10-20 23:50:47,659 INFO [train.py:561] (1/4) Epoch 709, batch 2, global_batch_idx: 11330, batch size: 203, loss[dur_loss=0.2959, prior_loss=0.9972, diff_loss=0.3873, tot_loss=1.68, over 203.00 samples.], tot_loss[dur_loss=0.2939, prior_loss=0.9973, diff_loss=0.3861, tot_loss=1.677, over 442.00 samples.], 2024-10-20 23:51:02,146 INFO [train.py:561] (1/4) Epoch 709, batch 12, global_batch_idx: 11340, batch size: 152, loss[dur_loss=0.2985, prior_loss=0.9967, diff_loss=0.3984, tot_loss=1.694, over 152.00 samples.], tot_loss[dur_loss=0.291, prior_loss=0.9969, diff_loss=0.4287, tot_loss=1.717, over 1966.00 samples.], 2024-10-20 23:51:06,648 INFO [train.py:682] (1/4) Start epoch 710 2024-10-20 23:51:23,874 INFO [train.py:561] (1/4) Epoch 710, batch 6, global_batch_idx: 11350, batch size: 106, loss[dur_loss=0.2885, prior_loss=0.9972, diff_loss=0.3641, tot_loss=1.65, over 106.00 samples.], tot_loss[dur_loss=0.2874, prior_loss=0.9965, diff_loss=0.464, tot_loss=1.748, over 1142.00 samples.], 2024-10-20 23:51:36,995 INFO [train.py:682] (1/4) Start epoch 711 2024-10-20 23:51:45,758 INFO [train.py:561] (1/4) Epoch 711, batch 0, global_batch_idx: 11360, batch size: 108, loss[dur_loss=0.3027, prior_loss=0.9981, diff_loss=0.3832, tot_loss=1.684, over 108.00 samples.], tot_loss[dur_loss=0.3027, prior_loss=0.9981, diff_loss=0.3832, tot_loss=1.684, over 108.00 samples.], 2024-10-20 23:52:00,054 INFO [train.py:561] (1/4) Epoch 711, batch 10, global_batch_idx: 11370, batch size: 111, loss[dur_loss=0.3026, prior_loss=0.9989, diff_loss=0.3359, tot_loss=1.637, over 111.00 samples.], tot_loss[dur_loss=0.2896, prior_loss=0.9969, diff_loss=0.4306, tot_loss=1.717, over 1656.00 samples.], 2024-10-20 23:52:07,194 INFO [train.py:682] (1/4) Start epoch 712 2024-10-20 23:52:21,020 INFO [train.py:561] (1/4) Epoch 712, batch 4, global_batch_idx: 11380, batch size: 189, loss[dur_loss=0.2921, prior_loss=0.9981, diff_loss=0.4083, tot_loss=1.699, over 189.00 samples.], tot_loss[dur_loss=0.2843, prior_loss=0.9964, diff_loss=0.4789, tot_loss=1.76, over 937.00 samples.], 2024-10-20 23:52:36,014 INFO [train.py:561] (1/4) Epoch 712, batch 14, global_batch_idx: 11390, batch size: 142, loss[dur_loss=0.2905, prior_loss=0.9969, diff_loss=0.3581, tot_loss=1.645, over 142.00 samples.], tot_loss[dur_loss=0.2901, prior_loss=0.9969, diff_loss=0.4146, tot_loss=1.702, over 2210.00 samples.], 2024-10-20 23:52:37,441 INFO [train.py:682] (1/4) Start epoch 713 2024-10-20 23:52:57,716 INFO [train.py:561] (1/4) Epoch 713, batch 8, global_batch_idx: 11400, batch size: 170, loss[dur_loss=0.2925, prior_loss=0.9974, diff_loss=0.3789, tot_loss=1.669, over 170.00 samples.], tot_loss[dur_loss=0.2866, prior_loss=0.9965, diff_loss=0.4453, tot_loss=1.728, over 1432.00 samples.], 2024-10-20 23:53:07,846 INFO [train.py:682] (1/4) Start epoch 714 2024-10-20 23:53:19,165 INFO [train.py:561] (1/4) Epoch 714, batch 2, global_batch_idx: 11410, batch size: 203, loss[dur_loss=0.2927, prior_loss=0.9967, diff_loss=0.4056, tot_loss=1.695, over 203.00 samples.], tot_loss[dur_loss=0.2938, prior_loss=0.997, diff_loss=0.3756, tot_loss=1.666, over 442.00 samples.], 2024-10-20 23:53:33,332 INFO [train.py:561] (1/4) Epoch 714, batch 12, global_batch_idx: 11420, batch size: 152, loss[dur_loss=0.2881, prior_loss=0.9963, diff_loss=0.3457, tot_loss=1.63, over 152.00 samples.], tot_loss[dur_loss=0.2884, prior_loss=0.9964, diff_loss=0.4246, tot_loss=1.709, over 1966.00 samples.], 2024-10-20 23:53:37,737 INFO [train.py:682] (1/4) Start epoch 715 2024-10-20 23:53:54,901 INFO [train.py:561] (1/4) Epoch 715, batch 6, global_batch_idx: 11430, batch size: 106, loss[dur_loss=0.2896, prior_loss=0.9968, diff_loss=0.3601, tot_loss=1.647, over 106.00 samples.], tot_loss[dur_loss=0.2849, prior_loss=0.9961, diff_loss=0.455, tot_loss=1.736, over 1142.00 samples.], 2024-10-20 23:54:07,948 INFO [train.py:682] (1/4) Start epoch 716 2024-10-20 23:54:16,790 INFO [train.py:561] (1/4) Epoch 716, batch 0, global_batch_idx: 11440, batch size: 108, loss[dur_loss=0.2992, prior_loss=0.9974, diff_loss=0.3698, tot_loss=1.666, over 108.00 samples.], tot_loss[dur_loss=0.2992, prior_loss=0.9974, diff_loss=0.3698, tot_loss=1.666, over 108.00 samples.], 2024-10-20 23:54:30,900 INFO [train.py:561] (1/4) Epoch 716, batch 10, global_batch_idx: 11450, batch size: 111, loss[dur_loss=0.2999, prior_loss=0.9981, diff_loss=0.3795, tot_loss=1.677, over 111.00 samples.], tot_loss[dur_loss=0.2867, prior_loss=0.9962, diff_loss=0.4317, tot_loss=1.715, over 1656.00 samples.], 2024-10-20 23:54:37,976 INFO [train.py:682] (1/4) Start epoch 717 2024-10-20 23:54:52,336 INFO [train.py:561] (1/4) Epoch 717, batch 4, global_batch_idx: 11460, batch size: 189, loss[dur_loss=0.2904, prior_loss=0.9969, diff_loss=0.3903, tot_loss=1.678, over 189.00 samples.], tot_loss[dur_loss=0.2843, prior_loss=0.9957, diff_loss=0.4793, tot_loss=1.759, over 937.00 samples.], 2024-10-20 23:55:07,262 INFO [train.py:561] (1/4) Epoch 717, batch 14, global_batch_idx: 11470, batch size: 142, loss[dur_loss=0.2928, prior_loss=0.9967, diff_loss=0.3525, tot_loss=1.642, over 142.00 samples.], tot_loss[dur_loss=0.2884, prior_loss=0.9963, diff_loss=0.4105, tot_loss=1.695, over 2210.00 samples.], 2024-10-20 23:55:08,710 INFO [train.py:682] (1/4) Start epoch 718 2024-10-20 23:55:28,731 INFO [train.py:561] (1/4) Epoch 718, batch 8, global_batch_idx: 11480, batch size: 170, loss[dur_loss=0.2885, prior_loss=0.9968, diff_loss=0.3584, tot_loss=1.644, over 170.00 samples.], tot_loss[dur_loss=0.286, prior_loss=0.9962, diff_loss=0.4385, tot_loss=1.721, over 1432.00 samples.], 2024-10-20 23:55:38,830 INFO [train.py:682] (1/4) Start epoch 719 2024-10-20 23:55:50,245 INFO [train.py:561] (1/4) Epoch 719, batch 2, global_batch_idx: 11490, batch size: 203, loss[dur_loss=0.2839, prior_loss=0.9965, diff_loss=0.376, tot_loss=1.656, over 203.00 samples.], tot_loss[dur_loss=0.2882, prior_loss=0.9968, diff_loss=0.3725, tot_loss=1.657, over 442.00 samples.], 2024-10-20 23:56:04,265 INFO [train.py:561] (1/4) Epoch 719, batch 12, global_batch_idx: 11500, batch size: 152, loss[dur_loss=0.2896, prior_loss=0.9963, diff_loss=0.3644, tot_loss=1.65, over 152.00 samples.], tot_loss[dur_loss=0.2865, prior_loss=0.9962, diff_loss=0.4176, tot_loss=1.7, over 1966.00 samples.], 2024-10-20 23:56:08,698 INFO [train.py:682] (1/4) Start epoch 720 2024-10-20 23:56:25,721 INFO [train.py:561] (1/4) Epoch 720, batch 6, global_batch_idx: 11510, batch size: 106, loss[dur_loss=0.2912, prior_loss=0.9969, diff_loss=0.3018, tot_loss=1.59, over 106.00 samples.], tot_loss[dur_loss=0.2839, prior_loss=0.996, diff_loss=0.4511, tot_loss=1.731, over 1142.00 samples.], 2024-10-20 23:56:38,848 INFO [train.py:682] (1/4) Start epoch 721 2024-10-20 23:56:47,599 INFO [train.py:561] (1/4) Epoch 721, batch 0, global_batch_idx: 11520, batch size: 108, loss[dur_loss=0.2998, prior_loss=0.9973, diff_loss=0.3616, tot_loss=1.659, over 108.00 samples.], tot_loss[dur_loss=0.2998, prior_loss=0.9973, diff_loss=0.3616, tot_loss=1.659, over 108.00 samples.], 2024-10-20 23:57:01,834 INFO [train.py:561] (1/4) Epoch 721, batch 10, global_batch_idx: 11530, batch size: 111, loss[dur_loss=0.3006, prior_loss=0.9988, diff_loss=0.3635, tot_loss=1.663, over 111.00 samples.], tot_loss[dur_loss=0.2913, prior_loss=0.9969, diff_loss=0.4198, tot_loss=1.708, over 1656.00 samples.], 2024-10-20 23:57:08,980 INFO [train.py:682] (1/4) Start epoch 722 2024-10-20 23:57:23,216 INFO [train.py:561] (1/4) Epoch 722, batch 4, global_batch_idx: 11540, batch size: 189, loss[dur_loss=0.2943, prior_loss=0.9978, diff_loss=0.3841, tot_loss=1.676, over 189.00 samples.], tot_loss[dur_loss=0.2861, prior_loss=0.9962, diff_loss=0.4739, tot_loss=1.756, over 937.00 samples.], 2024-10-20 23:57:38,084 INFO [train.py:561] (1/4) Epoch 722, batch 14, global_batch_idx: 11550, batch size: 142, loss[dur_loss=0.2926, prior_loss=0.9968, diff_loss=0.3671, tot_loss=1.657, over 142.00 samples.], tot_loss[dur_loss=0.2915, prior_loss=0.9969, diff_loss=0.4126, tot_loss=1.701, over 2210.00 samples.], 2024-10-20 23:57:39,528 INFO [train.py:682] (1/4) Start epoch 723 2024-10-20 23:57:59,557 INFO [train.py:561] (1/4) Epoch 723, batch 8, global_batch_idx: 11560, batch size: 170, loss[dur_loss=0.301, prior_loss=0.9993, diff_loss=0.4068, tot_loss=1.707, over 170.00 samples.], tot_loss[dur_loss=0.2893, prior_loss=0.9967, diff_loss=0.4367, tot_loss=1.723, over 1432.00 samples.], 2024-10-20 23:58:09,700 INFO [train.py:682] (1/4) Start epoch 724 2024-10-20 23:58:21,356 INFO [train.py:561] (1/4) Epoch 724, batch 2, global_batch_idx: 11570, batch size: 203, loss[dur_loss=0.2914, prior_loss=0.9972, diff_loss=0.4367, tot_loss=1.725, over 203.00 samples.], tot_loss[dur_loss=0.2924, prior_loss=0.9974, diff_loss=0.401, tot_loss=1.691, over 442.00 samples.], 2024-10-20 23:58:35,589 INFO [train.py:561] (1/4) Epoch 724, batch 12, global_batch_idx: 11580, batch size: 152, loss[dur_loss=0.2911, prior_loss=0.9967, diff_loss=0.3602, tot_loss=1.648, over 152.00 samples.], tot_loss[dur_loss=0.2894, prior_loss=0.997, diff_loss=0.4205, tot_loss=1.707, over 1966.00 samples.], 2024-10-20 23:58:40,073 INFO [train.py:682] (1/4) Start epoch 725 2024-10-20 23:58:57,610 INFO [train.py:561] (1/4) Epoch 725, batch 6, global_batch_idx: 11590, batch size: 106, loss[dur_loss=0.2917, prior_loss=0.997, diff_loss=0.4055, tot_loss=1.694, over 106.00 samples.], tot_loss[dur_loss=0.2857, prior_loss=0.9962, diff_loss=0.4576, tot_loss=1.74, over 1142.00 samples.], 2024-10-20 23:59:10,703 INFO [train.py:682] (1/4) Start epoch 726 2024-10-20 23:59:19,570 INFO [train.py:561] (1/4) Epoch 726, batch 0, global_batch_idx: 11600, batch size: 108, loss[dur_loss=0.3012, prior_loss=0.998, diff_loss=0.3426, tot_loss=1.642, over 108.00 samples.], tot_loss[dur_loss=0.3012, prior_loss=0.998, diff_loss=0.3426, tot_loss=1.642, over 108.00 samples.], 2024-10-20 23:59:33,759 INFO [train.py:561] (1/4) Epoch 726, batch 10, global_batch_idx: 11610, batch size: 111, loss[dur_loss=0.3005, prior_loss=0.9984, diff_loss=0.3726, tot_loss=1.672, over 111.00 samples.], tot_loss[dur_loss=0.2875, prior_loss=0.9965, diff_loss=0.4337, tot_loss=1.718, over 1656.00 samples.], 2024-10-20 23:59:40,796 INFO [train.py:682] (1/4) Start epoch 727 2024-10-20 23:59:54,803 INFO [train.py:561] (1/4) Epoch 727, batch 4, global_batch_idx: 11620, batch size: 189, loss[dur_loss=0.2906, prior_loss=0.997, diff_loss=0.3971, tot_loss=1.685, over 189.00 samples.], tot_loss[dur_loss=0.286, prior_loss=0.9958, diff_loss=0.474, tot_loss=1.756, over 937.00 samples.], 2024-10-21 00:00:09,590 INFO [train.py:561] (1/4) Epoch 727, batch 14, global_batch_idx: 11630, batch size: 142, loss[dur_loss=0.2899, prior_loss=0.9965, diff_loss=0.374, tot_loss=1.66, over 142.00 samples.], tot_loss[dur_loss=0.2898, prior_loss=0.9964, diff_loss=0.4076, tot_loss=1.694, over 2210.00 samples.], 2024-10-21 00:00:11,022 INFO [train.py:682] (1/4) Start epoch 728 2024-10-21 00:00:31,124 INFO [train.py:561] (1/4) Epoch 728, batch 8, global_batch_idx: 11640, batch size: 170, loss[dur_loss=0.2883, prior_loss=0.9969, diff_loss=0.3596, tot_loss=1.645, over 170.00 samples.], tot_loss[dur_loss=0.2862, prior_loss=0.9961, diff_loss=0.4404, tot_loss=1.723, over 1432.00 samples.], 2024-10-21 00:00:41,238 INFO [train.py:682] (1/4) Start epoch 729 2024-10-21 00:00:52,717 INFO [train.py:561] (1/4) Epoch 729, batch 2, global_batch_idx: 11650, batch size: 203, loss[dur_loss=0.287, prior_loss=0.9961, diff_loss=0.3834, tot_loss=1.667, over 203.00 samples.], tot_loss[dur_loss=0.29, prior_loss=0.9964, diff_loss=0.3799, tot_loss=1.666, over 442.00 samples.], 2024-10-21 00:01:06,973 INFO [train.py:561] (1/4) Epoch 729, batch 12, global_batch_idx: 11660, batch size: 152, loss[dur_loss=0.2916, prior_loss=0.9961, diff_loss=0.3836, tot_loss=1.671, over 152.00 samples.], tot_loss[dur_loss=0.2877, prior_loss=0.9961, diff_loss=0.4208, tot_loss=1.705, over 1966.00 samples.], 2024-10-21 00:01:11,433 INFO [train.py:682] (1/4) Start epoch 730 2024-10-21 00:01:28,691 INFO [train.py:561] (1/4) Epoch 730, batch 6, global_batch_idx: 11670, batch size: 106, loss[dur_loss=0.2952, prior_loss=0.9966, diff_loss=0.3662, tot_loss=1.658, over 106.00 samples.], tot_loss[dur_loss=0.2828, prior_loss=0.9957, diff_loss=0.455, tot_loss=1.733, over 1142.00 samples.], 2024-10-21 00:01:41,726 INFO [train.py:682] (1/4) Start epoch 731 2024-10-21 00:01:50,559 INFO [train.py:561] (1/4) Epoch 731, batch 0, global_batch_idx: 11680, batch size: 108, loss[dur_loss=0.2964, prior_loss=0.9968, diff_loss=0.3169, tot_loss=1.61, over 108.00 samples.], tot_loss[dur_loss=0.2964, prior_loss=0.9968, diff_loss=0.3169, tot_loss=1.61, over 108.00 samples.], 2024-10-21 00:02:04,765 INFO [train.py:561] (1/4) Epoch 731, batch 10, global_batch_idx: 11690, batch size: 111, loss[dur_loss=0.2951, prior_loss=0.9978, diff_loss=0.3797, tot_loss=1.673, over 111.00 samples.], tot_loss[dur_loss=0.2858, prior_loss=0.996, diff_loss=0.4272, tot_loss=1.709, over 1656.00 samples.], 2024-10-21 00:02:11,797 INFO [train.py:682] (1/4) Start epoch 732 2024-10-21 00:02:25,515 INFO [train.py:561] (1/4) Epoch 732, batch 4, global_batch_idx: 11700, batch size: 189, loss[dur_loss=0.2892, prior_loss=0.9971, diff_loss=0.3797, tot_loss=1.666, over 189.00 samples.], tot_loss[dur_loss=0.2809, prior_loss=0.9956, diff_loss=0.49, tot_loss=1.767, over 937.00 samples.], 2024-10-21 00:02:40,262 INFO [train.py:561] (1/4) Epoch 732, batch 14, global_batch_idx: 11710, batch size: 142, loss[dur_loss=0.2923, prior_loss=0.9968, diff_loss=0.3612, tot_loss=1.65, over 142.00 samples.], tot_loss[dur_loss=0.2869, prior_loss=0.9962, diff_loss=0.4181, tot_loss=1.701, over 2210.00 samples.], 2024-10-21 00:02:41,687 INFO [train.py:682] (1/4) Start epoch 733 2024-10-21 00:03:02,149 INFO [train.py:561] (1/4) Epoch 733, batch 8, global_batch_idx: 11720, batch size: 170, loss[dur_loss=0.2884, prior_loss=0.9965, diff_loss=0.366, tot_loss=1.651, over 170.00 samples.], tot_loss[dur_loss=0.2855, prior_loss=0.996, diff_loss=0.4366, tot_loss=1.718, over 1432.00 samples.], 2024-10-21 00:03:12,330 INFO [train.py:682] (1/4) Start epoch 734 2024-10-21 00:03:24,579 INFO [train.py:561] (1/4) Epoch 734, batch 2, global_batch_idx: 11730, batch size: 203, loss[dur_loss=0.2903, prior_loss=0.9967, diff_loss=0.375, tot_loss=1.662, over 203.00 samples.], tot_loss[dur_loss=0.291, prior_loss=0.9969, diff_loss=0.3794, tot_loss=1.667, over 442.00 samples.], 2024-10-21 00:03:39,145 INFO [train.py:561] (1/4) Epoch 734, batch 12, global_batch_idx: 11740, batch size: 152, loss[dur_loss=0.2928, prior_loss=0.9964, diff_loss=0.3585, tot_loss=1.648, over 152.00 samples.], tot_loss[dur_loss=0.2861, prior_loss=0.9962, diff_loss=0.4097, tot_loss=1.692, over 1966.00 samples.], 2024-10-21 00:03:43,649 INFO [train.py:682] (1/4) Start epoch 735 2024-10-21 00:04:01,026 INFO [train.py:561] (1/4) Epoch 735, batch 6, global_batch_idx: 11750, batch size: 106, loss[dur_loss=0.2892, prior_loss=0.9962, diff_loss=0.348, tot_loss=1.633, over 106.00 samples.], tot_loss[dur_loss=0.2833, prior_loss=0.9956, diff_loss=0.4556, tot_loss=1.734, over 1142.00 samples.], 2024-10-21 00:04:14,329 INFO [train.py:682] (1/4) Start epoch 736 2024-10-21 00:04:23,560 INFO [train.py:561] (1/4) Epoch 736, batch 0, global_batch_idx: 11760, batch size: 108, loss[dur_loss=0.2973, prior_loss=0.9973, diff_loss=0.3489, tot_loss=1.644, over 108.00 samples.], tot_loss[dur_loss=0.2973, prior_loss=0.9973, diff_loss=0.3489, tot_loss=1.644, over 108.00 samples.], 2024-10-21 00:04:37,983 INFO [train.py:561] (1/4) Epoch 736, batch 10, global_batch_idx: 11770, batch size: 111, loss[dur_loss=0.2974, prior_loss=0.9976, diff_loss=0.3658, tot_loss=1.661, over 111.00 samples.], tot_loss[dur_loss=0.2876, prior_loss=0.9961, diff_loss=0.4294, tot_loss=1.713, over 1656.00 samples.], 2024-10-21 00:04:45,197 INFO [train.py:682] (1/4) Start epoch 737 2024-10-21 00:04:58,998 INFO [train.py:561] (1/4) Epoch 737, batch 4, global_batch_idx: 11780, batch size: 189, loss[dur_loss=0.2888, prior_loss=0.9968, diff_loss=0.394, tot_loss=1.68, over 189.00 samples.], tot_loss[dur_loss=0.2817, prior_loss=0.9953, diff_loss=0.4792, tot_loss=1.756, over 937.00 samples.], 2024-10-21 00:05:13,973 INFO [train.py:561] (1/4) Epoch 737, batch 14, global_batch_idx: 11790, batch size: 142, loss[dur_loss=0.2951, prior_loss=0.9965, diff_loss=0.3369, tot_loss=1.628, over 142.00 samples.], tot_loss[dur_loss=0.2876, prior_loss=0.9961, diff_loss=0.4106, tot_loss=1.694, over 2210.00 samples.], 2024-10-21 00:05:15,417 INFO [train.py:682] (1/4) Start epoch 738 2024-10-21 00:05:35,623 INFO [train.py:561] (1/4) Epoch 738, batch 8, global_batch_idx: 11800, batch size: 170, loss[dur_loss=0.2885, prior_loss=0.9968, diff_loss=0.3755, tot_loss=1.661, over 170.00 samples.], tot_loss[dur_loss=0.2869, prior_loss=0.9961, diff_loss=0.4333, tot_loss=1.716, over 1432.00 samples.], 2024-10-21 00:05:45,798 INFO [train.py:682] (1/4) Start epoch 739 2024-10-21 00:05:57,262 INFO [train.py:561] (1/4) Epoch 739, batch 2, global_batch_idx: 11810, batch size: 203, loss[dur_loss=0.2893, prior_loss=0.9968, diff_loss=0.4088, tot_loss=1.695, over 203.00 samples.], tot_loss[dur_loss=0.2895, prior_loss=0.9969, diff_loss=0.3789, tot_loss=1.665, over 442.00 samples.], 2024-10-21 00:06:11,660 INFO [train.py:561] (1/4) Epoch 739, batch 12, global_batch_idx: 11820, batch size: 152, loss[dur_loss=0.2885, prior_loss=0.996, diff_loss=0.3719, tot_loss=1.656, over 152.00 samples.], tot_loss[dur_loss=0.2866, prior_loss=0.9965, diff_loss=0.4211, tot_loss=1.704, over 1966.00 samples.], 2024-10-21 00:06:16,136 INFO [train.py:682] (1/4) Start epoch 740 2024-10-21 00:06:33,071 INFO [train.py:561] (1/4) Epoch 740, batch 6, global_batch_idx: 11830, batch size: 106, loss[dur_loss=0.2969, prior_loss=0.997, diff_loss=0.326, tot_loss=1.62, over 106.00 samples.], tot_loss[dur_loss=0.2844, prior_loss=0.9958, diff_loss=0.453, tot_loss=1.733, over 1142.00 samples.], 2024-10-21 00:06:46,198 INFO [train.py:682] (1/4) Start epoch 741 2024-10-21 00:06:55,222 INFO [train.py:561] (1/4) Epoch 741, batch 0, global_batch_idx: 11840, batch size: 108, loss[dur_loss=0.2988, prior_loss=0.9971, diff_loss=0.357, tot_loss=1.653, over 108.00 samples.], tot_loss[dur_loss=0.2988, prior_loss=0.9971, diff_loss=0.357, tot_loss=1.653, over 108.00 samples.], 2024-10-21 00:07:09,396 INFO [train.py:561] (1/4) Epoch 741, batch 10, global_batch_idx: 11850, batch size: 111, loss[dur_loss=0.2957, prior_loss=0.9978, diff_loss=0.3002, tot_loss=1.594, over 111.00 samples.], tot_loss[dur_loss=0.2857, prior_loss=0.996, diff_loss=0.4224, tot_loss=1.704, over 1656.00 samples.], 2024-10-21 00:07:16,436 INFO [train.py:682] (1/4) Start epoch 742 2024-10-21 00:07:30,296 INFO [train.py:561] (1/4) Epoch 742, batch 4, global_batch_idx: 11860, batch size: 189, loss[dur_loss=0.2857, prior_loss=0.9963, diff_loss=0.3911, tot_loss=1.673, over 189.00 samples.], tot_loss[dur_loss=0.2813, prior_loss=0.9953, diff_loss=0.4828, tot_loss=1.759, over 937.00 samples.], 2024-10-21 00:07:45,129 INFO [train.py:561] (1/4) Epoch 742, batch 14, global_batch_idx: 11870, batch size: 142, loss[dur_loss=0.2886, prior_loss=0.996, diff_loss=0.3339, tot_loss=1.619, over 142.00 samples.], tot_loss[dur_loss=0.2862, prior_loss=0.9959, diff_loss=0.4111, tot_loss=1.693, over 2210.00 samples.], 2024-10-21 00:07:46,561 INFO [train.py:682] (1/4) Start epoch 743 2024-10-21 00:08:06,516 INFO [train.py:561] (1/4) Epoch 743, batch 8, global_batch_idx: 11880, batch size: 170, loss[dur_loss=0.2898, prior_loss=0.997, diff_loss=0.3693, tot_loss=1.656, over 170.00 samples.], tot_loss[dur_loss=0.2835, prior_loss=0.9956, diff_loss=0.434, tot_loss=1.713, over 1432.00 samples.], 2024-10-21 00:08:16,711 INFO [train.py:682] (1/4) Start epoch 744 2024-10-21 00:08:28,263 INFO [train.py:561] (1/4) Epoch 744, batch 2, global_batch_idx: 11890, batch size: 203, loss[dur_loss=0.2876, prior_loss=0.9958, diff_loss=0.3925, tot_loss=1.676, over 203.00 samples.], tot_loss[dur_loss=0.2886, prior_loss=0.9962, diff_loss=0.36, tot_loss=1.645, over 442.00 samples.], 2024-10-21 00:08:42,489 INFO [train.py:561] (1/4) Epoch 744, batch 12, global_batch_idx: 11900, batch size: 152, loss[dur_loss=0.2913, prior_loss=0.9963, diff_loss=0.3822, tot_loss=1.67, over 152.00 samples.], tot_loss[dur_loss=0.2863, prior_loss=0.996, diff_loss=0.4136, tot_loss=1.696, over 1966.00 samples.], 2024-10-21 00:08:46,945 INFO [train.py:682] (1/4) Start epoch 745 2024-10-21 00:09:03,910 INFO [train.py:561] (1/4) Epoch 745, batch 6, global_batch_idx: 11910, batch size: 106, loss[dur_loss=0.292, prior_loss=0.9968, diff_loss=0.3445, tot_loss=1.633, over 106.00 samples.], tot_loss[dur_loss=0.2829, prior_loss=0.9957, diff_loss=0.4434, tot_loss=1.722, over 1142.00 samples.], 2024-10-21 00:09:16,995 INFO [train.py:682] (1/4) Start epoch 746 2024-10-21 00:09:25,666 INFO [train.py:561] (1/4) Epoch 746, batch 0, global_batch_idx: 11920, batch size: 108, loss[dur_loss=0.2987, prior_loss=0.9971, diff_loss=0.3595, tot_loss=1.655, over 108.00 samples.], tot_loss[dur_loss=0.2987, prior_loss=0.9971, diff_loss=0.3595, tot_loss=1.655, over 108.00 samples.], 2024-10-21 00:09:40,182 INFO [train.py:561] (1/4) Epoch 746, batch 10, global_batch_idx: 11930, batch size: 111, loss[dur_loss=0.2951, prior_loss=0.9979, diff_loss=0.3413, tot_loss=1.634, over 111.00 samples.], tot_loss[dur_loss=0.2865, prior_loss=0.9961, diff_loss=0.424, tot_loss=1.707, over 1656.00 samples.], 2024-10-21 00:09:47,394 INFO [train.py:682] (1/4) Start epoch 747 2024-10-21 00:10:01,627 INFO [train.py:561] (1/4) Epoch 747, batch 4, global_batch_idx: 11940, batch size: 189, loss[dur_loss=0.2878, prior_loss=0.9967, diff_loss=0.4035, tot_loss=1.688, over 189.00 samples.], tot_loss[dur_loss=0.2813, prior_loss=0.9952, diff_loss=0.4771, tot_loss=1.754, over 937.00 samples.], 2024-10-21 00:10:16,677 INFO [train.py:561] (1/4) Epoch 747, batch 14, global_batch_idx: 11950, batch size: 142, loss[dur_loss=0.2909, prior_loss=0.9957, diff_loss=0.3436, tot_loss=1.63, over 142.00 samples.], tot_loss[dur_loss=0.2866, prior_loss=0.9959, diff_loss=0.4092, tot_loss=1.692, over 2210.00 samples.], 2024-10-21 00:10:18,110 INFO [train.py:682] (1/4) Start epoch 748 2024-10-21 00:10:38,307 INFO [train.py:561] (1/4) Epoch 748, batch 8, global_batch_idx: 11960, batch size: 170, loss[dur_loss=0.2902, prior_loss=0.9968, diff_loss=0.3616, tot_loss=1.649, over 170.00 samples.], tot_loss[dur_loss=0.2841, prior_loss=0.9956, diff_loss=0.4358, tot_loss=1.716, over 1432.00 samples.], 2024-10-21 00:10:48,511 INFO [train.py:682] (1/4) Start epoch 749 2024-10-21 00:11:00,165 INFO [train.py:561] (1/4) Epoch 749, batch 2, global_batch_idx: 11970, batch size: 203, loss[dur_loss=0.2856, prior_loss=0.996, diff_loss=0.3877, tot_loss=1.669, over 203.00 samples.], tot_loss[dur_loss=0.2864, prior_loss=0.9964, diff_loss=0.3816, tot_loss=1.664, over 442.00 samples.], 2024-10-21 00:11:14,297 INFO [train.py:561] (1/4) Epoch 749, batch 12, global_batch_idx: 11980, batch size: 152, loss[dur_loss=0.2855, prior_loss=0.9954, diff_loss=0.3525, tot_loss=1.633, over 152.00 samples.], tot_loss[dur_loss=0.2858, prior_loss=0.996, diff_loss=0.4288, tot_loss=1.711, over 1966.00 samples.], 2024-10-21 00:11:18,767 INFO [train.py:682] (1/4) Start epoch 750 2024-10-21 00:11:36,047 INFO [train.py:561] (1/4) Epoch 750, batch 6, global_batch_idx: 11990, batch size: 106, loss[dur_loss=0.2949, prior_loss=0.9969, diff_loss=0.3851, tot_loss=1.677, over 106.00 samples.], tot_loss[dur_loss=0.2815, prior_loss=0.9953, diff_loss=0.4628, tot_loss=1.74, over 1142.00 samples.], 2024-10-21 00:11:49,100 INFO [train.py:682] (1/4) Start epoch 751 2024-10-21 00:11:57,915 INFO [train.py:561] (1/4) Epoch 751, batch 0, global_batch_idx: 12000, batch size: 108, loss[dur_loss=0.2951, prior_loss=0.9967, diff_loss=0.3591, tot_loss=1.651, over 108.00 samples.], tot_loss[dur_loss=0.2951, prior_loss=0.9967, diff_loss=0.3591, tot_loss=1.651, over 108.00 samples.], 2024-10-21 00:11:59,356 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 00:12:27,400 INFO [train.py:589] (1/4) Epoch 751, validation: dur_loss=0.432, prior_loss=1.03, diff_loss=0.4049, tot_loss=1.867, over 100.00 samples. 2024-10-21 00:12:27,402 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 00:12:40,410 INFO [train.py:561] (1/4) Epoch 751, batch 10, global_batch_idx: 12010, batch size: 111, loss[dur_loss=0.2933, prior_loss=0.9972, diff_loss=0.3597, tot_loss=1.65, over 111.00 samples.], tot_loss[dur_loss=0.2836, prior_loss=0.9955, diff_loss=0.4241, tot_loss=1.703, over 1656.00 samples.], 2024-10-21 00:12:47,585 INFO [train.py:682] (1/4) Start epoch 752 2024-10-21 00:13:01,934 INFO [train.py:561] (1/4) Epoch 752, batch 4, global_batch_idx: 12020, batch size: 189, loss[dur_loss=0.2877, prior_loss=0.9965, diff_loss=0.3855, tot_loss=1.67, over 189.00 samples.], tot_loss[dur_loss=0.2797, prior_loss=0.995, diff_loss=0.4707, tot_loss=1.745, over 937.00 samples.], 2024-10-21 00:13:16,894 INFO [train.py:561] (1/4) Epoch 752, batch 14, global_batch_idx: 12030, batch size: 142, loss[dur_loss=0.2891, prior_loss=0.9957, diff_loss=0.3672, tot_loss=1.652, over 142.00 samples.], tot_loss[dur_loss=0.2854, prior_loss=0.9956, diff_loss=0.4048, tot_loss=1.686, over 2210.00 samples.], 2024-10-21 00:13:18,328 INFO [train.py:682] (1/4) Start epoch 753 2024-10-21 00:13:38,589 INFO [train.py:561] (1/4) Epoch 753, batch 8, global_batch_idx: 12040, batch size: 170, loss[dur_loss=0.2894, prior_loss=0.9961, diff_loss=0.3711, tot_loss=1.657, over 170.00 samples.], tot_loss[dur_loss=0.2839, prior_loss=0.9956, diff_loss=0.4429, tot_loss=1.722, over 1432.00 samples.], 2024-10-21 00:13:48,779 INFO [train.py:682] (1/4) Start epoch 754 2024-10-21 00:14:00,497 INFO [train.py:561] (1/4) Epoch 754, batch 2, global_batch_idx: 12050, batch size: 203, loss[dur_loss=0.2837, prior_loss=0.9964, diff_loss=0.3922, tot_loss=1.672, over 203.00 samples.], tot_loss[dur_loss=0.2871, prior_loss=0.9964, diff_loss=0.3764, tot_loss=1.66, over 442.00 samples.], 2024-10-21 00:14:14,784 INFO [train.py:561] (1/4) Epoch 754, batch 12, global_batch_idx: 12060, batch size: 152, loss[dur_loss=0.2885, prior_loss=0.9963, diff_loss=0.3683, tot_loss=1.653, over 152.00 samples.], tot_loss[dur_loss=0.2867, prior_loss=0.9961, diff_loss=0.4143, tot_loss=1.697, over 1966.00 samples.], 2024-10-21 00:14:19,235 INFO [train.py:682] (1/4) Start epoch 755 2024-10-21 00:14:36,475 INFO [train.py:561] (1/4) Epoch 755, batch 6, global_batch_idx: 12070, batch size: 106, loss[dur_loss=0.294, prior_loss=0.9965, diff_loss=0.3712, tot_loss=1.662, over 106.00 samples.], tot_loss[dur_loss=0.2836, prior_loss=0.9955, diff_loss=0.4583, tot_loss=1.737, over 1142.00 samples.], 2024-10-21 00:14:49,530 INFO [train.py:682] (1/4) Start epoch 756 2024-10-21 00:14:58,495 INFO [train.py:561] (1/4) Epoch 756, batch 0, global_batch_idx: 12080, batch size: 108, loss[dur_loss=0.2926, prior_loss=0.9967, diff_loss=0.3668, tot_loss=1.656, over 108.00 samples.], tot_loss[dur_loss=0.2926, prior_loss=0.9967, diff_loss=0.3668, tot_loss=1.656, over 108.00 samples.], 2024-10-21 00:15:12,901 INFO [train.py:561] (1/4) Epoch 756, batch 10, global_batch_idx: 12090, batch size: 111, loss[dur_loss=0.2967, prior_loss=0.9971, diff_loss=0.3078, tot_loss=1.602, over 111.00 samples.], tot_loss[dur_loss=0.2846, prior_loss=0.9954, diff_loss=0.4241, tot_loss=1.704, over 1656.00 samples.], 2024-10-21 00:15:20,052 INFO [train.py:682] (1/4) Start epoch 757 2024-10-21 00:15:33,981 INFO [train.py:561] (1/4) Epoch 757, batch 4, global_batch_idx: 12100, batch size: 189, loss[dur_loss=0.287, prior_loss=0.9964, diff_loss=0.3722, tot_loss=1.656, over 189.00 samples.], tot_loss[dur_loss=0.2787, prior_loss=0.9947, diff_loss=0.4686, tot_loss=1.742, over 937.00 samples.], 2024-10-21 00:15:48,921 INFO [train.py:561] (1/4) Epoch 757, batch 14, global_batch_idx: 12110, batch size: 142, loss[dur_loss=0.2893, prior_loss=0.9954, diff_loss=0.365, tot_loss=1.65, over 142.00 samples.], tot_loss[dur_loss=0.2841, prior_loss=0.9953, diff_loss=0.4069, tot_loss=1.686, over 2210.00 samples.], 2024-10-21 00:15:50,354 INFO [train.py:682] (1/4) Start epoch 758 2024-10-21 00:16:11,058 INFO [train.py:561] (1/4) Epoch 758, batch 8, global_batch_idx: 12120, batch size: 170, loss[dur_loss=0.286, prior_loss=0.9958, diff_loss=0.3694, tot_loss=1.651, over 170.00 samples.], tot_loss[dur_loss=0.2819, prior_loss=0.9951, diff_loss=0.4288, tot_loss=1.706, over 1432.00 samples.], 2024-10-21 00:16:21,043 INFO [train.py:682] (1/4) Start epoch 759 2024-10-21 00:16:32,541 INFO [train.py:561] (1/4) Epoch 759, batch 2, global_batch_idx: 12130, batch size: 203, loss[dur_loss=0.2837, prior_loss=0.9953, diff_loss=0.3891, tot_loss=1.668, over 203.00 samples.], tot_loss[dur_loss=0.2859, prior_loss=0.9957, diff_loss=0.373, tot_loss=1.655, over 442.00 samples.], 2024-10-21 00:16:46,688 INFO [train.py:561] (1/4) Epoch 759, batch 12, global_batch_idx: 12140, batch size: 152, loss[dur_loss=0.2866, prior_loss=0.9954, diff_loss=0.3545, tot_loss=1.637, over 152.00 samples.], tot_loss[dur_loss=0.2832, prior_loss=0.9953, diff_loss=0.416, tot_loss=1.694, over 1966.00 samples.], 2024-10-21 00:16:51,100 INFO [train.py:682] (1/4) Start epoch 760 2024-10-21 00:17:08,542 INFO [train.py:561] (1/4) Epoch 760, batch 6, global_batch_idx: 12150, batch size: 106, loss[dur_loss=0.2879, prior_loss=0.9957, diff_loss=0.3531, tot_loss=1.637, over 106.00 samples.], tot_loss[dur_loss=0.28, prior_loss=0.9949, diff_loss=0.4597, tot_loss=1.735, over 1142.00 samples.], 2024-10-21 00:17:21,694 INFO [train.py:682] (1/4) Start epoch 761 2024-10-21 00:17:30,620 INFO [train.py:561] (1/4) Epoch 761, batch 0, global_batch_idx: 12160, batch size: 108, loss[dur_loss=0.2915, prior_loss=0.9962, diff_loss=0.4211, tot_loss=1.709, over 108.00 samples.], tot_loss[dur_loss=0.2915, prior_loss=0.9962, diff_loss=0.4211, tot_loss=1.709, over 108.00 samples.], 2024-10-21 00:17:44,836 INFO [train.py:561] (1/4) Epoch 761, batch 10, global_batch_idx: 12170, batch size: 111, loss[dur_loss=0.2953, prior_loss=0.9972, diff_loss=0.3406, tot_loss=1.633, over 111.00 samples.], tot_loss[dur_loss=0.2834, prior_loss=0.9953, diff_loss=0.4178, tot_loss=1.696, over 1656.00 samples.], 2024-10-21 00:17:51,866 INFO [train.py:682] (1/4) Start epoch 762 2024-10-21 00:18:05,556 INFO [train.py:561] (1/4) Epoch 762, batch 4, global_batch_idx: 12180, batch size: 189, loss[dur_loss=0.2872, prior_loss=0.996, diff_loss=0.3542, tot_loss=1.637, over 189.00 samples.], tot_loss[dur_loss=0.2787, prior_loss=0.9947, diff_loss=0.4745, tot_loss=1.748, over 937.00 samples.], 2024-10-21 00:18:20,320 INFO [train.py:561] (1/4) Epoch 762, batch 14, global_batch_idx: 12190, batch size: 142, loss[dur_loss=0.289, prior_loss=0.9953, diff_loss=0.335, tot_loss=1.619, over 142.00 samples.], tot_loss[dur_loss=0.2843, prior_loss=0.9953, diff_loss=0.4069, tot_loss=1.687, over 2210.00 samples.], 2024-10-21 00:18:21,742 INFO [train.py:682] (1/4) Start epoch 763 2024-10-21 00:18:41,822 INFO [train.py:561] (1/4) Epoch 763, batch 8, global_batch_idx: 12200, batch size: 170, loss[dur_loss=0.282, prior_loss=0.9955, diff_loss=0.3687, tot_loss=1.646, over 170.00 samples.], tot_loss[dur_loss=0.2799, prior_loss=0.9949, diff_loss=0.4337, tot_loss=1.709, over 1432.00 samples.], 2024-10-21 00:18:51,798 INFO [train.py:682] (1/4) Start epoch 764 2024-10-21 00:19:03,241 INFO [train.py:561] (1/4) Epoch 764, batch 2, global_batch_idx: 12210, batch size: 203, loss[dur_loss=0.2836, prior_loss=0.9952, diff_loss=0.3788, tot_loss=1.658, over 203.00 samples.], tot_loss[dur_loss=0.2857, prior_loss=0.9955, diff_loss=0.358, tot_loss=1.639, over 442.00 samples.], 2024-10-21 00:19:17,365 INFO [train.py:561] (1/4) Epoch 764, batch 12, global_batch_idx: 12220, batch size: 152, loss[dur_loss=0.2855, prior_loss=0.9952, diff_loss=0.3545, tot_loss=1.635, over 152.00 samples.], tot_loss[dur_loss=0.2822, prior_loss=0.995, diff_loss=0.4152, tot_loss=1.692, over 1966.00 samples.], 2024-10-21 00:19:21,784 INFO [train.py:682] (1/4) Start epoch 765 2024-10-21 00:19:38,882 INFO [train.py:561] (1/4) Epoch 765, batch 6, global_batch_idx: 12230, batch size: 106, loss[dur_loss=0.2861, prior_loss=0.9954, diff_loss=0.3578, tot_loss=1.639, over 106.00 samples.], tot_loss[dur_loss=0.28, prior_loss=0.9946, diff_loss=0.4576, tot_loss=1.732, over 1142.00 samples.], 2024-10-21 00:19:51,949 INFO [train.py:682] (1/4) Start epoch 766 2024-10-21 00:20:01,195 INFO [train.py:561] (1/4) Epoch 766, batch 0, global_batch_idx: 12240, batch size: 108, loss[dur_loss=0.2924, prior_loss=0.9958, diff_loss=0.3859, tot_loss=1.674, over 108.00 samples.], tot_loss[dur_loss=0.2924, prior_loss=0.9958, diff_loss=0.3859, tot_loss=1.674, over 108.00 samples.], 2024-10-21 00:20:15,417 INFO [train.py:561] (1/4) Epoch 766, batch 10, global_batch_idx: 12250, batch size: 111, loss[dur_loss=0.2911, prior_loss=0.9965, diff_loss=0.3433, tot_loss=1.631, over 111.00 samples.], tot_loss[dur_loss=0.2808, prior_loss=0.9949, diff_loss=0.4305, tot_loss=1.706, over 1656.00 samples.], 2024-10-21 00:20:22,437 INFO [train.py:682] (1/4) Start epoch 767 2024-10-21 00:20:36,288 INFO [train.py:561] (1/4) Epoch 767, batch 4, global_batch_idx: 12260, batch size: 189, loss[dur_loss=0.2911, prior_loss=0.9958, diff_loss=0.3827, tot_loss=1.67, over 189.00 samples.], tot_loss[dur_loss=0.2785, prior_loss=0.9946, diff_loss=0.4642, tot_loss=1.737, over 937.00 samples.], 2024-10-21 00:20:51,049 INFO [train.py:561] (1/4) Epoch 767, batch 14, global_batch_idx: 12270, batch size: 142, loss[dur_loss=0.2889, prior_loss=0.9952, diff_loss=0.3383, tot_loss=1.622, over 142.00 samples.], tot_loss[dur_loss=0.2833, prior_loss=0.9951, diff_loss=0.4025, tot_loss=1.681, over 2210.00 samples.], 2024-10-21 00:20:52,464 INFO [train.py:682] (1/4) Start epoch 768 2024-10-21 00:21:12,655 INFO [train.py:561] (1/4) Epoch 768, batch 8, global_batch_idx: 12280, batch size: 170, loss[dur_loss=0.2855, prior_loss=0.9956, diff_loss=0.38, tot_loss=1.661, over 170.00 samples.], tot_loss[dur_loss=0.2821, prior_loss=0.9949, diff_loss=0.4312, tot_loss=1.708, over 1432.00 samples.], 2024-10-21 00:21:22,823 INFO [train.py:682] (1/4) Start epoch 769 2024-10-21 00:21:34,564 INFO [train.py:561] (1/4) Epoch 769, batch 2, global_batch_idx: 12290, batch size: 203, loss[dur_loss=0.2817, prior_loss=0.9953, diff_loss=0.3926, tot_loss=1.67, over 203.00 samples.], tot_loss[dur_loss=0.284, prior_loss=0.9956, diff_loss=0.3794, tot_loss=1.659, over 442.00 samples.], 2024-10-21 00:21:48,780 INFO [train.py:561] (1/4) Epoch 769, batch 12, global_batch_idx: 12300, batch size: 152, loss[dur_loss=0.2882, prior_loss=0.9957, diff_loss=0.3792, tot_loss=1.663, over 152.00 samples.], tot_loss[dur_loss=0.2834, prior_loss=0.9952, diff_loss=0.4204, tot_loss=1.699, over 1966.00 samples.], 2024-10-21 00:21:53,207 INFO [train.py:682] (1/4) Start epoch 770 2024-10-21 00:22:10,381 INFO [train.py:561] (1/4) Epoch 770, batch 6, global_batch_idx: 12310, batch size: 106, loss[dur_loss=0.2883, prior_loss=0.9954, diff_loss=0.3155, tot_loss=1.599, over 106.00 samples.], tot_loss[dur_loss=0.2798, prior_loss=0.9947, diff_loss=0.445, tot_loss=1.719, over 1142.00 samples.], 2024-10-21 00:22:23,203 INFO [train.py:682] (1/4) Start epoch 771 2024-10-21 00:22:32,106 INFO [train.py:561] (1/4) Epoch 771, batch 0, global_batch_idx: 12320, batch size: 108, loss[dur_loss=0.296, prior_loss=0.9962, diff_loss=0.3464, tot_loss=1.639, over 108.00 samples.], tot_loss[dur_loss=0.296, prior_loss=0.9962, diff_loss=0.3464, tot_loss=1.639, over 108.00 samples.], 2024-10-21 00:22:46,230 INFO [train.py:561] (1/4) Epoch 771, batch 10, global_batch_idx: 12330, batch size: 111, loss[dur_loss=0.295, prior_loss=0.9974, diff_loss=0.3345, tot_loss=1.627, over 111.00 samples.], tot_loss[dur_loss=0.2821, prior_loss=0.995, diff_loss=0.4339, tot_loss=1.711, over 1656.00 samples.], 2024-10-21 00:22:53,208 INFO [train.py:682] (1/4) Start epoch 772 2024-10-21 00:23:07,415 INFO [train.py:561] (1/4) Epoch 772, batch 4, global_batch_idx: 12340, batch size: 189, loss[dur_loss=0.2839, prior_loss=0.9953, diff_loss=0.4098, tot_loss=1.689, over 189.00 samples.], tot_loss[dur_loss=0.2778, prior_loss=0.9943, diff_loss=0.4866, tot_loss=1.759, over 937.00 samples.], 2024-10-21 00:23:22,096 INFO [train.py:561] (1/4) Epoch 772, batch 14, global_batch_idx: 12350, batch size: 142, loss[dur_loss=0.2869, prior_loss=0.9952, diff_loss=0.3465, tot_loss=1.629, over 142.00 samples.], tot_loss[dur_loss=0.2835, prior_loss=0.995, diff_loss=0.4121, tot_loss=1.691, over 2210.00 samples.], 2024-10-21 00:23:23,529 INFO [train.py:682] (1/4) Start epoch 773 2024-10-21 00:23:43,414 INFO [train.py:561] (1/4) Epoch 773, batch 8, global_batch_idx: 12360, batch size: 170, loss[dur_loss=0.2871, prior_loss=0.9952, diff_loss=0.3779, tot_loss=1.66, over 170.00 samples.], tot_loss[dur_loss=0.2806, prior_loss=0.9946, diff_loss=0.4465, tot_loss=1.722, over 1432.00 samples.], 2024-10-21 00:23:53,428 INFO [train.py:682] (1/4) Start epoch 774 2024-10-21 00:24:05,458 INFO [train.py:561] (1/4) Epoch 774, batch 2, global_batch_idx: 12370, batch size: 203, loss[dur_loss=0.2854, prior_loss=0.9951, diff_loss=0.3614, tot_loss=1.642, over 203.00 samples.], tot_loss[dur_loss=0.2852, prior_loss=0.9953, diff_loss=0.3553, tot_loss=1.636, over 442.00 samples.], 2024-10-21 00:24:20,072 INFO [train.py:561] (1/4) Epoch 774, batch 12, global_batch_idx: 12380, batch size: 152, loss[dur_loss=0.284, prior_loss=0.9952, diff_loss=0.3573, tot_loss=1.636, over 152.00 samples.], tot_loss[dur_loss=0.2821, prior_loss=0.9948, diff_loss=0.4189, tot_loss=1.696, over 1966.00 samples.], 2024-10-21 00:24:24,629 INFO [train.py:682] (1/4) Start epoch 775 2024-10-21 00:24:42,372 INFO [train.py:561] (1/4) Epoch 775, batch 6, global_batch_idx: 12390, batch size: 106, loss[dur_loss=0.2919, prior_loss=0.9955, diff_loss=0.3564, tot_loss=1.644, over 106.00 samples.], tot_loss[dur_loss=0.279, prior_loss=0.9945, diff_loss=0.4557, tot_loss=1.729, over 1142.00 samples.], 2024-10-21 00:24:55,568 INFO [train.py:682] (1/4) Start epoch 776 2024-10-21 00:25:04,491 INFO [train.py:561] (1/4) Epoch 776, batch 0, global_batch_idx: 12400, batch size: 108, loss[dur_loss=0.2936, prior_loss=0.9962, diff_loss=0.3801, tot_loss=1.67, over 108.00 samples.], tot_loss[dur_loss=0.2936, prior_loss=0.9962, diff_loss=0.3801, tot_loss=1.67, over 108.00 samples.], 2024-10-21 00:25:18,859 INFO [train.py:561] (1/4) Epoch 776, batch 10, global_batch_idx: 12410, batch size: 111, loss[dur_loss=0.2915, prior_loss=0.9967, diff_loss=0.3549, tot_loss=1.643, over 111.00 samples.], tot_loss[dur_loss=0.2822, prior_loss=0.995, diff_loss=0.4352, tot_loss=1.712, over 1656.00 samples.], 2024-10-21 00:25:26,071 INFO [train.py:682] (1/4) Start epoch 777 2024-10-21 00:25:40,206 INFO [train.py:561] (1/4) Epoch 777, batch 4, global_batch_idx: 12420, batch size: 189, loss[dur_loss=0.2823, prior_loss=0.9954, diff_loss=0.3939, tot_loss=1.672, over 189.00 samples.], tot_loss[dur_loss=0.2769, prior_loss=0.9942, diff_loss=0.4736, tot_loss=1.745, over 937.00 samples.], 2024-10-21 00:25:55,219 INFO [train.py:561] (1/4) Epoch 777, batch 14, global_batch_idx: 12430, batch size: 142, loss[dur_loss=0.293, prior_loss=0.9951, diff_loss=0.3903, tot_loss=1.678, over 142.00 samples.], tot_loss[dur_loss=0.2817, prior_loss=0.9948, diff_loss=0.4026, tot_loss=1.679, over 2210.00 samples.], 2024-10-21 00:25:56,658 INFO [train.py:682] (1/4) Start epoch 778 2024-10-21 00:26:17,340 INFO [train.py:561] (1/4) Epoch 778, batch 8, global_batch_idx: 12440, batch size: 170, loss[dur_loss=0.2867, prior_loss=0.996, diff_loss=0.3681, tot_loss=1.651, over 170.00 samples.], tot_loss[dur_loss=0.2813, prior_loss=0.9948, diff_loss=0.4379, tot_loss=1.714, over 1432.00 samples.], 2024-10-21 00:26:27,608 INFO [train.py:682] (1/4) Start epoch 779 2024-10-21 00:26:39,354 INFO [train.py:561] (1/4) Epoch 779, batch 2, global_batch_idx: 12450, batch size: 203, loss[dur_loss=0.2831, prior_loss=0.9958, diff_loss=0.3601, tot_loss=1.639, over 203.00 samples.], tot_loss[dur_loss=0.2854, prior_loss=0.9957, diff_loss=0.3492, tot_loss=1.63, over 442.00 samples.], 2024-10-21 00:26:53,782 INFO [train.py:561] (1/4) Epoch 779, batch 12, global_batch_idx: 12460, batch size: 152, loss[dur_loss=0.2853, prior_loss=0.9947, diff_loss=0.3554, tot_loss=1.635, over 152.00 samples.], tot_loss[dur_loss=0.2825, prior_loss=0.995, diff_loss=0.4142, tot_loss=1.692, over 1966.00 samples.], 2024-10-21 00:26:58,290 INFO [train.py:682] (1/4) Start epoch 780 2024-10-21 00:27:15,534 INFO [train.py:561] (1/4) Epoch 780, batch 6, global_batch_idx: 12470, batch size: 106, loss[dur_loss=0.287, prior_loss=0.9955, diff_loss=0.329, tot_loss=1.612, over 106.00 samples.], tot_loss[dur_loss=0.2792, prior_loss=0.9943, diff_loss=0.4554, tot_loss=1.729, over 1142.00 samples.], 2024-10-21 00:27:28,653 INFO [train.py:682] (1/4) Start epoch 781 2024-10-21 00:27:37,736 INFO [train.py:561] (1/4) Epoch 781, batch 0, global_batch_idx: 12480, batch size: 108, loss[dur_loss=0.2858, prior_loss=0.9958, diff_loss=0.3813, tot_loss=1.663, over 108.00 samples.], tot_loss[dur_loss=0.2858, prior_loss=0.9958, diff_loss=0.3813, tot_loss=1.663, over 108.00 samples.], 2024-10-21 00:27:52,155 INFO [train.py:561] (1/4) Epoch 781, batch 10, global_batch_idx: 12490, batch size: 111, loss[dur_loss=0.2884, prior_loss=0.996, diff_loss=0.36, tot_loss=1.644, over 111.00 samples.], tot_loss[dur_loss=0.2808, prior_loss=0.9946, diff_loss=0.4242, tot_loss=1.7, over 1656.00 samples.], 2024-10-21 00:27:59,308 INFO [train.py:682] (1/4) Start epoch 782 2024-10-21 00:28:13,136 INFO [train.py:561] (1/4) Epoch 782, batch 4, global_batch_idx: 12500, batch size: 189, loss[dur_loss=0.2821, prior_loss=0.9951, diff_loss=0.3738, tot_loss=1.651, over 189.00 samples.], tot_loss[dur_loss=0.2756, prior_loss=0.9939, diff_loss=0.4826, tot_loss=1.752, over 937.00 samples.], 2024-10-21 00:28:28,016 INFO [train.py:561] (1/4) Epoch 782, batch 14, global_batch_idx: 12510, batch size: 142, loss[dur_loss=0.2825, prior_loss=0.9947, diff_loss=0.3513, tot_loss=1.628, over 142.00 samples.], tot_loss[dur_loss=0.2807, prior_loss=0.9945, diff_loss=0.4118, tot_loss=1.687, over 2210.00 samples.], 2024-10-21 00:28:29,464 INFO [train.py:682] (1/4) Start epoch 783 2024-10-21 00:28:49,938 INFO [train.py:561] (1/4) Epoch 783, batch 8, global_batch_idx: 12520, batch size: 170, loss[dur_loss=0.283, prior_loss=0.9946, diff_loss=0.4032, tot_loss=1.681, over 170.00 samples.], tot_loss[dur_loss=0.2773, prior_loss=0.9943, diff_loss=0.4365, tot_loss=1.708, over 1432.00 samples.], 2024-10-21 00:29:00,092 INFO [train.py:682] (1/4) Start epoch 784 2024-10-21 00:29:11,752 INFO [train.py:561] (1/4) Epoch 784, batch 2, global_batch_idx: 12530, batch size: 203, loss[dur_loss=0.2803, prior_loss=0.9944, diff_loss=0.3769, tot_loss=1.652, over 203.00 samples.], tot_loss[dur_loss=0.2824, prior_loss=0.9947, diff_loss=0.3738, tot_loss=1.651, over 442.00 samples.], 2024-10-21 00:29:26,039 INFO [train.py:561] (1/4) Epoch 784, batch 12, global_batch_idx: 12540, batch size: 152, loss[dur_loss=0.2774, prior_loss=0.9944, diff_loss=0.4058, tot_loss=1.678, over 152.00 samples.], tot_loss[dur_loss=0.2795, prior_loss=0.9944, diff_loss=0.4199, tot_loss=1.694, over 1966.00 samples.], 2024-10-21 00:29:30,521 INFO [train.py:682] (1/4) Start epoch 785 2024-10-21 00:29:47,528 INFO [train.py:561] (1/4) Epoch 785, batch 6, global_batch_idx: 12550, batch size: 106, loss[dur_loss=0.282, prior_loss=0.9947, diff_loss=0.3233, tot_loss=1.6, over 106.00 samples.], tot_loss[dur_loss=0.2756, prior_loss=0.9939, diff_loss=0.4432, tot_loss=1.713, over 1142.00 samples.], 2024-10-21 00:30:00,704 INFO [train.py:682] (1/4) Start epoch 786 2024-10-21 00:30:09,586 INFO [train.py:561] (1/4) Epoch 786, batch 0, global_batch_idx: 12560, batch size: 108, loss[dur_loss=0.29, prior_loss=0.9955, diff_loss=0.3271, tot_loss=1.613, over 108.00 samples.], tot_loss[dur_loss=0.29, prior_loss=0.9955, diff_loss=0.3271, tot_loss=1.613, over 108.00 samples.], 2024-10-21 00:30:23,859 INFO [train.py:561] (1/4) Epoch 786, batch 10, global_batch_idx: 12570, batch size: 111, loss[dur_loss=0.2881, prior_loss=0.996, diff_loss=0.3419, tot_loss=1.626, over 111.00 samples.], tot_loss[dur_loss=0.2789, prior_loss=0.9943, diff_loss=0.4158, tot_loss=1.689, over 1656.00 samples.], 2024-10-21 00:30:30,969 INFO [train.py:682] (1/4) Start epoch 787 2024-10-21 00:30:45,069 INFO [train.py:561] (1/4) Epoch 787, batch 4, global_batch_idx: 12580, batch size: 189, loss[dur_loss=0.2847, prior_loss=0.9964, diff_loss=0.4089, tot_loss=1.69, over 189.00 samples.], tot_loss[dur_loss=0.2751, prior_loss=0.9942, diff_loss=0.4695, tot_loss=1.739, over 937.00 samples.], 2024-10-21 00:30:59,935 INFO [train.py:561] (1/4) Epoch 787, batch 14, global_batch_idx: 12590, batch size: 142, loss[dur_loss=0.2829, prior_loss=0.9948, diff_loss=0.3615, tot_loss=1.639, over 142.00 samples.], tot_loss[dur_loss=0.2801, prior_loss=0.9947, diff_loss=0.4067, tot_loss=1.682, over 2210.00 samples.], 2024-10-21 00:31:01,382 INFO [train.py:682] (1/4) Start epoch 788 2024-10-21 00:31:21,455 INFO [train.py:561] (1/4) Epoch 788, batch 8, global_batch_idx: 12600, batch size: 170, loss[dur_loss=0.2824, prior_loss=0.9956, diff_loss=0.3865, tot_loss=1.665, over 170.00 samples.], tot_loss[dur_loss=0.2787, prior_loss=0.9944, diff_loss=0.4319, tot_loss=1.705, over 1432.00 samples.], 2024-10-21 00:31:31,467 INFO [train.py:682] (1/4) Start epoch 789 2024-10-21 00:31:42,993 INFO [train.py:561] (1/4) Epoch 789, batch 2, global_batch_idx: 12610, batch size: 203, loss[dur_loss=0.2819, prior_loss=0.9946, diff_loss=0.3941, tot_loss=1.671, over 203.00 samples.], tot_loss[dur_loss=0.2831, prior_loss=0.9948, diff_loss=0.3626, tot_loss=1.641, over 442.00 samples.], 2024-10-21 00:31:57,036 INFO [train.py:561] (1/4) Epoch 789, batch 12, global_batch_idx: 12620, batch size: 152, loss[dur_loss=0.286, prior_loss=0.9946, diff_loss=0.3908, tot_loss=1.671, over 152.00 samples.], tot_loss[dur_loss=0.2799, prior_loss=0.9944, diff_loss=0.4229, tot_loss=1.697, over 1966.00 samples.], 2024-10-21 00:32:01,493 INFO [train.py:682] (1/4) Start epoch 790 2024-10-21 00:32:18,625 INFO [train.py:561] (1/4) Epoch 790, batch 6, global_batch_idx: 12630, batch size: 106, loss[dur_loss=0.2802, prior_loss=0.9948, diff_loss=0.3232, tot_loss=1.598, over 106.00 samples.], tot_loss[dur_loss=0.2753, prior_loss=0.9941, diff_loss=0.448, tot_loss=1.717, over 1142.00 samples.], 2024-10-21 00:32:31,713 INFO [train.py:682] (1/4) Start epoch 791 2024-10-21 00:32:40,570 INFO [train.py:561] (1/4) Epoch 791, batch 0, global_batch_idx: 12640, batch size: 108, loss[dur_loss=0.2865, prior_loss=0.9953, diff_loss=0.3447, tot_loss=1.626, over 108.00 samples.], tot_loss[dur_loss=0.2865, prior_loss=0.9953, diff_loss=0.3447, tot_loss=1.626, over 108.00 samples.], 2024-10-21 00:32:54,836 INFO [train.py:561] (1/4) Epoch 791, batch 10, global_batch_idx: 12650, batch size: 111, loss[dur_loss=0.2883, prior_loss=0.996, diff_loss=0.3577, tot_loss=1.642, over 111.00 samples.], tot_loss[dur_loss=0.2781, prior_loss=0.9943, diff_loss=0.4211, tot_loss=1.693, over 1656.00 samples.], 2024-10-21 00:33:01,943 INFO [train.py:682] (1/4) Start epoch 792 2024-10-21 00:33:15,712 INFO [train.py:561] (1/4) Epoch 792, batch 4, global_batch_idx: 12660, batch size: 189, loss[dur_loss=0.2775, prior_loss=0.9948, diff_loss=0.3777, tot_loss=1.65, over 189.00 samples.], tot_loss[dur_loss=0.274, prior_loss=0.9936, diff_loss=0.4684, tot_loss=1.736, over 937.00 samples.], 2024-10-21 00:33:30,559 INFO [train.py:561] (1/4) Epoch 792, batch 14, global_batch_idx: 12670, batch size: 142, loss[dur_loss=0.2797, prior_loss=0.9943, diff_loss=0.3554, tot_loss=1.629, over 142.00 samples.], tot_loss[dur_loss=0.2796, prior_loss=0.9943, diff_loss=0.4016, tot_loss=1.675, over 2210.00 samples.], 2024-10-21 00:33:31,977 INFO [train.py:682] (1/4) Start epoch 793 2024-10-21 00:33:52,186 INFO [train.py:561] (1/4) Epoch 793, batch 8, global_batch_idx: 12680, batch size: 170, loss[dur_loss=0.281, prior_loss=0.9943, diff_loss=0.391, tot_loss=1.666, over 170.00 samples.], tot_loss[dur_loss=0.277, prior_loss=0.9939, diff_loss=0.4504, tot_loss=1.721, over 1432.00 samples.], 2024-10-21 00:34:02,307 INFO [train.py:682] (1/4) Start epoch 794 2024-10-21 00:34:13,870 INFO [train.py:561] (1/4) Epoch 794, batch 2, global_batch_idx: 12690, batch size: 203, loss[dur_loss=0.2843, prior_loss=0.9943, diff_loss=0.3616, tot_loss=1.64, over 203.00 samples.], tot_loss[dur_loss=0.284, prior_loss=0.9946, diff_loss=0.3681, tot_loss=1.647, over 442.00 samples.], 2024-10-21 00:34:28,090 INFO [train.py:561] (1/4) Epoch 794, batch 12, global_batch_idx: 12700, batch size: 152, loss[dur_loss=0.2826, prior_loss=0.9945, diff_loss=0.3512, tot_loss=1.628, over 152.00 samples.], tot_loss[dur_loss=0.2795, prior_loss=0.9942, diff_loss=0.4066, tot_loss=1.68, over 1966.00 samples.], 2024-10-21 00:34:32,533 INFO [train.py:682] (1/4) Start epoch 795 2024-10-21 00:34:49,622 INFO [train.py:561] (1/4) Epoch 795, batch 6, global_batch_idx: 12710, batch size: 106, loss[dur_loss=0.2863, prior_loss=0.9947, diff_loss=0.3558, tot_loss=1.637, over 106.00 samples.], tot_loss[dur_loss=0.2769, prior_loss=0.9938, diff_loss=0.4493, tot_loss=1.72, over 1142.00 samples.], 2024-10-21 00:35:02,682 INFO [train.py:682] (1/4) Start epoch 796 2024-10-21 00:35:11,625 INFO [train.py:561] (1/4) Epoch 796, batch 0, global_batch_idx: 12720, batch size: 108, loss[dur_loss=0.2881, prior_loss=0.9953, diff_loss=0.3477, tot_loss=1.631, over 108.00 samples.], tot_loss[dur_loss=0.2881, prior_loss=0.9953, diff_loss=0.3477, tot_loss=1.631, over 108.00 samples.], 2024-10-21 00:35:25,867 INFO [train.py:561] (1/4) Epoch 796, batch 10, global_batch_idx: 12730, batch size: 111, loss[dur_loss=0.288, prior_loss=0.996, diff_loss=0.3394, tot_loss=1.623, over 111.00 samples.], tot_loss[dur_loss=0.2787, prior_loss=0.9943, diff_loss=0.4218, tot_loss=1.695, over 1656.00 samples.], 2024-10-21 00:35:32,972 INFO [train.py:682] (1/4) Start epoch 797 2024-10-21 00:35:46,806 INFO [train.py:561] (1/4) Epoch 797, batch 4, global_batch_idx: 12740, batch size: 189, loss[dur_loss=0.2832, prior_loss=0.9952, diff_loss=0.3652, tot_loss=1.644, over 189.00 samples.], tot_loss[dur_loss=0.2749, prior_loss=0.9937, diff_loss=0.4815, tot_loss=1.75, over 937.00 samples.], 2024-10-21 00:36:01,690 INFO [train.py:561] (1/4) Epoch 797, batch 14, global_batch_idx: 12750, batch size: 142, loss[dur_loss=0.2856, prior_loss=0.9942, diff_loss=0.3281, tot_loss=1.608, over 142.00 samples.], tot_loss[dur_loss=0.281, prior_loss=0.9943, diff_loss=0.4142, tot_loss=1.69, over 2210.00 samples.], 2024-10-21 00:36:03,106 INFO [train.py:682] (1/4) Start epoch 798 2024-10-21 00:36:22,940 INFO [train.py:561] (1/4) Epoch 798, batch 8, global_batch_idx: 12760, batch size: 170, loss[dur_loss=0.2811, prior_loss=0.9945, diff_loss=0.3756, tot_loss=1.651, over 170.00 samples.], tot_loss[dur_loss=0.2776, prior_loss=0.9939, diff_loss=0.4401, tot_loss=1.712, over 1432.00 samples.], 2024-10-21 00:36:33,100 INFO [train.py:682] (1/4) Start epoch 799 2024-10-21 00:36:44,592 INFO [train.py:561] (1/4) Epoch 799, batch 2, global_batch_idx: 12770, batch size: 203, loss[dur_loss=0.2802, prior_loss=0.9944, diff_loss=0.3886, tot_loss=1.663, over 203.00 samples.], tot_loss[dur_loss=0.2829, prior_loss=0.9947, diff_loss=0.3698, tot_loss=1.647, over 442.00 samples.], 2024-10-21 00:36:58,795 INFO [train.py:561] (1/4) Epoch 799, batch 12, global_batch_idx: 12780, batch size: 152, loss[dur_loss=0.281, prior_loss=0.994, diff_loss=0.3495, tot_loss=1.625, over 152.00 samples.], tot_loss[dur_loss=0.2784, prior_loss=0.994, diff_loss=0.4183, tot_loss=1.691, over 1966.00 samples.], 2024-10-21 00:37:03,272 INFO [train.py:682] (1/4) Start epoch 800 2024-10-21 00:37:20,487 INFO [train.py:561] (1/4) Epoch 800, batch 6, global_batch_idx: 12790, batch size: 106, loss[dur_loss=0.2789, prior_loss=0.9943, diff_loss=0.3574, tot_loss=1.631, over 106.00 samples.], tot_loss[dur_loss=0.274, prior_loss=0.9936, diff_loss=0.4429, tot_loss=1.71, over 1142.00 samples.], 2024-10-21 00:37:33,585 INFO [train.py:682] (1/4) Start epoch 801 2024-10-21 00:37:42,363 INFO [train.py:561] (1/4) Epoch 801, batch 0, global_batch_idx: 12800, batch size: 108, loss[dur_loss=0.2888, prior_loss=0.9952, diff_loss=0.3259, tot_loss=1.61, over 108.00 samples.], tot_loss[dur_loss=0.2888, prior_loss=0.9952, diff_loss=0.3259, tot_loss=1.61, over 108.00 samples.], 2024-10-21 00:37:56,807 INFO [train.py:561] (1/4) Epoch 801, batch 10, global_batch_idx: 12810, batch size: 111, loss[dur_loss=0.2853, prior_loss=0.9956, diff_loss=0.3649, tot_loss=1.646, over 111.00 samples.], tot_loss[dur_loss=0.2778, prior_loss=0.9941, diff_loss=0.4167, tot_loss=1.689, over 1656.00 samples.], 2024-10-21 00:38:03,917 INFO [train.py:682] (1/4) Start epoch 802 2024-10-21 00:38:18,225 INFO [train.py:561] (1/4) Epoch 802, batch 4, global_batch_idx: 12820, batch size: 189, loss[dur_loss=0.2826, prior_loss=0.9952, diff_loss=0.401, tot_loss=1.679, over 189.00 samples.], tot_loss[dur_loss=0.2748, prior_loss=0.9936, diff_loss=0.4853, tot_loss=1.754, over 937.00 samples.], 2024-10-21 00:38:33,068 INFO [train.py:561] (1/4) Epoch 802, batch 14, global_batch_idx: 12830, batch size: 142, loss[dur_loss=0.284, prior_loss=0.9941, diff_loss=0.3396, tot_loss=1.618, over 142.00 samples.], tot_loss[dur_loss=0.2792, prior_loss=0.9942, diff_loss=0.414, tot_loss=1.687, over 2210.00 samples.], 2024-10-21 00:38:34,494 INFO [train.py:682] (1/4) Start epoch 803 2024-10-21 00:38:54,823 INFO [train.py:561] (1/4) Epoch 803, batch 8, global_batch_idx: 12840, batch size: 170, loss[dur_loss=0.279, prior_loss=0.9943, diff_loss=0.3376, tot_loss=1.611, over 170.00 samples.], tot_loss[dur_loss=0.2771, prior_loss=0.9939, diff_loss=0.4245, tot_loss=1.695, over 1432.00 samples.], 2024-10-21 00:39:04,948 INFO [train.py:682] (1/4) Start epoch 804 2024-10-21 00:39:16,727 INFO [train.py:561] (1/4) Epoch 804, batch 2, global_batch_idx: 12850, batch size: 203, loss[dur_loss=0.2782, prior_loss=0.9946, diff_loss=0.3774, tot_loss=1.65, over 203.00 samples.], tot_loss[dur_loss=0.2811, prior_loss=0.9948, diff_loss=0.3653, tot_loss=1.641, over 442.00 samples.], 2024-10-21 00:39:30,847 INFO [train.py:561] (1/4) Epoch 804, batch 12, global_batch_idx: 12860, batch size: 152, loss[dur_loss=0.2814, prior_loss=0.9941, diff_loss=0.369, tot_loss=1.644, over 152.00 samples.], tot_loss[dur_loss=0.2778, prior_loss=0.994, diff_loss=0.4108, tot_loss=1.683, over 1966.00 samples.], 2024-10-21 00:39:35,304 INFO [train.py:682] (1/4) Start epoch 805 2024-10-21 00:39:52,670 INFO [train.py:561] (1/4) Epoch 805, batch 6, global_batch_idx: 12870, batch size: 106, loss[dur_loss=0.2813, prior_loss=0.9947, diff_loss=0.3391, tot_loss=1.615, over 106.00 samples.], tot_loss[dur_loss=0.2757, prior_loss=0.9937, diff_loss=0.444, tot_loss=1.713, over 1142.00 samples.], 2024-10-21 00:40:05,590 INFO [train.py:682] (1/4) Start epoch 806 2024-10-21 00:40:14,629 INFO [train.py:561] (1/4) Epoch 806, batch 0, global_batch_idx: 12880, batch size: 108, loss[dur_loss=0.2901, prior_loss=0.9951, diff_loss=0.3384, tot_loss=1.624, over 108.00 samples.], tot_loss[dur_loss=0.2901, prior_loss=0.9951, diff_loss=0.3384, tot_loss=1.624, over 108.00 samples.], 2024-10-21 00:40:29,022 INFO [train.py:561] (1/4) Epoch 806, batch 10, global_batch_idx: 12890, batch size: 111, loss[dur_loss=0.2857, prior_loss=0.9958, diff_loss=0.3664, tot_loss=1.648, over 111.00 samples.], tot_loss[dur_loss=0.2767, prior_loss=0.994, diff_loss=0.4247, tot_loss=1.695, over 1656.00 samples.], 2024-10-21 00:40:36,203 INFO [train.py:682] (1/4) Start epoch 807 2024-10-21 00:40:50,516 INFO [train.py:561] (1/4) Epoch 807, batch 4, global_batch_idx: 12900, batch size: 189, loss[dur_loss=0.2845, prior_loss=0.9952, diff_loss=0.3784, tot_loss=1.658, over 189.00 samples.], tot_loss[dur_loss=0.276, prior_loss=0.9935, diff_loss=0.4591, tot_loss=1.729, over 937.00 samples.], 2024-10-21 00:41:05,530 INFO [train.py:561] (1/4) Epoch 807, batch 14, global_batch_idx: 12910, batch size: 142, loss[dur_loss=0.2819, prior_loss=0.9941, diff_loss=0.3439, tot_loss=1.62, over 142.00 samples.], tot_loss[dur_loss=0.2798, prior_loss=0.994, diff_loss=0.3941, tot_loss=1.668, over 2210.00 samples.], 2024-10-21 00:41:06,977 INFO [train.py:682] (1/4) Start epoch 808 2024-10-21 00:41:26,954 INFO [train.py:561] (1/4) Epoch 808, batch 8, global_batch_idx: 12920, batch size: 170, loss[dur_loss=0.2801, prior_loss=0.9944, diff_loss=0.3495, tot_loss=1.624, over 170.00 samples.], tot_loss[dur_loss=0.2767, prior_loss=0.9937, diff_loss=0.4313, tot_loss=1.702, over 1432.00 samples.], 2024-10-21 00:41:37,143 INFO [train.py:682] (1/4) Start epoch 809 2024-10-21 00:41:48,851 INFO [train.py:561] (1/4) Epoch 809, batch 2, global_batch_idx: 12930, batch size: 203, loss[dur_loss=0.2795, prior_loss=0.9943, diff_loss=0.389, tot_loss=1.663, over 203.00 samples.], tot_loss[dur_loss=0.2808, prior_loss=0.9944, diff_loss=0.3595, tot_loss=1.635, over 442.00 samples.], 2024-10-21 00:42:03,137 INFO [train.py:561] (1/4) Epoch 809, batch 12, global_batch_idx: 12940, batch size: 152, loss[dur_loss=0.2806, prior_loss=0.9939, diff_loss=0.3416, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2775, prior_loss=0.9939, diff_loss=0.4083, tot_loss=1.68, over 1966.00 samples.], 2024-10-21 00:42:07,608 INFO [train.py:682] (1/4) Start epoch 810 2024-10-21 00:42:24,628 INFO [train.py:561] (1/4) Epoch 810, batch 6, global_batch_idx: 12950, batch size: 106, loss[dur_loss=0.2832, prior_loss=0.9947, diff_loss=0.3561, tot_loss=1.634, over 106.00 samples.], tot_loss[dur_loss=0.2749, prior_loss=0.9935, diff_loss=0.4482, tot_loss=1.717, over 1142.00 samples.], 2024-10-21 00:42:37,867 INFO [train.py:682] (1/4) Start epoch 811 2024-10-21 00:42:46,826 INFO [train.py:561] (1/4) Epoch 811, batch 0, global_batch_idx: 12960, batch size: 108, loss[dur_loss=0.2884, prior_loss=0.9953, diff_loss=0.3468, tot_loss=1.63, over 108.00 samples.], tot_loss[dur_loss=0.2884, prior_loss=0.9953, diff_loss=0.3468, tot_loss=1.63, over 108.00 samples.], 2024-10-21 00:43:01,107 INFO [train.py:561] (1/4) Epoch 811, batch 10, global_batch_idx: 12970, batch size: 111, loss[dur_loss=0.2866, prior_loss=0.9952, diff_loss=0.3653, tot_loss=1.647, over 111.00 samples.], tot_loss[dur_loss=0.2788, prior_loss=0.9941, diff_loss=0.4207, tot_loss=1.694, over 1656.00 samples.], 2024-10-21 00:43:08,281 INFO [train.py:682] (1/4) Start epoch 812 2024-10-21 00:43:22,103 INFO [train.py:561] (1/4) Epoch 812, batch 4, global_batch_idx: 12980, batch size: 189, loss[dur_loss=0.2829, prior_loss=0.9954, diff_loss=0.3865, tot_loss=1.665, over 189.00 samples.], tot_loss[dur_loss=0.2738, prior_loss=0.9935, diff_loss=0.4762, tot_loss=1.743, over 937.00 samples.], 2024-10-21 00:43:37,097 INFO [train.py:561] (1/4) Epoch 812, batch 14, global_batch_idx: 12990, batch size: 142, loss[dur_loss=0.2863, prior_loss=0.9943, diff_loss=0.3539, tot_loss=1.634, over 142.00 samples.], tot_loss[dur_loss=0.2793, prior_loss=0.9941, diff_loss=0.4048, tot_loss=1.678, over 2210.00 samples.], 2024-10-21 00:43:38,548 INFO [train.py:682] (1/4) Start epoch 813 2024-10-21 00:43:58,863 INFO [train.py:561] (1/4) Epoch 813, batch 8, global_batch_idx: 13000, batch size: 170, loss[dur_loss=0.2812, prior_loss=0.9944, diff_loss=0.3616, tot_loss=1.637, over 170.00 samples.], tot_loss[dur_loss=0.2787, prior_loss=0.994, diff_loss=0.4368, tot_loss=1.71, over 1432.00 samples.], 2024-10-21 00:44:09,043 INFO [train.py:682] (1/4) Start epoch 814 2024-10-21 00:44:20,608 INFO [train.py:561] (1/4) Epoch 814, batch 2, global_batch_idx: 13010, batch size: 203, loss[dur_loss=0.2805, prior_loss=0.9943, diff_loss=0.386, tot_loss=1.661, over 203.00 samples.], tot_loss[dur_loss=0.2822, prior_loss=0.9946, diff_loss=0.3801, tot_loss=1.657, over 442.00 samples.], 2024-10-21 00:44:34,901 INFO [train.py:561] (1/4) Epoch 814, batch 12, global_batch_idx: 13020, batch size: 152, loss[dur_loss=0.2791, prior_loss=0.9937, diff_loss=0.3703, tot_loss=1.643, over 152.00 samples.], tot_loss[dur_loss=0.2775, prior_loss=0.9939, diff_loss=0.4123, tot_loss=1.684, over 1966.00 samples.], 2024-10-21 00:44:39,348 INFO [train.py:682] (1/4) Start epoch 815 2024-10-21 00:44:56,973 INFO [train.py:561] (1/4) Epoch 815, batch 6, global_batch_idx: 13030, batch size: 106, loss[dur_loss=0.28, prior_loss=0.9941, diff_loss=0.3574, tot_loss=1.631, over 106.00 samples.], tot_loss[dur_loss=0.2751, prior_loss=0.9937, diff_loss=0.4595, tot_loss=1.728, over 1142.00 samples.], 2024-10-21 00:45:10,180 INFO [train.py:682] (1/4) Start epoch 816 2024-10-21 00:45:19,152 INFO [train.py:561] (1/4) Epoch 816, batch 0, global_batch_idx: 13040, batch size: 108, loss[dur_loss=0.2841, prior_loss=0.9946, diff_loss=0.316, tot_loss=1.595, over 108.00 samples.], tot_loss[dur_loss=0.2841, prior_loss=0.9946, diff_loss=0.316, tot_loss=1.595, over 108.00 samples.], 2024-10-21 00:45:33,485 INFO [train.py:561] (1/4) Epoch 816, batch 10, global_batch_idx: 13050, batch size: 111, loss[dur_loss=0.2881, prior_loss=0.9953, diff_loss=0.3507, tot_loss=1.634, over 111.00 samples.], tot_loss[dur_loss=0.2766, prior_loss=0.9938, diff_loss=0.4148, tot_loss=1.685, over 1656.00 samples.], 2024-10-21 00:45:40,610 INFO [train.py:682] (1/4) Start epoch 817 2024-10-21 00:45:54,280 INFO [train.py:561] (1/4) Epoch 817, batch 4, global_batch_idx: 13060, batch size: 189, loss[dur_loss=0.2771, prior_loss=0.9941, diff_loss=0.364, tot_loss=1.635, over 189.00 samples.], tot_loss[dur_loss=0.271, prior_loss=0.993, diff_loss=0.4876, tot_loss=1.752, over 937.00 samples.], 2024-10-21 00:46:09,201 INFO [train.py:561] (1/4) Epoch 817, batch 14, global_batch_idx: 13070, batch size: 142, loss[dur_loss=0.2814, prior_loss=0.9937, diff_loss=0.3653, tot_loss=1.64, over 142.00 samples.], tot_loss[dur_loss=0.2769, prior_loss=0.9936, diff_loss=0.4103, tot_loss=1.681, over 2210.00 samples.], 2024-10-21 00:46:10,657 INFO [train.py:682] (1/4) Start epoch 818 2024-10-21 00:46:30,903 INFO [train.py:561] (1/4) Epoch 818, batch 8, global_batch_idx: 13080, batch size: 170, loss[dur_loss=0.2792, prior_loss=0.9937, diff_loss=0.3508, tot_loss=1.624, over 170.00 samples.], tot_loss[dur_loss=0.2754, prior_loss=0.9933, diff_loss=0.4202, tot_loss=1.689, over 1432.00 samples.], 2024-10-21 00:46:41,053 INFO [train.py:682] (1/4) Start epoch 819 2024-10-21 00:46:52,435 INFO [train.py:561] (1/4) Epoch 819, batch 2, global_batch_idx: 13090, batch size: 203, loss[dur_loss=0.2794, prior_loss=0.9936, diff_loss=0.383, tot_loss=1.656, over 203.00 samples.], tot_loss[dur_loss=0.2814, prior_loss=0.9939, diff_loss=0.3684, tot_loss=1.644, over 442.00 samples.], 2024-10-21 00:47:06,710 INFO [train.py:561] (1/4) Epoch 819, batch 12, global_batch_idx: 13100, batch size: 152, loss[dur_loss=0.2749, prior_loss=0.9933, diff_loss=0.3642, tot_loss=1.632, over 152.00 samples.], tot_loss[dur_loss=0.276, prior_loss=0.9934, diff_loss=0.418, tot_loss=1.687, over 1966.00 samples.], 2024-10-21 00:47:11,152 INFO [train.py:682] (1/4) Start epoch 820 2024-10-21 00:47:28,387 INFO [train.py:561] (1/4) Epoch 820, batch 6, global_batch_idx: 13110, batch size: 106, loss[dur_loss=0.2822, prior_loss=0.9941, diff_loss=0.3672, tot_loss=1.644, over 106.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9931, diff_loss=0.4525, tot_loss=1.718, over 1142.00 samples.], 2024-10-21 00:47:41,570 INFO [train.py:682] (1/4) Start epoch 821 2024-10-21 00:47:50,441 INFO [train.py:561] (1/4) Epoch 821, batch 0, global_batch_idx: 13120, batch size: 108, loss[dur_loss=0.2871, prior_loss=0.9949, diff_loss=0.3327, tot_loss=1.615, over 108.00 samples.], tot_loss[dur_loss=0.2871, prior_loss=0.9949, diff_loss=0.3327, tot_loss=1.615, over 108.00 samples.], 2024-10-21 00:48:04,784 INFO [train.py:561] (1/4) Epoch 821, batch 10, global_batch_idx: 13130, batch size: 111, loss[dur_loss=0.2809, prior_loss=0.9951, diff_loss=0.3249, tot_loss=1.601, over 111.00 samples.], tot_loss[dur_loss=0.2753, prior_loss=0.9934, diff_loss=0.4176, tot_loss=1.686, over 1656.00 samples.], 2024-10-21 00:48:11,938 INFO [train.py:682] (1/4) Start epoch 822 2024-10-21 00:48:26,013 INFO [train.py:561] (1/4) Epoch 822, batch 4, global_batch_idx: 13140, batch size: 189, loss[dur_loss=0.281, prior_loss=0.9946, diff_loss=0.4247, tot_loss=1.7, over 189.00 samples.], tot_loss[dur_loss=0.2724, prior_loss=0.993, diff_loss=0.4752, tot_loss=1.741, over 937.00 samples.], 2024-10-21 00:48:40,966 INFO [train.py:561] (1/4) Epoch 822, batch 14, global_batch_idx: 13150, batch size: 142, loss[dur_loss=0.2803, prior_loss=0.9934, diff_loss=0.3381, tot_loss=1.612, over 142.00 samples.], tot_loss[dur_loss=0.2769, prior_loss=0.9936, diff_loss=0.4061, tot_loss=1.677, over 2210.00 samples.], 2024-10-21 00:48:42,445 INFO [train.py:682] (1/4) Start epoch 823 2024-10-21 00:49:03,168 INFO [train.py:561] (1/4) Epoch 823, batch 8, global_batch_idx: 13160, batch size: 170, loss[dur_loss=0.2792, prior_loss=0.9938, diff_loss=0.36, tot_loss=1.633, over 170.00 samples.], tot_loss[dur_loss=0.2747, prior_loss=0.9934, diff_loss=0.4226, tot_loss=1.691, over 1432.00 samples.], 2024-10-21 00:49:13,478 INFO [train.py:682] (1/4) Start epoch 824 2024-10-21 00:49:25,010 INFO [train.py:561] (1/4) Epoch 824, batch 2, global_batch_idx: 13170, batch size: 203, loss[dur_loss=0.277, prior_loss=0.9937, diff_loss=0.3913, tot_loss=1.662, over 203.00 samples.], tot_loss[dur_loss=0.2789, prior_loss=0.9939, diff_loss=0.3782, tot_loss=1.651, over 442.00 samples.], 2024-10-21 00:49:39,342 INFO [train.py:561] (1/4) Epoch 824, batch 12, global_batch_idx: 13180, batch size: 152, loss[dur_loss=0.2773, prior_loss=0.9936, diff_loss=0.3682, tot_loss=1.639, over 152.00 samples.], tot_loss[dur_loss=0.2769, prior_loss=0.9937, diff_loss=0.4219, tot_loss=1.692, over 1966.00 samples.], 2024-10-21 00:49:43,831 INFO [train.py:682] (1/4) Start epoch 825 2024-10-21 00:50:01,028 INFO [train.py:561] (1/4) Epoch 825, batch 6, global_batch_idx: 13190, batch size: 106, loss[dur_loss=0.2772, prior_loss=0.994, diff_loss=0.3468, tot_loss=1.618, over 106.00 samples.], tot_loss[dur_loss=0.272, prior_loss=0.9932, diff_loss=0.4512, tot_loss=1.716, over 1142.00 samples.], 2024-10-21 00:50:14,194 INFO [train.py:682] (1/4) Start epoch 826 2024-10-21 00:50:22,959 INFO [train.py:561] (1/4) Epoch 826, batch 0, global_batch_idx: 13200, batch size: 108, loss[dur_loss=0.288, prior_loss=0.9946, diff_loss=0.4124, tot_loss=1.695, over 108.00 samples.], tot_loss[dur_loss=0.288, prior_loss=0.9946, diff_loss=0.4124, tot_loss=1.695, over 108.00 samples.], 2024-10-21 00:50:37,295 INFO [train.py:561] (1/4) Epoch 826, batch 10, global_batch_idx: 13210, batch size: 111, loss[dur_loss=0.2846, prior_loss=0.9947, diff_loss=0.3429, tot_loss=1.622, over 111.00 samples.], tot_loss[dur_loss=0.2764, prior_loss=0.9935, diff_loss=0.4297, tot_loss=1.7, over 1656.00 samples.], 2024-10-21 00:50:44,419 INFO [train.py:682] (1/4) Start epoch 827 2024-10-21 00:50:58,627 INFO [train.py:561] (1/4) Epoch 827, batch 4, global_batch_idx: 13220, batch size: 189, loss[dur_loss=0.2802, prior_loss=0.9944, diff_loss=0.3897, tot_loss=1.664, over 189.00 samples.], tot_loss[dur_loss=0.2716, prior_loss=0.9929, diff_loss=0.4696, tot_loss=1.734, over 937.00 samples.], 2024-10-21 00:51:13,520 INFO [train.py:561] (1/4) Epoch 827, batch 14, global_batch_idx: 13230, batch size: 142, loss[dur_loss=0.2802, prior_loss=0.9939, diff_loss=0.384, tot_loss=1.658, over 142.00 samples.], tot_loss[dur_loss=0.2763, prior_loss=0.9934, diff_loss=0.4099, tot_loss=1.68, over 2210.00 samples.], 2024-10-21 00:51:14,947 INFO [train.py:682] (1/4) Start epoch 828 2024-10-21 00:51:34,884 INFO [train.py:561] (1/4) Epoch 828, batch 8, global_batch_idx: 13240, batch size: 170, loss[dur_loss=0.2769, prior_loss=0.9942, diff_loss=0.3939, tot_loss=1.665, over 170.00 samples.], tot_loss[dur_loss=0.2747, prior_loss=0.9933, diff_loss=0.4326, tot_loss=1.701, over 1432.00 samples.], 2024-10-21 00:51:44,971 INFO [train.py:682] (1/4) Start epoch 829 2024-10-21 00:51:56,555 INFO [train.py:561] (1/4) Epoch 829, batch 2, global_batch_idx: 13250, batch size: 203, loss[dur_loss=0.28, prior_loss=0.9937, diff_loss=0.3764, tot_loss=1.65, over 203.00 samples.], tot_loss[dur_loss=0.2795, prior_loss=0.9939, diff_loss=0.3783, tot_loss=1.652, over 442.00 samples.], 2024-10-21 00:52:10,666 INFO [train.py:561] (1/4) Epoch 829, batch 12, global_batch_idx: 13260, batch size: 152, loss[dur_loss=0.2783, prior_loss=0.9933, diff_loss=0.3851, tot_loss=1.657, over 152.00 samples.], tot_loss[dur_loss=0.276, prior_loss=0.9933, diff_loss=0.4181, tot_loss=1.687, over 1966.00 samples.], 2024-10-21 00:52:15,068 INFO [train.py:682] (1/4) Start epoch 830 2024-10-21 00:52:32,238 INFO [train.py:561] (1/4) Epoch 830, batch 6, global_batch_idx: 13270, batch size: 106, loss[dur_loss=0.2787, prior_loss=0.9937, diff_loss=0.3744, tot_loss=1.647, over 106.00 samples.], tot_loss[dur_loss=0.2723, prior_loss=0.9929, diff_loss=0.462, tot_loss=1.727, over 1142.00 samples.], 2024-10-21 00:52:45,371 INFO [train.py:682] (1/4) Start epoch 831 2024-10-21 00:52:54,070 INFO [train.py:561] (1/4) Epoch 831, batch 0, global_batch_idx: 13280, batch size: 108, loss[dur_loss=0.2847, prior_loss=0.9945, diff_loss=0.3571, tot_loss=1.636, over 108.00 samples.], tot_loss[dur_loss=0.2847, prior_loss=0.9945, diff_loss=0.3571, tot_loss=1.636, over 108.00 samples.], 2024-10-21 00:53:08,250 INFO [train.py:561] (1/4) Epoch 831, batch 10, global_batch_idx: 13290, batch size: 111, loss[dur_loss=0.2868, prior_loss=0.9956, diff_loss=0.3279, tot_loss=1.61, over 111.00 samples.], tot_loss[dur_loss=0.2748, prior_loss=0.9934, diff_loss=0.42, tot_loss=1.688, over 1656.00 samples.], 2024-10-21 00:53:15,351 INFO [train.py:682] (1/4) Start epoch 832 2024-10-21 00:53:29,634 INFO [train.py:561] (1/4) Epoch 832, batch 4, global_batch_idx: 13300, batch size: 189, loss[dur_loss=0.2742, prior_loss=0.9938, diff_loss=0.3924, tot_loss=1.66, over 189.00 samples.], tot_loss[dur_loss=0.2716, prior_loss=0.9927, diff_loss=0.4602, tot_loss=1.725, over 937.00 samples.], 2024-10-21 00:53:44,489 INFO [train.py:561] (1/4) Epoch 832, batch 14, global_batch_idx: 13310, batch size: 142, loss[dur_loss=0.2779, prior_loss=0.9937, diff_loss=0.3476, tot_loss=1.619, over 142.00 samples.], tot_loss[dur_loss=0.2765, prior_loss=0.9934, diff_loss=0.398, tot_loss=1.668, over 2210.00 samples.], 2024-10-21 00:53:45,910 INFO [train.py:682] (1/4) Start epoch 833 2024-10-21 00:54:05,928 INFO [train.py:561] (1/4) Epoch 833, batch 8, global_batch_idx: 13320, batch size: 170, loss[dur_loss=0.2805, prior_loss=0.9937, diff_loss=0.3712, tot_loss=1.646, over 170.00 samples.], tot_loss[dur_loss=0.2738, prior_loss=0.9932, diff_loss=0.4415, tot_loss=1.708, over 1432.00 samples.], 2024-10-21 00:54:16,138 INFO [train.py:682] (1/4) Start epoch 834 2024-10-21 00:54:27,672 INFO [train.py:561] (1/4) Epoch 834, batch 2, global_batch_idx: 13330, batch size: 203, loss[dur_loss=0.277, prior_loss=0.9933, diff_loss=0.3879, tot_loss=1.658, over 203.00 samples.], tot_loss[dur_loss=0.2789, prior_loss=0.9936, diff_loss=0.3689, tot_loss=1.641, over 442.00 samples.], 2024-10-21 00:54:41,859 INFO [train.py:561] (1/4) Epoch 834, batch 12, global_batch_idx: 13340, batch size: 152, loss[dur_loss=0.2747, prior_loss=0.9934, diff_loss=0.367, tot_loss=1.635, over 152.00 samples.], tot_loss[dur_loss=0.2752, prior_loss=0.9932, diff_loss=0.4101, tot_loss=1.678, over 1966.00 samples.], 2024-10-21 00:54:46,360 INFO [train.py:682] (1/4) Start epoch 835 2024-10-21 00:55:03,376 INFO [train.py:561] (1/4) Epoch 835, batch 6, global_batch_idx: 13350, batch size: 106, loss[dur_loss=0.278, prior_loss=0.9937, diff_loss=0.352, tot_loss=1.624, over 106.00 samples.], tot_loss[dur_loss=0.2706, prior_loss=0.9928, diff_loss=0.457, tot_loss=1.72, over 1142.00 samples.], 2024-10-21 00:55:16,381 INFO [train.py:682] (1/4) Start epoch 836 2024-10-21 00:55:25,396 INFO [train.py:561] (1/4) Epoch 836, batch 0, global_batch_idx: 13360, batch size: 108, loss[dur_loss=0.2835, prior_loss=0.9943, diff_loss=0.3624, tot_loss=1.64, over 108.00 samples.], tot_loss[dur_loss=0.2835, prior_loss=0.9943, diff_loss=0.3624, tot_loss=1.64, over 108.00 samples.], 2024-10-21 00:55:39,594 INFO [train.py:561] (1/4) Epoch 836, batch 10, global_batch_idx: 13370, batch size: 111, loss[dur_loss=0.2802, prior_loss=0.9949, diff_loss=0.3224, tot_loss=1.597, over 111.00 samples.], tot_loss[dur_loss=0.2744, prior_loss=0.9931, diff_loss=0.4129, tot_loss=1.68, over 1656.00 samples.], 2024-10-21 00:55:46,629 INFO [train.py:682] (1/4) Start epoch 837 2024-10-21 00:56:00,421 INFO [train.py:561] (1/4) Epoch 837, batch 4, global_batch_idx: 13380, batch size: 189, loss[dur_loss=0.2762, prior_loss=0.994, diff_loss=0.3711, tot_loss=1.641, over 189.00 samples.], tot_loss[dur_loss=0.27, prior_loss=0.9925, diff_loss=0.475, tot_loss=1.737, over 937.00 samples.], 2024-10-21 00:56:15,335 INFO [train.py:561] (1/4) Epoch 837, batch 14, global_batch_idx: 13390, batch size: 142, loss[dur_loss=0.2783, prior_loss=0.9931, diff_loss=0.3642, tot_loss=1.636, over 142.00 samples.], tot_loss[dur_loss=0.2748, prior_loss=0.9931, diff_loss=0.4122, tot_loss=1.68, over 2210.00 samples.], 2024-10-21 00:56:16,758 INFO [train.py:682] (1/4) Start epoch 838 2024-10-21 00:56:36,794 INFO [train.py:561] (1/4) Epoch 838, batch 8, global_batch_idx: 13400, batch size: 170, loss[dur_loss=0.2819, prior_loss=0.9939, diff_loss=0.3817, tot_loss=1.657, over 170.00 samples.], tot_loss[dur_loss=0.2738, prior_loss=0.9929, diff_loss=0.4285, tot_loss=1.695, over 1432.00 samples.], 2024-10-21 00:56:46,918 INFO [train.py:682] (1/4) Start epoch 839 2024-10-21 00:56:58,158 INFO [train.py:561] (1/4) Epoch 839, batch 2, global_batch_idx: 13410, batch size: 203, loss[dur_loss=0.277, prior_loss=0.9931, diff_loss=0.3982, tot_loss=1.668, over 203.00 samples.], tot_loss[dur_loss=0.2786, prior_loss=0.9934, diff_loss=0.3652, tot_loss=1.637, over 442.00 samples.], 2024-10-21 00:57:12,396 INFO [train.py:561] (1/4) Epoch 839, batch 12, global_batch_idx: 13420, batch size: 152, loss[dur_loss=0.2774, prior_loss=0.9933, diff_loss=0.3287, tot_loss=1.599, over 152.00 samples.], tot_loss[dur_loss=0.2743, prior_loss=0.993, diff_loss=0.409, tot_loss=1.676, over 1966.00 samples.], 2024-10-21 00:57:16,828 INFO [train.py:682] (1/4) Start epoch 840 2024-10-21 00:57:33,759 INFO [train.py:561] (1/4) Epoch 840, batch 6, global_batch_idx: 13430, batch size: 106, loss[dur_loss=0.2769, prior_loss=0.9938, diff_loss=0.3489, tot_loss=1.62, over 106.00 samples.], tot_loss[dur_loss=0.2716, prior_loss=0.9927, diff_loss=0.4539, tot_loss=1.718, over 1142.00 samples.], 2024-10-21 00:57:46,755 INFO [train.py:682] (1/4) Start epoch 841 2024-10-21 00:57:55,768 INFO [train.py:561] (1/4) Epoch 841, batch 0, global_batch_idx: 13440, batch size: 108, loss[dur_loss=0.2805, prior_loss=0.9939, diff_loss=0.3586, tot_loss=1.633, over 108.00 samples.], tot_loss[dur_loss=0.2805, prior_loss=0.9939, diff_loss=0.3586, tot_loss=1.633, over 108.00 samples.], 2024-10-21 00:58:09,961 INFO [train.py:561] (1/4) Epoch 841, batch 10, global_batch_idx: 13450, batch size: 111, loss[dur_loss=0.283, prior_loss=0.9946, diff_loss=0.357, tot_loss=1.635, over 111.00 samples.], tot_loss[dur_loss=0.2748, prior_loss=0.993, diff_loss=0.4183, tot_loss=1.686, over 1656.00 samples.], 2024-10-21 00:58:16,995 INFO [train.py:682] (1/4) Start epoch 842 2024-10-21 00:58:30,709 INFO [train.py:561] (1/4) Epoch 842, batch 4, global_batch_idx: 13460, batch size: 189, loss[dur_loss=0.277, prior_loss=0.994, diff_loss=0.3589, tot_loss=1.63, over 189.00 samples.], tot_loss[dur_loss=0.2709, prior_loss=0.9925, diff_loss=0.4649, tot_loss=1.728, over 937.00 samples.], 2024-10-21 00:58:45,552 INFO [train.py:561] (1/4) Epoch 842, batch 14, global_batch_idx: 13470, batch size: 142, loss[dur_loss=0.2789, prior_loss=0.9932, diff_loss=0.3446, tot_loss=1.617, over 142.00 samples.], tot_loss[dur_loss=0.2746, prior_loss=0.9931, diff_loss=0.4033, tot_loss=1.671, over 2210.00 samples.], 2024-10-21 00:58:46,979 INFO [train.py:682] (1/4) Start epoch 843 2024-10-21 00:59:07,230 INFO [train.py:561] (1/4) Epoch 843, batch 8, global_batch_idx: 13480, batch size: 170, loss[dur_loss=0.2802, prior_loss=0.9938, diff_loss=0.36, tot_loss=1.634, over 170.00 samples.], tot_loss[dur_loss=0.2737, prior_loss=0.9931, diff_loss=0.4315, tot_loss=1.698, over 1432.00 samples.], 2024-10-21 00:59:17,361 INFO [train.py:682] (1/4) Start epoch 844 2024-10-21 00:59:28,690 INFO [train.py:561] (1/4) Epoch 844, batch 2, global_batch_idx: 13490, batch size: 203, loss[dur_loss=0.2737, prior_loss=0.9936, diff_loss=0.3591, tot_loss=1.626, over 203.00 samples.], tot_loss[dur_loss=0.2778, prior_loss=0.9939, diff_loss=0.3666, tot_loss=1.638, over 442.00 samples.], 2024-10-21 00:59:43,235 INFO [train.py:561] (1/4) Epoch 844, batch 12, global_batch_idx: 13500, batch size: 152, loss[dur_loss=0.2769, prior_loss=0.9934, diff_loss=0.3705, tot_loss=1.641, over 152.00 samples.], tot_loss[dur_loss=0.2754, prior_loss=0.9933, diff_loss=0.4089, tot_loss=1.678, over 1966.00 samples.], 2024-10-21 00:59:44,864 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 01:00:12,989 INFO [train.py:589] (1/4) Epoch 844, validation: dur_loss=0.4361, prior_loss=1.03, diff_loss=0.4063, tot_loss=1.872, over 100.00 samples. 2024-10-21 01:00:12,990 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 01:00:15,884 INFO [train.py:682] (1/4) Start epoch 845 2024-10-21 01:00:33,393 INFO [train.py:561] (1/4) Epoch 845, batch 6, global_batch_idx: 13510, batch size: 106, loss[dur_loss=0.2743, prior_loss=0.9938, diff_loss=0.3128, tot_loss=1.581, over 106.00 samples.], tot_loss[dur_loss=0.2697, prior_loss=0.9927, diff_loss=0.4409, tot_loss=1.703, over 1142.00 samples.], 2024-10-21 01:00:46,243 INFO [train.py:682] (1/4) Start epoch 846 2024-10-21 01:00:55,701 INFO [train.py:561] (1/4) Epoch 846, batch 0, global_batch_idx: 13520, batch size: 108, loss[dur_loss=0.2844, prior_loss=0.9942, diff_loss=0.3729, tot_loss=1.652, over 108.00 samples.], tot_loss[dur_loss=0.2844, prior_loss=0.9942, diff_loss=0.3729, tot_loss=1.652, over 108.00 samples.], 2024-10-21 01:01:09,858 INFO [train.py:561] (1/4) Epoch 846, batch 10, global_batch_idx: 13530, batch size: 111, loss[dur_loss=0.2827, prior_loss=0.9944, diff_loss=0.3794, tot_loss=1.657, over 111.00 samples.], tot_loss[dur_loss=0.2731, prior_loss=0.9929, diff_loss=0.4226, tot_loss=1.689, over 1656.00 samples.], 2024-10-21 01:01:16,923 INFO [train.py:682] (1/4) Start epoch 847 2024-10-21 01:01:31,130 INFO [train.py:561] (1/4) Epoch 847, batch 4, global_batch_idx: 13540, batch size: 189, loss[dur_loss=0.2741, prior_loss=0.9938, diff_loss=0.3894, tot_loss=1.657, over 189.00 samples.], tot_loss[dur_loss=0.2692, prior_loss=0.9922, diff_loss=0.4727, tot_loss=1.734, over 937.00 samples.], 2024-10-21 01:01:45,920 INFO [train.py:561] (1/4) Epoch 847, batch 14, global_batch_idx: 13550, batch size: 142, loss[dur_loss=0.276, prior_loss=0.9929, diff_loss=0.3514, tot_loss=1.62, over 142.00 samples.], tot_loss[dur_loss=0.2739, prior_loss=0.9929, diff_loss=0.402, tot_loss=1.669, over 2210.00 samples.], 2024-10-21 01:01:47,344 INFO [train.py:682] (1/4) Start epoch 848 2024-10-21 01:02:07,617 INFO [train.py:561] (1/4) Epoch 848, batch 8, global_batch_idx: 13560, batch size: 170, loss[dur_loss=0.2805, prior_loss=0.9938, diff_loss=0.3811, tot_loss=1.655, over 170.00 samples.], tot_loss[dur_loss=0.2733, prior_loss=0.9929, diff_loss=0.4255, tot_loss=1.692, over 1432.00 samples.], 2024-10-21 01:02:17,634 INFO [train.py:682] (1/4) Start epoch 849 2024-10-21 01:02:29,517 INFO [train.py:561] (1/4) Epoch 849, batch 2, global_batch_idx: 13570, batch size: 203, loss[dur_loss=0.2764, prior_loss=0.9931, diff_loss=0.391, tot_loss=1.66, over 203.00 samples.], tot_loss[dur_loss=0.2784, prior_loss=0.9933, diff_loss=0.3557, tot_loss=1.627, over 442.00 samples.], 2024-10-21 01:02:43,810 INFO [train.py:561] (1/4) Epoch 849, batch 12, global_batch_idx: 13580, batch size: 152, loss[dur_loss=0.2761, prior_loss=0.9928, diff_loss=0.3567, tot_loss=1.626, over 152.00 samples.], tot_loss[dur_loss=0.274, prior_loss=0.9928, diff_loss=0.4084, tot_loss=1.675, over 1966.00 samples.], 2024-10-21 01:02:48,267 INFO [train.py:682] (1/4) Start epoch 850 2024-10-21 01:03:05,747 INFO [train.py:561] (1/4) Epoch 850, batch 6, global_batch_idx: 13590, batch size: 106, loss[dur_loss=0.2772, prior_loss=0.9934, diff_loss=0.3692, tot_loss=1.64, over 106.00 samples.], tot_loss[dur_loss=0.2687, prior_loss=0.9923, diff_loss=0.4581, tot_loss=1.719, over 1142.00 samples.], 2024-10-21 01:03:19,031 INFO [train.py:682] (1/4) Start epoch 851 2024-10-21 01:03:27,912 INFO [train.py:561] (1/4) Epoch 851, batch 0, global_batch_idx: 13600, batch size: 108, loss[dur_loss=0.2827, prior_loss=0.9937, diff_loss=0.352, tot_loss=1.628, over 108.00 samples.], tot_loss[dur_loss=0.2827, prior_loss=0.9937, diff_loss=0.352, tot_loss=1.628, over 108.00 samples.], 2024-10-21 01:03:42,210 INFO [train.py:561] (1/4) Epoch 851, batch 10, global_batch_idx: 13610, batch size: 111, loss[dur_loss=0.279, prior_loss=0.9939, diff_loss=0.3305, tot_loss=1.603, over 111.00 samples.], tot_loss[dur_loss=0.2733, prior_loss=0.9926, diff_loss=0.416, tot_loss=1.682, over 1656.00 samples.], 2024-10-21 01:03:49,356 INFO [train.py:682] (1/4) Start epoch 852 2024-10-21 01:04:03,405 INFO [train.py:561] (1/4) Epoch 852, batch 4, global_batch_idx: 13620, batch size: 189, loss[dur_loss=0.2756, prior_loss=0.9935, diff_loss=0.3426, tot_loss=1.612, over 189.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9922, diff_loss=0.4666, tot_loss=1.728, over 937.00 samples.], 2024-10-21 01:04:18,471 INFO [train.py:561] (1/4) Epoch 852, batch 14, global_batch_idx: 13630, batch size: 142, loss[dur_loss=0.2758, prior_loss=0.9929, diff_loss=0.3413, tot_loss=1.61, over 142.00 samples.], tot_loss[dur_loss=0.2734, prior_loss=0.9928, diff_loss=0.4071, tot_loss=1.673, over 2210.00 samples.], 2024-10-21 01:04:19,901 INFO [train.py:682] (1/4) Start epoch 853 2024-10-21 01:04:40,030 INFO [train.py:561] (1/4) Epoch 853, batch 8, global_batch_idx: 13640, batch size: 170, loss[dur_loss=0.2784, prior_loss=0.9931, diff_loss=0.3915, tot_loss=1.663, over 170.00 samples.], tot_loss[dur_loss=0.2717, prior_loss=0.9925, diff_loss=0.425, tot_loss=1.689, over 1432.00 samples.], 2024-10-21 01:04:50,153 INFO [train.py:682] (1/4) Start epoch 854 2024-10-21 01:05:01,803 INFO [train.py:561] (1/4) Epoch 854, batch 2, global_batch_idx: 13650, batch size: 203, loss[dur_loss=0.2741, prior_loss=0.9927, diff_loss=0.3747, tot_loss=1.641, over 203.00 samples.], tot_loss[dur_loss=0.2757, prior_loss=0.9931, diff_loss=0.3496, tot_loss=1.618, over 442.00 samples.], 2024-10-21 01:05:16,116 INFO [train.py:561] (1/4) Epoch 854, batch 12, global_batch_idx: 13660, batch size: 152, loss[dur_loss=0.2745, prior_loss=0.9931, diff_loss=0.3824, tot_loss=1.65, over 152.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9927, diff_loss=0.4146, tot_loss=1.68, over 1966.00 samples.], 2024-10-21 01:05:20,610 INFO [train.py:682] (1/4) Start epoch 855 2024-10-21 01:05:37,961 INFO [train.py:561] (1/4) Epoch 855, batch 6, global_batch_idx: 13670, batch size: 106, loss[dur_loss=0.2769, prior_loss=0.993, diff_loss=0.3449, tot_loss=1.615, over 106.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9922, diff_loss=0.4407, tot_loss=1.702, over 1142.00 samples.], 2024-10-21 01:05:51,146 INFO [train.py:682] (1/4) Start epoch 856 2024-10-21 01:06:00,516 INFO [train.py:561] (1/4) Epoch 856, batch 0, global_batch_idx: 13680, batch size: 108, loss[dur_loss=0.2825, prior_loss=0.9938, diff_loss=0.3799, tot_loss=1.656, over 108.00 samples.], tot_loss[dur_loss=0.2825, prior_loss=0.9938, diff_loss=0.3799, tot_loss=1.656, over 108.00 samples.], 2024-10-21 01:06:14,949 INFO [train.py:561] (1/4) Epoch 856, batch 10, global_batch_idx: 13690, batch size: 111, loss[dur_loss=0.2821, prior_loss=0.994, diff_loss=0.3306, tot_loss=1.607, over 111.00 samples.], tot_loss[dur_loss=0.2723, prior_loss=0.9925, diff_loss=0.407, tot_loss=1.672, over 1656.00 samples.], 2024-10-21 01:06:22,101 INFO [train.py:682] (1/4) Start epoch 857 2024-10-21 01:06:36,187 INFO [train.py:561] (1/4) Epoch 857, batch 4, global_batch_idx: 13700, batch size: 189, loss[dur_loss=0.2715, prior_loss=0.993, diff_loss=0.3933, tot_loss=1.658, over 189.00 samples.], tot_loss[dur_loss=0.2686, prior_loss=0.9918, diff_loss=0.4756, tot_loss=1.736, over 937.00 samples.], 2024-10-21 01:06:51,481 INFO [train.py:561] (1/4) Epoch 857, batch 14, global_batch_idx: 13710, batch size: 142, loss[dur_loss=0.2778, prior_loss=0.9927, diff_loss=0.3402, tot_loss=1.611, over 142.00 samples.], tot_loss[dur_loss=0.2724, prior_loss=0.9925, diff_loss=0.4059, tot_loss=1.671, over 2210.00 samples.], 2024-10-21 01:06:52,926 INFO [train.py:682] (1/4) Start epoch 858 2024-10-21 01:07:13,337 INFO [train.py:561] (1/4) Epoch 858, batch 8, global_batch_idx: 13720, batch size: 170, loss[dur_loss=0.276, prior_loss=0.9927, diff_loss=0.3694, tot_loss=1.638, over 170.00 samples.], tot_loss[dur_loss=0.2698, prior_loss=0.9921, diff_loss=0.4236, tot_loss=1.686, over 1432.00 samples.], 2024-10-21 01:07:23,728 INFO [train.py:682] (1/4) Start epoch 859 2024-10-21 01:07:35,554 INFO [train.py:561] (1/4) Epoch 859, batch 2, global_batch_idx: 13730, batch size: 203, loss[dur_loss=0.2725, prior_loss=0.9926, diff_loss=0.3939, tot_loss=1.659, over 203.00 samples.], tot_loss[dur_loss=0.2749, prior_loss=0.9928, diff_loss=0.3586, tot_loss=1.626, over 442.00 samples.], 2024-10-21 01:07:50,019 INFO [train.py:561] (1/4) Epoch 859, batch 12, global_batch_idx: 13740, batch size: 152, loss[dur_loss=0.2713, prior_loss=0.9926, diff_loss=0.3673, tot_loss=1.631, over 152.00 samples.], tot_loss[dur_loss=0.2711, prior_loss=0.9923, diff_loss=0.4046, tot_loss=1.668, over 1966.00 samples.], 2024-10-21 01:07:54,518 INFO [train.py:682] (1/4) Start epoch 860 2024-10-21 01:08:11,810 INFO [train.py:561] (1/4) Epoch 860, batch 6, global_batch_idx: 13750, batch size: 106, loss[dur_loss=0.2751, prior_loss=0.9928, diff_loss=0.3093, tot_loss=1.577, over 106.00 samples.], tot_loss[dur_loss=0.2689, prior_loss=0.9919, diff_loss=0.4455, tot_loss=1.706, over 1142.00 samples.], 2024-10-21 01:08:24,925 INFO [train.py:682] (1/4) Start epoch 861 2024-10-21 01:08:34,164 INFO [train.py:561] (1/4) Epoch 861, batch 0, global_batch_idx: 13760, batch size: 108, loss[dur_loss=0.2815, prior_loss=0.9934, diff_loss=0.3475, tot_loss=1.622, over 108.00 samples.], tot_loss[dur_loss=0.2815, prior_loss=0.9934, diff_loss=0.3475, tot_loss=1.622, over 108.00 samples.], 2024-10-21 01:08:48,469 INFO [train.py:561] (1/4) Epoch 861, batch 10, global_batch_idx: 13770, batch size: 111, loss[dur_loss=0.2786, prior_loss=0.9937, diff_loss=0.3799, tot_loss=1.652, over 111.00 samples.], tot_loss[dur_loss=0.2713, prior_loss=0.9923, diff_loss=0.4111, tot_loss=1.675, over 1656.00 samples.], 2024-10-21 01:08:55,558 INFO [train.py:682] (1/4) Start epoch 862 2024-10-21 01:09:09,454 INFO [train.py:561] (1/4) Epoch 862, batch 4, global_batch_idx: 13780, batch size: 189, loss[dur_loss=0.2715, prior_loss=0.9936, diff_loss=0.3542, tot_loss=1.619, over 189.00 samples.], tot_loss[dur_loss=0.2678, prior_loss=0.992, diff_loss=0.4659, tot_loss=1.726, over 937.00 samples.], 2024-10-21 01:09:24,466 INFO [train.py:561] (1/4) Epoch 862, batch 14, global_batch_idx: 13790, batch size: 142, loss[dur_loss=0.2743, prior_loss=0.9926, diff_loss=0.3348, tot_loss=1.602, over 142.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9925, diff_loss=0.3954, tot_loss=1.66, over 2210.00 samples.], 2024-10-21 01:09:25,925 INFO [train.py:682] (1/4) Start epoch 863 2024-10-21 01:09:46,520 INFO [train.py:561] (1/4) Epoch 863, batch 8, global_batch_idx: 13800, batch size: 170, loss[dur_loss=0.2777, prior_loss=0.9925, diff_loss=0.345, tot_loss=1.615, over 170.00 samples.], tot_loss[dur_loss=0.2701, prior_loss=0.9921, diff_loss=0.4256, tot_loss=1.688, over 1432.00 samples.], 2024-10-21 01:09:56,765 INFO [train.py:682] (1/4) Start epoch 864 2024-10-21 01:10:08,783 INFO [train.py:561] (1/4) Epoch 864, batch 2, global_batch_idx: 13810, batch size: 203, loss[dur_loss=0.2737, prior_loss=0.9926, diff_loss=0.4103, tot_loss=1.677, over 203.00 samples.], tot_loss[dur_loss=0.2738, prior_loss=0.9928, diff_loss=0.3731, tot_loss=1.64, over 442.00 samples.], 2024-10-21 01:10:23,204 INFO [train.py:561] (1/4) Epoch 864, batch 12, global_batch_idx: 13820, batch size: 152, loss[dur_loss=0.2724, prior_loss=0.9922, diff_loss=0.3513, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2704, prior_loss=0.9923, diff_loss=0.4101, tot_loss=1.673, over 1966.00 samples.], 2024-10-21 01:10:27,710 INFO [train.py:682] (1/4) Start epoch 865 2024-10-21 01:10:45,316 INFO [train.py:561] (1/4) Epoch 865, batch 6, global_batch_idx: 13830, batch size: 106, loss[dur_loss=0.2751, prior_loss=0.993, diff_loss=0.3785, tot_loss=1.647, over 106.00 samples.], tot_loss[dur_loss=0.2679, prior_loss=0.9918, diff_loss=0.4431, tot_loss=1.703, over 1142.00 samples.], 2024-10-21 01:10:58,518 INFO [train.py:682] (1/4) Start epoch 866 2024-10-21 01:11:07,317 INFO [train.py:561] (1/4) Epoch 866, batch 0, global_batch_idx: 13840, batch size: 108, loss[dur_loss=0.2807, prior_loss=0.9933, diff_loss=0.3561, tot_loss=1.63, over 108.00 samples.], tot_loss[dur_loss=0.2807, prior_loss=0.9933, diff_loss=0.3561, tot_loss=1.63, over 108.00 samples.], 2024-10-21 01:11:21,698 INFO [train.py:561] (1/4) Epoch 866, batch 10, global_batch_idx: 13850, batch size: 111, loss[dur_loss=0.279, prior_loss=0.9938, diff_loss=0.3568, tot_loss=1.63, over 111.00 samples.], tot_loss[dur_loss=0.2703, prior_loss=0.9923, diff_loss=0.4162, tot_loss=1.679, over 1656.00 samples.], 2024-10-21 01:11:28,858 INFO [train.py:682] (1/4) Start epoch 867 2024-10-21 01:11:42,962 INFO [train.py:561] (1/4) Epoch 867, batch 4, global_batch_idx: 13860, batch size: 189, loss[dur_loss=0.2753, prior_loss=0.9934, diff_loss=0.3895, tot_loss=1.658, over 189.00 samples.], tot_loss[dur_loss=0.2673, prior_loss=0.9919, diff_loss=0.4829, tot_loss=1.742, over 937.00 samples.], 2024-10-21 01:11:57,980 INFO [train.py:561] (1/4) Epoch 867, batch 14, global_batch_idx: 13870, batch size: 142, loss[dur_loss=0.2758, prior_loss=0.9925, diff_loss=0.3396, tot_loss=1.608, over 142.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9924, diff_loss=0.4095, tot_loss=1.674, over 2210.00 samples.], 2024-10-21 01:11:59,448 INFO [train.py:682] (1/4) Start epoch 868 2024-10-21 01:12:19,582 INFO [train.py:561] (1/4) Epoch 868, batch 8, global_batch_idx: 13880, batch size: 170, loss[dur_loss=0.277, prior_loss=0.993, diff_loss=0.392, tot_loss=1.662, over 170.00 samples.], tot_loss[dur_loss=0.2705, prior_loss=0.9921, diff_loss=0.4335, tot_loss=1.696, over 1432.00 samples.], 2024-10-21 01:12:29,796 INFO [train.py:682] (1/4) Start epoch 869 2024-10-21 01:12:41,629 INFO [train.py:561] (1/4) Epoch 869, batch 2, global_batch_idx: 13890, batch size: 203, loss[dur_loss=0.276, prior_loss=0.9927, diff_loss=0.3695, tot_loss=1.638, over 203.00 samples.], tot_loss[dur_loss=0.2766, prior_loss=0.9929, diff_loss=0.3694, tot_loss=1.639, over 442.00 samples.], 2024-10-21 01:12:55,903 INFO [train.py:561] (1/4) Epoch 869, batch 12, global_batch_idx: 13900, batch size: 152, loss[dur_loss=0.2746, prior_loss=0.9925, diff_loss=0.347, tot_loss=1.614, over 152.00 samples.], tot_loss[dur_loss=0.2732, prior_loss=0.9924, diff_loss=0.4081, tot_loss=1.674, over 1966.00 samples.], 2024-10-21 01:13:00,398 INFO [train.py:682] (1/4) Start epoch 870 2024-10-21 01:13:18,469 INFO [train.py:561] (1/4) Epoch 870, batch 6, global_batch_idx: 13910, batch size: 106, loss[dur_loss=0.2757, prior_loss=0.9931, diff_loss=0.3848, tot_loss=1.654, over 106.00 samples.], tot_loss[dur_loss=0.2683, prior_loss=0.9919, diff_loss=0.4544, tot_loss=1.715, over 1142.00 samples.], 2024-10-21 01:13:31,699 INFO [train.py:682] (1/4) Start epoch 871 2024-10-21 01:13:40,727 INFO [train.py:561] (1/4) Epoch 871, batch 0, global_batch_idx: 13920, batch size: 108, loss[dur_loss=0.2768, prior_loss=0.9936, diff_loss=0.3276, tot_loss=1.598, over 108.00 samples.], tot_loss[dur_loss=0.2768, prior_loss=0.9936, diff_loss=0.3276, tot_loss=1.598, over 108.00 samples.], 2024-10-21 01:13:55,265 INFO [train.py:561] (1/4) Epoch 871, batch 10, global_batch_idx: 13930, batch size: 111, loss[dur_loss=0.2801, prior_loss=0.994, diff_loss=0.3643, tot_loss=1.639, over 111.00 samples.], tot_loss[dur_loss=0.2709, prior_loss=0.9923, diff_loss=0.4197, tot_loss=1.683, over 1656.00 samples.], 2024-10-21 01:14:02,382 INFO [train.py:682] (1/4) Start epoch 872 2024-10-21 01:14:16,828 INFO [train.py:561] (1/4) Epoch 872, batch 4, global_batch_idx: 13940, batch size: 189, loss[dur_loss=0.2718, prior_loss=0.9928, diff_loss=0.3811, tot_loss=1.646, over 189.00 samples.], tot_loss[dur_loss=0.265, prior_loss=0.9916, diff_loss=0.4787, tot_loss=1.735, over 937.00 samples.], 2024-10-21 01:14:31,660 INFO [train.py:561] (1/4) Epoch 872, batch 14, global_batch_idx: 13950, batch size: 142, loss[dur_loss=0.2736, prior_loss=0.9929, diff_loss=0.352, tot_loss=1.618, over 142.00 samples.], tot_loss[dur_loss=0.2714, prior_loss=0.9923, diff_loss=0.4091, tot_loss=1.673, over 2210.00 samples.], 2024-10-21 01:14:33,072 INFO [train.py:682] (1/4) Start epoch 873 2024-10-21 01:14:53,367 INFO [train.py:561] (1/4) Epoch 873, batch 8, global_batch_idx: 13960, batch size: 170, loss[dur_loss=0.2757, prior_loss=0.9926, diff_loss=0.3598, tot_loss=1.628, over 170.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.992, diff_loss=0.4328, tot_loss=1.694, over 1432.00 samples.], 2024-10-21 01:15:03,376 INFO [train.py:682] (1/4) Start epoch 874 2024-10-21 01:15:14,728 INFO [train.py:561] (1/4) Epoch 874, batch 2, global_batch_idx: 13970, batch size: 203, loss[dur_loss=0.2713, prior_loss=0.9925, diff_loss=0.3863, tot_loss=1.65, over 203.00 samples.], tot_loss[dur_loss=0.2745, prior_loss=0.9927, diff_loss=0.3755, tot_loss=1.643, over 442.00 samples.], 2024-10-21 01:15:28,851 INFO [train.py:561] (1/4) Epoch 874, batch 12, global_batch_idx: 13980, batch size: 152, loss[dur_loss=0.2749, prior_loss=0.9929, diff_loss=0.3415, tot_loss=1.609, over 152.00 samples.], tot_loss[dur_loss=0.2719, prior_loss=0.9923, diff_loss=0.4115, tot_loss=1.676, over 1966.00 samples.], 2024-10-21 01:15:33,359 INFO [train.py:682] (1/4) Start epoch 875 2024-10-21 01:15:50,637 INFO [train.py:561] (1/4) Epoch 875, batch 6, global_batch_idx: 13990, batch size: 106, loss[dur_loss=0.2765, prior_loss=0.9936, diff_loss=0.3231, tot_loss=1.593, over 106.00 samples.], tot_loss[dur_loss=0.2691, prior_loss=0.9922, diff_loss=0.4408, tot_loss=1.702, over 1142.00 samples.], 2024-10-21 01:16:03,764 INFO [train.py:682] (1/4) Start epoch 876 2024-10-21 01:16:12,697 INFO [train.py:561] (1/4) Epoch 876, batch 0, global_batch_idx: 14000, batch size: 108, loss[dur_loss=0.2883, prior_loss=0.9949, diff_loss=0.3238, tot_loss=1.607, over 108.00 samples.], tot_loss[dur_loss=0.2883, prior_loss=0.9949, diff_loss=0.3238, tot_loss=1.607, over 108.00 samples.], 2024-10-21 01:16:27,208 INFO [train.py:561] (1/4) Epoch 876, batch 10, global_batch_idx: 14010, batch size: 111, loss[dur_loss=0.2798, prior_loss=0.9939, diff_loss=0.3632, tot_loss=1.637, over 111.00 samples.], tot_loss[dur_loss=0.2722, prior_loss=0.9926, diff_loss=0.4189, tot_loss=1.684, over 1656.00 samples.], 2024-10-21 01:16:34,340 INFO [train.py:682] (1/4) Start epoch 877 2024-10-21 01:16:48,571 INFO [train.py:561] (1/4) Epoch 877, batch 4, global_batch_idx: 14020, batch size: 189, loss[dur_loss=0.2739, prior_loss=0.9935, diff_loss=0.418, tot_loss=1.685, over 189.00 samples.], tot_loss[dur_loss=0.2684, prior_loss=0.9919, diff_loss=0.4822, tot_loss=1.742, over 937.00 samples.], 2024-10-21 01:17:03,665 INFO [train.py:561] (1/4) Epoch 877, batch 14, global_batch_idx: 14030, batch size: 142, loss[dur_loss=0.2768, prior_loss=0.9924, diff_loss=0.341, tot_loss=1.61, over 142.00 samples.], tot_loss[dur_loss=0.2733, prior_loss=0.9925, diff_loss=0.413, tot_loss=1.679, over 2210.00 samples.], 2024-10-21 01:17:05,090 INFO [train.py:682] (1/4) Start epoch 878 2024-10-21 01:17:25,606 INFO [train.py:561] (1/4) Epoch 878, batch 8, global_batch_idx: 14040, batch size: 170, loss[dur_loss=0.271, prior_loss=0.9927, diff_loss=0.3564, tot_loss=1.62, over 170.00 samples.], tot_loss[dur_loss=0.2699, prior_loss=0.9923, diff_loss=0.4278, tot_loss=1.69, over 1432.00 samples.], 2024-10-21 01:17:35,943 INFO [train.py:682] (1/4) Start epoch 879 2024-10-21 01:17:47,648 INFO [train.py:561] (1/4) Epoch 879, batch 2, global_batch_idx: 14050, batch size: 203, loss[dur_loss=0.2693, prior_loss=0.9924, diff_loss=0.4155, tot_loss=1.677, over 203.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9925, diff_loss=0.3748, tot_loss=1.64, over 442.00 samples.], 2024-10-21 01:18:02,059 INFO [train.py:561] (1/4) Epoch 879, batch 12, global_batch_idx: 14060, batch size: 152, loss[dur_loss=0.2688, prior_loss=0.9921, diff_loss=0.3278, tot_loss=1.589, over 152.00 samples.], tot_loss[dur_loss=0.269, prior_loss=0.9921, diff_loss=0.4085, tot_loss=1.67, over 1966.00 samples.], 2024-10-21 01:18:06,611 INFO [train.py:682] (1/4) Start epoch 880 2024-10-21 01:18:24,019 INFO [train.py:561] (1/4) Epoch 880, batch 6, global_batch_idx: 14070, batch size: 106, loss[dur_loss=0.2768, prior_loss=0.9928, diff_loss=0.3357, tot_loss=1.605, over 106.00 samples.], tot_loss[dur_loss=0.269, prior_loss=0.9917, diff_loss=0.436, tot_loss=1.697, over 1142.00 samples.], 2024-10-21 01:18:37,254 INFO [train.py:682] (1/4) Start epoch 881 2024-10-21 01:18:46,503 INFO [train.py:561] (1/4) Epoch 881, batch 0, global_batch_idx: 14080, batch size: 108, loss[dur_loss=0.2764, prior_loss=0.9929, diff_loss=0.3338, tot_loss=1.603, over 108.00 samples.], tot_loss[dur_loss=0.2764, prior_loss=0.9929, diff_loss=0.3338, tot_loss=1.603, over 108.00 samples.], 2024-10-21 01:19:00,955 INFO [train.py:561] (1/4) Epoch 881, batch 10, global_batch_idx: 14090, batch size: 111, loss[dur_loss=0.2784, prior_loss=0.9938, diff_loss=0.3515, tot_loss=1.624, over 111.00 samples.], tot_loss[dur_loss=0.2686, prior_loss=0.992, diff_loss=0.4096, tot_loss=1.67, over 1656.00 samples.], 2024-10-21 01:19:08,123 INFO [train.py:682] (1/4) Start epoch 882 2024-10-21 01:19:22,313 INFO [train.py:561] (1/4) Epoch 882, batch 4, global_batch_idx: 14100, batch size: 189, loss[dur_loss=0.2708, prior_loss=0.9928, diff_loss=0.3876, tot_loss=1.651, over 189.00 samples.], tot_loss[dur_loss=0.2649, prior_loss=0.9914, diff_loss=0.4796, tot_loss=1.736, over 937.00 samples.], 2024-10-21 01:19:37,364 INFO [train.py:561] (1/4) Epoch 882, batch 14, global_batch_idx: 14110, batch size: 142, loss[dur_loss=0.2738, prior_loss=0.9921, diff_loss=0.3502, tot_loss=1.616, over 142.00 samples.], tot_loss[dur_loss=0.2701, prior_loss=0.9921, diff_loss=0.4088, tot_loss=1.671, over 2210.00 samples.], 2024-10-21 01:19:38,821 INFO [train.py:682] (1/4) Start epoch 883 2024-10-21 01:19:59,031 INFO [train.py:561] (1/4) Epoch 883, batch 8, global_batch_idx: 14120, batch size: 170, loss[dur_loss=0.2752, prior_loss=0.9928, diff_loss=0.3456, tot_loss=1.614, over 170.00 samples.], tot_loss[dur_loss=0.2683, prior_loss=0.9918, diff_loss=0.4195, tot_loss=1.68, over 1432.00 samples.], 2024-10-21 01:20:09,311 INFO [train.py:682] (1/4) Start epoch 884 2024-10-21 01:20:21,058 INFO [train.py:561] (1/4) Epoch 884, batch 2, global_batch_idx: 14130, batch size: 203, loss[dur_loss=0.2695, prior_loss=0.9921, diff_loss=0.3647, tot_loss=1.626, over 203.00 samples.], tot_loss[dur_loss=0.2712, prior_loss=0.9923, diff_loss=0.3516, tot_loss=1.615, over 442.00 samples.], 2024-10-21 01:20:35,376 INFO [train.py:561] (1/4) Epoch 884, batch 12, global_batch_idx: 14140, batch size: 152, loss[dur_loss=0.2753, prior_loss=0.9921, diff_loss=0.3898, tot_loss=1.657, over 152.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9919, diff_loss=0.4088, tot_loss=1.669, over 1966.00 samples.], 2024-10-21 01:20:39,876 INFO [train.py:682] (1/4) Start epoch 885 2024-10-21 01:20:57,121 INFO [train.py:561] (1/4) Epoch 885, batch 6, global_batch_idx: 14150, batch size: 106, loss[dur_loss=0.2734, prior_loss=0.9924, diff_loss=0.3498, tot_loss=1.616, over 106.00 samples.], tot_loss[dur_loss=0.266, prior_loss=0.9915, diff_loss=0.4512, tot_loss=1.709, over 1142.00 samples.], 2024-10-21 01:21:10,282 INFO [train.py:682] (1/4) Start epoch 886 2024-10-21 01:21:19,337 INFO [train.py:561] (1/4) Epoch 886, batch 0, global_batch_idx: 14160, batch size: 108, loss[dur_loss=0.2781, prior_loss=0.9928, diff_loss=0.3149, tot_loss=1.586, over 108.00 samples.], tot_loss[dur_loss=0.2781, prior_loss=0.9928, diff_loss=0.3149, tot_loss=1.586, over 108.00 samples.], 2024-10-21 01:21:33,644 INFO [train.py:561] (1/4) Epoch 886, batch 10, global_batch_idx: 14170, batch size: 111, loss[dur_loss=0.2751, prior_loss=0.9933, diff_loss=0.3387, tot_loss=1.607, over 111.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9919, diff_loss=0.4249, tot_loss=1.686, over 1656.00 samples.], 2024-10-21 01:21:40,743 INFO [train.py:682] (1/4) Start epoch 887 2024-10-21 01:21:54,934 INFO [train.py:561] (1/4) Epoch 887, batch 4, global_batch_idx: 14180, batch size: 189, loss[dur_loss=0.2729, prior_loss=0.9927, diff_loss=0.3896, tot_loss=1.655, over 189.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.9912, diff_loss=0.4713, tot_loss=1.727, over 937.00 samples.], 2024-10-21 01:22:10,007 INFO [train.py:561] (1/4) Epoch 887, batch 14, global_batch_idx: 14190, batch size: 142, loss[dur_loss=0.2747, prior_loss=0.9922, diff_loss=0.3417, tot_loss=1.609, over 142.00 samples.], tot_loss[dur_loss=0.2708, prior_loss=0.9921, diff_loss=0.4017, tot_loss=1.665, over 2210.00 samples.], 2024-10-21 01:22:11,441 INFO [train.py:682] (1/4) Start epoch 888 2024-10-21 01:22:31,803 INFO [train.py:561] (1/4) Epoch 888, batch 8, global_batch_idx: 14200, batch size: 170, loss[dur_loss=0.2755, prior_loss=0.9924, diff_loss=0.3469, tot_loss=1.615, over 170.00 samples.], tot_loss[dur_loss=0.2682, prior_loss=0.9917, diff_loss=0.418, tot_loss=1.678, over 1432.00 samples.], 2024-10-21 01:22:42,003 INFO [train.py:682] (1/4) Start epoch 889 2024-10-21 01:22:53,661 INFO [train.py:561] (1/4) Epoch 889, batch 2, global_batch_idx: 14210, batch size: 203, loss[dur_loss=0.2668, prior_loss=0.9919, diff_loss=0.379, tot_loss=1.638, over 203.00 samples.], tot_loss[dur_loss=0.2717, prior_loss=0.9922, diff_loss=0.3543, tot_loss=1.618, over 442.00 samples.], 2024-10-21 01:23:07,942 INFO [train.py:561] (1/4) Epoch 889, batch 12, global_batch_idx: 14220, batch size: 152, loss[dur_loss=0.27, prior_loss=0.9921, diff_loss=0.3703, tot_loss=1.632, over 152.00 samples.], tot_loss[dur_loss=0.2684, prior_loss=0.9919, diff_loss=0.4068, tot_loss=1.667, over 1966.00 samples.], 2024-10-21 01:23:12,451 INFO [train.py:682] (1/4) Start epoch 890 2024-10-21 01:23:29,625 INFO [train.py:561] (1/4) Epoch 890, batch 6, global_batch_idx: 14230, batch size: 106, loss[dur_loss=0.2738, prior_loss=0.9922, diff_loss=0.3412, tot_loss=1.607, over 106.00 samples.], tot_loss[dur_loss=0.2651, prior_loss=0.9914, diff_loss=0.4459, tot_loss=1.702, over 1142.00 samples.], 2024-10-21 01:23:42,733 INFO [train.py:682] (1/4) Start epoch 891 2024-10-21 01:23:51,914 INFO [train.py:561] (1/4) Epoch 891, batch 0, global_batch_idx: 14240, batch size: 108, loss[dur_loss=0.2827, prior_loss=0.993, diff_loss=0.3624, tot_loss=1.638, over 108.00 samples.], tot_loss[dur_loss=0.2827, prior_loss=0.993, diff_loss=0.3624, tot_loss=1.638, over 108.00 samples.], 2024-10-21 01:24:06,235 INFO [train.py:561] (1/4) Epoch 891, batch 10, global_batch_idx: 14250, batch size: 111, loss[dur_loss=0.2811, prior_loss=0.9936, diff_loss=0.3371, tot_loss=1.612, over 111.00 samples.], tot_loss[dur_loss=0.2705, prior_loss=0.992, diff_loss=0.4105, tot_loss=1.673, over 1656.00 samples.], 2024-10-21 01:24:13,304 INFO [train.py:682] (1/4) Start epoch 892 2024-10-21 01:24:27,149 INFO [train.py:561] (1/4) Epoch 892, batch 4, global_batch_idx: 14260, batch size: 189, loss[dur_loss=0.276, prior_loss=0.9926, diff_loss=0.3754, tot_loss=1.644, over 189.00 samples.], tot_loss[dur_loss=0.2659, prior_loss=0.9912, diff_loss=0.4762, tot_loss=1.733, over 937.00 samples.], 2024-10-21 01:24:41,933 INFO [train.py:561] (1/4) Epoch 892, batch 14, global_batch_idx: 14270, batch size: 142, loss[dur_loss=0.2741, prior_loss=0.992, diff_loss=0.3496, tot_loss=1.616, over 142.00 samples.], tot_loss[dur_loss=0.2716, prior_loss=0.9919, diff_loss=0.4049, tot_loss=1.668, over 2210.00 samples.], 2024-10-21 01:24:43,354 INFO [train.py:682] (1/4) Start epoch 893 2024-10-21 01:25:03,387 INFO [train.py:561] (1/4) Epoch 893, batch 8, global_batch_idx: 14280, batch size: 170, loss[dur_loss=0.2744, prior_loss=0.9921, diff_loss=0.3652, tot_loss=1.632, over 170.00 samples.], tot_loss[dur_loss=0.2691, prior_loss=0.9916, diff_loss=0.4265, tot_loss=1.687, over 1432.00 samples.], 2024-10-21 01:25:13,498 INFO [train.py:682] (1/4) Start epoch 894 2024-10-21 01:25:24,839 INFO [train.py:561] (1/4) Epoch 894, batch 2, global_batch_idx: 14290, batch size: 203, loss[dur_loss=0.271, prior_loss=0.9918, diff_loss=0.3893, tot_loss=1.652, over 203.00 samples.], tot_loss[dur_loss=0.2723, prior_loss=0.9922, diff_loss=0.3714, tot_loss=1.636, over 442.00 samples.], 2024-10-21 01:25:39,049 INFO [train.py:561] (1/4) Epoch 894, batch 12, global_batch_idx: 14300, batch size: 152, loss[dur_loss=0.274, prior_loss=0.9921, diff_loss=0.3772, tot_loss=1.643, over 152.00 samples.], tot_loss[dur_loss=0.2696, prior_loss=0.9917, diff_loss=0.4041, tot_loss=1.665, over 1966.00 samples.], 2024-10-21 01:25:43,479 INFO [train.py:682] (1/4) Start epoch 895 2024-10-21 01:26:00,681 INFO [train.py:561] (1/4) Epoch 895, batch 6, global_batch_idx: 14310, batch size: 106, loss[dur_loss=0.2675, prior_loss=0.9919, diff_loss=0.363, tot_loss=1.622, over 106.00 samples.], tot_loss[dur_loss=0.2636, prior_loss=0.9912, diff_loss=0.4444, tot_loss=1.699, over 1142.00 samples.], 2024-10-21 01:26:13,691 INFO [train.py:682] (1/4) Start epoch 896 2024-10-21 01:26:22,396 INFO [train.py:561] (1/4) Epoch 896, batch 0, global_batch_idx: 14320, batch size: 108, loss[dur_loss=0.281, prior_loss=0.9931, diff_loss=0.3415, tot_loss=1.616, over 108.00 samples.], tot_loss[dur_loss=0.281, prior_loss=0.9931, diff_loss=0.3415, tot_loss=1.616, over 108.00 samples.], 2024-10-21 01:26:36,615 INFO [train.py:561] (1/4) Epoch 896, batch 10, global_batch_idx: 14330, batch size: 111, loss[dur_loss=0.2782, prior_loss=0.9934, diff_loss=0.3743, tot_loss=1.646, over 111.00 samples.], tot_loss[dur_loss=0.2686, prior_loss=0.9918, diff_loss=0.4161, tot_loss=1.677, over 1656.00 samples.], 2024-10-21 01:26:43,741 INFO [train.py:682] (1/4) Start epoch 897 2024-10-21 01:26:57,505 INFO [train.py:561] (1/4) Epoch 897, batch 4, global_batch_idx: 14340, batch size: 189, loss[dur_loss=0.272, prior_loss=0.9925, diff_loss=0.35, tot_loss=1.614, over 189.00 samples.], tot_loss[dur_loss=0.2646, prior_loss=0.9912, diff_loss=0.4688, tot_loss=1.725, over 937.00 samples.], 2024-10-21 01:27:12,347 INFO [train.py:561] (1/4) Epoch 897, batch 14, global_batch_idx: 14350, batch size: 142, loss[dur_loss=0.2769, prior_loss=0.9921, diff_loss=0.3611, tot_loss=1.63, over 142.00 samples.], tot_loss[dur_loss=0.2702, prior_loss=0.9919, diff_loss=0.4003, tot_loss=1.662, over 2210.00 samples.], 2024-10-21 01:27:13,771 INFO [train.py:682] (1/4) Start epoch 898 2024-10-21 01:27:33,534 INFO [train.py:561] (1/4) Epoch 898, batch 8, global_batch_idx: 14360, batch size: 170, loss[dur_loss=0.2724, prior_loss=0.9924, diff_loss=0.3626, tot_loss=1.627, over 170.00 samples.], tot_loss[dur_loss=0.2674, prior_loss=0.9915, diff_loss=0.4244, tot_loss=1.683, over 1432.00 samples.], 2024-10-21 01:27:43,552 INFO [train.py:682] (1/4) Start epoch 899 2024-10-21 01:27:54,853 INFO [train.py:561] (1/4) Epoch 899, batch 2, global_batch_idx: 14370, batch size: 203, loss[dur_loss=0.2749, prior_loss=0.992, diff_loss=0.3576, tot_loss=1.624, over 203.00 samples.], tot_loss[dur_loss=0.274, prior_loss=0.9921, diff_loss=0.3662, tot_loss=1.632, over 442.00 samples.], 2024-10-21 01:28:08,900 INFO [train.py:561] (1/4) Epoch 899, batch 12, global_batch_idx: 14380, batch size: 152, loss[dur_loss=0.27, prior_loss=0.9922, diff_loss=0.3552, tot_loss=1.617, over 152.00 samples.], tot_loss[dur_loss=0.2693, prior_loss=0.9917, diff_loss=0.3993, tot_loss=1.66, over 1966.00 samples.], 2024-10-21 01:28:13,273 INFO [train.py:682] (1/4) Start epoch 900 2024-10-21 01:28:30,436 INFO [train.py:561] (1/4) Epoch 900, batch 6, global_batch_idx: 14390, batch size: 106, loss[dur_loss=0.2721, prior_loss=0.9924, diff_loss=0.3565, tot_loss=1.621, over 106.00 samples.], tot_loss[dur_loss=0.2667, prior_loss=0.9915, diff_loss=0.4508, tot_loss=1.709, over 1142.00 samples.], 2024-10-21 01:28:43,408 INFO [train.py:682] (1/4) Start epoch 901 2024-10-21 01:28:52,083 INFO [train.py:561] (1/4) Epoch 901, batch 0, global_batch_idx: 14400, batch size: 108, loss[dur_loss=0.2765, prior_loss=0.9929, diff_loss=0.3241, tot_loss=1.594, over 108.00 samples.], tot_loss[dur_loss=0.2765, prior_loss=0.9929, diff_loss=0.3241, tot_loss=1.594, over 108.00 samples.], 2024-10-21 01:29:06,460 INFO [train.py:561] (1/4) Epoch 901, batch 10, global_batch_idx: 14410, batch size: 111, loss[dur_loss=0.2781, prior_loss=0.993, diff_loss=0.3293, tot_loss=1.6, over 111.00 samples.], tot_loss[dur_loss=0.2677, prior_loss=0.9916, diff_loss=0.4181, tot_loss=1.677, over 1656.00 samples.], 2024-10-21 01:29:13,677 INFO [train.py:682] (1/4) Start epoch 902 2024-10-21 01:29:27,639 INFO [train.py:561] (1/4) Epoch 902, batch 4, global_batch_idx: 14420, batch size: 189, loss[dur_loss=0.2698, prior_loss=0.9923, diff_loss=0.3762, tot_loss=1.638, over 189.00 samples.], tot_loss[dur_loss=0.2625, prior_loss=0.991, diff_loss=0.4727, tot_loss=1.726, over 937.00 samples.], 2024-10-21 01:29:42,525 INFO [train.py:561] (1/4) Epoch 902, batch 14, global_batch_idx: 14430, batch size: 142, loss[dur_loss=0.2741, prior_loss=0.9916, diff_loss=0.3682, tot_loss=1.634, over 142.00 samples.], tot_loss[dur_loss=0.2684, prior_loss=0.9917, diff_loss=0.3978, tot_loss=1.658, over 2210.00 samples.], 2024-10-21 01:29:43,954 INFO [train.py:682] (1/4) Start epoch 903 2024-10-21 01:30:03,837 INFO [train.py:561] (1/4) Epoch 903, batch 8, global_batch_idx: 14440, batch size: 170, loss[dur_loss=0.2717, prior_loss=0.9923, diff_loss=0.3606, tot_loss=1.625, over 170.00 samples.], tot_loss[dur_loss=0.2655, prior_loss=0.9914, diff_loss=0.422, tot_loss=1.679, over 1432.00 samples.], 2024-10-21 01:30:14,038 INFO [train.py:682] (1/4) Start epoch 904 2024-10-21 01:30:25,585 INFO [train.py:561] (1/4) Epoch 904, batch 2, global_batch_idx: 14450, batch size: 203, loss[dur_loss=0.273, prior_loss=0.9917, diff_loss=0.392, tot_loss=1.657, over 203.00 samples.], tot_loss[dur_loss=0.2724, prior_loss=0.992, diff_loss=0.359, tot_loss=1.623, over 442.00 samples.], 2024-10-21 01:30:39,874 INFO [train.py:561] (1/4) Epoch 904, batch 12, global_batch_idx: 14460, batch size: 152, loss[dur_loss=0.2698, prior_loss=0.992, diff_loss=0.3207, tot_loss=1.583, over 152.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9917, diff_loss=0.4108, tot_loss=1.671, over 1966.00 samples.], 2024-10-21 01:30:44,347 INFO [train.py:682] (1/4) Start epoch 905 2024-10-21 01:31:01,658 INFO [train.py:561] (1/4) Epoch 905, batch 6, global_batch_idx: 14470, batch size: 106, loss[dur_loss=0.2703, prior_loss=0.9923, diff_loss=0.3496, tot_loss=1.612, over 106.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9913, diff_loss=0.4406, tot_loss=1.698, over 1142.00 samples.], 2024-10-21 01:31:14,860 INFO [train.py:682] (1/4) Start epoch 906 2024-10-21 01:31:23,597 INFO [train.py:561] (1/4) Epoch 906, batch 0, global_batch_idx: 14480, batch size: 108, loss[dur_loss=0.2785, prior_loss=0.993, diff_loss=0.3319, tot_loss=1.603, over 108.00 samples.], tot_loss[dur_loss=0.2785, prior_loss=0.993, diff_loss=0.3319, tot_loss=1.603, over 108.00 samples.], 2024-10-21 01:31:37,857 INFO [train.py:561] (1/4) Epoch 906, batch 10, global_batch_idx: 14490, batch size: 111, loss[dur_loss=0.2769, prior_loss=0.9932, diff_loss=0.3282, tot_loss=1.598, over 111.00 samples.], tot_loss[dur_loss=0.2684, prior_loss=0.9917, diff_loss=0.4201, tot_loss=1.68, over 1656.00 samples.], 2024-10-21 01:31:45,010 INFO [train.py:682] (1/4) Start epoch 907 2024-10-21 01:31:58,838 INFO [train.py:561] (1/4) Epoch 907, batch 4, global_batch_idx: 14500, batch size: 189, loss[dur_loss=0.2732, prior_loss=0.9926, diff_loss=0.3782, tot_loss=1.644, over 189.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.9912, diff_loss=0.4562, tot_loss=1.712, over 937.00 samples.], 2024-10-21 01:32:13,792 INFO [train.py:561] (1/4) Epoch 907, batch 14, global_batch_idx: 14510, batch size: 142, loss[dur_loss=0.2747, prior_loss=0.992, diff_loss=0.3707, tot_loss=1.637, over 142.00 samples.], tot_loss[dur_loss=0.2698, prior_loss=0.9918, diff_loss=0.3945, tot_loss=1.656, over 2210.00 samples.], 2024-10-21 01:32:15,255 INFO [train.py:682] (1/4) Start epoch 908 2024-10-21 01:32:35,115 INFO [train.py:561] (1/4) Epoch 908, batch 8, global_batch_idx: 14520, batch size: 170, loss[dur_loss=0.2779, prior_loss=0.9929, diff_loss=0.3671, tot_loss=1.638, over 170.00 samples.], tot_loss[dur_loss=0.268, prior_loss=0.9917, diff_loss=0.4231, tot_loss=1.683, over 1432.00 samples.], 2024-10-21 01:32:45,399 INFO [train.py:682] (1/4) Start epoch 909 2024-10-21 01:32:56,938 INFO [train.py:561] (1/4) Epoch 909, batch 2, global_batch_idx: 14530, batch size: 203, loss[dur_loss=0.2682, prior_loss=0.992, diff_loss=0.3712, tot_loss=1.631, over 203.00 samples.], tot_loss[dur_loss=0.2722, prior_loss=0.9923, diff_loss=0.3479, tot_loss=1.612, over 442.00 samples.], 2024-10-21 01:33:10,998 INFO [train.py:561] (1/4) Epoch 909, batch 12, global_batch_idx: 14540, batch size: 152, loss[dur_loss=0.2731, prior_loss=0.9927, diff_loss=0.3519, tot_loss=1.618, over 152.00 samples.], tot_loss[dur_loss=0.2685, prior_loss=0.9919, diff_loss=0.4019, tot_loss=1.662, over 1966.00 samples.], 2024-10-21 01:33:15,453 INFO [train.py:682] (1/4) Start epoch 910 2024-10-21 01:33:32,410 INFO [train.py:561] (1/4) Epoch 910, batch 6, global_batch_idx: 14550, batch size: 106, loss[dur_loss=0.2731, prior_loss=0.9921, diff_loss=0.3146, tot_loss=1.58, over 106.00 samples.], tot_loss[dur_loss=0.2669, prior_loss=0.9913, diff_loss=0.4439, tot_loss=1.702, over 1142.00 samples.], 2024-10-21 01:33:45,378 INFO [train.py:682] (1/4) Start epoch 911 2024-10-21 01:33:54,275 INFO [train.py:561] (1/4) Epoch 911, batch 0, global_batch_idx: 14560, batch size: 108, loss[dur_loss=0.2812, prior_loss=0.9932, diff_loss=0.322, tot_loss=1.596, over 108.00 samples.], tot_loss[dur_loss=0.2812, prior_loss=0.9932, diff_loss=0.322, tot_loss=1.596, over 108.00 samples.], 2024-10-21 01:34:08,443 INFO [train.py:561] (1/4) Epoch 911, batch 10, global_batch_idx: 14570, batch size: 111, loss[dur_loss=0.2783, prior_loss=0.9934, diff_loss=0.3164, tot_loss=1.588, over 111.00 samples.], tot_loss[dur_loss=0.2681, prior_loss=0.9917, diff_loss=0.4055, tot_loss=1.665, over 1656.00 samples.], 2024-10-21 01:34:15,520 INFO [train.py:682] (1/4) Start epoch 912 2024-10-21 01:34:29,255 INFO [train.py:561] (1/4) Epoch 912, batch 4, global_batch_idx: 14580, batch size: 189, loss[dur_loss=0.2712, prior_loss=0.992, diff_loss=0.3898, tot_loss=1.653, over 189.00 samples.], tot_loss[dur_loss=0.2647, prior_loss=0.991, diff_loss=0.4693, tot_loss=1.725, over 937.00 samples.], 2024-10-21 01:34:44,185 INFO [train.py:561] (1/4) Epoch 912, batch 14, global_batch_idx: 14590, batch size: 142, loss[dur_loss=0.2724, prior_loss=0.9916, diff_loss=0.3743, tot_loss=1.638, over 142.00 samples.], tot_loss[dur_loss=0.2696, prior_loss=0.9916, diff_loss=0.4041, tot_loss=1.665, over 2210.00 samples.], 2024-10-21 01:34:45,613 INFO [train.py:682] (1/4) Start epoch 913 2024-10-21 01:35:06,023 INFO [train.py:561] (1/4) Epoch 913, batch 8, global_batch_idx: 14600, batch size: 170, loss[dur_loss=0.2723, prior_loss=0.9921, diff_loss=0.3735, tot_loss=1.638, over 170.00 samples.], tot_loss[dur_loss=0.2663, prior_loss=0.9913, diff_loss=0.4336, tot_loss=1.691, over 1432.00 samples.], 2024-10-21 01:35:16,288 INFO [train.py:682] (1/4) Start epoch 914 2024-10-21 01:35:27,980 INFO [train.py:561] (1/4) Epoch 914, batch 2, global_batch_idx: 14610, batch size: 203, loss[dur_loss=0.27, prior_loss=0.9917, diff_loss=0.3576, tot_loss=1.619, over 203.00 samples.], tot_loss[dur_loss=0.2717, prior_loss=0.9918, diff_loss=0.3361, tot_loss=1.6, over 442.00 samples.], 2024-10-21 01:35:42,420 INFO [train.py:561] (1/4) Epoch 914, batch 12, global_batch_idx: 14620, batch size: 152, loss[dur_loss=0.2702, prior_loss=0.9914, diff_loss=0.3477, tot_loss=1.609, over 152.00 samples.], tot_loss[dur_loss=0.2675, prior_loss=0.9913, diff_loss=0.3938, tot_loss=1.653, over 1966.00 samples.], 2024-10-21 01:35:46,917 INFO [train.py:682] (1/4) Start epoch 915 2024-10-21 01:36:03,998 INFO [train.py:561] (1/4) Epoch 915, batch 6, global_batch_idx: 14630, batch size: 106, loss[dur_loss=0.2701, prior_loss=0.9917, diff_loss=0.3341, tot_loss=1.596, over 106.00 samples.], tot_loss[dur_loss=0.2638, prior_loss=0.9908, diff_loss=0.45, tot_loss=1.705, over 1142.00 samples.], 2024-10-21 01:36:17,051 INFO [train.py:682] (1/4) Start epoch 916 2024-10-21 01:36:25,828 INFO [train.py:561] (1/4) Epoch 916, batch 0, global_batch_idx: 14640, batch size: 108, loss[dur_loss=0.2758, prior_loss=0.9923, diff_loss=0.3414, tot_loss=1.609, over 108.00 samples.], tot_loss[dur_loss=0.2758, prior_loss=0.9923, diff_loss=0.3414, tot_loss=1.609, over 108.00 samples.], 2024-10-21 01:36:40,111 INFO [train.py:561] (1/4) Epoch 916, batch 10, global_batch_idx: 14650, batch size: 111, loss[dur_loss=0.275, prior_loss=0.9927, diff_loss=0.3535, tot_loss=1.621, over 111.00 samples.], tot_loss[dur_loss=0.2664, prior_loss=0.9912, diff_loss=0.4241, tot_loss=1.682, over 1656.00 samples.], 2024-10-21 01:36:47,225 INFO [train.py:682] (1/4) Start epoch 917 2024-10-21 01:37:01,087 INFO [train.py:561] (1/4) Epoch 917, batch 4, global_batch_idx: 14660, batch size: 189, loss[dur_loss=0.2699, prior_loss=0.9924, diff_loss=0.398, tot_loss=1.66, over 189.00 samples.], tot_loss[dur_loss=0.2649, prior_loss=0.9908, diff_loss=0.4708, tot_loss=1.727, over 937.00 samples.], 2024-10-21 01:37:15,968 INFO [train.py:561] (1/4) Epoch 917, batch 14, global_batch_idx: 14670, batch size: 142, loss[dur_loss=0.2722, prior_loss=0.9915, diff_loss=0.3507, tot_loss=1.614, over 142.00 samples.], tot_loss[dur_loss=0.2687, prior_loss=0.9914, diff_loss=0.4046, tot_loss=1.665, over 2210.00 samples.], 2024-10-21 01:37:17,396 INFO [train.py:682] (1/4) Start epoch 918 2024-10-21 01:37:37,387 INFO [train.py:561] (1/4) Epoch 918, batch 8, global_batch_idx: 14680, batch size: 170, loss[dur_loss=0.273, prior_loss=0.9923, diff_loss=0.3575, tot_loss=1.623, over 170.00 samples.], tot_loss[dur_loss=0.2661, prior_loss=0.9913, diff_loss=0.4196, tot_loss=1.677, over 1432.00 samples.], 2024-10-21 01:37:47,487 INFO [train.py:682] (1/4) Start epoch 919 2024-10-21 01:37:58,938 INFO [train.py:561] (1/4) Epoch 919, batch 2, global_batch_idx: 14690, batch size: 203, loss[dur_loss=0.2677, prior_loss=0.9917, diff_loss=0.3656, tot_loss=1.625, over 203.00 samples.], tot_loss[dur_loss=0.27, prior_loss=0.9919, diff_loss=0.3693, tot_loss=1.631, over 442.00 samples.], 2024-10-21 01:38:13,193 INFO [train.py:561] (1/4) Epoch 919, batch 12, global_batch_idx: 14700, batch size: 152, loss[dur_loss=0.2683, prior_loss=0.9915, diff_loss=0.3562, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.267, prior_loss=0.9914, diff_loss=0.4163, tot_loss=1.675, over 1966.00 samples.], 2024-10-21 01:38:17,686 INFO [train.py:682] (1/4) Start epoch 920 2024-10-21 01:38:34,781 INFO [train.py:561] (1/4) Epoch 920, batch 6, global_batch_idx: 14710, batch size: 106, loss[dur_loss=0.2718, prior_loss=0.9924, diff_loss=0.3309, tot_loss=1.595, over 106.00 samples.], tot_loss[dur_loss=0.2653, prior_loss=0.9912, diff_loss=0.4436, tot_loss=1.7, over 1142.00 samples.], 2024-10-21 01:38:48,142 INFO [train.py:682] (1/4) Start epoch 921 2024-10-21 01:38:57,317 INFO [train.py:561] (1/4) Epoch 921, batch 0, global_batch_idx: 14720, batch size: 108, loss[dur_loss=0.2779, prior_loss=0.9927, diff_loss=0.3725, tot_loss=1.643, over 108.00 samples.], tot_loss[dur_loss=0.2779, prior_loss=0.9927, diff_loss=0.3725, tot_loss=1.643, over 108.00 samples.], 2024-10-21 01:39:11,653 INFO [train.py:561] (1/4) Epoch 921, batch 10, global_batch_idx: 14730, batch size: 111, loss[dur_loss=0.2758, prior_loss=0.993, diff_loss=0.352, tot_loss=1.621, over 111.00 samples.], tot_loss[dur_loss=0.2681, prior_loss=0.9916, diff_loss=0.4276, tot_loss=1.687, over 1656.00 samples.], 2024-10-21 01:39:18,807 INFO [train.py:682] (1/4) Start epoch 922 2024-10-21 01:39:32,516 INFO [train.py:561] (1/4) Epoch 922, batch 4, global_batch_idx: 14740, batch size: 189, loss[dur_loss=0.2697, prior_loss=0.9924, diff_loss=0.3931, tot_loss=1.655, over 189.00 samples.], tot_loss[dur_loss=0.2649, prior_loss=0.9909, diff_loss=0.4605, tot_loss=1.716, over 937.00 samples.], 2024-10-21 01:39:47,475 INFO [train.py:561] (1/4) Epoch 922, batch 14, global_batch_idx: 14750, batch size: 142, loss[dur_loss=0.2718, prior_loss=0.9915, diff_loss=0.3259, tot_loss=1.589, over 142.00 samples.], tot_loss[dur_loss=0.2686, prior_loss=0.9915, diff_loss=0.3939, tot_loss=1.654, over 2210.00 samples.], 2024-10-21 01:39:48,888 INFO [train.py:682] (1/4) Start epoch 923 2024-10-21 01:40:09,112 INFO [train.py:561] (1/4) Epoch 923, batch 8, global_batch_idx: 14760, batch size: 170, loss[dur_loss=0.271, prior_loss=0.992, diff_loss=0.3965, tot_loss=1.659, over 170.00 samples.], tot_loss[dur_loss=0.2642, prior_loss=0.9912, diff_loss=0.4354, tot_loss=1.691, over 1432.00 samples.], 2024-10-21 01:40:19,392 INFO [train.py:682] (1/4) Start epoch 924 2024-10-21 01:40:30,790 INFO [train.py:561] (1/4) Epoch 924, batch 2, global_batch_idx: 14770, batch size: 203, loss[dur_loss=0.2686, prior_loss=0.9916, diff_loss=0.3924, tot_loss=1.653, over 203.00 samples.], tot_loss[dur_loss=0.2725, prior_loss=0.9919, diff_loss=0.3649, tot_loss=1.629, over 442.00 samples.], 2024-10-21 01:40:45,009 INFO [train.py:561] (1/4) Epoch 924, batch 12, global_batch_idx: 14780, batch size: 152, loss[dur_loss=0.2713, prior_loss=0.9914, diff_loss=0.3393, tot_loss=1.602, over 152.00 samples.], tot_loss[dur_loss=0.2684, prior_loss=0.9914, diff_loss=0.4086, tot_loss=1.668, over 1966.00 samples.], 2024-10-21 01:40:49,451 INFO [train.py:682] (1/4) Start epoch 925 2024-10-21 01:41:06,623 INFO [train.py:561] (1/4) Epoch 925, batch 6, global_batch_idx: 14790, batch size: 106, loss[dur_loss=0.2698, prior_loss=0.9919, diff_loss=0.3736, tot_loss=1.635, over 106.00 samples.], tot_loss[dur_loss=0.2655, prior_loss=0.991, diff_loss=0.4321, tot_loss=1.689, over 1142.00 samples.], 2024-10-21 01:41:19,655 INFO [train.py:682] (1/4) Start epoch 926 2024-10-21 01:41:28,451 INFO [train.py:561] (1/4) Epoch 926, batch 0, global_batch_idx: 14800, batch size: 108, loss[dur_loss=0.2784, prior_loss=0.9925, diff_loss=0.3702, tot_loss=1.641, over 108.00 samples.], tot_loss[dur_loss=0.2784, prior_loss=0.9925, diff_loss=0.3702, tot_loss=1.641, over 108.00 samples.], 2024-10-21 01:41:42,740 INFO [train.py:561] (1/4) Epoch 926, batch 10, global_batch_idx: 14810, batch size: 111, loss[dur_loss=0.2759, prior_loss=0.9929, diff_loss=0.3808, tot_loss=1.65, over 111.00 samples.], tot_loss[dur_loss=0.2662, prior_loss=0.9913, diff_loss=0.4134, tot_loss=1.671, over 1656.00 samples.], 2024-10-21 01:41:49,863 INFO [train.py:682] (1/4) Start epoch 927 2024-10-21 01:42:03,765 INFO [train.py:561] (1/4) Epoch 927, batch 4, global_batch_idx: 14820, batch size: 189, loss[dur_loss=0.2677, prior_loss=0.9916, diff_loss=0.3628, tot_loss=1.622, over 189.00 samples.], tot_loss[dur_loss=0.2626, prior_loss=0.9906, diff_loss=0.4646, tot_loss=1.718, over 937.00 samples.], 2024-10-21 01:42:18,660 INFO [train.py:561] (1/4) Epoch 927, batch 14, global_batch_idx: 14830, batch size: 142, loss[dur_loss=0.2716, prior_loss=0.9914, diff_loss=0.355, tot_loss=1.618, over 142.00 samples.], tot_loss[dur_loss=0.2671, prior_loss=0.9913, diff_loss=0.3925, tot_loss=1.651, over 2210.00 samples.], 2024-10-21 01:42:20,103 INFO [train.py:682] (1/4) Start epoch 928 2024-10-21 01:42:40,107 INFO [train.py:561] (1/4) Epoch 928, batch 8, global_batch_idx: 14840, batch size: 170, loss[dur_loss=0.2685, prior_loss=0.992, diff_loss=0.3579, tot_loss=1.618, over 170.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9913, diff_loss=0.4297, tot_loss=1.687, over 1432.00 samples.], 2024-10-21 01:42:50,236 INFO [train.py:682] (1/4) Start epoch 929 2024-10-21 01:43:01,998 INFO [train.py:561] (1/4) Epoch 929, batch 2, global_batch_idx: 14850, batch size: 203, loss[dur_loss=0.2721, prior_loss=0.9917, diff_loss=0.3895, tot_loss=1.653, over 203.00 samples.], tot_loss[dur_loss=0.272, prior_loss=0.9918, diff_loss=0.3639, tot_loss=1.628, over 442.00 samples.], 2024-10-21 01:43:16,965 INFO [train.py:561] (1/4) Epoch 929, batch 12, global_batch_idx: 14860, batch size: 152, loss[dur_loss=0.2705, prior_loss=0.9914, diff_loss=0.3399, tot_loss=1.602, over 152.00 samples.], tot_loss[dur_loss=0.2676, prior_loss=0.9913, diff_loss=0.4027, tot_loss=1.662, over 1966.00 samples.], 2024-10-21 01:43:21,487 INFO [train.py:682] (1/4) Start epoch 930 2024-10-21 01:43:38,887 INFO [train.py:561] (1/4) Epoch 930, batch 6, global_batch_idx: 14870, batch size: 106, loss[dur_loss=0.2709, prior_loss=0.9918, diff_loss=0.3566, tot_loss=1.619, over 106.00 samples.], tot_loss[dur_loss=0.2646, prior_loss=0.9907, diff_loss=0.4423, tot_loss=1.698, over 1142.00 samples.], 2024-10-21 01:43:52,116 INFO [train.py:682] (1/4) Start epoch 931 2024-10-21 01:44:00,720 INFO [train.py:561] (1/4) Epoch 931, batch 0, global_batch_idx: 14880, batch size: 108, loss[dur_loss=0.2764, prior_loss=0.9924, diff_loss=0.3203, tot_loss=1.589, over 108.00 samples.], tot_loss[dur_loss=0.2764, prior_loss=0.9924, diff_loss=0.3203, tot_loss=1.589, over 108.00 samples.], 2024-10-21 01:44:15,000 INFO [train.py:561] (1/4) Epoch 931, batch 10, global_batch_idx: 14890, batch size: 111, loss[dur_loss=0.275, prior_loss=0.9926, diff_loss=0.3382, tot_loss=1.606, over 111.00 samples.], tot_loss[dur_loss=0.2661, prior_loss=0.991, diff_loss=0.4144, tot_loss=1.672, over 1656.00 samples.], 2024-10-21 01:44:22,197 INFO [train.py:682] (1/4) Start epoch 932 2024-10-21 01:44:36,347 INFO [train.py:561] (1/4) Epoch 932, batch 4, global_batch_idx: 14900, batch size: 189, loss[dur_loss=0.2666, prior_loss=0.9914, diff_loss=0.3902, tot_loss=1.648, over 189.00 samples.], tot_loss[dur_loss=0.2624, prior_loss=0.9903, diff_loss=0.4649, tot_loss=1.718, over 937.00 samples.], 2024-10-21 01:44:51,485 INFO [train.py:561] (1/4) Epoch 932, batch 14, global_batch_idx: 14910, batch size: 142, loss[dur_loss=0.268, prior_loss=0.9907, diff_loss=0.3287, tot_loss=1.587, over 142.00 samples.], tot_loss[dur_loss=0.2664, prior_loss=0.9909, diff_loss=0.3953, tot_loss=1.653, over 2210.00 samples.], 2024-10-21 01:44:52,972 INFO [train.py:682] (1/4) Start epoch 933 2024-10-21 01:45:13,069 INFO [train.py:561] (1/4) Epoch 933, batch 8, global_batch_idx: 14920, batch size: 170, loss[dur_loss=0.2718, prior_loss=0.9914, diff_loss=0.3571, tot_loss=1.62, over 170.00 samples.], tot_loss[dur_loss=0.2642, prior_loss=0.9906, diff_loss=0.4232, tot_loss=1.678, over 1432.00 samples.], 2024-10-21 01:45:23,363 INFO [train.py:682] (1/4) Start epoch 934 2024-10-21 01:45:35,279 INFO [train.py:561] (1/4) Epoch 934, batch 2, global_batch_idx: 14930, batch size: 203, loss[dur_loss=0.2719, prior_loss=0.9913, diff_loss=0.3624, tot_loss=1.626, over 203.00 samples.], tot_loss[dur_loss=0.272, prior_loss=0.9915, diff_loss=0.3535, tot_loss=1.617, over 442.00 samples.], 2024-10-21 01:45:49,536 INFO [train.py:561] (1/4) Epoch 934, batch 12, global_batch_idx: 14940, batch size: 152, loss[dur_loss=0.2642, prior_loss=0.9909, diff_loss=0.394, tot_loss=1.649, over 152.00 samples.], tot_loss[dur_loss=0.2662, prior_loss=0.9909, diff_loss=0.4051, tot_loss=1.662, over 1966.00 samples.], 2024-10-21 01:45:53,994 INFO [train.py:682] (1/4) Start epoch 935 2024-10-21 01:46:11,195 INFO [train.py:561] (1/4) Epoch 935, batch 6, global_batch_idx: 14950, batch size: 106, loss[dur_loss=0.2642, prior_loss=0.9911, diff_loss=0.3387, tot_loss=1.594, over 106.00 samples.], tot_loss[dur_loss=0.2621, prior_loss=0.9905, diff_loss=0.4425, tot_loss=1.695, over 1142.00 samples.], 2024-10-21 01:46:24,369 INFO [train.py:682] (1/4) Start epoch 936 2024-10-21 01:46:33,264 INFO [train.py:561] (1/4) Epoch 936, batch 0, global_batch_idx: 14960, batch size: 108, loss[dur_loss=0.2743, prior_loss=0.9919, diff_loss=0.2998, tot_loss=1.566, over 108.00 samples.], tot_loss[dur_loss=0.2743, prior_loss=0.9919, diff_loss=0.2998, tot_loss=1.566, over 108.00 samples.], 2024-10-21 01:46:47,539 INFO [train.py:561] (1/4) Epoch 936, batch 10, global_batch_idx: 14970, batch size: 111, loss[dur_loss=0.2753, prior_loss=0.9923, diff_loss=0.381, tot_loss=1.649, over 111.00 samples.], tot_loss[dur_loss=0.2652, prior_loss=0.9908, diff_loss=0.4092, tot_loss=1.665, over 1656.00 samples.], 2024-10-21 01:46:54,635 INFO [train.py:682] (1/4) Start epoch 937 2024-10-21 01:47:08,754 INFO [train.py:561] (1/4) Epoch 937, batch 4, global_batch_idx: 14980, batch size: 189, loss[dur_loss=0.27, prior_loss=0.9916, diff_loss=0.3695, tot_loss=1.631, over 189.00 samples.], tot_loss[dur_loss=0.2628, prior_loss=0.9903, diff_loss=0.4618, tot_loss=1.715, over 937.00 samples.], 2024-10-21 01:47:23,712 INFO [train.py:561] (1/4) Epoch 937, batch 14, global_batch_idx: 14990, batch size: 142, loss[dur_loss=0.2715, prior_loss=0.991, diff_loss=0.3624, tot_loss=1.625, over 142.00 samples.], tot_loss[dur_loss=0.2669, prior_loss=0.9909, diff_loss=0.3994, tot_loss=1.657, over 2210.00 samples.], 2024-10-21 01:47:25,148 INFO [train.py:682] (1/4) Start epoch 938 2024-10-21 01:47:45,471 INFO [train.py:561] (1/4) Epoch 938, batch 8, global_batch_idx: 15000, batch size: 170, loss[dur_loss=0.2679, prior_loss=0.9912, diff_loss=0.3474, tot_loss=1.607, over 170.00 samples.], tot_loss[dur_loss=0.2636, prior_loss=0.9906, diff_loss=0.4199, tot_loss=1.674, over 1432.00 samples.], 2024-10-21 01:47:46,987 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 01:48:23,072 INFO [train.py:589] (1/4) Epoch 938, validation: dur_loss=0.4412, prior_loss=1.032, diff_loss=0.4441, tot_loss=1.917, over 100.00 samples. 2024-10-21 01:48:23,073 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 01:48:31,936 INFO [train.py:682] (1/4) Start epoch 939 2024-10-21 01:48:43,430 INFO [train.py:561] (1/4) Epoch 939, batch 2, global_batch_idx: 15010, batch size: 203, loss[dur_loss=0.2697, prior_loss=0.9911, diff_loss=0.3936, tot_loss=1.654, over 203.00 samples.], tot_loss[dur_loss=0.2696, prior_loss=0.9913, diff_loss=0.3708, tot_loss=1.632, over 442.00 samples.], 2024-10-21 01:48:57,683 INFO [train.py:561] (1/4) Epoch 939, batch 12, global_batch_idx: 15020, batch size: 152, loss[dur_loss=0.2699, prior_loss=0.991, diff_loss=0.354, tot_loss=1.615, over 152.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9908, diff_loss=0.4128, tot_loss=1.669, over 1966.00 samples.], 2024-10-21 01:49:02,164 INFO [train.py:682] (1/4) Start epoch 940 2024-10-21 01:49:19,688 INFO [train.py:561] (1/4) Epoch 940, batch 6, global_batch_idx: 15030, batch size: 106, loss[dur_loss=0.2696, prior_loss=0.9917, diff_loss=0.3101, tot_loss=1.571, over 106.00 samples.], tot_loss[dur_loss=0.2631, prior_loss=0.9905, diff_loss=0.428, tot_loss=1.682, over 1142.00 samples.], 2024-10-21 01:49:32,810 INFO [train.py:682] (1/4) Start epoch 941 2024-10-21 01:49:41,760 INFO [train.py:561] (1/4) Epoch 941, batch 0, global_batch_idx: 15040, batch size: 108, loss[dur_loss=0.272, prior_loss=0.992, diff_loss=0.3128, tot_loss=1.577, over 108.00 samples.], tot_loss[dur_loss=0.272, prior_loss=0.992, diff_loss=0.3128, tot_loss=1.577, over 108.00 samples.], 2024-10-21 01:49:55,956 INFO [train.py:561] (1/4) Epoch 941, batch 10, global_batch_idx: 15050, batch size: 111, loss[dur_loss=0.2764, prior_loss=0.9926, diff_loss=0.3226, tot_loss=1.592, over 111.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9909, diff_loss=0.4159, tot_loss=1.672, over 1656.00 samples.], 2024-10-21 01:50:03,014 INFO [train.py:682] (1/4) Start epoch 942 2024-10-21 01:50:17,047 INFO [train.py:561] (1/4) Epoch 942, batch 4, global_batch_idx: 15060, batch size: 189, loss[dur_loss=0.2697, prior_loss=0.9918, diff_loss=0.3775, tot_loss=1.639, over 189.00 samples.], tot_loss[dur_loss=0.261, prior_loss=0.9904, diff_loss=0.4587, tot_loss=1.71, over 937.00 samples.], 2024-10-21 01:50:31,862 INFO [train.py:561] (1/4) Epoch 942, batch 14, global_batch_idx: 15070, batch size: 142, loss[dur_loss=0.2699, prior_loss=0.9911, diff_loss=0.365, tot_loss=1.626, over 142.00 samples.], tot_loss[dur_loss=0.2656, prior_loss=0.9909, diff_loss=0.3979, tot_loss=1.654, over 2210.00 samples.], 2024-10-21 01:50:33,292 INFO [train.py:682] (1/4) Start epoch 943 2024-10-21 01:50:53,356 INFO [train.py:561] (1/4) Epoch 943, batch 8, global_batch_idx: 15080, batch size: 170, loss[dur_loss=0.2711, prior_loss=0.9915, diff_loss=0.3731, tot_loss=1.636, over 170.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.9907, diff_loss=0.4239, tot_loss=1.679, over 1432.00 samples.], 2024-10-21 01:51:03,456 INFO [train.py:682] (1/4) Start epoch 944 2024-10-21 01:51:14,865 INFO [train.py:561] (1/4) Epoch 944, batch 2, global_batch_idx: 15090, batch size: 203, loss[dur_loss=0.2676, prior_loss=0.9908, diff_loss=0.355, tot_loss=1.613, over 203.00 samples.], tot_loss[dur_loss=0.2685, prior_loss=0.9911, diff_loss=0.3501, tot_loss=1.61, over 442.00 samples.], 2024-10-21 01:51:29,134 INFO [train.py:561] (1/4) Epoch 944, batch 12, global_batch_idx: 15100, batch size: 152, loss[dur_loss=0.2675, prior_loss=0.9908, diff_loss=0.3577, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2652, prior_loss=0.9906, diff_loss=0.3965, tot_loss=1.652, over 1966.00 samples.], 2024-10-21 01:51:33,589 INFO [train.py:682] (1/4) Start epoch 945 2024-10-21 01:51:50,933 INFO [train.py:561] (1/4) Epoch 945, batch 6, global_batch_idx: 15110, batch size: 106, loss[dur_loss=0.2698, prior_loss=0.9914, diff_loss=0.3381, tot_loss=1.599, over 106.00 samples.], tot_loss[dur_loss=0.2619, prior_loss=0.9902, diff_loss=0.4419, tot_loss=1.694, over 1142.00 samples.], 2024-10-21 01:52:04,014 INFO [train.py:682] (1/4) Start epoch 946 2024-10-21 01:52:12,819 INFO [train.py:561] (1/4) Epoch 946, batch 0, global_batch_idx: 15120, batch size: 108, loss[dur_loss=0.2761, prior_loss=0.992, diff_loss=0.3335, tot_loss=1.601, over 108.00 samples.], tot_loss[dur_loss=0.2761, prior_loss=0.992, diff_loss=0.3335, tot_loss=1.601, over 108.00 samples.], 2024-10-21 01:52:27,099 INFO [train.py:561] (1/4) Epoch 946, batch 10, global_batch_idx: 15130, batch size: 111, loss[dur_loss=0.271, prior_loss=0.992, diff_loss=0.3096, tot_loss=1.573, over 111.00 samples.], tot_loss[dur_loss=0.2647, prior_loss=0.9908, diff_loss=0.4078, tot_loss=1.663, over 1656.00 samples.], 2024-10-21 01:52:34,229 INFO [train.py:682] (1/4) Start epoch 947 2024-10-21 01:52:47,920 INFO [train.py:561] (1/4) Epoch 947, batch 4, global_batch_idx: 15140, batch size: 189, loss[dur_loss=0.2676, prior_loss=0.9915, diff_loss=0.3612, tot_loss=1.62, over 189.00 samples.], tot_loss[dur_loss=0.2609, prior_loss=0.9902, diff_loss=0.472, tot_loss=1.723, over 937.00 samples.], 2024-10-21 01:53:02,833 INFO [train.py:561] (1/4) Epoch 947, batch 14, global_batch_idx: 15150, batch size: 142, loss[dur_loss=0.2692, prior_loss=0.9905, diff_loss=0.3081, tot_loss=1.568, over 142.00 samples.], tot_loss[dur_loss=0.2651, prior_loss=0.9907, diff_loss=0.3993, tot_loss=1.655, over 2210.00 samples.], 2024-10-21 01:53:04,246 INFO [train.py:682] (1/4) Start epoch 948 2024-10-21 01:53:24,436 INFO [train.py:561] (1/4) Epoch 948, batch 8, global_batch_idx: 15160, batch size: 170, loss[dur_loss=0.2686, prior_loss=0.9915, diff_loss=0.3793, tot_loss=1.64, over 170.00 samples.], tot_loss[dur_loss=0.2637, prior_loss=0.9905, diff_loss=0.4199, tot_loss=1.674, over 1432.00 samples.], 2024-10-21 01:53:34,505 INFO [train.py:682] (1/4) Start epoch 949 2024-10-21 01:53:46,089 INFO [train.py:561] (1/4) Epoch 949, batch 2, global_batch_idx: 15170, batch size: 203, loss[dur_loss=0.2674, prior_loss=0.9906, diff_loss=0.3643, tot_loss=1.622, over 203.00 samples.], tot_loss[dur_loss=0.2679, prior_loss=0.991, diff_loss=0.3574, tot_loss=1.616, over 442.00 samples.], 2024-10-21 01:54:00,309 INFO [train.py:561] (1/4) Epoch 949, batch 12, global_batch_idx: 15180, batch size: 152, loss[dur_loss=0.2674, prior_loss=0.991, diff_loss=0.3543, tot_loss=1.613, over 152.00 samples.], tot_loss[dur_loss=0.2641, prior_loss=0.9906, diff_loss=0.3958, tot_loss=1.65, over 1966.00 samples.], 2024-10-21 01:54:04,743 INFO [train.py:682] (1/4) Start epoch 950 2024-10-21 01:54:21,638 INFO [train.py:561] (1/4) Epoch 950, batch 6, global_batch_idx: 15190, batch size: 106, loss[dur_loss=0.2661, prior_loss=0.9909, diff_loss=0.3597, tot_loss=1.617, over 106.00 samples.], tot_loss[dur_loss=0.2618, prior_loss=0.9903, diff_loss=0.451, tot_loss=1.703, over 1142.00 samples.], 2024-10-21 01:54:34,678 INFO [train.py:682] (1/4) Start epoch 951 2024-10-21 01:54:43,261 INFO [train.py:561] (1/4) Epoch 951, batch 0, global_batch_idx: 15200, batch size: 108, loss[dur_loss=0.2755, prior_loss=0.9923, diff_loss=0.3256, tot_loss=1.593, over 108.00 samples.], tot_loss[dur_loss=0.2755, prior_loss=0.9923, diff_loss=0.3256, tot_loss=1.593, over 108.00 samples.], 2024-10-21 01:54:57,426 INFO [train.py:561] (1/4) Epoch 951, batch 10, global_batch_idx: 15210, batch size: 111, loss[dur_loss=0.2748, prior_loss=0.9921, diff_loss=0.3133, tot_loss=1.58, over 111.00 samples.], tot_loss[dur_loss=0.2641, prior_loss=0.9906, diff_loss=0.4117, tot_loss=1.666, over 1656.00 samples.], 2024-10-21 01:55:04,490 INFO [train.py:682] (1/4) Start epoch 952 2024-10-21 01:55:18,369 INFO [train.py:561] (1/4) Epoch 952, batch 4, global_batch_idx: 15220, batch size: 189, loss[dur_loss=0.2663, prior_loss=0.9914, diff_loss=0.3687, tot_loss=1.626, over 189.00 samples.], tot_loss[dur_loss=0.2617, prior_loss=0.9901, diff_loss=0.4664, tot_loss=1.718, over 937.00 samples.], 2024-10-21 01:55:33,182 INFO [train.py:561] (1/4) Epoch 952, batch 14, global_batch_idx: 15230, batch size: 142, loss[dur_loss=0.2691, prior_loss=0.9911, diff_loss=0.3764, tot_loss=1.637, over 142.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9908, diff_loss=0.3985, tot_loss=1.655, over 2210.00 samples.], 2024-10-21 01:55:34,610 INFO [train.py:682] (1/4) Start epoch 953 2024-10-21 01:55:54,451 INFO [train.py:561] (1/4) Epoch 953, batch 8, global_batch_idx: 15240, batch size: 170, loss[dur_loss=0.2631, prior_loss=0.991, diff_loss=0.3835, tot_loss=1.638, over 170.00 samples.], tot_loss[dur_loss=0.2625, prior_loss=0.9905, diff_loss=0.4187, tot_loss=1.672, over 1432.00 samples.], 2024-10-21 01:56:04,581 INFO [train.py:682] (1/4) Start epoch 954 2024-10-21 01:56:16,256 INFO [train.py:561] (1/4) Epoch 954, batch 2, global_batch_idx: 15250, batch size: 203, loss[dur_loss=0.267, prior_loss=0.991, diff_loss=0.3768, tot_loss=1.635, over 203.00 samples.], tot_loss[dur_loss=0.2679, prior_loss=0.9912, diff_loss=0.3741, tot_loss=1.633, over 442.00 samples.], 2024-10-21 01:56:30,472 INFO [train.py:561] (1/4) Epoch 954, batch 12, global_batch_idx: 15260, batch size: 152, loss[dur_loss=0.2685, prior_loss=0.9909, diff_loss=0.3669, tot_loss=1.626, over 152.00 samples.], tot_loss[dur_loss=0.2648, prior_loss=0.9907, diff_loss=0.4115, tot_loss=1.667, over 1966.00 samples.], 2024-10-21 01:56:34,927 INFO [train.py:682] (1/4) Start epoch 955 2024-10-21 01:56:52,084 INFO [train.py:561] (1/4) Epoch 955, batch 6, global_batch_idx: 15270, batch size: 106, loss[dur_loss=0.2665, prior_loss=0.9909, diff_loss=0.3345, tot_loss=1.592, over 106.00 samples.], tot_loss[dur_loss=0.2617, prior_loss=0.9901, diff_loss=0.4473, tot_loss=1.699, over 1142.00 samples.], 2024-10-21 01:57:05,081 INFO [train.py:682] (1/4) Start epoch 956 2024-10-21 01:57:13,974 INFO [train.py:561] (1/4) Epoch 956, batch 0, global_batch_idx: 15280, batch size: 108, loss[dur_loss=0.2729, prior_loss=0.992, diff_loss=0.3497, tot_loss=1.615, over 108.00 samples.], tot_loss[dur_loss=0.2729, prior_loss=0.992, diff_loss=0.3497, tot_loss=1.615, over 108.00 samples.], 2024-10-21 01:57:28,273 INFO [train.py:561] (1/4) Epoch 956, batch 10, global_batch_idx: 15290, batch size: 111, loss[dur_loss=0.2738, prior_loss=0.9926, diff_loss=0.3559, tot_loss=1.622, over 111.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9908, diff_loss=0.422, tot_loss=1.679, over 1656.00 samples.], 2024-10-21 01:57:35,408 INFO [train.py:682] (1/4) Start epoch 957 2024-10-21 01:57:49,263 INFO [train.py:561] (1/4) Epoch 957, batch 4, global_batch_idx: 15300, batch size: 189, loss[dur_loss=0.2694, prior_loss=0.9912, diff_loss=0.3666, tot_loss=1.627, over 189.00 samples.], tot_loss[dur_loss=0.2611, prior_loss=0.99, diff_loss=0.4659, tot_loss=1.717, over 937.00 samples.], 2024-10-21 01:58:04,189 INFO [train.py:561] (1/4) Epoch 957, batch 14, global_batch_idx: 15310, batch size: 142, loss[dur_loss=0.2675, prior_loss=0.9903, diff_loss=0.3553, tot_loss=1.613, over 142.00 samples.], tot_loss[dur_loss=0.2654, prior_loss=0.9906, diff_loss=0.3969, tot_loss=1.653, over 2210.00 samples.], 2024-10-21 01:58:05,607 INFO [train.py:682] (1/4) Start epoch 958 2024-10-21 01:58:25,569 INFO [train.py:561] (1/4) Epoch 958, batch 8, global_batch_idx: 15320, batch size: 170, loss[dur_loss=0.2687, prior_loss=0.9919, diff_loss=0.3619, tot_loss=1.622, over 170.00 samples.], tot_loss[dur_loss=0.2629, prior_loss=0.9905, diff_loss=0.4192, tot_loss=1.673, over 1432.00 samples.], 2024-10-21 01:58:35,706 INFO [train.py:682] (1/4) Start epoch 959 2024-10-21 01:58:47,056 INFO [train.py:561] (1/4) Epoch 959, batch 2, global_batch_idx: 15330, batch size: 203, loss[dur_loss=0.266, prior_loss=0.9907, diff_loss=0.383, tot_loss=1.64, over 203.00 samples.], tot_loss[dur_loss=0.2671, prior_loss=0.991, diff_loss=0.3695, tot_loss=1.628, over 442.00 samples.], 2024-10-21 01:59:01,330 INFO [train.py:561] (1/4) Epoch 959, batch 12, global_batch_idx: 15340, batch size: 152, loss[dur_loss=0.2688, prior_loss=0.9907, diff_loss=0.3615, tot_loss=1.621, over 152.00 samples.], tot_loss[dur_loss=0.2656, prior_loss=0.9907, diff_loss=0.4007, tot_loss=1.657, over 1966.00 samples.], 2024-10-21 01:59:05,799 INFO [train.py:682] (1/4) Start epoch 960 2024-10-21 01:59:22,818 INFO [train.py:561] (1/4) Epoch 960, batch 6, global_batch_idx: 15350, batch size: 106, loss[dur_loss=0.2676, prior_loss=0.991, diff_loss=0.2902, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.262, prior_loss=0.9901, diff_loss=0.4422, tot_loss=1.694, over 1142.00 samples.], 2024-10-21 01:59:35,948 INFO [train.py:682] (1/4) Start epoch 961 2024-10-21 01:59:44,829 INFO [train.py:561] (1/4) Epoch 961, batch 0, global_batch_idx: 15360, batch size: 108, loss[dur_loss=0.271, prior_loss=0.9919, diff_loss=0.3295, tot_loss=1.592, over 108.00 samples.], tot_loss[dur_loss=0.271, prior_loss=0.9919, diff_loss=0.3295, tot_loss=1.592, over 108.00 samples.], 2024-10-21 01:59:59,176 INFO [train.py:561] (1/4) Epoch 961, batch 10, global_batch_idx: 15370, batch size: 111, loss[dur_loss=0.2735, prior_loss=0.9923, diff_loss=0.3776, tot_loss=1.643, over 111.00 samples.], tot_loss[dur_loss=0.2653, prior_loss=0.9907, diff_loss=0.4095, tot_loss=1.665, over 1656.00 samples.], 2024-10-21 02:00:06,266 INFO [train.py:682] (1/4) Start epoch 962 2024-10-21 02:00:20,289 INFO [train.py:561] (1/4) Epoch 962, batch 4, global_batch_idx: 15380, batch size: 189, loss[dur_loss=0.2678, prior_loss=0.9913, diff_loss=0.395, tot_loss=1.654, over 189.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.9901, diff_loss=0.4652, tot_loss=1.716, over 937.00 samples.], 2024-10-21 02:00:35,138 INFO [train.py:561] (1/4) Epoch 962, batch 14, global_batch_idx: 15390, batch size: 142, loss[dur_loss=0.2653, prior_loss=0.9906, diff_loss=0.3447, tot_loss=1.601, over 142.00 samples.], tot_loss[dur_loss=0.2656, prior_loss=0.9908, diff_loss=0.3979, tot_loss=1.654, over 2210.00 samples.], 2024-10-21 02:00:36,535 INFO [train.py:682] (1/4) Start epoch 963 2024-10-21 02:00:56,655 INFO [train.py:561] (1/4) Epoch 963, batch 8, global_batch_idx: 15400, batch size: 170, loss[dur_loss=0.2721, prior_loss=0.9911, diff_loss=0.3872, tot_loss=1.65, over 170.00 samples.], tot_loss[dur_loss=0.2636, prior_loss=0.9903, diff_loss=0.4332, tot_loss=1.687, over 1432.00 samples.], 2024-10-21 02:01:06,779 INFO [train.py:682] (1/4) Start epoch 964 2024-10-21 02:01:18,332 INFO [train.py:561] (1/4) Epoch 964, batch 2, global_batch_idx: 15410, batch size: 203, loss[dur_loss=0.2657, prior_loss=0.9907, diff_loss=0.4228, tot_loss=1.679, over 203.00 samples.], tot_loss[dur_loss=0.2668, prior_loss=0.9909, diff_loss=0.3794, tot_loss=1.637, over 442.00 samples.], 2024-10-21 02:01:32,551 INFO [train.py:561] (1/4) Epoch 964, batch 12, global_batch_idx: 15420, batch size: 152, loss[dur_loss=0.2658, prior_loss=0.9909, diff_loss=0.3397, tot_loss=1.596, over 152.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.9907, diff_loss=0.4062, tot_loss=1.661, over 1966.00 samples.], 2024-10-21 02:01:37,007 INFO [train.py:682] (1/4) Start epoch 965 2024-10-21 02:01:54,179 INFO [train.py:561] (1/4) Epoch 965, batch 6, global_batch_idx: 15430, batch size: 106, loss[dur_loss=0.2729, prior_loss=0.9911, diff_loss=0.3356, tot_loss=1.6, over 106.00 samples.], tot_loss[dur_loss=0.2618, prior_loss=0.9902, diff_loss=0.44, tot_loss=1.692, over 1142.00 samples.], 2024-10-21 02:02:07,246 INFO [train.py:682] (1/4) Start epoch 966 2024-10-21 02:02:16,045 INFO [train.py:561] (1/4) Epoch 966, batch 0, global_batch_idx: 15440, batch size: 108, loss[dur_loss=0.27, prior_loss=0.9918, diff_loss=0.3439, tot_loss=1.606, over 108.00 samples.], tot_loss[dur_loss=0.27, prior_loss=0.9918, diff_loss=0.3439, tot_loss=1.606, over 108.00 samples.], 2024-10-21 02:02:30,258 INFO [train.py:561] (1/4) Epoch 966, batch 10, global_batch_idx: 15450, batch size: 111, loss[dur_loss=0.2747, prior_loss=0.9921, diff_loss=0.382, tot_loss=1.649, over 111.00 samples.], tot_loss[dur_loss=0.2655, prior_loss=0.9905, diff_loss=0.4221, tot_loss=1.678, over 1656.00 samples.], 2024-10-21 02:02:37,321 INFO [train.py:682] (1/4) Start epoch 967 2024-10-21 02:02:51,151 INFO [train.py:561] (1/4) Epoch 967, batch 4, global_batch_idx: 15460, batch size: 189, loss[dur_loss=0.2671, prior_loss=0.9916, diff_loss=0.3508, tot_loss=1.61, over 189.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.9902, diff_loss=0.4542, tot_loss=1.705, over 937.00 samples.], 2024-10-21 02:03:05,971 INFO [train.py:561] (1/4) Epoch 967, batch 14, global_batch_idx: 15470, batch size: 142, loss[dur_loss=0.2679, prior_loss=0.9908, diff_loss=0.3373, tot_loss=1.596, over 142.00 samples.], tot_loss[dur_loss=0.2652, prior_loss=0.9908, diff_loss=0.3968, tot_loss=1.653, over 2210.00 samples.], 2024-10-21 02:03:07,401 INFO [train.py:682] (1/4) Start epoch 968 2024-10-21 02:03:27,336 INFO [train.py:561] (1/4) Epoch 968, batch 8, global_batch_idx: 15480, batch size: 170, loss[dur_loss=0.2686, prior_loss=0.9905, diff_loss=0.3844, tot_loss=1.643, over 170.00 samples.], tot_loss[dur_loss=0.2617, prior_loss=0.9901, diff_loss=0.4289, tot_loss=1.681, over 1432.00 samples.], 2024-10-21 02:03:37,460 INFO [train.py:682] (1/4) Start epoch 969 2024-10-21 02:03:48,820 INFO [train.py:561] (1/4) Epoch 969, batch 2, global_batch_idx: 15490, batch size: 203, loss[dur_loss=0.2655, prior_loss=0.9905, diff_loss=0.3846, tot_loss=1.641, over 203.00 samples.], tot_loss[dur_loss=0.2669, prior_loss=0.9908, diff_loss=0.3641, tot_loss=1.622, over 442.00 samples.], 2024-10-21 02:04:02,977 INFO [train.py:561] (1/4) Epoch 969, batch 12, global_batch_idx: 15500, batch size: 152, loss[dur_loss=0.2678, prior_loss=0.9905, diff_loss=0.3569, tot_loss=1.615, over 152.00 samples.], tot_loss[dur_loss=0.2642, prior_loss=0.9902, diff_loss=0.407, tot_loss=1.661, over 1966.00 samples.], 2024-10-21 02:04:07,325 INFO [train.py:682] (1/4) Start epoch 970 2024-10-21 02:04:24,798 INFO [train.py:561] (1/4) Epoch 970, batch 6, global_batch_idx: 15510, batch size: 106, loss[dur_loss=0.2631, prior_loss=0.9905, diff_loss=0.3232, tot_loss=1.577, over 106.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.9899, diff_loss=0.4387, tot_loss=1.689, over 1142.00 samples.], 2024-10-21 02:04:37,905 INFO [train.py:682] (1/4) Start epoch 971 2024-10-21 02:04:46,481 INFO [train.py:561] (1/4) Epoch 971, batch 0, global_batch_idx: 15520, batch size: 108, loss[dur_loss=0.2751, prior_loss=0.9915, diff_loss=0.3312, tot_loss=1.598, over 108.00 samples.], tot_loss[dur_loss=0.2751, prior_loss=0.9915, diff_loss=0.3312, tot_loss=1.598, over 108.00 samples.], 2024-10-21 02:05:00,584 INFO [train.py:561] (1/4) Epoch 971, batch 10, global_batch_idx: 15530, batch size: 111, loss[dur_loss=0.2738, prior_loss=0.9917, diff_loss=0.3398, tot_loss=1.605, over 111.00 samples.], tot_loss[dur_loss=0.2635, prior_loss=0.9904, diff_loss=0.4154, tot_loss=1.669, over 1656.00 samples.], 2024-10-21 02:05:07,630 INFO [train.py:682] (1/4) Start epoch 972 2024-10-21 02:05:21,343 INFO [train.py:561] (1/4) Epoch 972, batch 4, global_batch_idx: 15540, batch size: 189, loss[dur_loss=0.2668, prior_loss=0.9915, diff_loss=0.3349, tot_loss=1.593, over 189.00 samples.], tot_loss[dur_loss=0.2592, prior_loss=0.9897, diff_loss=0.4609, tot_loss=1.71, over 937.00 samples.], 2024-10-21 02:05:36,118 INFO [train.py:561] (1/4) Epoch 972, batch 14, global_batch_idx: 15550, batch size: 142, loss[dur_loss=0.2631, prior_loss=0.9901, diff_loss=0.3287, tot_loss=1.582, over 142.00 samples.], tot_loss[dur_loss=0.2636, prior_loss=0.9903, diff_loss=0.3939, tot_loss=1.648, over 2210.00 samples.], 2024-10-21 02:05:37,538 INFO [train.py:682] (1/4) Start epoch 973 2024-10-21 02:05:57,188 INFO [train.py:561] (1/4) Epoch 973, batch 8, global_batch_idx: 15560, batch size: 170, loss[dur_loss=0.2674, prior_loss=0.9904, diff_loss=0.3611, tot_loss=1.619, over 170.00 samples.], tot_loss[dur_loss=0.2616, prior_loss=0.99, diff_loss=0.4206, tot_loss=1.672, over 1432.00 samples.], 2024-10-21 02:06:07,224 INFO [train.py:682] (1/4) Start epoch 974 2024-10-21 02:06:18,750 INFO [train.py:561] (1/4) Epoch 974, batch 2, global_batch_idx: 15570, batch size: 203, loss[dur_loss=0.2693, prior_loss=0.9903, diff_loss=0.3537, tot_loss=1.613, over 203.00 samples.], tot_loss[dur_loss=0.267, prior_loss=0.9906, diff_loss=0.349, tot_loss=1.607, over 442.00 samples.], 2024-10-21 02:06:32,959 INFO [train.py:561] (1/4) Epoch 974, batch 12, global_batch_idx: 15580, batch size: 152, loss[dur_loss=0.2651, prior_loss=0.9905, diff_loss=0.3746, tot_loss=1.63, over 152.00 samples.], tot_loss[dur_loss=0.2634, prior_loss=0.9902, diff_loss=0.402, tot_loss=1.656, over 1966.00 samples.], 2024-10-21 02:06:37,372 INFO [train.py:682] (1/4) Start epoch 975 2024-10-21 02:06:54,274 INFO [train.py:561] (1/4) Epoch 975, batch 6, global_batch_idx: 15590, batch size: 106, loss[dur_loss=0.2665, prior_loss=0.9907, diff_loss=0.3541, tot_loss=1.611, over 106.00 samples.], tot_loss[dur_loss=0.2601, prior_loss=0.9899, diff_loss=0.446, tot_loss=1.696, over 1142.00 samples.], 2024-10-21 02:07:07,137 INFO [train.py:682] (1/4) Start epoch 976 2024-10-21 02:07:18,875 INFO [train.py:561] (1/4) Epoch 976, batch 0, global_batch_idx: 15600, batch size: 108, loss[dur_loss=0.2745, prior_loss=0.9914, diff_loss=0.3482, tot_loss=1.614, over 108.00 samples.], tot_loss[dur_loss=0.2745, prior_loss=0.9914, diff_loss=0.3482, tot_loss=1.614, over 108.00 samples.], 2024-10-21 02:07:33,211 INFO [train.py:561] (1/4) Epoch 976, batch 10, global_batch_idx: 15610, batch size: 111, loss[dur_loss=0.2681, prior_loss=0.9916, diff_loss=0.3883, tot_loss=1.648, over 111.00 samples.], tot_loss[dur_loss=0.2617, prior_loss=0.99, diff_loss=0.413, tot_loss=1.665, over 1656.00 samples.], 2024-10-21 02:07:40,233 INFO [train.py:682] (1/4) Start epoch 977 2024-10-21 02:07:53,958 INFO [train.py:561] (1/4) Epoch 977, batch 4, global_batch_idx: 15620, batch size: 189, loss[dur_loss=0.2674, prior_loss=0.9911, diff_loss=0.3523, tot_loss=1.611, over 189.00 samples.], tot_loss[dur_loss=0.2584, prior_loss=0.9895, diff_loss=0.4666, tot_loss=1.714, over 937.00 samples.], 2024-10-21 02:08:08,891 INFO [train.py:561] (1/4) Epoch 977, batch 14, global_batch_idx: 15630, batch size: 142, loss[dur_loss=0.2616, prior_loss=0.9903, diff_loss=0.389, tot_loss=1.641, over 142.00 samples.], tot_loss[dur_loss=0.2631, prior_loss=0.9902, diff_loss=0.4006, tot_loss=1.654, over 2210.00 samples.], 2024-10-21 02:08:10,332 INFO [train.py:682] (1/4) Start epoch 978 2024-10-21 02:08:30,368 INFO [train.py:561] (1/4) Epoch 978, batch 8, global_batch_idx: 15640, batch size: 170, loss[dur_loss=0.2612, prior_loss=0.9907, diff_loss=0.3528, tot_loss=1.605, over 170.00 samples.], tot_loss[dur_loss=0.2605, prior_loss=0.9899, diff_loss=0.4154, tot_loss=1.666, over 1432.00 samples.], 2024-10-21 02:08:40,615 INFO [train.py:682] (1/4) Start epoch 979 2024-10-21 02:08:52,585 INFO [train.py:561] (1/4) Epoch 979, batch 2, global_batch_idx: 15650, batch size: 203, loss[dur_loss=0.264, prior_loss=0.9903, diff_loss=0.3722, tot_loss=1.626, over 203.00 samples.], tot_loss[dur_loss=0.2672, prior_loss=0.9907, diff_loss=0.3609, tot_loss=1.619, over 442.00 samples.], 2024-10-21 02:09:06,968 INFO [train.py:561] (1/4) Epoch 979, batch 12, global_batch_idx: 15660, batch size: 152, loss[dur_loss=0.2674, prior_loss=0.9902, diff_loss=0.3469, tot_loss=1.604, over 152.00 samples.], tot_loss[dur_loss=0.2641, prior_loss=0.9902, diff_loss=0.4011, tot_loss=1.655, over 1966.00 samples.], 2024-10-21 02:09:11,438 INFO [train.py:682] (1/4) Start epoch 980 2024-10-21 02:09:28,373 INFO [train.py:561] (1/4) Epoch 980, batch 6, global_batch_idx: 15670, batch size: 106, loss[dur_loss=0.2647, prior_loss=0.9907, diff_loss=0.333, tot_loss=1.588, over 106.00 samples.], tot_loss[dur_loss=0.261, prior_loss=0.9899, diff_loss=0.4437, tot_loss=1.695, over 1142.00 samples.], 2024-10-21 02:09:41,581 INFO [train.py:682] (1/4) Start epoch 981 2024-10-21 02:09:50,170 INFO [train.py:561] (1/4) Epoch 981, batch 0, global_batch_idx: 15680, batch size: 108, loss[dur_loss=0.2691, prior_loss=0.9908, diff_loss=0.3117, tot_loss=1.572, over 108.00 samples.], tot_loss[dur_loss=0.2691, prior_loss=0.9908, diff_loss=0.3117, tot_loss=1.572, over 108.00 samples.], 2024-10-21 02:10:04,734 INFO [train.py:561] (1/4) Epoch 981, batch 10, global_batch_idx: 15690, batch size: 111, loss[dur_loss=0.2669, prior_loss=0.9915, diff_loss=0.3265, tot_loss=1.585, over 111.00 samples.], tot_loss[dur_loss=0.2616, prior_loss=0.99, diff_loss=0.4157, tot_loss=1.667, over 1656.00 samples.], 2024-10-21 02:10:11,874 INFO [train.py:682] (1/4) Start epoch 982 2024-10-21 02:10:25,530 INFO [train.py:561] (1/4) Epoch 982, batch 4, global_batch_idx: 15700, batch size: 189, loss[dur_loss=0.2635, prior_loss=0.9905, diff_loss=0.3683, tot_loss=1.622, over 189.00 samples.], tot_loss[dur_loss=0.2564, prior_loss=0.9893, diff_loss=0.4588, tot_loss=1.705, over 937.00 samples.], 2024-10-21 02:10:40,487 INFO [train.py:561] (1/4) Epoch 982, batch 14, global_batch_idx: 15710, batch size: 142, loss[dur_loss=0.2647, prior_loss=0.99, diff_loss=0.3321, tot_loss=1.587, over 142.00 samples.], tot_loss[dur_loss=0.2624, prior_loss=0.99, diff_loss=0.3969, tot_loss=1.649, over 2210.00 samples.], 2024-10-21 02:10:41,923 INFO [train.py:682] (1/4) Start epoch 983 2024-10-21 02:11:01,791 INFO [train.py:561] (1/4) Epoch 983, batch 8, global_batch_idx: 15720, batch size: 170, loss[dur_loss=0.2661, prior_loss=0.9907, diff_loss=0.3238, tot_loss=1.581, over 170.00 samples.], tot_loss[dur_loss=0.2603, prior_loss=0.9898, diff_loss=0.416, tot_loss=1.666, over 1432.00 samples.], 2024-10-21 02:11:11,984 INFO [train.py:682] (1/4) Start epoch 984 2024-10-21 02:11:23,328 INFO [train.py:561] (1/4) Epoch 984, batch 2, global_batch_idx: 15730, batch size: 203, loss[dur_loss=0.2647, prior_loss=0.99, diff_loss=0.3591, tot_loss=1.614, over 203.00 samples.], tot_loss[dur_loss=0.2657, prior_loss=0.9903, diff_loss=0.3483, tot_loss=1.604, over 442.00 samples.], 2024-10-21 02:11:37,502 INFO [train.py:561] (1/4) Epoch 984, batch 12, global_batch_idx: 15740, batch size: 152, loss[dur_loss=0.2644, prior_loss=0.9903, diff_loss=0.3618, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2626, prior_loss=0.99, diff_loss=0.3955, tot_loss=1.648, over 1966.00 samples.], 2024-10-21 02:11:41,853 INFO [train.py:682] (1/4) Start epoch 985 2024-10-21 02:11:59,859 INFO [train.py:561] (1/4) Epoch 985, batch 6, global_batch_idx: 15750, batch size: 106, loss[dur_loss=0.2664, prior_loss=0.9908, diff_loss=0.3716, tot_loss=1.629, over 106.00 samples.], tot_loss[dur_loss=0.2593, prior_loss=0.9897, diff_loss=0.4379, tot_loss=1.687, over 1142.00 samples.], 2024-10-21 02:12:13,109 INFO [train.py:682] (1/4) Start epoch 986 2024-10-21 02:12:21,871 INFO [train.py:561] (1/4) Epoch 986, batch 0, global_batch_idx: 15760, batch size: 108, loss[dur_loss=0.2703, prior_loss=0.9912, diff_loss=0.3037, tot_loss=1.565, over 108.00 samples.], tot_loss[dur_loss=0.2703, prior_loss=0.9912, diff_loss=0.3037, tot_loss=1.565, over 108.00 samples.], 2024-10-21 02:12:36,202 INFO [train.py:561] (1/4) Epoch 986, batch 10, global_batch_idx: 15770, batch size: 111, loss[dur_loss=0.2713, prior_loss=0.9918, diff_loss=0.3684, tot_loss=1.632, over 111.00 samples.], tot_loss[dur_loss=0.2625, prior_loss=0.99, diff_loss=0.4163, tot_loss=1.669, over 1656.00 samples.], 2024-10-21 02:12:43,539 INFO [train.py:682] (1/4) Start epoch 987 2024-10-21 02:12:57,559 INFO [train.py:561] (1/4) Epoch 987, batch 4, global_batch_idx: 15780, batch size: 189, loss[dur_loss=0.2673, prior_loss=0.9909, diff_loss=0.3542, tot_loss=1.612, over 189.00 samples.], tot_loss[dur_loss=0.2599, prior_loss=0.9894, diff_loss=0.4681, tot_loss=1.717, over 937.00 samples.], 2024-10-21 02:13:12,557 INFO [train.py:561] (1/4) Epoch 987, batch 14, global_batch_idx: 15790, batch size: 142, loss[dur_loss=0.2659, prior_loss=0.9899, diff_loss=0.3469, tot_loss=1.603, over 142.00 samples.], tot_loss[dur_loss=0.2634, prior_loss=0.99, diff_loss=0.3965, tot_loss=1.65, over 2210.00 samples.], 2024-10-21 02:13:14,000 INFO [train.py:682] (1/4) Start epoch 988 2024-10-21 02:13:34,048 INFO [train.py:561] (1/4) Epoch 988, batch 8, global_batch_idx: 15800, batch size: 170, loss[dur_loss=0.2658, prior_loss=0.9905, diff_loss=0.3716, tot_loss=1.628, over 170.00 samples.], tot_loss[dur_loss=0.2615, prior_loss=0.9898, diff_loss=0.4253, tot_loss=1.677, over 1432.00 samples.], 2024-10-21 02:13:44,275 INFO [train.py:682] (1/4) Start epoch 989 2024-10-21 02:13:55,795 INFO [train.py:561] (1/4) Epoch 989, batch 2, global_batch_idx: 15810, batch size: 203, loss[dur_loss=0.2621, prior_loss=0.9898, diff_loss=0.3775, tot_loss=1.629, over 203.00 samples.], tot_loss[dur_loss=0.263, prior_loss=0.9902, diff_loss=0.3784, tot_loss=1.632, over 442.00 samples.], 2024-10-21 02:14:10,086 INFO [train.py:561] (1/4) Epoch 989, batch 12, global_batch_idx: 15820, batch size: 152, loss[dur_loss=0.2637, prior_loss=0.99, diff_loss=0.337, tot_loss=1.591, over 152.00 samples.], tot_loss[dur_loss=0.2616, prior_loss=0.9898, diff_loss=0.4086, tot_loss=1.66, over 1966.00 samples.], 2024-10-21 02:14:14,590 INFO [train.py:682] (1/4) Start epoch 990 2024-10-21 02:14:31,501 INFO [train.py:561] (1/4) Epoch 990, batch 6, global_batch_idx: 15830, batch size: 106, loss[dur_loss=0.266, prior_loss=0.9905, diff_loss=0.3224, tot_loss=1.579, over 106.00 samples.], tot_loss[dur_loss=0.2596, prior_loss=0.9895, diff_loss=0.442, tot_loss=1.691, over 1142.00 samples.], 2024-10-21 02:14:44,594 INFO [train.py:682] (1/4) Start epoch 991 2024-10-21 02:14:53,699 INFO [train.py:561] (1/4) Epoch 991, batch 0, global_batch_idx: 15840, batch size: 108, loss[dur_loss=0.2693, prior_loss=0.9908, diff_loss=0.3574, tot_loss=1.617, over 108.00 samples.], tot_loss[dur_loss=0.2693, prior_loss=0.9908, diff_loss=0.3574, tot_loss=1.617, over 108.00 samples.], 2024-10-21 02:15:07,981 INFO [train.py:561] (1/4) Epoch 991, batch 10, global_batch_idx: 15850, batch size: 111, loss[dur_loss=0.273, prior_loss=0.9918, diff_loss=0.3414, tot_loss=1.606, over 111.00 samples.], tot_loss[dur_loss=0.262, prior_loss=0.99, diff_loss=0.4175, tot_loss=1.67, over 1656.00 samples.], 2024-10-21 02:15:15,107 INFO [train.py:682] (1/4) Start epoch 992 2024-10-21 02:15:28,816 INFO [train.py:561] (1/4) Epoch 992, batch 4, global_batch_idx: 15860, batch size: 189, loss[dur_loss=0.2629, prior_loss=0.9906, diff_loss=0.3677, tot_loss=1.621, over 189.00 samples.], tot_loss[dur_loss=0.2575, prior_loss=0.9892, diff_loss=0.4508, tot_loss=1.697, over 937.00 samples.], 2024-10-21 02:15:43,785 INFO [train.py:561] (1/4) Epoch 992, batch 14, global_batch_idx: 15870, batch size: 142, loss[dur_loss=0.2638, prior_loss=0.9897, diff_loss=0.3597, tot_loss=1.613, over 142.00 samples.], tot_loss[dur_loss=0.2622, prior_loss=0.9898, diff_loss=0.3884, tot_loss=1.64, over 2210.00 samples.], 2024-10-21 02:15:45,230 INFO [train.py:682] (1/4) Start epoch 993 2024-10-21 02:16:05,472 INFO [train.py:561] (1/4) Epoch 993, batch 8, global_batch_idx: 15880, batch size: 170, loss[dur_loss=0.2638, prior_loss=0.9905, diff_loss=0.3964, tot_loss=1.651, over 170.00 samples.], tot_loss[dur_loss=0.2594, prior_loss=0.9896, diff_loss=0.4377, tot_loss=1.687, over 1432.00 samples.], 2024-10-21 02:16:15,679 INFO [train.py:682] (1/4) Start epoch 994 2024-10-21 02:16:27,239 INFO [train.py:561] (1/4) Epoch 994, batch 2, global_batch_idx: 15890, batch size: 203, loss[dur_loss=0.2601, prior_loss=0.9899, diff_loss=0.3903, tot_loss=1.64, over 203.00 samples.], tot_loss[dur_loss=0.2635, prior_loss=0.9902, diff_loss=0.3562, tot_loss=1.61, over 442.00 samples.], 2024-10-21 02:16:41,563 INFO [train.py:561] (1/4) Epoch 994, batch 12, global_batch_idx: 15900, batch size: 152, loss[dur_loss=0.2659, prior_loss=0.99, diff_loss=0.3638, tot_loss=1.62, over 152.00 samples.], tot_loss[dur_loss=0.263, prior_loss=0.9898, diff_loss=0.402, tot_loss=1.655, over 1966.00 samples.], 2024-10-21 02:16:46,092 INFO [train.py:682] (1/4) Start epoch 995 2024-10-21 02:17:03,148 INFO [train.py:561] (1/4) Epoch 995, batch 6, global_batch_idx: 15910, batch size: 106, loss[dur_loss=0.2673, prior_loss=0.9907, diff_loss=0.3547, tot_loss=1.613, over 106.00 samples.], tot_loss[dur_loss=0.2595, prior_loss=0.9895, diff_loss=0.4328, tot_loss=1.682, over 1142.00 samples.], 2024-10-21 02:17:16,296 INFO [train.py:682] (1/4) Start epoch 996 2024-10-21 02:17:24,897 INFO [train.py:561] (1/4) Epoch 996, batch 0, global_batch_idx: 15920, batch size: 108, loss[dur_loss=0.2681, prior_loss=0.9908, diff_loss=0.3584, tot_loss=1.617, over 108.00 samples.], tot_loss[dur_loss=0.2681, prior_loss=0.9908, diff_loss=0.3584, tot_loss=1.617, over 108.00 samples.], 2024-10-21 02:17:39,160 INFO [train.py:561] (1/4) Epoch 996, batch 10, global_batch_idx: 15930, batch size: 111, loss[dur_loss=0.269, prior_loss=0.9914, diff_loss=0.3682, tot_loss=1.629, over 111.00 samples.], tot_loss[dur_loss=0.2607, prior_loss=0.9897, diff_loss=0.4082, tot_loss=1.659, over 1656.00 samples.], 2024-10-21 02:17:46,370 INFO [train.py:682] (1/4) Start epoch 997 2024-10-21 02:18:00,421 INFO [train.py:561] (1/4) Epoch 997, batch 4, global_batch_idx: 15940, batch size: 189, loss[dur_loss=0.2632, prior_loss=0.99, diff_loss=0.3657, tot_loss=1.619, over 189.00 samples.], tot_loss[dur_loss=0.2567, prior_loss=0.9889, diff_loss=0.4601, tot_loss=1.706, over 937.00 samples.], 2024-10-21 02:18:15,186 INFO [train.py:561] (1/4) Epoch 997, batch 14, global_batch_idx: 15950, batch size: 142, loss[dur_loss=0.2624, prior_loss=0.9896, diff_loss=0.3377, tot_loss=1.59, over 142.00 samples.], tot_loss[dur_loss=0.2611, prior_loss=0.9897, diff_loss=0.3994, tot_loss=1.65, over 2210.00 samples.], 2024-10-21 02:18:16,612 INFO [train.py:682] (1/4) Start epoch 998 2024-10-21 02:18:36,693 INFO [train.py:561] (1/4) Epoch 998, batch 8, global_batch_idx: 15960, batch size: 170, loss[dur_loss=0.2606, prior_loss=0.9901, diff_loss=0.3811, tot_loss=1.632, over 170.00 samples.], tot_loss[dur_loss=0.2606, prior_loss=0.9895, diff_loss=0.4307, tot_loss=1.681, over 1432.00 samples.], 2024-10-21 02:18:46,909 INFO [train.py:682] (1/4) Start epoch 999 2024-10-21 02:18:58,179 INFO [train.py:561] (1/4) Epoch 999, batch 2, global_batch_idx: 15970, batch size: 203, loss[dur_loss=0.2639, prior_loss=0.9898, diff_loss=0.3632, tot_loss=1.617, over 203.00 samples.], tot_loss[dur_loss=0.2646, prior_loss=0.99, diff_loss=0.3579, tot_loss=1.613, over 442.00 samples.], 2024-10-21 02:19:12,730 INFO [train.py:561] (1/4) Epoch 999, batch 12, global_batch_idx: 15980, batch size: 152, loss[dur_loss=0.264, prior_loss=0.9897, diff_loss=0.3646, tot_loss=1.618, over 152.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.9896, diff_loss=0.4013, tot_loss=1.652, over 1966.00 samples.], 2024-10-21 02:19:17,244 INFO [train.py:682] (1/4) Start epoch 1000 2024-10-21 02:19:34,444 INFO [train.py:561] (1/4) Epoch 1000, batch 6, global_batch_idx: 15990, batch size: 106, loss[dur_loss=0.266, prior_loss=0.9902, diff_loss=0.3233, tot_loss=1.58, over 106.00 samples.], tot_loss[dur_loss=0.2585, prior_loss=0.9893, diff_loss=0.4319, tot_loss=1.68, over 1142.00 samples.], 2024-10-21 02:19:47,631 INFO [train.py:682] (1/4) Start epoch 1001 2024-10-21 02:19:56,498 INFO [train.py:561] (1/4) Epoch 1001, batch 0, global_batch_idx: 16000, batch size: 108, loss[dur_loss=0.2718, prior_loss=0.9906, diff_loss=0.3523, tot_loss=1.615, over 108.00 samples.], tot_loss[dur_loss=0.2718, prior_loss=0.9906, diff_loss=0.3523, tot_loss=1.615, over 108.00 samples.], 2024-10-21 02:20:10,729 INFO [train.py:561] (1/4) Epoch 1001, batch 10, global_batch_idx: 16010, batch size: 111, loss[dur_loss=0.2653, prior_loss=0.9908, diff_loss=0.3395, tot_loss=1.596, over 111.00 samples.], tot_loss[dur_loss=0.2589, prior_loss=0.9893, diff_loss=0.4102, tot_loss=1.658, over 1656.00 samples.], 2024-10-21 02:20:17,861 INFO [train.py:682] (1/4) Start epoch 1002 2024-10-21 02:20:31,676 INFO [train.py:561] (1/4) Epoch 1002, batch 4, global_batch_idx: 16020, batch size: 189, loss[dur_loss=0.2654, prior_loss=0.9903, diff_loss=0.3729, tot_loss=1.629, over 189.00 samples.], tot_loss[dur_loss=0.2561, prior_loss=0.9888, diff_loss=0.4581, tot_loss=1.703, over 937.00 samples.], 2024-10-21 02:20:46,563 INFO [train.py:561] (1/4) Epoch 1002, batch 14, global_batch_idx: 16030, batch size: 142, loss[dur_loss=0.2644, prior_loss=0.9898, diff_loss=0.3149, tot_loss=1.569, over 142.00 samples.], tot_loss[dur_loss=0.2609, prior_loss=0.9895, diff_loss=0.3899, tot_loss=1.64, over 2210.00 samples.], 2024-10-21 02:20:48,022 INFO [train.py:682] (1/4) Start epoch 1003 2024-10-21 02:21:08,487 INFO [train.py:561] (1/4) Epoch 1003, batch 8, global_batch_idx: 16040, batch size: 170, loss[dur_loss=0.2617, prior_loss=0.9899, diff_loss=0.3482, tot_loss=1.6, over 170.00 samples.], tot_loss[dur_loss=0.2606, prior_loss=0.9894, diff_loss=0.4176, tot_loss=1.668, over 1432.00 samples.], 2024-10-21 02:21:18,707 INFO [train.py:682] (1/4) Start epoch 1004 2024-10-21 02:21:30,251 INFO [train.py:561] (1/4) Epoch 1004, batch 2, global_batch_idx: 16050, batch size: 203, loss[dur_loss=0.2655, prior_loss=0.9898, diff_loss=0.391, tot_loss=1.646, over 203.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.99, diff_loss=0.3572, tot_loss=1.612, over 442.00 samples.], 2024-10-21 02:21:44,496 INFO [train.py:561] (1/4) Epoch 1004, batch 12, global_batch_idx: 16060, batch size: 152, loss[dur_loss=0.261, prior_loss=0.9896, diff_loss=0.3657, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.9896, diff_loss=0.4048, tot_loss=1.655, over 1966.00 samples.], 2024-10-21 02:21:48,985 INFO [train.py:682] (1/4) Start epoch 1005 2024-10-21 02:22:06,297 INFO [train.py:561] (1/4) Epoch 1005, batch 6, global_batch_idx: 16070, batch size: 106, loss[dur_loss=0.2658, prior_loss=0.9901, diff_loss=0.3466, tot_loss=1.603, over 106.00 samples.], tot_loss[dur_loss=0.2568, prior_loss=0.989, diff_loss=0.4404, tot_loss=1.686, over 1142.00 samples.], 2024-10-21 02:22:19,655 INFO [train.py:682] (1/4) Start epoch 1006 2024-10-21 02:22:28,644 INFO [train.py:561] (1/4) Epoch 1006, batch 0, global_batch_idx: 16080, batch size: 108, loss[dur_loss=0.2693, prior_loss=0.9906, diff_loss=0.3239, tot_loss=1.584, over 108.00 samples.], tot_loss[dur_loss=0.2693, prior_loss=0.9906, diff_loss=0.3239, tot_loss=1.584, over 108.00 samples.], 2024-10-21 02:22:43,096 INFO [train.py:561] (1/4) Epoch 1006, batch 10, global_batch_idx: 16090, batch size: 111, loss[dur_loss=0.2672, prior_loss=0.9914, diff_loss=0.3231, tot_loss=1.582, over 111.00 samples.], tot_loss[dur_loss=0.2585, prior_loss=0.9895, diff_loss=0.4079, tot_loss=1.656, over 1656.00 samples.], 2024-10-21 02:22:50,303 INFO [train.py:682] (1/4) Start epoch 1007 2024-10-21 02:23:04,323 INFO [train.py:561] (1/4) Epoch 1007, batch 4, global_batch_idx: 16100, batch size: 189, loss[dur_loss=0.2679, prior_loss=0.9901, diff_loss=0.3595, tot_loss=1.618, over 189.00 samples.], tot_loss[dur_loss=0.2567, prior_loss=0.9889, diff_loss=0.4434, tot_loss=1.689, over 937.00 samples.], 2024-10-21 02:23:19,658 INFO [train.py:561] (1/4) Epoch 1007, batch 14, global_batch_idx: 16110, batch size: 142, loss[dur_loss=0.2661, prior_loss=0.9898, diff_loss=0.3399, tot_loss=1.596, over 142.00 samples.], tot_loss[dur_loss=0.2605, prior_loss=0.9895, diff_loss=0.3899, tot_loss=1.64, over 2210.00 samples.], 2024-10-21 02:23:21,106 INFO [train.py:682] (1/4) Start epoch 1008 2024-10-21 02:23:41,308 INFO [train.py:561] (1/4) Epoch 1008, batch 8, global_batch_idx: 16120, batch size: 170, loss[dur_loss=0.2629, prior_loss=0.9899, diff_loss=0.3211, tot_loss=1.574, over 170.00 samples.], tot_loss[dur_loss=0.2592, prior_loss=0.9893, diff_loss=0.4083, tot_loss=1.657, over 1432.00 samples.], 2024-10-21 02:23:51,604 INFO [train.py:682] (1/4) Start epoch 1009 2024-10-21 02:24:02,901 INFO [train.py:561] (1/4) Epoch 1009, batch 2, global_batch_idx: 16130, batch size: 203, loss[dur_loss=0.2603, prior_loss=0.9895, diff_loss=0.3898, tot_loss=1.64, over 203.00 samples.], tot_loss[dur_loss=0.2618, prior_loss=0.9897, diff_loss=0.3474, tot_loss=1.599, over 442.00 samples.], 2024-10-21 02:24:17,185 INFO [train.py:561] (1/4) Epoch 1009, batch 12, global_batch_idx: 16140, batch size: 152, loss[dur_loss=0.264, prior_loss=0.9894, diff_loss=0.3434, tot_loss=1.597, over 152.00 samples.], tot_loss[dur_loss=0.2596, prior_loss=0.9893, diff_loss=0.399, tot_loss=1.648, over 1966.00 samples.], 2024-10-21 02:24:21,698 INFO [train.py:682] (1/4) Start epoch 1010 2024-10-21 02:24:39,409 INFO [train.py:561] (1/4) Epoch 1010, batch 6, global_batch_idx: 16150, batch size: 106, loss[dur_loss=0.2658, prior_loss=0.9898, diff_loss=0.3572, tot_loss=1.613, over 106.00 samples.], tot_loss[dur_loss=0.2569, prior_loss=0.9887, diff_loss=0.4341, tot_loss=1.68, over 1142.00 samples.], 2024-10-21 02:24:52,624 INFO [train.py:682] (1/4) Start epoch 1011 2024-10-21 02:25:01,741 INFO [train.py:561] (1/4) Epoch 1011, batch 0, global_batch_idx: 16160, batch size: 108, loss[dur_loss=0.2666, prior_loss=0.9901, diff_loss=0.3332, tot_loss=1.59, over 108.00 samples.], tot_loss[dur_loss=0.2666, prior_loss=0.9901, diff_loss=0.3332, tot_loss=1.59, over 108.00 samples.], 2024-10-21 02:25:16,081 INFO [train.py:561] (1/4) Epoch 1011, batch 10, global_batch_idx: 16170, batch size: 111, loss[dur_loss=0.2658, prior_loss=0.9908, diff_loss=0.3604, tot_loss=1.617, over 111.00 samples.], tot_loss[dur_loss=0.2602, prior_loss=0.9891, diff_loss=0.409, tot_loss=1.658, over 1656.00 samples.], 2024-10-21 02:25:23,217 INFO [train.py:682] (1/4) Start epoch 1012 2024-10-21 02:25:37,237 INFO [train.py:561] (1/4) Epoch 1012, batch 4, global_batch_idx: 16180, batch size: 189, loss[dur_loss=0.2639, prior_loss=0.99, diff_loss=0.3726, tot_loss=1.626, over 189.00 samples.], tot_loss[dur_loss=0.2569, prior_loss=0.9887, diff_loss=0.4666, tot_loss=1.712, over 937.00 samples.], 2024-10-21 02:25:52,032 INFO [train.py:561] (1/4) Epoch 1012, batch 14, global_batch_idx: 16190, batch size: 142, loss[dur_loss=0.2612, prior_loss=0.9891, diff_loss=0.3113, tot_loss=1.562, over 142.00 samples.], tot_loss[dur_loss=0.2604, prior_loss=0.9893, diff_loss=0.3947, tot_loss=1.644, over 2210.00 samples.], 2024-10-21 02:25:53,458 INFO [train.py:682] (1/4) Start epoch 1013 2024-10-21 02:26:13,345 INFO [train.py:561] (1/4) Epoch 1013, batch 8, global_batch_idx: 16200, batch size: 170, loss[dur_loss=0.2604, prior_loss=0.9896, diff_loss=0.334, tot_loss=1.584, over 170.00 samples.], tot_loss[dur_loss=0.2576, prior_loss=0.989, diff_loss=0.4215, tot_loss=1.668, over 1432.00 samples.], 2024-10-21 02:26:23,583 INFO [train.py:682] (1/4) Start epoch 1014 2024-10-21 02:26:35,258 INFO [train.py:561] (1/4) Epoch 1014, batch 2, global_batch_idx: 16210, batch size: 203, loss[dur_loss=0.2626, prior_loss=0.9895, diff_loss=0.3758, tot_loss=1.628, over 203.00 samples.], tot_loss[dur_loss=0.2624, prior_loss=0.9896, diff_loss=0.3523, tot_loss=1.604, over 442.00 samples.], 2024-10-21 02:26:49,500 INFO [train.py:561] (1/4) Epoch 1014, batch 12, global_batch_idx: 16220, batch size: 152, loss[dur_loss=0.2635, prior_loss=0.9892, diff_loss=0.3351, tot_loss=1.588, over 152.00 samples.], tot_loss[dur_loss=0.2597, prior_loss=0.9891, diff_loss=0.3995, tot_loss=1.648, over 1966.00 samples.], 2024-10-21 02:26:53,977 INFO [train.py:682] (1/4) Start epoch 1015 2024-10-21 02:27:11,074 INFO [train.py:561] (1/4) Epoch 1015, batch 6, global_batch_idx: 16230, batch size: 106, loss[dur_loss=0.2631, prior_loss=0.9897, diff_loss=0.3074, tot_loss=1.56, over 106.00 samples.], tot_loss[dur_loss=0.2567, prior_loss=0.9888, diff_loss=0.4464, tot_loss=1.692, over 1142.00 samples.], 2024-10-21 02:27:24,179 INFO [train.py:682] (1/4) Start epoch 1016 2024-10-21 02:27:32,969 INFO [train.py:561] (1/4) Epoch 1016, batch 0, global_batch_idx: 16240, batch size: 108, loss[dur_loss=0.2708, prior_loss=0.9909, diff_loss=0.316, tot_loss=1.578, over 108.00 samples.], tot_loss[dur_loss=0.2708, prior_loss=0.9909, diff_loss=0.316, tot_loss=1.578, over 108.00 samples.], 2024-10-21 02:27:47,228 INFO [train.py:561] (1/4) Epoch 1016, batch 10, global_batch_idx: 16250, batch size: 111, loss[dur_loss=0.2654, prior_loss=0.9909, diff_loss=0.3502, tot_loss=1.607, over 111.00 samples.], tot_loss[dur_loss=0.2588, prior_loss=0.9893, diff_loss=0.4119, tot_loss=1.66, over 1656.00 samples.], 2024-10-21 02:27:54,363 INFO [train.py:682] (1/4) Start epoch 1017 2024-10-21 02:28:08,348 INFO [train.py:561] (1/4) Epoch 1017, batch 4, global_batch_idx: 16260, batch size: 189, loss[dur_loss=0.2607, prior_loss=0.9897, diff_loss=0.37, tot_loss=1.62, over 189.00 samples.], tot_loss[dur_loss=0.2547, prior_loss=0.9884, diff_loss=0.4677, tot_loss=1.711, over 937.00 samples.], 2024-10-21 02:28:23,301 INFO [train.py:561] (1/4) Epoch 1017, batch 14, global_batch_idx: 16270, batch size: 142, loss[dur_loss=0.2625, prior_loss=0.9894, diff_loss=0.3509, tot_loss=1.603, over 142.00 samples.], tot_loss[dur_loss=0.2596, prior_loss=0.9892, diff_loss=0.3941, tot_loss=1.643, over 2210.00 samples.], 2024-10-21 02:28:24,735 INFO [train.py:682] (1/4) Start epoch 1018 2024-10-21 02:28:44,907 INFO [train.py:561] (1/4) Epoch 1018, batch 8, global_batch_idx: 16280, batch size: 170, loss[dur_loss=0.2628, prior_loss=0.9894, diff_loss=0.3607, tot_loss=1.613, over 170.00 samples.], tot_loss[dur_loss=0.2574, prior_loss=0.9889, diff_loss=0.4315, tot_loss=1.678, over 1432.00 samples.], 2024-10-21 02:28:55,075 INFO [train.py:682] (1/4) Start epoch 1019 2024-10-21 02:29:06,769 INFO [train.py:561] (1/4) Epoch 1019, batch 2, global_batch_idx: 16290, batch size: 203, loss[dur_loss=0.2596, prior_loss=0.9893, diff_loss=0.3959, tot_loss=1.645, over 203.00 samples.], tot_loss[dur_loss=0.2618, prior_loss=0.9896, diff_loss=0.3646, tot_loss=1.616, over 442.00 samples.], 2024-10-21 02:29:20,852 INFO [train.py:561] (1/4) Epoch 1019, batch 12, global_batch_idx: 16300, batch size: 152, loss[dur_loss=0.26, prior_loss=0.9893, diff_loss=0.3292, tot_loss=1.578, over 152.00 samples.], tot_loss[dur_loss=0.2584, prior_loss=0.9891, diff_loss=0.4033, tot_loss=1.651, over 1966.00 samples.], 2024-10-21 02:29:25,252 INFO [train.py:682] (1/4) Start epoch 1020 2024-10-21 02:29:42,585 INFO [train.py:561] (1/4) Epoch 1020, batch 6, global_batch_idx: 16310, batch size: 106, loss[dur_loss=0.2645, prior_loss=0.9896, diff_loss=0.3384, tot_loss=1.592, over 106.00 samples.], tot_loss[dur_loss=0.2556, prior_loss=0.9888, diff_loss=0.4394, tot_loss=1.684, over 1142.00 samples.], 2024-10-21 02:29:55,767 INFO [train.py:682] (1/4) Start epoch 1021 2024-10-21 02:30:04,804 INFO [train.py:561] (1/4) Epoch 1021, batch 0, global_batch_idx: 16320, batch size: 108, loss[dur_loss=0.2698, prior_loss=0.9903, diff_loss=0.3343, tot_loss=1.594, over 108.00 samples.], tot_loss[dur_loss=0.2698, prior_loss=0.9903, diff_loss=0.3343, tot_loss=1.594, over 108.00 samples.], 2024-10-21 02:30:19,159 INFO [train.py:561] (1/4) Epoch 1021, batch 10, global_batch_idx: 16330, batch size: 111, loss[dur_loss=0.2651, prior_loss=0.9911, diff_loss=0.3765, tot_loss=1.633, over 111.00 samples.], tot_loss[dur_loss=0.258, prior_loss=0.9891, diff_loss=0.4217, tot_loss=1.669, over 1656.00 samples.], 2024-10-21 02:30:26,319 INFO [train.py:682] (1/4) Start epoch 1022 2024-10-21 02:30:40,202 INFO [train.py:561] (1/4) Epoch 1022, batch 4, global_batch_idx: 16340, batch size: 189, loss[dur_loss=0.2604, prior_loss=0.9897, diff_loss=0.3861, tot_loss=1.636, over 189.00 samples.], tot_loss[dur_loss=0.256, prior_loss=0.9886, diff_loss=0.4716, tot_loss=1.716, over 937.00 samples.], 2024-10-21 02:30:55,068 INFO [train.py:561] (1/4) Epoch 1022, batch 14, global_batch_idx: 16350, batch size: 142, loss[dur_loss=0.2655, prior_loss=0.9898, diff_loss=0.352, tot_loss=1.607, over 142.00 samples.], tot_loss[dur_loss=0.2605, prior_loss=0.9893, diff_loss=0.3973, tot_loss=1.647, over 2210.00 samples.], 2024-10-21 02:30:56,494 INFO [train.py:682] (1/4) Start epoch 1023 2024-10-21 02:31:16,503 INFO [train.py:561] (1/4) Epoch 1023, batch 8, global_batch_idx: 16360, batch size: 170, loss[dur_loss=0.2625, prior_loss=0.9896, diff_loss=0.3394, tot_loss=1.592, over 170.00 samples.], tot_loss[dur_loss=0.2596, prior_loss=0.9891, diff_loss=0.417, tot_loss=1.666, over 1432.00 samples.], 2024-10-21 02:31:26,721 INFO [train.py:682] (1/4) Start epoch 1024 2024-10-21 02:31:38,564 INFO [train.py:561] (1/4) Epoch 1024, batch 2, global_batch_idx: 16370, batch size: 203, loss[dur_loss=0.262, prior_loss=0.9894, diff_loss=0.3756, tot_loss=1.627, over 203.00 samples.], tot_loss[dur_loss=0.2643, prior_loss=0.9897, diff_loss=0.3614, tot_loss=1.615, over 442.00 samples.], 2024-10-21 02:31:52,604 INFO [train.py:561] (1/4) Epoch 1024, batch 12, global_batch_idx: 16380, batch size: 152, loss[dur_loss=0.2601, prior_loss=0.989, diff_loss=0.3414, tot_loss=1.591, over 152.00 samples.], tot_loss[dur_loss=0.259, prior_loss=0.9891, diff_loss=0.3893, tot_loss=1.637, over 1966.00 samples.], 2024-10-21 02:31:56,968 INFO [train.py:682] (1/4) Start epoch 1025 2024-10-21 02:32:14,161 INFO [train.py:561] (1/4) Epoch 1025, batch 6, global_batch_idx: 16390, batch size: 106, loss[dur_loss=0.2658, prior_loss=0.9899, diff_loss=0.3649, tot_loss=1.621, over 106.00 samples.], tot_loss[dur_loss=0.2557, prior_loss=0.9887, diff_loss=0.4373, tot_loss=1.682, over 1142.00 samples.], 2024-10-21 02:32:27,289 INFO [train.py:682] (1/4) Start epoch 1026 2024-10-21 02:32:36,161 INFO [train.py:561] (1/4) Epoch 1026, batch 0, global_batch_idx: 16400, batch size: 108, loss[dur_loss=0.2688, prior_loss=0.9903, diff_loss=0.3147, tot_loss=1.574, over 108.00 samples.], tot_loss[dur_loss=0.2688, prior_loss=0.9903, diff_loss=0.3147, tot_loss=1.574, over 108.00 samples.], 2024-10-21 02:32:50,409 INFO [train.py:561] (1/4) Epoch 1026, batch 10, global_batch_idx: 16410, batch size: 111, loss[dur_loss=0.2628, prior_loss=0.9906, diff_loss=0.3474, tot_loss=1.601, over 111.00 samples.], tot_loss[dur_loss=0.2577, prior_loss=0.989, diff_loss=0.4055, tot_loss=1.652, over 1656.00 samples.], 2024-10-21 02:32:57,542 INFO [train.py:682] (1/4) Start epoch 1027 2024-10-21 02:33:11,265 INFO [train.py:561] (1/4) Epoch 1027, batch 4, global_batch_idx: 16420, batch size: 189, loss[dur_loss=0.2597, prior_loss=0.9898, diff_loss=0.3652, tot_loss=1.615, over 189.00 samples.], tot_loss[dur_loss=0.2575, prior_loss=0.9887, diff_loss=0.4642, tot_loss=1.71, over 937.00 samples.], 2024-10-21 02:33:26,221 INFO [train.py:561] (1/4) Epoch 1027, batch 14, global_batch_idx: 16430, batch size: 142, loss[dur_loss=0.2593, prior_loss=0.9892, diff_loss=0.3406, tot_loss=1.589, over 142.00 samples.], tot_loss[dur_loss=0.2602, prior_loss=0.9892, diff_loss=0.3867, tot_loss=1.636, over 2210.00 samples.], 2024-10-21 02:33:27,654 INFO [train.py:682] (1/4) Start epoch 1028 2024-10-21 02:33:47,778 INFO [train.py:561] (1/4) Epoch 1028, batch 8, global_batch_idx: 16440, batch size: 170, loss[dur_loss=0.2605, prior_loss=0.9892, diff_loss=0.3663, tot_loss=1.616, over 170.00 samples.], tot_loss[dur_loss=0.2586, prior_loss=0.9891, diff_loss=0.4299, tot_loss=1.678, over 1432.00 samples.], 2024-10-21 02:33:57,932 INFO [train.py:682] (1/4) Start epoch 1029 2024-10-21 02:34:09,704 INFO [train.py:561] (1/4) Epoch 1029, batch 2, global_batch_idx: 16450, batch size: 203, loss[dur_loss=0.2605, prior_loss=0.9893, diff_loss=0.3592, tot_loss=1.609, over 203.00 samples.], tot_loss[dur_loss=0.2622, prior_loss=0.9895, diff_loss=0.3573, tot_loss=1.609, over 442.00 samples.], 2024-10-21 02:34:23,966 INFO [train.py:561] (1/4) Epoch 1029, batch 12, global_batch_idx: 16460, batch size: 152, loss[dur_loss=0.2615, prior_loss=0.989, diff_loss=0.3671, tot_loss=1.618, over 152.00 samples.], tot_loss[dur_loss=0.2579, prior_loss=0.9889, diff_loss=0.4009, tot_loss=1.648, over 1966.00 samples.], 2024-10-21 02:34:28,438 INFO [train.py:682] (1/4) Start epoch 1030 2024-10-21 02:34:45,627 INFO [train.py:561] (1/4) Epoch 1030, batch 6, global_batch_idx: 16470, batch size: 106, loss[dur_loss=0.2634, prior_loss=0.9895, diff_loss=0.3417, tot_loss=1.595, over 106.00 samples.], tot_loss[dur_loss=0.256, prior_loss=0.9886, diff_loss=0.4345, tot_loss=1.679, over 1142.00 samples.], 2024-10-21 02:34:58,653 INFO [train.py:682] (1/4) Start epoch 1031 2024-10-21 02:35:07,546 INFO [train.py:561] (1/4) Epoch 1031, batch 0, global_batch_idx: 16480, batch size: 108, loss[dur_loss=0.2687, prior_loss=0.9903, diff_loss=0.3473, tot_loss=1.606, over 108.00 samples.], tot_loss[dur_loss=0.2687, prior_loss=0.9903, diff_loss=0.3473, tot_loss=1.606, over 108.00 samples.], 2024-10-21 02:35:21,753 INFO [train.py:561] (1/4) Epoch 1031, batch 10, global_batch_idx: 16490, batch size: 111, loss[dur_loss=0.2636, prior_loss=0.9904, diff_loss=0.3643, tot_loss=1.618, over 111.00 samples.], tot_loss[dur_loss=0.2579, prior_loss=0.989, diff_loss=0.4228, tot_loss=1.67, over 1656.00 samples.], 2024-10-21 02:35:28,866 INFO [train.py:682] (1/4) Start epoch 1032 2024-10-21 02:35:42,610 INFO [train.py:561] (1/4) Epoch 1032, batch 4, global_batch_idx: 16500, batch size: 189, loss[dur_loss=0.2609, prior_loss=0.9902, diff_loss=0.36, tot_loss=1.611, over 189.00 samples.], tot_loss[dur_loss=0.2556, prior_loss=0.9886, diff_loss=0.4578, tot_loss=1.702, over 937.00 samples.], 2024-10-21 02:35:44,261 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 02:36:15,178 INFO [train.py:589] (1/4) Epoch 1032, validation: dur_loss=0.4439, prior_loss=1.032, diff_loss=0.4171, tot_loss=1.893, over 100.00 samples. 2024-10-21 02:36:15,179 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 02:36:28,506 INFO [train.py:561] (1/4) Epoch 1032, batch 14, global_batch_idx: 16510, batch size: 142, loss[dur_loss=0.262, prior_loss=0.989, diff_loss=0.3265, tot_loss=1.578, over 142.00 samples.], tot_loss[dur_loss=0.2593, prior_loss=0.9891, diff_loss=0.3963, tot_loss=1.645, over 2210.00 samples.], 2024-10-21 02:36:29,947 INFO [train.py:682] (1/4) Start epoch 1033 2024-10-21 02:36:50,020 INFO [train.py:561] (1/4) Epoch 1033, batch 8, global_batch_idx: 16520, batch size: 170, loss[dur_loss=0.2618, prior_loss=0.9895, diff_loss=0.3543, tot_loss=1.606, over 170.00 samples.], tot_loss[dur_loss=0.257, prior_loss=0.9889, diff_loss=0.4242, tot_loss=1.67, over 1432.00 samples.], 2024-10-21 02:37:00,139 INFO [train.py:682] (1/4) Start epoch 1034 2024-10-21 02:37:11,952 INFO [train.py:561] (1/4) Epoch 1034, batch 2, global_batch_idx: 16530, batch size: 203, loss[dur_loss=0.2614, prior_loss=0.9894, diff_loss=0.365, tot_loss=1.616, over 203.00 samples.], tot_loss[dur_loss=0.2617, prior_loss=0.9895, diff_loss=0.359, tot_loss=1.61, over 442.00 samples.], 2024-10-21 02:37:26,141 INFO [train.py:561] (1/4) Epoch 1034, batch 12, global_batch_idx: 16540, batch size: 152, loss[dur_loss=0.2629, prior_loss=0.9894, diff_loss=0.3453, tot_loss=1.598, over 152.00 samples.], tot_loss[dur_loss=0.2591, prior_loss=0.989, diff_loss=0.3993, tot_loss=1.647, over 1966.00 samples.], 2024-10-21 02:37:30,609 INFO [train.py:682] (1/4) Start epoch 1035 2024-10-21 02:37:47,726 INFO [train.py:561] (1/4) Epoch 1035, batch 6, global_batch_idx: 16550, batch size: 106, loss[dur_loss=0.263, prior_loss=0.9892, diff_loss=0.3509, tot_loss=1.603, over 106.00 samples.], tot_loss[dur_loss=0.2553, prior_loss=0.9885, diff_loss=0.4447, tot_loss=1.688, over 1142.00 samples.], 2024-10-21 02:38:00,679 INFO [train.py:682] (1/4) Start epoch 1036 2024-10-21 02:38:09,505 INFO [train.py:561] (1/4) Epoch 1036, batch 0, global_batch_idx: 16560, batch size: 108, loss[dur_loss=0.2687, prior_loss=0.9902, diff_loss=0.3186, tot_loss=1.577, over 108.00 samples.], tot_loss[dur_loss=0.2687, prior_loss=0.9902, diff_loss=0.3186, tot_loss=1.577, over 108.00 samples.], 2024-10-21 02:38:23,824 INFO [train.py:561] (1/4) Epoch 1036, batch 10, global_batch_idx: 16570, batch size: 111, loss[dur_loss=0.2637, prior_loss=0.9904, diff_loss=0.2922, tot_loss=1.546, over 111.00 samples.], tot_loss[dur_loss=0.2574, prior_loss=0.9888, diff_loss=0.4126, tot_loss=1.659, over 1656.00 samples.], 2024-10-21 02:38:30,953 INFO [train.py:682] (1/4) Start epoch 1037 2024-10-21 02:38:44,841 INFO [train.py:561] (1/4) Epoch 1037, batch 4, global_batch_idx: 16580, batch size: 189, loss[dur_loss=0.2602, prior_loss=0.99, diff_loss=0.3607, tot_loss=1.611, over 189.00 samples.], tot_loss[dur_loss=0.2539, prior_loss=0.9883, diff_loss=0.4746, tot_loss=1.717, over 937.00 samples.], 2024-10-21 02:38:59,454 INFO [train.py:561] (1/4) Epoch 1037, batch 14, global_batch_idx: 16590, batch size: 142, loss[dur_loss=0.2621, prior_loss=0.9894, diff_loss=0.3502, tot_loss=1.602, over 142.00 samples.], tot_loss[dur_loss=0.2584, prior_loss=0.9891, diff_loss=0.4003, tot_loss=1.648, over 2210.00 samples.], 2024-10-21 02:39:00,894 INFO [train.py:682] (1/4) Start epoch 1038 2024-10-21 02:39:20,935 INFO [train.py:561] (1/4) Epoch 1038, batch 8, global_batch_idx: 16600, batch size: 170, loss[dur_loss=0.2657, prior_loss=0.9898, diff_loss=0.3726, tot_loss=1.628, over 170.00 samples.], tot_loss[dur_loss=0.2585, prior_loss=0.9889, diff_loss=0.423, tot_loss=1.67, over 1432.00 samples.], 2024-10-21 02:39:31,077 INFO [train.py:682] (1/4) Start epoch 1039 2024-10-21 02:39:42,480 INFO [train.py:561] (1/4) Epoch 1039, batch 2, global_batch_idx: 16610, batch size: 203, loss[dur_loss=0.2614, prior_loss=0.9891, diff_loss=0.3694, tot_loss=1.62, over 203.00 samples.], tot_loss[dur_loss=0.262, prior_loss=0.9893, diff_loss=0.3485, tot_loss=1.6, over 442.00 samples.], 2024-10-21 02:39:56,706 INFO [train.py:561] (1/4) Epoch 1039, batch 12, global_batch_idx: 16620, batch size: 152, loss[dur_loss=0.2632, prior_loss=0.9898, diff_loss=0.3469, tot_loss=1.6, over 152.00 samples.], tot_loss[dur_loss=0.2584, prior_loss=0.9891, diff_loss=0.3977, tot_loss=1.645, over 1966.00 samples.], 2024-10-21 02:40:01,187 INFO [train.py:682] (1/4) Start epoch 1040 2024-10-21 02:40:18,528 INFO [train.py:561] (1/4) Epoch 1040, batch 6, global_batch_idx: 16630, batch size: 106, loss[dur_loss=0.2607, prior_loss=0.9895, diff_loss=0.3394, tot_loss=1.59, over 106.00 samples.], tot_loss[dur_loss=0.2544, prior_loss=0.9884, diff_loss=0.4295, tot_loss=1.672, over 1142.00 samples.], 2024-10-21 02:40:31,567 INFO [train.py:682] (1/4) Start epoch 1041 2024-10-21 02:40:40,289 INFO [train.py:561] (1/4) Epoch 1041, batch 0, global_batch_idx: 16640, batch size: 108, loss[dur_loss=0.2685, prior_loss=0.9899, diff_loss=0.3389, tot_loss=1.597, over 108.00 samples.], tot_loss[dur_loss=0.2685, prior_loss=0.9899, diff_loss=0.3389, tot_loss=1.597, over 108.00 samples.], 2024-10-21 02:40:54,523 INFO [train.py:561] (1/4) Epoch 1041, batch 10, global_batch_idx: 16650, batch size: 111, loss[dur_loss=0.264, prior_loss=0.9903, diff_loss=0.337, tot_loss=1.591, over 111.00 samples.], tot_loss[dur_loss=0.2581, prior_loss=0.9887, diff_loss=0.3982, tot_loss=1.645, over 1656.00 samples.], 2024-10-21 02:41:01,620 INFO [train.py:682] (1/4) Start epoch 1042 2024-10-21 02:41:15,398 INFO [train.py:561] (1/4) Epoch 1042, batch 4, global_batch_idx: 16660, batch size: 189, loss[dur_loss=0.2604, prior_loss=0.9892, diff_loss=0.3721, tot_loss=1.622, over 189.00 samples.], tot_loss[dur_loss=0.254, prior_loss=0.9881, diff_loss=0.4531, tot_loss=1.695, over 937.00 samples.], 2024-10-21 02:41:30,320 INFO [train.py:561] (1/4) Epoch 1042, batch 14, global_batch_idx: 16670, batch size: 142, loss[dur_loss=0.2628, prior_loss=0.9893, diff_loss=0.3447, tot_loss=1.597, over 142.00 samples.], tot_loss[dur_loss=0.2583, prior_loss=0.9889, diff_loss=0.3893, tot_loss=1.636, over 2210.00 samples.], 2024-10-21 02:41:31,739 INFO [train.py:682] (1/4) Start epoch 1043 2024-10-21 02:41:51,790 INFO [train.py:561] (1/4) Epoch 1043, batch 8, global_batch_idx: 16680, batch size: 170, loss[dur_loss=0.2626, prior_loss=0.9892, diff_loss=0.3463, tot_loss=1.598, over 170.00 samples.], tot_loss[dur_loss=0.2566, prior_loss=0.9886, diff_loss=0.4259, tot_loss=1.671, over 1432.00 samples.], 2024-10-21 02:42:01,917 INFO [train.py:682] (1/4) Start epoch 1044 2024-10-21 02:42:13,230 INFO [train.py:561] (1/4) Epoch 1044, batch 2, global_batch_idx: 16690, batch size: 203, loss[dur_loss=0.2606, prior_loss=0.9889, diff_loss=0.368, tot_loss=1.617, over 203.00 samples.], tot_loss[dur_loss=0.2625, prior_loss=0.9891, diff_loss=0.3501, tot_loss=1.602, over 442.00 samples.], 2024-10-21 02:42:27,485 INFO [train.py:561] (1/4) Epoch 1044, batch 12, global_batch_idx: 16700, batch size: 152, loss[dur_loss=0.2567, prior_loss=0.9894, diff_loss=0.3368, tot_loss=1.583, over 152.00 samples.], tot_loss[dur_loss=0.2582, prior_loss=0.9889, diff_loss=0.4036, tot_loss=1.651, over 1966.00 samples.], 2024-10-21 02:42:31,983 INFO [train.py:682] (1/4) Start epoch 1045 2024-10-21 02:42:49,124 INFO [train.py:561] (1/4) Epoch 1045, batch 6, global_batch_idx: 16710, batch size: 106, loss[dur_loss=0.2615, prior_loss=0.9889, diff_loss=0.3592, tot_loss=1.61, over 106.00 samples.], tot_loss[dur_loss=0.255, prior_loss=0.9884, diff_loss=0.4377, tot_loss=1.681, over 1142.00 samples.], 2024-10-21 02:43:02,209 INFO [train.py:682] (1/4) Start epoch 1046 2024-10-21 02:43:10,996 INFO [train.py:561] (1/4) Epoch 1046, batch 0, global_batch_idx: 16720, batch size: 108, loss[dur_loss=0.2642, prior_loss=0.9901, diff_loss=0.3474, tot_loss=1.602, over 108.00 samples.], tot_loss[dur_loss=0.2642, prior_loss=0.9901, diff_loss=0.3474, tot_loss=1.602, over 108.00 samples.], 2024-10-21 02:43:25,264 INFO [train.py:561] (1/4) Epoch 1046, batch 10, global_batch_idx: 16730, batch size: 111, loss[dur_loss=0.2611, prior_loss=0.9899, diff_loss=0.3102, tot_loss=1.561, over 111.00 samples.], tot_loss[dur_loss=0.2566, prior_loss=0.9887, diff_loss=0.4039, tot_loss=1.649, over 1656.00 samples.], 2024-10-21 02:43:32,305 INFO [train.py:682] (1/4) Start epoch 1047 2024-10-21 02:43:45,997 INFO [train.py:561] (1/4) Epoch 1047, batch 4, global_batch_idx: 16740, batch size: 189, loss[dur_loss=0.2602, prior_loss=0.9893, diff_loss=0.3572, tot_loss=1.607, over 189.00 samples.], tot_loss[dur_loss=0.2531, prior_loss=0.9883, diff_loss=0.4619, tot_loss=1.703, over 937.00 samples.], 2024-10-21 02:44:00,843 INFO [train.py:561] (1/4) Epoch 1047, batch 14, global_batch_idx: 16750, batch size: 142, loss[dur_loss=0.2628, prior_loss=0.989, diff_loss=0.2966, tot_loss=1.548, over 142.00 samples.], tot_loss[dur_loss=0.2577, prior_loss=0.9889, diff_loss=0.3924, tot_loss=1.639, over 2210.00 samples.], 2024-10-21 02:44:02,256 INFO [train.py:682] (1/4) Start epoch 1048 2024-10-21 02:44:22,051 INFO [train.py:561] (1/4) Epoch 1048, batch 8, global_batch_idx: 16760, batch size: 170, loss[dur_loss=0.2585, prior_loss=0.9896, diff_loss=0.3456, tot_loss=1.594, over 170.00 samples.], tot_loss[dur_loss=0.2564, prior_loss=0.9886, diff_loss=0.4166, tot_loss=1.662, over 1432.00 samples.], 2024-10-21 02:44:32,132 INFO [train.py:682] (1/4) Start epoch 1049 2024-10-21 02:44:43,454 INFO [train.py:561] (1/4) Epoch 1049, batch 2, global_batch_idx: 16770, batch size: 203, loss[dur_loss=0.2562, prior_loss=0.9889, diff_loss=0.3812, tot_loss=1.626, over 203.00 samples.], tot_loss[dur_loss=0.2593, prior_loss=0.9892, diff_loss=0.3526, tot_loss=1.601, over 442.00 samples.], 2024-10-21 02:44:57,667 INFO [train.py:561] (1/4) Epoch 1049, batch 12, global_batch_idx: 16780, batch size: 152, loss[dur_loss=0.2578, prior_loss=0.989, diff_loss=0.3307, tot_loss=1.577, over 152.00 samples.], tot_loss[dur_loss=0.2577, prior_loss=0.989, diff_loss=0.397, tot_loss=1.644, over 1966.00 samples.], 2024-10-21 02:45:02,255 INFO [train.py:682] (1/4) Start epoch 1050 2024-10-21 02:45:19,355 INFO [train.py:561] (1/4) Epoch 1050, batch 6, global_batch_idx: 16790, batch size: 106, loss[dur_loss=0.2638, prior_loss=0.9893, diff_loss=0.3022, tot_loss=1.555, over 106.00 samples.], tot_loss[dur_loss=0.2561, prior_loss=0.9885, diff_loss=0.4381, tot_loss=1.683, over 1142.00 samples.], 2024-10-21 02:45:32,443 INFO [train.py:682] (1/4) Start epoch 1051 2024-10-21 02:45:41,384 INFO [train.py:561] (1/4) Epoch 1051, batch 0, global_batch_idx: 16800, batch size: 108, loss[dur_loss=0.265, prior_loss=0.9898, diff_loss=0.3264, tot_loss=1.581, over 108.00 samples.], tot_loss[dur_loss=0.265, prior_loss=0.9898, diff_loss=0.3264, tot_loss=1.581, over 108.00 samples.], 2024-10-21 02:45:55,912 INFO [train.py:561] (1/4) Epoch 1051, batch 10, global_batch_idx: 16810, batch size: 111, loss[dur_loss=0.2649, prior_loss=0.9905, diff_loss=0.3338, tot_loss=1.589, over 111.00 samples.], tot_loss[dur_loss=0.2571, prior_loss=0.9889, diff_loss=0.4077, tot_loss=1.654, over 1656.00 samples.], 2024-10-21 02:46:03,041 INFO [train.py:682] (1/4) Start epoch 1052 2024-10-21 02:46:17,124 INFO [train.py:561] (1/4) Epoch 1052, batch 4, global_batch_idx: 16820, batch size: 189, loss[dur_loss=0.2623, prior_loss=0.9903, diff_loss=0.3555, tot_loss=1.608, over 189.00 samples.], tot_loss[dur_loss=0.2552, prior_loss=0.9883, diff_loss=0.4625, tot_loss=1.706, over 937.00 samples.], 2024-10-21 02:46:32,038 INFO [train.py:561] (1/4) Epoch 1052, batch 14, global_batch_idx: 16830, batch size: 142, loss[dur_loss=0.2625, prior_loss=0.989, diff_loss=0.3499, tot_loss=1.601, over 142.00 samples.], tot_loss[dur_loss=0.2597, prior_loss=0.989, diff_loss=0.3982, tot_loss=1.647, over 2210.00 samples.], 2024-10-21 02:46:33,471 INFO [train.py:682] (1/4) Start epoch 1053 2024-10-21 02:46:53,377 INFO [train.py:561] (1/4) Epoch 1053, batch 8, global_batch_idx: 16840, batch size: 170, loss[dur_loss=0.2661, prior_loss=0.9896, diff_loss=0.3643, tot_loss=1.62, over 170.00 samples.], tot_loss[dur_loss=0.2581, prior_loss=0.9888, diff_loss=0.4079, tot_loss=1.655, over 1432.00 samples.], 2024-10-21 02:47:03,480 INFO [train.py:682] (1/4) Start epoch 1054 2024-10-21 02:47:15,065 INFO [train.py:561] (1/4) Epoch 1054, batch 2, global_batch_idx: 16850, batch size: 203, loss[dur_loss=0.265, prior_loss=0.9891, diff_loss=0.3713, tot_loss=1.625, over 203.00 samples.], tot_loss[dur_loss=0.2646, prior_loss=0.9892, diff_loss=0.3481, tot_loss=1.602, over 442.00 samples.], 2024-10-21 02:47:29,392 INFO [train.py:561] (1/4) Epoch 1054, batch 12, global_batch_idx: 16860, batch size: 152, loss[dur_loss=0.2595, prior_loss=0.9892, diff_loss=0.3181, tot_loss=1.567, over 152.00 samples.], tot_loss[dur_loss=0.2587, prior_loss=0.9888, diff_loss=0.3915, tot_loss=1.639, over 1966.00 samples.], 2024-10-21 02:47:33,874 INFO [train.py:682] (1/4) Start epoch 1055 2024-10-21 02:47:50,973 INFO [train.py:561] (1/4) Epoch 1055, batch 6, global_batch_idx: 16870, batch size: 106, loss[dur_loss=0.2639, prior_loss=0.989, diff_loss=0.3287, tot_loss=1.582, over 106.00 samples.], tot_loss[dur_loss=0.2558, prior_loss=0.9882, diff_loss=0.438, tot_loss=1.682, over 1142.00 samples.], 2024-10-21 02:48:03,975 INFO [train.py:682] (1/4) Start epoch 1056 2024-10-21 02:48:12,823 INFO [train.py:561] (1/4) Epoch 1056, batch 0, global_batch_idx: 16880, batch size: 108, loss[dur_loss=0.2676, prior_loss=0.9899, diff_loss=0.3684, tot_loss=1.626, over 108.00 samples.], tot_loss[dur_loss=0.2676, prior_loss=0.9899, diff_loss=0.3684, tot_loss=1.626, over 108.00 samples.], 2024-10-21 02:48:27,197 INFO [train.py:561] (1/4) Epoch 1056, batch 10, global_batch_idx: 16890, batch size: 111, loss[dur_loss=0.2657, prior_loss=0.9904, diff_loss=0.3152, tot_loss=1.571, over 111.00 samples.], tot_loss[dur_loss=0.257, prior_loss=0.9889, diff_loss=0.4066, tot_loss=1.653, over 1656.00 samples.], 2024-10-21 02:48:34,379 INFO [train.py:682] (1/4) Start epoch 1057 2024-10-21 02:48:48,489 INFO [train.py:561] (1/4) Epoch 1057, batch 4, global_batch_idx: 16900, batch size: 189, loss[dur_loss=0.2609, prior_loss=0.9896, diff_loss=0.3497, tot_loss=1.6, over 189.00 samples.], tot_loss[dur_loss=0.2555, prior_loss=0.9884, diff_loss=0.4682, tot_loss=1.712, over 937.00 samples.], 2024-10-21 02:49:03,395 INFO [train.py:561] (1/4) Epoch 1057, batch 14, global_batch_idx: 16910, batch size: 142, loss[dur_loss=0.2626, prior_loss=0.9889, diff_loss=0.3556, tot_loss=1.607, over 142.00 samples.], tot_loss[dur_loss=0.2586, prior_loss=0.9889, diff_loss=0.3945, tot_loss=1.642, over 2210.00 samples.], 2024-10-21 02:49:04,813 INFO [train.py:682] (1/4) Start epoch 1058 2024-10-21 02:49:24,950 INFO [train.py:561] (1/4) Epoch 1058, batch 8, global_batch_idx: 16920, batch size: 170, loss[dur_loss=0.2599, prior_loss=0.9893, diff_loss=0.3253, tot_loss=1.575, over 170.00 samples.], tot_loss[dur_loss=0.257, prior_loss=0.9885, diff_loss=0.4092, tot_loss=1.655, over 1432.00 samples.], 2024-10-21 02:49:35,133 INFO [train.py:682] (1/4) Start epoch 1059 2024-10-21 02:49:46,580 INFO [train.py:561] (1/4) Epoch 1059, batch 2, global_batch_idx: 16930, batch size: 203, loss[dur_loss=0.2583, prior_loss=0.9891, diff_loss=0.3781, tot_loss=1.625, over 203.00 samples.], tot_loss[dur_loss=0.2625, prior_loss=0.9893, diff_loss=0.3647, tot_loss=1.617, over 442.00 samples.], 2024-10-21 02:50:00,838 INFO [train.py:561] (1/4) Epoch 1059, batch 12, global_batch_idx: 16940, batch size: 152, loss[dur_loss=0.257, prior_loss=0.9886, diff_loss=0.3722, tot_loss=1.618, over 152.00 samples.], tot_loss[dur_loss=0.2562, prior_loss=0.9886, diff_loss=0.4033, tot_loss=1.648, over 1966.00 samples.], 2024-10-21 02:50:05,278 INFO [train.py:682] (1/4) Start epoch 1060 2024-10-21 02:50:22,626 INFO [train.py:561] (1/4) Epoch 1060, batch 6, global_batch_idx: 16950, batch size: 106, loss[dur_loss=0.2604, prior_loss=0.9891, diff_loss=0.3429, tot_loss=1.592, over 106.00 samples.], tot_loss[dur_loss=0.2542, prior_loss=0.9883, diff_loss=0.4308, tot_loss=1.673, over 1142.00 samples.], 2024-10-21 02:50:35,775 INFO [train.py:682] (1/4) Start epoch 1061 2024-10-21 02:50:44,754 INFO [train.py:561] (1/4) Epoch 1061, batch 0, global_batch_idx: 16960, batch size: 108, loss[dur_loss=0.2642, prior_loss=0.9898, diff_loss=0.3034, tot_loss=1.557, over 108.00 samples.], tot_loss[dur_loss=0.2642, prior_loss=0.9898, diff_loss=0.3034, tot_loss=1.557, over 108.00 samples.], 2024-10-21 02:50:58,969 INFO [train.py:561] (1/4) Epoch 1061, batch 10, global_batch_idx: 16970, batch size: 111, loss[dur_loss=0.2624, prior_loss=0.9898, diff_loss=0.3601, tot_loss=1.612, over 111.00 samples.], tot_loss[dur_loss=0.2572, prior_loss=0.9884, diff_loss=0.4101, tot_loss=1.656, over 1656.00 samples.], 2024-10-21 02:51:06,063 INFO [train.py:682] (1/4) Start epoch 1062 2024-10-21 02:51:19,775 INFO [train.py:561] (1/4) Epoch 1062, batch 4, global_batch_idx: 16980, batch size: 189, loss[dur_loss=0.2585, prior_loss=0.9891, diff_loss=0.3758, tot_loss=1.623, over 189.00 samples.], tot_loss[dur_loss=0.2547, prior_loss=0.988, diff_loss=0.4568, tot_loss=1.7, over 937.00 samples.], 2024-10-21 02:51:34,764 INFO [train.py:561] (1/4) Epoch 1062, batch 14, global_batch_idx: 16990, batch size: 142, loss[dur_loss=0.2565, prior_loss=0.9886, diff_loss=0.3561, tot_loss=1.601, over 142.00 samples.], tot_loss[dur_loss=0.257, prior_loss=0.9886, diff_loss=0.3986, tot_loss=1.644, over 2210.00 samples.], 2024-10-21 02:51:36,194 INFO [train.py:682] (1/4) Start epoch 1063 2024-10-21 02:51:56,165 INFO [train.py:561] (1/4) Epoch 1063, batch 8, global_batch_idx: 17000, batch size: 170, loss[dur_loss=0.2625, prior_loss=0.9895, diff_loss=0.3842, tot_loss=1.636, over 170.00 samples.], tot_loss[dur_loss=0.2563, prior_loss=0.9885, diff_loss=0.425, tot_loss=1.67, over 1432.00 samples.], 2024-10-21 02:52:06,338 INFO [train.py:682] (1/4) Start epoch 1064 2024-10-21 02:52:17,526 INFO [train.py:561] (1/4) Epoch 1064, batch 2, global_batch_idx: 17010, batch size: 203, loss[dur_loss=0.2594, prior_loss=0.989, diff_loss=0.3863, tot_loss=1.635, over 203.00 samples.], tot_loss[dur_loss=0.2623, prior_loss=0.9892, diff_loss=0.3521, tot_loss=1.604, over 442.00 samples.], 2024-10-21 02:52:31,899 INFO [train.py:561] (1/4) Epoch 1064, batch 12, global_batch_idx: 17020, batch size: 152, loss[dur_loss=0.2585, prior_loss=0.9891, diff_loss=0.3579, tot_loss=1.606, over 152.00 samples.], tot_loss[dur_loss=0.2584, prior_loss=0.9888, diff_loss=0.3987, tot_loss=1.646, over 1966.00 samples.], 2024-10-21 02:52:36,423 INFO [train.py:682] (1/4) Start epoch 1065 2024-10-21 02:52:53,622 INFO [train.py:561] (1/4) Epoch 1065, batch 6, global_batch_idx: 17030, batch size: 106, loss[dur_loss=0.2574, prior_loss=0.9892, diff_loss=0.3472, tot_loss=1.594, over 106.00 samples.], tot_loss[dur_loss=0.2561, prior_loss=0.9884, diff_loss=0.4398, tot_loss=1.684, over 1142.00 samples.], 2024-10-21 02:53:06,767 INFO [train.py:682] (1/4) Start epoch 1066 2024-10-21 02:53:15,591 INFO [train.py:561] (1/4) Epoch 1066, batch 0, global_batch_idx: 17040, batch size: 108, loss[dur_loss=0.2644, prior_loss=0.99, diff_loss=0.3599, tot_loss=1.614, over 108.00 samples.], tot_loss[dur_loss=0.2644, prior_loss=0.99, diff_loss=0.3599, tot_loss=1.614, over 108.00 samples.], 2024-10-21 02:53:30,018 INFO [train.py:561] (1/4) Epoch 1066, batch 10, global_batch_idx: 17050, batch size: 111, loss[dur_loss=0.265, prior_loss=0.9906, diff_loss=0.358, tot_loss=1.614, over 111.00 samples.], tot_loss[dur_loss=0.2583, prior_loss=0.9886, diff_loss=0.4098, tot_loss=1.657, over 1656.00 samples.], 2024-10-21 02:53:37,194 INFO [train.py:682] (1/4) Start epoch 1067 2024-10-21 02:53:51,126 INFO [train.py:561] (1/4) Epoch 1067, batch 4, global_batch_idx: 17060, batch size: 189, loss[dur_loss=0.2579, prior_loss=0.9888, diff_loss=0.3749, tot_loss=1.622, over 189.00 samples.], tot_loss[dur_loss=0.254, prior_loss=0.9879, diff_loss=0.4593, tot_loss=1.701, over 937.00 samples.], 2024-10-21 02:54:06,035 INFO [train.py:561] (1/4) Epoch 1067, batch 14, global_batch_idx: 17070, batch size: 142, loss[dur_loss=0.2591, prior_loss=0.9889, diff_loss=0.3261, tot_loss=1.574, over 142.00 samples.], tot_loss[dur_loss=0.2576, prior_loss=0.9886, diff_loss=0.3919, tot_loss=1.638, over 2210.00 samples.], 2024-10-21 02:54:07,455 INFO [train.py:682] (1/4) Start epoch 1068 2024-10-21 02:54:27,702 INFO [train.py:561] (1/4) Epoch 1068, batch 8, global_batch_idx: 17080, batch size: 170, loss[dur_loss=0.2604, prior_loss=0.9887, diff_loss=0.346, tot_loss=1.595, over 170.00 samples.], tot_loss[dur_loss=0.256, prior_loss=0.9883, diff_loss=0.4195, tot_loss=1.664, over 1432.00 samples.], 2024-10-21 02:54:37,872 INFO [train.py:682] (1/4) Start epoch 1069 2024-10-21 02:54:49,244 INFO [train.py:561] (1/4) Epoch 1069, batch 2, global_batch_idx: 17090, batch size: 203, loss[dur_loss=0.2585, prior_loss=0.9888, diff_loss=0.3711, tot_loss=1.618, over 203.00 samples.], tot_loss[dur_loss=0.2608, prior_loss=0.989, diff_loss=0.3614, tot_loss=1.611, over 442.00 samples.], 2024-10-21 02:55:03,568 INFO [train.py:561] (1/4) Epoch 1069, batch 12, global_batch_idx: 17100, batch size: 152, loss[dur_loss=0.2587, prior_loss=0.9886, diff_loss=0.345, tot_loss=1.592, over 152.00 samples.], tot_loss[dur_loss=0.2564, prior_loss=0.9885, diff_loss=0.4053, tot_loss=1.65, over 1966.00 samples.], 2024-10-21 02:55:08,037 INFO [train.py:682] (1/4) Start epoch 1070 2024-10-21 02:55:24,987 INFO [train.py:561] (1/4) Epoch 1070, batch 6, global_batch_idx: 17110, batch size: 106, loss[dur_loss=0.2604, prior_loss=0.9886, diff_loss=0.3614, tot_loss=1.61, over 106.00 samples.], tot_loss[dur_loss=0.2534, prior_loss=0.9879, diff_loss=0.4394, tot_loss=1.681, over 1142.00 samples.], 2024-10-21 02:55:38,096 INFO [train.py:682] (1/4) Start epoch 1071 2024-10-21 02:55:46,576 INFO [train.py:561] (1/4) Epoch 1071, batch 0, global_batch_idx: 17120, batch size: 108, loss[dur_loss=0.263, prior_loss=0.9896, diff_loss=0.3263, tot_loss=1.579, over 108.00 samples.], tot_loss[dur_loss=0.263, prior_loss=0.9896, diff_loss=0.3263, tot_loss=1.579, over 108.00 samples.], 2024-10-21 02:56:00,940 INFO [train.py:561] (1/4) Epoch 1071, batch 10, global_batch_idx: 17130, batch size: 111, loss[dur_loss=0.2663, prior_loss=0.9899, diff_loss=0.3723, tot_loss=1.629, over 111.00 samples.], tot_loss[dur_loss=0.2575, prior_loss=0.9884, diff_loss=0.4109, tot_loss=1.657, over 1656.00 samples.], 2024-10-21 02:56:08,075 INFO [train.py:682] (1/4) Start epoch 1072 2024-10-21 02:56:21,941 INFO [train.py:561] (1/4) Epoch 1072, batch 4, global_batch_idx: 17140, batch size: 189, loss[dur_loss=0.26, prior_loss=0.9892, diff_loss=0.3758, tot_loss=1.625, over 189.00 samples.], tot_loss[dur_loss=0.2533, prior_loss=0.9878, diff_loss=0.4518, tot_loss=1.693, over 937.00 samples.], 2024-10-21 02:56:36,909 INFO [train.py:561] (1/4) Epoch 1072, batch 14, global_batch_idx: 17150, batch size: 142, loss[dur_loss=0.26, prior_loss=0.9887, diff_loss=0.3467, tot_loss=1.595, over 142.00 samples.], tot_loss[dur_loss=0.2568, prior_loss=0.9884, diff_loss=0.3855, tot_loss=1.631, over 2210.00 samples.], 2024-10-21 02:56:38,336 INFO [train.py:682] (1/4) Start epoch 1073 2024-10-21 02:56:58,005 INFO [train.py:561] (1/4) Epoch 1073, batch 8, global_batch_idx: 17160, batch size: 170, loss[dur_loss=0.2575, prior_loss=0.9886, diff_loss=0.4075, tot_loss=1.654, over 170.00 samples.], tot_loss[dur_loss=0.2561, prior_loss=0.988, diff_loss=0.4327, tot_loss=1.677, over 1432.00 samples.], 2024-10-21 02:57:08,105 INFO [train.py:682] (1/4) Start epoch 1074 2024-10-21 02:57:19,644 INFO [train.py:561] (1/4) Epoch 1074, batch 2, global_batch_idx: 17170, batch size: 203, loss[dur_loss=0.2615, prior_loss=0.9887, diff_loss=0.3427, tot_loss=1.593, over 203.00 samples.], tot_loss[dur_loss=0.2606, prior_loss=0.9888, diff_loss=0.3403, tot_loss=1.59, over 442.00 samples.], 2024-10-21 02:57:33,934 INFO [train.py:561] (1/4) Epoch 1074, batch 12, global_batch_idx: 17180, batch size: 152, loss[dur_loss=0.2569, prior_loss=0.9885, diff_loss=0.365, tot_loss=1.61, over 152.00 samples.], tot_loss[dur_loss=0.2567, prior_loss=0.9884, diff_loss=0.3992, tot_loss=1.644, over 1966.00 samples.], 2024-10-21 02:57:38,446 INFO [train.py:682] (1/4) Start epoch 1075 2024-10-21 02:57:55,477 INFO [train.py:561] (1/4) Epoch 1075, batch 6, global_batch_idx: 17190, batch size: 106, loss[dur_loss=0.2567, prior_loss=0.9887, diff_loss=0.3437, tot_loss=1.589, over 106.00 samples.], tot_loss[dur_loss=0.253, prior_loss=0.9879, diff_loss=0.4474, tot_loss=1.688, over 1142.00 samples.], 2024-10-21 02:58:08,568 INFO [train.py:682] (1/4) Start epoch 1076 2024-10-21 02:58:17,092 INFO [train.py:561] (1/4) Epoch 1076, batch 0, global_batch_idx: 17200, batch size: 108, loss[dur_loss=0.2598, prior_loss=0.9894, diff_loss=0.3238, tot_loss=1.573, over 108.00 samples.], tot_loss[dur_loss=0.2598, prior_loss=0.9894, diff_loss=0.3238, tot_loss=1.573, over 108.00 samples.], 2024-10-21 02:58:31,300 INFO [train.py:561] (1/4) Epoch 1076, batch 10, global_batch_idx: 17210, batch size: 111, loss[dur_loss=0.2637, prior_loss=0.9898, diff_loss=0.3292, tot_loss=1.583, over 111.00 samples.], tot_loss[dur_loss=0.2548, prior_loss=0.988, diff_loss=0.3969, tot_loss=1.64, over 1656.00 samples.], 2024-10-21 02:58:38,348 INFO [train.py:682] (1/4) Start epoch 1077 2024-10-21 02:58:52,675 INFO [train.py:561] (1/4) Epoch 1077, batch 4, global_batch_idx: 17220, batch size: 189, loss[dur_loss=0.2553, prior_loss=0.9885, diff_loss=0.337, tot_loss=1.581, over 189.00 samples.], tot_loss[dur_loss=0.253, prior_loss=0.9875, diff_loss=0.4642, tot_loss=1.705, over 937.00 samples.], 2024-10-21 02:59:07,503 INFO [train.py:561] (1/4) Epoch 1077, batch 14, global_batch_idx: 17230, batch size: 142, loss[dur_loss=0.2559, prior_loss=0.988, diff_loss=0.3564, tot_loss=1.6, over 142.00 samples.], tot_loss[dur_loss=0.2561, prior_loss=0.9881, diff_loss=0.3906, tot_loss=1.635, over 2210.00 samples.], 2024-10-21 02:59:08,936 INFO [train.py:682] (1/4) Start epoch 1078 2024-10-21 02:59:28,920 INFO [train.py:561] (1/4) Epoch 1078, batch 8, global_batch_idx: 17240, batch size: 170, loss[dur_loss=0.2608, prior_loss=0.9884, diff_loss=0.361, tot_loss=1.61, over 170.00 samples.], tot_loss[dur_loss=0.2551, prior_loss=0.9878, diff_loss=0.4199, tot_loss=1.663, over 1432.00 samples.], 2024-10-21 02:59:39,060 INFO [train.py:682] (1/4) Start epoch 1079 2024-10-21 02:59:50,284 INFO [train.py:561] (1/4) Epoch 1079, batch 2, global_batch_idx: 17250, batch size: 203, loss[dur_loss=0.2593, prior_loss=0.9883, diff_loss=0.3522, tot_loss=1.6, over 203.00 samples.], tot_loss[dur_loss=0.2586, prior_loss=0.9886, diff_loss=0.3524, tot_loss=1.599, over 442.00 samples.], 2024-10-21 03:00:04,643 INFO [train.py:561] (1/4) Epoch 1079, batch 12, global_batch_idx: 17260, batch size: 152, loss[dur_loss=0.2576, prior_loss=0.9883, diff_loss=0.3381, tot_loss=1.584, over 152.00 samples.], tot_loss[dur_loss=0.2551, prior_loss=0.988, diff_loss=0.3939, tot_loss=1.637, over 1966.00 samples.], 2024-10-21 03:00:09,141 INFO [train.py:682] (1/4) Start epoch 1080 2024-10-21 03:00:26,414 INFO [train.py:561] (1/4) Epoch 1080, batch 6, global_batch_idx: 17270, batch size: 106, loss[dur_loss=0.2609, prior_loss=0.9888, diff_loss=0.3084, tot_loss=1.558, over 106.00 samples.], tot_loss[dur_loss=0.2505, prior_loss=0.9876, diff_loss=0.4315, tot_loss=1.67, over 1142.00 samples.], 2024-10-21 03:00:39,530 INFO [train.py:682] (1/4) Start epoch 1081 2024-10-21 03:00:48,084 INFO [train.py:561] (1/4) Epoch 1081, batch 0, global_batch_idx: 17280, batch size: 108, loss[dur_loss=0.2653, prior_loss=0.9892, diff_loss=0.3257, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2653, prior_loss=0.9892, diff_loss=0.3257, tot_loss=1.58, over 108.00 samples.], 2024-10-21 03:01:02,421 INFO [train.py:561] (1/4) Epoch 1081, batch 10, global_batch_idx: 17290, batch size: 111, loss[dur_loss=0.2646, prior_loss=0.9896, diff_loss=0.3394, tot_loss=1.594, over 111.00 samples.], tot_loss[dur_loss=0.2547, prior_loss=0.988, diff_loss=0.4002, tot_loss=1.643, over 1656.00 samples.], 2024-10-21 03:01:09,517 INFO [train.py:682] (1/4) Start epoch 1082 2024-10-21 03:01:23,088 INFO [train.py:561] (1/4) Epoch 1082, batch 4, global_batch_idx: 17300, batch size: 189, loss[dur_loss=0.2579, prior_loss=0.9886, diff_loss=0.3789, tot_loss=1.625, over 189.00 samples.], tot_loss[dur_loss=0.2502, prior_loss=0.9873, diff_loss=0.46, tot_loss=1.698, over 937.00 samples.], 2024-10-21 03:01:37,877 INFO [train.py:561] (1/4) Epoch 1082, batch 14, global_batch_idx: 17310, batch size: 142, loss[dur_loss=0.2552, prior_loss=0.9883, diff_loss=0.322, tot_loss=1.565, over 142.00 samples.], tot_loss[dur_loss=0.2546, prior_loss=0.988, diff_loss=0.3916, tot_loss=1.634, over 2210.00 samples.], 2024-10-21 03:01:39,284 INFO [train.py:682] (1/4) Start epoch 1083 2024-10-21 03:01:59,626 INFO [train.py:561] (1/4) Epoch 1083, batch 8, global_batch_idx: 17320, batch size: 170, loss[dur_loss=0.2573, prior_loss=0.9885, diff_loss=0.3628, tot_loss=1.609, over 170.00 samples.], tot_loss[dur_loss=0.2534, prior_loss=0.9878, diff_loss=0.4269, tot_loss=1.668, over 1432.00 samples.], 2024-10-21 03:02:09,820 INFO [train.py:682] (1/4) Start epoch 1084 2024-10-21 03:02:21,159 INFO [train.py:561] (1/4) Epoch 1084, batch 2, global_batch_idx: 17330, batch size: 203, loss[dur_loss=0.2551, prior_loss=0.9882, diff_loss=0.3667, tot_loss=1.61, over 203.00 samples.], tot_loss[dur_loss=0.2571, prior_loss=0.9884, diff_loss=0.3331, tot_loss=1.579, over 442.00 samples.], 2024-10-21 03:02:35,527 INFO [train.py:561] (1/4) Epoch 1084, batch 12, global_batch_idx: 17340, batch size: 152, loss[dur_loss=0.2585, prior_loss=0.9883, diff_loss=0.3373, tot_loss=1.584, over 152.00 samples.], tot_loss[dur_loss=0.2551, prior_loss=0.9879, diff_loss=0.3858, tot_loss=1.629, over 1966.00 samples.], 2024-10-21 03:02:40,000 INFO [train.py:682] (1/4) Start epoch 1085 2024-10-21 03:02:57,125 INFO [train.py:561] (1/4) Epoch 1085, batch 6, global_batch_idx: 17350, batch size: 106, loss[dur_loss=0.2544, prior_loss=0.9886, diff_loss=0.3082, tot_loss=1.551, over 106.00 samples.], tot_loss[dur_loss=0.2526, prior_loss=0.9877, diff_loss=0.437, tot_loss=1.677, over 1142.00 samples.], 2024-10-21 03:03:10,226 INFO [train.py:682] (1/4) Start epoch 1086 2024-10-21 03:03:18,849 INFO [train.py:561] (1/4) Epoch 1086, batch 0, global_batch_idx: 17360, batch size: 108, loss[dur_loss=0.2619, prior_loss=0.9889, diff_loss=0.287, tot_loss=1.538, over 108.00 samples.], tot_loss[dur_loss=0.2619, prior_loss=0.9889, diff_loss=0.287, tot_loss=1.538, over 108.00 samples.], 2024-10-21 03:03:33,089 INFO [train.py:561] (1/4) Epoch 1086, batch 10, global_batch_idx: 17370, batch size: 111, loss[dur_loss=0.2615, prior_loss=0.9897, diff_loss=0.3542, tot_loss=1.605, over 111.00 samples.], tot_loss[dur_loss=0.2553, prior_loss=0.9881, diff_loss=0.4005, tot_loss=1.644, over 1656.00 samples.], 2024-10-21 03:03:40,186 INFO [train.py:682] (1/4) Start epoch 1087 2024-10-21 03:03:53,874 INFO [train.py:561] (1/4) Epoch 1087, batch 4, global_batch_idx: 17380, batch size: 189, loss[dur_loss=0.2578, prior_loss=0.9888, diff_loss=0.3421, tot_loss=1.589, over 189.00 samples.], tot_loss[dur_loss=0.2517, prior_loss=0.9875, diff_loss=0.4448, tot_loss=1.684, over 937.00 samples.], 2024-10-21 03:04:08,730 INFO [train.py:561] (1/4) Epoch 1087, batch 14, global_batch_idx: 17390, batch size: 142, loss[dur_loss=0.257, prior_loss=0.9879, diff_loss=0.3191, tot_loss=1.564, over 142.00 samples.], tot_loss[dur_loss=0.2553, prior_loss=0.988, diff_loss=0.3878, tot_loss=1.631, over 2210.00 samples.], 2024-10-21 03:04:10,168 INFO [train.py:682] (1/4) Start epoch 1088 2024-10-21 03:04:30,136 INFO [train.py:561] (1/4) Epoch 1088, batch 8, global_batch_idx: 17400, batch size: 170, loss[dur_loss=0.2589, prior_loss=0.9887, diff_loss=0.3435, tot_loss=1.591, over 170.00 samples.], tot_loss[dur_loss=0.2546, prior_loss=0.9879, diff_loss=0.4136, tot_loss=1.656, over 1432.00 samples.], 2024-10-21 03:04:40,396 INFO [train.py:682] (1/4) Start epoch 1089 2024-10-21 03:04:51,800 INFO [train.py:561] (1/4) Epoch 1089, batch 2, global_batch_idx: 17410, batch size: 203, loss[dur_loss=0.2599, prior_loss=0.9882, diff_loss=0.3524, tot_loss=1.6, over 203.00 samples.], tot_loss[dur_loss=0.2586, prior_loss=0.9884, diff_loss=0.3423, tot_loss=1.589, over 442.00 samples.], 2024-10-21 03:05:06,158 INFO [train.py:561] (1/4) Epoch 1089, batch 12, global_batch_idx: 17420, batch size: 152, loss[dur_loss=0.2553, prior_loss=0.9879, diff_loss=0.2972, tot_loss=1.54, over 152.00 samples.], tot_loss[dur_loss=0.2544, prior_loss=0.9879, diff_loss=0.3886, tot_loss=1.631, over 1966.00 samples.], 2024-10-21 03:05:10,642 INFO [train.py:682] (1/4) Start epoch 1090 2024-10-21 03:05:27,637 INFO [train.py:561] (1/4) Epoch 1090, batch 6, global_batch_idx: 17430, batch size: 106, loss[dur_loss=0.2626, prior_loss=0.9886, diff_loss=0.3352, tot_loss=1.586, over 106.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9876, diff_loss=0.4334, tot_loss=1.672, over 1142.00 samples.], 2024-10-21 03:05:40,759 INFO [train.py:682] (1/4) Start epoch 1091 2024-10-21 03:05:49,632 INFO [train.py:561] (1/4) Epoch 1091, batch 0, global_batch_idx: 17440, batch size: 108, loss[dur_loss=0.2636, prior_loss=0.9893, diff_loss=0.3569, tot_loss=1.61, over 108.00 samples.], tot_loss[dur_loss=0.2636, prior_loss=0.9893, diff_loss=0.3569, tot_loss=1.61, over 108.00 samples.], 2024-10-21 03:06:03,901 INFO [train.py:561] (1/4) Epoch 1091, batch 10, global_batch_idx: 17450, batch size: 111, loss[dur_loss=0.2577, prior_loss=0.9893, diff_loss=0.3302, tot_loss=1.577, over 111.00 samples.], tot_loss[dur_loss=0.2543, prior_loss=0.9879, diff_loss=0.4016, tot_loss=1.644, over 1656.00 samples.], 2024-10-21 03:06:11,024 INFO [train.py:682] (1/4) Start epoch 1092 2024-10-21 03:06:24,797 INFO [train.py:561] (1/4) Epoch 1092, batch 4, global_batch_idx: 17460, batch size: 189, loss[dur_loss=0.2614, prior_loss=0.9892, diff_loss=0.3467, tot_loss=1.597, over 189.00 samples.], tot_loss[dur_loss=0.2503, prior_loss=0.9874, diff_loss=0.449, tot_loss=1.687, over 937.00 samples.], 2024-10-21 03:06:39,547 INFO [train.py:561] (1/4) Epoch 1092, batch 14, global_batch_idx: 17470, batch size: 142, loss[dur_loss=0.2568, prior_loss=0.9877, diff_loss=0.3284, tot_loss=1.573, over 142.00 samples.], tot_loss[dur_loss=0.2546, prior_loss=0.988, diff_loss=0.3898, tot_loss=1.632, over 2210.00 samples.], 2024-10-21 03:06:40,949 INFO [train.py:682] (1/4) Start epoch 1093 2024-10-21 03:07:00,739 INFO [train.py:561] (1/4) Epoch 1093, batch 8, global_batch_idx: 17480, batch size: 170, loss[dur_loss=0.2567, prior_loss=0.9883, diff_loss=0.3823, tot_loss=1.627, over 170.00 samples.], tot_loss[dur_loss=0.2536, prior_loss=0.9877, diff_loss=0.4219, tot_loss=1.663, over 1432.00 samples.], 2024-10-21 03:07:10,919 INFO [train.py:682] (1/4) Start epoch 1094 2024-10-21 03:07:22,411 INFO [train.py:561] (1/4) Epoch 1094, batch 2, global_batch_idx: 17490, batch size: 203, loss[dur_loss=0.257, prior_loss=0.9879, diff_loss=0.3631, tot_loss=1.608, over 203.00 samples.], tot_loss[dur_loss=0.2575, prior_loss=0.9882, diff_loss=0.3505, tot_loss=1.596, over 442.00 samples.], 2024-10-21 03:07:36,577 INFO [train.py:561] (1/4) Epoch 1094, batch 12, global_batch_idx: 17500, batch size: 152, loss[dur_loss=0.2555, prior_loss=0.9882, diff_loss=0.3317, tot_loss=1.575, over 152.00 samples.], tot_loss[dur_loss=0.2541, prior_loss=0.9878, diff_loss=0.3885, tot_loss=1.63, over 1966.00 samples.], 2024-10-21 03:07:40,991 INFO [train.py:682] (1/4) Start epoch 1095 2024-10-21 03:07:57,809 INFO [train.py:561] (1/4) Epoch 1095, batch 6, global_batch_idx: 17510, batch size: 106, loss[dur_loss=0.2572, prior_loss=0.9883, diff_loss=0.3279, tot_loss=1.573, over 106.00 samples.], tot_loss[dur_loss=0.2497, prior_loss=0.9873, diff_loss=0.4296, tot_loss=1.667, over 1142.00 samples.], 2024-10-21 03:08:10,714 INFO [train.py:682] (1/4) Start epoch 1096 2024-10-21 03:08:19,721 INFO [train.py:561] (1/4) Epoch 1096, batch 0, global_batch_idx: 17520, batch size: 108, loss[dur_loss=0.2639, prior_loss=0.9889, diff_loss=0.3219, tot_loss=1.575, over 108.00 samples.], tot_loss[dur_loss=0.2639, prior_loss=0.9889, diff_loss=0.3219, tot_loss=1.575, over 108.00 samples.], 2024-10-21 03:08:33,993 INFO [train.py:561] (1/4) Epoch 1096, batch 10, global_batch_idx: 17530, batch size: 111, loss[dur_loss=0.2599, prior_loss=0.9891, diff_loss=0.3232, tot_loss=1.572, over 111.00 samples.], tot_loss[dur_loss=0.253, prior_loss=0.9877, diff_loss=0.4074, tot_loss=1.648, over 1656.00 samples.], 2024-10-21 03:08:41,156 INFO [train.py:682] (1/4) Start epoch 1097 2024-10-21 03:08:54,944 INFO [train.py:561] (1/4) Epoch 1097, batch 4, global_batch_idx: 17540, batch size: 189, loss[dur_loss=0.257, prior_loss=0.9884, diff_loss=0.3742, tot_loss=1.62, over 189.00 samples.], tot_loss[dur_loss=0.2504, prior_loss=0.9872, diff_loss=0.4653, tot_loss=1.703, over 937.00 samples.], 2024-10-21 03:09:09,835 INFO [train.py:561] (1/4) Epoch 1097, batch 14, global_batch_idx: 17550, batch size: 142, loss[dur_loss=0.2574, prior_loss=0.9877, diff_loss=0.3062, tot_loss=1.551, over 142.00 samples.], tot_loss[dur_loss=0.2546, prior_loss=0.9877, diff_loss=0.3927, tot_loss=1.635, over 2210.00 samples.], 2024-10-21 03:09:11,257 INFO [train.py:682] (1/4) Start epoch 1098 2024-10-21 03:09:31,146 INFO [train.py:561] (1/4) Epoch 1098, batch 8, global_batch_idx: 17560, batch size: 170, loss[dur_loss=0.2583, prior_loss=0.988, diff_loss=0.378, tot_loss=1.624, over 170.00 samples.], tot_loss[dur_loss=0.2535, prior_loss=0.9875, diff_loss=0.4216, tot_loss=1.663, over 1432.00 samples.], 2024-10-21 03:09:41,379 INFO [train.py:682] (1/4) Start epoch 1099 2024-10-21 03:09:52,914 INFO [train.py:561] (1/4) Epoch 1099, batch 2, global_batch_idx: 17570, batch size: 203, loss[dur_loss=0.2552, prior_loss=0.9877, diff_loss=0.3684, tot_loss=1.611, over 203.00 samples.], tot_loss[dur_loss=0.2558, prior_loss=0.988, diff_loss=0.3472, tot_loss=1.591, over 442.00 samples.], 2024-10-21 03:10:07,136 INFO [train.py:561] (1/4) Epoch 1099, batch 12, global_batch_idx: 17580, batch size: 152, loss[dur_loss=0.2561, prior_loss=0.988, diff_loss=0.3809, tot_loss=1.625, over 152.00 samples.], tot_loss[dur_loss=0.2534, prior_loss=0.9877, diff_loss=0.3989, tot_loss=1.64, over 1966.00 samples.], 2024-10-21 03:10:11,602 INFO [train.py:682] (1/4) Start epoch 1100 2024-10-21 03:10:28,694 INFO [train.py:561] (1/4) Epoch 1100, batch 6, global_batch_idx: 17590, batch size: 106, loss[dur_loss=0.255, prior_loss=0.9883, diff_loss=0.3663, tot_loss=1.61, over 106.00 samples.], tot_loss[dur_loss=0.2511, prior_loss=0.9873, diff_loss=0.4275, tot_loss=1.666, over 1142.00 samples.], 2024-10-21 03:10:41,788 INFO [train.py:682] (1/4) Start epoch 1101 2024-10-21 03:10:50,720 INFO [train.py:561] (1/4) Epoch 1101, batch 0, global_batch_idx: 17600, batch size: 108, loss[dur_loss=0.2597, prior_loss=0.9888, diff_loss=0.3403, tot_loss=1.589, over 108.00 samples.], tot_loss[dur_loss=0.2597, prior_loss=0.9888, diff_loss=0.3403, tot_loss=1.589, over 108.00 samples.], 2024-10-21 03:11:04,982 INFO [train.py:561] (1/4) Epoch 1101, batch 10, global_batch_idx: 17610, batch size: 111, loss[dur_loss=0.2622, prior_loss=0.9894, diff_loss=0.3468, tot_loss=1.598, over 111.00 samples.], tot_loss[dur_loss=0.2541, prior_loss=0.9878, diff_loss=0.4081, tot_loss=1.65, over 1656.00 samples.], 2024-10-21 03:11:12,045 INFO [train.py:682] (1/4) Start epoch 1102 2024-10-21 03:11:26,037 INFO [train.py:561] (1/4) Epoch 1102, batch 4, global_batch_idx: 17620, batch size: 189, loss[dur_loss=0.2621, prior_loss=0.9891, diff_loss=0.3529, tot_loss=1.604, over 189.00 samples.], tot_loss[dur_loss=0.2501, prior_loss=0.9872, diff_loss=0.4524, tot_loss=1.69, over 937.00 samples.], 2024-10-21 03:11:40,871 INFO [train.py:561] (1/4) Epoch 1102, batch 14, global_batch_idx: 17630, batch size: 142, loss[dur_loss=0.2615, prior_loss=0.9879, diff_loss=0.3069, tot_loss=1.556, over 142.00 samples.], tot_loss[dur_loss=0.2543, prior_loss=0.9879, diff_loss=0.3912, tot_loss=1.633, over 2210.00 samples.], 2024-10-21 03:11:42,304 INFO [train.py:682] (1/4) Start epoch 1103 2024-10-21 03:12:02,204 INFO [train.py:561] (1/4) Epoch 1103, batch 8, global_batch_idx: 17640, batch size: 170, loss[dur_loss=0.2571, prior_loss=0.9882, diff_loss=0.3904, tot_loss=1.636, over 170.00 samples.], tot_loss[dur_loss=0.253, prior_loss=0.9877, diff_loss=0.4212, tot_loss=1.662, over 1432.00 samples.], 2024-10-21 03:12:12,287 INFO [train.py:682] (1/4) Start epoch 1104 2024-10-21 03:12:23,898 INFO [train.py:561] (1/4) Epoch 1104, batch 2, global_batch_idx: 17650, batch size: 203, loss[dur_loss=0.2573, prior_loss=0.988, diff_loss=0.3636, tot_loss=1.609, over 203.00 samples.], tot_loss[dur_loss=0.2582, prior_loss=0.9882, diff_loss=0.3307, tot_loss=1.577, over 442.00 samples.], 2024-10-21 03:12:38,170 INFO [train.py:561] (1/4) Epoch 1104, batch 12, global_batch_idx: 17660, batch size: 152, loss[dur_loss=0.2575, prior_loss=0.9879, diff_loss=0.342, tot_loss=1.587, over 152.00 samples.], tot_loss[dur_loss=0.2546, prior_loss=0.9879, diff_loss=0.392, tot_loss=1.635, over 1966.00 samples.], 2024-10-21 03:12:42,626 INFO [train.py:682] (1/4) Start epoch 1105 2024-10-21 03:12:59,731 INFO [train.py:561] (1/4) Epoch 1105, batch 6, global_batch_idx: 17670, batch size: 106, loss[dur_loss=0.26, prior_loss=0.9885, diff_loss=0.2964, tot_loss=1.545, over 106.00 samples.], tot_loss[dur_loss=0.2505, prior_loss=0.9872, diff_loss=0.4344, tot_loss=1.672, over 1142.00 samples.], 2024-10-21 03:13:12,826 INFO [train.py:682] (1/4) Start epoch 1106 2024-10-21 03:13:21,478 INFO [train.py:561] (1/4) Epoch 1106, batch 0, global_batch_idx: 17680, batch size: 108, loss[dur_loss=0.2597, prior_loss=0.9886, diff_loss=0.293, tot_loss=1.541, over 108.00 samples.], tot_loss[dur_loss=0.2597, prior_loss=0.9886, diff_loss=0.293, tot_loss=1.541, over 108.00 samples.], 2024-10-21 03:13:35,693 INFO [train.py:561] (1/4) Epoch 1106, batch 10, global_batch_idx: 17690, batch size: 111, loss[dur_loss=0.2609, prior_loss=0.9894, diff_loss=0.3326, tot_loss=1.583, over 111.00 samples.], tot_loss[dur_loss=0.2528, prior_loss=0.9876, diff_loss=0.4012, tot_loss=1.642, over 1656.00 samples.], 2024-10-21 03:13:42,746 INFO [train.py:682] (1/4) Start epoch 1107 2024-10-21 03:13:56,758 INFO [train.py:561] (1/4) Epoch 1107, batch 4, global_batch_idx: 17700, batch size: 189, loss[dur_loss=0.2548, prior_loss=0.9882, diff_loss=0.3475, tot_loss=1.591, over 189.00 samples.], tot_loss[dur_loss=0.2481, prior_loss=0.9869, diff_loss=0.4544, tot_loss=1.689, over 937.00 samples.], 2024-10-21 03:14:11,503 INFO [train.py:561] (1/4) Epoch 1107, batch 14, global_batch_idx: 17710, batch size: 142, loss[dur_loss=0.2571, prior_loss=0.9876, diff_loss=0.3667, tot_loss=1.611, over 142.00 samples.], tot_loss[dur_loss=0.2532, prior_loss=0.9876, diff_loss=0.3877, tot_loss=1.628, over 2210.00 samples.], 2024-10-21 03:14:12,906 INFO [train.py:682] (1/4) Start epoch 1108 2024-10-21 03:14:32,961 INFO [train.py:561] (1/4) Epoch 1108, batch 8, global_batch_idx: 17720, batch size: 170, loss[dur_loss=0.2556, prior_loss=0.988, diff_loss=0.3473, tot_loss=1.591, over 170.00 samples.], tot_loss[dur_loss=0.2532, prior_loss=0.9874, diff_loss=0.4183, tot_loss=1.659, over 1432.00 samples.], 2024-10-21 03:14:43,198 INFO [train.py:682] (1/4) Start epoch 1109 2024-10-21 03:14:54,879 INFO [train.py:561] (1/4) Epoch 1109, batch 2, global_batch_idx: 17730, batch size: 203, loss[dur_loss=0.2526, prior_loss=0.9877, diff_loss=0.3696, tot_loss=1.61, over 203.00 samples.], tot_loss[dur_loss=0.2553, prior_loss=0.9879, diff_loss=0.3369, tot_loss=1.58, over 442.00 samples.], 2024-10-21 03:15:09,153 INFO [train.py:561] (1/4) Epoch 1109, batch 12, global_batch_idx: 17740, batch size: 152, loss[dur_loss=0.2534, prior_loss=0.9876, diff_loss=0.323, tot_loss=1.564, over 152.00 samples.], tot_loss[dur_loss=0.2531, prior_loss=0.9875, diff_loss=0.3936, tot_loss=1.634, over 1966.00 samples.], 2024-10-21 03:15:13,652 INFO [train.py:682] (1/4) Start epoch 1110 2024-10-21 03:15:30,965 INFO [train.py:561] (1/4) Epoch 1110, batch 6, global_batch_idx: 17750, batch size: 106, loss[dur_loss=0.2612, prior_loss=0.9882, diff_loss=0.3197, tot_loss=1.569, over 106.00 samples.], tot_loss[dur_loss=0.2502, prior_loss=0.9871, diff_loss=0.4235, tot_loss=1.661, over 1142.00 samples.], 2024-10-21 03:15:44,130 INFO [train.py:682] (1/4) Start epoch 1111 2024-10-21 03:15:53,021 INFO [train.py:561] (1/4) Epoch 1111, batch 0, global_batch_idx: 17760, batch size: 108, loss[dur_loss=0.2609, prior_loss=0.9887, diff_loss=0.3112, tot_loss=1.561, over 108.00 samples.], tot_loss[dur_loss=0.2609, prior_loss=0.9887, diff_loss=0.3112, tot_loss=1.561, over 108.00 samples.], 2024-10-21 03:16:07,239 INFO [train.py:561] (1/4) Epoch 1111, batch 10, global_batch_idx: 17770, batch size: 111, loss[dur_loss=0.2618, prior_loss=0.9894, diff_loss=0.3179, tot_loss=1.569, over 111.00 samples.], tot_loss[dur_loss=0.2529, prior_loss=0.9876, diff_loss=0.4053, tot_loss=1.646, over 1656.00 samples.], 2024-10-21 03:16:14,361 INFO [train.py:682] (1/4) Start epoch 1112 2024-10-21 03:16:28,135 INFO [train.py:561] (1/4) Epoch 1112, batch 4, global_batch_idx: 17780, batch size: 189, loss[dur_loss=0.2545, prior_loss=0.9879, diff_loss=0.3808, tot_loss=1.623, over 189.00 samples.], tot_loss[dur_loss=0.2476, prior_loss=0.9869, diff_loss=0.4661, tot_loss=1.701, over 937.00 samples.], 2024-10-21 03:16:43,059 INFO [train.py:561] (1/4) Epoch 1112, batch 14, global_batch_idx: 17790, batch size: 142, loss[dur_loss=0.2534, prior_loss=0.9873, diff_loss=0.33, tot_loss=1.571, over 142.00 samples.], tot_loss[dur_loss=0.2531, prior_loss=0.9876, diff_loss=0.3954, tot_loss=1.636, over 2210.00 samples.], 2024-10-21 03:16:44,462 INFO [train.py:682] (1/4) Start epoch 1113 2024-10-21 03:17:04,526 INFO [train.py:561] (1/4) Epoch 1113, batch 8, global_batch_idx: 17800, batch size: 170, loss[dur_loss=0.2591, prior_loss=0.9877, diff_loss=0.3354, tot_loss=1.582, over 170.00 samples.], tot_loss[dur_loss=0.2532, prior_loss=0.9873, diff_loss=0.412, tot_loss=1.652, over 1432.00 samples.], 2024-10-21 03:17:14,591 INFO [train.py:682] (1/4) Start epoch 1114 2024-10-21 03:17:26,368 INFO [train.py:561] (1/4) Epoch 1114, batch 2, global_batch_idx: 17810, batch size: 203, loss[dur_loss=0.2534, prior_loss=0.9878, diff_loss=0.3816, tot_loss=1.623, over 203.00 samples.], tot_loss[dur_loss=0.2562, prior_loss=0.9879, diff_loss=0.3604, tot_loss=1.604, over 442.00 samples.], 2024-10-21 03:17:40,580 INFO [train.py:561] (1/4) Epoch 1114, batch 12, global_batch_idx: 17820, batch size: 152, loss[dur_loss=0.2547, prior_loss=0.9878, diff_loss=0.3219, tot_loss=1.564, over 152.00 samples.], tot_loss[dur_loss=0.2534, prior_loss=0.9875, diff_loss=0.3938, tot_loss=1.635, over 1966.00 samples.], 2024-10-21 03:17:45,030 INFO [train.py:682] (1/4) Start epoch 1115 2024-10-21 03:18:02,357 INFO [train.py:561] (1/4) Epoch 1115, batch 6, global_batch_idx: 17830, batch size: 106, loss[dur_loss=0.2569, prior_loss=0.9879, diff_loss=0.3383, tot_loss=1.583, over 106.00 samples.], tot_loss[dur_loss=0.2501, prior_loss=0.987, diff_loss=0.4398, tot_loss=1.677, over 1142.00 samples.], 2024-10-21 03:18:15,434 INFO [train.py:682] (1/4) Start epoch 1116 2024-10-21 03:18:24,597 INFO [train.py:561] (1/4) Epoch 1116, batch 0, global_batch_idx: 17840, batch size: 108, loss[dur_loss=0.2595, prior_loss=0.9883, diff_loss=0.3203, tot_loss=1.568, over 108.00 samples.], tot_loss[dur_loss=0.2595, prior_loss=0.9883, diff_loss=0.3203, tot_loss=1.568, over 108.00 samples.], 2024-10-21 03:18:38,954 INFO [train.py:561] (1/4) Epoch 1116, batch 10, global_batch_idx: 17850, batch size: 111, loss[dur_loss=0.2589, prior_loss=0.9892, diff_loss=0.3378, tot_loss=1.586, over 111.00 samples.], tot_loss[dur_loss=0.2525, prior_loss=0.9874, diff_loss=0.4147, tot_loss=1.655, over 1656.00 samples.], 2024-10-21 03:18:46,125 INFO [train.py:682] (1/4) Start epoch 1117 2024-10-21 03:19:00,275 INFO [train.py:561] (1/4) Epoch 1117, batch 4, global_batch_idx: 17860, batch size: 189, loss[dur_loss=0.2519, prior_loss=0.9879, diff_loss=0.3667, tot_loss=1.607, over 189.00 samples.], tot_loss[dur_loss=0.2486, prior_loss=0.9868, diff_loss=0.4513, tot_loss=1.687, over 937.00 samples.], 2024-10-21 03:19:15,325 INFO [train.py:561] (1/4) Epoch 1117, batch 14, global_batch_idx: 17870, batch size: 142, loss[dur_loss=0.2528, prior_loss=0.9877, diff_loss=0.3711, tot_loss=1.612, over 142.00 samples.], tot_loss[dur_loss=0.2529, prior_loss=0.9874, diff_loss=0.3913, tot_loss=1.632, over 2210.00 samples.], 2024-10-21 03:19:16,756 INFO [train.py:682] (1/4) Start epoch 1118 2024-10-21 03:19:36,758 INFO [train.py:561] (1/4) Epoch 1118, batch 8, global_batch_idx: 17880, batch size: 170, loss[dur_loss=0.2572, prior_loss=0.9876, diff_loss=0.3187, tot_loss=1.564, over 170.00 samples.], tot_loss[dur_loss=0.2514, prior_loss=0.9872, diff_loss=0.4046, tot_loss=1.643, over 1432.00 samples.], 2024-10-21 03:19:46,896 INFO [train.py:682] (1/4) Start epoch 1119 2024-10-21 03:19:58,272 INFO [train.py:561] (1/4) Epoch 1119, batch 2, global_batch_idx: 17890, batch size: 203, loss[dur_loss=0.2557, prior_loss=0.9875, diff_loss=0.365, tot_loss=1.608, over 203.00 samples.], tot_loss[dur_loss=0.256, prior_loss=0.9878, diff_loss=0.3518, tot_loss=1.596, over 442.00 samples.], 2024-10-21 03:20:12,545 INFO [train.py:561] (1/4) Epoch 1119, batch 12, global_batch_idx: 17900, batch size: 152, loss[dur_loss=0.2551, prior_loss=0.9875, diff_loss=0.3624, tot_loss=1.605, over 152.00 samples.], tot_loss[dur_loss=0.252, prior_loss=0.9873, diff_loss=0.3964, tot_loss=1.636, over 1966.00 samples.], 2024-10-21 03:20:17,022 INFO [train.py:682] (1/4) Start epoch 1120 2024-10-21 03:20:34,786 INFO [train.py:561] (1/4) Epoch 1120, batch 6, global_batch_idx: 17910, batch size: 106, loss[dur_loss=0.2536, prior_loss=0.9879, diff_loss=0.3417, tot_loss=1.583, over 106.00 samples.], tot_loss[dur_loss=0.2522, prior_loss=0.987, diff_loss=0.4327, tot_loss=1.672, over 1142.00 samples.], 2024-10-21 03:20:47,902 INFO [train.py:682] (1/4) Start epoch 1121 2024-10-21 03:20:57,041 INFO [train.py:561] (1/4) Epoch 1121, batch 0, global_batch_idx: 17920, batch size: 108, loss[dur_loss=0.2586, prior_loss=0.9884, diff_loss=0.3434, tot_loss=1.59, over 108.00 samples.], tot_loss[dur_loss=0.2586, prior_loss=0.9884, diff_loss=0.3434, tot_loss=1.59, over 108.00 samples.], 2024-10-21 03:21:11,307 INFO [train.py:561] (1/4) Epoch 1121, batch 10, global_batch_idx: 17930, batch size: 111, loss[dur_loss=0.2598, prior_loss=0.989, diff_loss=0.3824, tot_loss=1.631, over 111.00 samples.], tot_loss[dur_loss=0.252, prior_loss=0.9873, diff_loss=0.4084, tot_loss=1.648, over 1656.00 samples.], 2024-10-21 03:21:18,464 INFO [train.py:682] (1/4) Start epoch 1122 2024-10-21 03:21:32,294 INFO [train.py:561] (1/4) Epoch 1122, batch 4, global_batch_idx: 17940, batch size: 189, loss[dur_loss=0.2537, prior_loss=0.9878, diff_loss=0.3951, tot_loss=1.637, over 189.00 samples.], tot_loss[dur_loss=0.248, prior_loss=0.9867, diff_loss=0.46, tot_loss=1.695, over 937.00 samples.], 2024-10-21 03:21:47,228 INFO [train.py:561] (1/4) Epoch 1122, batch 14, global_batch_idx: 17950, batch size: 142, loss[dur_loss=0.2524, prior_loss=0.9872, diff_loss=0.3367, tot_loss=1.576, over 142.00 samples.], tot_loss[dur_loss=0.2522, prior_loss=0.9874, diff_loss=0.3857, tot_loss=1.625, over 2210.00 samples.], 2024-10-21 03:21:48,661 INFO [train.py:682] (1/4) Start epoch 1123 2024-10-21 03:22:12,224 INFO [train.py:561] (1/4) Epoch 1123, batch 8, global_batch_idx: 17960, batch size: 170, loss[dur_loss=0.2546, prior_loss=0.9875, diff_loss=0.3631, tot_loss=1.605, over 170.00 samples.], tot_loss[dur_loss=0.2507, prior_loss=0.987, diff_loss=0.4138, tot_loss=1.652, over 1432.00 samples.], 2024-10-21 03:22:22,428 INFO [train.py:682] (1/4) Start epoch 1124 2024-10-21 03:22:34,103 INFO [train.py:561] (1/4) Epoch 1124, batch 2, global_batch_idx: 17970, batch size: 203, loss[dur_loss=0.2535, prior_loss=0.9876, diff_loss=0.3855, tot_loss=1.627, over 203.00 samples.], tot_loss[dur_loss=0.2541, prior_loss=0.9877, diff_loss=0.3602, tot_loss=1.602, over 442.00 samples.], 2024-10-21 03:22:48,387 INFO [train.py:561] (1/4) Epoch 1124, batch 12, global_batch_idx: 17980, batch size: 152, loss[dur_loss=0.2526, prior_loss=0.9875, diff_loss=0.3433, tot_loss=1.583, over 152.00 samples.], tot_loss[dur_loss=0.2525, prior_loss=0.9872, diff_loss=0.3984, tot_loss=1.638, over 1966.00 samples.], 2024-10-21 03:22:52,837 INFO [train.py:682] (1/4) Start epoch 1125 2024-10-21 03:23:09,849 INFO [train.py:561] (1/4) Epoch 1125, batch 6, global_batch_idx: 17990, batch size: 106, loss[dur_loss=0.2542, prior_loss=0.988, diff_loss=0.3009, tot_loss=1.543, over 106.00 samples.], tot_loss[dur_loss=0.2491, prior_loss=0.9868, diff_loss=0.4285, tot_loss=1.664, over 1142.00 samples.], 2024-10-21 03:23:22,971 INFO [train.py:682] (1/4) Start epoch 1126 2024-10-21 03:23:32,026 INFO [train.py:561] (1/4) Epoch 1126, batch 0, global_batch_idx: 18000, batch size: 108, loss[dur_loss=0.2588, prior_loss=0.9884, diff_loss=0.3365, tot_loss=1.584, over 108.00 samples.], tot_loss[dur_loss=0.2588, prior_loss=0.9884, diff_loss=0.3365, tot_loss=1.584, over 108.00 samples.], 2024-10-21 03:23:33,447 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 03:24:25,896 INFO [train.py:589] (1/4) Epoch 1126, validation: dur_loss=0.4461, prior_loss=1.032, diff_loss=0.387, tot_loss=1.866, over 100.00 samples. 2024-10-21 03:24:25,897 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 03:24:39,032 INFO [train.py:561] (1/4) Epoch 1126, batch 10, global_batch_idx: 18010, batch size: 111, loss[dur_loss=0.2566, prior_loss=0.9888, diff_loss=0.3261, tot_loss=1.571, over 111.00 samples.], tot_loss[dur_loss=0.2504, prior_loss=0.9871, diff_loss=0.4074, tot_loss=1.645, over 1656.00 samples.], 2024-10-21 03:24:46,157 INFO [train.py:682] (1/4) Start epoch 1127 2024-10-21 03:25:00,286 INFO [train.py:561] (1/4) Epoch 1127, batch 4, global_batch_idx: 18020, batch size: 189, loss[dur_loss=0.255, prior_loss=0.9878, diff_loss=0.3831, tot_loss=1.626, over 189.00 samples.], tot_loss[dur_loss=0.2461, prior_loss=0.9866, diff_loss=0.4572, tot_loss=1.69, over 937.00 samples.], 2024-10-21 03:25:15,272 INFO [train.py:561] (1/4) Epoch 1127, batch 14, global_batch_idx: 18030, batch size: 142, loss[dur_loss=0.2508, prior_loss=0.9872, diff_loss=0.3179, tot_loss=1.556, over 142.00 samples.], tot_loss[dur_loss=0.2512, prior_loss=0.9872, diff_loss=0.3874, tot_loss=1.626, over 2210.00 samples.], 2024-10-21 03:25:16,701 INFO [train.py:682] (1/4) Start epoch 1128 2024-10-21 03:25:37,225 INFO [train.py:561] (1/4) Epoch 1128, batch 8, global_batch_idx: 18040, batch size: 170, loss[dur_loss=0.2542, prior_loss=0.9874, diff_loss=0.3391, tot_loss=1.581, over 170.00 samples.], tot_loss[dur_loss=0.2489, prior_loss=0.9868, diff_loss=0.4006, tot_loss=1.636, over 1432.00 samples.], 2024-10-21 03:25:47,381 INFO [train.py:682] (1/4) Start epoch 1129 2024-10-21 03:25:59,231 INFO [train.py:561] (1/4) Epoch 1129, batch 2, global_batch_idx: 18050, batch size: 203, loss[dur_loss=0.2508, prior_loss=0.9873, diff_loss=0.3528, tot_loss=1.591, over 203.00 samples.], tot_loss[dur_loss=0.2518, prior_loss=0.9875, diff_loss=0.3299, tot_loss=1.569, over 442.00 samples.], 2024-10-21 03:26:13,608 INFO [train.py:561] (1/4) Epoch 1129, batch 12, global_batch_idx: 18060, batch size: 152, loss[dur_loss=0.2533, prior_loss=0.9875, diff_loss=0.3524, tot_loss=1.593, over 152.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9871, diff_loss=0.3997, tot_loss=1.638, over 1966.00 samples.], 2024-10-21 03:26:18,093 INFO [train.py:682] (1/4) Start epoch 1130 2024-10-21 03:26:35,358 INFO [train.py:561] (1/4) Epoch 1130, batch 6, global_batch_idx: 18070, batch size: 106, loss[dur_loss=0.2569, prior_loss=0.9878, diff_loss=0.332, tot_loss=1.577, over 106.00 samples.], tot_loss[dur_loss=0.249, prior_loss=0.9868, diff_loss=0.4401, tot_loss=1.676, over 1142.00 samples.], 2024-10-21 03:26:48,386 INFO [train.py:682] (1/4) Start epoch 1131 2024-10-21 03:26:57,728 INFO [train.py:561] (1/4) Epoch 1131, batch 0, global_batch_idx: 18080, batch size: 108, loss[dur_loss=0.2582, prior_loss=0.9884, diff_loss=0.3399, tot_loss=1.587, over 108.00 samples.], tot_loss[dur_loss=0.2582, prior_loss=0.9884, diff_loss=0.3399, tot_loss=1.587, over 108.00 samples.], 2024-10-21 03:27:12,122 INFO [train.py:561] (1/4) Epoch 1131, batch 10, global_batch_idx: 18090, batch size: 111, loss[dur_loss=0.2601, prior_loss=0.9886, diff_loss=0.3257, tot_loss=1.574, over 111.00 samples.], tot_loss[dur_loss=0.2516, prior_loss=0.9871, diff_loss=0.41, tot_loss=1.649, over 1656.00 samples.], 2024-10-21 03:27:19,305 INFO [train.py:682] (1/4) Start epoch 1132 2024-10-21 03:27:33,721 INFO [train.py:561] (1/4) Epoch 1132, batch 4, global_batch_idx: 18100, batch size: 189, loss[dur_loss=0.2581, prior_loss=0.9882, diff_loss=0.3778, tot_loss=1.624, over 189.00 samples.], tot_loss[dur_loss=0.249, prior_loss=0.9865, diff_loss=0.4515, tot_loss=1.687, over 937.00 samples.], 2024-10-21 03:27:48,582 INFO [train.py:561] (1/4) Epoch 1132, batch 14, global_batch_idx: 18110, batch size: 142, loss[dur_loss=0.2546, prior_loss=0.9874, diff_loss=0.3451, tot_loss=1.587, over 142.00 samples.], tot_loss[dur_loss=0.2526, prior_loss=0.9874, diff_loss=0.3878, tot_loss=1.628, over 2210.00 samples.], 2024-10-21 03:27:49,999 INFO [train.py:682] (1/4) Start epoch 1133 2024-10-21 03:28:10,254 INFO [train.py:561] (1/4) Epoch 1133, batch 8, global_batch_idx: 18120, batch size: 170, loss[dur_loss=0.2571, prior_loss=0.9878, diff_loss=0.3361, tot_loss=1.581, over 170.00 samples.], tot_loss[dur_loss=0.2502, prior_loss=0.987, diff_loss=0.4167, tot_loss=1.654, over 1432.00 samples.], 2024-10-21 03:28:20,497 INFO [train.py:682] (1/4) Start epoch 1134 2024-10-21 03:28:32,152 INFO [train.py:561] (1/4) Epoch 1134, batch 2, global_batch_idx: 18130, batch size: 203, loss[dur_loss=0.2541, prior_loss=0.9873, diff_loss=0.3642, tot_loss=1.606, over 203.00 samples.], tot_loss[dur_loss=0.2555, prior_loss=0.9875, diff_loss=0.3431, tot_loss=1.586, over 442.00 samples.], 2024-10-21 03:28:46,437 INFO [train.py:561] (1/4) Epoch 1134, batch 12, global_batch_idx: 18140, batch size: 152, loss[dur_loss=0.2516, prior_loss=0.9875, diff_loss=0.3364, tot_loss=1.576, over 152.00 samples.], tot_loss[dur_loss=0.2512, prior_loss=0.9872, diff_loss=0.3941, tot_loss=1.632, over 1966.00 samples.], 2024-10-21 03:28:50,902 INFO [train.py:682] (1/4) Start epoch 1135 2024-10-21 03:29:07,909 INFO [train.py:561] (1/4) Epoch 1135, batch 6, global_batch_idx: 18150, batch size: 106, loss[dur_loss=0.2579, prior_loss=0.9878, diff_loss=0.3241, tot_loss=1.57, over 106.00 samples.], tot_loss[dur_loss=0.2481, prior_loss=0.9866, diff_loss=0.4292, tot_loss=1.664, over 1142.00 samples.], 2024-10-21 03:29:20,913 INFO [train.py:682] (1/4) Start epoch 1136 2024-10-21 03:29:30,174 INFO [train.py:561] (1/4) Epoch 1136, batch 0, global_batch_idx: 18160, batch size: 108, loss[dur_loss=0.2583, prior_loss=0.9882, diff_loss=0.3441, tot_loss=1.591, over 108.00 samples.], tot_loss[dur_loss=0.2583, prior_loss=0.9882, diff_loss=0.3441, tot_loss=1.591, over 108.00 samples.], 2024-10-21 03:29:44,627 INFO [train.py:561] (1/4) Epoch 1136, batch 10, global_batch_idx: 18170, batch size: 111, loss[dur_loss=0.2575, prior_loss=0.9886, diff_loss=0.3213, tot_loss=1.567, over 111.00 samples.], tot_loss[dur_loss=0.2489, prior_loss=0.9869, diff_loss=0.3957, tot_loss=1.632, over 1656.00 samples.], 2024-10-21 03:29:51,759 INFO [train.py:682] (1/4) Start epoch 1137 2024-10-21 03:30:05,866 INFO [train.py:561] (1/4) Epoch 1137, batch 4, global_batch_idx: 18180, batch size: 189, loss[dur_loss=0.2609, prior_loss=0.9882, diff_loss=0.3703, tot_loss=1.619, over 189.00 samples.], tot_loss[dur_loss=0.2481, prior_loss=0.9867, diff_loss=0.4549, tot_loss=1.69, over 937.00 samples.], 2024-10-21 03:30:20,807 INFO [train.py:561] (1/4) Epoch 1137, batch 14, global_batch_idx: 18190, batch size: 142, loss[dur_loss=0.2538, prior_loss=0.9872, diff_loss=0.3171, tot_loss=1.558, over 142.00 samples.], tot_loss[dur_loss=0.2514, prior_loss=0.9872, diff_loss=0.3901, tot_loss=1.629, over 2210.00 samples.], 2024-10-21 03:30:22,241 INFO [train.py:682] (1/4) Start epoch 1138 2024-10-21 03:30:42,353 INFO [train.py:561] (1/4) Epoch 1138, batch 8, global_batch_idx: 18200, batch size: 170, loss[dur_loss=0.2588, prior_loss=0.988, diff_loss=0.3573, tot_loss=1.604, over 170.00 samples.], tot_loss[dur_loss=0.2503, prior_loss=0.987, diff_loss=0.4222, tot_loss=1.659, over 1432.00 samples.], 2024-10-21 03:30:52,418 INFO [train.py:682] (1/4) Start epoch 1139 2024-10-21 03:31:03,981 INFO [train.py:561] (1/4) Epoch 1139, batch 2, global_batch_idx: 18210, batch size: 203, loss[dur_loss=0.2537, prior_loss=0.9872, diff_loss=0.3882, tot_loss=1.629, over 203.00 samples.], tot_loss[dur_loss=0.2559, prior_loss=0.9876, diff_loss=0.353, tot_loss=1.596, over 442.00 samples.], 2024-10-21 03:31:18,197 INFO [train.py:561] (1/4) Epoch 1139, batch 12, global_batch_idx: 18220, batch size: 152, loss[dur_loss=0.2535, prior_loss=0.9876, diff_loss=0.3551, tot_loss=1.596, over 152.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9872, diff_loss=0.3968, tot_loss=1.635, over 1966.00 samples.], 2024-10-21 03:31:22,633 INFO [train.py:682] (1/4) Start epoch 1140 2024-10-21 03:31:39,760 INFO [train.py:561] (1/4) Epoch 1140, batch 6, global_batch_idx: 18230, batch size: 106, loss[dur_loss=0.2585, prior_loss=0.9877, diff_loss=0.3128, tot_loss=1.559, over 106.00 samples.], tot_loss[dur_loss=0.2495, prior_loss=0.9867, diff_loss=0.4317, tot_loss=1.668, over 1142.00 samples.], 2024-10-21 03:31:52,700 INFO [train.py:682] (1/4) Start epoch 1141 2024-10-21 03:32:01,501 INFO [train.py:561] (1/4) Epoch 1141, batch 0, global_batch_idx: 18240, batch size: 108, loss[dur_loss=0.26, prior_loss=0.9881, diff_loss=0.2923, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.26, prior_loss=0.9881, diff_loss=0.2923, tot_loss=1.54, over 108.00 samples.], 2024-10-21 03:32:15,801 INFO [train.py:561] (1/4) Epoch 1141, batch 10, global_batch_idx: 18250, batch size: 111, loss[dur_loss=0.2599, prior_loss=0.9886, diff_loss=0.3289, tot_loss=1.578, over 111.00 samples.], tot_loss[dur_loss=0.2504, prior_loss=0.9869, diff_loss=0.4032, tot_loss=1.64, over 1656.00 samples.], 2024-10-21 03:32:22,884 INFO [train.py:682] (1/4) Start epoch 1142 2024-10-21 03:32:36,995 INFO [train.py:561] (1/4) Epoch 1142, batch 4, global_batch_idx: 18260, batch size: 189, loss[dur_loss=0.2489, prior_loss=0.9874, diff_loss=0.3678, tot_loss=1.604, over 189.00 samples.], tot_loss[dur_loss=0.2461, prior_loss=0.9864, diff_loss=0.4561, tot_loss=1.689, over 937.00 samples.], 2024-10-21 03:32:51,784 INFO [train.py:561] (1/4) Epoch 1142, batch 14, global_batch_idx: 18270, batch size: 142, loss[dur_loss=0.2517, prior_loss=0.9873, diff_loss=0.3313, tot_loss=1.57, over 142.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9871, diff_loss=0.3936, tot_loss=1.632, over 2210.00 samples.], 2024-10-21 03:32:53,214 INFO [train.py:682] (1/4) Start epoch 1143 2024-10-21 03:33:13,508 INFO [train.py:561] (1/4) Epoch 1143, batch 8, global_batch_idx: 18280, batch size: 170, loss[dur_loss=0.2544, prior_loss=0.9876, diff_loss=0.3564, tot_loss=1.598, over 170.00 samples.], tot_loss[dur_loss=0.2502, prior_loss=0.9868, diff_loss=0.4077, tot_loss=1.645, over 1432.00 samples.], 2024-10-21 03:33:23,654 INFO [train.py:682] (1/4) Start epoch 1144 2024-10-21 03:33:35,168 INFO [train.py:561] (1/4) Epoch 1144, batch 2, global_batch_idx: 18290, batch size: 203, loss[dur_loss=0.2539, prior_loss=0.9872, diff_loss=0.3856, tot_loss=1.627, over 203.00 samples.], tot_loss[dur_loss=0.2549, prior_loss=0.9875, diff_loss=0.3447, tot_loss=1.587, over 442.00 samples.], 2024-10-21 03:33:49,445 INFO [train.py:561] (1/4) Epoch 1144, batch 12, global_batch_idx: 18300, batch size: 152, loss[dur_loss=0.2549, prior_loss=0.9876, diff_loss=0.3369, tot_loss=1.579, over 152.00 samples.], tot_loss[dur_loss=0.2512, prior_loss=0.9871, diff_loss=0.3927, tot_loss=1.631, over 1966.00 samples.], 2024-10-21 03:33:53,941 INFO [train.py:682] (1/4) Start epoch 1145 2024-10-21 03:34:13,169 INFO [train.py:561] (1/4) Epoch 1145, batch 6, global_batch_idx: 18310, batch size: 106, loss[dur_loss=0.2534, prior_loss=0.9874, diff_loss=0.3628, tot_loss=1.604, over 106.00 samples.], tot_loss[dur_loss=0.2492, prior_loss=0.9867, diff_loss=0.4472, tot_loss=1.683, over 1142.00 samples.], 2024-10-21 03:34:26,299 INFO [train.py:682] (1/4) Start epoch 1146 2024-10-21 03:34:35,276 INFO [train.py:561] (1/4) Epoch 1146, batch 0, global_batch_idx: 18320, batch size: 108, loss[dur_loss=0.2589, prior_loss=0.9882, diff_loss=0.3088, tot_loss=1.556, over 108.00 samples.], tot_loss[dur_loss=0.2589, prior_loss=0.9882, diff_loss=0.3088, tot_loss=1.556, over 108.00 samples.], 2024-10-21 03:34:49,555 INFO [train.py:561] (1/4) Epoch 1146, batch 10, global_batch_idx: 18330, batch size: 111, loss[dur_loss=0.2586, prior_loss=0.9887, diff_loss=0.3374, tot_loss=1.585, over 111.00 samples.], tot_loss[dur_loss=0.2507, prior_loss=0.987, diff_loss=0.4071, tot_loss=1.645, over 1656.00 samples.], 2024-10-21 03:34:56,648 INFO [train.py:682] (1/4) Start epoch 1147 2024-10-21 03:35:10,462 INFO [train.py:561] (1/4) Epoch 1147, batch 4, global_batch_idx: 18340, batch size: 189, loss[dur_loss=0.2536, prior_loss=0.9876, diff_loss=0.3421, tot_loss=1.583, over 189.00 samples.], tot_loss[dur_loss=0.2471, prior_loss=0.9863, diff_loss=0.441, tot_loss=1.674, over 937.00 samples.], 2024-10-21 03:35:25,283 INFO [train.py:561] (1/4) Epoch 1147, batch 14, global_batch_idx: 18350, batch size: 142, loss[dur_loss=0.2529, prior_loss=0.9872, diff_loss=0.3359, tot_loss=1.576, over 142.00 samples.], tot_loss[dur_loss=0.2507, prior_loss=0.9871, diff_loss=0.3773, tot_loss=1.615, over 2210.00 samples.], 2024-10-21 03:35:26,715 INFO [train.py:682] (1/4) Start epoch 1148 2024-10-21 03:35:47,133 INFO [train.py:561] (1/4) Epoch 1148, batch 8, global_batch_idx: 18360, batch size: 170, loss[dur_loss=0.2553, prior_loss=0.9879, diff_loss=0.3648, tot_loss=1.608, over 170.00 samples.], tot_loss[dur_loss=0.2504, prior_loss=0.9869, diff_loss=0.4208, tot_loss=1.658, over 1432.00 samples.], 2024-10-21 03:35:57,249 INFO [train.py:682] (1/4) Start epoch 1149 2024-10-21 03:36:08,805 INFO [train.py:561] (1/4) Epoch 1149, batch 2, global_batch_idx: 18370, batch size: 203, loss[dur_loss=0.2551, prior_loss=0.9873, diff_loss=0.3585, tot_loss=1.601, over 203.00 samples.], tot_loss[dur_loss=0.2543, prior_loss=0.9873, diff_loss=0.3485, tot_loss=1.59, over 442.00 samples.], 2024-10-21 03:36:22,973 INFO [train.py:561] (1/4) Epoch 1149, batch 12, global_batch_idx: 18380, batch size: 152, loss[dur_loss=0.255, prior_loss=0.9872, diff_loss=0.3542, tot_loss=1.596, over 152.00 samples.], tot_loss[dur_loss=0.2506, prior_loss=0.9869, diff_loss=0.3942, tot_loss=1.632, over 1966.00 samples.], 2024-10-21 03:36:27,391 INFO [train.py:682] (1/4) Start epoch 1150 2024-10-21 03:36:44,853 INFO [train.py:561] (1/4) Epoch 1150, batch 6, global_batch_idx: 18390, batch size: 106, loss[dur_loss=0.2526, prior_loss=0.9874, diff_loss=0.3329, tot_loss=1.573, over 106.00 samples.], tot_loss[dur_loss=0.2457, prior_loss=0.9864, diff_loss=0.4279, tot_loss=1.66, over 1142.00 samples.], 2024-10-21 03:36:57,980 INFO [train.py:682] (1/4) Start epoch 1151 2024-10-21 03:37:07,056 INFO [train.py:561] (1/4) Epoch 1151, batch 0, global_batch_idx: 18400, batch size: 108, loss[dur_loss=0.2587, prior_loss=0.988, diff_loss=0.3369, tot_loss=1.584, over 108.00 samples.], tot_loss[dur_loss=0.2587, prior_loss=0.988, diff_loss=0.3369, tot_loss=1.584, over 108.00 samples.], 2024-10-21 03:37:21,306 INFO [train.py:561] (1/4) Epoch 1151, batch 10, global_batch_idx: 18410, batch size: 111, loss[dur_loss=0.2605, prior_loss=0.9887, diff_loss=0.3534, tot_loss=1.603, over 111.00 samples.], tot_loss[dur_loss=0.2497, prior_loss=0.9871, diff_loss=0.4079, tot_loss=1.645, over 1656.00 samples.], 2024-10-21 03:37:28,388 INFO [train.py:682] (1/4) Start epoch 1152 2024-10-21 03:37:42,184 INFO [train.py:561] (1/4) Epoch 1152, batch 4, global_batch_idx: 18420, batch size: 189, loss[dur_loss=0.2531, prior_loss=0.9882, diff_loss=0.3668, tot_loss=1.608, over 189.00 samples.], tot_loss[dur_loss=0.2466, prior_loss=0.9864, diff_loss=0.4575, tot_loss=1.691, over 937.00 samples.], 2024-10-21 03:37:57,075 INFO [train.py:561] (1/4) Epoch 1152, batch 14, global_batch_idx: 18430, batch size: 142, loss[dur_loss=0.2526, prior_loss=0.9869, diff_loss=0.3566, tot_loss=1.596, over 142.00 samples.], tot_loss[dur_loss=0.2514, prior_loss=0.9873, diff_loss=0.394, tot_loss=1.633, over 2210.00 samples.], 2024-10-21 03:37:58,519 INFO [train.py:682] (1/4) Start epoch 1153 2024-10-21 03:38:18,445 INFO [train.py:561] (1/4) Epoch 1153, batch 8, global_batch_idx: 18440, batch size: 170, loss[dur_loss=0.2556, prior_loss=0.9874, diff_loss=0.3403, tot_loss=1.583, over 170.00 samples.], tot_loss[dur_loss=0.2492, prior_loss=0.9867, diff_loss=0.4049, tot_loss=1.641, over 1432.00 samples.], 2024-10-21 03:38:28,790 INFO [train.py:682] (1/4) Start epoch 1154 2024-10-21 03:38:40,636 INFO [train.py:561] (1/4) Epoch 1154, batch 2, global_batch_idx: 18450, batch size: 203, loss[dur_loss=0.2518, prior_loss=0.9871, diff_loss=0.3691, tot_loss=1.608, over 203.00 samples.], tot_loss[dur_loss=0.2534, prior_loss=0.9873, diff_loss=0.3458, tot_loss=1.587, over 442.00 samples.], 2024-10-21 03:38:54,921 INFO [train.py:561] (1/4) Epoch 1154, batch 12, global_batch_idx: 18460, batch size: 152, loss[dur_loss=0.2504, prior_loss=0.9874, diff_loss=0.3779, tot_loss=1.616, over 152.00 samples.], tot_loss[dur_loss=0.2503, prior_loss=0.987, diff_loss=0.3938, tot_loss=1.631, over 1966.00 samples.], 2024-10-21 03:38:59,382 INFO [train.py:682] (1/4) Start epoch 1155 2024-10-21 03:39:16,467 INFO [train.py:561] (1/4) Epoch 1155, batch 6, global_batch_idx: 18470, batch size: 106, loss[dur_loss=0.2549, prior_loss=0.9877, diff_loss=0.358, tot_loss=1.601, over 106.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9866, diff_loss=0.4487, tot_loss=1.682, over 1142.00 samples.], 2024-10-21 03:39:29,588 INFO [train.py:682] (1/4) Start epoch 1156 2024-10-21 03:39:38,189 INFO [train.py:561] (1/4) Epoch 1156, batch 0, global_batch_idx: 18480, batch size: 108, loss[dur_loss=0.2564, prior_loss=0.988, diff_loss=0.3275, tot_loss=1.572, over 108.00 samples.], tot_loss[dur_loss=0.2564, prior_loss=0.988, diff_loss=0.3275, tot_loss=1.572, over 108.00 samples.], 2024-10-21 03:39:52,481 INFO [train.py:561] (1/4) Epoch 1156, batch 10, global_batch_idx: 18490, batch size: 111, loss[dur_loss=0.2606, prior_loss=0.9885, diff_loss=0.3202, tot_loss=1.569, over 111.00 samples.], tot_loss[dur_loss=0.2512, prior_loss=0.9869, diff_loss=0.4014, tot_loss=1.64, over 1656.00 samples.], 2024-10-21 03:39:59,657 INFO [train.py:682] (1/4) Start epoch 1157 2024-10-21 03:40:13,549 INFO [train.py:561] (1/4) Epoch 1157, batch 4, global_batch_idx: 18500, batch size: 189, loss[dur_loss=0.2513, prior_loss=0.9877, diff_loss=0.3478, tot_loss=1.587, over 189.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9863, diff_loss=0.452, tot_loss=1.684, over 937.00 samples.], 2024-10-21 03:40:28,503 INFO [train.py:561] (1/4) Epoch 1157, batch 14, global_batch_idx: 18510, batch size: 142, loss[dur_loss=0.2515, prior_loss=0.9869, diff_loss=0.3546, tot_loss=1.593, over 142.00 samples.], tot_loss[dur_loss=0.2499, prior_loss=0.9869, diff_loss=0.3852, tot_loss=1.622, over 2210.00 samples.], 2024-10-21 03:40:29,962 INFO [train.py:682] (1/4) Start epoch 1158 2024-10-21 03:40:50,221 INFO [train.py:561] (1/4) Epoch 1158, batch 8, global_batch_idx: 18520, batch size: 170, loss[dur_loss=0.2546, prior_loss=0.9875, diff_loss=0.3892, tot_loss=1.631, over 170.00 samples.], tot_loss[dur_loss=0.2484, prior_loss=0.9866, diff_loss=0.4158, tot_loss=1.651, over 1432.00 samples.], 2024-10-21 03:41:00,475 INFO [train.py:682] (1/4) Start epoch 1159 2024-10-21 03:41:11,756 INFO [train.py:561] (1/4) Epoch 1159, batch 2, global_batch_idx: 18530, batch size: 203, loss[dur_loss=0.2515, prior_loss=0.9868, diff_loss=0.3347, tot_loss=1.573, over 203.00 samples.], tot_loss[dur_loss=0.252, prior_loss=0.9871, diff_loss=0.3259, tot_loss=1.565, over 442.00 samples.], 2024-10-21 03:41:26,161 INFO [train.py:561] (1/4) Epoch 1159, batch 12, global_batch_idx: 18540, batch size: 152, loss[dur_loss=0.2508, prior_loss=0.9875, diff_loss=0.3524, tot_loss=1.591, over 152.00 samples.], tot_loss[dur_loss=0.25, prior_loss=0.987, diff_loss=0.4023, tot_loss=1.639, over 1966.00 samples.], 2024-10-21 03:41:30,679 INFO [train.py:682] (1/4) Start epoch 1160 2024-10-21 03:41:47,728 INFO [train.py:561] (1/4) Epoch 1160, batch 6, global_batch_idx: 18550, batch size: 106, loss[dur_loss=0.2501, prior_loss=0.9871, diff_loss=0.3539, tot_loss=1.591, over 106.00 samples.], tot_loss[dur_loss=0.246, prior_loss=0.9864, diff_loss=0.4341, tot_loss=1.666, over 1142.00 samples.], 2024-10-21 03:42:00,850 INFO [train.py:682] (1/4) Start epoch 1161 2024-10-21 03:42:09,566 INFO [train.py:561] (1/4) Epoch 1161, batch 0, global_batch_idx: 18560, batch size: 108, loss[dur_loss=0.2589, prior_loss=0.988, diff_loss=0.315, tot_loss=1.562, over 108.00 samples.], tot_loss[dur_loss=0.2589, prior_loss=0.988, diff_loss=0.315, tot_loss=1.562, over 108.00 samples.], 2024-10-21 03:42:23,994 INFO [train.py:561] (1/4) Epoch 1161, batch 10, global_batch_idx: 18570, batch size: 111, loss[dur_loss=0.258, prior_loss=0.9882, diff_loss=0.3249, tot_loss=1.571, over 111.00 samples.], tot_loss[dur_loss=0.25, prior_loss=0.9868, diff_loss=0.3988, tot_loss=1.636, over 1656.00 samples.], 2024-10-21 03:42:31,218 INFO [train.py:682] (1/4) Start epoch 1162 2024-10-21 03:42:44,769 INFO [train.py:561] (1/4) Epoch 1162, batch 4, global_batch_idx: 18580, batch size: 189, loss[dur_loss=0.2494, prior_loss=0.9876, diff_loss=0.3585, tot_loss=1.595, over 189.00 samples.], tot_loss[dur_loss=0.2462, prior_loss=0.9863, diff_loss=0.4493, tot_loss=1.682, over 937.00 samples.], 2024-10-21 03:42:59,789 INFO [train.py:561] (1/4) Epoch 1162, batch 14, global_batch_idx: 18590, batch size: 142, loss[dur_loss=0.2547, prior_loss=0.9871, diff_loss=0.2984, tot_loss=1.54, over 142.00 samples.], tot_loss[dur_loss=0.2503, prior_loss=0.9869, diff_loss=0.3798, tot_loss=1.617, over 2210.00 samples.], 2024-10-21 03:43:01,240 INFO [train.py:682] (1/4) Start epoch 1163 2024-10-21 03:43:21,581 INFO [train.py:561] (1/4) Epoch 1163, batch 8, global_batch_idx: 18600, batch size: 170, loss[dur_loss=0.258, prior_loss=0.9876, diff_loss=0.3305, tot_loss=1.576, over 170.00 samples.], tot_loss[dur_loss=0.2485, prior_loss=0.9866, diff_loss=0.4143, tot_loss=1.649, over 1432.00 samples.], 2024-10-21 03:43:31,778 INFO [train.py:682] (1/4) Start epoch 1164 2024-10-21 03:43:43,162 INFO [train.py:561] (1/4) Epoch 1164, batch 2, global_batch_idx: 18610, batch size: 203, loss[dur_loss=0.2519, prior_loss=0.9871, diff_loss=0.3733, tot_loss=1.612, over 203.00 samples.], tot_loss[dur_loss=0.2526, prior_loss=0.9874, diff_loss=0.3532, tot_loss=1.593, over 442.00 samples.], 2024-10-21 03:43:57,457 INFO [train.py:561] (1/4) Epoch 1164, batch 12, global_batch_idx: 18620, batch size: 152, loss[dur_loss=0.2515, prior_loss=0.9876, diff_loss=0.357, tot_loss=1.596, over 152.00 samples.], tot_loss[dur_loss=0.2497, prior_loss=0.9869, diff_loss=0.3985, tot_loss=1.635, over 1966.00 samples.], 2024-10-21 03:44:01,935 INFO [train.py:682] (1/4) Start epoch 1165 2024-10-21 03:44:19,059 INFO [train.py:561] (1/4) Epoch 1165, batch 6, global_batch_idx: 18630, batch size: 106, loss[dur_loss=0.2512, prior_loss=0.9874, diff_loss=0.3159, tot_loss=1.554, over 106.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9865, diff_loss=0.4347, tot_loss=1.668, over 1142.00 samples.], 2024-10-21 03:44:32,169 INFO [train.py:682] (1/4) Start epoch 1166 2024-10-21 03:44:41,321 INFO [train.py:561] (1/4) Epoch 1166, batch 0, global_batch_idx: 18640, batch size: 108, loss[dur_loss=0.2554, prior_loss=0.988, diff_loss=0.3735, tot_loss=1.617, over 108.00 samples.], tot_loss[dur_loss=0.2554, prior_loss=0.988, diff_loss=0.3735, tot_loss=1.617, over 108.00 samples.], 2024-10-21 03:44:55,690 INFO [train.py:561] (1/4) Epoch 1166, batch 10, global_batch_idx: 18650, batch size: 111, loss[dur_loss=0.2562, prior_loss=0.9881, diff_loss=0.3271, tot_loss=1.571, over 111.00 samples.], tot_loss[dur_loss=0.2493, prior_loss=0.9868, diff_loss=0.402, tot_loss=1.638, over 1656.00 samples.], 2024-10-21 03:45:02,785 INFO [train.py:682] (1/4) Start epoch 1167 2024-10-21 03:45:16,580 INFO [train.py:561] (1/4) Epoch 1167, batch 4, global_batch_idx: 18660, batch size: 189, loss[dur_loss=0.2503, prior_loss=0.9872, diff_loss=0.4061, tot_loss=1.644, over 189.00 samples.], tot_loss[dur_loss=0.2474, prior_loss=0.9861, diff_loss=0.4557, tot_loss=1.689, over 937.00 samples.], 2024-10-21 03:45:31,477 INFO [train.py:561] (1/4) Epoch 1167, batch 14, global_batch_idx: 18670, batch size: 142, loss[dur_loss=0.256, prior_loss=0.9875, diff_loss=0.3465, tot_loss=1.59, over 142.00 samples.], tot_loss[dur_loss=0.2508, prior_loss=0.9869, diff_loss=0.385, tot_loss=1.623, over 2210.00 samples.], 2024-10-21 03:45:32,911 INFO [train.py:682] (1/4) Start epoch 1168 2024-10-21 03:45:52,692 INFO [train.py:561] (1/4) Epoch 1168, batch 8, global_batch_idx: 18680, batch size: 170, loss[dur_loss=0.2542, prior_loss=0.9873, diff_loss=0.3222, tot_loss=1.564, over 170.00 samples.], tot_loss[dur_loss=0.2478, prior_loss=0.9866, diff_loss=0.4095, tot_loss=1.644, over 1432.00 samples.], 2024-10-21 03:46:02,821 INFO [train.py:682] (1/4) Start epoch 1169 2024-10-21 03:46:14,435 INFO [train.py:561] (1/4) Epoch 1169, batch 2, global_batch_idx: 18690, batch size: 203, loss[dur_loss=0.2527, prior_loss=0.9871, diff_loss=0.3964, tot_loss=1.636, over 203.00 samples.], tot_loss[dur_loss=0.2527, prior_loss=0.9872, diff_loss=0.3745, tot_loss=1.614, over 442.00 samples.], 2024-10-21 03:46:28,838 INFO [train.py:561] (1/4) Epoch 1169, batch 12, global_batch_idx: 18700, batch size: 152, loss[dur_loss=0.2546, prior_loss=0.9872, diff_loss=0.3764, tot_loss=1.618, over 152.00 samples.], tot_loss[dur_loss=0.2496, prior_loss=0.9867, diff_loss=0.4091, tot_loss=1.645, over 1966.00 samples.], 2024-10-21 03:46:33,317 INFO [train.py:682] (1/4) Start epoch 1170 2024-10-21 03:46:50,360 INFO [train.py:561] (1/4) Epoch 1170, batch 6, global_batch_idx: 18710, batch size: 106, loss[dur_loss=0.2552, prior_loss=0.9874, diff_loss=0.3053, tot_loss=1.548, over 106.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9863, diff_loss=0.4308, tot_loss=1.664, over 1142.00 samples.], 2024-10-21 03:47:03,513 INFO [train.py:682] (1/4) Start epoch 1171 2024-10-21 03:47:12,168 INFO [train.py:561] (1/4) Epoch 1171, batch 0, global_batch_idx: 18720, batch size: 108, loss[dur_loss=0.2607, prior_loss=0.9882, diff_loss=0.3339, tot_loss=1.583, over 108.00 samples.], tot_loss[dur_loss=0.2607, prior_loss=0.9882, diff_loss=0.3339, tot_loss=1.583, over 108.00 samples.], 2024-10-21 03:47:26,591 INFO [train.py:561] (1/4) Epoch 1171, batch 10, global_batch_idx: 18730, batch size: 111, loss[dur_loss=0.2558, prior_loss=0.9882, diff_loss=0.3674, tot_loss=1.611, over 111.00 samples.], tot_loss[dur_loss=0.2495, prior_loss=0.9867, diff_loss=0.4064, tot_loss=1.643, over 1656.00 samples.], 2024-10-21 03:47:33,693 INFO [train.py:682] (1/4) Start epoch 1172 2024-10-21 03:47:47,656 INFO [train.py:561] (1/4) Epoch 1172, batch 4, global_batch_idx: 18740, batch size: 189, loss[dur_loss=0.2475, prior_loss=0.9868, diff_loss=0.3554, tot_loss=1.59, over 189.00 samples.], tot_loss[dur_loss=0.2452, prior_loss=0.9859, diff_loss=0.4647, tot_loss=1.696, over 937.00 samples.], 2024-10-21 03:48:02,369 INFO [train.py:561] (1/4) Epoch 1172, batch 14, global_batch_idx: 18750, batch size: 142, loss[dur_loss=0.2508, prior_loss=0.9866, diff_loss=0.3288, tot_loss=1.566, over 142.00 samples.], tot_loss[dur_loss=0.2497, prior_loss=0.9866, diff_loss=0.3905, tot_loss=1.627, over 2210.00 samples.], 2024-10-21 03:48:03,799 INFO [train.py:682] (1/4) Start epoch 1173 2024-10-21 03:48:23,795 INFO [train.py:561] (1/4) Epoch 1173, batch 8, global_batch_idx: 18760, batch size: 170, loss[dur_loss=0.258, prior_loss=0.9872, diff_loss=0.3787, tot_loss=1.624, over 170.00 samples.], tot_loss[dur_loss=0.2488, prior_loss=0.9864, diff_loss=0.4229, tot_loss=1.658, over 1432.00 samples.], 2024-10-21 03:48:34,173 INFO [train.py:682] (1/4) Start epoch 1174 2024-10-21 03:48:46,410 INFO [train.py:561] (1/4) Epoch 1174, batch 2, global_batch_idx: 18770, batch size: 203, loss[dur_loss=0.2483, prior_loss=0.987, diff_loss=0.3796, tot_loss=1.615, over 203.00 samples.], tot_loss[dur_loss=0.2514, prior_loss=0.9871, diff_loss=0.3784, tot_loss=1.617, over 442.00 samples.], 2024-10-21 03:49:00,762 INFO [train.py:561] (1/4) Epoch 1174, batch 12, global_batch_idx: 18780, batch size: 152, loss[dur_loss=0.2528, prior_loss=0.9866, diff_loss=0.3464, tot_loss=1.586, over 152.00 samples.], tot_loss[dur_loss=0.2497, prior_loss=0.9865, diff_loss=0.4064, tot_loss=1.643, over 1966.00 samples.], 2024-10-21 03:49:05,267 INFO [train.py:682] (1/4) Start epoch 1175 2024-10-21 03:49:22,770 INFO [train.py:561] (1/4) Epoch 1175, batch 6, global_batch_idx: 18790, batch size: 106, loss[dur_loss=0.2522, prior_loss=0.9869, diff_loss=0.2975, tot_loss=1.537, over 106.00 samples.], tot_loss[dur_loss=0.2443, prior_loss=0.986, diff_loss=0.4259, tot_loss=1.656, over 1142.00 samples.], 2024-10-21 03:49:35,949 INFO [train.py:682] (1/4) Start epoch 1176 2024-10-21 03:49:44,844 INFO [train.py:561] (1/4) Epoch 1176, batch 0, global_batch_idx: 18800, batch size: 108, loss[dur_loss=0.2601, prior_loss=0.9876, diff_loss=0.3168, tot_loss=1.565, over 108.00 samples.], tot_loss[dur_loss=0.2601, prior_loss=0.9876, diff_loss=0.3168, tot_loss=1.565, over 108.00 samples.], 2024-10-21 03:49:59,420 INFO [train.py:561] (1/4) Epoch 1176, batch 10, global_batch_idx: 18810, batch size: 111, loss[dur_loss=0.2547, prior_loss=0.9877, diff_loss=0.2971, tot_loss=1.539, over 111.00 samples.], tot_loss[dur_loss=0.2478, prior_loss=0.9864, diff_loss=0.4095, tot_loss=1.644, over 1656.00 samples.], 2024-10-21 03:50:06,653 INFO [train.py:682] (1/4) Start epoch 1177 2024-10-21 03:50:20,583 INFO [train.py:561] (1/4) Epoch 1177, batch 4, global_batch_idx: 18820, batch size: 189, loss[dur_loss=0.2503, prior_loss=0.9867, diff_loss=0.3769, tot_loss=1.614, over 189.00 samples.], tot_loss[dur_loss=0.2442, prior_loss=0.9858, diff_loss=0.4634, tot_loss=1.693, over 937.00 samples.], 2024-10-21 03:50:35,648 INFO [train.py:561] (1/4) Epoch 1177, batch 14, global_batch_idx: 18830, batch size: 142, loss[dur_loss=0.2532, prior_loss=0.9866, diff_loss=0.328, tot_loss=1.568, over 142.00 samples.], tot_loss[dur_loss=0.2488, prior_loss=0.9864, diff_loss=0.3995, tot_loss=1.635, over 2210.00 samples.], 2024-10-21 03:50:37,075 INFO [train.py:682] (1/4) Start epoch 1178 2024-10-21 03:50:57,272 INFO [train.py:561] (1/4) Epoch 1178, batch 8, global_batch_idx: 18840, batch size: 170, loss[dur_loss=0.2535, prior_loss=0.9869, diff_loss=0.3121, tot_loss=1.552, over 170.00 samples.], tot_loss[dur_loss=0.2483, prior_loss=0.9863, diff_loss=0.4095, tot_loss=1.644, over 1432.00 samples.], 2024-10-21 03:51:07,455 INFO [train.py:682] (1/4) Start epoch 1179 2024-10-21 03:51:18,897 INFO [train.py:561] (1/4) Epoch 1179, batch 2, global_batch_idx: 18850, batch size: 203, loss[dur_loss=0.2505, prior_loss=0.9866, diff_loss=0.3509, tot_loss=1.588, over 203.00 samples.], tot_loss[dur_loss=0.2513, prior_loss=0.9869, diff_loss=0.3291, tot_loss=1.567, over 442.00 samples.], 2024-10-21 03:51:33,252 INFO [train.py:561] (1/4) Epoch 1179, batch 12, global_batch_idx: 18860, batch size: 152, loss[dur_loss=0.2485, prior_loss=0.9866, diff_loss=0.3603, tot_loss=1.595, over 152.00 samples.], tot_loss[dur_loss=0.2486, prior_loss=0.9866, diff_loss=0.3877, tot_loss=1.623, over 1966.00 samples.], 2024-10-21 03:51:37,712 INFO [train.py:682] (1/4) Start epoch 1180 2024-10-21 03:51:54,885 INFO [train.py:561] (1/4) Epoch 1180, batch 6, global_batch_idx: 18870, batch size: 106, loss[dur_loss=0.2558, prior_loss=0.9872, diff_loss=0.3059, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.2477, prior_loss=0.9862, diff_loss=0.4276, tot_loss=1.662, over 1142.00 samples.], 2024-10-21 03:52:07,977 INFO [train.py:682] (1/4) Start epoch 1181 2024-10-21 03:52:16,875 INFO [train.py:561] (1/4) Epoch 1181, batch 0, global_batch_idx: 18880, batch size: 108, loss[dur_loss=0.2563, prior_loss=0.9875, diff_loss=0.3539, tot_loss=1.598, over 108.00 samples.], tot_loss[dur_loss=0.2563, prior_loss=0.9875, diff_loss=0.3539, tot_loss=1.598, over 108.00 samples.], 2024-10-21 03:52:31,280 INFO [train.py:561] (1/4) Epoch 1181, batch 10, global_batch_idx: 18890, batch size: 111, loss[dur_loss=0.2551, prior_loss=0.9876, diff_loss=0.355, tot_loss=1.598, over 111.00 samples.], tot_loss[dur_loss=0.2488, prior_loss=0.9866, diff_loss=0.4033, tot_loss=1.639, over 1656.00 samples.], 2024-10-21 03:52:38,475 INFO [train.py:682] (1/4) Start epoch 1182 2024-10-21 03:52:52,216 INFO [train.py:561] (1/4) Epoch 1182, batch 4, global_batch_idx: 18900, batch size: 189, loss[dur_loss=0.2485, prior_loss=0.9867, diff_loss=0.3837, tot_loss=1.619, over 189.00 samples.], tot_loss[dur_loss=0.2439, prior_loss=0.9857, diff_loss=0.4584, tot_loss=1.688, over 937.00 samples.], 2024-10-21 03:53:07,180 INFO [train.py:561] (1/4) Epoch 1182, batch 14, global_batch_idx: 18910, batch size: 142, loss[dur_loss=0.2477, prior_loss=0.9867, diff_loss=0.3135, tot_loss=1.548, over 142.00 samples.], tot_loss[dur_loss=0.2485, prior_loss=0.9864, diff_loss=0.393, tot_loss=1.628, over 2210.00 samples.], 2024-10-21 03:53:08,589 INFO [train.py:682] (1/4) Start epoch 1183 2024-10-21 03:53:28,342 INFO [train.py:561] (1/4) Epoch 1183, batch 8, global_batch_idx: 18920, batch size: 170, loss[dur_loss=0.2516, prior_loss=0.9867, diff_loss=0.373, tot_loss=1.611, over 170.00 samples.], tot_loss[dur_loss=0.2467, prior_loss=0.986, diff_loss=0.4169, tot_loss=1.65, over 1432.00 samples.], 2024-10-21 03:53:38,567 INFO [train.py:682] (1/4) Start epoch 1184 2024-10-21 03:53:50,048 INFO [train.py:561] (1/4) Epoch 1184, batch 2, global_batch_idx: 18930, batch size: 203, loss[dur_loss=0.2508, prior_loss=0.9866, diff_loss=0.3568, tot_loss=1.594, over 203.00 samples.], tot_loss[dur_loss=0.2505, prior_loss=0.9868, diff_loss=0.3426, tot_loss=1.58, over 442.00 samples.], 2024-10-21 03:54:04,384 INFO [train.py:561] (1/4) Epoch 1184, batch 12, global_batch_idx: 18940, batch size: 152, loss[dur_loss=0.2516, prior_loss=0.9863, diff_loss=0.349, tot_loss=1.587, over 152.00 samples.], tot_loss[dur_loss=0.2484, prior_loss=0.9863, diff_loss=0.3948, tot_loss=1.629, over 1966.00 samples.], 2024-10-21 03:54:08,868 INFO [train.py:682] (1/4) Start epoch 1185 2024-10-21 03:54:25,796 INFO [train.py:561] (1/4) Epoch 1185, batch 6, global_batch_idx: 18950, batch size: 106, loss[dur_loss=0.2499, prior_loss=0.9867, diff_loss=0.3264, tot_loss=1.563, over 106.00 samples.], tot_loss[dur_loss=0.2458, prior_loss=0.986, diff_loss=0.436, tot_loss=1.668, over 1142.00 samples.], 2024-10-21 03:54:38,969 INFO [train.py:682] (1/4) Start epoch 1186 2024-10-21 03:54:47,617 INFO [train.py:561] (1/4) Epoch 1186, batch 0, global_batch_idx: 18960, batch size: 108, loss[dur_loss=0.2592, prior_loss=0.9875, diff_loss=0.3333, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2592, prior_loss=0.9875, diff_loss=0.3333, tot_loss=1.58, over 108.00 samples.], 2024-10-21 03:55:01,984 INFO [train.py:561] (1/4) Epoch 1186, batch 10, global_batch_idx: 18970, batch size: 111, loss[dur_loss=0.2573, prior_loss=0.9876, diff_loss=0.3427, tot_loss=1.588, over 111.00 samples.], tot_loss[dur_loss=0.2479, prior_loss=0.9862, diff_loss=0.3955, tot_loss=1.63, over 1656.00 samples.], 2024-10-21 03:55:08,993 INFO [train.py:682] (1/4) Start epoch 1187 2024-10-21 03:55:23,117 INFO [train.py:561] (1/4) Epoch 1187, batch 4, global_batch_idx: 18980, batch size: 189, loss[dur_loss=0.2473, prior_loss=0.9868, diff_loss=0.3638, tot_loss=1.598, over 189.00 samples.], tot_loss[dur_loss=0.2453, prior_loss=0.9857, diff_loss=0.4545, tot_loss=1.685, over 937.00 samples.], 2024-10-21 03:55:38,118 INFO [train.py:561] (1/4) Epoch 1187, batch 14, global_batch_idx: 18990, batch size: 142, loss[dur_loss=0.2513, prior_loss=0.9862, diff_loss=0.3059, tot_loss=1.543, over 142.00 samples.], tot_loss[dur_loss=0.2488, prior_loss=0.9863, diff_loss=0.3819, tot_loss=1.617, over 2210.00 samples.], 2024-10-21 03:55:39,559 INFO [train.py:682] (1/4) Start epoch 1188 2024-10-21 03:55:59,601 INFO [train.py:561] (1/4) Epoch 1188, batch 8, global_batch_idx: 19000, batch size: 170, loss[dur_loss=0.2502, prior_loss=0.987, diff_loss=0.3724, tot_loss=1.61, over 170.00 samples.], tot_loss[dur_loss=0.2488, prior_loss=0.9863, diff_loss=0.4061, tot_loss=1.641, over 1432.00 samples.], 2024-10-21 03:56:09,913 INFO [train.py:682] (1/4) Start epoch 1189 2024-10-21 03:56:21,632 INFO [train.py:561] (1/4) Epoch 1189, batch 2, global_batch_idx: 19010, batch size: 203, loss[dur_loss=0.2494, prior_loss=0.9867, diff_loss=0.3937, tot_loss=1.63, over 203.00 samples.], tot_loss[dur_loss=0.2501, prior_loss=0.9868, diff_loss=0.3658, tot_loss=1.603, over 442.00 samples.], 2024-10-21 03:56:35,967 INFO [train.py:561] (1/4) Epoch 1189, batch 12, global_batch_idx: 19020, batch size: 152, loss[dur_loss=0.2472, prior_loss=0.9866, diff_loss=0.3266, tot_loss=1.56, over 152.00 samples.], tot_loss[dur_loss=0.2484, prior_loss=0.9865, diff_loss=0.3928, tot_loss=1.628, over 1966.00 samples.], 2024-10-21 03:56:40,454 INFO [train.py:682] (1/4) Start epoch 1190 2024-10-21 03:56:57,550 INFO [train.py:561] (1/4) Epoch 1190, batch 6, global_batch_idx: 19030, batch size: 106, loss[dur_loss=0.2476, prior_loss=0.9868, diff_loss=0.3632, tot_loss=1.598, over 106.00 samples.], tot_loss[dur_loss=0.2462, prior_loss=0.986, diff_loss=0.4195, tot_loss=1.652, over 1142.00 samples.], 2024-10-21 03:57:10,713 INFO [train.py:682] (1/4) Start epoch 1191 2024-10-21 03:57:19,343 INFO [train.py:561] (1/4) Epoch 1191, batch 0, global_batch_idx: 19040, batch size: 108, loss[dur_loss=0.2536, prior_loss=0.9873, diff_loss=0.3703, tot_loss=1.611, over 108.00 samples.], tot_loss[dur_loss=0.2536, prior_loss=0.9873, diff_loss=0.3703, tot_loss=1.611, over 108.00 samples.], 2024-10-21 03:57:33,603 INFO [train.py:561] (1/4) Epoch 1191, batch 10, global_batch_idx: 19050, batch size: 111, loss[dur_loss=0.2536, prior_loss=0.9876, diff_loss=0.377, tot_loss=1.618, over 111.00 samples.], tot_loss[dur_loss=0.2479, prior_loss=0.9862, diff_loss=0.4018, tot_loss=1.636, over 1656.00 samples.], 2024-10-21 03:57:40,743 INFO [train.py:682] (1/4) Start epoch 1192 2024-10-21 03:57:54,392 INFO [train.py:561] (1/4) Epoch 1192, batch 4, global_batch_idx: 19060, batch size: 189, loss[dur_loss=0.2517, prior_loss=0.9868, diff_loss=0.3407, tot_loss=1.579, over 189.00 samples.], tot_loss[dur_loss=0.2432, prior_loss=0.9856, diff_loss=0.4413, tot_loss=1.67, over 937.00 samples.], 2024-10-21 03:58:09,349 INFO [train.py:561] (1/4) Epoch 1192, batch 14, global_batch_idx: 19070, batch size: 142, loss[dur_loss=0.2461, prior_loss=0.9862, diff_loss=0.3198, tot_loss=1.552, over 142.00 samples.], tot_loss[dur_loss=0.2477, prior_loss=0.9862, diff_loss=0.3757, tot_loss=1.61, over 2210.00 samples.], 2024-10-21 03:58:10,775 INFO [train.py:682] (1/4) Start epoch 1193 2024-10-21 03:58:30,932 INFO [train.py:561] (1/4) Epoch 1193, batch 8, global_batch_idx: 19080, batch size: 170, loss[dur_loss=0.2546, prior_loss=0.9869, diff_loss=0.3519, tot_loss=1.593, over 170.00 samples.], tot_loss[dur_loss=0.2474, prior_loss=0.9862, diff_loss=0.4177, tot_loss=1.651, over 1432.00 samples.], 2024-10-21 03:58:41,142 INFO [train.py:682] (1/4) Start epoch 1194 2024-10-21 03:58:52,463 INFO [train.py:561] (1/4) Epoch 1194, batch 2, global_batch_idx: 19090, batch size: 203, loss[dur_loss=0.2501, prior_loss=0.9866, diff_loss=0.3583, tot_loss=1.595, over 203.00 samples.], tot_loss[dur_loss=0.2512, prior_loss=0.9867, diff_loss=0.3331, tot_loss=1.571, over 442.00 samples.], 2024-10-21 03:59:06,702 INFO [train.py:561] (1/4) Epoch 1194, batch 12, global_batch_idx: 19100, batch size: 152, loss[dur_loss=0.2498, prior_loss=0.9866, diff_loss=0.3028, tot_loss=1.539, over 152.00 samples.], tot_loss[dur_loss=0.2482, prior_loss=0.9863, diff_loss=0.3937, tot_loss=1.628, over 1966.00 samples.], 2024-10-21 03:59:11,184 INFO [train.py:682] (1/4) Start epoch 1195 2024-10-21 03:59:28,188 INFO [train.py:561] (1/4) Epoch 1195, batch 6, global_batch_idx: 19110, batch size: 106, loss[dur_loss=0.2538, prior_loss=0.9867, diff_loss=0.3703, tot_loss=1.611, over 106.00 samples.], tot_loss[dur_loss=0.2468, prior_loss=0.9858, diff_loss=0.4352, tot_loss=1.668, over 1142.00 samples.], 2024-10-21 03:59:41,335 INFO [train.py:682] (1/4) Start epoch 1196 2024-10-21 03:59:50,158 INFO [train.py:561] (1/4) Epoch 1196, batch 0, global_batch_idx: 19120, batch size: 108, loss[dur_loss=0.2575, prior_loss=0.9873, diff_loss=0.335, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2575, prior_loss=0.9873, diff_loss=0.335, tot_loss=1.58, over 108.00 samples.], 2024-10-21 04:00:04,497 INFO [train.py:561] (1/4) Epoch 1196, batch 10, global_batch_idx: 19130, batch size: 111, loss[dur_loss=0.2534, prior_loss=0.9875, diff_loss=0.3634, tot_loss=1.604, over 111.00 samples.], tot_loss[dur_loss=0.2463, prior_loss=0.9861, diff_loss=0.396, tot_loss=1.628, over 1656.00 samples.], 2024-10-21 04:00:11,659 INFO [train.py:682] (1/4) Start epoch 1197 2024-10-21 04:00:25,424 INFO [train.py:561] (1/4) Epoch 1197, batch 4, global_batch_idx: 19140, batch size: 189, loss[dur_loss=0.2482, prior_loss=0.9866, diff_loss=0.3379, tot_loss=1.573, over 189.00 samples.], tot_loss[dur_loss=0.2433, prior_loss=0.9854, diff_loss=0.4517, tot_loss=1.68, over 937.00 samples.], 2024-10-21 04:00:40,344 INFO [train.py:561] (1/4) Epoch 1197, batch 14, global_batch_idx: 19150, batch size: 142, loss[dur_loss=0.2493, prior_loss=0.986, diff_loss=0.3337, tot_loss=1.569, over 142.00 samples.], tot_loss[dur_loss=0.2471, prior_loss=0.9861, diff_loss=0.3868, tot_loss=1.62, over 2210.00 samples.], 2024-10-21 04:00:41,766 INFO [train.py:682] (1/4) Start epoch 1198 2024-10-21 04:01:01,822 INFO [train.py:561] (1/4) Epoch 1198, batch 8, global_batch_idx: 19160, batch size: 170, loss[dur_loss=0.2534, prior_loss=0.9868, diff_loss=0.3309, tot_loss=1.571, over 170.00 samples.], tot_loss[dur_loss=0.2476, prior_loss=0.9861, diff_loss=0.4046, tot_loss=1.638, over 1432.00 samples.], 2024-10-21 04:01:12,034 INFO [train.py:682] (1/4) Start epoch 1199 2024-10-21 04:01:23,747 INFO [train.py:561] (1/4) Epoch 1199, batch 2, global_batch_idx: 19170, batch size: 203, loss[dur_loss=0.2467, prior_loss=0.9864, diff_loss=0.3454, tot_loss=1.579, over 203.00 samples.], tot_loss[dur_loss=0.2477, prior_loss=0.9865, diff_loss=0.3405, tot_loss=1.575, over 442.00 samples.], 2024-10-21 04:01:37,987 INFO [train.py:561] (1/4) Epoch 1199, batch 12, global_batch_idx: 19180, batch size: 152, loss[dur_loss=0.2495, prior_loss=0.9862, diff_loss=0.3418, tot_loss=1.578, over 152.00 samples.], tot_loss[dur_loss=0.2472, prior_loss=0.9862, diff_loss=0.3885, tot_loss=1.622, over 1966.00 samples.], 2024-10-21 04:01:42,468 INFO [train.py:682] (1/4) Start epoch 1200 2024-10-21 04:01:59,402 INFO [train.py:561] (1/4) Epoch 1200, batch 6, global_batch_idx: 19190, batch size: 106, loss[dur_loss=0.2488, prior_loss=0.9868, diff_loss=0.3549, tot_loss=1.591, over 106.00 samples.], tot_loss[dur_loss=0.2453, prior_loss=0.9857, diff_loss=0.4231, tot_loss=1.654, over 1142.00 samples.], 2024-10-21 04:02:12,532 INFO [train.py:682] (1/4) Start epoch 1201 2024-10-21 04:02:21,209 INFO [train.py:561] (1/4) Epoch 1201, batch 0, global_batch_idx: 19200, batch size: 108, loss[dur_loss=0.2537, prior_loss=0.9875, diff_loss=0.3306, tot_loss=1.572, over 108.00 samples.], tot_loss[dur_loss=0.2537, prior_loss=0.9875, diff_loss=0.3306, tot_loss=1.572, over 108.00 samples.], 2024-10-21 04:02:35,447 INFO [train.py:561] (1/4) Epoch 1201, batch 10, global_batch_idx: 19210, batch size: 111, loss[dur_loss=0.2525, prior_loss=0.9874, diff_loss=0.3339, tot_loss=1.574, over 111.00 samples.], tot_loss[dur_loss=0.2483, prior_loss=0.9861, diff_loss=0.4058, tot_loss=1.64, over 1656.00 samples.], 2024-10-21 04:02:42,584 INFO [train.py:682] (1/4) Start epoch 1202 2024-10-21 04:02:56,396 INFO [train.py:561] (1/4) Epoch 1202, batch 4, global_batch_idx: 19220, batch size: 189, loss[dur_loss=0.2518, prior_loss=0.9866, diff_loss=0.3394, tot_loss=1.578, over 189.00 samples.], tot_loss[dur_loss=0.2438, prior_loss=0.9854, diff_loss=0.4457, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:03:11,243 INFO [train.py:561] (1/4) Epoch 1202, batch 14, global_batch_idx: 19230, batch size: 142, loss[dur_loss=0.2459, prior_loss=0.9862, diff_loss=0.3539, tot_loss=1.586, over 142.00 samples.], tot_loss[dur_loss=0.2471, prior_loss=0.9861, diff_loss=0.3876, tot_loss=1.621, over 2210.00 samples.], 2024-10-21 04:03:12,685 INFO [train.py:682] (1/4) Start epoch 1203 2024-10-21 04:03:32,582 INFO [train.py:561] (1/4) Epoch 1203, batch 8, global_batch_idx: 19240, batch size: 170, loss[dur_loss=0.2478, prior_loss=0.9865, diff_loss=0.3424, tot_loss=1.577, over 170.00 samples.], tot_loss[dur_loss=0.2459, prior_loss=0.9859, diff_loss=0.4051, tot_loss=1.637, over 1432.00 samples.], 2024-10-21 04:03:42,735 INFO [train.py:682] (1/4) Start epoch 1204 2024-10-21 04:03:54,056 INFO [train.py:561] (1/4) Epoch 1204, batch 2, global_batch_idx: 19250, batch size: 203, loss[dur_loss=0.2469, prior_loss=0.9865, diff_loss=0.3554, tot_loss=1.589, over 203.00 samples.], tot_loss[dur_loss=0.2494, prior_loss=0.9867, diff_loss=0.35, tot_loss=1.586, over 442.00 samples.], 2024-10-21 04:04:08,286 INFO [train.py:561] (1/4) Epoch 1204, batch 12, global_batch_idx: 19260, batch size: 152, loss[dur_loss=0.2494, prior_loss=0.9865, diff_loss=0.3442, tot_loss=1.58, over 152.00 samples.], tot_loss[dur_loss=0.2465, prior_loss=0.9862, diff_loss=0.392, tot_loss=1.625, over 1966.00 samples.], 2024-10-21 04:04:12,769 INFO [train.py:682] (1/4) Start epoch 1205 2024-10-21 04:04:29,936 INFO [train.py:561] (1/4) Epoch 1205, batch 6, global_batch_idx: 19270, batch size: 106, loss[dur_loss=0.2564, prior_loss=0.9871, diff_loss=0.3334, tot_loss=1.577, over 106.00 samples.], tot_loss[dur_loss=0.2458, prior_loss=0.9858, diff_loss=0.4374, tot_loss=1.669, over 1142.00 samples.], 2024-10-21 04:04:43,009 INFO [train.py:682] (1/4) Start epoch 1206 2024-10-21 04:04:51,715 INFO [train.py:561] (1/4) Epoch 1206, batch 0, global_batch_idx: 19280, batch size: 108, loss[dur_loss=0.2513, prior_loss=0.9873, diff_loss=0.3398, tot_loss=1.578, over 108.00 samples.], tot_loss[dur_loss=0.2513, prior_loss=0.9873, diff_loss=0.3398, tot_loss=1.578, over 108.00 samples.], 2024-10-21 04:05:05,818 INFO [train.py:561] (1/4) Epoch 1206, batch 10, global_batch_idx: 19290, batch size: 111, loss[dur_loss=0.2561, prior_loss=0.9878, diff_loss=0.2997, tot_loss=1.544, over 111.00 samples.], tot_loss[dur_loss=0.2474, prior_loss=0.9862, diff_loss=0.4, tot_loss=1.634, over 1656.00 samples.], 2024-10-21 04:05:13,019 INFO [train.py:682] (1/4) Start epoch 1207 2024-10-21 04:05:27,123 INFO [train.py:561] (1/4) Epoch 1207, batch 4, global_batch_idx: 19300, batch size: 189, loss[dur_loss=0.2482, prior_loss=0.9867, diff_loss=0.3707, tot_loss=1.606, over 189.00 samples.], tot_loss[dur_loss=0.2425, prior_loss=0.9855, diff_loss=0.4487, tot_loss=1.677, over 937.00 samples.], 2024-10-21 04:05:42,069 INFO [train.py:561] (1/4) Epoch 1207, batch 14, global_batch_idx: 19310, batch size: 142, loss[dur_loss=0.2549, prior_loss=0.9867, diff_loss=0.3435, tot_loss=1.585, over 142.00 samples.], tot_loss[dur_loss=0.2479, prior_loss=0.9863, diff_loss=0.3898, tot_loss=1.624, over 2210.00 samples.], 2024-10-21 04:05:43,519 INFO [train.py:682] (1/4) Start epoch 1208 2024-10-21 04:06:03,664 INFO [train.py:561] (1/4) Epoch 1208, batch 8, global_batch_idx: 19320, batch size: 170, loss[dur_loss=0.2505, prior_loss=0.9865, diff_loss=0.3483, tot_loss=1.585, over 170.00 samples.], tot_loss[dur_loss=0.2469, prior_loss=0.9861, diff_loss=0.4041, tot_loss=1.637, over 1432.00 samples.], 2024-10-21 04:06:13,899 INFO [train.py:682] (1/4) Start epoch 1209 2024-10-21 04:06:25,479 INFO [train.py:561] (1/4) Epoch 1209, batch 2, global_batch_idx: 19330, batch size: 203, loss[dur_loss=0.2494, prior_loss=0.9862, diff_loss=0.3738, tot_loss=1.609, over 203.00 samples.], tot_loss[dur_loss=0.2503, prior_loss=0.9865, diff_loss=0.3499, tot_loss=1.587, over 442.00 samples.], 2024-10-21 04:06:39,738 INFO [train.py:561] (1/4) Epoch 1209, batch 12, global_batch_idx: 19340, batch size: 152, loss[dur_loss=0.2493, prior_loss=0.9863, diff_loss=0.3298, tot_loss=1.565, over 152.00 samples.], tot_loss[dur_loss=0.247, prior_loss=0.9861, diff_loss=0.3905, tot_loss=1.624, over 1966.00 samples.], 2024-10-21 04:06:44,221 INFO [train.py:682] (1/4) Start epoch 1210 2024-10-21 04:07:01,327 INFO [train.py:561] (1/4) Epoch 1210, batch 6, global_batch_idx: 19350, batch size: 106, loss[dur_loss=0.2477, prior_loss=0.9865, diff_loss=0.3321, tot_loss=1.566, over 106.00 samples.], tot_loss[dur_loss=0.2435, prior_loss=0.9855, diff_loss=0.4303, tot_loss=1.659, over 1142.00 samples.], 2024-10-21 04:07:14,442 INFO [train.py:682] (1/4) Start epoch 1211 2024-10-21 04:07:23,530 INFO [train.py:561] (1/4) Epoch 1211, batch 0, global_batch_idx: 19360, batch size: 108, loss[dur_loss=0.2538, prior_loss=0.987, diff_loss=0.3302, tot_loss=1.571, over 108.00 samples.], tot_loss[dur_loss=0.2538, prior_loss=0.987, diff_loss=0.3302, tot_loss=1.571, over 108.00 samples.], 2024-10-21 04:07:37,834 INFO [train.py:561] (1/4) Epoch 1211, batch 10, global_batch_idx: 19370, batch size: 111, loss[dur_loss=0.2531, prior_loss=0.9877, diff_loss=0.332, tot_loss=1.573, over 111.00 samples.], tot_loss[dur_loss=0.2469, prior_loss=0.9861, diff_loss=0.4027, tot_loss=1.636, over 1656.00 samples.], 2024-10-21 04:07:45,023 INFO [train.py:682] (1/4) Start epoch 1212 2024-10-21 04:07:59,020 INFO [train.py:561] (1/4) Epoch 1212, batch 4, global_batch_idx: 19380, batch size: 189, loss[dur_loss=0.2472, prior_loss=0.9868, diff_loss=0.3215, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.244, prior_loss=0.9854, diff_loss=0.4455, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:08:13,899 INFO [train.py:561] (1/4) Epoch 1212, batch 14, global_batch_idx: 19390, batch size: 142, loss[dur_loss=0.2489, prior_loss=0.9861, diff_loss=0.3435, tot_loss=1.579, over 142.00 samples.], tot_loss[dur_loss=0.2478, prior_loss=0.9861, diff_loss=0.3903, tot_loss=1.624, over 2210.00 samples.], 2024-10-21 04:08:15,364 INFO [train.py:682] (1/4) Start epoch 1213 2024-10-21 04:08:35,640 INFO [train.py:561] (1/4) Epoch 1213, batch 8, global_batch_idx: 19400, batch size: 170, loss[dur_loss=0.2475, prior_loss=0.9865, diff_loss=0.3254, tot_loss=1.559, over 170.00 samples.], tot_loss[dur_loss=0.2452, prior_loss=0.9859, diff_loss=0.4129, tot_loss=1.644, over 1432.00 samples.], 2024-10-21 04:08:45,913 INFO [train.py:682] (1/4) Start epoch 1214 2024-10-21 04:08:57,707 INFO [train.py:561] (1/4) Epoch 1214, batch 2, global_batch_idx: 19410, batch size: 203, loss[dur_loss=0.2473, prior_loss=0.9861, diff_loss=0.3427, tot_loss=1.576, over 203.00 samples.], tot_loss[dur_loss=0.2491, prior_loss=0.9863, diff_loss=0.335, tot_loss=1.57, over 442.00 samples.], 2024-10-21 04:09:12,065 INFO [train.py:561] (1/4) Epoch 1214, batch 12, global_batch_idx: 19420, batch size: 152, loss[dur_loss=0.248, prior_loss=0.9861, diff_loss=0.3684, tot_loss=1.603, over 152.00 samples.], tot_loss[dur_loss=0.2471, prior_loss=0.9859, diff_loss=0.3889, tot_loss=1.622, over 1966.00 samples.], 2024-10-21 04:09:16,486 INFO [train.py:682] (1/4) Start epoch 1215 2024-10-21 04:09:34,247 INFO [train.py:561] (1/4) Epoch 1215, batch 6, global_batch_idx: 19430, batch size: 106, loss[dur_loss=0.2521, prior_loss=0.9865, diff_loss=0.3323, tot_loss=1.571, over 106.00 samples.], tot_loss[dur_loss=0.2448, prior_loss=0.9854, diff_loss=0.4243, tot_loss=1.655, over 1142.00 samples.], 2024-10-21 04:09:47,624 INFO [train.py:682] (1/4) Start epoch 1216 2024-10-21 04:09:56,366 INFO [train.py:561] (1/4) Epoch 1216, batch 0, global_batch_idx: 19440, batch size: 108, loss[dur_loss=0.2517, prior_loss=0.9871, diff_loss=0.3595, tot_loss=1.598, over 108.00 samples.], tot_loss[dur_loss=0.2517, prior_loss=0.9871, diff_loss=0.3595, tot_loss=1.598, over 108.00 samples.], 2024-10-21 04:10:10,639 INFO [train.py:561] (1/4) Epoch 1216, batch 10, global_batch_idx: 19450, batch size: 111, loss[dur_loss=0.251, prior_loss=0.9873, diff_loss=0.3257, tot_loss=1.564, over 111.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9858, diff_loss=0.4022, tot_loss=1.634, over 1656.00 samples.], 2024-10-21 04:10:17,782 INFO [train.py:682] (1/4) Start epoch 1217 2024-10-21 04:10:31,727 INFO [train.py:561] (1/4) Epoch 1217, batch 4, global_batch_idx: 19460, batch size: 189, loss[dur_loss=0.2451, prior_loss=0.9863, diff_loss=0.3484, tot_loss=1.58, over 189.00 samples.], tot_loss[dur_loss=0.2404, prior_loss=0.9851, diff_loss=0.4535, tot_loss=1.679, over 937.00 samples.], 2024-10-21 04:10:46,760 INFO [train.py:561] (1/4) Epoch 1217, batch 14, global_batch_idx: 19470, batch size: 142, loss[dur_loss=0.2475, prior_loss=0.9859, diff_loss=0.3309, tot_loss=1.564, over 142.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9858, diff_loss=0.3875, tot_loss=1.619, over 2210.00 samples.], 2024-10-21 04:10:48,228 INFO [train.py:682] (1/4) Start epoch 1218 2024-10-21 04:11:08,363 INFO [train.py:561] (1/4) Epoch 1218, batch 8, global_batch_idx: 19480, batch size: 170, loss[dur_loss=0.2501, prior_loss=0.9866, diff_loss=0.3415, tot_loss=1.578, over 170.00 samples.], tot_loss[dur_loss=0.2453, prior_loss=0.9858, diff_loss=0.4168, tot_loss=1.648, over 1432.00 samples.], 2024-10-21 04:11:18,595 INFO [train.py:682] (1/4) Start epoch 1219 2024-10-21 04:11:30,064 INFO [train.py:561] (1/4) Epoch 1219, batch 2, global_batch_idx: 19490, batch size: 203, loss[dur_loss=0.251, prior_loss=0.9859, diff_loss=0.3384, tot_loss=1.575, over 203.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9862, diff_loss=0.3302, tot_loss=1.567, over 442.00 samples.], 2024-10-21 04:11:44,266 INFO [train.py:561] (1/4) Epoch 1219, batch 12, global_batch_idx: 19500, batch size: 152, loss[dur_loss=0.2472, prior_loss=0.9863, diff_loss=0.3206, tot_loss=1.554, over 152.00 samples.], tot_loss[dur_loss=0.2467, prior_loss=0.9859, diff_loss=0.3905, tot_loss=1.623, over 1966.00 samples.], 2024-10-21 04:11:45,877 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 04:12:11,984 INFO [train.py:589] (1/4) Epoch 1219, validation: dur_loss=0.448, prior_loss=1.032, diff_loss=0.4297, tot_loss=1.91, over 100.00 samples. 2024-10-21 04:12:11,985 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 04:12:14,852 INFO [train.py:682] (1/4) Start epoch 1220 2024-10-21 04:12:32,456 INFO [train.py:561] (1/4) Epoch 1220, batch 6, global_batch_idx: 19510, batch size: 106, loss[dur_loss=0.2534, prior_loss=0.9865, diff_loss=0.2904, tot_loss=1.53, over 106.00 samples.], tot_loss[dur_loss=0.2435, prior_loss=0.9854, diff_loss=0.4314, tot_loss=1.66, over 1142.00 samples.], 2024-10-21 04:12:45,596 INFO [train.py:682] (1/4) Start epoch 1221 2024-10-21 04:12:54,572 INFO [train.py:561] (1/4) Epoch 1221, batch 0, global_batch_idx: 19520, batch size: 108, loss[dur_loss=0.2531, prior_loss=0.987, diff_loss=0.2981, tot_loss=1.538, over 108.00 samples.], tot_loss[dur_loss=0.2531, prior_loss=0.987, diff_loss=0.2981, tot_loss=1.538, over 108.00 samples.], 2024-10-21 04:13:08,966 INFO [train.py:561] (1/4) Epoch 1221, batch 10, global_batch_idx: 19530, batch size: 111, loss[dur_loss=0.2543, prior_loss=0.9871, diff_loss=0.3151, tot_loss=1.556, over 111.00 samples.], tot_loss[dur_loss=0.2468, prior_loss=0.9858, diff_loss=0.3923, tot_loss=1.625, over 1656.00 samples.], 2024-10-21 04:13:16,093 INFO [train.py:682] (1/4) Start epoch 1222 2024-10-21 04:13:30,028 INFO [train.py:561] (1/4) Epoch 1222, batch 4, global_batch_idx: 19540, batch size: 189, loss[dur_loss=0.2468, prior_loss=0.9861, diff_loss=0.3705, tot_loss=1.603, over 189.00 samples.], tot_loss[dur_loss=0.2421, prior_loss=0.985, diff_loss=0.4502, tot_loss=1.677, over 937.00 samples.], 2024-10-21 04:13:45,041 INFO [train.py:561] (1/4) Epoch 1222, batch 14, global_batch_idx: 19550, batch size: 142, loss[dur_loss=0.2481, prior_loss=0.9859, diff_loss=0.3689, tot_loss=1.603, over 142.00 samples.], tot_loss[dur_loss=0.246, prior_loss=0.9857, diff_loss=0.3841, tot_loss=1.616, over 2210.00 samples.], 2024-10-21 04:13:46,472 INFO [train.py:682] (1/4) Start epoch 1223 2024-10-21 04:14:06,705 INFO [train.py:561] (1/4) Epoch 1223, batch 8, global_batch_idx: 19560, batch size: 170, loss[dur_loss=0.2492, prior_loss=0.9865, diff_loss=0.3574, tot_loss=1.593, over 170.00 samples.], tot_loss[dur_loss=0.2452, prior_loss=0.9856, diff_loss=0.4181, tot_loss=1.649, over 1432.00 samples.], 2024-10-21 04:14:16,926 INFO [train.py:682] (1/4) Start epoch 1224 2024-10-21 04:14:28,787 INFO [train.py:561] (1/4) Epoch 1224, batch 2, global_batch_idx: 19570, batch size: 203, loss[dur_loss=0.2482, prior_loss=0.9858, diff_loss=0.3635, tot_loss=1.597, over 203.00 samples.], tot_loss[dur_loss=0.2483, prior_loss=0.9861, diff_loss=0.3435, tot_loss=1.578, over 442.00 samples.], 2024-10-21 04:14:43,171 INFO [train.py:561] (1/4) Epoch 1224, batch 12, global_batch_idx: 19580, batch size: 152, loss[dur_loss=0.2476, prior_loss=0.986, diff_loss=0.3256, tot_loss=1.559, over 152.00 samples.], tot_loss[dur_loss=0.2455, prior_loss=0.9857, diff_loss=0.3865, tot_loss=1.618, over 1966.00 samples.], 2024-10-21 04:14:47,586 INFO [train.py:682] (1/4) Start epoch 1225 2024-10-21 04:15:05,090 INFO [train.py:561] (1/4) Epoch 1225, batch 6, global_batch_idx: 19590, batch size: 106, loss[dur_loss=0.2491, prior_loss=0.986, diff_loss=0.2994, tot_loss=1.535, over 106.00 samples.], tot_loss[dur_loss=0.2432, prior_loss=0.9852, diff_loss=0.4189, tot_loss=1.647, over 1142.00 samples.], 2024-10-21 04:15:18,272 INFO [train.py:682] (1/4) Start epoch 1226 2024-10-21 04:15:27,225 INFO [train.py:561] (1/4) Epoch 1226, batch 0, global_batch_idx: 19600, batch size: 108, loss[dur_loss=0.2553, prior_loss=0.9869, diff_loss=0.3257, tot_loss=1.568, over 108.00 samples.], tot_loss[dur_loss=0.2553, prior_loss=0.9869, diff_loss=0.3257, tot_loss=1.568, over 108.00 samples.], 2024-10-21 04:15:41,554 INFO [train.py:561] (1/4) Epoch 1226, batch 10, global_batch_idx: 19610, batch size: 111, loss[dur_loss=0.2516, prior_loss=0.987, diff_loss=0.3254, tot_loss=1.564, over 111.00 samples.], tot_loss[dur_loss=0.245, prior_loss=0.9858, diff_loss=0.4024, tot_loss=1.633, over 1656.00 samples.], 2024-10-21 04:15:48,631 INFO [train.py:682] (1/4) Start epoch 1227 2024-10-21 04:16:02,532 INFO [train.py:561] (1/4) Epoch 1227, batch 4, global_batch_idx: 19620, batch size: 189, loss[dur_loss=0.2429, prior_loss=0.9862, diff_loss=0.3374, tot_loss=1.566, over 189.00 samples.], tot_loss[dur_loss=0.2406, prior_loss=0.9851, diff_loss=0.443, tot_loss=1.669, over 937.00 samples.], 2024-10-21 04:16:17,425 INFO [train.py:561] (1/4) Epoch 1227, batch 14, global_batch_idx: 19630, batch size: 142, loss[dur_loss=0.2493, prior_loss=0.9857, diff_loss=0.2985, tot_loss=1.533, over 142.00 samples.], tot_loss[dur_loss=0.2461, prior_loss=0.9858, diff_loss=0.3789, tot_loss=1.611, over 2210.00 samples.], 2024-10-21 04:16:18,854 INFO [train.py:682] (1/4) Start epoch 1228 2024-10-21 04:16:39,066 INFO [train.py:561] (1/4) Epoch 1228, batch 8, global_batch_idx: 19640, batch size: 170, loss[dur_loss=0.2484, prior_loss=0.9862, diff_loss=0.3543, tot_loss=1.589, over 170.00 samples.], tot_loss[dur_loss=0.2444, prior_loss=0.9855, diff_loss=0.4133, tot_loss=1.643, over 1432.00 samples.], 2024-10-21 04:16:49,290 INFO [train.py:682] (1/4) Start epoch 1229 2024-10-21 04:17:00,698 INFO [train.py:561] (1/4) Epoch 1229, batch 2, global_batch_idx: 19650, batch size: 203, loss[dur_loss=0.2482, prior_loss=0.986, diff_loss=0.3486, tot_loss=1.583, over 203.00 samples.], tot_loss[dur_loss=0.249, prior_loss=0.9862, diff_loss=0.3401, tot_loss=1.575, over 442.00 samples.], 2024-10-21 04:17:14,959 INFO [train.py:561] (1/4) Epoch 1229, batch 12, global_batch_idx: 19660, batch size: 152, loss[dur_loss=0.2463, prior_loss=0.986, diff_loss=0.3311, tot_loss=1.563, over 152.00 samples.], tot_loss[dur_loss=0.2455, prior_loss=0.9856, diff_loss=0.3905, tot_loss=1.622, over 1966.00 samples.], 2024-10-21 04:17:19,394 INFO [train.py:682] (1/4) Start epoch 1230 2024-10-21 04:17:36,433 INFO [train.py:561] (1/4) Epoch 1230, batch 6, global_batch_idx: 19670, batch size: 106, loss[dur_loss=0.2484, prior_loss=0.9863, diff_loss=0.3294, tot_loss=1.564, over 106.00 samples.], tot_loss[dur_loss=0.2431, prior_loss=0.9853, diff_loss=0.4401, tot_loss=1.669, over 1142.00 samples.], 2024-10-21 04:17:49,607 INFO [train.py:682] (1/4) Start epoch 1231 2024-10-21 04:17:59,003 INFO [train.py:561] (1/4) Epoch 1231, batch 0, global_batch_idx: 19680, batch size: 108, loss[dur_loss=0.2514, prior_loss=0.9867, diff_loss=0.3317, tot_loss=1.57, over 108.00 samples.], tot_loss[dur_loss=0.2514, prior_loss=0.9867, diff_loss=0.3317, tot_loss=1.57, over 108.00 samples.], 2024-10-21 04:18:13,375 INFO [train.py:561] (1/4) Epoch 1231, batch 10, global_batch_idx: 19690, batch size: 111, loss[dur_loss=0.2499, prior_loss=0.987, diff_loss=0.2976, tot_loss=1.534, over 111.00 samples.], tot_loss[dur_loss=0.2451, prior_loss=0.9858, diff_loss=0.3986, tot_loss=1.63, over 1656.00 samples.], 2024-10-21 04:18:20,554 INFO [train.py:682] (1/4) Start epoch 1232 2024-10-21 04:18:34,607 INFO [train.py:561] (1/4) Epoch 1232, batch 4, global_batch_idx: 19700, batch size: 189, loss[dur_loss=0.2445, prior_loss=0.986, diff_loss=0.3764, tot_loss=1.607, over 189.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.985, diff_loss=0.4668, tot_loss=1.693, over 937.00 samples.], 2024-10-21 04:18:49,663 INFO [train.py:561] (1/4) Epoch 1232, batch 14, global_batch_idx: 19710, batch size: 142, loss[dur_loss=0.2486, prior_loss=0.9857, diff_loss=0.3657, tot_loss=1.6, over 142.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9857, diff_loss=0.3922, tot_loss=1.624, over 2210.00 samples.], 2024-10-21 04:18:51,104 INFO [train.py:682] (1/4) Start epoch 1233 2024-10-21 04:19:11,255 INFO [train.py:561] (1/4) Epoch 1233, batch 8, global_batch_idx: 19720, batch size: 170, loss[dur_loss=0.2516, prior_loss=0.9863, diff_loss=0.3789, tot_loss=1.617, over 170.00 samples.], tot_loss[dur_loss=0.2442, prior_loss=0.9855, diff_loss=0.4169, tot_loss=1.647, over 1432.00 samples.], 2024-10-21 04:19:21,403 INFO [train.py:682] (1/4) Start epoch 1234 2024-10-21 04:19:32,752 INFO [train.py:561] (1/4) Epoch 1234, batch 2, global_batch_idx: 19730, batch size: 203, loss[dur_loss=0.2451, prior_loss=0.986, diff_loss=0.3459, tot_loss=1.577, over 203.00 samples.], tot_loss[dur_loss=0.2463, prior_loss=0.9861, diff_loss=0.3314, tot_loss=1.564, over 442.00 samples.], 2024-10-21 04:19:47,169 INFO [train.py:561] (1/4) Epoch 1234, batch 12, global_batch_idx: 19740, batch size: 152, loss[dur_loss=0.248, prior_loss=0.9857, diff_loss=0.3598, tot_loss=1.594, over 152.00 samples.], tot_loss[dur_loss=0.2444, prior_loss=0.9857, diff_loss=0.3977, tot_loss=1.628, over 1966.00 samples.], 2024-10-21 04:19:51,674 INFO [train.py:682] (1/4) Start epoch 1235 2024-10-21 04:20:09,189 INFO [train.py:561] (1/4) Epoch 1235, batch 6, global_batch_idx: 19750, batch size: 106, loss[dur_loss=0.2495, prior_loss=0.9861, diff_loss=0.3104, tot_loss=1.546, over 106.00 samples.], tot_loss[dur_loss=0.2418, prior_loss=0.9853, diff_loss=0.4302, tot_loss=1.657, over 1142.00 samples.], 2024-10-21 04:20:22,340 INFO [train.py:682] (1/4) Start epoch 1236 2024-10-21 04:20:31,307 INFO [train.py:561] (1/4) Epoch 1236, batch 0, global_batch_idx: 19760, batch size: 108, loss[dur_loss=0.2558, prior_loss=0.9869, diff_loss=0.3149, tot_loss=1.558, over 108.00 samples.], tot_loss[dur_loss=0.2558, prior_loss=0.9869, diff_loss=0.3149, tot_loss=1.558, over 108.00 samples.], 2024-10-21 04:20:45,536 INFO [train.py:561] (1/4) Epoch 1236, batch 10, global_batch_idx: 19770, batch size: 111, loss[dur_loss=0.2485, prior_loss=0.9867, diff_loss=0.3251, tot_loss=1.56, over 111.00 samples.], tot_loss[dur_loss=0.2459, prior_loss=0.9855, diff_loss=0.4016, tot_loss=1.633, over 1656.00 samples.], 2024-10-21 04:20:52,659 INFO [train.py:682] (1/4) Start epoch 1237 2024-10-21 04:21:06,348 INFO [train.py:561] (1/4) Epoch 1237, batch 4, global_batch_idx: 19780, batch size: 189, loss[dur_loss=0.2442, prior_loss=0.9859, diff_loss=0.3484, tot_loss=1.579, over 189.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.9849, diff_loss=0.4512, tot_loss=1.677, over 937.00 samples.], 2024-10-21 04:21:21,348 INFO [train.py:561] (1/4) Epoch 1237, batch 14, global_batch_idx: 19790, batch size: 142, loss[dur_loss=0.2495, prior_loss=0.9859, diff_loss=0.3482, tot_loss=1.584, over 142.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9856, diff_loss=0.3811, tot_loss=1.612, over 2210.00 samples.], 2024-10-21 04:21:22,769 INFO [train.py:682] (1/4) Start epoch 1238 2024-10-21 04:21:43,021 INFO [train.py:561] (1/4) Epoch 1238, batch 8, global_batch_idx: 19800, batch size: 170, loss[dur_loss=0.2517, prior_loss=0.9861, diff_loss=0.3501, tot_loss=1.588, over 170.00 samples.], tot_loss[dur_loss=0.2452, prior_loss=0.9853, diff_loss=0.4146, tot_loss=1.645, over 1432.00 samples.], 2024-10-21 04:21:53,189 INFO [train.py:682] (1/4) Start epoch 1239 2024-10-21 04:22:04,398 INFO [train.py:561] (1/4) Epoch 1239, batch 2, global_batch_idx: 19810, batch size: 203, loss[dur_loss=0.2456, prior_loss=0.9856, diff_loss=0.3705, tot_loss=1.602, over 203.00 samples.], tot_loss[dur_loss=0.248, prior_loss=0.986, diff_loss=0.3227, tot_loss=1.557, over 442.00 samples.], 2024-10-21 04:22:18,780 INFO [train.py:561] (1/4) Epoch 1239, batch 12, global_batch_idx: 19820, batch size: 152, loss[dur_loss=0.2464, prior_loss=0.986, diff_loss=0.3371, tot_loss=1.57, over 152.00 samples.], tot_loss[dur_loss=0.2445, prior_loss=0.9857, diff_loss=0.387, tot_loss=1.617, over 1966.00 samples.], 2024-10-21 04:22:23,263 INFO [train.py:682] (1/4) Start epoch 1240 2024-10-21 04:22:40,528 INFO [train.py:561] (1/4) Epoch 1240, batch 6, global_batch_idx: 19830, batch size: 106, loss[dur_loss=0.2474, prior_loss=0.986, diff_loss=0.3853, tot_loss=1.619, over 106.00 samples.], tot_loss[dur_loss=0.2427, prior_loss=0.9855, diff_loss=0.4305, tot_loss=1.659, over 1142.00 samples.], 2024-10-21 04:22:53,604 INFO [train.py:682] (1/4) Start epoch 1241 2024-10-21 04:23:02,222 INFO [train.py:561] (1/4) Epoch 1241, batch 0, global_batch_idx: 19840, batch size: 108, loss[dur_loss=0.2551, prior_loss=0.9872, diff_loss=0.3175, tot_loss=1.56, over 108.00 samples.], tot_loss[dur_loss=0.2551, prior_loss=0.9872, diff_loss=0.3175, tot_loss=1.56, over 108.00 samples.], 2024-10-21 04:23:16,566 INFO [train.py:561] (1/4) Epoch 1241, batch 10, global_batch_idx: 19850, batch size: 111, loss[dur_loss=0.2578, prior_loss=0.9875, diff_loss=0.3446, tot_loss=1.59, over 111.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9857, diff_loss=0.4026, tot_loss=1.635, over 1656.00 samples.], 2024-10-21 04:23:23,735 INFO [train.py:682] (1/4) Start epoch 1242 2024-10-21 04:23:37,596 INFO [train.py:561] (1/4) Epoch 1242, batch 4, global_batch_idx: 19860, batch size: 189, loss[dur_loss=0.2457, prior_loss=0.9862, diff_loss=0.3527, tot_loss=1.585, over 189.00 samples.], tot_loss[dur_loss=0.2406, prior_loss=0.9849, diff_loss=0.4395, tot_loss=1.665, over 937.00 samples.], 2024-10-21 04:23:52,470 INFO [train.py:561] (1/4) Epoch 1242, batch 14, global_batch_idx: 19870, batch size: 142, loss[dur_loss=0.2479, prior_loss=0.9858, diff_loss=0.3232, tot_loss=1.557, over 142.00 samples.], tot_loss[dur_loss=0.245, prior_loss=0.9856, diff_loss=0.3766, tot_loss=1.607, over 2210.00 samples.], 2024-10-21 04:23:53,900 INFO [train.py:682] (1/4) Start epoch 1243 2024-10-21 04:24:14,077 INFO [train.py:561] (1/4) Epoch 1243, batch 8, global_batch_idx: 19880, batch size: 170, loss[dur_loss=0.2493, prior_loss=0.9858, diff_loss=0.3511, tot_loss=1.586, over 170.00 samples.], tot_loss[dur_loss=0.2437, prior_loss=0.9852, diff_loss=0.4101, tot_loss=1.639, over 1432.00 samples.], 2024-10-21 04:24:24,342 INFO [train.py:682] (1/4) Start epoch 1244 2024-10-21 04:24:35,683 INFO [train.py:561] (1/4) Epoch 1244, batch 2, global_batch_idx: 19890, batch size: 203, loss[dur_loss=0.2435, prior_loss=0.9852, diff_loss=0.3673, tot_loss=1.596, over 203.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9856, diff_loss=0.3338, tot_loss=1.565, over 442.00 samples.], 2024-10-21 04:24:49,925 INFO [train.py:561] (1/4) Epoch 1244, batch 12, global_batch_idx: 19900, batch size: 152, loss[dur_loss=0.243, prior_loss=0.9856, diff_loss=0.3247, tot_loss=1.553, over 152.00 samples.], tot_loss[dur_loss=0.2433, prior_loss=0.9853, diff_loss=0.3916, tot_loss=1.62, over 1966.00 samples.], 2024-10-21 04:24:54,375 INFO [train.py:682] (1/4) Start epoch 1245 2024-10-21 04:25:11,350 INFO [train.py:561] (1/4) Epoch 1245, batch 6, global_batch_idx: 19910, batch size: 106, loss[dur_loss=0.2467, prior_loss=0.9857, diff_loss=0.3166, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.2403, prior_loss=0.9849, diff_loss=0.4286, tot_loss=1.654, over 1142.00 samples.], 2024-10-21 04:25:24,465 INFO [train.py:682] (1/4) Start epoch 1246 2024-10-21 04:25:33,523 INFO [train.py:561] (1/4) Epoch 1246, batch 0, global_batch_idx: 19920, batch size: 108, loss[dur_loss=0.2505, prior_loss=0.9865, diff_loss=0.2613, tot_loss=1.498, over 108.00 samples.], tot_loss[dur_loss=0.2505, prior_loss=0.9865, diff_loss=0.2613, tot_loss=1.498, over 108.00 samples.], 2024-10-21 04:25:47,863 INFO [train.py:561] (1/4) Epoch 1246, batch 10, global_batch_idx: 19930, batch size: 111, loss[dur_loss=0.2499, prior_loss=0.987, diff_loss=0.3096, tot_loss=1.547, over 111.00 samples.], tot_loss[dur_loss=0.2436, prior_loss=0.9854, diff_loss=0.3823, tot_loss=1.611, over 1656.00 samples.], 2024-10-21 04:25:54,983 INFO [train.py:682] (1/4) Start epoch 1247 2024-10-21 04:26:08,753 INFO [train.py:561] (1/4) Epoch 1247, batch 4, global_batch_idx: 19940, batch size: 189, loss[dur_loss=0.2459, prior_loss=0.9858, diff_loss=0.3578, tot_loss=1.59, over 189.00 samples.], tot_loss[dur_loss=0.2416, prior_loss=0.9847, diff_loss=0.4561, tot_loss=1.682, over 937.00 samples.], 2024-10-21 04:26:26,123 INFO [train.py:561] (1/4) Epoch 1247, batch 14, global_batch_idx: 19950, batch size: 142, loss[dur_loss=0.2484, prior_loss=0.9852, diff_loss=0.3199, tot_loss=1.554, over 142.00 samples.], tot_loss[dur_loss=0.2449, prior_loss=0.9853, diff_loss=0.3923, tot_loss=1.623, over 2210.00 samples.], 2024-10-21 04:26:27,554 INFO [train.py:682] (1/4) Start epoch 1248 2024-10-21 04:26:48,124 INFO [train.py:561] (1/4) Epoch 1248, batch 8, global_batch_idx: 19960, batch size: 170, loss[dur_loss=0.2494, prior_loss=0.9855, diff_loss=0.3638, tot_loss=1.599, over 170.00 samples.], tot_loss[dur_loss=0.2436, prior_loss=0.9851, diff_loss=0.4145, tot_loss=1.643, over 1432.00 samples.], 2024-10-21 04:26:58,821 INFO [train.py:682] (1/4) Start epoch 1249 2024-10-21 04:27:10,520 INFO [train.py:561] (1/4) Epoch 1249, batch 2, global_batch_idx: 19970, batch size: 203, loss[dur_loss=0.2424, prior_loss=0.9854, diff_loss=0.3362, tot_loss=1.564, over 203.00 samples.], tot_loss[dur_loss=0.2451, prior_loss=0.9857, diff_loss=0.341, tot_loss=1.572, over 442.00 samples.], 2024-10-21 04:27:24,899 INFO [train.py:561] (1/4) Epoch 1249, batch 12, global_batch_idx: 19980, batch size: 152, loss[dur_loss=0.2459, prior_loss=0.9854, diff_loss=0.3367, tot_loss=1.568, over 152.00 samples.], tot_loss[dur_loss=0.2429, prior_loss=0.9852, diff_loss=0.394, tot_loss=1.622, over 1966.00 samples.], 2024-10-21 04:27:29,395 INFO [train.py:682] (1/4) Start epoch 1250 2024-10-21 04:27:46,466 INFO [train.py:561] (1/4) Epoch 1250, batch 6, global_batch_idx: 19990, batch size: 106, loss[dur_loss=0.2435, prior_loss=0.9855, diff_loss=0.326, tot_loss=1.555, over 106.00 samples.], tot_loss[dur_loss=0.2408, prior_loss=0.9848, diff_loss=0.4297, tot_loss=1.655, over 1142.00 samples.], 2024-10-21 04:27:59,596 INFO [train.py:682] (1/4) Start epoch 1251 2024-10-21 04:28:08,254 INFO [train.py:561] (1/4) Epoch 1251, batch 0, global_batch_idx: 20000, batch size: 108, loss[dur_loss=0.2498, prior_loss=0.9865, diff_loss=0.354, tot_loss=1.59, over 108.00 samples.], tot_loss[dur_loss=0.2498, prior_loss=0.9865, diff_loss=0.354, tot_loss=1.59, over 108.00 samples.], 2024-10-21 04:28:22,528 INFO [train.py:561] (1/4) Epoch 1251, batch 10, global_batch_idx: 20010, batch size: 111, loss[dur_loss=0.2547, prior_loss=0.9871, diff_loss=0.3199, tot_loss=1.562, over 111.00 samples.], tot_loss[dur_loss=0.2444, prior_loss=0.9853, diff_loss=0.3986, tot_loss=1.628, over 1656.00 samples.], 2024-10-21 04:28:29,661 INFO [train.py:682] (1/4) Start epoch 1252 2024-10-21 04:28:43,513 INFO [train.py:561] (1/4) Epoch 1252, batch 4, global_batch_idx: 20020, batch size: 189, loss[dur_loss=0.2523, prior_loss=0.9863, diff_loss=0.3578, tot_loss=1.596, over 189.00 samples.], tot_loss[dur_loss=0.242, prior_loss=0.9848, diff_loss=0.4561, tot_loss=1.683, over 937.00 samples.], 2024-10-21 04:28:58,400 INFO [train.py:561] (1/4) Epoch 1252, batch 14, global_batch_idx: 20030, batch size: 142, loss[dur_loss=0.2478, prior_loss=0.9855, diff_loss=0.3258, tot_loss=1.559, over 142.00 samples.], tot_loss[dur_loss=0.2456, prior_loss=0.9855, diff_loss=0.381, tot_loss=1.612, over 2210.00 samples.], 2024-10-21 04:28:59,839 INFO [train.py:682] (1/4) Start epoch 1253 2024-10-21 04:29:20,038 INFO [train.py:561] (1/4) Epoch 1253, batch 8, global_batch_idx: 20040, batch size: 170, loss[dur_loss=0.2474, prior_loss=0.9855, diff_loss=0.3317, tot_loss=1.565, over 170.00 samples.], tot_loss[dur_loss=0.2443, prior_loss=0.9853, diff_loss=0.4069, tot_loss=1.636, over 1432.00 samples.], 2024-10-21 04:29:30,289 INFO [train.py:682] (1/4) Start epoch 1254 2024-10-21 04:29:41,715 INFO [train.py:561] (1/4) Epoch 1254, batch 2, global_batch_idx: 20050, batch size: 203, loss[dur_loss=0.243, prior_loss=0.9856, diff_loss=0.3929, tot_loss=1.621, over 203.00 samples.], tot_loss[dur_loss=0.2449, prior_loss=0.9858, diff_loss=0.3714, tot_loss=1.602, over 442.00 samples.], 2024-10-21 04:29:56,239 INFO [train.py:561] (1/4) Epoch 1254, batch 12, global_batch_idx: 20060, batch size: 152, loss[dur_loss=0.2447, prior_loss=0.9855, diff_loss=0.35, tot_loss=1.58, over 152.00 samples.], tot_loss[dur_loss=0.2435, prior_loss=0.9853, diff_loss=0.392, tot_loss=1.621, over 1966.00 samples.], 2024-10-21 04:30:00,726 INFO [train.py:682] (1/4) Start epoch 1255 2024-10-21 04:30:17,997 INFO [train.py:561] (1/4) Epoch 1255, batch 6, global_batch_idx: 20070, batch size: 106, loss[dur_loss=0.2509, prior_loss=0.9859, diff_loss=0.3385, tot_loss=1.575, over 106.00 samples.], tot_loss[dur_loss=0.2423, prior_loss=0.9849, diff_loss=0.4301, tot_loss=1.657, over 1142.00 samples.], 2024-10-21 04:30:31,167 INFO [train.py:682] (1/4) Start epoch 1256 2024-10-21 04:30:40,099 INFO [train.py:561] (1/4) Epoch 1256, batch 0, global_batch_idx: 20080, batch size: 108, loss[dur_loss=0.2499, prior_loss=0.9867, diff_loss=0.3196, tot_loss=1.556, over 108.00 samples.], tot_loss[dur_loss=0.2499, prior_loss=0.9867, diff_loss=0.3196, tot_loss=1.556, over 108.00 samples.], 2024-10-21 04:30:54,497 INFO [train.py:561] (1/4) Epoch 1256, batch 10, global_batch_idx: 20090, batch size: 111, loss[dur_loss=0.2552, prior_loss=0.9869, diff_loss=0.3672, tot_loss=1.609, over 111.00 samples.], tot_loss[dur_loss=0.2444, prior_loss=0.9853, diff_loss=0.4056, tot_loss=1.635, over 1656.00 samples.], 2024-10-21 04:31:01,640 INFO [train.py:682] (1/4) Start epoch 1257 2024-10-21 04:31:15,505 INFO [train.py:561] (1/4) Epoch 1257, batch 4, global_batch_idx: 20100, batch size: 189, loss[dur_loss=0.2464, prior_loss=0.9859, diff_loss=0.3399, tot_loss=1.572, over 189.00 samples.], tot_loss[dur_loss=0.2396, prior_loss=0.9847, diff_loss=0.4511, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:31:30,528 INFO [train.py:561] (1/4) Epoch 1257, batch 14, global_batch_idx: 20110, batch size: 142, loss[dur_loss=0.2452, prior_loss=0.9854, diff_loss=0.3116, tot_loss=1.542, over 142.00 samples.], tot_loss[dur_loss=0.2439, prior_loss=0.9854, diff_loss=0.3792, tot_loss=1.608, over 2210.00 samples.], 2024-10-21 04:31:31,998 INFO [train.py:682] (1/4) Start epoch 1258 2024-10-21 04:31:52,407 INFO [train.py:561] (1/4) Epoch 1258, batch 8, global_batch_idx: 20120, batch size: 170, loss[dur_loss=0.2471, prior_loss=0.9857, diff_loss=0.341, tot_loss=1.574, over 170.00 samples.], tot_loss[dur_loss=0.2425, prior_loss=0.9851, diff_loss=0.4081, tot_loss=1.636, over 1432.00 samples.], 2024-10-21 04:32:02,643 INFO [train.py:682] (1/4) Start epoch 1259 2024-10-21 04:32:14,383 INFO [train.py:561] (1/4) Epoch 1259, batch 2, global_batch_idx: 20130, batch size: 203, loss[dur_loss=0.2445, prior_loss=0.9855, diff_loss=0.367, tot_loss=1.597, over 203.00 samples.], tot_loss[dur_loss=0.2468, prior_loss=0.9857, diff_loss=0.3545, tot_loss=1.587, over 442.00 samples.], 2024-10-21 04:32:30,027 INFO [train.py:561] (1/4) Epoch 1259, batch 12, global_batch_idx: 20140, batch size: 152, loss[dur_loss=0.2454, prior_loss=0.9857, diff_loss=0.3635, tot_loss=1.595, over 152.00 samples.], tot_loss[dur_loss=0.2442, prior_loss=0.9853, diff_loss=0.3992, tot_loss=1.629, over 1966.00 samples.], 2024-10-21 04:32:34,498 INFO [train.py:682] (1/4) Start epoch 1260 2024-10-21 04:32:51,828 INFO [train.py:561] (1/4) Epoch 1260, batch 6, global_batch_idx: 20150, batch size: 106, loss[dur_loss=0.2471, prior_loss=0.986, diff_loss=0.3289, tot_loss=1.562, over 106.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.985, diff_loss=0.4297, tot_loss=1.656, over 1142.00 samples.], 2024-10-21 04:33:04,991 INFO [train.py:682] (1/4) Start epoch 1261 2024-10-21 04:33:13,888 INFO [train.py:561] (1/4) Epoch 1261, batch 0, global_batch_idx: 20160, batch size: 108, loss[dur_loss=0.2504, prior_loss=0.9865, diff_loss=0.3392, tot_loss=1.576, over 108.00 samples.], tot_loss[dur_loss=0.2504, prior_loss=0.9865, diff_loss=0.3392, tot_loss=1.576, over 108.00 samples.], 2024-10-21 04:33:28,243 INFO [train.py:561] (1/4) Epoch 1261, batch 10, global_batch_idx: 20170, batch size: 111, loss[dur_loss=0.2535, prior_loss=0.987, diff_loss=0.3137, tot_loss=1.554, over 111.00 samples.], tot_loss[dur_loss=0.2448, prior_loss=0.9854, diff_loss=0.3998, tot_loss=1.63, over 1656.00 samples.], 2024-10-21 04:33:35,388 INFO [train.py:682] (1/4) Start epoch 1262 2024-10-21 04:33:49,226 INFO [train.py:561] (1/4) Epoch 1262, batch 4, global_batch_idx: 20180, batch size: 189, loss[dur_loss=0.2491, prior_loss=0.9861, diff_loss=0.3629, tot_loss=1.598, over 189.00 samples.], tot_loss[dur_loss=0.2412, prior_loss=0.9847, diff_loss=0.4556, tot_loss=1.682, over 937.00 samples.], 2024-10-21 04:34:04,164 INFO [train.py:561] (1/4) Epoch 1262, batch 14, global_batch_idx: 20190, batch size: 142, loss[dur_loss=0.2424, prior_loss=0.985, diff_loss=0.3494, tot_loss=1.577, over 142.00 samples.], tot_loss[dur_loss=0.2446, prior_loss=0.9853, diff_loss=0.3899, tot_loss=1.62, over 2210.00 samples.], 2024-10-21 04:34:05,617 INFO [train.py:682] (1/4) Start epoch 1263 2024-10-21 04:34:25,525 INFO [train.py:561] (1/4) Epoch 1263, batch 8, global_batch_idx: 20200, batch size: 170, loss[dur_loss=0.2487, prior_loss=0.9856, diff_loss=0.3622, tot_loss=1.597, over 170.00 samples.], tot_loss[dur_loss=0.2418, prior_loss=0.9849, diff_loss=0.4119, tot_loss=1.639, over 1432.00 samples.], 2024-10-21 04:34:35,723 INFO [train.py:682] (1/4) Start epoch 1264 2024-10-21 04:34:47,775 INFO [train.py:561] (1/4) Epoch 1264, batch 2, global_batch_idx: 20210, batch size: 203, loss[dur_loss=0.2424, prior_loss=0.9852, diff_loss=0.3842, tot_loss=1.612, over 203.00 samples.], tot_loss[dur_loss=0.2462, prior_loss=0.9855, diff_loss=0.3596, tot_loss=1.591, over 442.00 samples.], 2024-10-21 04:35:02,177 INFO [train.py:561] (1/4) Epoch 1264, batch 12, global_batch_idx: 20220, batch size: 152, loss[dur_loss=0.2505, prior_loss=0.9856, diff_loss=0.3443, tot_loss=1.58, over 152.00 samples.], tot_loss[dur_loss=0.2445, prior_loss=0.9851, diff_loss=0.3886, tot_loss=1.618, over 1966.00 samples.], 2024-10-21 04:35:06,645 INFO [train.py:682] (1/4) Start epoch 1265 2024-10-21 04:35:23,695 INFO [train.py:561] (1/4) Epoch 1265, batch 6, global_batch_idx: 20230, batch size: 106, loss[dur_loss=0.2499, prior_loss=0.986, diff_loss=0.3222, tot_loss=1.558, over 106.00 samples.], tot_loss[dur_loss=0.2415, prior_loss=0.9848, diff_loss=0.4228, tot_loss=1.649, over 1142.00 samples.], 2024-10-21 04:35:36,842 INFO [train.py:682] (1/4) Start epoch 1266 2024-10-21 04:35:45,784 INFO [train.py:561] (1/4) Epoch 1266, batch 0, global_batch_idx: 20240, batch size: 108, loss[dur_loss=0.2489, prior_loss=0.9862, diff_loss=0.3155, tot_loss=1.551, over 108.00 samples.], tot_loss[dur_loss=0.2489, prior_loss=0.9862, diff_loss=0.3155, tot_loss=1.551, over 108.00 samples.], 2024-10-21 04:36:00,111 INFO [train.py:561] (1/4) Epoch 1266, batch 10, global_batch_idx: 20250, batch size: 111, loss[dur_loss=0.2493, prior_loss=0.9865, diff_loss=0.3556, tot_loss=1.591, over 111.00 samples.], tot_loss[dur_loss=0.2423, prior_loss=0.9851, diff_loss=0.3965, tot_loss=1.624, over 1656.00 samples.], 2024-10-21 04:36:07,270 INFO [train.py:682] (1/4) Start epoch 1267 2024-10-21 04:36:21,092 INFO [train.py:561] (1/4) Epoch 1267, batch 4, global_batch_idx: 20260, batch size: 189, loss[dur_loss=0.2481, prior_loss=0.9856, diff_loss=0.3664, tot_loss=1.6, over 189.00 samples.], tot_loss[dur_loss=0.2403, prior_loss=0.9847, diff_loss=0.4538, tot_loss=1.679, over 937.00 samples.], 2024-10-21 04:36:36,060 INFO [train.py:561] (1/4) Epoch 1267, batch 14, global_batch_idx: 20270, batch size: 142, loss[dur_loss=0.2478, prior_loss=0.985, diff_loss=0.3436, tot_loss=1.576, over 142.00 samples.], tot_loss[dur_loss=0.2438, prior_loss=0.9852, diff_loss=0.3848, tot_loss=1.614, over 2210.00 samples.], 2024-10-21 04:36:37,497 INFO [train.py:682] (1/4) Start epoch 1268 2024-10-21 04:36:57,772 INFO [train.py:561] (1/4) Epoch 1268, batch 8, global_batch_idx: 20280, batch size: 170, loss[dur_loss=0.2467, prior_loss=0.9861, diff_loss=0.3869, tot_loss=1.62, over 170.00 samples.], tot_loss[dur_loss=0.2426, prior_loss=0.9853, diff_loss=0.4144, tot_loss=1.642, over 1432.00 samples.], 2024-10-21 04:37:08,443 INFO [train.py:682] (1/4) Start epoch 1269 2024-10-21 04:37:19,987 INFO [train.py:561] (1/4) Epoch 1269, batch 2, global_batch_idx: 20290, batch size: 203, loss[dur_loss=0.2463, prior_loss=0.9853, diff_loss=0.3549, tot_loss=1.586, over 203.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9855, diff_loss=0.3502, tot_loss=1.582, over 442.00 samples.], 2024-10-21 04:37:34,362 INFO [train.py:561] (1/4) Epoch 1269, batch 12, global_batch_idx: 20300, batch size: 152, loss[dur_loss=0.249, prior_loss=0.9853, diff_loss=0.3558, tot_loss=1.59, over 152.00 samples.], tot_loss[dur_loss=0.2447, prior_loss=0.9853, diff_loss=0.3975, tot_loss=1.627, over 1966.00 samples.], 2024-10-21 04:37:38,843 INFO [train.py:682] (1/4) Start epoch 1270 2024-10-21 04:37:55,691 INFO [train.py:561] (1/4) Epoch 1270, batch 6, global_batch_idx: 20310, batch size: 106, loss[dur_loss=0.2471, prior_loss=0.9857, diff_loss=0.3644, tot_loss=1.597, over 106.00 samples.], tot_loss[dur_loss=0.2404, prior_loss=0.9848, diff_loss=0.4301, tot_loss=1.655, over 1142.00 samples.], 2024-10-21 04:38:08,714 INFO [train.py:682] (1/4) Start epoch 1271 2024-10-21 04:38:17,267 INFO [train.py:561] (1/4) Epoch 1271, batch 0, global_batch_idx: 20320, batch size: 108, loss[dur_loss=0.2522, prior_loss=0.9863, diff_loss=0.3027, tot_loss=1.541, over 108.00 samples.], tot_loss[dur_loss=0.2522, prior_loss=0.9863, diff_loss=0.3027, tot_loss=1.541, over 108.00 samples.], 2024-10-21 04:38:31,681 INFO [train.py:561] (1/4) Epoch 1271, batch 10, global_batch_idx: 20330, batch size: 111, loss[dur_loss=0.2511, prior_loss=0.9865, diff_loss=0.3286, tot_loss=1.566, over 111.00 samples.], tot_loss[dur_loss=0.2428, prior_loss=0.9851, diff_loss=0.3898, tot_loss=1.618, over 1656.00 samples.], 2024-10-21 04:38:38,810 INFO [train.py:682] (1/4) Start epoch 1272 2024-10-21 04:38:52,448 INFO [train.py:561] (1/4) Epoch 1272, batch 4, global_batch_idx: 20340, batch size: 189, loss[dur_loss=0.2437, prior_loss=0.9856, diff_loss=0.3587, tot_loss=1.588, over 189.00 samples.], tot_loss[dur_loss=0.2374, prior_loss=0.9845, diff_loss=0.4459, tot_loss=1.668, over 937.00 samples.], 2024-10-21 04:39:07,313 INFO [train.py:561] (1/4) Epoch 1272, batch 14, global_batch_idx: 20350, batch size: 142, loss[dur_loss=0.245, prior_loss=0.9848, diff_loss=0.3362, tot_loss=1.566, over 142.00 samples.], tot_loss[dur_loss=0.2422, prior_loss=0.9851, diff_loss=0.3789, tot_loss=1.606, over 2210.00 samples.], 2024-10-21 04:39:08,769 INFO [train.py:682] (1/4) Start epoch 1273 2024-10-21 04:39:29,229 INFO [train.py:561] (1/4) Epoch 1273, batch 8, global_batch_idx: 20360, batch size: 170, loss[dur_loss=0.2471, prior_loss=0.9856, diff_loss=0.3386, tot_loss=1.571, over 170.00 samples.], tot_loss[dur_loss=0.2422, prior_loss=0.9849, diff_loss=0.4098, tot_loss=1.637, over 1432.00 samples.], 2024-10-21 04:39:39,346 INFO [train.py:682] (1/4) Start epoch 1274 2024-10-21 04:39:50,547 INFO [train.py:561] (1/4) Epoch 1274, batch 2, global_batch_idx: 20370, batch size: 203, loss[dur_loss=0.2434, prior_loss=0.9853, diff_loss=0.3829, tot_loss=1.612, over 203.00 samples.], tot_loss[dur_loss=0.245, prior_loss=0.9855, diff_loss=0.35, tot_loss=1.581, over 442.00 samples.], 2024-10-21 04:40:04,817 INFO [train.py:561] (1/4) Epoch 1274, batch 12, global_batch_idx: 20380, batch size: 152, loss[dur_loss=0.2433, prior_loss=0.9853, diff_loss=0.3442, tot_loss=1.573, over 152.00 samples.], tot_loss[dur_loss=0.2427, prior_loss=0.9851, diff_loss=0.3939, tot_loss=1.622, over 1966.00 samples.], 2024-10-21 04:40:09,273 INFO [train.py:682] (1/4) Start epoch 1275 2024-10-21 04:40:26,398 INFO [train.py:561] (1/4) Epoch 1275, batch 6, global_batch_idx: 20390, batch size: 106, loss[dur_loss=0.2472, prior_loss=0.9855, diff_loss=0.3234, tot_loss=1.556, over 106.00 samples.], tot_loss[dur_loss=0.2404, prior_loss=0.9848, diff_loss=0.438, tot_loss=1.663, over 1142.00 samples.], 2024-10-21 04:40:39,565 INFO [train.py:682] (1/4) Start epoch 1276 2024-10-21 04:40:48,329 INFO [train.py:561] (1/4) Epoch 1276, batch 0, global_batch_idx: 20400, batch size: 108, loss[dur_loss=0.2499, prior_loss=0.9863, diff_loss=0.3744, tot_loss=1.611, over 108.00 samples.], tot_loss[dur_loss=0.2499, prior_loss=0.9863, diff_loss=0.3744, tot_loss=1.611, over 108.00 samples.], 2024-10-21 04:41:02,600 INFO [train.py:561] (1/4) Epoch 1276, batch 10, global_batch_idx: 20410, batch size: 111, loss[dur_loss=0.2468, prior_loss=0.9867, diff_loss=0.3406, tot_loss=1.574, over 111.00 samples.], tot_loss[dur_loss=0.243, prior_loss=0.9851, diff_loss=0.4049, tot_loss=1.633, over 1656.00 samples.], 2024-10-21 04:41:09,774 INFO [train.py:682] (1/4) Start epoch 1277 2024-10-21 04:41:23,483 INFO [train.py:561] (1/4) Epoch 1277, batch 4, global_batch_idx: 20420, batch size: 189, loss[dur_loss=0.2478, prior_loss=0.9858, diff_loss=0.3445, tot_loss=1.578, over 189.00 samples.], tot_loss[dur_loss=0.2388, prior_loss=0.9845, diff_loss=0.4521, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:41:38,444 INFO [train.py:561] (1/4) Epoch 1277, batch 14, global_batch_idx: 20430, batch size: 142, loss[dur_loss=0.2464, prior_loss=0.9852, diff_loss=0.3515, tot_loss=1.583, over 142.00 samples.], tot_loss[dur_loss=0.2436, prior_loss=0.9851, diff_loss=0.3886, tot_loss=1.617, over 2210.00 samples.], 2024-10-21 04:41:39,874 INFO [train.py:682] (1/4) Start epoch 1278 2024-10-21 04:42:00,079 INFO [train.py:561] (1/4) Epoch 1278, batch 8, global_batch_idx: 20440, batch size: 170, loss[dur_loss=0.2496, prior_loss=0.9856, diff_loss=0.3363, tot_loss=1.572, over 170.00 samples.], tot_loss[dur_loss=0.2423, prior_loss=0.9848, diff_loss=0.4037, tot_loss=1.631, over 1432.00 samples.], 2024-10-21 04:42:10,153 INFO [train.py:682] (1/4) Start epoch 1279 2024-10-21 04:42:21,736 INFO [train.py:561] (1/4) Epoch 1279, batch 2, global_batch_idx: 20450, batch size: 203, loss[dur_loss=0.2443, prior_loss=0.9852, diff_loss=0.3757, tot_loss=1.605, over 203.00 samples.], tot_loss[dur_loss=0.2448, prior_loss=0.9854, diff_loss=0.35, tot_loss=1.58, over 442.00 samples.], 2024-10-21 04:42:35,992 INFO [train.py:561] (1/4) Epoch 1279, batch 12, global_batch_idx: 20460, batch size: 152, loss[dur_loss=0.2452, prior_loss=0.9857, diff_loss=0.3197, tot_loss=1.551, over 152.00 samples.], tot_loss[dur_loss=0.2426, prior_loss=0.9851, diff_loss=0.3885, tot_loss=1.616, over 1966.00 samples.], 2024-10-21 04:42:40,480 INFO [train.py:682] (1/4) Start epoch 1280 2024-10-21 04:42:57,367 INFO [train.py:561] (1/4) Epoch 1280, batch 6, global_batch_idx: 20470, batch size: 106, loss[dur_loss=0.2509, prior_loss=0.9857, diff_loss=0.3505, tot_loss=1.587, over 106.00 samples.], tot_loss[dur_loss=0.2417, prior_loss=0.9848, diff_loss=0.4307, tot_loss=1.657, over 1142.00 samples.], 2024-10-21 04:43:10,505 INFO [train.py:682] (1/4) Start epoch 1281 2024-10-21 04:43:19,577 INFO [train.py:561] (1/4) Epoch 1281, batch 0, global_batch_idx: 20480, batch size: 108, loss[dur_loss=0.2492, prior_loss=0.9865, diff_loss=0.3387, tot_loss=1.574, over 108.00 samples.], tot_loss[dur_loss=0.2492, prior_loss=0.9865, diff_loss=0.3387, tot_loss=1.574, over 108.00 samples.], 2024-10-21 04:43:33,839 INFO [train.py:561] (1/4) Epoch 1281, batch 10, global_batch_idx: 20490, batch size: 111, loss[dur_loss=0.2476, prior_loss=0.9864, diff_loss=0.3451, tot_loss=1.579, over 111.00 samples.], tot_loss[dur_loss=0.2417, prior_loss=0.9849, diff_loss=0.3972, tot_loss=1.624, over 1656.00 samples.], 2024-10-21 04:43:40,985 INFO [train.py:682] (1/4) Start epoch 1282 2024-10-21 04:43:54,620 INFO [train.py:561] (1/4) Epoch 1282, batch 4, global_batch_idx: 20500, batch size: 189, loss[dur_loss=0.2454, prior_loss=0.9861, diff_loss=0.3447, tot_loss=1.576, over 189.00 samples.], tot_loss[dur_loss=0.2397, prior_loss=0.9845, diff_loss=0.4509, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:44:09,530 INFO [train.py:561] (1/4) Epoch 1282, batch 14, global_batch_idx: 20510, batch size: 142, loss[dur_loss=0.2456, prior_loss=0.9854, diff_loss=0.297, tot_loss=1.528, over 142.00 samples.], tot_loss[dur_loss=0.2442, prior_loss=0.9853, diff_loss=0.3802, tot_loss=1.61, over 2210.00 samples.], 2024-10-21 04:44:10,965 INFO [train.py:682] (1/4) Start epoch 1283 2024-10-21 04:44:31,115 INFO [train.py:561] (1/4) Epoch 1283, batch 8, global_batch_idx: 20520, batch size: 170, loss[dur_loss=0.2487, prior_loss=0.9856, diff_loss=0.3268, tot_loss=1.561, over 170.00 samples.], tot_loss[dur_loss=0.2432, prior_loss=0.9851, diff_loss=0.4188, tot_loss=1.647, over 1432.00 samples.], 2024-10-21 04:44:41,289 INFO [train.py:682] (1/4) Start epoch 1284 2024-10-21 04:44:52,637 INFO [train.py:561] (1/4) Epoch 1284, batch 2, global_batch_idx: 20530, batch size: 203, loss[dur_loss=0.243, prior_loss=0.9854, diff_loss=0.3496, tot_loss=1.578, over 203.00 samples.], tot_loss[dur_loss=0.2457, prior_loss=0.9856, diff_loss=0.3419, tot_loss=1.573, over 442.00 samples.], 2024-10-21 04:45:07,019 INFO [train.py:561] (1/4) Epoch 1284, batch 12, global_batch_idx: 20540, batch size: 152, loss[dur_loss=0.2437, prior_loss=0.9853, diff_loss=0.3439, tot_loss=1.573, over 152.00 samples.], tot_loss[dur_loss=0.2437, prior_loss=0.9851, diff_loss=0.3843, tot_loss=1.613, over 1966.00 samples.], 2024-10-21 04:45:11,518 INFO [train.py:682] (1/4) Start epoch 1285 2024-10-21 04:45:28,523 INFO [train.py:561] (1/4) Epoch 1285, batch 6, global_batch_idx: 20550, batch size: 106, loss[dur_loss=0.2477, prior_loss=0.9857, diff_loss=0.3093, tot_loss=1.543, over 106.00 samples.], tot_loss[dur_loss=0.2405, prior_loss=0.9847, diff_loss=0.4199, tot_loss=1.645, over 1142.00 samples.], 2024-10-21 04:45:41,664 INFO [train.py:682] (1/4) Start epoch 1286 2024-10-21 04:45:50,623 INFO [train.py:561] (1/4) Epoch 1286, batch 0, global_batch_idx: 20560, batch size: 108, loss[dur_loss=0.2508, prior_loss=0.9865, diff_loss=0.2918, tot_loss=1.529, over 108.00 samples.], tot_loss[dur_loss=0.2508, prior_loss=0.9865, diff_loss=0.2918, tot_loss=1.529, over 108.00 samples.], 2024-10-21 04:46:04,966 INFO [train.py:561] (1/4) Epoch 1286, batch 10, global_batch_idx: 20570, batch size: 111, loss[dur_loss=0.2482, prior_loss=0.9865, diff_loss=0.3512, tot_loss=1.586, over 111.00 samples.], tot_loss[dur_loss=0.243, prior_loss=0.9851, diff_loss=0.3942, tot_loss=1.622, over 1656.00 samples.], 2024-10-21 04:46:12,100 INFO [train.py:682] (1/4) Start epoch 1287 2024-10-21 04:46:25,787 INFO [train.py:561] (1/4) Epoch 1287, batch 4, global_batch_idx: 20580, batch size: 189, loss[dur_loss=0.2491, prior_loss=0.9857, diff_loss=0.331, tot_loss=1.566, over 189.00 samples.], tot_loss[dur_loss=0.2397, prior_loss=0.9845, diff_loss=0.4369, tot_loss=1.661, over 937.00 samples.], 2024-10-21 04:46:40,741 INFO [train.py:561] (1/4) Epoch 1287, batch 14, global_batch_idx: 20590, batch size: 142, loss[dur_loss=0.2479, prior_loss=0.9859, diff_loss=0.3423, tot_loss=1.576, over 142.00 samples.], tot_loss[dur_loss=0.2439, prior_loss=0.9853, diff_loss=0.3752, tot_loss=1.604, over 2210.00 samples.], 2024-10-21 04:46:42,200 INFO [train.py:682] (1/4) Start epoch 1288 2024-10-21 04:47:02,131 INFO [train.py:561] (1/4) Epoch 1288, batch 8, global_batch_idx: 20600, batch size: 170, loss[dur_loss=0.2511, prior_loss=0.9857, diff_loss=0.3713, tot_loss=1.608, over 170.00 samples.], tot_loss[dur_loss=0.2436, prior_loss=0.9849, diff_loss=0.4064, tot_loss=1.635, over 1432.00 samples.], 2024-10-21 04:47:12,355 INFO [train.py:682] (1/4) Start epoch 1289 2024-10-21 04:47:24,156 INFO [train.py:561] (1/4) Epoch 1289, batch 2, global_batch_idx: 20610, batch size: 203, loss[dur_loss=0.2481, prior_loss=0.9853, diff_loss=0.3839, tot_loss=1.617, over 203.00 samples.], tot_loss[dur_loss=0.2464, prior_loss=0.9854, diff_loss=0.3465, tot_loss=1.578, over 442.00 samples.], 2024-10-21 04:47:38,316 INFO [train.py:561] (1/4) Epoch 1289, batch 12, global_batch_idx: 20620, batch size: 152, loss[dur_loss=0.2453, prior_loss=0.9853, diff_loss=0.3302, tot_loss=1.561, over 152.00 samples.], tot_loss[dur_loss=0.2426, prior_loss=0.985, diff_loss=0.3862, tot_loss=1.614, over 1966.00 samples.], 2024-10-21 04:47:42,781 INFO [train.py:682] (1/4) Start epoch 1290 2024-10-21 04:47:59,779 INFO [train.py:561] (1/4) Epoch 1290, batch 6, global_batch_idx: 20630, batch size: 106, loss[dur_loss=0.248, prior_loss=0.9855, diff_loss=0.3725, tot_loss=1.606, over 106.00 samples.], tot_loss[dur_loss=0.2379, prior_loss=0.9845, diff_loss=0.4316, tot_loss=1.654, over 1142.00 samples.], 2024-10-21 04:48:12,824 INFO [train.py:682] (1/4) Start epoch 1291 2024-10-21 04:48:21,663 INFO [train.py:561] (1/4) Epoch 1291, batch 0, global_batch_idx: 20640, batch size: 108, loss[dur_loss=0.2496, prior_loss=0.9862, diff_loss=0.3038, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.2496, prior_loss=0.9862, diff_loss=0.3038, tot_loss=1.54, over 108.00 samples.], 2024-10-21 04:48:35,909 INFO [train.py:561] (1/4) Epoch 1291, batch 10, global_batch_idx: 20650, batch size: 111, loss[dur_loss=0.2478, prior_loss=0.9865, diff_loss=0.3297, tot_loss=1.564, over 111.00 samples.], tot_loss[dur_loss=0.243, prior_loss=0.9851, diff_loss=0.41, tot_loss=1.638, over 1656.00 samples.], 2024-10-21 04:48:43,050 INFO [train.py:682] (1/4) Start epoch 1292 2024-10-21 04:48:56,763 INFO [train.py:561] (1/4) Epoch 1292, batch 4, global_batch_idx: 20660, batch size: 189, loss[dur_loss=0.2436, prior_loss=0.9854, diff_loss=0.3704, tot_loss=1.599, over 189.00 samples.], tot_loss[dur_loss=0.238, prior_loss=0.9842, diff_loss=0.4481, tot_loss=1.67, over 937.00 samples.], 2024-10-21 04:49:11,786 INFO [train.py:561] (1/4) Epoch 1292, batch 14, global_batch_idx: 20670, batch size: 142, loss[dur_loss=0.248, prior_loss=0.9852, diff_loss=0.3427, tot_loss=1.576, over 142.00 samples.], tot_loss[dur_loss=0.2433, prior_loss=0.985, diff_loss=0.3874, tot_loss=1.616, over 2210.00 samples.], 2024-10-21 04:49:13,242 INFO [train.py:682] (1/4) Start epoch 1293 2024-10-21 04:49:33,128 INFO [train.py:561] (1/4) Epoch 1293, batch 8, global_batch_idx: 20680, batch size: 170, loss[dur_loss=0.247, prior_loss=0.9854, diff_loss=0.3433, tot_loss=1.576, over 170.00 samples.], tot_loss[dur_loss=0.2412, prior_loss=0.9846, diff_loss=0.4098, tot_loss=1.636, over 1432.00 samples.], 2024-10-21 04:49:43,355 INFO [train.py:682] (1/4) Start epoch 1294 2024-10-21 04:49:54,895 INFO [train.py:561] (1/4) Epoch 1294, batch 2, global_batch_idx: 20690, batch size: 203, loss[dur_loss=0.2441, prior_loss=0.9848, diff_loss=0.3856, tot_loss=1.615, over 203.00 samples.], tot_loss[dur_loss=0.2453, prior_loss=0.9851, diff_loss=0.3469, tot_loss=1.577, over 442.00 samples.], 2024-10-21 04:50:09,202 INFO [train.py:561] (1/4) Epoch 1294, batch 12, global_batch_idx: 20700, batch size: 152, loss[dur_loss=0.2424, prior_loss=0.985, diff_loss=0.3513, tot_loss=1.579, over 152.00 samples.], tot_loss[dur_loss=0.2413, prior_loss=0.9848, diff_loss=0.4001, tot_loss=1.626, over 1966.00 samples.], 2024-10-21 04:50:13,697 INFO [train.py:682] (1/4) Start epoch 1295 2024-10-21 04:50:30,734 INFO [train.py:561] (1/4) Epoch 1295, batch 6, global_batch_idx: 20710, batch size: 106, loss[dur_loss=0.2462, prior_loss=0.9852, diff_loss=0.3711, tot_loss=1.602, over 106.00 samples.], tot_loss[dur_loss=0.2383, prior_loss=0.9842, diff_loss=0.4268, tot_loss=1.649, over 1142.00 samples.], 2024-10-21 04:50:43,815 INFO [train.py:682] (1/4) Start epoch 1296 2024-10-21 04:50:52,695 INFO [train.py:561] (1/4) Epoch 1296, batch 0, global_batch_idx: 20720, batch size: 108, loss[dur_loss=0.2479, prior_loss=0.9859, diff_loss=0.346, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2479, prior_loss=0.9859, diff_loss=0.346, tot_loss=1.58, over 108.00 samples.], 2024-10-21 04:51:06,957 INFO [train.py:561] (1/4) Epoch 1296, batch 10, global_batch_idx: 20730, batch size: 111, loss[dur_loss=0.2477, prior_loss=0.9861, diff_loss=0.3316, tot_loss=1.565, over 111.00 samples.], tot_loss[dur_loss=0.2409, prior_loss=0.9847, diff_loss=0.4037, tot_loss=1.629, over 1656.00 samples.], 2024-10-21 04:51:14,127 INFO [train.py:682] (1/4) Start epoch 1297 2024-10-21 04:51:28,065 INFO [train.py:561] (1/4) Epoch 1297, batch 4, global_batch_idx: 20740, batch size: 189, loss[dur_loss=0.2391, prior_loss=0.9853, diff_loss=0.3773, tot_loss=1.602, over 189.00 samples.], tot_loss[dur_loss=0.236, prior_loss=0.9841, diff_loss=0.453, tot_loss=1.673, over 937.00 samples.], 2024-10-21 04:51:43,086 INFO [train.py:561] (1/4) Epoch 1297, batch 14, global_batch_idx: 20750, batch size: 142, loss[dur_loss=0.2411, prior_loss=0.9848, diff_loss=0.3321, tot_loss=1.558, over 142.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.9848, diff_loss=0.3856, tot_loss=1.611, over 2210.00 samples.], 2024-10-21 04:51:44,526 INFO [train.py:682] (1/4) Start epoch 1298 2024-10-21 04:52:07,001 INFO [train.py:561] (1/4) Epoch 1298, batch 8, global_batch_idx: 20760, batch size: 170, loss[dur_loss=0.2512, prior_loss=0.9855, diff_loss=0.3438, tot_loss=1.581, over 170.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9846, diff_loss=0.4067, tot_loss=1.633, over 1432.00 samples.], 2024-10-21 04:52:17,227 INFO [train.py:682] (1/4) Start epoch 1299 2024-10-21 04:52:28,512 INFO [train.py:561] (1/4) Epoch 1299, batch 2, global_batch_idx: 20770, batch size: 203, loss[dur_loss=0.2443, prior_loss=0.985, diff_loss=0.37, tot_loss=1.599, over 203.00 samples.], tot_loss[dur_loss=0.2455, prior_loss=0.9851, diff_loss=0.3467, tot_loss=1.577, over 442.00 samples.], 2024-10-21 04:52:42,796 INFO [train.py:561] (1/4) Epoch 1299, batch 12, global_batch_idx: 20780, batch size: 152, loss[dur_loss=0.2427, prior_loss=0.9853, diff_loss=0.331, tot_loss=1.559, over 152.00 samples.], tot_loss[dur_loss=0.2419, prior_loss=0.9849, diff_loss=0.3831, tot_loss=1.61, over 1966.00 samples.], 2024-10-21 04:52:47,278 INFO [train.py:682] (1/4) Start epoch 1300 2024-10-21 04:53:04,554 INFO [train.py:561] (1/4) Epoch 1300, batch 6, global_batch_idx: 20790, batch size: 106, loss[dur_loss=0.2478, prior_loss=0.9853, diff_loss=0.3895, tot_loss=1.623, over 106.00 samples.], tot_loss[dur_loss=0.2403, prior_loss=0.9844, diff_loss=0.4247, tot_loss=1.649, over 1142.00 samples.], 2024-10-21 04:53:17,719 INFO [train.py:682] (1/4) Start epoch 1301 2024-10-21 04:53:26,473 INFO [train.py:561] (1/4) Epoch 1301, batch 0, global_batch_idx: 20800, batch size: 108, loss[dur_loss=0.2496, prior_loss=0.9859, diff_loss=0.3248, tot_loss=1.56, over 108.00 samples.], tot_loss[dur_loss=0.2496, prior_loss=0.9859, diff_loss=0.3248, tot_loss=1.56, over 108.00 samples.], 2024-10-21 04:53:40,761 INFO [train.py:561] (1/4) Epoch 1301, batch 10, global_batch_idx: 20810, batch size: 111, loss[dur_loss=0.2443, prior_loss=0.9862, diff_loss=0.3175, tot_loss=1.548, over 111.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9847, diff_loss=0.396, tot_loss=1.622, over 1656.00 samples.], 2024-10-21 04:53:47,907 INFO [train.py:682] (1/4) Start epoch 1302 2024-10-21 04:54:01,707 INFO [train.py:561] (1/4) Epoch 1302, batch 4, global_batch_idx: 20820, batch size: 189, loss[dur_loss=0.2477, prior_loss=0.9853, diff_loss=0.3695, tot_loss=1.602, over 189.00 samples.], tot_loss[dur_loss=0.238, prior_loss=0.984, diff_loss=0.4487, tot_loss=1.671, over 937.00 samples.], 2024-10-21 04:54:16,668 INFO [train.py:561] (1/4) Epoch 1302, batch 14, global_batch_idx: 20830, batch size: 142, loss[dur_loss=0.2462, prior_loss=0.9848, diff_loss=0.3079, tot_loss=1.539, over 142.00 samples.], tot_loss[dur_loss=0.2426, prior_loss=0.9847, diff_loss=0.3809, tot_loss=1.608, over 2210.00 samples.], 2024-10-21 04:54:18,105 INFO [train.py:682] (1/4) Start epoch 1303 2024-10-21 04:54:38,098 INFO [train.py:561] (1/4) Epoch 1303, batch 8, global_batch_idx: 20840, batch size: 170, loss[dur_loss=0.2483, prior_loss=0.9854, diff_loss=0.3444, tot_loss=1.578, over 170.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9845, diff_loss=0.3938, tot_loss=1.62, over 1432.00 samples.], 2024-10-21 04:54:48,384 INFO [train.py:682] (1/4) Start epoch 1304 2024-10-21 04:55:00,265 INFO [train.py:561] (1/4) Epoch 1304, batch 2, global_batch_idx: 20850, batch size: 203, loss[dur_loss=0.2449, prior_loss=0.9849, diff_loss=0.3621, tot_loss=1.592, over 203.00 samples.], tot_loss[dur_loss=0.2448, prior_loss=0.9851, diff_loss=0.3376, tot_loss=1.568, over 442.00 samples.], 2024-10-21 04:55:14,609 INFO [train.py:561] (1/4) Epoch 1304, batch 12, global_batch_idx: 20860, batch size: 152, loss[dur_loss=0.2425, prior_loss=0.9853, diff_loss=0.3214, tot_loss=1.549, over 152.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9847, diff_loss=0.3946, tot_loss=1.621, over 1966.00 samples.], 2024-10-21 04:55:19,155 INFO [train.py:682] (1/4) Start epoch 1305 2024-10-21 04:55:36,700 INFO [train.py:561] (1/4) Epoch 1305, batch 6, global_batch_idx: 20870, batch size: 106, loss[dur_loss=0.2445, prior_loss=0.9851, diff_loss=0.3574, tot_loss=1.587, over 106.00 samples.], tot_loss[dur_loss=0.2398, prior_loss=0.9843, diff_loss=0.4335, tot_loss=1.658, over 1142.00 samples.], 2024-10-21 04:55:50,056 INFO [train.py:682] (1/4) Start epoch 1306 2024-10-21 04:55:58,965 INFO [train.py:561] (1/4) Epoch 1306, batch 0, global_batch_idx: 20880, batch size: 108, loss[dur_loss=0.2477, prior_loss=0.9857, diff_loss=0.3055, tot_loss=1.539, over 108.00 samples.], tot_loss[dur_loss=0.2477, prior_loss=0.9857, diff_loss=0.3055, tot_loss=1.539, over 108.00 samples.], 2024-10-21 04:56:13,343 INFO [train.py:561] (1/4) Epoch 1306, batch 10, global_batch_idx: 20890, batch size: 111, loss[dur_loss=0.2435, prior_loss=0.9858, diff_loss=0.3502, tot_loss=1.58, over 111.00 samples.], tot_loss[dur_loss=0.2406, prior_loss=0.9846, diff_loss=0.3995, tot_loss=1.625, over 1656.00 samples.], 2024-10-21 04:56:20,539 INFO [train.py:682] (1/4) Start epoch 1307 2024-10-21 04:56:34,474 INFO [train.py:561] (1/4) Epoch 1307, batch 4, global_batch_idx: 20900, batch size: 189, loss[dur_loss=0.241, prior_loss=0.9851, diff_loss=0.3428, tot_loss=1.569, over 189.00 samples.], tot_loss[dur_loss=0.2367, prior_loss=0.9839, diff_loss=0.4388, tot_loss=1.659, over 937.00 samples.], 2024-10-21 04:56:49,162 INFO [train.py:561] (1/4) Epoch 1307, batch 14, global_batch_idx: 20910, batch size: 142, loss[dur_loss=0.245, prior_loss=0.9846, diff_loss=0.3478, tot_loss=1.577, over 142.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.9846, diff_loss=0.3774, tot_loss=1.603, over 2210.00 samples.], 2024-10-21 04:56:50,605 INFO [train.py:682] (1/4) Start epoch 1308 2024-10-21 04:57:11,348 INFO [train.py:561] (1/4) Epoch 1308, batch 8, global_batch_idx: 20920, batch size: 170, loss[dur_loss=0.2408, prior_loss=0.985, diff_loss=0.3506, tot_loss=1.576, over 170.00 samples.], tot_loss[dur_loss=0.2394, prior_loss=0.9845, diff_loss=0.4093, tot_loss=1.633, over 1432.00 samples.], 2024-10-21 04:57:21,654 INFO [train.py:682] (1/4) Start epoch 1309 2024-10-21 04:57:33,206 INFO [train.py:561] (1/4) Epoch 1309, batch 2, global_batch_idx: 20930, batch size: 203, loss[dur_loss=0.2431, prior_loss=0.9849, diff_loss=0.3456, tot_loss=1.574, over 203.00 samples.], tot_loss[dur_loss=0.2431, prior_loss=0.9849, diff_loss=0.3307, tot_loss=1.559, over 442.00 samples.], 2024-10-21 04:57:47,434 INFO [train.py:561] (1/4) Epoch 1309, batch 12, global_batch_idx: 20940, batch size: 152, loss[dur_loss=0.2425, prior_loss=0.985, diff_loss=0.3313, tot_loss=1.559, over 152.00 samples.], tot_loss[dur_loss=0.2407, prior_loss=0.9845, diff_loss=0.3823, tot_loss=1.608, over 1966.00 samples.], 2024-10-21 04:57:51,934 INFO [train.py:682] (1/4) Start epoch 1310 2024-10-21 04:58:08,952 INFO [train.py:561] (1/4) Epoch 1310, batch 6, global_batch_idx: 20950, batch size: 106, loss[dur_loss=0.244, prior_loss=0.985, diff_loss=0.31, tot_loss=1.539, over 106.00 samples.], tot_loss[dur_loss=0.2387, prior_loss=0.9841, diff_loss=0.4254, tot_loss=1.648, over 1142.00 samples.], 2024-10-21 04:58:22,022 INFO [train.py:682] (1/4) Start epoch 1311 2024-10-21 04:58:30,715 INFO [train.py:561] (1/4) Epoch 1311, batch 0, global_batch_idx: 20960, batch size: 108, loss[dur_loss=0.251, prior_loss=0.9857, diff_loss=0.3098, tot_loss=1.547, over 108.00 samples.], tot_loss[dur_loss=0.251, prior_loss=0.9857, diff_loss=0.3098, tot_loss=1.547, over 108.00 samples.], 2024-10-21 04:58:45,062 INFO [train.py:561] (1/4) Epoch 1311, batch 10, global_batch_idx: 20970, batch size: 111, loss[dur_loss=0.2462, prior_loss=0.9862, diff_loss=0.341, tot_loss=1.573, over 111.00 samples.], tot_loss[dur_loss=0.2426, prior_loss=0.9849, diff_loss=0.4034, tot_loss=1.631, over 1656.00 samples.], 2024-10-21 04:58:52,188 INFO [train.py:682] (1/4) Start epoch 1312 2024-10-21 04:59:06,234 INFO [train.py:561] (1/4) Epoch 1312, batch 4, global_batch_idx: 20980, batch size: 189, loss[dur_loss=0.2412, prior_loss=0.9847, diff_loss=0.3643, tot_loss=1.59, over 189.00 samples.], tot_loss[dur_loss=0.2362, prior_loss=0.9839, diff_loss=0.4545, tot_loss=1.675, over 937.00 samples.], 2024-10-21 04:59:21,211 INFO [train.py:561] (1/4) Epoch 1312, batch 14, global_batch_idx: 20990, batch size: 142, loss[dur_loss=0.2455, prior_loss=0.9846, diff_loss=0.3531, tot_loss=1.583, over 142.00 samples.], tot_loss[dur_loss=0.2412, prior_loss=0.9846, diff_loss=0.3859, tot_loss=1.612, over 2210.00 samples.], 2024-10-21 04:59:22,654 INFO [train.py:682] (1/4) Start epoch 1313 2024-10-21 04:59:42,967 INFO [train.py:561] (1/4) Epoch 1313, batch 8, global_batch_idx: 21000, batch size: 170, loss[dur_loss=0.2456, prior_loss=0.9848, diff_loss=0.3188, tot_loss=1.549, over 170.00 samples.], tot_loss[dur_loss=0.24, prior_loss=0.9843, diff_loss=0.4092, tot_loss=1.633, over 1432.00 samples.], 2024-10-21 04:59:44,448 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 05:00:17,891 INFO [train.py:589] (1/4) Epoch 1313, validation: dur_loss=0.4506, prior_loss=1.033, diff_loss=0.3921, tot_loss=1.876, over 100.00 samples. 2024-10-21 05:00:17,892 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 05:00:26,668 INFO [train.py:682] (1/4) Start epoch 1314 2024-10-21 05:00:38,864 INFO [train.py:561] (1/4) Epoch 1314, batch 2, global_batch_idx: 21010, batch size: 203, loss[dur_loss=0.2425, prior_loss=0.9848, diff_loss=0.3898, tot_loss=1.617, over 203.00 samples.], tot_loss[dur_loss=0.2437, prior_loss=0.985, diff_loss=0.3497, tot_loss=1.578, over 442.00 samples.], 2024-10-21 05:00:53,253 INFO [train.py:561] (1/4) Epoch 1314, batch 12, global_batch_idx: 21020, batch size: 152, loss[dur_loss=0.2405, prior_loss=0.9847, diff_loss=0.3756, tot_loss=1.601, over 152.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9845, diff_loss=0.3873, tot_loss=1.613, over 1966.00 samples.], 2024-10-21 05:00:57,769 INFO [train.py:682] (1/4) Start epoch 1315 2024-10-21 05:01:15,265 INFO [train.py:561] (1/4) Epoch 1315, batch 6, global_batch_idx: 21030, batch size: 106, loss[dur_loss=0.2459, prior_loss=0.9847, diff_loss=0.339, tot_loss=1.57, over 106.00 samples.], tot_loss[dur_loss=0.239, prior_loss=0.9841, diff_loss=0.4229, tot_loss=1.646, over 1142.00 samples.], 2024-10-21 05:01:28,371 INFO [train.py:682] (1/4) Start epoch 1316 2024-10-21 05:01:37,560 INFO [train.py:561] (1/4) Epoch 1316, batch 0, global_batch_idx: 21040, batch size: 108, loss[dur_loss=0.2477, prior_loss=0.9855, diff_loss=0.3311, tot_loss=1.564, over 108.00 samples.], tot_loss[dur_loss=0.2477, prior_loss=0.9855, diff_loss=0.3311, tot_loss=1.564, over 108.00 samples.], 2024-10-21 05:01:51,859 INFO [train.py:561] (1/4) Epoch 1316, batch 10, global_batch_idx: 21050, batch size: 111, loss[dur_loss=0.2462, prior_loss=0.9861, diff_loss=0.3669, tot_loss=1.599, over 111.00 samples.], tot_loss[dur_loss=0.2412, prior_loss=0.9846, diff_loss=0.3948, tot_loss=1.62, over 1656.00 samples.], 2024-10-21 05:01:58,971 INFO [train.py:682] (1/4) Start epoch 1317 2024-10-21 05:02:13,080 INFO [train.py:561] (1/4) Epoch 1317, batch 4, global_batch_idx: 21060, batch size: 189, loss[dur_loss=0.2405, prior_loss=0.9847, diff_loss=0.3304, tot_loss=1.556, over 189.00 samples.], tot_loss[dur_loss=0.2353, prior_loss=0.9838, diff_loss=0.439, tot_loss=1.658, over 937.00 samples.], 2024-10-21 05:02:27,953 INFO [train.py:561] (1/4) Epoch 1317, batch 14, global_batch_idx: 21070, batch size: 142, loss[dur_loss=0.2436, prior_loss=0.9847, diff_loss=0.3263, tot_loss=1.555, over 142.00 samples.], tot_loss[dur_loss=0.24, prior_loss=0.9845, diff_loss=0.3804, tot_loss=1.605, over 2210.00 samples.], 2024-10-21 05:02:29,376 INFO [train.py:682] (1/4) Start epoch 1318 2024-10-21 05:02:49,508 INFO [train.py:561] (1/4) Epoch 1318, batch 8, global_batch_idx: 21080, batch size: 170, loss[dur_loss=0.2434, prior_loss=0.985, diff_loss=0.3449, tot_loss=1.573, over 170.00 samples.], tot_loss[dur_loss=0.2381, prior_loss=0.9842, diff_loss=0.4088, tot_loss=1.631, over 1432.00 samples.], 2024-10-21 05:02:59,523 INFO [train.py:682] (1/4) Start epoch 1319 2024-10-21 05:03:11,071 INFO [train.py:561] (1/4) Epoch 1319, batch 2, global_batch_idx: 21090, batch size: 203, loss[dur_loss=0.2431, prior_loss=0.9848, diff_loss=0.3502, tot_loss=1.578, over 203.00 samples.], tot_loss[dur_loss=0.2438, prior_loss=0.9849, diff_loss=0.346, tot_loss=1.575, over 442.00 samples.], 2024-10-21 05:03:25,290 INFO [train.py:561] (1/4) Epoch 1319, batch 12, global_batch_idx: 21100, batch size: 152, loss[dur_loss=0.2418, prior_loss=0.9847, diff_loss=0.2947, tot_loss=1.521, over 152.00 samples.], tot_loss[dur_loss=0.2407, prior_loss=0.9843, diff_loss=0.3882, tot_loss=1.613, over 1966.00 samples.], 2024-10-21 05:03:29,722 INFO [train.py:682] (1/4) Start epoch 1320 2024-10-21 05:03:47,214 INFO [train.py:561] (1/4) Epoch 1320, batch 6, global_batch_idx: 21110, batch size: 106, loss[dur_loss=0.2441, prior_loss=0.9846, diff_loss=0.3351, tot_loss=1.564, over 106.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9839, diff_loss=0.4313, tot_loss=1.652, over 1142.00 samples.], 2024-10-21 05:04:00,264 INFO [train.py:682] (1/4) Start epoch 1321 2024-10-21 05:04:09,175 INFO [train.py:561] (1/4) Epoch 1321, batch 0, global_batch_idx: 21120, batch size: 108, loss[dur_loss=0.2463, prior_loss=0.9855, diff_loss=0.3706, tot_loss=1.602, over 108.00 samples.], tot_loss[dur_loss=0.2463, prior_loss=0.9855, diff_loss=0.3706, tot_loss=1.602, over 108.00 samples.], 2024-10-21 05:04:23,494 INFO [train.py:561] (1/4) Epoch 1321, batch 10, global_batch_idx: 21130, batch size: 111, loss[dur_loss=0.2476, prior_loss=0.9859, diff_loss=0.2805, tot_loss=1.514, over 111.00 samples.], tot_loss[dur_loss=0.2403, prior_loss=0.9843, diff_loss=0.4018, tot_loss=1.626, over 1656.00 samples.], 2024-10-21 05:04:30,668 INFO [train.py:682] (1/4) Start epoch 1322 2024-10-21 05:04:44,969 INFO [train.py:561] (1/4) Epoch 1322, batch 4, global_batch_idx: 21140, batch size: 189, loss[dur_loss=0.2378, prior_loss=0.9846, diff_loss=0.321, tot_loss=1.543, over 189.00 samples.], tot_loss[dur_loss=0.2363, prior_loss=0.9837, diff_loss=0.4435, tot_loss=1.663, over 937.00 samples.], 2024-10-21 05:05:00,024 INFO [train.py:561] (1/4) Epoch 1322, batch 14, global_batch_idx: 21150, batch size: 142, loss[dur_loss=0.245, prior_loss=0.9849, diff_loss=0.3544, tot_loss=1.584, over 142.00 samples.], tot_loss[dur_loss=0.2398, prior_loss=0.9844, diff_loss=0.3867, tot_loss=1.611, over 2210.00 samples.], 2024-10-21 05:05:01,460 INFO [train.py:682] (1/4) Start epoch 1323 2024-10-21 05:05:21,803 INFO [train.py:561] (1/4) Epoch 1323, batch 8, global_batch_idx: 21160, batch size: 170, loss[dur_loss=0.2449, prior_loss=0.9849, diff_loss=0.3199, tot_loss=1.55, over 170.00 samples.], tot_loss[dur_loss=0.2409, prior_loss=0.9844, diff_loss=0.4105, tot_loss=1.636, over 1432.00 samples.], 2024-10-21 05:05:31,967 INFO [train.py:682] (1/4) Start epoch 1324 2024-10-21 05:05:43,546 INFO [train.py:561] (1/4) Epoch 1324, batch 2, global_batch_idx: 21170, batch size: 203, loss[dur_loss=0.244, prior_loss=0.9846, diff_loss=0.3704, tot_loss=1.599, over 203.00 samples.], tot_loss[dur_loss=0.2439, prior_loss=0.9849, diff_loss=0.3552, tot_loss=1.584, over 442.00 samples.], 2024-10-21 05:05:57,854 INFO [train.py:561] (1/4) Epoch 1324, batch 12, global_batch_idx: 21180, batch size: 152, loss[dur_loss=0.2428, prior_loss=0.9847, diff_loss=0.3486, tot_loss=1.576, over 152.00 samples.], tot_loss[dur_loss=0.2402, prior_loss=0.9845, diff_loss=0.3944, tot_loss=1.619, over 1966.00 samples.], 2024-10-21 05:06:02,306 INFO [train.py:682] (1/4) Start epoch 1325 2024-10-21 05:06:19,368 INFO [train.py:561] (1/4) Epoch 1325, batch 6, global_batch_idx: 21190, batch size: 106, loss[dur_loss=0.2436, prior_loss=0.9848, diff_loss=0.3662, tot_loss=1.595, over 106.00 samples.], tot_loss[dur_loss=0.2385, prior_loss=0.984, diff_loss=0.4245, tot_loss=1.647, over 1142.00 samples.], 2024-10-21 05:06:32,448 INFO [train.py:682] (1/4) Start epoch 1326 2024-10-21 05:06:41,197 INFO [train.py:561] (1/4) Epoch 1326, batch 0, global_batch_idx: 21200, batch size: 108, loss[dur_loss=0.246, prior_loss=0.9855, diff_loss=0.3328, tot_loss=1.564, over 108.00 samples.], tot_loss[dur_loss=0.246, prior_loss=0.9855, diff_loss=0.3328, tot_loss=1.564, over 108.00 samples.], 2024-10-21 05:06:55,478 INFO [train.py:561] (1/4) Epoch 1326, batch 10, global_batch_idx: 21210, batch size: 111, loss[dur_loss=0.2491, prior_loss=0.9861, diff_loss=0.3241, tot_loss=1.559, over 111.00 samples.], tot_loss[dur_loss=0.2407, prior_loss=0.9843, diff_loss=0.405, tot_loss=1.63, over 1656.00 samples.], 2024-10-21 05:07:02,611 INFO [train.py:682] (1/4) Start epoch 1327 2024-10-21 05:07:16,666 INFO [train.py:561] (1/4) Epoch 1327, batch 4, global_batch_idx: 21220, batch size: 189, loss[dur_loss=0.246, prior_loss=0.9848, diff_loss=0.3191, tot_loss=1.55, over 189.00 samples.], tot_loss[dur_loss=0.237, prior_loss=0.9838, diff_loss=0.4411, tot_loss=1.662, over 937.00 samples.], 2024-10-21 05:07:31,607 INFO [train.py:561] (1/4) Epoch 1327, batch 14, global_batch_idx: 21230, batch size: 142, loss[dur_loss=0.2432, prior_loss=0.9845, diff_loss=0.3649, tot_loss=1.593, over 142.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.9845, diff_loss=0.3769, tot_loss=1.602, over 2210.00 samples.], 2024-10-21 05:07:33,028 INFO [train.py:682] (1/4) Start epoch 1328 2024-10-21 05:07:53,115 INFO [train.py:561] (1/4) Epoch 1328, batch 8, global_batch_idx: 21240, batch size: 170, loss[dur_loss=0.2495, prior_loss=0.9849, diff_loss=0.3508, tot_loss=1.585, over 170.00 samples.], tot_loss[dur_loss=0.2397, prior_loss=0.9842, diff_loss=0.4141, tot_loss=1.638, over 1432.00 samples.], 2024-10-21 05:08:03,335 INFO [train.py:682] (1/4) Start epoch 1329 2024-10-21 05:08:14,628 INFO [train.py:561] (1/4) Epoch 1329, batch 2, global_batch_idx: 21250, batch size: 203, loss[dur_loss=0.242, prior_loss=0.9847, diff_loss=0.3827, tot_loss=1.609, over 203.00 samples.], tot_loss[dur_loss=0.2444, prior_loss=0.985, diff_loss=0.3499, tot_loss=1.579, over 442.00 samples.], 2024-10-21 05:08:28,872 INFO [train.py:561] (1/4) Epoch 1329, batch 12, global_batch_idx: 21260, batch size: 152, loss[dur_loss=0.2412, prior_loss=0.9847, diff_loss=0.3261, tot_loss=1.552, over 152.00 samples.], tot_loss[dur_loss=0.2406, prior_loss=0.9845, diff_loss=0.3895, tot_loss=1.615, over 1966.00 samples.], 2024-10-21 05:08:33,320 INFO [train.py:682] (1/4) Start epoch 1330 2024-10-21 05:08:50,628 INFO [train.py:561] (1/4) Epoch 1330, batch 6, global_batch_idx: 21270, batch size: 106, loss[dur_loss=0.2448, prior_loss=0.9847, diff_loss=0.3206, tot_loss=1.55, over 106.00 samples.], tot_loss[dur_loss=0.2392, prior_loss=0.984, diff_loss=0.4299, tot_loss=1.653, over 1142.00 samples.], 2024-10-21 05:09:03,726 INFO [train.py:682] (1/4) Start epoch 1331 2024-10-21 05:09:12,531 INFO [train.py:561] (1/4) Epoch 1331, batch 0, global_batch_idx: 21280, batch size: 108, loss[dur_loss=0.2467, prior_loss=0.9857, diff_loss=0.3224, tot_loss=1.555, over 108.00 samples.], tot_loss[dur_loss=0.2467, prior_loss=0.9857, diff_loss=0.3224, tot_loss=1.555, over 108.00 samples.], 2024-10-21 05:09:26,768 INFO [train.py:561] (1/4) Epoch 1331, batch 10, global_batch_idx: 21290, batch size: 111, loss[dur_loss=0.2443, prior_loss=0.9858, diff_loss=0.2981, tot_loss=1.528, over 111.00 samples.], tot_loss[dur_loss=0.2407, prior_loss=0.9844, diff_loss=0.3934, tot_loss=1.619, over 1656.00 samples.], 2024-10-21 05:09:33,881 INFO [train.py:682] (1/4) Start epoch 1332 2024-10-21 05:09:47,870 INFO [train.py:561] (1/4) Epoch 1332, batch 4, global_batch_idx: 21300, batch size: 189, loss[dur_loss=0.2429, prior_loss=0.9857, diff_loss=0.3469, tot_loss=1.575, over 189.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.984, diff_loss=0.4528, tot_loss=1.674, over 937.00 samples.], 2024-10-21 05:10:02,890 INFO [train.py:561] (1/4) Epoch 1332, batch 14, global_batch_idx: 21310, batch size: 142, loss[dur_loss=0.2404, prior_loss=0.9843, diff_loss=0.3261, tot_loss=1.551, over 142.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.9846, diff_loss=0.3902, tot_loss=1.616, over 2210.00 samples.], 2024-10-21 05:10:04,322 INFO [train.py:682] (1/4) Start epoch 1333 2024-10-21 05:10:24,355 INFO [train.py:561] (1/4) Epoch 1333, batch 8, global_batch_idx: 21320, batch size: 170, loss[dur_loss=0.2462, prior_loss=0.985, diff_loss=0.3415, tot_loss=1.573, over 170.00 samples.], tot_loss[dur_loss=0.2413, prior_loss=0.9845, diff_loss=0.4043, tot_loss=1.63, over 1432.00 samples.], 2024-10-21 05:10:34,378 INFO [train.py:682] (1/4) Start epoch 1334 2024-10-21 05:10:45,649 INFO [train.py:561] (1/4) Epoch 1334, batch 2, global_batch_idx: 21330, batch size: 203, loss[dur_loss=0.2445, prior_loss=0.9849, diff_loss=0.3593, tot_loss=1.589, over 203.00 samples.], tot_loss[dur_loss=0.2446, prior_loss=0.985, diff_loss=0.3452, tot_loss=1.575, over 442.00 samples.], 2024-10-21 05:11:00,041 INFO [train.py:561] (1/4) Epoch 1334, batch 12, global_batch_idx: 21340, batch size: 152, loss[dur_loss=0.2432, prior_loss=0.9848, diff_loss=0.3643, tot_loss=1.592, over 152.00 samples.], tot_loss[dur_loss=0.2414, prior_loss=0.9845, diff_loss=0.3892, tot_loss=1.615, over 1966.00 samples.], 2024-10-21 05:11:04,491 INFO [train.py:682] (1/4) Start epoch 1335 2024-10-21 05:11:21,710 INFO [train.py:561] (1/4) Epoch 1335, batch 6, global_batch_idx: 21350, batch size: 106, loss[dur_loss=0.2468, prior_loss=0.9849, diff_loss=0.3524, tot_loss=1.584, over 106.00 samples.], tot_loss[dur_loss=0.2381, prior_loss=0.9839, diff_loss=0.4317, tot_loss=1.654, over 1142.00 samples.], 2024-10-21 05:11:34,873 INFO [train.py:682] (1/4) Start epoch 1336 2024-10-21 05:11:43,522 INFO [train.py:561] (1/4) Epoch 1336, batch 0, global_batch_idx: 21360, batch size: 108, loss[dur_loss=0.2437, prior_loss=0.9853, diff_loss=0.3252, tot_loss=1.554, over 108.00 samples.], tot_loss[dur_loss=0.2437, prior_loss=0.9853, diff_loss=0.3252, tot_loss=1.554, over 108.00 samples.], 2024-10-21 05:11:57,940 INFO [train.py:561] (1/4) Epoch 1336, batch 10, global_batch_idx: 21370, batch size: 111, loss[dur_loss=0.2413, prior_loss=0.9855, diff_loss=0.3218, tot_loss=1.549, over 111.00 samples.], tot_loss[dur_loss=0.2385, prior_loss=0.9841, diff_loss=0.3911, tot_loss=1.614, over 1656.00 samples.], 2024-10-21 05:12:05,089 INFO [train.py:682] (1/4) Start epoch 1337 2024-10-21 05:12:18,916 INFO [train.py:561] (1/4) Epoch 1337, batch 4, global_batch_idx: 21380, batch size: 189, loss[dur_loss=0.2422, prior_loss=0.9847, diff_loss=0.347, tot_loss=1.574, over 189.00 samples.], tot_loss[dur_loss=0.2343, prior_loss=0.9836, diff_loss=0.4433, tot_loss=1.661, over 937.00 samples.], 2024-10-21 05:12:33,953 INFO [train.py:561] (1/4) Epoch 1337, batch 14, global_batch_idx: 21390, batch size: 142, loss[dur_loss=0.2384, prior_loss=0.9843, diff_loss=0.3196, tot_loss=1.542, over 142.00 samples.], tot_loss[dur_loss=0.2395, prior_loss=0.9843, diff_loss=0.3738, tot_loss=1.598, over 2210.00 samples.], 2024-10-21 05:12:35,389 INFO [train.py:682] (1/4) Start epoch 1338 2024-10-21 05:12:55,490 INFO [train.py:561] (1/4) Epoch 1338, batch 8, global_batch_idx: 21400, batch size: 170, loss[dur_loss=0.2435, prior_loss=0.9845, diff_loss=0.3321, tot_loss=1.56, over 170.00 samples.], tot_loss[dur_loss=0.2383, prior_loss=0.984, diff_loss=0.3998, tot_loss=1.622, over 1432.00 samples.], 2024-10-21 05:13:05,557 INFO [train.py:682] (1/4) Start epoch 1339 2024-10-21 05:13:17,184 INFO [train.py:561] (1/4) Epoch 1339, batch 2, global_batch_idx: 21410, batch size: 203, loss[dur_loss=0.2437, prior_loss=0.9843, diff_loss=0.3598, tot_loss=1.588, over 203.00 samples.], tot_loss[dur_loss=0.2441, prior_loss=0.9845, diff_loss=0.3301, tot_loss=1.559, over 442.00 samples.], 2024-10-21 05:13:31,300 INFO [train.py:561] (1/4) Epoch 1339, batch 12, global_batch_idx: 21420, batch size: 152, loss[dur_loss=0.2414, prior_loss=0.9845, diff_loss=0.3492, tot_loss=1.575, over 152.00 samples.], tot_loss[dur_loss=0.2401, prior_loss=0.9841, diff_loss=0.3788, tot_loss=1.603, over 1966.00 samples.], 2024-10-21 05:13:35,720 INFO [train.py:682] (1/4) Start epoch 1340 2024-10-21 05:13:53,112 INFO [train.py:561] (1/4) Epoch 1340, batch 6, global_batch_idx: 21430, batch size: 106, loss[dur_loss=0.2425, prior_loss=0.9848, diff_loss=0.2907, tot_loss=1.518, over 106.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9838, diff_loss=0.4371, tot_loss=1.659, over 1142.00 samples.], 2024-10-21 05:14:06,353 INFO [train.py:682] (1/4) Start epoch 1341 2024-10-21 05:14:15,697 INFO [train.py:561] (1/4) Epoch 1341, batch 0, global_batch_idx: 21440, batch size: 108, loss[dur_loss=0.2422, prior_loss=0.9852, diff_loss=0.324, tot_loss=1.551, over 108.00 samples.], tot_loss[dur_loss=0.2422, prior_loss=0.9852, diff_loss=0.324, tot_loss=1.551, over 108.00 samples.], 2024-10-21 05:14:30,027 INFO [train.py:561] (1/4) Epoch 1341, batch 10, global_batch_idx: 21450, batch size: 111, loss[dur_loss=0.2441, prior_loss=0.9855, diff_loss=0.2999, tot_loss=1.529, over 111.00 samples.], tot_loss[dur_loss=0.2377, prior_loss=0.984, diff_loss=0.3961, tot_loss=1.618, over 1656.00 samples.], 2024-10-21 05:14:37,057 INFO [train.py:682] (1/4) Start epoch 1342 2024-10-21 05:14:50,706 INFO [train.py:561] (1/4) Epoch 1342, batch 4, global_batch_idx: 21460, batch size: 189, loss[dur_loss=0.2377, prior_loss=0.9843, diff_loss=0.3333, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9834, diff_loss=0.4491, tot_loss=1.666, over 937.00 samples.], 2024-10-21 05:15:05,743 INFO [train.py:561] (1/4) Epoch 1342, batch 14, global_batch_idx: 21470, batch size: 142, loss[dur_loss=0.2382, prior_loss=0.9838, diff_loss=0.3456, tot_loss=1.568, over 142.00 samples.], tot_loss[dur_loss=0.2388, prior_loss=0.984, diff_loss=0.3795, tot_loss=1.602, over 2210.00 samples.], 2024-10-21 05:15:07,178 INFO [train.py:682] (1/4) Start epoch 1343 2024-10-21 05:15:27,454 INFO [train.py:561] (1/4) Epoch 1343, batch 8, global_batch_idx: 21480, batch size: 170, loss[dur_loss=0.2455, prior_loss=0.9843, diff_loss=0.3594, tot_loss=1.589, over 170.00 samples.], tot_loss[dur_loss=0.2378, prior_loss=0.9837, diff_loss=0.3954, tot_loss=1.617, over 1432.00 samples.], 2024-10-21 05:15:37,699 INFO [train.py:682] (1/4) Start epoch 1344 2024-10-21 05:15:49,050 INFO [train.py:561] (1/4) Epoch 1344, batch 2, global_batch_idx: 21490, batch size: 203, loss[dur_loss=0.2401, prior_loss=0.9842, diff_loss=0.3783, tot_loss=1.603, over 203.00 samples.], tot_loss[dur_loss=0.241, prior_loss=0.9844, diff_loss=0.3395, tot_loss=1.565, over 442.00 samples.], 2024-10-21 05:16:03,256 INFO [train.py:561] (1/4) Epoch 1344, batch 12, global_batch_idx: 21500, batch size: 152, loss[dur_loss=0.2403, prior_loss=0.9844, diff_loss=0.3502, tot_loss=1.575, over 152.00 samples.], tot_loss[dur_loss=0.2387, prior_loss=0.984, diff_loss=0.3884, tot_loss=1.611, over 1966.00 samples.], 2024-10-21 05:16:07,709 INFO [train.py:682] (1/4) Start epoch 1345 2024-10-21 05:16:24,628 INFO [train.py:561] (1/4) Epoch 1345, batch 6, global_batch_idx: 21510, batch size: 106, loss[dur_loss=0.2428, prior_loss=0.9844, diff_loss=0.2996, tot_loss=1.527, over 106.00 samples.], tot_loss[dur_loss=0.2354, prior_loss=0.9835, diff_loss=0.4185, tot_loss=1.637, over 1142.00 samples.], 2024-10-21 05:16:37,722 INFO [train.py:682] (1/4) Start epoch 1346 2024-10-21 05:16:46,562 INFO [train.py:561] (1/4) Epoch 1346, batch 0, global_batch_idx: 21520, batch size: 108, loss[dur_loss=0.2471, prior_loss=0.9852, diff_loss=0.3369, tot_loss=1.569, over 108.00 samples.], tot_loss[dur_loss=0.2471, prior_loss=0.9852, diff_loss=0.3369, tot_loss=1.569, over 108.00 samples.], 2024-10-21 05:17:00,854 INFO [train.py:561] (1/4) Epoch 1346, batch 10, global_batch_idx: 21530, batch size: 111, loss[dur_loss=0.2433, prior_loss=0.9854, diff_loss=0.3184, tot_loss=1.547, over 111.00 samples.], tot_loss[dur_loss=0.238, prior_loss=0.984, diff_loss=0.3954, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 05:17:07,993 INFO [train.py:682] (1/4) Start epoch 1347 2024-10-21 05:17:22,099 INFO [train.py:561] (1/4) Epoch 1347, batch 4, global_batch_idx: 21540, batch size: 189, loss[dur_loss=0.2406, prior_loss=0.9846, diff_loss=0.3622, tot_loss=1.587, over 189.00 samples.], tot_loss[dur_loss=0.236, prior_loss=0.9834, diff_loss=0.4497, tot_loss=1.669, over 937.00 samples.], 2024-10-21 05:17:36,951 INFO [train.py:561] (1/4) Epoch 1347, batch 14, global_batch_idx: 21550, batch size: 142, loss[dur_loss=0.2443, prior_loss=0.9845, diff_loss=0.3239, tot_loss=1.553, over 142.00 samples.], tot_loss[dur_loss=0.2396, prior_loss=0.9842, diff_loss=0.3836, tot_loss=1.607, over 2210.00 samples.], 2024-10-21 05:17:38,396 INFO [train.py:682] (1/4) Start epoch 1348 2024-10-21 05:17:58,344 INFO [train.py:561] (1/4) Epoch 1348, batch 8, global_batch_idx: 21560, batch size: 170, loss[dur_loss=0.2424, prior_loss=0.9844, diff_loss=0.3511, tot_loss=1.578, over 170.00 samples.], tot_loss[dur_loss=0.2387, prior_loss=0.984, diff_loss=0.4167, tot_loss=1.639, over 1432.00 samples.], 2024-10-21 05:18:08,554 INFO [train.py:682] (1/4) Start epoch 1349 2024-10-21 05:18:19,855 INFO [train.py:561] (1/4) Epoch 1349, batch 2, global_batch_idx: 21570, batch size: 203, loss[dur_loss=0.2406, prior_loss=0.9842, diff_loss=0.3646, tot_loss=1.589, over 203.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.9845, diff_loss=0.3397, tot_loss=1.565, over 442.00 samples.], 2024-10-21 05:18:34,157 INFO [train.py:561] (1/4) Epoch 1349, batch 12, global_batch_idx: 21580, batch size: 152, loss[dur_loss=0.2416, prior_loss=0.9848, diff_loss=0.3364, tot_loss=1.563, over 152.00 samples.], tot_loss[dur_loss=0.2389, prior_loss=0.9841, diff_loss=0.3854, tot_loss=1.608, over 1966.00 samples.], 2024-10-21 05:18:38,594 INFO [train.py:682] (1/4) Start epoch 1350 2024-10-21 05:18:55,790 INFO [train.py:561] (1/4) Epoch 1350, batch 6, global_batch_idx: 21590, batch size: 106, loss[dur_loss=0.2433, prior_loss=0.9847, diff_loss=0.3031, tot_loss=1.531, over 106.00 samples.], tot_loss[dur_loss=0.2374, prior_loss=0.9837, diff_loss=0.4137, tot_loss=1.635, over 1142.00 samples.], 2024-10-21 05:19:08,905 INFO [train.py:682] (1/4) Start epoch 1351 2024-10-21 05:19:17,669 INFO [train.py:561] (1/4) Epoch 1351, batch 0, global_batch_idx: 21600, batch size: 108, loss[dur_loss=0.2448, prior_loss=0.9849, diff_loss=0.3133, tot_loss=1.543, over 108.00 samples.], tot_loss[dur_loss=0.2448, prior_loss=0.9849, diff_loss=0.3133, tot_loss=1.543, over 108.00 samples.], 2024-10-21 05:19:31,998 INFO [train.py:561] (1/4) Epoch 1351, batch 10, global_batch_idx: 21610, batch size: 111, loss[dur_loss=0.2453, prior_loss=0.9853, diff_loss=0.3115, tot_loss=1.542, over 111.00 samples.], tot_loss[dur_loss=0.2384, prior_loss=0.9839, diff_loss=0.3943, tot_loss=1.616, over 1656.00 samples.], 2024-10-21 05:19:39,099 INFO [train.py:682] (1/4) Start epoch 1352 2024-10-21 05:19:52,938 INFO [train.py:561] (1/4) Epoch 1352, batch 4, global_batch_idx: 21620, batch size: 189, loss[dur_loss=0.2384, prior_loss=0.9844, diff_loss=0.3629, tot_loss=1.586, over 189.00 samples.], tot_loss[dur_loss=0.2357, prior_loss=0.9833, diff_loss=0.4556, tot_loss=1.675, over 937.00 samples.], 2024-10-21 05:20:07,858 INFO [train.py:561] (1/4) Epoch 1352, batch 14, global_batch_idx: 21630, batch size: 142, loss[dur_loss=0.2382, prior_loss=0.9838, diff_loss=0.326, tot_loss=1.548, over 142.00 samples.], tot_loss[dur_loss=0.2391, prior_loss=0.984, diff_loss=0.3762, tot_loss=1.599, over 2210.00 samples.], 2024-10-21 05:20:09,287 INFO [train.py:682] (1/4) Start epoch 1353 2024-10-21 05:20:29,298 INFO [train.py:561] (1/4) Epoch 1353, batch 8, global_batch_idx: 21640, batch size: 170, loss[dur_loss=0.2432, prior_loss=0.9842, diff_loss=0.3406, tot_loss=1.568, over 170.00 samples.], tot_loss[dur_loss=0.2379, prior_loss=0.9838, diff_loss=0.412, tot_loss=1.634, over 1432.00 samples.], 2024-10-21 05:20:39,445 INFO [train.py:682] (1/4) Start epoch 1354 2024-10-21 05:20:51,089 INFO [train.py:561] (1/4) Epoch 1354, batch 2, global_batch_idx: 21650, batch size: 203, loss[dur_loss=0.2382, prior_loss=0.9843, diff_loss=0.3463, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.2403, prior_loss=0.9844, diff_loss=0.3342, tot_loss=1.559, over 442.00 samples.], 2024-10-21 05:21:05,467 INFO [train.py:561] (1/4) Epoch 1354, batch 12, global_batch_idx: 21660, batch size: 152, loss[dur_loss=0.2439, prior_loss=0.9841, diff_loss=0.3223, tot_loss=1.55, over 152.00 samples.], tot_loss[dur_loss=0.2392, prior_loss=0.984, diff_loss=0.3802, tot_loss=1.603, over 1966.00 samples.], 2024-10-21 05:21:09,940 INFO [train.py:682] (1/4) Start epoch 1355 2024-10-21 05:21:27,294 INFO [train.py:561] (1/4) Epoch 1355, batch 6, global_batch_idx: 21670, batch size: 106, loss[dur_loss=0.2433, prior_loss=0.9846, diff_loss=0.2893, tot_loss=1.517, over 106.00 samples.], tot_loss[dur_loss=0.235, prior_loss=0.9836, diff_loss=0.4308, tot_loss=1.649, over 1142.00 samples.], 2024-10-21 05:21:40,457 INFO [train.py:682] (1/4) Start epoch 1356 2024-10-21 05:21:49,206 INFO [train.py:561] (1/4) Epoch 1356, batch 0, global_batch_idx: 21680, batch size: 108, loss[dur_loss=0.2459, prior_loss=0.9852, diff_loss=0.3543, tot_loss=1.585, over 108.00 samples.], tot_loss[dur_loss=0.2459, prior_loss=0.9852, diff_loss=0.3543, tot_loss=1.585, over 108.00 samples.], 2024-10-21 05:22:03,486 INFO [train.py:561] (1/4) Epoch 1356, batch 10, global_batch_idx: 21690, batch size: 111, loss[dur_loss=0.246, prior_loss=0.9855, diff_loss=0.3118, tot_loss=1.543, over 111.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9839, diff_loss=0.3947, tot_loss=1.616, over 1656.00 samples.], 2024-10-21 05:22:10,606 INFO [train.py:682] (1/4) Start epoch 1357 2024-10-21 05:22:24,497 INFO [train.py:561] (1/4) Epoch 1357, batch 4, global_batch_idx: 21700, batch size: 189, loss[dur_loss=0.2363, prior_loss=0.9842, diff_loss=0.3439, tot_loss=1.564, over 189.00 samples.], tot_loss[dur_loss=0.2347, prior_loss=0.9833, diff_loss=0.4464, tot_loss=1.664, over 937.00 samples.], 2024-10-21 05:22:39,423 INFO [train.py:561] (1/4) Epoch 1357, batch 14, global_batch_idx: 21710, batch size: 142, loss[dur_loss=0.2406, prior_loss=0.9838, diff_loss=0.2822, tot_loss=1.507, over 142.00 samples.], tot_loss[dur_loss=0.2383, prior_loss=0.9839, diff_loss=0.3785, tot_loss=1.601, over 2210.00 samples.], 2024-10-21 05:22:40,839 INFO [train.py:682] (1/4) Start epoch 1358 2024-10-21 05:23:01,040 INFO [train.py:561] (1/4) Epoch 1358, batch 8, global_batch_idx: 21720, batch size: 170, loss[dur_loss=0.244, prior_loss=0.9844, diff_loss=0.3029, tot_loss=1.531, over 170.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9837, diff_loss=0.3926, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 05:23:11,157 INFO [train.py:682] (1/4) Start epoch 1359 2024-10-21 05:23:22,840 INFO [train.py:561] (1/4) Epoch 1359, batch 2, global_batch_idx: 21730, batch size: 203, loss[dur_loss=0.2381, prior_loss=0.9841, diff_loss=0.3583, tot_loss=1.58, over 203.00 samples.], tot_loss[dur_loss=0.2399, prior_loss=0.9843, diff_loss=0.3357, tot_loss=1.56, over 442.00 samples.], 2024-10-21 05:23:37,060 INFO [train.py:561] (1/4) Epoch 1359, batch 12, global_batch_idx: 21740, batch size: 152, loss[dur_loss=0.2392, prior_loss=0.9842, diff_loss=0.3528, tot_loss=1.576, over 152.00 samples.], tot_loss[dur_loss=0.2379, prior_loss=0.9839, diff_loss=0.3905, tot_loss=1.612, over 1966.00 samples.], 2024-10-21 05:23:41,495 INFO [train.py:682] (1/4) Start epoch 1360 2024-10-21 05:24:14,411 INFO [train.py:561] (1/4) Epoch 1360, batch 6, global_batch_idx: 21750, batch size: 106, loss[dur_loss=0.2413, prior_loss=0.9847, diff_loss=0.3487, tot_loss=1.575, over 106.00 samples.], tot_loss[dur_loss=0.2349, prior_loss=0.9836, diff_loss=0.4308, tot_loss=1.649, over 1142.00 samples.], 2024-10-21 05:24:27,514 INFO [train.py:682] (1/4) Start epoch 1361 2024-10-21 05:24:36,166 INFO [train.py:561] (1/4) Epoch 1361, batch 0, global_batch_idx: 21760, batch size: 108, loss[dur_loss=0.2432, prior_loss=0.9852, diff_loss=0.3364, tot_loss=1.565, over 108.00 samples.], tot_loss[dur_loss=0.2432, prior_loss=0.9852, diff_loss=0.3364, tot_loss=1.565, over 108.00 samples.], 2024-10-21 05:24:50,490 INFO [train.py:561] (1/4) Epoch 1361, batch 10, global_batch_idx: 21770, batch size: 111, loss[dur_loss=0.2483, prior_loss=0.9855, diff_loss=0.3312, tot_loss=1.565, over 111.00 samples.], tot_loss[dur_loss=0.238, prior_loss=0.9838, diff_loss=0.3901, tot_loss=1.612, over 1656.00 samples.], 2024-10-21 05:24:57,627 INFO [train.py:682] (1/4) Start epoch 1362 2024-10-21 05:25:11,834 INFO [train.py:561] (1/4) Epoch 1362, batch 4, global_batch_idx: 21780, batch size: 189, loss[dur_loss=0.236, prior_loss=0.9841, diff_loss=0.399, tot_loss=1.619, over 189.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9831, diff_loss=0.452, tot_loss=1.669, over 937.00 samples.], 2024-10-21 05:25:26,844 INFO [train.py:561] (1/4) Epoch 1362, batch 14, global_batch_idx: 21790, batch size: 142, loss[dur_loss=0.2413, prior_loss=0.9837, diff_loss=0.3307, tot_loss=1.556, over 142.00 samples.], tot_loss[dur_loss=0.2377, prior_loss=0.9838, diff_loss=0.3816, tot_loss=1.603, over 2210.00 samples.], 2024-10-21 05:25:28,282 INFO [train.py:682] (1/4) Start epoch 1363 2024-10-21 05:25:48,566 INFO [train.py:561] (1/4) Epoch 1363, batch 8, global_batch_idx: 21800, batch size: 170, loss[dur_loss=0.2401, prior_loss=0.9842, diff_loss=0.3065, tot_loss=1.531, over 170.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9837, diff_loss=0.4037, tot_loss=1.624, over 1432.00 samples.], 2024-10-21 05:25:58,740 INFO [train.py:682] (1/4) Start epoch 1364 2024-10-21 05:26:10,467 INFO [train.py:561] (1/4) Epoch 1364, batch 2, global_batch_idx: 21810, batch size: 203, loss[dur_loss=0.2389, prior_loss=0.984, diff_loss=0.3685, tot_loss=1.591, over 203.00 samples.], tot_loss[dur_loss=0.2404, prior_loss=0.9841, diff_loss=0.3376, tot_loss=1.562, over 442.00 samples.], 2024-10-21 05:26:24,880 INFO [train.py:561] (1/4) Epoch 1364, batch 12, global_batch_idx: 21820, batch size: 152, loss[dur_loss=0.2376, prior_loss=0.9842, diff_loss=0.3733, tot_loss=1.595, over 152.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9838, diff_loss=0.3881, tot_loss=1.609, over 1966.00 samples.], 2024-10-21 05:26:29,458 INFO [train.py:682] (1/4) Start epoch 1365 2024-10-21 05:26:46,697 INFO [train.py:561] (1/4) Epoch 1365, batch 6, global_batch_idx: 21830, batch size: 106, loss[dur_loss=0.2423, prior_loss=0.9845, diff_loss=0.2801, tot_loss=1.507, over 106.00 samples.], tot_loss[dur_loss=0.2361, prior_loss=0.9833, diff_loss=0.4256, tot_loss=1.645, over 1142.00 samples.], 2024-10-21 05:26:59,877 INFO [train.py:682] (1/4) Start epoch 1366 2024-10-21 05:27:08,567 INFO [train.py:561] (1/4) Epoch 1366, batch 0, global_batch_idx: 21840, batch size: 108, loss[dur_loss=0.2424, prior_loss=0.9848, diff_loss=0.3511, tot_loss=1.578, over 108.00 samples.], tot_loss[dur_loss=0.2424, prior_loss=0.9848, diff_loss=0.3511, tot_loss=1.578, over 108.00 samples.], 2024-10-21 05:27:22,825 INFO [train.py:561] (1/4) Epoch 1366, batch 10, global_batch_idx: 21850, batch size: 111, loss[dur_loss=0.2432, prior_loss=0.9851, diff_loss=0.3119, tot_loss=1.54, over 111.00 samples.], tot_loss[dur_loss=0.2366, prior_loss=0.9836, diff_loss=0.3971, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 05:27:29,947 INFO [train.py:682] (1/4) Start epoch 1367 2024-10-21 05:27:44,138 INFO [train.py:561] (1/4) Epoch 1367, batch 4, global_batch_idx: 21860, batch size: 189, loss[dur_loss=0.2422, prior_loss=0.9845, diff_loss=0.3527, tot_loss=1.579, over 189.00 samples.], tot_loss[dur_loss=0.235, prior_loss=0.9833, diff_loss=0.4518, tot_loss=1.67, over 937.00 samples.], 2024-10-21 05:27:59,190 INFO [train.py:561] (1/4) Epoch 1367, batch 14, global_batch_idx: 21870, batch size: 142, loss[dur_loss=0.2408, prior_loss=0.9837, diff_loss=0.3394, tot_loss=1.564, over 142.00 samples.], tot_loss[dur_loss=0.2378, prior_loss=0.9838, diff_loss=0.3728, tot_loss=1.594, over 2210.00 samples.], 2024-10-21 05:28:00,627 INFO [train.py:682] (1/4) Start epoch 1368 2024-10-21 05:28:20,652 INFO [train.py:561] (1/4) Epoch 1368, batch 8, global_batch_idx: 21880, batch size: 170, loss[dur_loss=0.2395, prior_loss=0.9841, diff_loss=0.3596, tot_loss=1.583, over 170.00 samples.], tot_loss[dur_loss=0.2358, prior_loss=0.9836, diff_loss=0.4115, tot_loss=1.631, over 1432.00 samples.], 2024-10-21 05:28:30,892 INFO [train.py:682] (1/4) Start epoch 1369 2024-10-21 05:28:42,506 INFO [train.py:561] (1/4) Epoch 1369, batch 2, global_batch_idx: 21890, batch size: 203, loss[dur_loss=0.2396, prior_loss=0.9843, diff_loss=0.3796, tot_loss=1.603, over 203.00 samples.], tot_loss[dur_loss=0.2402, prior_loss=0.9842, diff_loss=0.3544, tot_loss=1.579, over 442.00 samples.], 2024-10-21 05:28:56,952 INFO [train.py:561] (1/4) Epoch 1369, batch 12, global_batch_idx: 21900, batch size: 152, loss[dur_loss=0.241, prior_loss=0.984, diff_loss=0.3199, tot_loss=1.545, over 152.00 samples.], tot_loss[dur_loss=0.2377, prior_loss=0.9838, diff_loss=0.3885, tot_loss=1.61, over 1966.00 samples.], 2024-10-21 05:29:01,467 INFO [train.py:682] (1/4) Start epoch 1370 2024-10-21 05:29:18,613 INFO [train.py:561] (1/4) Epoch 1370, batch 6, global_batch_idx: 21910, batch size: 106, loss[dur_loss=0.2407, prior_loss=0.9843, diff_loss=0.3578, tot_loss=1.583, over 106.00 samples.], tot_loss[dur_loss=0.234, prior_loss=0.9833, diff_loss=0.435, tot_loss=1.652, over 1142.00 samples.], 2024-10-21 05:29:31,780 INFO [train.py:682] (1/4) Start epoch 1371 2024-10-21 05:29:40,801 INFO [train.py:561] (1/4) Epoch 1371, batch 0, global_batch_idx: 21920, batch size: 108, loss[dur_loss=0.2413, prior_loss=0.9846, diff_loss=0.3579, tot_loss=1.584, over 108.00 samples.], tot_loss[dur_loss=0.2413, prior_loss=0.9846, diff_loss=0.3579, tot_loss=1.584, over 108.00 samples.], 2024-10-21 05:29:55,101 INFO [train.py:561] (1/4) Epoch 1371, batch 10, global_batch_idx: 21930, batch size: 111, loss[dur_loss=0.2387, prior_loss=0.9854, diff_loss=0.339, tot_loss=1.563, over 111.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.9838, diff_loss=0.4028, tot_loss=1.623, over 1656.00 samples.], 2024-10-21 05:30:02,212 INFO [train.py:682] (1/4) Start epoch 1372 2024-10-21 05:30:16,243 INFO [train.py:561] (1/4) Epoch 1372, batch 4, global_batch_idx: 21940, batch size: 189, loss[dur_loss=0.2356, prior_loss=0.984, diff_loss=0.3596, tot_loss=1.579, over 189.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.983, diff_loss=0.4405, tot_loss=1.658, over 937.00 samples.], 2024-10-21 05:30:31,172 INFO [train.py:561] (1/4) Epoch 1372, batch 14, global_batch_idx: 21950, batch size: 142, loss[dur_loss=0.2386, prior_loss=0.9836, diff_loss=0.312, tot_loss=1.534, over 142.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9837, diff_loss=0.3736, tot_loss=1.595, over 2210.00 samples.], 2024-10-21 05:30:32,603 INFO [train.py:682] (1/4) Start epoch 1373 2024-10-21 05:30:52,809 INFO [train.py:561] (1/4) Epoch 1373, batch 8, global_batch_idx: 21960, batch size: 170, loss[dur_loss=0.244, prior_loss=0.9841, diff_loss=0.2914, tot_loss=1.52, over 170.00 samples.], tot_loss[dur_loss=0.2372, prior_loss=0.9834, diff_loss=0.3925, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 05:31:02,939 INFO [train.py:682] (1/4) Start epoch 1374 2024-10-21 05:31:14,519 INFO [train.py:561] (1/4) Epoch 1374, batch 2, global_batch_idx: 21970, batch size: 203, loss[dur_loss=0.2373, prior_loss=0.9836, diff_loss=0.3696, tot_loss=1.591, over 203.00 samples.], tot_loss[dur_loss=0.2395, prior_loss=0.9838, diff_loss=0.3559, tot_loss=1.579, over 442.00 samples.], 2024-10-21 05:31:28,911 INFO [train.py:561] (1/4) Epoch 1374, batch 12, global_batch_idx: 21980, batch size: 152, loss[dur_loss=0.2399, prior_loss=0.984, diff_loss=0.3613, tot_loss=1.585, over 152.00 samples.], tot_loss[dur_loss=0.237, prior_loss=0.9836, diff_loss=0.3837, tot_loss=1.604, over 1966.00 samples.], 2024-10-21 05:31:33,425 INFO [train.py:682] (1/4) Start epoch 1375 2024-10-21 05:31:50,958 INFO [train.py:561] (1/4) Epoch 1375, batch 6, global_batch_idx: 21990, batch size: 106, loss[dur_loss=0.2442, prior_loss=0.9841, diff_loss=0.3106, tot_loss=1.539, over 106.00 samples.], tot_loss[dur_loss=0.2348, prior_loss=0.9833, diff_loss=0.425, tot_loss=1.643, over 1142.00 samples.], 2024-10-21 05:32:04,126 INFO [train.py:682] (1/4) Start epoch 1376 2024-10-21 05:32:12,971 INFO [train.py:561] (1/4) Epoch 1376, batch 0, global_batch_idx: 22000, batch size: 108, loss[dur_loss=0.2442, prior_loss=0.9849, diff_loss=0.3391, tot_loss=1.568, over 108.00 samples.], tot_loss[dur_loss=0.2442, prior_loss=0.9849, diff_loss=0.3391, tot_loss=1.568, over 108.00 samples.], 2024-10-21 05:32:27,267 INFO [train.py:561] (1/4) Epoch 1376, batch 10, global_batch_idx: 22010, batch size: 111, loss[dur_loss=0.2394, prior_loss=0.985, diff_loss=0.3398, tot_loss=1.564, over 111.00 samples.], tot_loss[dur_loss=0.2362, prior_loss=0.9836, diff_loss=0.398, tot_loss=1.618, over 1656.00 samples.], 2024-10-21 05:32:34,444 INFO [train.py:682] (1/4) Start epoch 1377 2024-10-21 05:32:48,498 INFO [train.py:561] (1/4) Epoch 1377, batch 4, global_batch_idx: 22020, batch size: 189, loss[dur_loss=0.2397, prior_loss=0.9839, diff_loss=0.3631, tot_loss=1.587, over 189.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9829, diff_loss=0.4453, tot_loss=1.661, over 937.00 samples.], 2024-10-21 05:33:03,503 INFO [train.py:561] (1/4) Epoch 1377, batch 14, global_batch_idx: 22030, batch size: 142, loss[dur_loss=0.2413, prior_loss=0.9837, diff_loss=0.3362, tot_loss=1.561, over 142.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.9836, diff_loss=0.3791, tot_loss=1.6, over 2210.00 samples.], 2024-10-21 05:33:04,961 INFO [train.py:682] (1/4) Start epoch 1378 2024-10-21 05:33:25,244 INFO [train.py:561] (1/4) Epoch 1378, batch 8, global_batch_idx: 22040, batch size: 170, loss[dur_loss=0.2399, prior_loss=0.9839, diff_loss=0.3579, tot_loss=1.582, over 170.00 samples.], tot_loss[dur_loss=0.2347, prior_loss=0.9832, diff_loss=0.4012, tot_loss=1.619, over 1432.00 samples.], 2024-10-21 05:33:35,414 INFO [train.py:682] (1/4) Start epoch 1379 2024-10-21 05:33:47,167 INFO [train.py:561] (1/4) Epoch 1379, batch 2, global_batch_idx: 22050, batch size: 203, loss[dur_loss=0.2406, prior_loss=0.9838, diff_loss=0.338, tot_loss=1.562, over 203.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.984, diff_loss=0.3202, tot_loss=1.545, over 442.00 samples.], 2024-10-21 05:34:01,499 INFO [train.py:561] (1/4) Epoch 1379, batch 12, global_batch_idx: 22060, batch size: 152, loss[dur_loss=0.2371, prior_loss=0.9839, diff_loss=0.3609, tot_loss=1.582, over 152.00 samples.], tot_loss[dur_loss=0.2367, prior_loss=0.9836, diff_loss=0.3846, tot_loss=1.605, over 1966.00 samples.], 2024-10-21 05:34:05,997 INFO [train.py:682] (1/4) Start epoch 1380 2024-10-21 05:34:23,234 INFO [train.py:561] (1/4) Epoch 1380, batch 6, global_batch_idx: 22070, batch size: 106, loss[dur_loss=0.2405, prior_loss=0.984, diff_loss=0.3201, tot_loss=1.545, over 106.00 samples.], tot_loss[dur_loss=0.2348, prior_loss=0.9831, diff_loss=0.426, tot_loss=1.644, over 1142.00 samples.], 2024-10-21 05:34:36,458 INFO [train.py:682] (1/4) Start epoch 1381 2024-10-21 05:34:45,348 INFO [train.py:561] (1/4) Epoch 1381, batch 0, global_batch_idx: 22080, batch size: 108, loss[dur_loss=0.2419, prior_loss=0.9845, diff_loss=0.2967, tot_loss=1.523, over 108.00 samples.], tot_loss[dur_loss=0.2419, prior_loss=0.9845, diff_loss=0.2967, tot_loss=1.523, over 108.00 samples.], 2024-10-21 05:34:59,564 INFO [train.py:561] (1/4) Epoch 1381, batch 10, global_batch_idx: 22090, batch size: 111, loss[dur_loss=0.2424, prior_loss=0.9851, diff_loss=0.3077, tot_loss=1.535, over 111.00 samples.], tot_loss[dur_loss=0.2359, prior_loss=0.9835, diff_loss=0.3898, tot_loss=1.609, over 1656.00 samples.], 2024-10-21 05:35:06,685 INFO [train.py:682] (1/4) Start epoch 1382 2024-10-21 05:35:20,335 INFO [train.py:561] (1/4) Epoch 1382, batch 4, global_batch_idx: 22100, batch size: 189, loss[dur_loss=0.2363, prior_loss=0.9838, diff_loss=0.336, tot_loss=1.556, over 189.00 samples.], tot_loss[dur_loss=0.233, prior_loss=0.9829, diff_loss=0.4354, tot_loss=1.651, over 937.00 samples.], 2024-10-21 05:35:35,243 INFO [train.py:561] (1/4) Epoch 1382, batch 14, global_batch_idx: 22110, batch size: 142, loss[dur_loss=0.2409, prior_loss=0.9835, diff_loss=0.3363, tot_loss=1.561, over 142.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9836, diff_loss=0.3768, tot_loss=1.597, over 2210.00 samples.], 2024-10-21 05:35:36,662 INFO [train.py:682] (1/4) Start epoch 1383 2024-10-21 05:35:56,870 INFO [train.py:561] (1/4) Epoch 1383, batch 8, global_batch_idx: 22120, batch size: 170, loss[dur_loss=0.2426, prior_loss=0.9842, diff_loss=0.3577, tot_loss=1.585, over 170.00 samples.], tot_loss[dur_loss=0.2359, prior_loss=0.9833, diff_loss=0.4042, tot_loss=1.623, over 1432.00 samples.], 2024-10-21 05:36:06,959 INFO [train.py:682] (1/4) Start epoch 1384 2024-10-21 05:36:18,418 INFO [train.py:561] (1/4) Epoch 1384, batch 2, global_batch_idx: 22130, batch size: 203, loss[dur_loss=0.2403, prior_loss=0.9839, diff_loss=0.3421, tot_loss=1.566, over 203.00 samples.], tot_loss[dur_loss=0.2392, prior_loss=0.9839, diff_loss=0.3283, tot_loss=1.551, over 442.00 samples.], 2024-10-21 05:36:32,612 INFO [train.py:561] (1/4) Epoch 1384, batch 12, global_batch_idx: 22140, batch size: 152, loss[dur_loss=0.2382, prior_loss=0.9836, diff_loss=0.3045, tot_loss=1.526, over 152.00 samples.], tot_loss[dur_loss=0.2374, prior_loss=0.9836, diff_loss=0.3896, tot_loss=1.611, over 1966.00 samples.], 2024-10-21 05:36:37,055 INFO [train.py:682] (1/4) Start epoch 1385 2024-10-21 05:36:54,119 INFO [train.py:561] (1/4) Epoch 1385, batch 6, global_batch_idx: 22150, batch size: 106, loss[dur_loss=0.2374, prior_loss=0.984, diff_loss=0.3129, tot_loss=1.534, over 106.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.983, diff_loss=0.4201, tot_loss=1.636, over 1142.00 samples.], 2024-10-21 05:37:07,198 INFO [train.py:682] (1/4) Start epoch 1386 2024-10-21 05:37:16,149 INFO [train.py:561] (1/4) Epoch 1386, batch 0, global_batch_idx: 22160, batch size: 108, loss[dur_loss=0.2455, prior_loss=0.9846, diff_loss=0.3505, tot_loss=1.581, over 108.00 samples.], tot_loss[dur_loss=0.2455, prior_loss=0.9846, diff_loss=0.3505, tot_loss=1.581, over 108.00 samples.], 2024-10-21 05:37:30,396 INFO [train.py:561] (1/4) Epoch 1386, batch 10, global_batch_idx: 22170, batch size: 111, loss[dur_loss=0.2438, prior_loss=0.985, diff_loss=0.3157, tot_loss=1.545, over 111.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9835, diff_loss=0.3896, tot_loss=1.61, over 1656.00 samples.], 2024-10-21 05:37:37,535 INFO [train.py:682] (1/4) Start epoch 1387 2024-10-21 05:37:51,391 INFO [train.py:561] (1/4) Epoch 1387, batch 4, global_batch_idx: 22180, batch size: 189, loss[dur_loss=0.2356, prior_loss=0.9841, diff_loss=0.3537, tot_loss=1.573, over 189.00 samples.], tot_loss[dur_loss=0.2343, prior_loss=0.983, diff_loss=0.4466, tot_loss=1.664, over 937.00 samples.], 2024-10-21 05:38:06,344 INFO [train.py:561] (1/4) Epoch 1387, batch 14, global_batch_idx: 22190, batch size: 142, loss[dur_loss=0.2403, prior_loss=0.9838, diff_loss=0.3202, tot_loss=1.544, over 142.00 samples.], tot_loss[dur_loss=0.2384, prior_loss=0.9837, diff_loss=0.3772, tot_loss=1.599, over 2210.00 samples.], 2024-10-21 05:38:07,756 INFO [train.py:682] (1/4) Start epoch 1388 2024-10-21 05:38:27,716 INFO [train.py:561] (1/4) Epoch 1388, batch 8, global_batch_idx: 22200, batch size: 170, loss[dur_loss=0.2423, prior_loss=0.9837, diff_loss=0.3725, tot_loss=1.599, over 170.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.9834, diff_loss=0.4153, tot_loss=1.635, over 1432.00 samples.], 2024-10-21 05:38:37,823 INFO [train.py:682] (1/4) Start epoch 1389 2024-10-21 05:38:49,354 INFO [train.py:561] (1/4) Epoch 1389, batch 2, global_batch_idx: 22210, batch size: 203, loss[dur_loss=0.2372, prior_loss=0.9838, diff_loss=0.3824, tot_loss=1.603, over 203.00 samples.], tot_loss[dur_loss=0.2387, prior_loss=0.9841, diff_loss=0.3617, tot_loss=1.584, over 442.00 samples.], 2024-10-21 05:39:03,545 INFO [train.py:561] (1/4) Epoch 1389, batch 12, global_batch_idx: 22220, batch size: 152, loss[dur_loss=0.2386, prior_loss=0.9838, diff_loss=0.3145, tot_loss=1.537, over 152.00 samples.], tot_loss[dur_loss=0.2374, prior_loss=0.9836, diff_loss=0.3933, tot_loss=1.614, over 1966.00 samples.], 2024-10-21 05:39:07,976 INFO [train.py:682] (1/4) Start epoch 1390 2024-10-21 05:39:24,805 INFO [train.py:561] (1/4) Epoch 1390, batch 6, global_batch_idx: 22230, batch size: 106, loss[dur_loss=0.2405, prior_loss=0.984, diff_loss=0.3246, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.2343, prior_loss=0.9832, diff_loss=0.4193, tot_loss=1.637, over 1142.00 samples.], 2024-10-21 05:39:37,898 INFO [train.py:682] (1/4) Start epoch 1391 2024-10-21 05:39:46,852 INFO [train.py:561] (1/4) Epoch 1391, batch 0, global_batch_idx: 22240, batch size: 108, loss[dur_loss=0.2433, prior_loss=0.9849, diff_loss=0.3836, tot_loss=1.612, over 108.00 samples.], tot_loss[dur_loss=0.2433, prior_loss=0.9849, diff_loss=0.3836, tot_loss=1.612, over 108.00 samples.], 2024-10-21 05:40:01,025 INFO [train.py:561] (1/4) Epoch 1391, batch 10, global_batch_idx: 22250, batch size: 111, loss[dur_loss=0.2459, prior_loss=0.9855, diff_loss=0.3256, tot_loss=1.557, over 111.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9837, diff_loss=0.3983, tot_loss=1.62, over 1656.00 samples.], 2024-10-21 05:40:08,107 INFO [train.py:682] (1/4) Start epoch 1392 2024-10-21 05:40:21,882 INFO [train.py:561] (1/4) Epoch 1392, batch 4, global_batch_idx: 22260, batch size: 189, loss[dur_loss=0.2373, prior_loss=0.9838, diff_loss=0.357, tot_loss=1.578, over 189.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.9828, diff_loss=0.4581, tot_loss=1.673, over 937.00 samples.], 2024-10-21 05:40:36,635 INFO [train.py:561] (1/4) Epoch 1392, batch 14, global_batch_idx: 22270, batch size: 142, loss[dur_loss=0.2411, prior_loss=0.9839, diff_loss=0.3142, tot_loss=1.539, over 142.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9836, diff_loss=0.3864, tot_loss=1.607, over 2210.00 samples.], 2024-10-21 05:40:38,054 INFO [train.py:682] (1/4) Start epoch 1393 2024-10-21 05:40:57,847 INFO [train.py:561] (1/4) Epoch 1393, batch 8, global_batch_idx: 22280, batch size: 170, loss[dur_loss=0.2424, prior_loss=0.9837, diff_loss=0.3307, tot_loss=1.557, over 170.00 samples.], tot_loss[dur_loss=0.2358, prior_loss=0.9832, diff_loss=0.3917, tot_loss=1.611, over 1432.00 samples.], 2024-10-21 05:41:08,045 INFO [train.py:682] (1/4) Start epoch 1394 2024-10-21 05:41:19,684 INFO [train.py:561] (1/4) Epoch 1394, batch 2, global_batch_idx: 22290, batch size: 203, loss[dur_loss=0.2405, prior_loss=0.9835, diff_loss=0.3519, tot_loss=1.576, over 203.00 samples.], tot_loss[dur_loss=0.2393, prior_loss=0.9838, diff_loss=0.3047, tot_loss=1.528, over 442.00 samples.], 2024-10-21 05:41:34,036 INFO [train.py:561] (1/4) Epoch 1394, batch 12, global_batch_idx: 22300, batch size: 152, loss[dur_loss=0.2355, prior_loss=0.9835, diff_loss=0.3623, tot_loss=1.581, over 152.00 samples.], tot_loss[dur_loss=0.2366, prior_loss=0.9834, diff_loss=0.3867, tot_loss=1.607, over 1966.00 samples.], 2024-10-21 05:41:38,556 INFO [train.py:682] (1/4) Start epoch 1395 2024-10-21 05:41:55,841 INFO [train.py:561] (1/4) Epoch 1395, batch 6, global_batch_idx: 22310, batch size: 106, loss[dur_loss=0.2382, prior_loss=0.9838, diff_loss=0.3101, tot_loss=1.532, over 106.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9829, diff_loss=0.4267, tot_loss=1.643, over 1142.00 samples.], 2024-10-21 05:42:09,134 INFO [train.py:682] (1/4) Start epoch 1396 2024-10-21 05:42:18,138 INFO [train.py:561] (1/4) Epoch 1396, batch 0, global_batch_idx: 22320, batch size: 108, loss[dur_loss=0.243, prior_loss=0.9845, diff_loss=0.3064, tot_loss=1.534, over 108.00 samples.], tot_loss[dur_loss=0.243, prior_loss=0.9845, diff_loss=0.3064, tot_loss=1.534, over 108.00 samples.], 2024-10-21 05:42:32,595 INFO [train.py:561] (1/4) Epoch 1396, batch 10, global_batch_idx: 22330, batch size: 111, loss[dur_loss=0.2426, prior_loss=0.9849, diff_loss=0.3257, tot_loss=1.553, over 111.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9834, diff_loss=0.3804, tot_loss=1.6, over 1656.00 samples.], 2024-10-21 05:42:39,726 INFO [train.py:682] (1/4) Start epoch 1397 2024-10-21 05:42:53,786 INFO [train.py:561] (1/4) Epoch 1397, batch 4, global_batch_idx: 22340, batch size: 189, loss[dur_loss=0.2367, prior_loss=0.9839, diff_loss=0.3597, tot_loss=1.58, over 189.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.9827, diff_loss=0.4425, tot_loss=1.657, over 937.00 samples.], 2024-10-21 05:43:08,644 INFO [train.py:561] (1/4) Epoch 1397, batch 14, global_batch_idx: 22350, batch size: 142, loss[dur_loss=0.2377, prior_loss=0.9838, diff_loss=0.3275, tot_loss=1.549, over 142.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.9835, diff_loss=0.3664, tot_loss=1.587, over 2210.00 samples.], 2024-10-21 05:43:10,083 INFO [train.py:682] (1/4) Start epoch 1398 2024-10-21 05:43:30,495 INFO [train.py:561] (1/4) Epoch 1398, batch 8, global_batch_idx: 22360, batch size: 170, loss[dur_loss=0.2457, prior_loss=0.9841, diff_loss=0.3559, tot_loss=1.586, over 170.00 samples.], tot_loss[dur_loss=0.2363, prior_loss=0.9834, diff_loss=0.4077, tot_loss=1.627, over 1432.00 samples.], 2024-10-21 05:43:40,652 INFO [train.py:682] (1/4) Start epoch 1399 2024-10-21 05:43:51,912 INFO [train.py:561] (1/4) Epoch 1399, batch 2, global_batch_idx: 22370, batch size: 203, loss[dur_loss=0.239, prior_loss=0.9838, diff_loss=0.3687, tot_loss=1.592, over 203.00 samples.], tot_loss[dur_loss=0.24, prior_loss=0.984, diff_loss=0.3469, tot_loss=1.571, over 442.00 samples.], 2024-10-21 05:44:06,124 INFO [train.py:561] (1/4) Epoch 1399, batch 12, global_batch_idx: 22380, batch size: 152, loss[dur_loss=0.2373, prior_loss=0.9836, diff_loss=0.3176, tot_loss=1.539, over 152.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9836, diff_loss=0.3799, tot_loss=1.601, over 1966.00 samples.], 2024-10-21 05:44:10,601 INFO [train.py:682] (1/4) Start epoch 1400 2024-10-21 05:44:28,035 INFO [train.py:561] (1/4) Epoch 1400, batch 6, global_batch_idx: 22390, batch size: 106, loss[dur_loss=0.2388, prior_loss=0.9836, diff_loss=0.3362, tot_loss=1.559, over 106.00 samples.], tot_loss[dur_loss=0.2353, prior_loss=0.9831, diff_loss=0.4375, tot_loss=1.656, over 1142.00 samples.], 2024-10-21 05:44:41,088 INFO [train.py:682] (1/4) Start epoch 1401 2024-10-21 05:44:49,748 INFO [train.py:561] (1/4) Epoch 1401, batch 0, global_batch_idx: 22400, batch size: 108, loss[dur_loss=0.2422, prior_loss=0.9847, diff_loss=0.3153, tot_loss=1.542, over 108.00 samples.], tot_loss[dur_loss=0.2422, prior_loss=0.9847, diff_loss=0.3153, tot_loss=1.542, over 108.00 samples.], 2024-10-21 05:45:03,744 INFO [train.py:561] (1/4) Epoch 1401, batch 10, global_batch_idx: 22410, batch size: 111, loss[dur_loss=0.2434, prior_loss=0.9848, diff_loss=0.3136, tot_loss=1.542, over 111.00 samples.], tot_loss[dur_loss=0.236, prior_loss=0.9834, diff_loss=0.3895, tot_loss=1.609, over 1656.00 samples.], 2024-10-21 05:45:10,730 INFO [train.py:682] (1/4) Start epoch 1402 2024-10-21 05:45:24,668 INFO [train.py:561] (1/4) Epoch 1402, batch 4, global_batch_idx: 22420, batch size: 189, loss[dur_loss=0.2361, prior_loss=0.9839, diff_loss=0.3618, tot_loss=1.582, over 189.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9827, diff_loss=0.4562, tot_loss=1.672, over 937.00 samples.], 2024-10-21 05:45:39,437 INFO [train.py:561] (1/4) Epoch 1402, batch 14, global_batch_idx: 22430, batch size: 142, loss[dur_loss=0.2367, prior_loss=0.9836, diff_loss=0.3115, tot_loss=1.532, over 142.00 samples.], tot_loss[dur_loss=0.2369, prior_loss=0.9834, diff_loss=0.3824, tot_loss=1.603, over 2210.00 samples.], 2024-10-21 05:45:40,862 INFO [train.py:682] (1/4) Start epoch 1403 2024-10-21 05:46:00,944 INFO [train.py:561] (1/4) Epoch 1403, batch 8, global_batch_idx: 22440, batch size: 170, loss[dur_loss=0.2431, prior_loss=0.9842, diff_loss=0.3309, tot_loss=1.558, over 170.00 samples.], tot_loss[dur_loss=0.2365, prior_loss=0.9833, diff_loss=0.4086, tot_loss=1.628, over 1432.00 samples.], 2024-10-21 05:46:11,057 INFO [train.py:682] (1/4) Start epoch 1404 2024-10-21 05:46:22,609 INFO [train.py:561] (1/4) Epoch 1404, batch 2, global_batch_idx: 22450, batch size: 203, loss[dur_loss=0.2412, prior_loss=0.9836, diff_loss=0.3642, tot_loss=1.589, over 203.00 samples.], tot_loss[dur_loss=0.2399, prior_loss=0.9839, diff_loss=0.3375, tot_loss=1.561, over 442.00 samples.], 2024-10-21 05:46:36,817 INFO [train.py:561] (1/4) Epoch 1404, batch 12, global_batch_idx: 22460, batch size: 152, loss[dur_loss=0.2409, prior_loss=0.9842, diff_loss=0.3319, tot_loss=1.557, over 152.00 samples.], tot_loss[dur_loss=0.2357, prior_loss=0.9835, diff_loss=0.3834, tot_loss=1.603, over 1966.00 samples.], 2024-10-21 05:46:41,229 INFO [train.py:682] (1/4) Start epoch 1405 2024-10-21 05:46:58,394 INFO [train.py:561] (1/4) Epoch 1405, batch 6, global_batch_idx: 22470, batch size: 106, loss[dur_loss=0.2426, prior_loss=0.9843, diff_loss=0.3008, tot_loss=1.528, over 106.00 samples.], tot_loss[dur_loss=0.2352, prior_loss=0.9833, diff_loss=0.4177, tot_loss=1.636, over 1142.00 samples.], 2024-10-21 05:47:11,415 INFO [train.py:682] (1/4) Start epoch 1406 2024-10-21 05:47:20,245 INFO [train.py:561] (1/4) Epoch 1406, batch 0, global_batch_idx: 22480, batch size: 108, loss[dur_loss=0.2421, prior_loss=0.9846, diff_loss=0.3154, tot_loss=1.542, over 108.00 samples.], tot_loss[dur_loss=0.2421, prior_loss=0.9846, diff_loss=0.3154, tot_loss=1.542, over 108.00 samples.], 2024-10-21 05:47:34,373 INFO [train.py:561] (1/4) Epoch 1406, batch 10, global_batch_idx: 22490, batch size: 111, loss[dur_loss=0.2436, prior_loss=0.9847, diff_loss=0.2966, tot_loss=1.525, over 111.00 samples.], tot_loss[dur_loss=0.2367, prior_loss=0.9834, diff_loss=0.3832, tot_loss=1.603, over 1656.00 samples.], 2024-10-21 05:47:41,364 INFO [train.py:682] (1/4) Start epoch 1407 2024-10-21 05:47:55,129 INFO [train.py:561] (1/4) Epoch 1407, batch 4, global_batch_idx: 22500, batch size: 189, loss[dur_loss=0.2365, prior_loss=0.9838, diff_loss=0.3678, tot_loss=1.588, over 189.00 samples.], tot_loss[dur_loss=0.2321, prior_loss=0.9826, diff_loss=0.4648, tot_loss=1.679, over 937.00 samples.], 2024-10-21 05:47:56,760 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 05:48:18,397 INFO [train.py:589] (1/4) Epoch 1407, validation: dur_loss=0.4532, prior_loss=1.034, diff_loss=0.3919, tot_loss=1.879, over 100.00 samples. 2024-10-21 05:48:18,398 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 05:48:31,715 INFO [train.py:561] (1/4) Epoch 1407, batch 14, global_batch_idx: 22510, batch size: 142, loss[dur_loss=0.2367, prior_loss=0.983, diff_loss=0.3306, tot_loss=1.55, over 142.00 samples.], tot_loss[dur_loss=0.2359, prior_loss=0.9833, diff_loss=0.3974, tot_loss=1.617, over 2210.00 samples.], 2024-10-21 05:48:33,144 INFO [train.py:682] (1/4) Start epoch 1408 2024-10-21 05:48:53,522 INFO [train.py:561] (1/4) Epoch 1408, batch 8, global_batch_idx: 22520, batch size: 170, loss[dur_loss=0.2388, prior_loss=0.9835, diff_loss=0.3394, tot_loss=1.562, over 170.00 samples.], tot_loss[dur_loss=0.2343, prior_loss=0.983, diff_loss=0.4024, tot_loss=1.62, over 1432.00 samples.], 2024-10-21 05:49:03,671 INFO [train.py:682] (1/4) Start epoch 1409 2024-10-21 05:49:15,362 INFO [train.py:561] (1/4) Epoch 1409, batch 2, global_batch_idx: 22530, batch size: 203, loss[dur_loss=0.2374, prior_loss=0.9833, diff_loss=0.3317, tot_loss=1.552, over 203.00 samples.], tot_loss[dur_loss=0.2375, prior_loss=0.9835, diff_loss=0.3499, tot_loss=1.571, over 442.00 samples.], 2024-10-21 05:49:29,689 INFO [train.py:561] (1/4) Epoch 1409, batch 12, global_batch_idx: 22540, batch size: 152, loss[dur_loss=0.2369, prior_loss=0.9836, diff_loss=0.3124, tot_loss=1.533, over 152.00 samples.], tot_loss[dur_loss=0.2345, prior_loss=0.9832, diff_loss=0.3881, tot_loss=1.606, over 1966.00 samples.], 2024-10-21 05:49:34,095 INFO [train.py:682] (1/4) Start epoch 1410 2024-10-21 05:49:51,189 INFO [train.py:561] (1/4) Epoch 1410, batch 6, global_batch_idx: 22550, batch size: 106, loss[dur_loss=0.2369, prior_loss=0.9835, diff_loss=0.3134, tot_loss=1.534, over 106.00 samples.], tot_loss[dur_loss=0.2334, prior_loss=0.9828, diff_loss=0.4211, tot_loss=1.637, over 1142.00 samples.], 2024-10-21 05:50:04,065 INFO [train.py:682] (1/4) Start epoch 1411 2024-10-21 05:50:13,237 INFO [train.py:561] (1/4) Epoch 1411, batch 0, global_batch_idx: 22560, batch size: 108, loss[dur_loss=0.2433, prior_loss=0.9844, diff_loss=0.2947, tot_loss=1.522, over 108.00 samples.], tot_loss[dur_loss=0.2433, prior_loss=0.9844, diff_loss=0.2947, tot_loss=1.522, over 108.00 samples.], 2024-10-21 05:50:27,362 INFO [train.py:561] (1/4) Epoch 1411, batch 10, global_batch_idx: 22570, batch size: 111, loss[dur_loss=0.2404, prior_loss=0.9846, diff_loss=0.3182, tot_loss=1.543, over 111.00 samples.], tot_loss[dur_loss=0.2347, prior_loss=0.9831, diff_loss=0.3927, tot_loss=1.611, over 1656.00 samples.], 2024-10-21 05:50:34,348 INFO [train.py:682] (1/4) Start epoch 1412 2024-10-21 05:50:48,042 INFO [train.py:561] (1/4) Epoch 1412, batch 4, global_batch_idx: 22580, batch size: 189, loss[dur_loss=0.2404, prior_loss=0.9841, diff_loss=0.3581, tot_loss=1.583, over 189.00 samples.], tot_loss[dur_loss=0.2317, prior_loss=0.9827, diff_loss=0.4504, tot_loss=1.665, over 937.00 samples.], 2024-10-21 05:51:02,798 INFO [train.py:561] (1/4) Epoch 1412, batch 14, global_batch_idx: 22590, batch size: 142, loss[dur_loss=0.2369, prior_loss=0.9832, diff_loss=0.3173, tot_loss=1.537, over 142.00 samples.], tot_loss[dur_loss=0.2358, prior_loss=0.9834, diff_loss=0.3885, tot_loss=1.608, over 2210.00 samples.], 2024-10-21 05:51:04,220 INFO [train.py:682] (1/4) Start epoch 1413 2024-10-21 05:51:24,180 INFO [train.py:561] (1/4) Epoch 1413, batch 8, global_batch_idx: 22600, batch size: 170, loss[dur_loss=0.2392, prior_loss=0.9837, diff_loss=0.3407, tot_loss=1.564, over 170.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9832, diff_loss=0.4036, tot_loss=1.62, over 1432.00 samples.], 2024-10-21 05:51:34,273 INFO [train.py:682] (1/4) Start epoch 1414 2024-10-21 05:51:46,152 INFO [train.py:561] (1/4) Epoch 1414, batch 2, global_batch_idx: 22610, batch size: 203, loss[dur_loss=0.2376, prior_loss=0.9834, diff_loss=0.3484, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.2392, prior_loss=0.9837, diff_loss=0.3393, tot_loss=1.562, over 442.00 samples.], 2024-10-21 05:52:00,370 INFO [train.py:561] (1/4) Epoch 1414, batch 12, global_batch_idx: 22620, batch size: 152, loss[dur_loss=0.2375, prior_loss=0.9835, diff_loss=0.365, tot_loss=1.586, over 152.00 samples.], tot_loss[dur_loss=0.2355, prior_loss=0.9832, diff_loss=0.3898, tot_loss=1.608, over 1966.00 samples.], 2024-10-21 05:52:04,764 INFO [train.py:682] (1/4) Start epoch 1415 2024-10-21 05:52:21,900 INFO [train.py:561] (1/4) Epoch 1415, batch 6, global_batch_idx: 22630, batch size: 106, loss[dur_loss=0.2379, prior_loss=0.9837, diff_loss=0.3021, tot_loss=1.524, over 106.00 samples.], tot_loss[dur_loss=0.2326, prior_loss=0.9828, diff_loss=0.4235, tot_loss=1.639, over 1142.00 samples.], 2024-10-21 05:52:34,781 INFO [train.py:682] (1/4) Start epoch 1416 2024-10-21 05:52:43,809 INFO [train.py:561] (1/4) Epoch 1416, batch 0, global_batch_idx: 22640, batch size: 108, loss[dur_loss=0.2438, prior_loss=0.9843, diff_loss=0.3187, tot_loss=1.547, over 108.00 samples.], tot_loss[dur_loss=0.2438, prior_loss=0.9843, diff_loss=0.3187, tot_loss=1.547, over 108.00 samples.], 2024-10-21 05:52:57,806 INFO [train.py:561] (1/4) Epoch 1416, batch 10, global_batch_idx: 22650, batch size: 111, loss[dur_loss=0.2405, prior_loss=0.9847, diff_loss=0.3083, tot_loss=1.533, over 111.00 samples.], tot_loss[dur_loss=0.2359, prior_loss=0.9832, diff_loss=0.3977, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 05:53:04,802 INFO [train.py:682] (1/4) Start epoch 1417 2024-10-21 05:53:18,558 INFO [train.py:561] (1/4) Epoch 1417, batch 4, global_batch_idx: 22660, batch size: 189, loss[dur_loss=0.2396, prior_loss=0.9838, diff_loss=0.3789, tot_loss=1.602, over 189.00 samples.], tot_loss[dur_loss=0.231, prior_loss=0.9824, diff_loss=0.452, tot_loss=1.665, over 937.00 samples.], 2024-10-21 05:53:33,208 INFO [train.py:561] (1/4) Epoch 1417, batch 14, global_batch_idx: 22670, batch size: 142, loss[dur_loss=0.2365, prior_loss=0.9832, diff_loss=0.3345, tot_loss=1.554, over 142.00 samples.], tot_loss[dur_loss=0.2346, prior_loss=0.9831, diff_loss=0.3793, tot_loss=1.597, over 2210.00 samples.], 2024-10-21 05:53:34,609 INFO [train.py:682] (1/4) Start epoch 1418 2024-10-21 05:53:54,526 INFO [train.py:561] (1/4) Epoch 1418, batch 8, global_batch_idx: 22680, batch size: 170, loss[dur_loss=0.2397, prior_loss=0.9835, diff_loss=0.3489, tot_loss=1.572, over 170.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9829, diff_loss=0.4041, tot_loss=1.621, over 1432.00 samples.], 2024-10-21 05:54:04,562 INFO [train.py:682] (1/4) Start epoch 1419 2024-10-21 05:54:16,073 INFO [train.py:561] (1/4) Epoch 1419, batch 2, global_batch_idx: 22690, batch size: 203, loss[dur_loss=0.2392, prior_loss=0.9833, diff_loss=0.3273, tot_loss=1.55, over 203.00 samples.], tot_loss[dur_loss=0.2397, prior_loss=0.9836, diff_loss=0.3248, tot_loss=1.548, over 442.00 samples.], 2024-10-21 05:54:30,128 INFO [train.py:561] (1/4) Epoch 1419, batch 12, global_batch_idx: 22700, batch size: 152, loss[dur_loss=0.2364, prior_loss=0.9834, diff_loss=0.3397, tot_loss=1.56, over 152.00 samples.], tot_loss[dur_loss=0.2349, prior_loss=0.9831, diff_loss=0.3879, tot_loss=1.606, over 1966.00 samples.], 2024-10-21 05:54:34,527 INFO [train.py:682] (1/4) Start epoch 1420 2024-10-21 05:54:51,810 INFO [train.py:561] (1/4) Epoch 1420, batch 6, global_batch_idx: 22710, batch size: 106, loss[dur_loss=0.2398, prior_loss=0.984, diff_loss=0.3245, tot_loss=1.548, over 106.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9829, diff_loss=0.4299, tot_loss=1.646, over 1142.00 samples.], 2024-10-21 05:55:04,678 INFO [train.py:682] (1/4) Start epoch 1421 2024-10-21 05:55:13,476 INFO [train.py:561] (1/4) Epoch 1421, batch 0, global_batch_idx: 22720, batch size: 108, loss[dur_loss=0.2422, prior_loss=0.9843, diff_loss=0.3248, tot_loss=1.551, over 108.00 samples.], tot_loss[dur_loss=0.2422, prior_loss=0.9843, diff_loss=0.3248, tot_loss=1.551, over 108.00 samples.], 2024-10-21 05:55:27,505 INFO [train.py:561] (1/4) Epoch 1421, batch 10, global_batch_idx: 22730, batch size: 111, loss[dur_loss=0.2402, prior_loss=0.9844, diff_loss=0.3148, tot_loss=1.539, over 111.00 samples.], tot_loss[dur_loss=0.2346, prior_loss=0.9832, diff_loss=0.3946, tot_loss=1.612, over 1656.00 samples.], 2024-10-21 05:55:34,522 INFO [train.py:682] (1/4) Start epoch 1422 2024-10-21 05:55:49,002 INFO [train.py:561] (1/4) Epoch 1422, batch 4, global_batch_idx: 22740, batch size: 189, loss[dur_loss=0.2373, prior_loss=0.9833, diff_loss=0.3797, tot_loss=1.6, over 189.00 samples.], tot_loss[dur_loss=0.2312, prior_loss=0.9824, diff_loss=0.4567, tot_loss=1.67, over 937.00 samples.], 2024-10-21 05:56:03,868 INFO [train.py:561] (1/4) Epoch 1422, batch 14, global_batch_idx: 22750, batch size: 142, loss[dur_loss=0.2349, prior_loss=0.9828, diff_loss=0.3324, tot_loss=1.55, over 142.00 samples.], tot_loss[dur_loss=0.2357, prior_loss=0.9831, diff_loss=0.3813, tot_loss=1.6, over 2210.00 samples.], 2024-10-21 05:56:05,306 INFO [train.py:682] (1/4) Start epoch 1423 2024-10-21 05:56:25,823 INFO [train.py:561] (1/4) Epoch 1423, batch 8, global_batch_idx: 22760, batch size: 170, loss[dur_loss=0.2395, prior_loss=0.9833, diff_loss=0.3248, tot_loss=1.548, over 170.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9826, diff_loss=0.3956, tot_loss=1.612, over 1432.00 samples.], 2024-10-21 05:56:36,024 INFO [train.py:682] (1/4) Start epoch 1424 2024-10-21 05:56:47,688 INFO [train.py:561] (1/4) Epoch 1424, batch 2, global_batch_idx: 22770, batch size: 203, loss[dur_loss=0.234, prior_loss=0.983, diff_loss=0.3411, tot_loss=1.558, over 203.00 samples.], tot_loss[dur_loss=0.2353, prior_loss=0.9832, diff_loss=0.3558, tot_loss=1.574, over 442.00 samples.], 2024-10-21 05:57:02,058 INFO [train.py:561] (1/4) Epoch 1424, batch 12, global_batch_idx: 22780, batch size: 152, loss[dur_loss=0.2344, prior_loss=0.9832, diff_loss=0.326, tot_loss=1.544, over 152.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9829, diff_loss=0.3809, tot_loss=1.597, over 1966.00 samples.], 2024-10-21 05:57:06,543 INFO [train.py:682] (1/4) Start epoch 1425 2024-10-21 05:57:23,772 INFO [train.py:561] (1/4) Epoch 1425, batch 6, global_batch_idx: 22790, batch size: 106, loss[dur_loss=0.2399, prior_loss=0.9837, diff_loss=0.3399, tot_loss=1.564, over 106.00 samples.], tot_loss[dur_loss=0.2318, prior_loss=0.9825, diff_loss=0.4256, tot_loss=1.64, over 1142.00 samples.], 2024-10-21 05:57:36,918 INFO [train.py:682] (1/4) Start epoch 1426 2024-10-21 05:57:46,193 INFO [train.py:561] (1/4) Epoch 1426, batch 0, global_batch_idx: 22800, batch size: 108, loss[dur_loss=0.2441, prior_loss=0.9841, diff_loss=0.3512, tot_loss=1.579, over 108.00 samples.], tot_loss[dur_loss=0.2441, prior_loss=0.9841, diff_loss=0.3512, tot_loss=1.579, over 108.00 samples.], 2024-10-21 05:58:00,479 INFO [train.py:561] (1/4) Epoch 1426, batch 10, global_batch_idx: 22810, batch size: 111, loss[dur_loss=0.2374, prior_loss=0.9844, diff_loss=0.3538, tot_loss=1.576, over 111.00 samples.], tot_loss[dur_loss=0.2346, prior_loss=0.9829, diff_loss=0.3995, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 05:58:07,609 INFO [train.py:682] (1/4) Start epoch 1427 2024-10-21 05:58:21,586 INFO [train.py:561] (1/4) Epoch 1427, batch 4, global_batch_idx: 22820, batch size: 189, loss[dur_loss=0.235, prior_loss=0.9836, diff_loss=0.3563, tot_loss=1.575, over 189.00 samples.], tot_loss[dur_loss=0.2297, prior_loss=0.9823, diff_loss=0.4523, tot_loss=1.664, over 937.00 samples.], 2024-10-21 05:58:36,509 INFO [train.py:561] (1/4) Epoch 1427, batch 14, global_batch_idx: 22830, batch size: 142, loss[dur_loss=0.2368, prior_loss=0.983, diff_loss=0.3092, tot_loss=1.529, over 142.00 samples.], tot_loss[dur_loss=0.234, prior_loss=0.983, diff_loss=0.3868, tot_loss=1.604, over 2210.00 samples.], 2024-10-21 05:58:37,937 INFO [train.py:682] (1/4) Start epoch 1428 2024-10-21 05:58:58,002 INFO [train.py:561] (1/4) Epoch 1428, batch 8, global_batch_idx: 22840, batch size: 170, loss[dur_loss=0.2399, prior_loss=0.9835, diff_loss=0.3339, tot_loss=1.557, over 170.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9827, diff_loss=0.4025, tot_loss=1.618, over 1432.00 samples.], 2024-10-21 05:59:08,222 INFO [train.py:682] (1/4) Start epoch 1429 2024-10-21 05:59:20,066 INFO [train.py:561] (1/4) Epoch 1429, batch 2, global_batch_idx: 22850, batch size: 203, loss[dur_loss=0.2349, prior_loss=0.9832, diff_loss=0.3656, tot_loss=1.584, over 203.00 samples.], tot_loss[dur_loss=0.236, prior_loss=0.9833, diff_loss=0.345, tot_loss=1.564, over 442.00 samples.], 2024-10-21 05:59:34,385 INFO [train.py:561] (1/4) Epoch 1429, batch 12, global_batch_idx: 22860, batch size: 152, loss[dur_loss=0.2333, prior_loss=0.9832, diff_loss=0.3279, tot_loss=1.544, over 152.00 samples.], tot_loss[dur_loss=0.2338, prior_loss=0.9828, diff_loss=0.3896, tot_loss=1.606, over 1966.00 samples.], 2024-10-21 05:59:38,896 INFO [train.py:682] (1/4) Start epoch 1430 2024-10-21 05:59:56,387 INFO [train.py:561] (1/4) Epoch 1430, batch 6, global_batch_idx: 22870, batch size: 106, loss[dur_loss=0.238, prior_loss=0.9835, diff_loss=0.277, tot_loss=1.499, over 106.00 samples.], tot_loss[dur_loss=0.2316, prior_loss=0.9825, diff_loss=0.423, tot_loss=1.637, over 1142.00 samples.], 2024-10-21 06:00:09,651 INFO [train.py:682] (1/4) Start epoch 1431 2024-10-21 06:00:18,559 INFO [train.py:561] (1/4) Epoch 1431, batch 0, global_batch_idx: 22880, batch size: 108, loss[dur_loss=0.2421, prior_loss=0.9837, diff_loss=0.327, tot_loss=1.553, over 108.00 samples.], tot_loss[dur_loss=0.2421, prior_loss=0.9837, diff_loss=0.327, tot_loss=1.553, over 108.00 samples.], 2024-10-21 06:00:33,046 INFO [train.py:561] (1/4) Epoch 1431, batch 10, global_batch_idx: 22890, batch size: 111, loss[dur_loss=0.2416, prior_loss=0.9841, diff_loss=0.2998, tot_loss=1.525, over 111.00 samples.], tot_loss[dur_loss=0.2335, prior_loss=0.9828, diff_loss=0.3912, tot_loss=1.607, over 1656.00 samples.], 2024-10-21 06:00:40,177 INFO [train.py:682] (1/4) Start epoch 1432 2024-10-21 06:00:54,365 INFO [train.py:561] (1/4) Epoch 1432, batch 4, global_batch_idx: 22900, batch size: 189, loss[dur_loss=0.2357, prior_loss=0.9835, diff_loss=0.3525, tot_loss=1.572, over 189.00 samples.], tot_loss[dur_loss=0.23, prior_loss=0.9821, diff_loss=0.4552, tot_loss=1.667, over 937.00 samples.], 2024-10-21 06:01:09,332 INFO [train.py:561] (1/4) Epoch 1432, batch 14, global_batch_idx: 22910, batch size: 142, loss[dur_loss=0.2371, prior_loss=0.9829, diff_loss=0.384, tot_loss=1.604, over 142.00 samples.], tot_loss[dur_loss=0.2341, prior_loss=0.9829, diff_loss=0.3877, tot_loss=1.605, over 2210.00 samples.], 2024-10-21 06:01:10,777 INFO [train.py:682] (1/4) Start epoch 1433 2024-10-21 06:01:30,810 INFO [train.py:561] (1/4) Epoch 1433, batch 8, global_batch_idx: 22920, batch size: 170, loss[dur_loss=0.2388, prior_loss=0.9833, diff_loss=0.3182, tot_loss=1.54, over 170.00 samples.], tot_loss[dur_loss=0.2322, prior_loss=0.9826, diff_loss=0.4034, tot_loss=1.618, over 1432.00 samples.], 2024-10-21 06:01:41,009 INFO [train.py:682] (1/4) Start epoch 1434 2024-10-21 06:01:52,736 INFO [train.py:561] (1/4) Epoch 1434, batch 2, global_batch_idx: 22930, batch size: 203, loss[dur_loss=0.2362, prior_loss=0.9834, diff_loss=0.3634, tot_loss=1.583, over 203.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9835, diff_loss=0.3422, tot_loss=1.562, over 442.00 samples.], 2024-10-21 06:02:06,983 INFO [train.py:561] (1/4) Epoch 1434, batch 12, global_batch_idx: 22940, batch size: 152, loss[dur_loss=0.2343, prior_loss=0.983, diff_loss=0.2656, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.2337, prior_loss=0.983, diff_loss=0.3862, tot_loss=1.603, over 1966.00 samples.], 2024-10-21 06:02:11,435 INFO [train.py:682] (1/4) Start epoch 1435 2024-10-21 06:02:28,565 INFO [train.py:561] (1/4) Epoch 1435, batch 6, global_batch_idx: 22950, batch size: 106, loss[dur_loss=0.2425, prior_loss=0.9837, diff_loss=0.3474, tot_loss=1.574, over 106.00 samples.], tot_loss[dur_loss=0.2332, prior_loss=0.9825, diff_loss=0.4498, tot_loss=1.666, over 1142.00 samples.], 2024-10-21 06:02:41,653 INFO [train.py:682] (1/4) Start epoch 1436 2024-10-21 06:02:50,647 INFO [train.py:561] (1/4) Epoch 1436, batch 0, global_batch_idx: 22960, batch size: 108, loss[dur_loss=0.2394, prior_loss=0.9839, diff_loss=0.3574, tot_loss=1.581, over 108.00 samples.], tot_loss[dur_loss=0.2394, prior_loss=0.9839, diff_loss=0.3574, tot_loss=1.581, over 108.00 samples.], 2024-10-21 06:03:05,059 INFO [train.py:561] (1/4) Epoch 1436, batch 10, global_batch_idx: 22970, batch size: 111, loss[dur_loss=0.2398, prior_loss=0.9844, diff_loss=0.336, tot_loss=1.56, over 111.00 samples.], tot_loss[dur_loss=0.2344, prior_loss=0.983, diff_loss=0.3907, tot_loss=1.608, over 1656.00 samples.], 2024-10-21 06:03:12,155 INFO [train.py:682] (1/4) Start epoch 1437 2024-10-21 06:03:26,094 INFO [train.py:561] (1/4) Epoch 1437, batch 4, global_batch_idx: 22980, batch size: 189, loss[dur_loss=0.2394, prior_loss=0.9837, diff_loss=0.3623, tot_loss=1.585, over 189.00 samples.], tot_loss[dur_loss=0.2324, prior_loss=0.9823, diff_loss=0.4513, tot_loss=1.666, over 937.00 samples.], 2024-10-21 06:03:41,052 INFO [train.py:561] (1/4) Epoch 1437, batch 14, global_batch_idx: 22990, batch size: 142, loss[dur_loss=0.2365, prior_loss=0.983, diff_loss=0.3297, tot_loss=1.549, over 142.00 samples.], tot_loss[dur_loss=0.2357, prior_loss=0.9831, diff_loss=0.3795, tot_loss=1.598, over 2210.00 samples.], 2024-10-21 06:03:42,505 INFO [train.py:682] (1/4) Start epoch 1438 2024-10-21 06:04:02,594 INFO [train.py:561] (1/4) Epoch 1438, batch 8, global_batch_idx: 23000, batch size: 170, loss[dur_loss=0.238, prior_loss=0.9836, diff_loss=0.3396, tot_loss=1.561, over 170.00 samples.], tot_loss[dur_loss=0.2346, prior_loss=0.9829, diff_loss=0.4054, tot_loss=1.623, over 1432.00 samples.], 2024-10-21 06:04:12,838 INFO [train.py:682] (1/4) Start epoch 1439 2024-10-21 06:04:24,524 INFO [train.py:561] (1/4) Epoch 1439, batch 2, global_batch_idx: 23010, batch size: 203, loss[dur_loss=0.2364, prior_loss=0.983, diff_loss=0.311, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.2366, prior_loss=0.9831, diff_loss=0.3212, tot_loss=1.541, over 442.00 samples.], 2024-10-21 06:04:38,770 INFO [train.py:561] (1/4) Epoch 1439, batch 12, global_batch_idx: 23020, batch size: 152, loss[dur_loss=0.2353, prior_loss=0.9835, diff_loss=0.3137, tot_loss=1.532, over 152.00 samples.], tot_loss[dur_loss=0.234, prior_loss=0.9829, diff_loss=0.3789, tot_loss=1.596, over 1966.00 samples.], 2024-10-21 06:04:43,260 INFO [train.py:682] (1/4) Start epoch 1440 2024-10-21 06:05:00,490 INFO [train.py:561] (1/4) Epoch 1440, batch 6, global_batch_idx: 23030, batch size: 106, loss[dur_loss=0.2389, prior_loss=0.9833, diff_loss=0.3168, tot_loss=1.539, over 106.00 samples.], tot_loss[dur_loss=0.2307, prior_loss=0.9824, diff_loss=0.4351, tot_loss=1.648, over 1142.00 samples.], 2024-10-21 06:05:13,513 INFO [train.py:682] (1/4) Start epoch 1441 2024-10-21 06:05:22,170 INFO [train.py:561] (1/4) Epoch 1441, batch 0, global_batch_idx: 23040, batch size: 108, loss[dur_loss=0.2411, prior_loss=0.9838, diff_loss=0.3116, tot_loss=1.537, over 108.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.9838, diff_loss=0.3116, tot_loss=1.537, over 108.00 samples.], 2024-10-21 06:05:36,520 INFO [train.py:561] (1/4) Epoch 1441, batch 10, global_batch_idx: 23050, batch size: 111, loss[dur_loss=0.2392, prior_loss=0.9844, diff_loss=0.2912, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.2341, prior_loss=0.9828, diff_loss=0.3944, tot_loss=1.611, over 1656.00 samples.], 2024-10-21 06:05:43,590 INFO [train.py:682] (1/4) Start epoch 1442 2024-10-21 06:05:57,609 INFO [train.py:561] (1/4) Epoch 1442, batch 4, global_batch_idx: 23060, batch size: 189, loss[dur_loss=0.2352, prior_loss=0.9833, diff_loss=0.3523, tot_loss=1.571, over 189.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9822, diff_loss=0.4405, tot_loss=1.654, over 937.00 samples.], 2024-10-21 06:06:12,393 INFO [train.py:561] (1/4) Epoch 1442, batch 14, global_batch_idx: 23070, batch size: 142, loss[dur_loss=0.2346, prior_loss=0.9827, diff_loss=0.3197, tot_loss=1.537, over 142.00 samples.], tot_loss[dur_loss=0.2351, prior_loss=0.9829, diff_loss=0.3806, tot_loss=1.599, over 2210.00 samples.], 2024-10-21 06:06:13,797 INFO [train.py:682] (1/4) Start epoch 1443 2024-10-21 06:06:33,677 INFO [train.py:561] (1/4) Epoch 1443, batch 8, global_batch_idx: 23080, batch size: 170, loss[dur_loss=0.2388, prior_loss=0.9833, diff_loss=0.3426, tot_loss=1.565, over 170.00 samples.], tot_loss[dur_loss=0.2328, prior_loss=0.9826, diff_loss=0.4144, tot_loss=1.63, over 1432.00 samples.], 2024-10-21 06:06:43,663 INFO [train.py:682] (1/4) Start epoch 1444 2024-10-21 06:06:55,146 INFO [train.py:561] (1/4) Epoch 1444, batch 2, global_batch_idx: 23090, batch size: 203, loss[dur_loss=0.2341, prior_loss=0.9832, diff_loss=0.3527, tot_loss=1.57, over 203.00 samples.], tot_loss[dur_loss=0.2349, prior_loss=0.9833, diff_loss=0.3396, tot_loss=1.558, over 442.00 samples.], 2024-10-21 06:07:09,363 INFO [train.py:561] (1/4) Epoch 1444, batch 12, global_batch_idx: 23100, batch size: 152, loss[dur_loss=0.2341, prior_loss=0.9828, diff_loss=0.3028, tot_loss=1.52, over 152.00 samples.], tot_loss[dur_loss=0.2335, prior_loss=0.9828, diff_loss=0.3883, tot_loss=1.605, over 1966.00 samples.], 2024-10-21 06:07:13,801 INFO [train.py:682] (1/4) Start epoch 1445 2024-10-21 06:07:30,616 INFO [train.py:561] (1/4) Epoch 1445, batch 6, global_batch_idx: 23110, batch size: 106, loss[dur_loss=0.2352, prior_loss=0.9831, diff_loss=0.297, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.231, prior_loss=0.9823, diff_loss=0.4177, tot_loss=1.631, over 1142.00 samples.], 2024-10-21 06:07:43,531 INFO [train.py:682] (1/4) Start epoch 1446 2024-10-21 06:07:52,530 INFO [train.py:561] (1/4) Epoch 1446, batch 0, global_batch_idx: 23120, batch size: 108, loss[dur_loss=0.2437, prior_loss=0.9838, diff_loss=0.3232, tot_loss=1.551, over 108.00 samples.], tot_loss[dur_loss=0.2437, prior_loss=0.9838, diff_loss=0.3232, tot_loss=1.551, over 108.00 samples.], 2024-10-21 06:08:06,652 INFO [train.py:561] (1/4) Epoch 1446, batch 10, global_batch_idx: 23130, batch size: 111, loss[dur_loss=0.2408, prior_loss=0.9842, diff_loss=0.2948, tot_loss=1.52, over 111.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9827, diff_loss=0.3919, tot_loss=1.608, over 1656.00 samples.], 2024-10-21 06:08:13,623 INFO [train.py:682] (1/4) Start epoch 1447 2024-10-21 06:08:27,252 INFO [train.py:561] (1/4) Epoch 1447, batch 4, global_batch_idx: 23140, batch size: 189, loss[dur_loss=0.2335, prior_loss=0.9829, diff_loss=0.3722, tot_loss=1.589, over 189.00 samples.], tot_loss[dur_loss=0.2305, prior_loss=0.982, diff_loss=0.4503, tot_loss=1.663, over 937.00 samples.], 2024-10-21 06:08:42,023 INFO [train.py:561] (1/4) Epoch 1447, batch 14, global_batch_idx: 23150, batch size: 142, loss[dur_loss=0.2369, prior_loss=0.9829, diff_loss=0.3244, tot_loss=1.544, over 142.00 samples.], tot_loss[dur_loss=0.2346, prior_loss=0.9828, diff_loss=0.3811, tot_loss=1.599, over 2210.00 samples.], 2024-10-21 06:08:43,424 INFO [train.py:682] (1/4) Start epoch 1448 2024-10-21 06:09:03,157 INFO [train.py:561] (1/4) Epoch 1448, batch 8, global_batch_idx: 23160, batch size: 170, loss[dur_loss=0.2406, prior_loss=0.9833, diff_loss=0.3502, tot_loss=1.574, over 170.00 samples.], tot_loss[dur_loss=0.2328, prior_loss=0.9825, diff_loss=0.4094, tot_loss=1.625, over 1432.00 samples.], 2024-10-21 06:09:13,200 INFO [train.py:682] (1/4) Start epoch 1449 2024-10-21 06:09:24,652 INFO [train.py:561] (1/4) Epoch 1449, batch 2, global_batch_idx: 23170, batch size: 203, loss[dur_loss=0.2342, prior_loss=0.9828, diff_loss=0.3281, tot_loss=1.545, over 203.00 samples.], tot_loss[dur_loss=0.2352, prior_loss=0.9829, diff_loss=0.3116, tot_loss=1.53, over 442.00 samples.], 2024-10-21 06:09:38,822 INFO [train.py:561] (1/4) Epoch 1449, batch 12, global_batch_idx: 23180, batch size: 152, loss[dur_loss=0.236, prior_loss=0.9829, diff_loss=0.3456, tot_loss=1.565, over 152.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9826, diff_loss=0.3708, tot_loss=1.587, over 1966.00 samples.], 2024-10-21 06:09:43,238 INFO [train.py:682] (1/4) Start epoch 1450 2024-10-21 06:10:00,147 INFO [train.py:561] (1/4) Epoch 1450, batch 6, global_batch_idx: 23190, batch size: 106, loss[dur_loss=0.2391, prior_loss=0.9831, diff_loss=0.3052, tot_loss=1.527, over 106.00 samples.], tot_loss[dur_loss=0.2316, prior_loss=0.9822, diff_loss=0.4205, tot_loss=1.634, over 1142.00 samples.], 2024-10-21 06:10:13,087 INFO [train.py:682] (1/4) Start epoch 1451 2024-10-21 06:10:21,927 INFO [train.py:561] (1/4) Epoch 1451, batch 0, global_batch_idx: 23200, batch size: 108, loss[dur_loss=0.2411, prior_loss=0.9836, diff_loss=0.3355, tot_loss=1.56, over 108.00 samples.], tot_loss[dur_loss=0.2411, prior_loss=0.9836, diff_loss=0.3355, tot_loss=1.56, over 108.00 samples.], 2024-10-21 06:10:36,079 INFO [train.py:561] (1/4) Epoch 1451, batch 10, global_batch_idx: 23210, batch size: 111, loss[dur_loss=0.2408, prior_loss=0.9843, diff_loss=0.321, tot_loss=1.546, over 111.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9827, diff_loss=0.3888, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 06:10:43,101 INFO [train.py:682] (1/4) Start epoch 1452 2024-10-21 06:10:57,013 INFO [train.py:561] (1/4) Epoch 1452, batch 4, global_batch_idx: 23220, batch size: 189, loss[dur_loss=0.2304, prior_loss=0.9834, diff_loss=0.3192, tot_loss=1.533, over 189.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9821, diff_loss=0.4343, tot_loss=1.645, over 937.00 samples.], 2024-10-21 06:11:11,736 INFO [train.py:561] (1/4) Epoch 1452, batch 14, global_batch_idx: 23230, batch size: 142, loss[dur_loss=0.237, prior_loss=0.9826, diff_loss=0.3413, tot_loss=1.561, over 142.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9827, diff_loss=0.3765, tot_loss=1.593, over 2210.00 samples.], 2024-10-21 06:11:13,142 INFO [train.py:682] (1/4) Start epoch 1453 2024-10-21 06:11:32,923 INFO [train.py:561] (1/4) Epoch 1453, batch 8, global_batch_idx: 23240, batch size: 170, loss[dur_loss=0.2394, prior_loss=0.9832, diff_loss=0.3334, tot_loss=1.556, over 170.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9825, diff_loss=0.4052, tot_loss=1.622, over 1432.00 samples.], 2024-10-21 06:11:42,943 INFO [train.py:682] (1/4) Start epoch 1454 2024-10-21 06:11:54,467 INFO [train.py:561] (1/4) Epoch 1454, batch 2, global_batch_idx: 23250, batch size: 203, loss[dur_loss=0.237, prior_loss=0.9827, diff_loss=0.3708, tot_loss=1.591, over 203.00 samples.], tot_loss[dur_loss=0.2354, prior_loss=0.9829, diff_loss=0.3471, tot_loss=1.565, over 442.00 samples.], 2024-10-21 06:12:08,661 INFO [train.py:561] (1/4) Epoch 1454, batch 12, global_batch_idx: 23260, batch size: 152, loss[dur_loss=0.2321, prior_loss=0.9829, diff_loss=0.3148, tot_loss=1.53, over 152.00 samples.], tot_loss[dur_loss=0.2334, prior_loss=0.9826, diff_loss=0.3848, tot_loss=1.601, over 1966.00 samples.], 2024-10-21 06:12:13,137 INFO [train.py:682] (1/4) Start epoch 1455 2024-10-21 06:12:30,001 INFO [train.py:561] (1/4) Epoch 1455, batch 6, global_batch_idx: 23270, batch size: 106, loss[dur_loss=0.2344, prior_loss=0.9829, diff_loss=0.3409, tot_loss=1.558, over 106.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.9822, diff_loss=0.4292, tot_loss=1.642, over 1142.00 samples.], 2024-10-21 06:12:42,959 INFO [train.py:682] (1/4) Start epoch 1456 2024-10-21 06:12:51,747 INFO [train.py:561] (1/4) Epoch 1456, batch 0, global_batch_idx: 23280, batch size: 108, loss[dur_loss=0.2405, prior_loss=0.9837, diff_loss=0.3203, tot_loss=1.545, over 108.00 samples.], tot_loss[dur_loss=0.2405, prior_loss=0.9837, diff_loss=0.3203, tot_loss=1.545, over 108.00 samples.], 2024-10-21 06:13:06,214 INFO [train.py:561] (1/4) Epoch 1456, batch 10, global_batch_idx: 23290, batch size: 111, loss[dur_loss=0.2402, prior_loss=0.9842, diff_loss=0.3347, tot_loss=1.559, over 111.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9827, diff_loss=0.3927, tot_loss=1.609, over 1656.00 samples.], 2024-10-21 06:13:13,380 INFO [train.py:682] (1/4) Start epoch 1457 2024-10-21 06:13:27,002 INFO [train.py:561] (1/4) Epoch 1457, batch 4, global_batch_idx: 23300, batch size: 189, loss[dur_loss=0.2332, prior_loss=0.9831, diff_loss=0.3584, tot_loss=1.575, over 189.00 samples.], tot_loss[dur_loss=0.2306, prior_loss=0.982, diff_loss=0.447, tot_loss=1.66, over 937.00 samples.], 2024-10-21 06:13:41,906 INFO [train.py:561] (1/4) Epoch 1457, batch 14, global_batch_idx: 23310, batch size: 142, loss[dur_loss=0.2351, prior_loss=0.9827, diff_loss=0.3228, tot_loss=1.541, over 142.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9827, diff_loss=0.376, tot_loss=1.593, over 2210.00 samples.], 2024-10-21 06:13:43,338 INFO [train.py:682] (1/4) Start epoch 1458 2024-10-21 06:14:03,533 INFO [train.py:561] (1/4) Epoch 1458, batch 8, global_batch_idx: 23320, batch size: 170, loss[dur_loss=0.2412, prior_loss=0.9831, diff_loss=0.3442, tot_loss=1.569, over 170.00 samples.], tot_loss[dur_loss=0.2329, prior_loss=0.9824, diff_loss=0.3992, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 06:14:13,737 INFO [train.py:682] (1/4) Start epoch 1459 2024-10-21 06:14:25,205 INFO [train.py:561] (1/4) Epoch 1459, batch 2, global_batch_idx: 23330, batch size: 203, loss[dur_loss=0.2361, prior_loss=0.9828, diff_loss=0.3376, tot_loss=1.556, over 203.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9829, diff_loss=0.3331, tot_loss=1.552, over 442.00 samples.], 2024-10-21 06:14:39,535 INFO [train.py:561] (1/4) Epoch 1459, batch 12, global_batch_idx: 23340, batch size: 152, loss[dur_loss=0.2365, prior_loss=0.983, diff_loss=0.347, tot_loss=1.567, over 152.00 samples.], tot_loss[dur_loss=0.2347, prior_loss=0.9827, diff_loss=0.3759, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 06:14:44,009 INFO [train.py:682] (1/4) Start epoch 1460 2024-10-21 06:15:01,233 INFO [train.py:561] (1/4) Epoch 1460, batch 6, global_batch_idx: 23350, batch size: 106, loss[dur_loss=0.2379, prior_loss=0.9832, diff_loss=0.3104, tot_loss=1.532, over 106.00 samples.], tot_loss[dur_loss=0.2305, prior_loss=0.9823, diff_loss=0.4158, tot_loss=1.629, over 1142.00 samples.], 2024-10-21 06:15:14,350 INFO [train.py:682] (1/4) Start epoch 1461 2024-10-21 06:15:23,112 INFO [train.py:561] (1/4) Epoch 1461, batch 0, global_batch_idx: 23360, batch size: 108, loss[dur_loss=0.2412, prior_loss=0.984, diff_loss=0.3196, tot_loss=1.545, over 108.00 samples.], tot_loss[dur_loss=0.2412, prior_loss=0.984, diff_loss=0.3196, tot_loss=1.545, over 108.00 samples.], 2024-10-21 06:15:37,455 INFO [train.py:561] (1/4) Epoch 1461, batch 10, global_batch_idx: 23370, batch size: 111, loss[dur_loss=0.2368, prior_loss=0.984, diff_loss=0.3189, tot_loss=1.54, over 111.00 samples.], tot_loss[dur_loss=0.2335, prior_loss=0.9827, diff_loss=0.3885, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 06:15:44,604 INFO [train.py:682] (1/4) Start epoch 1462 2024-10-21 06:15:58,333 INFO [train.py:561] (1/4) Epoch 1462, batch 4, global_batch_idx: 23380, batch size: 189, loss[dur_loss=0.2332, prior_loss=0.983, diff_loss=0.3437, tot_loss=1.56, over 189.00 samples.], tot_loss[dur_loss=0.2295, prior_loss=0.9819, diff_loss=0.4361, tot_loss=1.648, over 937.00 samples.], 2024-10-21 06:16:13,290 INFO [train.py:561] (1/4) Epoch 1462, batch 14, global_batch_idx: 23390, batch size: 142, loss[dur_loss=0.2311, prior_loss=0.9824, diff_loss=0.3036, tot_loss=1.517, over 142.00 samples.], tot_loss[dur_loss=0.233, prior_loss=0.9826, diff_loss=0.3704, tot_loss=1.586, over 2210.00 samples.], 2024-10-21 06:16:14,733 INFO [train.py:682] (1/4) Start epoch 1463 2024-10-21 06:16:34,712 INFO [train.py:561] (1/4) Epoch 1463, batch 8, global_batch_idx: 23400, batch size: 170, loss[dur_loss=0.2424, prior_loss=0.9832, diff_loss=0.345, tot_loss=1.571, over 170.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9826, diff_loss=0.4059, tot_loss=1.622, over 1432.00 samples.], 2024-10-21 06:16:44,875 INFO [train.py:682] (1/4) Start epoch 1464 2024-10-21 06:16:56,303 INFO [train.py:561] (1/4) Epoch 1464, batch 2, global_batch_idx: 23410, batch size: 203, loss[dur_loss=0.2355, prior_loss=0.9827, diff_loss=0.381, tot_loss=1.599, over 203.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9831, diff_loss=0.3426, tot_loss=1.562, over 442.00 samples.], 2024-10-21 06:17:10,536 INFO [train.py:561] (1/4) Epoch 1464, batch 12, global_batch_idx: 23420, batch size: 152, loss[dur_loss=0.2385, prior_loss=0.9834, diff_loss=0.3436, tot_loss=1.565, over 152.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9828, diff_loss=0.379, tot_loss=1.596, over 1966.00 samples.], 2024-10-21 06:17:15,026 INFO [train.py:682] (1/4) Start epoch 1465 2024-10-21 06:17:31,910 INFO [train.py:561] (1/4) Epoch 1465, batch 6, global_batch_idx: 23430, batch size: 106, loss[dur_loss=0.2419, prior_loss=0.9835, diff_loss=0.3082, tot_loss=1.534, over 106.00 samples.], tot_loss[dur_loss=0.2328, prior_loss=0.9825, diff_loss=0.4203, tot_loss=1.636, over 1142.00 samples.], 2024-10-21 06:17:44,977 INFO [train.py:682] (1/4) Start epoch 1466 2024-10-21 06:17:53,944 INFO [train.py:561] (1/4) Epoch 1466, batch 0, global_batch_idx: 23440, batch size: 108, loss[dur_loss=0.2405, prior_loss=0.984, diff_loss=0.322, tot_loss=1.547, over 108.00 samples.], tot_loss[dur_loss=0.2405, prior_loss=0.984, diff_loss=0.322, tot_loss=1.547, over 108.00 samples.], 2024-10-21 06:18:08,215 INFO [train.py:561] (1/4) Epoch 1466, batch 10, global_batch_idx: 23450, batch size: 111, loss[dur_loss=0.2372, prior_loss=0.9837, diff_loss=0.2968, tot_loss=1.518, over 111.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9825, diff_loss=0.3868, tot_loss=1.603, over 1656.00 samples.], 2024-10-21 06:18:15,302 INFO [train.py:682] (1/4) Start epoch 1467 2024-10-21 06:18:28,999 INFO [train.py:561] (1/4) Epoch 1467, batch 4, global_batch_idx: 23460, batch size: 189, loss[dur_loss=0.2309, prior_loss=0.9831, diff_loss=0.3407, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.2284, prior_loss=0.982, diff_loss=0.4483, tot_loss=1.659, over 937.00 samples.], 2024-10-21 06:18:43,947 INFO [train.py:561] (1/4) Epoch 1467, batch 14, global_batch_idx: 23470, batch size: 142, loss[dur_loss=0.2371, prior_loss=0.9827, diff_loss=0.349, tot_loss=1.569, over 142.00 samples.], tot_loss[dur_loss=0.2326, prior_loss=0.9826, diff_loss=0.3815, tot_loss=1.597, over 2210.00 samples.], 2024-10-21 06:18:45,386 INFO [train.py:682] (1/4) Start epoch 1468 2024-10-21 06:19:05,220 INFO [train.py:561] (1/4) Epoch 1468, batch 8, global_batch_idx: 23480, batch size: 170, loss[dur_loss=0.2398, prior_loss=0.983, diff_loss=0.337, tot_loss=1.56, over 170.00 samples.], tot_loss[dur_loss=0.2325, prior_loss=0.9824, diff_loss=0.4002, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 06:19:15,457 INFO [train.py:682] (1/4) Start epoch 1469 2024-10-21 06:19:27,262 INFO [train.py:561] (1/4) Epoch 1469, batch 2, global_batch_idx: 23490, batch size: 203, loss[dur_loss=0.2342, prior_loss=0.9828, diff_loss=0.3467, tot_loss=1.564, over 203.00 samples.], tot_loss[dur_loss=0.2359, prior_loss=0.983, diff_loss=0.3266, tot_loss=1.545, over 442.00 samples.], 2024-10-21 06:19:41,773 INFO [train.py:561] (1/4) Epoch 1469, batch 12, global_batch_idx: 23500, batch size: 152, loss[dur_loss=0.236, prior_loss=0.9831, diff_loss=0.3264, tot_loss=1.545, over 152.00 samples.], tot_loss[dur_loss=0.2341, prior_loss=0.9826, diff_loss=0.3798, tot_loss=1.596, over 1966.00 samples.], 2024-10-21 06:19:46,301 INFO [train.py:682] (1/4) Start epoch 1470 2024-10-21 06:20:03,356 INFO [train.py:561] (1/4) Epoch 1470, batch 6, global_batch_idx: 23510, batch size: 106, loss[dur_loss=0.2416, prior_loss=0.9831, diff_loss=0.3475, tot_loss=1.572, over 106.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.9822, diff_loss=0.4257, tot_loss=1.639, over 1142.00 samples.], 2024-10-21 06:20:16,509 INFO [train.py:682] (1/4) Start epoch 1471 2024-10-21 06:20:25,209 INFO [train.py:561] (1/4) Epoch 1471, batch 0, global_batch_idx: 23520, batch size: 108, loss[dur_loss=0.2379, prior_loss=0.9837, diff_loss=0.3182, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.2379, prior_loss=0.9837, diff_loss=0.3182, tot_loss=1.54, over 108.00 samples.], 2024-10-21 06:20:39,497 INFO [train.py:561] (1/4) Epoch 1471, batch 10, global_batch_idx: 23530, batch size: 111, loss[dur_loss=0.2359, prior_loss=0.984, diff_loss=0.3655, tot_loss=1.585, over 111.00 samples.], tot_loss[dur_loss=0.2321, prior_loss=0.9824, diff_loss=0.3932, tot_loss=1.608, over 1656.00 samples.], 2024-10-21 06:20:46,635 INFO [train.py:682] (1/4) Start epoch 1472 2024-10-21 06:21:00,477 INFO [train.py:561] (1/4) Epoch 1472, batch 4, global_batch_idx: 23540, batch size: 189, loss[dur_loss=0.2332, prior_loss=0.9829, diff_loss=0.3385, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.2299, prior_loss=0.9818, diff_loss=0.4316, tot_loss=1.643, over 937.00 samples.], 2024-10-21 06:21:15,420 INFO [train.py:561] (1/4) Epoch 1472, batch 14, global_batch_idx: 23550, batch size: 142, loss[dur_loss=0.2349, prior_loss=0.9825, diff_loss=0.3086, tot_loss=1.526, over 142.00 samples.], tot_loss[dur_loss=0.2334, prior_loss=0.9825, diff_loss=0.3672, tot_loss=1.583, over 2210.00 samples.], 2024-10-21 06:21:16,848 INFO [train.py:682] (1/4) Start epoch 1473 2024-10-21 06:21:36,655 INFO [train.py:561] (1/4) Epoch 1473, batch 8, global_batch_idx: 23560, batch size: 170, loss[dur_loss=0.242, prior_loss=0.983, diff_loss=0.3512, tot_loss=1.576, over 170.00 samples.], tot_loss[dur_loss=0.2318, prior_loss=0.9823, diff_loss=0.3994, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 06:21:46,676 INFO [train.py:682] (1/4) Start epoch 1474 2024-10-21 06:21:57,968 INFO [train.py:561] (1/4) Epoch 1474, batch 2, global_batch_idx: 23570, batch size: 203, loss[dur_loss=0.2362, prior_loss=0.9828, diff_loss=0.3893, tot_loss=1.608, over 203.00 samples.], tot_loss[dur_loss=0.2372, prior_loss=0.983, diff_loss=0.3535, tot_loss=1.574, over 442.00 samples.], 2024-10-21 06:22:12,237 INFO [train.py:561] (1/4) Epoch 1474, batch 12, global_batch_idx: 23580, batch size: 152, loss[dur_loss=0.2387, prior_loss=0.9828, diff_loss=0.327, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.2329, prior_loss=0.9825, diff_loss=0.3761, tot_loss=1.591, over 1966.00 samples.], 2024-10-21 06:22:16,753 INFO [train.py:682] (1/4) Start epoch 1475 2024-10-21 06:22:34,980 INFO [train.py:561] (1/4) Epoch 1475, batch 6, global_batch_idx: 23590, batch size: 106, loss[dur_loss=0.241, prior_loss=0.983, diff_loss=0.3061, tot_loss=1.53, over 106.00 samples.], tot_loss[dur_loss=0.2305, prior_loss=0.9821, diff_loss=0.4091, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 06:22:48,117 INFO [train.py:682] (1/4) Start epoch 1476 2024-10-21 06:22:56,904 INFO [train.py:561] (1/4) Epoch 1476, batch 0, global_batch_idx: 23600, batch size: 108, loss[dur_loss=0.2373, prior_loss=0.9836, diff_loss=0.3401, tot_loss=1.561, over 108.00 samples.], tot_loss[dur_loss=0.2373, prior_loss=0.9836, diff_loss=0.3401, tot_loss=1.561, over 108.00 samples.], 2024-10-21 06:23:11,226 INFO [train.py:561] (1/4) Epoch 1476, batch 10, global_batch_idx: 23610, batch size: 111, loss[dur_loss=0.2392, prior_loss=0.9837, diff_loss=0.3587, tot_loss=1.582, over 111.00 samples.], tot_loss[dur_loss=0.232, prior_loss=0.9824, diff_loss=0.4015, tot_loss=1.616, over 1656.00 samples.], 2024-10-21 06:23:18,356 INFO [train.py:682] (1/4) Start epoch 1477 2024-10-21 06:23:32,021 INFO [train.py:561] (1/4) Epoch 1477, batch 4, global_batch_idx: 23620, batch size: 189, loss[dur_loss=0.2378, prior_loss=0.9829, diff_loss=0.3348, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.2309, prior_loss=0.9818, diff_loss=0.4332, tot_loss=1.646, over 937.00 samples.], 2024-10-21 06:23:46,944 INFO [train.py:561] (1/4) Epoch 1477, batch 14, global_batch_idx: 23630, batch size: 142, loss[dur_loss=0.2333, prior_loss=0.9821, diff_loss=0.2915, tot_loss=1.507, over 142.00 samples.], tot_loss[dur_loss=0.2334, prior_loss=0.9824, diff_loss=0.3688, tot_loss=1.585, over 2210.00 samples.], 2024-10-21 06:23:48,383 INFO [train.py:682] (1/4) Start epoch 1478 2024-10-21 06:24:08,452 INFO [train.py:561] (1/4) Epoch 1478, batch 8, global_batch_idx: 23640, batch size: 170, loss[dur_loss=0.2357, prior_loss=0.9827, diff_loss=0.3419, tot_loss=1.56, over 170.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9821, diff_loss=0.3989, tot_loss=1.612, over 1432.00 samples.], 2024-10-21 06:24:18,821 INFO [train.py:682] (1/4) Start epoch 1479 2024-10-21 06:24:30,541 INFO [train.py:561] (1/4) Epoch 1479, batch 2, global_batch_idx: 23650, batch size: 203, loss[dur_loss=0.2325, prior_loss=0.9824, diff_loss=0.3286, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2337, prior_loss=0.9827, diff_loss=0.3233, tot_loss=1.54, over 442.00 samples.], 2024-10-21 06:24:44,822 INFO [train.py:561] (1/4) Epoch 1479, batch 12, global_batch_idx: 23660, batch size: 152, loss[dur_loss=0.2348, prior_loss=0.9826, diff_loss=0.3686, tot_loss=1.586, over 152.00 samples.], tot_loss[dur_loss=0.2315, prior_loss=0.9823, diff_loss=0.3843, tot_loss=1.598, over 1966.00 samples.], 2024-10-21 06:24:49,320 INFO [train.py:682] (1/4) Start epoch 1480 2024-10-21 06:25:06,184 INFO [train.py:561] (1/4) Epoch 1480, batch 6, global_batch_idx: 23670, batch size: 106, loss[dur_loss=0.2331, prior_loss=0.9829, diff_loss=0.2949, tot_loss=1.511, over 106.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.982, diff_loss=0.4113, tot_loss=1.624, over 1142.00 samples.], 2024-10-21 06:25:19,296 INFO [train.py:682] (1/4) Start epoch 1481 2024-10-21 06:25:28,073 INFO [train.py:561] (1/4) Epoch 1481, batch 0, global_batch_idx: 23680, batch size: 108, loss[dur_loss=0.2394, prior_loss=0.9836, diff_loss=0.294, tot_loss=1.517, over 108.00 samples.], tot_loss[dur_loss=0.2394, prior_loss=0.9836, diff_loss=0.294, tot_loss=1.517, over 108.00 samples.], 2024-10-21 06:25:42,354 INFO [train.py:561] (1/4) Epoch 1481, batch 10, global_batch_idx: 23690, batch size: 111, loss[dur_loss=0.2369, prior_loss=0.9838, diff_loss=0.359, tot_loss=1.58, over 111.00 samples.], tot_loss[dur_loss=0.2318, prior_loss=0.9823, diff_loss=0.3886, tot_loss=1.603, over 1656.00 samples.], 2024-10-21 06:25:49,530 INFO [train.py:682] (1/4) Start epoch 1482 2024-10-21 06:26:03,159 INFO [train.py:561] (1/4) Epoch 1482, batch 4, global_batch_idx: 23700, batch size: 189, loss[dur_loss=0.2317, prior_loss=0.9831, diff_loss=0.3656, tot_loss=1.58, over 189.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.9818, diff_loss=0.4382, tot_loss=1.648, over 937.00 samples.], 2024-10-21 06:26:18,360 INFO [train.py:561] (1/4) Epoch 1482, batch 14, global_batch_idx: 23710, batch size: 142, loss[dur_loss=0.2337, prior_loss=0.9823, diff_loss=0.3229, tot_loss=1.539, over 142.00 samples.], tot_loss[dur_loss=0.2324, prior_loss=0.9825, diff_loss=0.372, tot_loss=1.587, over 2210.00 samples.], 2024-10-21 06:26:19,814 INFO [train.py:682] (1/4) Start epoch 1483 2024-10-21 06:26:40,072 INFO [train.py:561] (1/4) Epoch 1483, batch 8, global_batch_idx: 23720, batch size: 170, loss[dur_loss=0.238, prior_loss=0.983, diff_loss=0.3894, tot_loss=1.61, over 170.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9822, diff_loss=0.4001, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 06:26:50,355 INFO [train.py:682] (1/4) Start epoch 1484 2024-10-21 06:27:01,969 INFO [train.py:561] (1/4) Epoch 1484, batch 2, global_batch_idx: 23730, batch size: 203, loss[dur_loss=0.2364, prior_loss=0.9826, diff_loss=0.3489, tot_loss=1.568, over 203.00 samples.], tot_loss[dur_loss=0.2351, prior_loss=0.9827, diff_loss=0.3223, tot_loss=1.54, over 442.00 samples.], 2024-10-21 06:27:16,245 INFO [train.py:561] (1/4) Epoch 1484, batch 12, global_batch_idx: 23740, batch size: 152, loss[dur_loss=0.237, prior_loss=0.9826, diff_loss=0.3448, tot_loss=1.564, over 152.00 samples.], tot_loss[dur_loss=0.2332, prior_loss=0.9823, diff_loss=0.3778, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 06:27:20,775 INFO [train.py:682] (1/4) Start epoch 1485 2024-10-21 06:27:37,911 INFO [train.py:561] (1/4) Epoch 1485, batch 6, global_batch_idx: 23750, batch size: 106, loss[dur_loss=0.2383, prior_loss=0.9831, diff_loss=0.2872, tot_loss=1.509, over 106.00 samples.], tot_loss[dur_loss=0.2297, prior_loss=0.9819, diff_loss=0.4181, tot_loss=1.63, over 1142.00 samples.], 2024-10-21 06:27:51,132 INFO [train.py:682] (1/4) Start epoch 1486 2024-10-21 06:27:59,692 INFO [train.py:561] (1/4) Epoch 1486, batch 0, global_batch_idx: 23760, batch size: 108, loss[dur_loss=0.2388, prior_loss=0.9834, diff_loss=0.3222, tot_loss=1.544, over 108.00 samples.], tot_loss[dur_loss=0.2388, prior_loss=0.9834, diff_loss=0.3222, tot_loss=1.544, over 108.00 samples.], 2024-10-21 06:28:14,000 INFO [train.py:561] (1/4) Epoch 1486, batch 10, global_batch_idx: 23770, batch size: 111, loss[dur_loss=0.2386, prior_loss=0.9841, diff_loss=0.3093, tot_loss=1.532, over 111.00 samples.], tot_loss[dur_loss=0.2321, prior_loss=0.9823, diff_loss=0.3971, tot_loss=1.611, over 1656.00 samples.], 2024-10-21 06:28:21,134 INFO [train.py:682] (1/4) Start epoch 1487 2024-10-21 06:28:35,184 INFO [train.py:561] (1/4) Epoch 1487, batch 4, global_batch_idx: 23780, batch size: 189, loss[dur_loss=0.2335, prior_loss=0.983, diff_loss=0.3388, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9817, diff_loss=0.4487, tot_loss=1.659, over 937.00 samples.], 2024-10-21 06:28:50,171 INFO [train.py:561] (1/4) Epoch 1487, batch 14, global_batch_idx: 23790, batch size: 142, loss[dur_loss=0.2341, prior_loss=0.9824, diff_loss=0.3388, tot_loss=1.555, over 142.00 samples.], tot_loss[dur_loss=0.2327, prior_loss=0.9824, diff_loss=0.3842, tot_loss=1.599, over 2210.00 samples.], 2024-10-21 06:28:51,605 INFO [train.py:682] (1/4) Start epoch 1488 2024-10-21 06:29:11,665 INFO [train.py:561] (1/4) Epoch 1488, batch 8, global_batch_idx: 23800, batch size: 170, loss[dur_loss=0.2422, prior_loss=0.9831, diff_loss=0.3388, tot_loss=1.564, over 170.00 samples.], tot_loss[dur_loss=0.2317, prior_loss=0.9822, diff_loss=0.4078, tot_loss=1.622, over 1432.00 samples.], 2024-10-21 06:29:21,992 INFO [train.py:682] (1/4) Start epoch 1489 2024-10-21 06:29:33,415 INFO [train.py:561] (1/4) Epoch 1489, batch 2, global_batch_idx: 23810, batch size: 203, loss[dur_loss=0.2365, prior_loss=0.9825, diff_loss=0.3555, tot_loss=1.574, over 203.00 samples.], tot_loss[dur_loss=0.2351, prior_loss=0.9827, diff_loss=0.3321, tot_loss=1.55, over 442.00 samples.], 2024-10-21 06:29:47,676 INFO [train.py:561] (1/4) Epoch 1489, batch 12, global_batch_idx: 23820, batch size: 152, loss[dur_loss=0.2304, prior_loss=0.9824, diff_loss=0.309, tot_loss=1.522, over 152.00 samples.], tot_loss[dur_loss=0.2333, prior_loss=0.9824, diff_loss=0.3843, tot_loss=1.6, over 1966.00 samples.], 2024-10-21 06:29:52,180 INFO [train.py:682] (1/4) Start epoch 1490 2024-10-21 06:30:09,055 INFO [train.py:561] (1/4) Epoch 1490, batch 6, global_batch_idx: 23830, batch size: 106, loss[dur_loss=0.2347, prior_loss=0.9827, diff_loss=0.3126, tot_loss=1.53, over 106.00 samples.], tot_loss[dur_loss=0.2292, prior_loss=0.9817, diff_loss=0.4184, tot_loss=1.629, over 1142.00 samples.], 2024-10-21 06:30:22,258 INFO [train.py:682] (1/4) Start epoch 1491 2024-10-21 06:30:31,401 INFO [train.py:561] (1/4) Epoch 1491, batch 0, global_batch_idx: 23840, batch size: 108, loss[dur_loss=0.2394, prior_loss=0.9831, diff_loss=0.3054, tot_loss=1.528, over 108.00 samples.], tot_loss[dur_loss=0.2394, prior_loss=0.9831, diff_loss=0.3054, tot_loss=1.528, over 108.00 samples.], 2024-10-21 06:30:45,884 INFO [train.py:561] (1/4) Epoch 1491, batch 10, global_batch_idx: 23850, batch size: 111, loss[dur_loss=0.2342, prior_loss=0.9837, diff_loss=0.3147, tot_loss=1.533, over 111.00 samples.], tot_loss[dur_loss=0.2319, prior_loss=0.9821, diff_loss=0.3823, tot_loss=1.596, over 1656.00 samples.], 2024-10-21 06:30:53,007 INFO [train.py:682] (1/4) Start epoch 1492 2024-10-21 06:31:06,918 INFO [train.py:561] (1/4) Epoch 1492, batch 4, global_batch_idx: 23860, batch size: 189, loss[dur_loss=0.2325, prior_loss=0.9826, diff_loss=0.3947, tot_loss=1.61, over 189.00 samples.], tot_loss[dur_loss=0.2281, prior_loss=0.9816, diff_loss=0.4528, tot_loss=1.662, over 937.00 samples.], 2024-10-21 06:31:22,133 INFO [train.py:561] (1/4) Epoch 1492, batch 14, global_batch_idx: 23870, batch size: 142, loss[dur_loss=0.231, prior_loss=0.9819, diff_loss=0.2815, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.2317, prior_loss=0.9822, diff_loss=0.3743, tot_loss=1.588, over 2210.00 samples.], 2024-10-21 06:31:23,579 INFO [train.py:682] (1/4) Start epoch 1493 2024-10-21 06:31:43,798 INFO [train.py:561] (1/4) Epoch 1493, batch 8, global_batch_idx: 23880, batch size: 170, loss[dur_loss=0.2359, prior_loss=0.9827, diff_loss=0.3392, tot_loss=1.558, over 170.00 samples.], tot_loss[dur_loss=0.2306, prior_loss=0.9819, diff_loss=0.4189, tot_loss=1.631, over 1432.00 samples.], 2024-10-21 06:31:54,023 INFO [train.py:682] (1/4) Start epoch 1494 2024-10-21 06:32:05,595 INFO [train.py:561] (1/4) Epoch 1494, batch 2, global_batch_idx: 23890, batch size: 203, loss[dur_loss=0.2333, prior_loss=0.9823, diff_loss=0.3675, tot_loss=1.583, over 203.00 samples.], tot_loss[dur_loss=0.2336, prior_loss=0.9824, diff_loss=0.3278, tot_loss=1.544, over 442.00 samples.], 2024-10-21 06:32:19,759 INFO [train.py:561] (1/4) Epoch 1494, batch 12, global_batch_idx: 23900, batch size: 152, loss[dur_loss=0.2329, prior_loss=0.9823, diff_loss=0.347, tot_loss=1.562, over 152.00 samples.], tot_loss[dur_loss=0.2319, prior_loss=0.9822, diff_loss=0.3817, tot_loss=1.596, over 1966.00 samples.], 2024-10-21 06:32:24,223 INFO [train.py:682] (1/4) Start epoch 1495 2024-10-21 06:32:41,453 INFO [train.py:561] (1/4) Epoch 1495, batch 6, global_batch_idx: 23910, batch size: 106, loss[dur_loss=0.2326, prior_loss=0.9827, diff_loss=0.2876, tot_loss=1.503, over 106.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9818, diff_loss=0.4113, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 06:32:54,581 INFO [train.py:682] (1/4) Start epoch 1496 2024-10-21 06:33:03,397 INFO [train.py:561] (1/4) Epoch 1496, batch 0, global_batch_idx: 23920, batch size: 108, loss[dur_loss=0.2431, prior_loss=0.9833, diff_loss=0.3102, tot_loss=1.537, over 108.00 samples.], tot_loss[dur_loss=0.2431, prior_loss=0.9833, diff_loss=0.3102, tot_loss=1.537, over 108.00 samples.], 2024-10-21 06:33:17,607 INFO [train.py:561] (1/4) Epoch 1496, batch 10, global_batch_idx: 23930, batch size: 111, loss[dur_loss=0.2365, prior_loss=0.9836, diff_loss=0.2976, tot_loss=1.518, over 111.00 samples.], tot_loss[dur_loss=0.2326, prior_loss=0.9822, diff_loss=0.3904, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 06:33:24,739 INFO [train.py:682] (1/4) Start epoch 1497 2024-10-21 06:33:38,480 INFO [train.py:561] (1/4) Epoch 1497, batch 4, global_batch_idx: 23940, batch size: 189, loss[dur_loss=0.2317, prior_loss=0.9825, diff_loss=0.3549, tot_loss=1.569, over 189.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9815, diff_loss=0.4436, tot_loss=1.653, over 937.00 samples.], 2024-10-21 06:33:53,397 INFO [train.py:561] (1/4) Epoch 1497, batch 14, global_batch_idx: 23950, batch size: 142, loss[dur_loss=0.2335, prior_loss=0.9821, diff_loss=0.3311, tot_loss=1.547, over 142.00 samples.], tot_loss[dur_loss=0.232, prior_loss=0.9822, diff_loss=0.3775, tot_loss=1.592, over 2210.00 samples.], 2024-10-21 06:33:54,841 INFO [train.py:682] (1/4) Start epoch 1498 2024-10-21 06:34:14,593 INFO [train.py:561] (1/4) Epoch 1498, batch 8, global_batch_idx: 23960, batch size: 170, loss[dur_loss=0.2393, prior_loss=0.9828, diff_loss=0.3477, tot_loss=1.57, over 170.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.982, diff_loss=0.4032, tot_loss=1.616, over 1432.00 samples.], 2024-10-21 06:34:24,776 INFO [train.py:682] (1/4) Start epoch 1499 2024-10-21 06:34:36,545 INFO [train.py:561] (1/4) Epoch 1499, batch 2, global_batch_idx: 23970, batch size: 203, loss[dur_loss=0.2346, prior_loss=0.9823, diff_loss=0.365, tot_loss=1.582, over 203.00 samples.], tot_loss[dur_loss=0.236, prior_loss=0.9824, diff_loss=0.3382, tot_loss=1.557, over 442.00 samples.], 2024-10-21 06:34:50,792 INFO [train.py:561] (1/4) Epoch 1499, batch 12, global_batch_idx: 23980, batch size: 152, loss[dur_loss=0.2324, prior_loss=0.9825, diff_loss=0.3373, tot_loss=1.552, over 152.00 samples.], tot_loss[dur_loss=0.2316, prior_loss=0.9822, diff_loss=0.386, tot_loss=1.6, over 1966.00 samples.], 2024-10-21 06:34:55,266 INFO [train.py:682] (1/4) Start epoch 1500 2024-10-21 06:35:12,170 INFO [train.py:561] (1/4) Epoch 1500, batch 6, global_batch_idx: 23990, batch size: 106, loss[dur_loss=0.2337, prior_loss=0.9826, diff_loss=0.3517, tot_loss=1.568, over 106.00 samples.], tot_loss[dur_loss=0.2291, prior_loss=0.9816, diff_loss=0.4249, tot_loss=1.636, over 1142.00 samples.], 2024-10-21 06:35:25,127 INFO [train.py:682] (1/4) Start epoch 1501 2024-10-21 06:35:34,397 INFO [train.py:561] (1/4) Epoch 1501, batch 0, global_batch_idx: 24000, batch size: 108, loss[dur_loss=0.2381, prior_loss=0.983, diff_loss=0.3208, tot_loss=1.542, over 108.00 samples.], tot_loss[dur_loss=0.2381, prior_loss=0.983, diff_loss=0.3208, tot_loss=1.542, over 108.00 samples.], 2024-10-21 06:35:35,855 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 06:36:10,631 INFO [train.py:589] (1/4) Epoch 1501, validation: dur_loss=0.453, prior_loss=1.034, diff_loss=0.4248, tot_loss=1.911, over 100.00 samples. 2024-10-21 06:36:10,633 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 06:36:23,613 INFO [train.py:561] (1/4) Epoch 1501, batch 10, global_batch_idx: 24010, batch size: 111, loss[dur_loss=0.2364, prior_loss=0.9835, diff_loss=0.3035, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2324, prior_loss=0.9822, diff_loss=0.3921, tot_loss=1.607, over 1656.00 samples.], 2024-10-21 06:36:30,779 INFO [train.py:682] (1/4) Start epoch 1502 2024-10-21 06:36:44,857 INFO [train.py:561] (1/4) Epoch 1502, batch 4, global_batch_idx: 24020, batch size: 189, loss[dur_loss=0.2353, prior_loss=0.9826, diff_loss=0.3554, tot_loss=1.573, over 189.00 samples.], tot_loss[dur_loss=0.2265, prior_loss=0.9816, diff_loss=0.4436, tot_loss=1.652, over 937.00 samples.], 2024-10-21 06:36:59,804 INFO [train.py:561] (1/4) Epoch 1502, batch 14, global_batch_idx: 24030, batch size: 142, loss[dur_loss=0.2335, prior_loss=0.9823, diff_loss=0.3225, tot_loss=1.538, over 142.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9823, diff_loss=0.3782, tot_loss=1.592, over 2210.00 samples.], 2024-10-21 06:37:01,252 INFO [train.py:682] (1/4) Start epoch 1503 2024-10-21 06:37:21,262 INFO [train.py:561] (1/4) Epoch 1503, batch 8, global_batch_idx: 24040, batch size: 170, loss[dur_loss=0.2408, prior_loss=0.9828, diff_loss=0.3476, tot_loss=1.571, over 170.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9818, diff_loss=0.4029, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 06:37:31,432 INFO [train.py:682] (1/4) Start epoch 1504 2024-10-21 06:37:43,059 INFO [train.py:561] (1/4) Epoch 1504, batch 2, global_batch_idx: 24050, batch size: 203, loss[dur_loss=0.2303, prior_loss=0.9824, diff_loss=0.325, tot_loss=1.538, over 203.00 samples.], tot_loss[dur_loss=0.2339, prior_loss=0.9826, diff_loss=0.3292, tot_loss=1.546, over 442.00 samples.], 2024-10-21 06:37:57,357 INFO [train.py:561] (1/4) Epoch 1504, batch 12, global_batch_idx: 24060, batch size: 152, loss[dur_loss=0.2325, prior_loss=0.9824, diff_loss=0.3114, tot_loss=1.526, over 152.00 samples.], tot_loss[dur_loss=0.2313, prior_loss=0.9821, diff_loss=0.381, tot_loss=1.594, over 1966.00 samples.], 2024-10-21 06:38:01,809 INFO [train.py:682] (1/4) Start epoch 1505 2024-10-21 06:38:18,981 INFO [train.py:561] (1/4) Epoch 1505, batch 6, global_batch_idx: 24070, batch size: 106, loss[dur_loss=0.2391, prior_loss=0.9827, diff_loss=0.3699, tot_loss=1.592, over 106.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.9816, diff_loss=0.4302, tot_loss=1.639, over 1142.00 samples.], 2024-10-21 06:38:32,040 INFO [train.py:682] (1/4) Start epoch 1506 2024-10-21 06:38:41,029 INFO [train.py:561] (1/4) Epoch 1506, batch 0, global_batch_idx: 24080, batch size: 108, loss[dur_loss=0.2368, prior_loss=0.9832, diff_loss=0.3271, tot_loss=1.547, over 108.00 samples.], tot_loss[dur_loss=0.2368, prior_loss=0.9832, diff_loss=0.3271, tot_loss=1.547, over 108.00 samples.], 2024-10-21 06:38:55,271 INFO [train.py:561] (1/4) Epoch 1506, batch 10, global_batch_idx: 24090, batch size: 111, loss[dur_loss=0.2354, prior_loss=0.9835, diff_loss=0.3276, tot_loss=1.547, over 111.00 samples.], tot_loss[dur_loss=0.2299, prior_loss=0.982, diff_loss=0.3922, tot_loss=1.604, over 1656.00 samples.], 2024-10-21 06:39:02,346 INFO [train.py:682] (1/4) Start epoch 1507 2024-10-21 06:39:16,134 INFO [train.py:561] (1/4) Epoch 1507, batch 4, global_batch_idx: 24100, batch size: 189, loss[dur_loss=0.2309, prior_loss=0.9824, diff_loss=0.3436, tot_loss=1.557, over 189.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9813, diff_loss=0.4444, tot_loss=1.652, over 937.00 samples.], 2024-10-21 06:39:31,090 INFO [train.py:561] (1/4) Epoch 1507, batch 14, global_batch_idx: 24110, batch size: 142, loss[dur_loss=0.2333, prior_loss=0.9822, diff_loss=0.3035, tot_loss=1.519, over 142.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.982, diff_loss=0.3677, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 06:39:32,520 INFO [train.py:682] (1/4) Start epoch 1508 2024-10-21 06:39:52,792 INFO [train.py:561] (1/4) Epoch 1508, batch 8, global_batch_idx: 24120, batch size: 170, loss[dur_loss=0.2371, prior_loss=0.9824, diff_loss=0.329, tot_loss=1.548, over 170.00 samples.], tot_loss[dur_loss=0.2301, prior_loss=0.9817, diff_loss=0.3986, tot_loss=1.61, over 1432.00 samples.], 2024-10-21 06:40:02,914 INFO [train.py:682] (1/4) Start epoch 1509 2024-10-21 06:40:14,584 INFO [train.py:561] (1/4) Epoch 1509, batch 2, global_batch_idx: 24130, batch size: 203, loss[dur_loss=0.234, prior_loss=0.9822, diff_loss=0.3401, tot_loss=1.556, over 203.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9824, diff_loss=0.3413, tot_loss=1.558, over 442.00 samples.], 2024-10-21 06:40:28,751 INFO [train.py:561] (1/4) Epoch 1509, batch 12, global_batch_idx: 24140, batch size: 152, loss[dur_loss=0.2284, prior_loss=0.9823, diff_loss=0.341, tot_loss=1.552, over 152.00 samples.], tot_loss[dur_loss=0.2312, prior_loss=0.9819, diff_loss=0.3806, tot_loss=1.594, over 1966.00 samples.], 2024-10-21 06:40:33,171 INFO [train.py:682] (1/4) Start epoch 1510 2024-10-21 06:40:50,485 INFO [train.py:561] (1/4) Epoch 1510, batch 6, global_batch_idx: 24150, batch size: 106, loss[dur_loss=0.2389, prior_loss=0.9826, diff_loss=0.3242, tot_loss=1.546, over 106.00 samples.], tot_loss[dur_loss=0.2292, prior_loss=0.9816, diff_loss=0.411, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 06:41:03,530 INFO [train.py:682] (1/4) Start epoch 1511 2024-10-21 06:41:12,587 INFO [train.py:561] (1/4) Epoch 1511, batch 0, global_batch_idx: 24160, batch size: 108, loss[dur_loss=0.2358, prior_loss=0.9832, diff_loss=0.336, tot_loss=1.555, over 108.00 samples.], tot_loss[dur_loss=0.2358, prior_loss=0.9832, diff_loss=0.336, tot_loss=1.555, over 108.00 samples.], 2024-10-21 06:41:26,849 INFO [train.py:561] (1/4) Epoch 1511, batch 10, global_batch_idx: 24170, batch size: 111, loss[dur_loss=0.2364, prior_loss=0.9835, diff_loss=0.3113, tot_loss=1.531, over 111.00 samples.], tot_loss[dur_loss=0.2302, prior_loss=0.9819, diff_loss=0.3888, tot_loss=1.601, over 1656.00 samples.], 2024-10-21 06:41:34,002 INFO [train.py:682] (1/4) Start epoch 1512 2024-10-21 06:41:48,061 INFO [train.py:561] (1/4) Epoch 1512, batch 4, global_batch_idx: 24180, batch size: 189, loss[dur_loss=0.234, prior_loss=0.9826, diff_loss=0.3733, tot_loss=1.59, over 189.00 samples.], tot_loss[dur_loss=0.226, prior_loss=0.9813, diff_loss=0.4537, tot_loss=1.661, over 937.00 samples.], 2024-10-21 06:42:02,974 INFO [train.py:561] (1/4) Epoch 1512, batch 14, global_batch_idx: 24190, batch size: 142, loss[dur_loss=0.2332, prior_loss=0.9823, diff_loss=0.3394, tot_loss=1.555, over 142.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.9821, diff_loss=0.3766, tot_loss=1.59, over 2210.00 samples.], 2024-10-21 06:42:04,415 INFO [train.py:682] (1/4) Start epoch 1513 2024-10-21 06:42:24,363 INFO [train.py:561] (1/4) Epoch 1513, batch 8, global_batch_idx: 24200, batch size: 170, loss[dur_loss=0.2372, prior_loss=0.9824, diff_loss=0.346, tot_loss=1.566, over 170.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.9819, diff_loss=0.4072, tot_loss=1.619, over 1432.00 samples.], 2024-10-21 06:42:34,474 INFO [train.py:682] (1/4) Start epoch 1514 2024-10-21 06:42:46,103 INFO [train.py:561] (1/4) Epoch 1514, batch 2, global_batch_idx: 24210, batch size: 203, loss[dur_loss=0.2316, prior_loss=0.9821, diff_loss=0.3493, tot_loss=1.563, over 203.00 samples.], tot_loss[dur_loss=0.2345, prior_loss=0.9824, diff_loss=0.3294, tot_loss=1.546, over 442.00 samples.], 2024-10-21 06:43:00,492 INFO [train.py:561] (1/4) Epoch 1514, batch 12, global_batch_idx: 24220, batch size: 152, loss[dur_loss=0.2363, prior_loss=0.9825, diff_loss=0.3146, tot_loss=1.533, over 152.00 samples.], tot_loss[dur_loss=0.2322, prior_loss=0.9822, diff_loss=0.3832, tot_loss=1.598, over 1966.00 samples.], 2024-10-21 06:43:04,932 INFO [train.py:682] (1/4) Start epoch 1515 2024-10-21 06:43:21,853 INFO [train.py:561] (1/4) Epoch 1515, batch 6, global_batch_idx: 24230, batch size: 106, loss[dur_loss=0.2389, prior_loss=0.9828, diff_loss=0.3061, tot_loss=1.528, over 106.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9817, diff_loss=0.4248, tot_loss=1.635, over 1142.00 samples.], 2024-10-21 06:43:34,857 INFO [train.py:682] (1/4) Start epoch 1516 2024-10-21 06:43:43,466 INFO [train.py:561] (1/4) Epoch 1516, batch 0, global_batch_idx: 24240, batch size: 108, loss[dur_loss=0.2384, prior_loss=0.9834, diff_loss=0.3243, tot_loss=1.546, over 108.00 samples.], tot_loss[dur_loss=0.2384, prior_loss=0.9834, diff_loss=0.3243, tot_loss=1.546, over 108.00 samples.], 2024-10-21 06:43:58,051 INFO [train.py:561] (1/4) Epoch 1516, batch 10, global_batch_idx: 24250, batch size: 111, loss[dur_loss=0.2374, prior_loss=0.9835, diff_loss=0.2921, tot_loss=1.513, over 111.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.982, diff_loss=0.3902, tot_loss=1.603, over 1656.00 samples.], 2024-10-21 06:44:05,178 INFO [train.py:682] (1/4) Start epoch 1517 2024-10-21 06:44:18,984 INFO [train.py:561] (1/4) Epoch 1517, batch 4, global_batch_idx: 24260, batch size: 189, loss[dur_loss=0.231, prior_loss=0.9823, diff_loss=0.3432, tot_loss=1.557, over 189.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.9814, diff_loss=0.4382, tot_loss=1.646, over 937.00 samples.], 2024-10-21 06:44:33,929 INFO [train.py:561] (1/4) Epoch 1517, batch 14, global_batch_idx: 24270, batch size: 142, loss[dur_loss=0.2334, prior_loss=0.982, diff_loss=0.3577, tot_loss=1.573, over 142.00 samples.], tot_loss[dur_loss=0.2315, prior_loss=0.9821, diff_loss=0.3744, tot_loss=1.588, over 2210.00 samples.], 2024-10-21 06:44:35,353 INFO [train.py:682] (1/4) Start epoch 1518 2024-10-21 06:44:55,546 INFO [train.py:561] (1/4) Epoch 1518, batch 8, global_batch_idx: 24280, batch size: 170, loss[dur_loss=0.2389, prior_loss=0.9825, diff_loss=0.3172, tot_loss=1.539, over 170.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9819, diff_loss=0.396, tot_loss=1.609, over 1432.00 samples.], 2024-10-21 06:45:05,792 INFO [train.py:682] (1/4) Start epoch 1519 2024-10-21 06:45:17,096 INFO [train.py:561] (1/4) Epoch 1519, batch 2, global_batch_idx: 24290, batch size: 203, loss[dur_loss=0.2333, prior_loss=0.9825, diff_loss=0.3784, tot_loss=1.594, over 203.00 samples.], tot_loss[dur_loss=0.2339, prior_loss=0.9826, diff_loss=0.3362, tot_loss=1.553, over 442.00 samples.], 2024-10-21 06:45:31,412 INFO [train.py:561] (1/4) Epoch 1519, batch 12, global_batch_idx: 24300, batch size: 152, loss[dur_loss=0.2317, prior_loss=0.9825, diff_loss=0.3069, tot_loss=1.521, over 152.00 samples.], tot_loss[dur_loss=0.2318, prior_loss=0.9822, diff_loss=0.3811, tot_loss=1.595, over 1966.00 samples.], 2024-10-21 06:45:35,893 INFO [train.py:682] (1/4) Start epoch 1520 2024-10-21 06:45:52,984 INFO [train.py:561] (1/4) Epoch 1520, batch 6, global_batch_idx: 24310, batch size: 106, loss[dur_loss=0.2399, prior_loss=0.9827, diff_loss=0.3265, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.2288, prior_loss=0.9816, diff_loss=0.4129, tot_loss=1.623, over 1142.00 samples.], 2024-10-21 06:46:06,084 INFO [train.py:682] (1/4) Start epoch 1521 2024-10-21 06:46:14,694 INFO [train.py:561] (1/4) Epoch 1521, batch 0, global_batch_idx: 24320, batch size: 108, loss[dur_loss=0.2376, prior_loss=0.9833, diff_loss=0.2962, tot_loss=1.517, over 108.00 samples.], tot_loss[dur_loss=0.2376, prior_loss=0.9833, diff_loss=0.2962, tot_loss=1.517, over 108.00 samples.], 2024-10-21 06:46:28,904 INFO [train.py:561] (1/4) Epoch 1521, batch 10, global_batch_idx: 24330, batch size: 111, loss[dur_loss=0.2341, prior_loss=0.9833, diff_loss=0.3318, tot_loss=1.549, over 111.00 samples.], tot_loss[dur_loss=0.2299, prior_loss=0.9819, diff_loss=0.3922, tot_loss=1.604, over 1656.00 samples.], 2024-10-21 06:46:35,960 INFO [train.py:682] (1/4) Start epoch 1522 2024-10-21 06:46:49,743 INFO [train.py:561] (1/4) Epoch 1522, batch 4, global_batch_idx: 24340, batch size: 189, loss[dur_loss=0.2326, prior_loss=0.9824, diff_loss=0.3339, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9813, diff_loss=0.4449, tot_loss=1.653, over 937.00 samples.], 2024-10-21 06:47:04,564 INFO [train.py:561] (1/4) Epoch 1522, batch 14, global_batch_idx: 24350, batch size: 142, loss[dur_loss=0.2299, prior_loss=0.9817, diff_loss=0.3155, tot_loss=1.527, over 142.00 samples.], tot_loss[dur_loss=0.2305, prior_loss=0.9819, diff_loss=0.372, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 06:47:05,986 INFO [train.py:682] (1/4) Start epoch 1523 2024-10-21 06:47:25,735 INFO [train.py:561] (1/4) Epoch 1523, batch 8, global_batch_idx: 24360, batch size: 170, loss[dur_loss=0.2344, prior_loss=0.9822, diff_loss=0.3669, tot_loss=1.584, over 170.00 samples.], tot_loss[dur_loss=0.2312, prior_loss=0.9818, diff_loss=0.4112, tot_loss=1.624, over 1432.00 samples.], 2024-10-21 06:47:35,787 INFO [train.py:682] (1/4) Start epoch 1524 2024-10-21 06:47:47,003 INFO [train.py:561] (1/4) Epoch 1524, batch 2, global_batch_idx: 24370, batch size: 203, loss[dur_loss=0.2319, prior_loss=0.9824, diff_loss=0.3546, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.233, prior_loss=0.9825, diff_loss=0.3513, tot_loss=1.567, over 442.00 samples.], 2024-10-21 06:48:01,258 INFO [train.py:561] (1/4) Epoch 1524, batch 12, global_batch_idx: 24380, batch size: 152, loss[dur_loss=0.2342, prior_loss=0.9825, diff_loss=0.337, tot_loss=1.554, over 152.00 samples.], tot_loss[dur_loss=0.231, prior_loss=0.9821, diff_loss=0.3812, tot_loss=1.594, over 1966.00 samples.], 2024-10-21 06:48:05,669 INFO [train.py:682] (1/4) Start epoch 1525 2024-10-21 06:48:22,306 INFO [train.py:561] (1/4) Epoch 1525, batch 6, global_batch_idx: 24390, batch size: 106, loss[dur_loss=0.234, prior_loss=0.9823, diff_loss=0.3271, tot_loss=1.543, over 106.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.9816, diff_loss=0.427, tot_loss=1.637, over 1142.00 samples.], 2024-10-21 06:48:35,370 INFO [train.py:682] (1/4) Start epoch 1526 2024-10-21 06:48:44,207 INFO [train.py:561] (1/4) Epoch 1526, batch 0, global_batch_idx: 24400, batch size: 108, loss[dur_loss=0.2371, prior_loss=0.9828, diff_loss=0.3045, tot_loss=1.524, over 108.00 samples.], tot_loss[dur_loss=0.2371, prior_loss=0.9828, diff_loss=0.3045, tot_loss=1.524, over 108.00 samples.], 2024-10-21 06:48:58,511 INFO [train.py:561] (1/4) Epoch 1526, batch 10, global_batch_idx: 24410, batch size: 111, loss[dur_loss=0.2345, prior_loss=0.983, diff_loss=0.3683, tot_loss=1.586, over 111.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9818, diff_loss=0.3989, tot_loss=1.61, over 1656.00 samples.], 2024-10-21 06:49:05,575 INFO [train.py:682] (1/4) Start epoch 1527 2024-10-21 06:49:19,154 INFO [train.py:561] (1/4) Epoch 1527, batch 4, global_batch_idx: 24420, batch size: 189, loss[dur_loss=0.2323, prior_loss=0.9825, diff_loss=0.3768, tot_loss=1.592, over 189.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.9813, diff_loss=0.4354, tot_loss=1.644, over 937.00 samples.], 2024-10-21 06:49:33,941 INFO [train.py:561] (1/4) Epoch 1527, batch 14, global_batch_idx: 24430, batch size: 142, loss[dur_loss=0.2314, prior_loss=0.9818, diff_loss=0.3183, tot_loss=1.531, over 142.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9819, diff_loss=0.3723, tot_loss=1.585, over 2210.00 samples.], 2024-10-21 06:49:35,359 INFO [train.py:682] (1/4) Start epoch 1528 2024-10-21 06:49:55,503 INFO [train.py:561] (1/4) Epoch 1528, batch 8, global_batch_idx: 24440, batch size: 170, loss[dur_loss=0.236, prior_loss=0.9821, diff_loss=0.3016, tot_loss=1.52, over 170.00 samples.], tot_loss[dur_loss=0.2292, prior_loss=0.9816, diff_loss=0.3948, tot_loss=1.606, over 1432.00 samples.], 2024-10-21 06:50:05,675 INFO [train.py:682] (1/4) Start epoch 1529 2024-10-21 06:50:16,936 INFO [train.py:561] (1/4) Epoch 1529, batch 2, global_batch_idx: 24450, batch size: 203, loss[dur_loss=0.2309, prior_loss=0.9821, diff_loss=0.3643, tot_loss=1.577, over 203.00 samples.], tot_loss[dur_loss=0.2332, prior_loss=0.9823, diff_loss=0.3438, tot_loss=1.559, over 442.00 samples.], 2024-10-21 06:50:31,127 INFO [train.py:561] (1/4) Epoch 1529, batch 12, global_batch_idx: 24460, batch size: 152, loss[dur_loss=0.2315, prior_loss=0.9824, diff_loss=0.334, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.2316, prior_loss=0.9819, diff_loss=0.3852, tot_loss=1.599, over 1966.00 samples.], 2024-10-21 06:50:35,528 INFO [train.py:682] (1/4) Start epoch 1530 2024-10-21 06:50:52,541 INFO [train.py:561] (1/4) Epoch 1530, batch 6, global_batch_idx: 24470, batch size: 106, loss[dur_loss=0.235, prior_loss=0.9823, diff_loss=0.3637, tot_loss=1.581, over 106.00 samples.], tot_loss[dur_loss=0.2291, prior_loss=0.9816, diff_loss=0.4102, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 06:51:05,618 INFO [train.py:682] (1/4) Start epoch 1531 2024-10-21 06:51:14,278 INFO [train.py:561] (1/4) Epoch 1531, batch 0, global_batch_idx: 24480, batch size: 108, loss[dur_loss=0.2381, prior_loss=0.9831, diff_loss=0.3193, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.2381, prior_loss=0.9831, diff_loss=0.3193, tot_loss=1.54, over 108.00 samples.], 2024-10-21 06:51:28,430 INFO [train.py:561] (1/4) Epoch 1531, batch 10, global_batch_idx: 24490, batch size: 111, loss[dur_loss=0.231, prior_loss=0.9832, diff_loss=0.3566, tot_loss=1.571, over 111.00 samples.], tot_loss[dur_loss=0.2307, prior_loss=0.9819, diff_loss=0.4039, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 06:51:35,506 INFO [train.py:682] (1/4) Start epoch 1532 2024-10-21 06:51:49,183 INFO [train.py:561] (1/4) Epoch 1532, batch 4, global_batch_idx: 24500, batch size: 189, loss[dur_loss=0.2277, prior_loss=0.9822, diff_loss=0.3781, tot_loss=1.588, over 189.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9813, diff_loss=0.4562, tot_loss=1.664, over 937.00 samples.], 2024-10-21 06:52:03,877 INFO [train.py:561] (1/4) Epoch 1532, batch 14, global_batch_idx: 24510, batch size: 142, loss[dur_loss=0.2286, prior_loss=0.9815, diff_loss=0.305, tot_loss=1.515, over 142.00 samples.], tot_loss[dur_loss=0.2298, prior_loss=0.9818, diff_loss=0.3829, tot_loss=1.595, over 2210.00 samples.], 2024-10-21 06:52:05,289 INFO [train.py:682] (1/4) Start epoch 1533 2024-10-21 06:52:25,218 INFO [train.py:561] (1/4) Epoch 1533, batch 8, global_batch_idx: 24520, batch size: 170, loss[dur_loss=0.2331, prior_loss=0.982, diff_loss=0.3323, tot_loss=1.547, over 170.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9815, diff_loss=0.4107, tot_loss=1.621, over 1432.00 samples.], 2024-10-21 06:52:35,398 INFO [train.py:682] (1/4) Start epoch 1534 2024-10-21 06:52:46,581 INFO [train.py:561] (1/4) Epoch 1534, batch 2, global_batch_idx: 24530, batch size: 203, loss[dur_loss=0.2315, prior_loss=0.9818, diff_loss=0.3473, tot_loss=1.561, over 203.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9821, diff_loss=0.3267, tot_loss=1.542, over 442.00 samples.], 2024-10-21 06:53:00,730 INFO [train.py:561] (1/4) Epoch 1534, batch 12, global_batch_idx: 24540, batch size: 152, loss[dur_loss=0.2291, prior_loss=0.982, diff_loss=0.3554, tot_loss=1.566, over 152.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9817, diff_loss=0.3774, tot_loss=1.589, over 1966.00 samples.], 2024-10-21 06:53:05,149 INFO [train.py:682] (1/4) Start epoch 1535 2024-10-21 06:53:21,743 INFO [train.py:561] (1/4) Epoch 1535, batch 6, global_batch_idx: 24550, batch size: 106, loss[dur_loss=0.234, prior_loss=0.9821, diff_loss=0.3124, tot_loss=1.529, over 106.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9813, diff_loss=0.4239, tot_loss=1.632, over 1142.00 samples.], 2024-10-21 06:53:34,666 INFO [train.py:682] (1/4) Start epoch 1536 2024-10-21 06:53:43,322 INFO [train.py:561] (1/4) Epoch 1536, batch 0, global_batch_idx: 24560, batch size: 108, loss[dur_loss=0.2351, prior_loss=0.9828, diff_loss=0.3262, tot_loss=1.544, over 108.00 samples.], tot_loss[dur_loss=0.2351, prior_loss=0.9828, diff_loss=0.3262, tot_loss=1.544, over 108.00 samples.], 2024-10-21 06:53:57,465 INFO [train.py:561] (1/4) Epoch 1536, batch 10, global_batch_idx: 24570, batch size: 111, loss[dur_loss=0.2352, prior_loss=0.983, diff_loss=0.352, tot_loss=1.57, over 111.00 samples.], tot_loss[dur_loss=0.2293, prior_loss=0.9816, diff_loss=0.3939, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 06:54:04,501 INFO [train.py:682] (1/4) Start epoch 1537 2024-10-21 06:54:17,897 INFO [train.py:561] (1/4) Epoch 1537, batch 4, global_batch_idx: 24580, batch size: 189, loss[dur_loss=0.2292, prior_loss=0.9821, diff_loss=0.3339, tot_loss=1.545, over 189.00 samples.], tot_loss[dur_loss=0.2252, prior_loss=0.9811, diff_loss=0.4367, tot_loss=1.643, over 937.00 samples.], 2024-10-21 06:54:32,813 INFO [train.py:561] (1/4) Epoch 1537, batch 14, global_batch_idx: 24590, batch size: 142, loss[dur_loss=0.2301, prior_loss=0.9817, diff_loss=0.3276, tot_loss=1.539, over 142.00 samples.], tot_loss[dur_loss=0.2295, prior_loss=0.9817, diff_loss=0.3659, tot_loss=1.577, over 2210.00 samples.], 2024-10-21 06:54:34,240 INFO [train.py:682] (1/4) Start epoch 1538 2024-10-21 06:54:53,982 INFO [train.py:561] (1/4) Epoch 1538, batch 8, global_batch_idx: 24600, batch size: 170, loss[dur_loss=0.2322, prior_loss=0.9819, diff_loss=0.3374, tot_loss=1.551, over 170.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9815, diff_loss=0.4067, tot_loss=1.617, over 1432.00 samples.], 2024-10-21 06:55:04,032 INFO [train.py:682] (1/4) Start epoch 1539 2024-10-21 06:55:15,174 INFO [train.py:561] (1/4) Epoch 1539, batch 2, global_batch_idx: 24610, batch size: 203, loss[dur_loss=0.2331, prior_loss=0.9819, diff_loss=0.3514, tot_loss=1.566, over 203.00 samples.], tot_loss[dur_loss=0.2319, prior_loss=0.982, diff_loss=0.3292, tot_loss=1.543, over 442.00 samples.], 2024-10-21 06:55:29,353 INFO [train.py:561] (1/4) Epoch 1539, batch 12, global_batch_idx: 24620, batch size: 152, loss[dur_loss=0.2313, prior_loss=0.9818, diff_loss=0.3615, tot_loss=1.575, over 152.00 samples.], tot_loss[dur_loss=0.2298, prior_loss=0.9816, diff_loss=0.382, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 06:55:33,780 INFO [train.py:682] (1/4) Start epoch 1540 2024-10-21 06:55:50,768 INFO [train.py:561] (1/4) Epoch 1540, batch 6, global_batch_idx: 24630, batch size: 106, loss[dur_loss=0.2327, prior_loss=0.9823, diff_loss=0.3341, tot_loss=1.549, over 106.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9811, diff_loss=0.4321, tot_loss=1.639, over 1142.00 samples.], 2024-10-21 06:56:03,752 INFO [train.py:682] (1/4) Start epoch 1541 2024-10-21 06:56:12,275 INFO [train.py:561] (1/4) Epoch 1541, batch 0, global_batch_idx: 24640, batch size: 108, loss[dur_loss=0.2331, prior_loss=0.9825, diff_loss=0.3207, tot_loss=1.536, over 108.00 samples.], tot_loss[dur_loss=0.2331, prior_loss=0.9825, diff_loss=0.3207, tot_loss=1.536, over 108.00 samples.], 2024-10-21 06:56:26,387 INFO [train.py:561] (1/4) Epoch 1541, batch 10, global_batch_idx: 24650, batch size: 111, loss[dur_loss=0.2369, prior_loss=0.983, diff_loss=0.3553, tot_loss=1.575, over 111.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9815, diff_loss=0.394, tot_loss=1.604, over 1656.00 samples.], 2024-10-21 06:56:33,393 INFO [train.py:682] (1/4) Start epoch 1542 2024-10-21 06:56:47,064 INFO [train.py:561] (1/4) Epoch 1542, batch 4, global_batch_idx: 24660, batch size: 189, loss[dur_loss=0.2267, prior_loss=0.9823, diff_loss=0.3402, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.981, diff_loss=0.4358, tot_loss=1.642, over 937.00 samples.], 2024-10-21 06:57:01,733 INFO [train.py:561] (1/4) Epoch 1542, batch 14, global_batch_idx: 24670, batch size: 142, loss[dur_loss=0.235, prior_loss=0.9817, diff_loss=0.3041, tot_loss=1.521, over 142.00 samples.], tot_loss[dur_loss=0.2299, prior_loss=0.9817, diff_loss=0.3642, tot_loss=1.576, over 2210.00 samples.], 2024-10-21 06:57:03,142 INFO [train.py:682] (1/4) Start epoch 1543 2024-10-21 06:57:22,792 INFO [train.py:561] (1/4) Epoch 1543, batch 8, global_batch_idx: 24680, batch size: 170, loss[dur_loss=0.2357, prior_loss=0.9824, diff_loss=0.3444, tot_loss=1.562, over 170.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.9815, diff_loss=0.4134, tot_loss=1.622, over 1432.00 samples.], 2024-10-21 06:57:32,865 INFO [train.py:682] (1/4) Start epoch 1544 2024-10-21 06:57:44,027 INFO [train.py:561] (1/4) Epoch 1544, batch 2, global_batch_idx: 24690, batch size: 203, loss[dur_loss=0.2265, prior_loss=0.9815, diff_loss=0.3594, tot_loss=1.567, over 203.00 samples.], tot_loss[dur_loss=0.23, prior_loss=0.9818, diff_loss=0.3408, tot_loss=1.553, over 442.00 samples.], 2024-10-21 06:57:58,075 INFO [train.py:561] (1/4) Epoch 1544, batch 12, global_batch_idx: 24700, batch size: 152, loss[dur_loss=0.2313, prior_loss=0.9818, diff_loss=0.3611, tot_loss=1.574, over 152.00 samples.], tot_loss[dur_loss=0.2294, prior_loss=0.9816, diff_loss=0.3824, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 06:58:02,451 INFO [train.py:682] (1/4) Start epoch 1545 2024-10-21 06:58:19,455 INFO [train.py:561] (1/4) Epoch 1545, batch 6, global_batch_idx: 24710, batch size: 106, loss[dur_loss=0.2317, prior_loss=0.9823, diff_loss=0.292, tot_loss=1.506, over 106.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9814, diff_loss=0.424, tot_loss=1.631, over 1142.00 samples.], 2024-10-21 06:58:32,426 INFO [train.py:682] (1/4) Start epoch 1546 2024-10-21 06:58:41,009 INFO [train.py:561] (1/4) Epoch 1546, batch 0, global_batch_idx: 24720, batch size: 108, loss[dur_loss=0.2373, prior_loss=0.9827, diff_loss=0.3257, tot_loss=1.546, over 108.00 samples.], tot_loss[dur_loss=0.2373, prior_loss=0.9827, diff_loss=0.3257, tot_loss=1.546, over 108.00 samples.], 2024-10-21 06:58:55,112 INFO [train.py:561] (1/4) Epoch 1546, batch 10, global_batch_idx: 24730, batch size: 111, loss[dur_loss=0.2359, prior_loss=0.9829, diff_loss=0.3252, tot_loss=1.544, over 111.00 samples.], tot_loss[dur_loss=0.23, prior_loss=0.9815, diff_loss=0.3828, tot_loss=1.594, over 1656.00 samples.], 2024-10-21 06:59:02,146 INFO [train.py:682] (1/4) Start epoch 1547 2024-10-21 06:59:15,798 INFO [train.py:561] (1/4) Epoch 1547, batch 4, global_batch_idx: 24740, batch size: 189, loss[dur_loss=0.2287, prior_loss=0.9818, diff_loss=0.3614, tot_loss=1.572, over 189.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.9808, diff_loss=0.4399, tot_loss=1.645, over 937.00 samples.], 2024-10-21 06:59:30,582 INFO [train.py:561] (1/4) Epoch 1547, batch 14, global_batch_idx: 24750, batch size: 142, loss[dur_loss=0.2316, prior_loss=0.9815, diff_loss=0.3187, tot_loss=1.532, over 142.00 samples.], tot_loss[dur_loss=0.2291, prior_loss=0.9816, diff_loss=0.3713, tot_loss=1.582, over 2210.00 samples.], 2024-10-21 06:59:31,998 INFO [train.py:682] (1/4) Start epoch 1548 2024-10-21 06:59:51,604 INFO [train.py:561] (1/4) Epoch 1548, batch 8, global_batch_idx: 24760, batch size: 170, loss[dur_loss=0.235, prior_loss=0.9819, diff_loss=0.3227, tot_loss=1.54, over 170.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.9813, diff_loss=0.4007, tot_loss=1.61, over 1432.00 samples.], 2024-10-21 07:00:01,706 INFO [train.py:682] (1/4) Start epoch 1549 2024-10-21 07:00:13,133 INFO [train.py:561] (1/4) Epoch 1549, batch 2, global_batch_idx: 24770, batch size: 203, loss[dur_loss=0.2316, prior_loss=0.982, diff_loss=0.3586, tot_loss=1.572, over 203.00 samples.], tot_loss[dur_loss=0.2327, prior_loss=0.9821, diff_loss=0.3523, tot_loss=1.567, over 442.00 samples.], 2024-10-21 07:00:27,270 INFO [train.py:561] (1/4) Epoch 1549, batch 12, global_batch_idx: 24780, batch size: 152, loss[dur_loss=0.2348, prior_loss=0.9822, diff_loss=0.3353, tot_loss=1.552, over 152.00 samples.], tot_loss[dur_loss=0.2306, prior_loss=0.9816, diff_loss=0.3905, tot_loss=1.603, over 1966.00 samples.], 2024-10-21 07:00:31,680 INFO [train.py:682] (1/4) Start epoch 1550 2024-10-21 07:00:48,680 INFO [train.py:561] (1/4) Epoch 1550, batch 6, global_batch_idx: 24790, batch size: 106, loss[dur_loss=0.2312, prior_loss=0.9819, diff_loss=0.3246, tot_loss=1.538, over 106.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9812, diff_loss=0.4226, tot_loss=1.63, over 1142.00 samples.], 2024-10-21 07:01:01,683 INFO [train.py:682] (1/4) Start epoch 1551 2024-10-21 07:01:10,315 INFO [train.py:561] (1/4) Epoch 1551, batch 0, global_batch_idx: 24800, batch size: 108, loss[dur_loss=0.2334, prior_loss=0.9824, diff_loss=0.2986, tot_loss=1.514, over 108.00 samples.], tot_loss[dur_loss=0.2334, prior_loss=0.9824, diff_loss=0.2986, tot_loss=1.514, over 108.00 samples.], 2024-10-21 07:01:24,443 INFO [train.py:561] (1/4) Epoch 1551, batch 10, global_batch_idx: 24810, batch size: 111, loss[dur_loss=0.2355, prior_loss=0.9831, diff_loss=0.3501, tot_loss=1.569, over 111.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9815, diff_loss=0.3889, tot_loss=1.599, over 1656.00 samples.], 2024-10-21 07:01:31,468 INFO [train.py:682] (1/4) Start epoch 1552 2024-10-21 07:01:44,951 INFO [train.py:561] (1/4) Epoch 1552, batch 4, global_batch_idx: 24820, batch size: 189, loss[dur_loss=0.2315, prior_loss=0.9821, diff_loss=0.3627, tot_loss=1.576, over 189.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9809, diff_loss=0.4378, tot_loss=1.643, over 937.00 samples.], 2024-10-21 07:01:59,677 INFO [train.py:561] (1/4) Epoch 1552, batch 14, global_batch_idx: 24830, batch size: 142, loss[dur_loss=0.2318, prior_loss=0.9816, diff_loss=0.3299, tot_loss=1.543, over 142.00 samples.], tot_loss[dur_loss=0.2293, prior_loss=0.9816, diff_loss=0.3704, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 07:02:01,085 INFO [train.py:682] (1/4) Start epoch 1553 2024-10-21 07:02:20,596 INFO [train.py:561] (1/4) Epoch 1553, batch 8, global_batch_idx: 24840, batch size: 170, loss[dur_loss=0.2379, prior_loss=0.9822, diff_loss=0.3588, tot_loss=1.579, over 170.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9814, diff_loss=0.4015, tot_loss=1.612, over 1432.00 samples.], 2024-10-21 07:02:30,665 INFO [train.py:682] (1/4) Start epoch 1554 2024-10-21 07:02:42,020 INFO [train.py:561] (1/4) Epoch 1554, batch 2, global_batch_idx: 24850, batch size: 203, loss[dur_loss=0.2323, prior_loss=0.9822, diff_loss=0.3672, tot_loss=1.582, over 203.00 samples.], tot_loss[dur_loss=0.2315, prior_loss=0.9821, diff_loss=0.3496, tot_loss=1.563, over 442.00 samples.], 2024-10-21 07:02:56,150 INFO [train.py:561] (1/4) Epoch 1554, batch 12, global_batch_idx: 24860, batch size: 152, loss[dur_loss=0.23, prior_loss=0.982, diff_loss=0.3188, tot_loss=1.531, over 152.00 samples.], tot_loss[dur_loss=0.23, prior_loss=0.9816, diff_loss=0.3774, tot_loss=1.589, over 1966.00 samples.], 2024-10-21 07:03:00,602 INFO [train.py:682] (1/4) Start epoch 1555 2024-10-21 07:03:17,255 INFO [train.py:561] (1/4) Epoch 1555, batch 6, global_batch_idx: 24870, batch size: 106, loss[dur_loss=0.2312, prior_loss=0.9821, diff_loss=0.3411, tot_loss=1.554, over 106.00 samples.], tot_loss[dur_loss=0.2256, prior_loss=0.9811, diff_loss=0.4174, tot_loss=1.624, over 1142.00 samples.], 2024-10-21 07:03:30,145 INFO [train.py:682] (1/4) Start epoch 1556 2024-10-21 07:03:38,822 INFO [train.py:561] (1/4) Epoch 1556, batch 0, global_batch_idx: 24880, batch size: 108, loss[dur_loss=0.2355, prior_loss=0.9826, diff_loss=0.3139, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2355, prior_loss=0.9826, diff_loss=0.3139, tot_loss=1.532, over 108.00 samples.], 2024-10-21 07:03:52,859 INFO [train.py:561] (1/4) Epoch 1556, batch 10, global_batch_idx: 24890, batch size: 111, loss[dur_loss=0.2346, prior_loss=0.9828, diff_loss=0.3122, tot_loss=1.53, over 111.00 samples.], tot_loss[dur_loss=0.2285, prior_loss=0.9815, diff_loss=0.3808, tot_loss=1.591, over 1656.00 samples.], 2024-10-21 07:03:59,906 INFO [train.py:682] (1/4) Start epoch 1557 2024-10-21 07:04:13,383 INFO [train.py:561] (1/4) Epoch 1557, batch 4, global_batch_idx: 24900, batch size: 189, loss[dur_loss=0.2296, prior_loss=0.9819, diff_loss=0.3542, tot_loss=1.566, over 189.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9807, diff_loss=0.4393, tot_loss=1.644, over 937.00 samples.], 2024-10-21 07:04:28,174 INFO [train.py:561] (1/4) Epoch 1557, batch 14, global_batch_idx: 24910, batch size: 142, loss[dur_loss=0.2316, prior_loss=0.9818, diff_loss=0.3366, tot_loss=1.55, over 142.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9816, diff_loss=0.3731, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 07:04:29,582 INFO [train.py:682] (1/4) Start epoch 1558 2024-10-21 07:04:49,473 INFO [train.py:561] (1/4) Epoch 1558, batch 8, global_batch_idx: 24920, batch size: 170, loss[dur_loss=0.2379, prior_loss=0.9824, diff_loss=0.3379, tot_loss=1.558, over 170.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.9813, diff_loss=0.4054, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 07:04:59,489 INFO [train.py:682] (1/4) Start epoch 1559 2024-10-21 07:05:10,720 INFO [train.py:561] (1/4) Epoch 1559, batch 2, global_batch_idx: 24930, batch size: 203, loss[dur_loss=0.2322, prior_loss=0.9819, diff_loss=0.3588, tot_loss=1.573, over 203.00 samples.], tot_loss[dur_loss=0.2327, prior_loss=0.982, diff_loss=0.3435, tot_loss=1.558, over 442.00 samples.], 2024-10-21 07:05:24,811 INFO [train.py:561] (1/4) Epoch 1559, batch 12, global_batch_idx: 24940, batch size: 152, loss[dur_loss=0.2284, prior_loss=0.9815, diff_loss=0.2712, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.2293, prior_loss=0.9815, diff_loss=0.3794, tot_loss=1.59, over 1966.00 samples.], 2024-10-21 07:05:29,258 INFO [train.py:682] (1/4) Start epoch 1560 2024-10-21 07:05:45,843 INFO [train.py:561] (1/4) Epoch 1560, batch 6, global_batch_idx: 24950, batch size: 106, loss[dur_loss=0.2356, prior_loss=0.982, diff_loss=0.3451, tot_loss=1.563, over 106.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9812, diff_loss=0.425, tot_loss=1.633, over 1142.00 samples.], 2024-10-21 07:05:58,789 INFO [train.py:682] (1/4) Start epoch 1561 2024-10-21 07:06:07,269 INFO [train.py:561] (1/4) Epoch 1561, batch 0, global_batch_idx: 24960, batch size: 108, loss[dur_loss=0.2341, prior_loss=0.9824, diff_loss=0.2864, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2341, prior_loss=0.9824, diff_loss=0.2864, tot_loss=1.503, over 108.00 samples.], 2024-10-21 07:06:21,313 INFO [train.py:561] (1/4) Epoch 1561, batch 10, global_batch_idx: 24970, batch size: 111, loss[dur_loss=0.2352, prior_loss=0.9828, diff_loss=0.3086, tot_loss=1.527, over 111.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9814, diff_loss=0.3831, tot_loss=1.594, over 1656.00 samples.], 2024-10-21 07:06:28,355 INFO [train.py:682] (1/4) Start epoch 1562 2024-10-21 07:06:41,954 INFO [train.py:561] (1/4) Epoch 1562, batch 4, global_batch_idx: 24980, batch size: 189, loss[dur_loss=0.2298, prior_loss=0.9817, diff_loss=0.3162, tot_loss=1.528, over 189.00 samples.], tot_loss[dur_loss=0.2252, prior_loss=0.9807, diff_loss=0.4237, tot_loss=1.63, over 937.00 samples.], 2024-10-21 07:06:56,648 INFO [train.py:561] (1/4) Epoch 1562, batch 14, global_batch_idx: 24990, batch size: 142, loss[dur_loss=0.2288, prior_loss=0.9816, diff_loss=0.3238, tot_loss=1.534, over 142.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9814, diff_loss=0.368, tot_loss=1.578, over 2210.00 samples.], 2024-10-21 07:06:58,064 INFO [train.py:682] (1/4) Start epoch 1563 2024-10-21 07:07:17,656 INFO [train.py:561] (1/4) Epoch 1563, batch 8, global_batch_idx: 25000, batch size: 170, loss[dur_loss=0.233, prior_loss=0.9822, diff_loss=0.3092, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9812, diff_loss=0.3894, tot_loss=1.598, over 1432.00 samples.], 2024-10-21 07:07:27,789 INFO [train.py:682] (1/4) Start epoch 1564 2024-10-21 07:07:38,888 INFO [train.py:561] (1/4) Epoch 1564, batch 2, global_batch_idx: 25010, batch size: 203, loss[dur_loss=0.2305, prior_loss=0.9819, diff_loss=0.3566, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.2309, prior_loss=0.9819, diff_loss=0.3271, tot_loss=1.54, over 442.00 samples.], 2024-10-21 07:07:52,989 INFO [train.py:561] (1/4) Epoch 1564, batch 12, global_batch_idx: 25020, batch size: 152, loss[dur_loss=0.2265, prior_loss=0.9819, diff_loss=0.2912, tot_loss=1.5, over 152.00 samples.], tot_loss[dur_loss=0.2297, prior_loss=0.9816, diff_loss=0.3869, tot_loss=1.598, over 1966.00 samples.], 2024-10-21 07:07:57,413 INFO [train.py:682] (1/4) Start epoch 1565 2024-10-21 07:08:14,603 INFO [train.py:561] (1/4) Epoch 1565, batch 6, global_batch_idx: 25030, batch size: 106, loss[dur_loss=0.2347, prior_loss=0.982, diff_loss=0.3014, tot_loss=1.518, over 106.00 samples.], tot_loss[dur_loss=0.2261, prior_loss=0.9811, diff_loss=0.4186, tot_loss=1.626, over 1142.00 samples.], 2024-10-21 07:08:27,711 INFO [train.py:682] (1/4) Start epoch 1566 2024-10-21 07:08:36,471 INFO [train.py:561] (1/4) Epoch 1566, batch 0, global_batch_idx: 25040, batch size: 108, loss[dur_loss=0.2373, prior_loss=0.9829, diff_loss=0.3593, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2373, prior_loss=0.9829, diff_loss=0.3593, tot_loss=1.58, over 108.00 samples.], 2024-10-21 07:08:50,585 INFO [train.py:561] (1/4) Epoch 1566, batch 10, global_batch_idx: 25050, batch size: 111, loss[dur_loss=0.2347, prior_loss=0.9831, diff_loss=0.3404, tot_loss=1.558, over 111.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9814, diff_loss=0.4069, tot_loss=1.617, over 1656.00 samples.], 2024-10-21 07:08:57,624 INFO [train.py:682] (1/4) Start epoch 1567 2024-10-21 07:09:11,393 INFO [train.py:561] (1/4) Epoch 1567, batch 4, global_batch_idx: 25060, batch size: 189, loss[dur_loss=0.2313, prior_loss=0.9822, diff_loss=0.3505, tot_loss=1.564, over 189.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9809, diff_loss=0.4441, tot_loss=1.65, over 937.00 samples.], 2024-10-21 07:09:26,307 INFO [train.py:561] (1/4) Epoch 1567, batch 14, global_batch_idx: 25070, batch size: 142, loss[dur_loss=0.2299, prior_loss=0.9816, diff_loss=0.3424, tot_loss=1.554, over 142.00 samples.], tot_loss[dur_loss=0.2295, prior_loss=0.9816, diff_loss=0.3729, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 07:09:27,747 INFO [train.py:682] (1/4) Start epoch 1568 2024-10-21 07:09:47,458 INFO [train.py:561] (1/4) Epoch 1568, batch 8, global_batch_idx: 25080, batch size: 170, loss[dur_loss=0.2378, prior_loss=0.9821, diff_loss=0.323, tot_loss=1.543, over 170.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9812, diff_loss=0.3986, tot_loss=1.608, over 1432.00 samples.], 2024-10-21 07:09:57,551 INFO [train.py:682] (1/4) Start epoch 1569 2024-10-21 07:10:08,916 INFO [train.py:561] (1/4) Epoch 1569, batch 2, global_batch_idx: 25090, batch size: 203, loss[dur_loss=0.2325, prior_loss=0.9817, diff_loss=0.3776, tot_loss=1.592, over 203.00 samples.], tot_loss[dur_loss=0.2318, prior_loss=0.9819, diff_loss=0.3531, tot_loss=1.567, over 442.00 samples.], 2024-10-21 07:10:23,144 INFO [train.py:561] (1/4) Epoch 1569, batch 12, global_batch_idx: 25100, batch size: 152, loss[dur_loss=0.2292, prior_loss=0.9818, diff_loss=0.3244, tot_loss=1.535, over 152.00 samples.], tot_loss[dur_loss=0.2293, prior_loss=0.9815, diff_loss=0.3905, tot_loss=1.601, over 1966.00 samples.], 2024-10-21 07:10:27,620 INFO [train.py:682] (1/4) Start epoch 1570 2024-10-21 07:10:44,603 INFO [train.py:561] (1/4) Epoch 1570, batch 6, global_batch_idx: 25110, batch size: 106, loss[dur_loss=0.238, prior_loss=0.9822, diff_loss=0.2968, tot_loss=1.517, over 106.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9812, diff_loss=0.4213, tot_loss=1.628, over 1142.00 samples.], 2024-10-21 07:10:57,594 INFO [train.py:682] (1/4) Start epoch 1571 2024-10-21 07:11:06,057 INFO [train.py:561] (1/4) Epoch 1571, batch 0, global_batch_idx: 25120, batch size: 108, loss[dur_loss=0.2369, prior_loss=0.9829, diff_loss=0.3468, tot_loss=1.567, over 108.00 samples.], tot_loss[dur_loss=0.2369, prior_loss=0.9829, diff_loss=0.3468, tot_loss=1.567, over 108.00 samples.], 2024-10-21 07:11:20,153 INFO [train.py:561] (1/4) Epoch 1571, batch 10, global_batch_idx: 25130, batch size: 111, loss[dur_loss=0.2352, prior_loss=0.9831, diff_loss=0.3782, tot_loss=1.597, over 111.00 samples.], tot_loss[dur_loss=0.2291, prior_loss=0.9815, diff_loss=0.396, tot_loss=1.606, over 1656.00 samples.], 2024-10-21 07:11:27,163 INFO [train.py:682] (1/4) Start epoch 1572 2024-10-21 07:11:41,072 INFO [train.py:561] (1/4) Epoch 1572, batch 4, global_batch_idx: 25140, batch size: 189, loss[dur_loss=0.2299, prior_loss=0.9819, diff_loss=0.3266, tot_loss=1.538, over 189.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9809, diff_loss=0.441, tot_loss=1.647, over 937.00 samples.], 2024-10-21 07:11:55,808 INFO [train.py:561] (1/4) Epoch 1572, batch 14, global_batch_idx: 25150, batch size: 142, loss[dur_loss=0.2307, prior_loss=0.9812, diff_loss=0.3196, tot_loss=1.531, over 142.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9815, diff_loss=0.3704, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 07:11:57,218 INFO [train.py:682] (1/4) Start epoch 1573 2024-10-21 07:12:16,835 INFO [train.py:561] (1/4) Epoch 1573, batch 8, global_batch_idx: 25160, batch size: 170, loss[dur_loss=0.2335, prior_loss=0.9818, diff_loss=0.3183, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2288, prior_loss=0.9814, diff_loss=0.3929, tot_loss=1.603, over 1432.00 samples.], 2024-10-21 07:12:26,889 INFO [train.py:682] (1/4) Start epoch 1574 2024-10-21 07:12:38,048 INFO [train.py:561] (1/4) Epoch 1574, batch 2, global_batch_idx: 25170, batch size: 203, loss[dur_loss=0.2318, prior_loss=0.9815, diff_loss=0.3622, tot_loss=1.576, over 203.00 samples.], tot_loss[dur_loss=0.2319, prior_loss=0.9818, diff_loss=0.3305, tot_loss=1.544, over 442.00 samples.], 2024-10-21 07:12:52,130 INFO [train.py:561] (1/4) Epoch 1574, batch 12, global_batch_idx: 25180, batch size: 152, loss[dur_loss=0.2285, prior_loss=0.9818, diff_loss=0.3537, tot_loss=1.564, over 152.00 samples.], tot_loss[dur_loss=0.2292, prior_loss=0.9815, diff_loss=0.3824, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 07:12:56,522 INFO [train.py:682] (1/4) Start epoch 1575 2024-10-21 07:13:13,114 INFO [train.py:561] (1/4) Epoch 1575, batch 6, global_batch_idx: 25190, batch size: 106, loss[dur_loss=0.2332, prior_loss=0.9821, diff_loss=0.2836, tot_loss=1.499, over 106.00 samples.], tot_loss[dur_loss=0.227, prior_loss=0.9811, diff_loss=0.4109, tot_loss=1.619, over 1142.00 samples.], 2024-10-21 07:13:26,096 INFO [train.py:682] (1/4) Start epoch 1576 2024-10-21 07:13:34,848 INFO [train.py:561] (1/4) Epoch 1576, batch 0, global_batch_idx: 25200, batch size: 108, loss[dur_loss=0.2351, prior_loss=0.9826, diff_loss=0.311, tot_loss=1.529, over 108.00 samples.], tot_loss[dur_loss=0.2351, prior_loss=0.9826, diff_loss=0.311, tot_loss=1.529, over 108.00 samples.], 2024-10-21 07:13:48,899 INFO [train.py:561] (1/4) Epoch 1576, batch 10, global_batch_idx: 25210, batch size: 111, loss[dur_loss=0.2357, prior_loss=0.9828, diff_loss=0.2801, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.2283, prior_loss=0.9813, diff_loss=0.3951, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 07:13:55,929 INFO [train.py:682] (1/4) Start epoch 1577 2024-10-21 07:14:09,289 INFO [train.py:561] (1/4) Epoch 1577, batch 4, global_batch_idx: 25220, batch size: 189, loss[dur_loss=0.231, prior_loss=0.9819, diff_loss=0.367, tot_loss=1.58, over 189.00 samples.], tot_loss[dur_loss=0.225, prior_loss=0.9807, diff_loss=0.4383, tot_loss=1.644, over 937.00 samples.], 2024-10-21 07:14:24,003 INFO [train.py:561] (1/4) Epoch 1577, batch 14, global_batch_idx: 25230, batch size: 142, loss[dur_loss=0.2307, prior_loss=0.9813, diff_loss=0.3193, tot_loss=1.531, over 142.00 samples.], tot_loss[dur_loss=0.2286, prior_loss=0.9813, diff_loss=0.3758, tot_loss=1.586, over 2210.00 samples.], 2024-10-21 07:14:25,422 INFO [train.py:682] (1/4) Start epoch 1578 2024-10-21 07:14:45,206 INFO [train.py:561] (1/4) Epoch 1578, batch 8, global_batch_idx: 25240, batch size: 170, loss[dur_loss=0.2349, prior_loss=0.9817, diff_loss=0.3581, tot_loss=1.575, over 170.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9811, diff_loss=0.3941, tot_loss=1.603, over 1432.00 samples.], 2024-10-21 07:14:55,261 INFO [train.py:682] (1/4) Start epoch 1579 2024-10-21 07:15:06,402 INFO [train.py:561] (1/4) Epoch 1579, batch 2, global_batch_idx: 25250, batch size: 203, loss[dur_loss=0.2313, prior_loss=0.9815, diff_loss=0.3584, tot_loss=1.571, over 203.00 samples.], tot_loss[dur_loss=0.2308, prior_loss=0.9817, diff_loss=0.3358, tot_loss=1.548, over 442.00 samples.], 2024-10-21 07:15:20,451 INFO [train.py:561] (1/4) Epoch 1579, batch 12, global_batch_idx: 25260, batch size: 152, loss[dur_loss=0.2294, prior_loss=0.9817, diff_loss=0.3445, tot_loss=1.556, over 152.00 samples.], tot_loss[dur_loss=0.2286, prior_loss=0.9813, diff_loss=0.3782, tot_loss=1.588, over 1966.00 samples.], 2024-10-21 07:15:24,875 INFO [train.py:682] (1/4) Start epoch 1580 2024-10-21 07:15:41,563 INFO [train.py:561] (1/4) Epoch 1580, batch 6, global_batch_idx: 25270, batch size: 106, loss[dur_loss=0.2319, prior_loss=0.9817, diff_loss=0.3128, tot_loss=1.526, over 106.00 samples.], tot_loss[dur_loss=0.2257, prior_loss=0.9809, diff_loss=0.4201, tot_loss=1.627, over 1142.00 samples.], 2024-10-21 07:15:54,571 INFO [train.py:682] (1/4) Start epoch 1581 2024-10-21 07:16:03,502 INFO [train.py:561] (1/4) Epoch 1581, batch 0, global_batch_idx: 25280, batch size: 108, loss[dur_loss=0.2364, prior_loss=0.9822, diff_loss=0.2985, tot_loss=1.517, over 108.00 samples.], tot_loss[dur_loss=0.2364, prior_loss=0.9822, diff_loss=0.2985, tot_loss=1.517, over 108.00 samples.], 2024-10-21 07:16:17,642 INFO [train.py:561] (1/4) Epoch 1581, batch 10, global_batch_idx: 25290, batch size: 111, loss[dur_loss=0.232, prior_loss=0.9829, diff_loss=0.3149, tot_loss=1.53, over 111.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9813, diff_loss=0.3816, tot_loss=1.592, over 1656.00 samples.], 2024-10-21 07:16:24,770 INFO [train.py:682] (1/4) Start epoch 1582 2024-10-21 07:16:38,872 INFO [train.py:561] (1/4) Epoch 1582, batch 4, global_batch_idx: 25300, batch size: 189, loss[dur_loss=0.2307, prior_loss=0.9821, diff_loss=0.3311, tot_loss=1.544, over 189.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.981, diff_loss=0.4408, tot_loss=1.648, over 937.00 samples.], 2024-10-21 07:16:53,694 INFO [train.py:561] (1/4) Epoch 1582, batch 14, global_batch_idx: 25310, batch size: 142, loss[dur_loss=0.2291, prior_loss=0.9811, diff_loss=0.3317, tot_loss=1.542, over 142.00 samples.], tot_loss[dur_loss=0.2294, prior_loss=0.9814, diff_loss=0.3763, tot_loss=1.587, over 2210.00 samples.], 2024-10-21 07:16:55,160 INFO [train.py:682] (1/4) Start epoch 1583 2024-10-21 07:17:15,236 INFO [train.py:561] (1/4) Epoch 1583, batch 8, global_batch_idx: 25320, batch size: 170, loss[dur_loss=0.2343, prior_loss=0.982, diff_loss=0.3588, tot_loss=1.575, over 170.00 samples.], tot_loss[dur_loss=0.2286, prior_loss=0.9812, diff_loss=0.4082, tot_loss=1.618, over 1432.00 samples.], 2024-10-21 07:17:25,332 INFO [train.py:682] (1/4) Start epoch 1584 2024-10-21 07:17:36,473 INFO [train.py:561] (1/4) Epoch 1584, batch 2, global_batch_idx: 25330, batch size: 203, loss[dur_loss=0.2288, prior_loss=0.9815, diff_loss=0.3432, tot_loss=1.553, over 203.00 samples.], tot_loss[dur_loss=0.2306, prior_loss=0.9818, diff_loss=0.3355, tot_loss=1.548, over 442.00 samples.], 2024-10-21 07:17:50,950 INFO [train.py:561] (1/4) Epoch 1584, batch 12, global_batch_idx: 25340, batch size: 152, loss[dur_loss=0.2279, prior_loss=0.9815, diff_loss=0.3409, tot_loss=1.55, over 152.00 samples.], tot_loss[dur_loss=0.2278, prior_loss=0.9813, diff_loss=0.3794, tot_loss=1.589, over 1966.00 samples.], 2024-10-21 07:17:55,344 INFO [train.py:682] (1/4) Start epoch 1585 2024-10-21 07:18:12,252 INFO [train.py:561] (1/4) Epoch 1585, batch 6, global_batch_idx: 25350, batch size: 106, loss[dur_loss=0.229, prior_loss=0.9815, diff_loss=0.2944, tot_loss=1.505, over 106.00 samples.], tot_loss[dur_loss=0.2262, prior_loss=0.9808, diff_loss=0.4085, tot_loss=1.616, over 1142.00 samples.], 2024-10-21 07:18:25,190 INFO [train.py:682] (1/4) Start epoch 1586 2024-10-21 07:18:34,037 INFO [train.py:561] (1/4) Epoch 1586, batch 0, global_batch_idx: 25360, batch size: 108, loss[dur_loss=0.2344, prior_loss=0.9822, diff_loss=0.2935, tot_loss=1.51, over 108.00 samples.], tot_loss[dur_loss=0.2344, prior_loss=0.9822, diff_loss=0.2935, tot_loss=1.51, over 108.00 samples.], 2024-10-21 07:18:48,174 INFO [train.py:561] (1/4) Epoch 1586, batch 10, global_batch_idx: 25370, batch size: 111, loss[dur_loss=0.2337, prior_loss=0.9825, diff_loss=0.3049, tot_loss=1.521, over 111.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.9811, diff_loss=0.3746, tot_loss=1.583, over 1656.00 samples.], 2024-10-21 07:18:55,250 INFO [train.py:682] (1/4) Start epoch 1587 2024-10-21 07:19:08,767 INFO [train.py:561] (1/4) Epoch 1587, batch 4, global_batch_idx: 25380, batch size: 189, loss[dur_loss=0.2302, prior_loss=0.9816, diff_loss=0.3442, tot_loss=1.556, over 189.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9806, diff_loss=0.4418, tot_loss=1.647, over 937.00 samples.], 2024-10-21 07:19:23,464 INFO [train.py:561] (1/4) Epoch 1587, batch 14, global_batch_idx: 25390, batch size: 142, loss[dur_loss=0.2319, prior_loss=0.9812, diff_loss=0.3272, tot_loss=1.54, over 142.00 samples.], tot_loss[dur_loss=0.2293, prior_loss=0.9812, diff_loss=0.3733, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 07:19:24,888 INFO [train.py:682] (1/4) Start epoch 1588 2024-10-21 07:19:44,943 INFO [train.py:561] (1/4) Epoch 1588, batch 8, global_batch_idx: 25400, batch size: 170, loss[dur_loss=0.2326, prior_loss=0.9814, diff_loss=0.3401, tot_loss=1.554, over 170.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.9809, diff_loss=0.4, tot_loss=1.608, over 1432.00 samples.], 2024-10-21 07:19:55,026 INFO [train.py:682] (1/4) Start epoch 1589 2024-10-21 07:20:06,295 INFO [train.py:561] (1/4) Epoch 1589, batch 2, global_batch_idx: 25410, batch size: 203, loss[dur_loss=0.2284, prior_loss=0.9815, diff_loss=0.3756, tot_loss=1.586, over 203.00 samples.], tot_loss[dur_loss=0.231, prior_loss=0.9816, diff_loss=0.3565, tot_loss=1.569, over 442.00 samples.], 2024-10-21 07:20:20,333 INFO [train.py:561] (1/4) Epoch 1589, batch 12, global_batch_idx: 25420, batch size: 152, loss[dur_loss=0.2286, prior_loss=0.9816, diff_loss=0.3349, tot_loss=1.545, over 152.00 samples.], tot_loss[dur_loss=0.2285, prior_loss=0.9812, diff_loss=0.382, tot_loss=1.592, over 1966.00 samples.], 2024-10-21 07:20:24,766 INFO [train.py:682] (1/4) Start epoch 1590 2024-10-21 07:20:41,879 INFO [train.py:561] (1/4) Epoch 1590, batch 6, global_batch_idx: 25430, batch size: 106, loss[dur_loss=0.2321, prior_loss=0.9819, diff_loss=0.3391, tot_loss=1.553, over 106.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9809, diff_loss=0.4155, tot_loss=1.623, over 1142.00 samples.], 2024-10-21 07:20:54,812 INFO [train.py:682] (1/4) Start epoch 1591 2024-10-21 07:21:03,887 INFO [train.py:561] (1/4) Epoch 1591, batch 0, global_batch_idx: 25440, batch size: 108, loss[dur_loss=0.2326, prior_loss=0.9824, diff_loss=0.3637, tot_loss=1.579, over 108.00 samples.], tot_loss[dur_loss=0.2326, prior_loss=0.9824, diff_loss=0.3637, tot_loss=1.579, over 108.00 samples.], 2024-10-21 07:21:18,058 INFO [train.py:561] (1/4) Epoch 1591, batch 10, global_batch_idx: 25450, batch size: 111, loss[dur_loss=0.2343, prior_loss=0.9827, diff_loss=0.3748, tot_loss=1.592, over 111.00 samples.], tot_loss[dur_loss=0.2284, prior_loss=0.9813, diff_loss=0.3982, tot_loss=1.608, over 1656.00 samples.], 2024-10-21 07:21:25,137 INFO [train.py:682] (1/4) Start epoch 1592 2024-10-21 07:21:38,878 INFO [train.py:561] (1/4) Epoch 1592, batch 4, global_batch_idx: 25460, batch size: 189, loss[dur_loss=0.2299, prior_loss=0.9819, diff_loss=0.3255, tot_loss=1.537, over 189.00 samples.], tot_loss[dur_loss=0.2257, prior_loss=0.9807, diff_loss=0.4316, tot_loss=1.638, over 937.00 samples.], 2024-10-21 07:21:53,620 INFO [train.py:561] (1/4) Epoch 1592, batch 14, global_batch_idx: 25470, batch size: 142, loss[dur_loss=0.227, prior_loss=0.981, diff_loss=0.3085, tot_loss=1.517, over 142.00 samples.], tot_loss[dur_loss=0.2288, prior_loss=0.9813, diff_loss=0.3687, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 07:21:55,052 INFO [train.py:682] (1/4) Start epoch 1593 2024-10-21 07:22:14,676 INFO [train.py:561] (1/4) Epoch 1593, batch 8, global_batch_idx: 25480, batch size: 170, loss[dur_loss=0.2328, prior_loss=0.9817, diff_loss=0.3511, tot_loss=1.566, over 170.00 samples.], tot_loss[dur_loss=0.2285, prior_loss=0.981, diff_loss=0.4056, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 07:22:24,756 INFO [train.py:682] (1/4) Start epoch 1594 2024-10-21 07:22:36,361 INFO [train.py:561] (1/4) Epoch 1594, batch 2, global_batch_idx: 25490, batch size: 203, loss[dur_loss=0.2284, prior_loss=0.9813, diff_loss=0.3728, tot_loss=1.583, over 203.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.9816, diff_loss=0.3444, tot_loss=1.556, over 442.00 samples.], 2024-10-21 07:22:50,422 INFO [train.py:561] (1/4) Epoch 1594, batch 12, global_batch_idx: 25500, batch size: 152, loss[dur_loss=0.2319, prior_loss=0.9814, diff_loss=0.2891, tot_loss=1.502, over 152.00 samples.], tot_loss[dur_loss=0.228, prior_loss=0.9811, diff_loss=0.3839, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 07:22:52,054 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 07:23:49,064 INFO [train.py:589] (1/4) Epoch 1594, validation: dur_loss=0.4527, prior_loss=1.034, diff_loss=0.3886, tot_loss=1.875, over 100.00 samples. 2024-10-21 07:23:49,065 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 07:23:51,891 INFO [train.py:682] (1/4) Start epoch 1595 2024-10-21 07:24:08,749 INFO [train.py:561] (1/4) Epoch 1595, batch 6, global_batch_idx: 25510, batch size: 106, loss[dur_loss=0.2313, prior_loss=0.9819, diff_loss=0.3059, tot_loss=1.519, over 106.00 samples.], tot_loss[dur_loss=0.2242, prior_loss=0.9807, diff_loss=0.4157, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 07:24:21,647 INFO [train.py:682] (1/4) Start epoch 1596 2024-10-21 07:24:30,258 INFO [train.py:561] (1/4) Epoch 1596, batch 0, global_batch_idx: 25520, batch size: 108, loss[dur_loss=0.2321, prior_loss=0.9824, diff_loss=0.3287, tot_loss=1.543, over 108.00 samples.], tot_loss[dur_loss=0.2321, prior_loss=0.9824, diff_loss=0.3287, tot_loss=1.543, over 108.00 samples.], 2024-10-21 07:24:44,383 INFO [train.py:561] (1/4) Epoch 1596, batch 10, global_batch_idx: 25530, batch size: 111, loss[dur_loss=0.233, prior_loss=0.9825, diff_loss=0.2997, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.228, prior_loss=0.9811, diff_loss=0.3909, tot_loss=1.6, over 1656.00 samples.], 2024-10-21 07:24:51,450 INFO [train.py:682] (1/4) Start epoch 1597 2024-10-21 07:25:05,192 INFO [train.py:561] (1/4) Epoch 1597, batch 4, global_batch_idx: 25540, batch size: 189, loss[dur_loss=0.2291, prior_loss=0.9816, diff_loss=0.3522, tot_loss=1.563, over 189.00 samples.], tot_loss[dur_loss=0.2244, prior_loss=0.9805, diff_loss=0.4513, tot_loss=1.656, over 937.00 samples.], 2024-10-21 07:25:19,828 INFO [train.py:561] (1/4) Epoch 1597, batch 14, global_batch_idx: 25550, batch size: 142, loss[dur_loss=0.2285, prior_loss=0.9809, diff_loss=0.3642, tot_loss=1.574, over 142.00 samples.], tot_loss[dur_loss=0.2284, prior_loss=0.9811, diff_loss=0.3804, tot_loss=1.59, over 2210.00 samples.], 2024-10-21 07:25:21,232 INFO [train.py:682] (1/4) Start epoch 1598 2024-10-21 07:25:40,893 INFO [train.py:561] (1/4) Epoch 1598, batch 8, global_batch_idx: 25560, batch size: 170, loss[dur_loss=0.2358, prior_loss=0.9819, diff_loss=0.3236, tot_loss=1.541, over 170.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9809, diff_loss=0.4071, tot_loss=1.615, over 1432.00 samples.], 2024-10-21 07:25:50,886 INFO [train.py:682] (1/4) Start epoch 1599 2024-10-21 07:26:02,516 INFO [train.py:561] (1/4) Epoch 1599, batch 2, global_batch_idx: 25570, batch size: 203, loss[dur_loss=0.2308, prior_loss=0.9815, diff_loss=0.3352, tot_loss=1.547, over 203.00 samples.], tot_loss[dur_loss=0.2295, prior_loss=0.9816, diff_loss=0.3338, tot_loss=1.545, over 442.00 samples.], 2024-10-21 07:26:16,570 INFO [train.py:561] (1/4) Epoch 1599, batch 12, global_batch_idx: 25580, batch size: 152, loss[dur_loss=0.2268, prior_loss=0.9812, diff_loss=0.3604, tot_loss=1.568, over 152.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.981, diff_loss=0.3786, tot_loss=1.587, over 1966.00 samples.], 2024-10-21 07:26:20,955 INFO [train.py:682] (1/4) Start epoch 1600 2024-10-21 07:26:37,863 INFO [train.py:561] (1/4) Epoch 1600, batch 6, global_batch_idx: 25590, batch size: 106, loss[dur_loss=0.2321, prior_loss=0.9816, diff_loss=0.3277, tot_loss=1.541, over 106.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9806, diff_loss=0.4272, tot_loss=1.635, over 1142.00 samples.], 2024-10-21 07:26:50,740 INFO [train.py:682] (1/4) Start epoch 1601 2024-10-21 07:26:59,552 INFO [train.py:561] (1/4) Epoch 1601, batch 0, global_batch_idx: 25600, batch size: 108, loss[dur_loss=0.2353, prior_loss=0.9823, diff_loss=0.3142, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2353, prior_loss=0.9823, diff_loss=0.3142, tot_loss=1.532, over 108.00 samples.], 2024-10-21 07:27:13,603 INFO [train.py:561] (1/4) Epoch 1601, batch 10, global_batch_idx: 25610, batch size: 111, loss[dur_loss=0.2314, prior_loss=0.9824, diff_loss=0.3014, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.981, diff_loss=0.3844, tot_loss=1.593, over 1656.00 samples.], 2024-10-21 07:27:20,606 INFO [train.py:682] (1/4) Start epoch 1602 2024-10-21 07:27:34,359 INFO [train.py:561] (1/4) Epoch 1602, batch 4, global_batch_idx: 25620, batch size: 189, loss[dur_loss=0.2238, prior_loss=0.9814, diff_loss=0.3208, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.223, prior_loss=0.9804, diff_loss=0.4397, tot_loss=1.643, over 937.00 samples.], 2024-10-21 07:27:49,016 INFO [train.py:561] (1/4) Epoch 1602, batch 14, global_batch_idx: 25630, batch size: 142, loss[dur_loss=0.2273, prior_loss=0.981, diff_loss=0.3525, tot_loss=1.561, over 142.00 samples.], tot_loss[dur_loss=0.227, prior_loss=0.9811, diff_loss=0.376, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 07:27:50,426 INFO [train.py:682] (1/4) Start epoch 1603 2024-10-21 07:28:10,174 INFO [train.py:561] (1/4) Epoch 1603, batch 8, global_batch_idx: 25640, batch size: 170, loss[dur_loss=0.2379, prior_loss=0.9816, diff_loss=0.3375, tot_loss=1.557, over 170.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9808, diff_loss=0.3956, tot_loss=1.603, over 1432.00 samples.], 2024-10-21 07:28:20,240 INFO [train.py:682] (1/4) Start epoch 1604 2024-10-21 07:28:31,408 INFO [train.py:561] (1/4) Epoch 1604, batch 2, global_batch_idx: 25650, batch size: 203, loss[dur_loss=0.2299, prior_loss=0.9814, diff_loss=0.342, tot_loss=1.553, over 203.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.9814, diff_loss=0.3218, tot_loss=1.533, over 442.00 samples.], 2024-10-21 07:28:45,508 INFO [train.py:561] (1/4) Epoch 1604, batch 12, global_batch_idx: 25660, batch size: 152, loss[dur_loss=0.2247, prior_loss=0.9813, diff_loss=0.3451, tot_loss=1.551, over 152.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9811, diff_loss=0.3777, tot_loss=1.586, over 1966.00 samples.], 2024-10-21 07:28:49,960 INFO [train.py:682] (1/4) Start epoch 1605 2024-10-21 07:29:06,884 INFO [train.py:561] (1/4) Epoch 1605, batch 6, global_batch_idx: 25670, batch size: 106, loss[dur_loss=0.2312, prior_loss=0.9814, diff_loss=0.302, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.9806, diff_loss=0.4222, tot_loss=1.627, over 1142.00 samples.], 2024-10-21 07:29:19,812 INFO [train.py:682] (1/4) Start epoch 1606 2024-10-21 07:29:28,595 INFO [train.py:561] (1/4) Epoch 1606, batch 0, global_batch_idx: 25680, batch size: 108, loss[dur_loss=0.2353, prior_loss=0.9824, diff_loss=0.2829, tot_loss=1.501, over 108.00 samples.], tot_loss[dur_loss=0.2353, prior_loss=0.9824, diff_loss=0.2829, tot_loss=1.501, over 108.00 samples.], 2024-10-21 07:29:42,740 INFO [train.py:561] (1/4) Epoch 1606, batch 10, global_batch_idx: 25690, batch size: 111, loss[dur_loss=0.2335, prior_loss=0.9823, diff_loss=0.3299, tot_loss=1.546, over 111.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9809, diff_loss=0.3792, tot_loss=1.587, over 1656.00 samples.], 2024-10-21 07:29:49,783 INFO [train.py:682] (1/4) Start epoch 1607 2024-10-21 07:30:03,517 INFO [train.py:561] (1/4) Epoch 1607, batch 4, global_batch_idx: 25700, batch size: 189, loss[dur_loss=0.2277, prior_loss=0.9816, diff_loss=0.3367, tot_loss=1.546, over 189.00 samples.], tot_loss[dur_loss=0.2231, prior_loss=0.9804, diff_loss=0.4414, tot_loss=1.645, over 937.00 samples.], 2024-10-21 07:30:18,280 INFO [train.py:561] (1/4) Epoch 1607, batch 14, global_batch_idx: 25710, batch size: 142, loss[dur_loss=0.2306, prior_loss=0.9813, diff_loss=0.3308, tot_loss=1.543, over 142.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9811, diff_loss=0.3697, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 07:30:19,696 INFO [train.py:682] (1/4) Start epoch 1608 2024-10-21 07:30:39,599 INFO [train.py:561] (1/4) Epoch 1608, batch 8, global_batch_idx: 25720, batch size: 170, loss[dur_loss=0.237, prior_loss=0.9818, diff_loss=0.2958, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.2265, prior_loss=0.981, diff_loss=0.3857, tot_loss=1.593, over 1432.00 samples.], 2024-10-21 07:30:49,606 INFO [train.py:682] (1/4) Start epoch 1609 2024-10-21 07:31:00,777 INFO [train.py:561] (1/4) Epoch 1609, batch 2, global_batch_idx: 25730, batch size: 203, loss[dur_loss=0.2258, prior_loss=0.9812, diff_loss=0.347, tot_loss=1.554, over 203.00 samples.], tot_loss[dur_loss=0.2278, prior_loss=0.9814, diff_loss=0.3226, tot_loss=1.532, over 442.00 samples.], 2024-10-21 07:31:15,031 INFO [train.py:561] (1/4) Epoch 1609, batch 12, global_batch_idx: 25740, batch size: 152, loss[dur_loss=0.2251, prior_loss=0.981, diff_loss=0.2966, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2267, prior_loss=0.981, diff_loss=0.3773, tot_loss=1.585, over 1966.00 samples.], 2024-10-21 07:31:19,463 INFO [train.py:682] (1/4) Start epoch 1610 2024-10-21 07:31:36,462 INFO [train.py:561] (1/4) Epoch 1610, batch 6, global_batch_idx: 25750, batch size: 106, loss[dur_loss=0.2309, prior_loss=0.9815, diff_loss=0.3211, tot_loss=1.533, over 106.00 samples.], tot_loss[dur_loss=0.2245, prior_loss=0.9806, diff_loss=0.425, tot_loss=1.63, over 1142.00 samples.], 2024-10-21 07:31:49,443 INFO [train.py:682] (1/4) Start epoch 1611 2024-10-21 07:31:58,001 INFO [train.py:561] (1/4) Epoch 1611, batch 0, global_batch_idx: 25760, batch size: 108, loss[dur_loss=0.2363, prior_loss=0.9822, diff_loss=0.3298, tot_loss=1.548, over 108.00 samples.], tot_loss[dur_loss=0.2363, prior_loss=0.9822, diff_loss=0.3298, tot_loss=1.548, over 108.00 samples.], 2024-10-21 07:32:12,265 INFO [train.py:561] (1/4) Epoch 1611, batch 10, global_batch_idx: 25770, batch size: 111, loss[dur_loss=0.2335, prior_loss=0.9825, diff_loss=0.316, tot_loss=1.532, over 111.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.981, diff_loss=0.3908, tot_loss=1.599, over 1656.00 samples.], 2024-10-21 07:32:19,278 INFO [train.py:682] (1/4) Start epoch 1612 2024-10-21 07:32:32,879 INFO [train.py:561] (1/4) Epoch 1612, batch 4, global_batch_idx: 25780, batch size: 189, loss[dur_loss=0.2323, prior_loss=0.9816, diff_loss=0.3547, tot_loss=1.569, over 189.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9804, diff_loss=0.4417, tot_loss=1.648, over 937.00 samples.], 2024-10-21 07:32:47,640 INFO [train.py:561] (1/4) Epoch 1612, batch 14, global_batch_idx: 25790, batch size: 142, loss[dur_loss=0.2307, prior_loss=0.981, diff_loss=0.3106, tot_loss=1.522, over 142.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9811, diff_loss=0.3692, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 07:32:49,059 INFO [train.py:682] (1/4) Start epoch 1613 2024-10-21 07:33:08,696 INFO [train.py:561] (1/4) Epoch 1613, batch 8, global_batch_idx: 25800, batch size: 170, loss[dur_loss=0.2353, prior_loss=0.9817, diff_loss=0.3407, tot_loss=1.558, over 170.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.9809, diff_loss=0.3967, tot_loss=1.605, over 1432.00 samples.], 2024-10-21 07:33:18,834 INFO [train.py:682] (1/4) Start epoch 1614 2024-10-21 07:33:29,901 INFO [train.py:561] (1/4) Epoch 1614, batch 2, global_batch_idx: 25810, batch size: 203, loss[dur_loss=0.226, prior_loss=0.9815, diff_loss=0.341, tot_loss=1.548, over 203.00 samples.], tot_loss[dur_loss=0.2285, prior_loss=0.9816, diff_loss=0.3307, tot_loss=1.541, over 442.00 samples.], 2024-10-21 07:33:43,991 INFO [train.py:561] (1/4) Epoch 1614, batch 12, global_batch_idx: 25820, batch size: 152, loss[dur_loss=0.2296, prior_loss=0.9813, diff_loss=0.3303, tot_loss=1.541, over 152.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.9811, diff_loss=0.3767, tot_loss=1.585, over 1966.00 samples.], 2024-10-21 07:33:48,376 INFO [train.py:682] (1/4) Start epoch 1615 2024-10-21 07:34:05,474 INFO [train.py:561] (1/4) Epoch 1615, batch 6, global_batch_idx: 25830, batch size: 106, loss[dur_loss=0.234, prior_loss=0.9816, diff_loss=0.3189, tot_loss=1.535, over 106.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9806, diff_loss=0.4128, tot_loss=1.618, over 1142.00 samples.], 2024-10-21 07:34:18,561 INFO [train.py:682] (1/4) Start epoch 1616 2024-10-21 07:34:27,468 INFO [train.py:561] (1/4) Epoch 1616, batch 0, global_batch_idx: 25840, batch size: 108, loss[dur_loss=0.2323, prior_loss=0.982, diff_loss=0.3035, tot_loss=1.518, over 108.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.982, diff_loss=0.3035, tot_loss=1.518, over 108.00 samples.], 2024-10-21 07:34:41,688 INFO [train.py:561] (1/4) Epoch 1616, batch 10, global_batch_idx: 25850, batch size: 111, loss[dur_loss=0.2363, prior_loss=0.9828, diff_loss=0.3325, tot_loss=1.552, over 111.00 samples.], tot_loss[dur_loss=0.2277, prior_loss=0.9811, diff_loss=0.3959, tot_loss=1.605, over 1656.00 samples.], 2024-10-21 07:34:48,828 INFO [train.py:682] (1/4) Start epoch 1617 2024-10-21 07:35:02,552 INFO [train.py:561] (1/4) Epoch 1617, batch 4, global_batch_idx: 25860, batch size: 189, loss[dur_loss=0.229, prior_loss=0.9812, diff_loss=0.3547, tot_loss=1.565, over 189.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9802, diff_loss=0.4375, tot_loss=1.641, over 937.00 samples.], 2024-10-21 07:35:17,286 INFO [train.py:561] (1/4) Epoch 1617, batch 14, global_batch_idx: 25870, batch size: 142, loss[dur_loss=0.2276, prior_loss=0.9809, diff_loss=0.2991, tot_loss=1.508, over 142.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.981, diff_loss=0.3717, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 07:35:18,717 INFO [train.py:682] (1/4) Start epoch 1618 2024-10-21 07:35:38,349 INFO [train.py:561] (1/4) Epoch 1618, batch 8, global_batch_idx: 25880, batch size: 170, loss[dur_loss=0.2344, prior_loss=0.9816, diff_loss=0.3378, tot_loss=1.554, over 170.00 samples.], tot_loss[dur_loss=0.2273, prior_loss=0.9807, diff_loss=0.3947, tot_loss=1.603, over 1432.00 samples.], 2024-10-21 07:35:48,358 INFO [train.py:682] (1/4) Start epoch 1619 2024-10-21 07:35:59,479 INFO [train.py:561] (1/4) Epoch 1619, batch 2, global_batch_idx: 25890, batch size: 203, loss[dur_loss=0.228, prior_loss=0.9812, diff_loss=0.3441, tot_loss=1.553, over 203.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9813, diff_loss=0.3284, tot_loss=1.54, over 442.00 samples.], 2024-10-21 07:36:13,638 INFO [train.py:561] (1/4) Epoch 1619, batch 12, global_batch_idx: 25900, batch size: 152, loss[dur_loss=0.228, prior_loss=0.9815, diff_loss=0.3256, tot_loss=1.535, over 152.00 samples.], tot_loss[dur_loss=0.2275, prior_loss=0.981, diff_loss=0.3773, tot_loss=1.586, over 1966.00 samples.], 2024-10-21 07:36:18,221 INFO [train.py:682] (1/4) Start epoch 1620 2024-10-21 07:36:35,296 INFO [train.py:561] (1/4) Epoch 1620, batch 6, global_batch_idx: 25910, batch size: 106, loss[dur_loss=0.2326, prior_loss=0.9815, diff_loss=0.2839, tot_loss=1.498, over 106.00 samples.], tot_loss[dur_loss=0.2249, prior_loss=0.9805, diff_loss=0.4135, tot_loss=1.619, over 1142.00 samples.], 2024-10-21 07:36:48,297 INFO [train.py:682] (1/4) Start epoch 1621 2024-10-21 07:36:56,886 INFO [train.py:561] (1/4) Epoch 1621, batch 0, global_batch_idx: 25920, batch size: 108, loss[dur_loss=0.2303, prior_loss=0.982, diff_loss=0.3272, tot_loss=1.539, over 108.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.982, diff_loss=0.3272, tot_loss=1.539, over 108.00 samples.], 2024-10-21 07:37:10,984 INFO [train.py:561] (1/4) Epoch 1621, batch 10, global_batch_idx: 25930, batch size: 111, loss[dur_loss=0.2367, prior_loss=0.9831, diff_loss=0.3529, tot_loss=1.573, over 111.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.981, diff_loss=0.3921, tot_loss=1.601, over 1656.00 samples.], 2024-10-21 07:37:18,000 INFO [train.py:682] (1/4) Start epoch 1622 2024-10-21 07:37:31,427 INFO [train.py:561] (1/4) Epoch 1622, batch 4, global_batch_idx: 25940, batch size: 189, loss[dur_loss=0.2257, prior_loss=0.9811, diff_loss=0.387, tot_loss=1.594, over 189.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9803, diff_loss=0.4452, tot_loss=1.647, over 937.00 samples.], 2024-10-21 07:37:46,179 INFO [train.py:561] (1/4) Epoch 1622, batch 14, global_batch_idx: 25950, batch size: 142, loss[dur_loss=0.2272, prior_loss=0.9808, diff_loss=0.338, tot_loss=1.546, over 142.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9809, diff_loss=0.3735, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 07:37:47,603 INFO [train.py:682] (1/4) Start epoch 1623 2024-10-21 07:38:07,237 INFO [train.py:561] (1/4) Epoch 1623, batch 8, global_batch_idx: 25960, batch size: 170, loss[dur_loss=0.2331, prior_loss=0.9813, diff_loss=0.3406, tot_loss=1.555, over 170.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9806, diff_loss=0.4058, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 07:38:17,287 INFO [train.py:682] (1/4) Start epoch 1624 2024-10-21 07:38:28,390 INFO [train.py:561] (1/4) Epoch 1624, batch 2, global_batch_idx: 25970, batch size: 203, loss[dur_loss=0.2303, prior_loss=0.9815, diff_loss=0.3519, tot_loss=1.564, over 203.00 samples.], tot_loss[dur_loss=0.2297, prior_loss=0.9814, diff_loss=0.3319, tot_loss=1.543, over 442.00 samples.], 2024-10-21 07:38:42,521 INFO [train.py:561] (1/4) Epoch 1624, batch 12, global_batch_idx: 25980, batch size: 152, loss[dur_loss=0.2299, prior_loss=0.9815, diff_loss=0.3452, tot_loss=1.557, over 152.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.981, diff_loss=0.3747, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 07:38:46,934 INFO [train.py:682] (1/4) Start epoch 1625 2024-10-21 07:39:03,627 INFO [train.py:561] (1/4) Epoch 1625, batch 6, global_batch_idx: 25990, batch size: 106, loss[dur_loss=0.2293, prior_loss=0.9815, diff_loss=0.3255, tot_loss=1.536, over 106.00 samples.], tot_loss[dur_loss=0.2239, prior_loss=0.9806, diff_loss=0.4222, tot_loss=1.627, over 1142.00 samples.], 2024-10-21 07:39:16,543 INFO [train.py:682] (1/4) Start epoch 1626 2024-10-21 07:39:25,253 INFO [train.py:561] (1/4) Epoch 1626, batch 0, global_batch_idx: 26000, batch size: 108, loss[dur_loss=0.2298, prior_loss=0.9821, diff_loss=0.3189, tot_loss=1.531, over 108.00 samples.], tot_loss[dur_loss=0.2298, prior_loss=0.9821, diff_loss=0.3189, tot_loss=1.531, over 108.00 samples.], 2024-10-21 07:39:39,527 INFO [train.py:561] (1/4) Epoch 1626, batch 10, global_batch_idx: 26010, batch size: 111, loss[dur_loss=0.2348, prior_loss=0.9829, diff_loss=0.2615, tot_loss=1.479, over 111.00 samples.], tot_loss[dur_loss=0.2281, prior_loss=0.981, diff_loss=0.3893, tot_loss=1.598, over 1656.00 samples.], 2024-10-21 07:39:46,616 INFO [train.py:682] (1/4) Start epoch 1627 2024-10-21 07:40:00,345 INFO [train.py:561] (1/4) Epoch 1627, batch 4, global_batch_idx: 26020, batch size: 189, loss[dur_loss=0.2273, prior_loss=0.9815, diff_loss=0.3662, tot_loss=1.575, over 189.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9806, diff_loss=0.4508, tot_loss=1.655, over 937.00 samples.], 2024-10-21 07:40:15,173 INFO [train.py:561] (1/4) Epoch 1627, batch 14, global_batch_idx: 26030, batch size: 142, loss[dur_loss=0.2295, prior_loss=0.981, diff_loss=0.3345, tot_loss=1.545, over 142.00 samples.], tot_loss[dur_loss=0.2277, prior_loss=0.9812, diff_loss=0.3762, tot_loss=1.585, over 2210.00 samples.], 2024-10-21 07:40:16,592 INFO [train.py:682] (1/4) Start epoch 1628 2024-10-21 07:40:36,295 INFO [train.py:561] (1/4) Epoch 1628, batch 8, global_batch_idx: 26040, batch size: 170, loss[dur_loss=0.2354, prior_loss=0.9817, diff_loss=0.3354, tot_loss=1.553, over 170.00 samples.], tot_loss[dur_loss=0.226, prior_loss=0.9808, diff_loss=0.4028, tot_loss=1.61, over 1432.00 samples.], 2024-10-21 07:40:46,311 INFO [train.py:682] (1/4) Start epoch 1629 2024-10-21 07:40:57,512 INFO [train.py:561] (1/4) Epoch 1629, batch 2, global_batch_idx: 26050, batch size: 203, loss[dur_loss=0.228, prior_loss=0.9814, diff_loss=0.3626, tot_loss=1.572, over 203.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9816, diff_loss=0.334, tot_loss=1.544, over 442.00 samples.], 2024-10-21 07:41:11,638 INFO [train.py:561] (1/4) Epoch 1629, batch 12, global_batch_idx: 26060, batch size: 152, loss[dur_loss=0.2293, prior_loss=0.9813, diff_loss=0.3123, tot_loss=1.523, over 152.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9809, diff_loss=0.3827, tot_loss=1.591, over 1966.00 samples.], 2024-10-21 07:41:16,055 INFO [train.py:682] (1/4) Start epoch 1630 2024-10-21 07:41:32,781 INFO [train.py:561] (1/4) Epoch 1630, batch 6, global_batch_idx: 26070, batch size: 106, loss[dur_loss=0.2333, prior_loss=0.9813, diff_loss=0.326, tot_loss=1.541, over 106.00 samples.], tot_loss[dur_loss=0.2249, prior_loss=0.9806, diff_loss=0.3997, tot_loss=1.605, over 1142.00 samples.], 2024-10-21 07:41:45,704 INFO [train.py:682] (1/4) Start epoch 1631 2024-10-21 07:41:54,458 INFO [train.py:561] (1/4) Epoch 1631, batch 0, global_batch_idx: 26080, batch size: 108, loss[dur_loss=0.2325, prior_loss=0.9819, diff_loss=0.3204, tot_loss=1.535, over 108.00 samples.], tot_loss[dur_loss=0.2325, prior_loss=0.9819, diff_loss=0.3204, tot_loss=1.535, over 108.00 samples.], 2024-10-21 07:42:08,504 INFO [train.py:561] (1/4) Epoch 1631, batch 10, global_batch_idx: 26090, batch size: 111, loss[dur_loss=0.2327, prior_loss=0.9822, diff_loss=0.3379, tot_loss=1.553, over 111.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9807, diff_loss=0.3959, tot_loss=1.602, over 1656.00 samples.], 2024-10-21 07:42:15,536 INFO [train.py:682] (1/4) Start epoch 1632 2024-10-21 07:42:29,070 INFO [train.py:561] (1/4) Epoch 1632, batch 4, global_batch_idx: 26100, batch size: 189, loss[dur_loss=0.2274, prior_loss=0.9811, diff_loss=0.3538, tot_loss=1.562, over 189.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9803, diff_loss=0.4469, tot_loss=1.65, over 937.00 samples.], 2024-10-21 07:42:43,763 INFO [train.py:561] (1/4) Epoch 1632, batch 14, global_batch_idx: 26110, batch size: 142, loss[dur_loss=0.2254, prior_loss=0.9806, diff_loss=0.3302, tot_loss=1.536, over 142.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9809, diff_loss=0.3711, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 07:42:45,176 INFO [train.py:682] (1/4) Start epoch 1633 2024-10-21 07:43:04,716 INFO [train.py:561] (1/4) Epoch 1633, batch 8, global_batch_idx: 26120, batch size: 170, loss[dur_loss=0.2333, prior_loss=0.9812, diff_loss=0.3507, tot_loss=1.565, over 170.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9807, diff_loss=0.4106, tot_loss=1.618, over 1432.00 samples.], 2024-10-21 07:43:14,750 INFO [train.py:682] (1/4) Start epoch 1634 2024-10-21 07:43:25,972 INFO [train.py:561] (1/4) Epoch 1634, batch 2, global_batch_idx: 26130, batch size: 203, loss[dur_loss=0.2298, prior_loss=0.9814, diff_loss=0.3415, tot_loss=1.553, over 203.00 samples.], tot_loss[dur_loss=0.229, prior_loss=0.9815, diff_loss=0.3217, tot_loss=1.532, over 442.00 samples.], 2024-10-21 07:43:40,031 INFO [train.py:561] (1/4) Epoch 1634, batch 12, global_batch_idx: 26140, batch size: 152, loss[dur_loss=0.2304, prior_loss=0.9812, diff_loss=0.3357, tot_loss=1.547, over 152.00 samples.], tot_loss[dur_loss=0.2278, prior_loss=0.9809, diff_loss=0.3667, tot_loss=1.575, over 1966.00 samples.], 2024-10-21 07:43:44,410 INFO [train.py:682] (1/4) Start epoch 1635 2024-10-21 07:44:00,988 INFO [train.py:561] (1/4) Epoch 1635, batch 6, global_batch_idx: 26150, batch size: 106, loss[dur_loss=0.2298, prior_loss=0.9813, diff_loss=0.3185, tot_loss=1.53, over 106.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9805, diff_loss=0.4217, tot_loss=1.626, over 1142.00 samples.], 2024-10-21 07:44:13,895 INFO [train.py:682] (1/4) Start epoch 1636 2024-10-21 07:44:22,314 INFO [train.py:561] (1/4) Epoch 1636, batch 0, global_batch_idx: 26160, batch size: 108, loss[dur_loss=0.2329, prior_loss=0.9822, diff_loss=0.3377, tot_loss=1.553, over 108.00 samples.], tot_loss[dur_loss=0.2329, prior_loss=0.9822, diff_loss=0.3377, tot_loss=1.553, over 108.00 samples.], 2024-10-21 07:44:36,351 INFO [train.py:561] (1/4) Epoch 1636, batch 10, global_batch_idx: 26170, batch size: 111, loss[dur_loss=0.2362, prior_loss=0.9824, diff_loss=0.2811, tot_loss=1.5, over 111.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9809, diff_loss=0.3805, tot_loss=1.589, over 1656.00 samples.], 2024-10-21 07:44:43,335 INFO [train.py:682] (1/4) Start epoch 1637 2024-10-21 07:44:56,908 INFO [train.py:561] (1/4) Epoch 1637, batch 4, global_batch_idx: 26180, batch size: 189, loss[dur_loss=0.2288, prior_loss=0.9814, diff_loss=0.3476, tot_loss=1.558, over 189.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9805, diff_loss=0.4397, tot_loss=1.644, over 937.00 samples.], 2024-10-21 07:45:11,697 INFO [train.py:561] (1/4) Epoch 1637, batch 14, global_batch_idx: 26190, batch size: 142, loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.304, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9809, diff_loss=0.3753, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 07:45:13,117 INFO [train.py:682] (1/4) Start epoch 1638 2024-10-21 07:45:33,234 INFO [train.py:561] (1/4) Epoch 1638, batch 8, global_batch_idx: 26200, batch size: 170, loss[dur_loss=0.2373, prior_loss=0.9812, diff_loss=0.3308, tot_loss=1.549, over 170.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9806, diff_loss=0.4049, tot_loss=1.613, over 1432.00 samples.], 2024-10-21 07:45:43,398 INFO [train.py:682] (1/4) Start epoch 1639 2024-10-21 07:45:54,999 INFO [train.py:561] (1/4) Epoch 1639, batch 2, global_batch_idx: 26210, batch size: 203, loss[dur_loss=0.2251, prior_loss=0.9813, diff_loss=0.3164, tot_loss=1.523, over 203.00 samples.], tot_loss[dur_loss=0.2269, prior_loss=0.9813, diff_loss=0.3297, tot_loss=1.538, over 442.00 samples.], 2024-10-21 07:46:09,528 INFO [train.py:561] (1/4) Epoch 1639, batch 12, global_batch_idx: 26220, batch size: 152, loss[dur_loss=0.2289, prior_loss=0.9813, diff_loss=0.3657, tot_loss=1.576, over 152.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9809, diff_loss=0.3724, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 07:46:13,982 INFO [train.py:682] (1/4) Start epoch 1640 2024-10-21 07:46:31,218 INFO [train.py:561] (1/4) Epoch 1640, batch 6, global_batch_idx: 26230, batch size: 106, loss[dur_loss=0.2296, prior_loss=0.9813, diff_loss=0.3272, tot_loss=1.538, over 106.00 samples.], tot_loss[dur_loss=0.2249, prior_loss=0.9806, diff_loss=0.4182, tot_loss=1.624, over 1142.00 samples.], 2024-10-21 07:46:44,177 INFO [train.py:682] (1/4) Start epoch 1641 2024-10-21 07:46:52,951 INFO [train.py:561] (1/4) Epoch 1641, batch 0, global_batch_idx: 26240, batch size: 108, loss[dur_loss=0.2309, prior_loss=0.9817, diff_loss=0.3671, tot_loss=1.58, over 108.00 samples.], tot_loss[dur_loss=0.2309, prior_loss=0.9817, diff_loss=0.3671, tot_loss=1.58, over 108.00 samples.], 2024-10-21 07:47:07,337 INFO [train.py:561] (1/4) Epoch 1641, batch 10, global_batch_idx: 26250, batch size: 111, loss[dur_loss=0.2306, prior_loss=0.9822, diff_loss=0.3118, tot_loss=1.525, over 111.00 samples.], tot_loss[dur_loss=0.226, prior_loss=0.9808, diff_loss=0.3826, tot_loss=1.589, over 1656.00 samples.], 2024-10-21 07:47:14,368 INFO [train.py:682] (1/4) Start epoch 1642 2024-10-21 07:47:28,002 INFO [train.py:561] (1/4) Epoch 1642, batch 4, global_batch_idx: 26260, batch size: 189, loss[dur_loss=0.2269, prior_loss=0.9812, diff_loss=0.3414, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2225, prior_loss=0.9801, diff_loss=0.4505, tot_loss=1.653, over 937.00 samples.], 2024-10-21 07:47:42,787 INFO [train.py:561] (1/4) Epoch 1642, batch 14, global_batch_idx: 26270, batch size: 142, loss[dur_loss=0.227, prior_loss=0.9809, diff_loss=0.3279, tot_loss=1.536, over 142.00 samples.], tot_loss[dur_loss=0.2267, prior_loss=0.9808, diff_loss=0.3784, tot_loss=1.586, over 2210.00 samples.], 2024-10-21 07:47:44,204 INFO [train.py:682] (1/4) Start epoch 1643 2024-10-21 07:48:03,853 INFO [train.py:561] (1/4) Epoch 1643, batch 8, global_batch_idx: 26280, batch size: 170, loss[dur_loss=0.2329, prior_loss=0.9812, diff_loss=0.3128, tot_loss=1.527, over 170.00 samples.], tot_loss[dur_loss=0.2241, prior_loss=0.9805, diff_loss=0.393, tot_loss=1.598, over 1432.00 samples.], 2024-10-21 07:48:13,877 INFO [train.py:682] (1/4) Start epoch 1644 2024-10-21 07:48:24,994 INFO [train.py:561] (1/4) Epoch 1644, batch 2, global_batch_idx: 26290, batch size: 203, loss[dur_loss=0.2272, prior_loss=0.9811, diff_loss=0.3805, tot_loss=1.589, over 203.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9813, diff_loss=0.3547, tot_loss=1.564, over 442.00 samples.], 2024-10-21 07:48:39,265 INFO [train.py:561] (1/4) Epoch 1644, batch 12, global_batch_idx: 26300, batch size: 152, loss[dur_loss=0.231, prior_loss=0.9811, diff_loss=0.3238, tot_loss=1.536, over 152.00 samples.], tot_loss[dur_loss=0.226, prior_loss=0.9807, diff_loss=0.3747, tot_loss=1.581, over 1966.00 samples.], 2024-10-21 07:48:43,683 INFO [train.py:682] (1/4) Start epoch 1645 2024-10-21 07:49:00,382 INFO [train.py:561] (1/4) Epoch 1645, batch 6, global_batch_idx: 26310, batch size: 106, loss[dur_loss=0.2291, prior_loss=0.981, diff_loss=0.3315, tot_loss=1.542, over 106.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9803, diff_loss=0.4173, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 07:49:13,542 INFO [train.py:682] (1/4) Start epoch 1646 2024-10-21 07:49:22,310 INFO [train.py:561] (1/4) Epoch 1646, batch 0, global_batch_idx: 26320, batch size: 108, loss[dur_loss=0.2323, prior_loss=0.9816, diff_loss=0.3288, tot_loss=1.543, over 108.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.9816, diff_loss=0.3288, tot_loss=1.543, over 108.00 samples.], 2024-10-21 07:49:36,423 INFO [train.py:561] (1/4) Epoch 1646, batch 10, global_batch_idx: 26330, batch size: 111, loss[dur_loss=0.234, prior_loss=0.9824, diff_loss=0.3004, tot_loss=1.517, over 111.00 samples.], tot_loss[dur_loss=0.227, prior_loss=0.9807, diff_loss=0.3903, tot_loss=1.598, over 1656.00 samples.], 2024-10-21 07:49:43,460 INFO [train.py:682] (1/4) Start epoch 1647 2024-10-21 07:49:56,974 INFO [train.py:561] (1/4) Epoch 1647, batch 4, global_batch_idx: 26340, batch size: 189, loss[dur_loss=0.228, prior_loss=0.9812, diff_loss=0.348, tot_loss=1.557, over 189.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9802, diff_loss=0.4478, tot_loss=1.652, over 937.00 samples.], 2024-10-21 07:50:11,751 INFO [train.py:561] (1/4) Epoch 1647, batch 14, global_batch_idx: 26350, batch size: 142, loss[dur_loss=0.2273, prior_loss=0.9809, diff_loss=0.3143, tot_loss=1.523, over 142.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9808, diff_loss=0.3736, tot_loss=1.582, over 2210.00 samples.], 2024-10-21 07:50:13,179 INFO [train.py:682] (1/4) Start epoch 1648 2024-10-21 07:50:33,190 INFO [train.py:561] (1/4) Epoch 1648, batch 8, global_batch_idx: 26360, batch size: 170, loss[dur_loss=0.2317, prior_loss=0.9812, diff_loss=0.3318, tot_loss=1.545, over 170.00 samples.], tot_loss[dur_loss=0.2247, prior_loss=0.9806, diff_loss=0.3895, tot_loss=1.595, over 1432.00 samples.], 2024-10-21 07:50:43,251 INFO [train.py:682] (1/4) Start epoch 1649 2024-10-21 07:50:54,630 INFO [train.py:561] (1/4) Epoch 1649, batch 2, global_batch_idx: 26370, batch size: 203, loss[dur_loss=0.2266, prior_loss=0.9809, diff_loss=0.3428, tot_loss=1.55, over 203.00 samples.], tot_loss[dur_loss=0.2275, prior_loss=0.981, diff_loss=0.3233, tot_loss=1.532, over 442.00 samples.], 2024-10-21 07:51:08,750 INFO [train.py:561] (1/4) Epoch 1649, batch 12, global_batch_idx: 26380, batch size: 152, loss[dur_loss=0.2297, prior_loss=0.9814, diff_loss=0.337, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.2275, prior_loss=0.9808, diff_loss=0.379, tot_loss=1.587, over 1966.00 samples.], 2024-10-21 07:51:13,146 INFO [train.py:682] (1/4) Start epoch 1650 2024-10-21 07:51:29,806 INFO [train.py:561] (1/4) Epoch 1650, batch 6, global_batch_idx: 26390, batch size: 106, loss[dur_loss=0.2283, prior_loss=0.9812, diff_loss=0.3177, tot_loss=1.527, over 106.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9805, diff_loss=0.4174, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 07:51:42,875 INFO [train.py:682] (1/4) Start epoch 1651 2024-10-21 07:51:51,682 INFO [train.py:561] (1/4) Epoch 1651, batch 0, global_batch_idx: 26400, batch size: 108, loss[dur_loss=0.2342, prior_loss=0.9822, diff_loss=0.2922, tot_loss=1.509, over 108.00 samples.], tot_loss[dur_loss=0.2342, prior_loss=0.9822, diff_loss=0.2922, tot_loss=1.509, over 108.00 samples.], 2024-10-21 07:52:05,828 INFO [train.py:561] (1/4) Epoch 1651, batch 10, global_batch_idx: 26410, batch size: 111, loss[dur_loss=0.2297, prior_loss=0.9819, diff_loss=0.2905, tot_loss=1.502, over 111.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9807, diff_loss=0.3857, tot_loss=1.593, over 1656.00 samples.], 2024-10-21 07:52:12,876 INFO [train.py:682] (1/4) Start epoch 1652 2024-10-21 07:52:26,454 INFO [train.py:561] (1/4) Epoch 1652, batch 4, global_batch_idx: 26420, batch size: 189, loss[dur_loss=0.2277, prior_loss=0.9816, diff_loss=0.3298, tot_loss=1.539, over 189.00 samples.], tot_loss[dur_loss=0.2223, prior_loss=0.98, diff_loss=0.4393, tot_loss=1.642, over 937.00 samples.], 2024-10-21 07:52:41,224 INFO [train.py:561] (1/4) Epoch 1652, batch 14, global_batch_idx: 26430, batch size: 142, loss[dur_loss=0.2272, prior_loss=0.9805, diff_loss=0.3055, tot_loss=1.513, over 142.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9807, diff_loss=0.3664, tot_loss=1.573, over 2210.00 samples.], 2024-10-21 07:52:42,644 INFO [train.py:682] (1/4) Start epoch 1653 2024-10-21 07:53:02,419 INFO [train.py:561] (1/4) Epoch 1653, batch 8, global_batch_idx: 26440, batch size: 170, loss[dur_loss=0.2322, prior_loss=0.981, diff_loss=0.3426, tot_loss=1.556, over 170.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9804, diff_loss=0.4017, tot_loss=1.607, over 1432.00 samples.], 2024-10-21 07:53:12,526 INFO [train.py:682] (1/4) Start epoch 1654 2024-10-21 07:53:23,824 INFO [train.py:561] (1/4) Epoch 1654, batch 2, global_batch_idx: 26450, batch size: 203, loss[dur_loss=0.2283, prior_loss=0.9808, diff_loss=0.3234, tot_loss=1.533, over 203.00 samples.], tot_loss[dur_loss=0.2281, prior_loss=0.981, diff_loss=0.3138, tot_loss=1.523, over 442.00 samples.], 2024-10-21 07:53:37,993 INFO [train.py:561] (1/4) Epoch 1654, batch 12, global_batch_idx: 26460, batch size: 152, loss[dur_loss=0.2264, prior_loss=0.9808, diff_loss=0.3087, tot_loss=1.516, over 152.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9806, diff_loss=0.3717, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 07:53:42,418 INFO [train.py:682] (1/4) Start epoch 1655 2024-10-21 07:53:59,106 INFO [train.py:561] (1/4) Epoch 1655, batch 6, global_batch_idx: 26470, batch size: 106, loss[dur_loss=0.2281, prior_loss=0.9809, diff_loss=0.2942, tot_loss=1.503, over 106.00 samples.], tot_loss[dur_loss=0.2242, prior_loss=0.9802, diff_loss=0.4217, tot_loss=1.626, over 1142.00 samples.], 2024-10-21 07:54:12,016 INFO [train.py:682] (1/4) Start epoch 1656 2024-10-21 07:54:20,657 INFO [train.py:561] (1/4) Epoch 1656, batch 0, global_batch_idx: 26480, batch size: 108, loss[dur_loss=0.2272, prior_loss=0.9816, diff_loss=0.2979, tot_loss=1.507, over 108.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9816, diff_loss=0.2979, tot_loss=1.507, over 108.00 samples.], 2024-10-21 07:54:34,781 INFO [train.py:561] (1/4) Epoch 1656, batch 10, global_batch_idx: 26490, batch size: 111, loss[dur_loss=0.2313, prior_loss=0.9819, diff_loss=0.3343, tot_loss=1.547, over 111.00 samples.], tot_loss[dur_loss=0.2241, prior_loss=0.9805, diff_loss=0.392, tot_loss=1.597, over 1656.00 samples.], 2024-10-21 07:54:41,862 INFO [train.py:682] (1/4) Start epoch 1657 2024-10-21 07:54:55,589 INFO [train.py:561] (1/4) Epoch 1657, batch 4, global_batch_idx: 26500, batch size: 189, loss[dur_loss=0.2278, prior_loss=0.9813, diff_loss=0.3402, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2216, prior_loss=0.98, diff_loss=0.4455, tot_loss=1.647, over 937.00 samples.], 2024-10-21 07:55:10,366 INFO [train.py:561] (1/4) Epoch 1657, batch 14, global_batch_idx: 26510, batch size: 142, loss[dur_loss=0.2269, prior_loss=0.9804, diff_loss=0.3262, tot_loss=1.533, over 142.00 samples.], tot_loss[dur_loss=0.2259, prior_loss=0.9806, diff_loss=0.3695, tot_loss=1.576, over 2210.00 samples.], 2024-10-21 07:55:11,789 INFO [train.py:682] (1/4) Start epoch 1658 2024-10-21 07:55:31,421 INFO [train.py:561] (1/4) Epoch 1658, batch 8, global_batch_idx: 26520, batch size: 170, loss[dur_loss=0.2324, prior_loss=0.9812, diff_loss=0.335, tot_loss=1.549, over 170.00 samples.], tot_loss[dur_loss=0.2247, prior_loss=0.9804, diff_loss=0.4034, tot_loss=1.609, over 1432.00 samples.], 2024-10-21 07:55:41,492 INFO [train.py:682] (1/4) Start epoch 1659 2024-10-21 07:55:52,815 INFO [train.py:561] (1/4) Epoch 1659, batch 2, global_batch_idx: 26530, batch size: 203, loss[dur_loss=0.2239, prior_loss=0.9807, diff_loss=0.3574, tot_loss=1.562, over 203.00 samples.], tot_loss[dur_loss=0.2261, prior_loss=0.9809, diff_loss=0.3189, tot_loss=1.526, over 442.00 samples.], 2024-10-21 07:56:06,917 INFO [train.py:561] (1/4) Epoch 1659, batch 12, global_batch_idx: 26540, batch size: 152, loss[dur_loss=0.2235, prior_loss=0.9806, diff_loss=0.337, tot_loss=1.541, over 152.00 samples.], tot_loss[dur_loss=0.2242, prior_loss=0.9805, diff_loss=0.3714, tot_loss=1.576, over 1966.00 samples.], 2024-10-21 07:56:11,385 INFO [train.py:682] (1/4) Start epoch 1660 2024-10-21 07:56:28,450 INFO [train.py:561] (1/4) Epoch 1660, batch 6, global_batch_idx: 26550, batch size: 106, loss[dur_loss=0.2285, prior_loss=0.981, diff_loss=0.3615, tot_loss=1.571, over 106.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.9801, diff_loss=0.418, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 07:56:41,491 INFO [train.py:682] (1/4) Start epoch 1661 2024-10-21 07:56:50,185 INFO [train.py:561] (1/4) Epoch 1661, batch 0, global_batch_idx: 26560, batch size: 108, loss[dur_loss=0.2337, prior_loss=0.9818, diff_loss=0.3537, tot_loss=1.569, over 108.00 samples.], tot_loss[dur_loss=0.2337, prior_loss=0.9818, diff_loss=0.3537, tot_loss=1.569, over 108.00 samples.], 2024-10-21 07:57:04,282 INFO [train.py:561] (1/4) Epoch 1661, batch 10, global_batch_idx: 26570, batch size: 111, loss[dur_loss=0.2288, prior_loss=0.982, diff_loss=0.3448, tot_loss=1.556, over 111.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9805, diff_loss=0.3918, tot_loss=1.597, over 1656.00 samples.], 2024-10-21 07:57:11,320 INFO [train.py:682] (1/4) Start epoch 1662 2024-10-21 07:57:25,016 INFO [train.py:561] (1/4) Epoch 1662, batch 4, global_batch_idx: 26580, batch size: 189, loss[dur_loss=0.2266, prior_loss=0.9811, diff_loss=0.3499, tot_loss=1.558, over 189.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.98, diff_loss=0.445, tot_loss=1.648, over 937.00 samples.], 2024-10-21 07:57:39,831 INFO [train.py:561] (1/4) Epoch 1662, batch 14, global_batch_idx: 26590, batch size: 142, loss[dur_loss=0.2227, prior_loss=0.9804, diff_loss=0.3435, tot_loss=1.547, over 142.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9806, diff_loss=0.3708, tot_loss=1.578, over 2210.00 samples.], 2024-10-21 07:57:41,264 INFO [train.py:682] (1/4) Start epoch 1663 2024-10-21 07:58:00,834 INFO [train.py:561] (1/4) Epoch 1663, batch 8, global_batch_idx: 26600, batch size: 170, loss[dur_loss=0.2362, prior_loss=0.9812, diff_loss=0.321, tot_loss=1.538, over 170.00 samples.], tot_loss[dur_loss=0.2249, prior_loss=0.9804, diff_loss=0.4034, tot_loss=1.609, over 1432.00 samples.], 2024-10-21 07:58:10,874 INFO [train.py:682] (1/4) Start epoch 1664 2024-10-21 07:58:21,913 INFO [train.py:561] (1/4) Epoch 1664, batch 2, global_batch_idx: 26610, batch size: 203, loss[dur_loss=0.2258, prior_loss=0.9808, diff_loss=0.3135, tot_loss=1.52, over 203.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9809, diff_loss=0.3125, tot_loss=1.521, over 442.00 samples.], 2024-10-21 07:58:36,062 INFO [train.py:561] (1/4) Epoch 1664, batch 12, global_batch_idx: 26620, batch size: 152, loss[dur_loss=0.2289, prior_loss=0.9809, diff_loss=0.3267, tot_loss=1.537, over 152.00 samples.], tot_loss[dur_loss=0.2259, prior_loss=0.9805, diff_loss=0.3728, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 07:58:40,500 INFO [train.py:682] (1/4) Start epoch 1665 2024-10-21 07:58:57,667 INFO [train.py:561] (1/4) Epoch 1665, batch 6, global_batch_idx: 26630, batch size: 106, loss[dur_loss=0.2268, prior_loss=0.9808, diff_loss=0.3331, tot_loss=1.541, over 106.00 samples.], tot_loss[dur_loss=0.223, prior_loss=0.98, diff_loss=0.4289, tot_loss=1.632, over 1142.00 samples.], 2024-10-21 07:59:10,729 INFO [train.py:682] (1/4) Start epoch 1666 2024-10-21 07:59:19,243 INFO [train.py:561] (1/4) Epoch 1666, batch 0, global_batch_idx: 26640, batch size: 108, loss[dur_loss=0.2323, prior_loss=0.9815, diff_loss=0.3337, tot_loss=1.548, over 108.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.9815, diff_loss=0.3337, tot_loss=1.548, over 108.00 samples.], 2024-10-21 07:59:33,411 INFO [train.py:561] (1/4) Epoch 1666, batch 10, global_batch_idx: 26650, batch size: 111, loss[dur_loss=0.2316, prior_loss=0.9818, diff_loss=0.3082, tot_loss=1.522, over 111.00 samples.], tot_loss[dur_loss=0.2252, prior_loss=0.9803, diff_loss=0.3872, tot_loss=1.593, over 1656.00 samples.], 2024-10-21 07:59:40,519 INFO [train.py:682] (1/4) Start epoch 1667 2024-10-21 07:59:54,204 INFO [train.py:561] (1/4) Epoch 1667, batch 4, global_batch_idx: 26660, batch size: 189, loss[dur_loss=0.2289, prior_loss=0.9809, diff_loss=0.3275, tot_loss=1.537, over 189.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9798, diff_loss=0.4502, tot_loss=1.652, over 937.00 samples.], 2024-10-21 08:00:08,925 INFO [train.py:561] (1/4) Epoch 1667, batch 14, global_batch_idx: 26670, batch size: 142, loss[dur_loss=0.2265, prior_loss=0.9803, diff_loss=0.3127, tot_loss=1.519, over 142.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9805, diff_loss=0.3776, tot_loss=1.583, over 2210.00 samples.], 2024-10-21 08:00:10,339 INFO [train.py:682] (1/4) Start epoch 1668 2024-10-21 08:00:29,965 INFO [train.py:561] (1/4) Epoch 1668, batch 8, global_batch_idx: 26680, batch size: 170, loss[dur_loss=0.229, prior_loss=0.9806, diff_loss=0.3338, tot_loss=1.544, over 170.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9802, diff_loss=0.4026, tot_loss=1.606, over 1432.00 samples.], 2024-10-21 08:00:40,059 INFO [train.py:682] (1/4) Start epoch 1669 2024-10-21 08:00:51,154 INFO [train.py:561] (1/4) Epoch 1669, batch 2, global_batch_idx: 26690, batch size: 203, loss[dur_loss=0.2249, prior_loss=0.9806, diff_loss=0.3372, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.981, diff_loss=0.3398, tot_loss=1.548, over 442.00 samples.], 2024-10-21 08:01:05,196 INFO [train.py:561] (1/4) Epoch 1669, batch 12, global_batch_idx: 26700, batch size: 152, loss[dur_loss=0.2273, prior_loss=0.9809, diff_loss=0.353, tot_loss=1.561, over 152.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9806, diff_loss=0.3793, tot_loss=1.587, over 1966.00 samples.], 2024-10-21 08:01:09,626 INFO [train.py:682] (1/4) Start epoch 1670 2024-10-21 08:01:26,501 INFO [train.py:561] (1/4) Epoch 1670, batch 6, global_batch_idx: 26710, batch size: 106, loss[dur_loss=0.2294, prior_loss=0.9811, diff_loss=0.3113, tot_loss=1.522, over 106.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9802, diff_loss=0.4192, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 08:01:39,540 INFO [train.py:682] (1/4) Start epoch 1671 2024-10-21 08:01:48,229 INFO [train.py:561] (1/4) Epoch 1671, batch 0, global_batch_idx: 26720, batch size: 108, loss[dur_loss=0.2284, prior_loss=0.9815, diff_loss=0.3315, tot_loss=1.541, over 108.00 samples.], tot_loss[dur_loss=0.2284, prior_loss=0.9815, diff_loss=0.3315, tot_loss=1.541, over 108.00 samples.], 2024-10-21 08:02:02,251 INFO [train.py:561] (1/4) Epoch 1671, batch 10, global_batch_idx: 26730, batch size: 111, loss[dur_loss=0.2338, prior_loss=0.9821, diff_loss=0.3375, tot_loss=1.553, over 111.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9805, diff_loss=0.3896, tot_loss=1.597, over 1656.00 samples.], 2024-10-21 08:02:09,311 INFO [train.py:682] (1/4) Start epoch 1672 2024-10-21 08:02:22,741 INFO [train.py:561] (1/4) Epoch 1672, batch 4, global_batch_idx: 26740, batch size: 189, loss[dur_loss=0.2235, prior_loss=0.981, diff_loss=0.3797, tot_loss=1.584, over 189.00 samples.], tot_loss[dur_loss=0.2213, prior_loss=0.9799, diff_loss=0.4578, tot_loss=1.659, over 937.00 samples.], 2024-10-21 08:02:37,477 INFO [train.py:561] (1/4) Epoch 1672, batch 14, global_batch_idx: 26750, batch size: 142, loss[dur_loss=0.2265, prior_loss=0.9805, diff_loss=0.3444, tot_loss=1.551, over 142.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9806, diff_loss=0.3793, tot_loss=1.586, over 2210.00 samples.], 2024-10-21 08:02:38,908 INFO [train.py:682] (1/4) Start epoch 1673 2024-10-21 08:02:58,685 INFO [train.py:561] (1/4) Epoch 1673, batch 8, global_batch_idx: 26760, batch size: 170, loss[dur_loss=0.2337, prior_loss=0.9812, diff_loss=0.3505, tot_loss=1.565, over 170.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.9803, diff_loss=0.4028, tot_loss=1.607, over 1432.00 samples.], 2024-10-21 08:03:08,768 INFO [train.py:682] (1/4) Start epoch 1674 2024-10-21 08:03:20,010 INFO [train.py:561] (1/4) Epoch 1674, batch 2, global_batch_idx: 26770, batch size: 203, loss[dur_loss=0.2276, prior_loss=0.9809, diff_loss=0.3543, tot_loss=1.563, over 203.00 samples.], tot_loss[dur_loss=0.2287, prior_loss=0.9811, diff_loss=0.3229, tot_loss=1.533, over 442.00 samples.], 2024-10-21 08:03:34,203 INFO [train.py:561] (1/4) Epoch 1674, batch 12, global_batch_idx: 26780, batch size: 152, loss[dur_loss=0.2249, prior_loss=0.9806, diff_loss=0.3216, tot_loss=1.527, over 152.00 samples.], tot_loss[dur_loss=0.2252, prior_loss=0.9805, diff_loss=0.372, tot_loss=1.578, over 1966.00 samples.], 2024-10-21 08:03:38,650 INFO [train.py:682] (1/4) Start epoch 1675 2024-10-21 08:03:55,405 INFO [train.py:561] (1/4) Epoch 1675, batch 6, global_batch_idx: 26790, batch size: 106, loss[dur_loss=0.2261, prior_loss=0.9809, diff_loss=0.2806, tot_loss=1.488, over 106.00 samples.], tot_loss[dur_loss=0.2212, prior_loss=0.9799, diff_loss=0.4103, tot_loss=1.611, over 1142.00 samples.], 2024-10-21 08:04:08,362 INFO [train.py:682] (1/4) Start epoch 1676 2024-10-21 08:04:17,160 INFO [train.py:561] (1/4) Epoch 1676, batch 0, global_batch_idx: 26800, batch size: 108, loss[dur_loss=0.2311, prior_loss=0.9816, diff_loss=0.3374, tot_loss=1.55, over 108.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9816, diff_loss=0.3374, tot_loss=1.55, over 108.00 samples.], 2024-10-21 08:04:31,240 INFO [train.py:561] (1/4) Epoch 1676, batch 10, global_batch_idx: 26810, batch size: 111, loss[dur_loss=0.2378, prior_loss=0.9822, diff_loss=0.3344, tot_loss=1.554, over 111.00 samples.], tot_loss[dur_loss=0.2248, prior_loss=0.9804, diff_loss=0.3864, tot_loss=1.592, over 1656.00 samples.], 2024-10-21 08:04:38,307 INFO [train.py:682] (1/4) Start epoch 1677 2024-10-21 08:04:51,824 INFO [train.py:561] (1/4) Epoch 1677, batch 4, global_batch_idx: 26820, batch size: 189, loss[dur_loss=0.2268, prior_loss=0.9811, diff_loss=0.3327, tot_loss=1.541, over 189.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9798, diff_loss=0.4284, tot_loss=1.629, over 937.00 samples.], 2024-10-21 08:05:06,549 INFO [train.py:561] (1/4) Epoch 1677, batch 14, global_batch_idx: 26830, batch size: 142, loss[dur_loss=0.2271, prior_loss=0.9804, diff_loss=0.2994, tot_loss=1.507, over 142.00 samples.], tot_loss[dur_loss=0.2257, prior_loss=0.9805, diff_loss=0.3666, tot_loss=1.573, over 2210.00 samples.], 2024-10-21 08:05:07,968 INFO [train.py:682] (1/4) Start epoch 1678 2024-10-21 08:05:27,713 INFO [train.py:561] (1/4) Epoch 1678, batch 8, global_batch_idx: 26840, batch size: 170, loss[dur_loss=0.2361, prior_loss=0.981, diff_loss=0.3339, tot_loss=1.551, over 170.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9803, diff_loss=0.4005, tot_loss=1.606, over 1432.00 samples.], 2024-10-21 08:05:37,733 INFO [train.py:682] (1/4) Start epoch 1679 2024-10-21 08:05:49,008 INFO [train.py:561] (1/4) Epoch 1679, batch 2, global_batch_idx: 26850, batch size: 203, loss[dur_loss=0.2245, prior_loss=0.9805, diff_loss=0.3768, tot_loss=1.582, over 203.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9807, diff_loss=0.3473, tot_loss=1.554, over 442.00 samples.], 2024-10-21 08:06:03,025 INFO [train.py:561] (1/4) Epoch 1679, batch 12, global_batch_idx: 26860, batch size: 152, loss[dur_loss=0.2259, prior_loss=0.9808, diff_loss=0.3337, tot_loss=1.54, over 152.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9804, diff_loss=0.3825, tot_loss=1.588, over 1966.00 samples.], 2024-10-21 08:06:07,441 INFO [train.py:682] (1/4) Start epoch 1680 2024-10-21 08:06:24,117 INFO [train.py:561] (1/4) Epoch 1680, batch 6, global_batch_idx: 26870, batch size: 106, loss[dur_loss=0.228, prior_loss=0.9811, diff_loss=0.3132, tot_loss=1.522, over 106.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.9802, diff_loss=0.4212, tot_loss=1.624, over 1142.00 samples.], 2024-10-21 08:06:36,995 INFO [train.py:682] (1/4) Start epoch 1681 2024-10-21 08:06:45,983 INFO [train.py:561] (1/4) Epoch 1681, batch 0, global_batch_idx: 26880, batch size: 108, loss[dur_loss=0.2325, prior_loss=0.9818, diff_loss=0.3429, tot_loss=1.557, over 108.00 samples.], tot_loss[dur_loss=0.2325, prior_loss=0.9818, diff_loss=0.3429, tot_loss=1.557, over 108.00 samples.], 2024-10-21 08:07:00,166 INFO [train.py:561] (1/4) Epoch 1681, batch 10, global_batch_idx: 26890, batch size: 111, loss[dur_loss=0.2314, prior_loss=0.9821, diff_loss=0.3313, tot_loss=1.545, over 111.00 samples.], tot_loss[dur_loss=0.2262, prior_loss=0.9806, diff_loss=0.3906, tot_loss=1.597, over 1656.00 samples.], 2024-10-21 08:07:07,207 INFO [train.py:682] (1/4) Start epoch 1682 2024-10-21 08:07:20,920 INFO [train.py:561] (1/4) Epoch 1682, batch 4, global_batch_idx: 26900, batch size: 189, loss[dur_loss=0.2263, prior_loss=0.9809, diff_loss=0.3516, tot_loss=1.559, over 189.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9798, diff_loss=0.4497, tot_loss=1.652, over 937.00 samples.], 2024-10-21 08:07:35,583 INFO [train.py:561] (1/4) Epoch 1682, batch 14, global_batch_idx: 26910, batch size: 142, loss[dur_loss=0.2266, prior_loss=0.9807, diff_loss=0.3154, tot_loss=1.523, over 142.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.9805, diff_loss=0.3744, tot_loss=1.582, over 2210.00 samples.], 2024-10-21 08:07:37,016 INFO [train.py:682] (1/4) Start epoch 1683 2024-10-21 08:07:57,057 INFO [train.py:561] (1/4) Epoch 1683, batch 8, global_batch_idx: 26920, batch size: 170, loss[dur_loss=0.2314, prior_loss=0.9811, diff_loss=0.3357, tot_loss=1.548, over 170.00 samples.], tot_loss[dur_loss=0.2245, prior_loss=0.9804, diff_loss=0.402, tot_loss=1.607, over 1432.00 samples.], 2024-10-21 08:08:07,225 INFO [train.py:682] (1/4) Start epoch 1684 2024-10-21 08:08:19,120 INFO [train.py:561] (1/4) Epoch 1684, batch 2, global_batch_idx: 26930, batch size: 203, loss[dur_loss=0.227, prior_loss=0.981, diff_loss=0.3381, tot_loss=1.546, over 203.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.981, diff_loss=0.3162, tot_loss=1.525, over 442.00 samples.], 2024-10-21 08:08:33,341 INFO [train.py:561] (1/4) Epoch 1684, batch 12, global_batch_idx: 26940, batch size: 152, loss[dur_loss=0.228, prior_loss=0.9809, diff_loss=0.3669, tot_loss=1.576, over 152.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9805, diff_loss=0.3741, tot_loss=1.58, over 1966.00 samples.], 2024-10-21 08:08:37,809 INFO [train.py:682] (1/4) Start epoch 1685 2024-10-21 08:08:54,751 INFO [train.py:561] (1/4) Epoch 1685, batch 6, global_batch_idx: 26950, batch size: 106, loss[dur_loss=0.2251, prior_loss=0.9809, diff_loss=0.31, tot_loss=1.516, over 106.00 samples.], tot_loss[dur_loss=0.2218, prior_loss=0.98, diff_loss=0.4145, tot_loss=1.616, over 1142.00 samples.], 2024-10-21 08:09:07,680 INFO [train.py:682] (1/4) Start epoch 1686 2024-10-21 08:09:16,341 INFO [train.py:561] (1/4) Epoch 1686, batch 0, global_batch_idx: 26960, batch size: 108, loss[dur_loss=0.2296, prior_loss=0.9814, diff_loss=0.2654, tot_loss=1.476, over 108.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9814, diff_loss=0.2654, tot_loss=1.476, over 108.00 samples.], 2024-10-21 08:09:30,422 INFO [train.py:561] (1/4) Epoch 1686, batch 10, global_batch_idx: 26970, batch size: 111, loss[dur_loss=0.2323, prior_loss=0.9822, diff_loss=0.3115, tot_loss=1.526, over 111.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9803, diff_loss=0.377, tot_loss=1.581, over 1656.00 samples.], 2024-10-21 08:09:37,464 INFO [train.py:682] (1/4) Start epoch 1687 2024-10-21 08:09:51,324 INFO [train.py:561] (1/4) Epoch 1687, batch 4, global_batch_idx: 26980, batch size: 189, loss[dur_loss=0.2246, prior_loss=0.9807, diff_loss=0.3426, tot_loss=1.548, over 189.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9797, diff_loss=0.4434, tot_loss=1.644, over 937.00 samples.], 2024-10-21 08:10:06,019 INFO [train.py:561] (1/4) Epoch 1687, batch 14, global_batch_idx: 26990, batch size: 142, loss[dur_loss=0.2236, prior_loss=0.9804, diff_loss=0.3091, tot_loss=1.513, over 142.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9804, diff_loss=0.3718, tot_loss=1.578, over 2210.00 samples.], 2024-10-21 08:10:07,439 INFO [train.py:682] (1/4) Start epoch 1688 2024-10-21 08:10:27,174 INFO [train.py:561] (1/4) Epoch 1688, batch 8, global_batch_idx: 27000, batch size: 170, loss[dur_loss=0.2309, prior_loss=0.9808, diff_loss=0.3302, tot_loss=1.542, over 170.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.98, diff_loss=0.3968, tot_loss=1.599, over 1432.00 samples.], 2024-10-21 08:10:28,642 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 08:11:24,089 INFO [train.py:589] (1/4) Epoch 1688, validation: dur_loss=0.4564, prior_loss=1.036, diff_loss=0.4117, tot_loss=1.904, over 100.00 samples. 2024-10-21 08:11:24,090 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 08:11:32,732 INFO [train.py:682] (1/4) Start epoch 1689 2024-10-21 08:11:44,873 INFO [train.py:561] (1/4) Epoch 1689, batch 2, global_batch_idx: 27010, batch size: 203, loss[dur_loss=0.229, prior_loss=0.9809, diff_loss=0.363, tot_loss=1.573, over 203.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.9808, diff_loss=0.3437, tot_loss=1.552, over 442.00 samples.], 2024-10-21 08:11:58,977 INFO [train.py:561] (1/4) Epoch 1689, batch 12, global_batch_idx: 27020, batch size: 152, loss[dur_loss=0.2297, prior_loss=0.9811, diff_loss=0.3029, tot_loss=1.514, over 152.00 samples.], tot_loss[dur_loss=0.2256, prior_loss=0.9804, diff_loss=0.3827, tot_loss=1.589, over 1966.00 samples.], 2024-10-21 08:12:03,366 INFO [train.py:682] (1/4) Start epoch 1690 2024-10-21 08:12:20,311 INFO [train.py:561] (1/4) Epoch 1690, batch 6, global_batch_idx: 27030, batch size: 106, loss[dur_loss=0.2274, prior_loss=0.9808, diff_loss=0.3085, tot_loss=1.517, over 106.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9802, diff_loss=0.4246, tot_loss=1.626, over 1142.00 samples.], 2024-10-21 08:12:33,157 INFO [train.py:682] (1/4) Start epoch 1691 2024-10-21 08:12:41,860 INFO [train.py:561] (1/4) Epoch 1691, batch 0, global_batch_idx: 27040, batch size: 108, loss[dur_loss=0.2314, prior_loss=0.9815, diff_loss=0.3086, tot_loss=1.522, over 108.00 samples.], tot_loss[dur_loss=0.2314, prior_loss=0.9815, diff_loss=0.3086, tot_loss=1.522, over 108.00 samples.], 2024-10-21 08:12:55,871 INFO [train.py:561] (1/4) Epoch 1691, batch 10, global_batch_idx: 27050, batch size: 111, loss[dur_loss=0.2305, prior_loss=0.9819, diff_loss=0.2934, tot_loss=1.506, over 111.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9803, diff_loss=0.3821, tot_loss=1.588, over 1656.00 samples.], 2024-10-21 08:13:02,854 INFO [train.py:682] (1/4) Start epoch 1692 2024-10-21 08:13:16,463 INFO [train.py:561] (1/4) Epoch 1692, batch 4, global_batch_idx: 27060, batch size: 189, loss[dur_loss=0.2265, prior_loss=0.9807, diff_loss=0.3286, tot_loss=1.536, over 189.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9797, diff_loss=0.4335, tot_loss=1.634, over 937.00 samples.], 2024-10-21 08:13:31,132 INFO [train.py:561] (1/4) Epoch 1692, batch 14, global_batch_idx: 27070, batch size: 142, loss[dur_loss=0.2256, prior_loss=0.9802, diff_loss=0.3535, tot_loss=1.559, over 142.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9804, diff_loss=0.3765, tot_loss=1.582, over 2210.00 samples.], 2024-10-21 08:13:32,543 INFO [train.py:682] (1/4) Start epoch 1693 2024-10-21 08:13:52,334 INFO [train.py:561] (1/4) Epoch 1693, batch 8, global_batch_idx: 27080, batch size: 170, loss[dur_loss=0.2295, prior_loss=0.9807, diff_loss=0.3447, tot_loss=1.555, over 170.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.98, diff_loss=0.4029, tot_loss=1.607, over 1432.00 samples.], 2024-10-21 08:14:02,391 INFO [train.py:682] (1/4) Start epoch 1694 2024-10-21 08:14:13,562 INFO [train.py:561] (1/4) Epoch 1694, batch 2, global_batch_idx: 27090, batch size: 203, loss[dur_loss=0.2241, prior_loss=0.9807, diff_loss=0.3401, tot_loss=1.545, over 203.00 samples.], tot_loss[dur_loss=0.2252, prior_loss=0.9807, diff_loss=0.3146, tot_loss=1.521, over 442.00 samples.], 2024-10-21 08:14:27,622 INFO [train.py:561] (1/4) Epoch 1694, batch 12, global_batch_idx: 27100, batch size: 152, loss[dur_loss=0.2263, prior_loss=0.981, diff_loss=0.3198, tot_loss=1.527, over 152.00 samples.], tot_loss[dur_loss=0.2248, prior_loss=0.9804, diff_loss=0.3777, tot_loss=1.583, over 1966.00 samples.], 2024-10-21 08:14:32,015 INFO [train.py:682] (1/4) Start epoch 1695 2024-10-21 08:14:49,108 INFO [train.py:561] (1/4) Epoch 1695, batch 6, global_batch_idx: 27110, batch size: 106, loss[dur_loss=0.2272, prior_loss=0.9805, diff_loss=0.3124, tot_loss=1.52, over 106.00 samples.], tot_loss[dur_loss=0.2223, prior_loss=0.9798, diff_loss=0.4102, tot_loss=1.612, over 1142.00 samples.], 2024-10-21 08:15:01,992 INFO [train.py:682] (1/4) Start epoch 1696 2024-10-21 08:15:10,573 INFO [train.py:561] (1/4) Epoch 1696, batch 0, global_batch_idx: 27120, batch size: 108, loss[dur_loss=0.23, prior_loss=0.9816, diff_loss=0.2867, tot_loss=1.498, over 108.00 samples.], tot_loss[dur_loss=0.23, prior_loss=0.9816, diff_loss=0.2867, tot_loss=1.498, over 108.00 samples.], 2024-10-21 08:15:24,631 INFO [train.py:561] (1/4) Epoch 1696, batch 10, global_batch_idx: 27130, batch size: 111, loss[dur_loss=0.2322, prior_loss=0.9817, diff_loss=0.2876, tot_loss=1.501, over 111.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.9802, diff_loss=0.3827, tot_loss=1.587, over 1656.00 samples.], 2024-10-21 08:15:31,610 INFO [train.py:682] (1/4) Start epoch 1697 2024-10-21 08:15:45,431 INFO [train.py:561] (1/4) Epoch 1697, batch 4, global_batch_idx: 27140, batch size: 189, loss[dur_loss=0.2236, prior_loss=0.9807, diff_loss=0.3485, tot_loss=1.553, over 189.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9796, diff_loss=0.4302, tot_loss=1.632, over 937.00 samples.], 2024-10-21 08:16:00,121 INFO [train.py:561] (1/4) Epoch 1697, batch 14, global_batch_idx: 27150, batch size: 142, loss[dur_loss=0.2284, prior_loss=0.9803, diff_loss=0.3081, tot_loss=1.517, over 142.00 samples.], tot_loss[dur_loss=0.2261, prior_loss=0.9804, diff_loss=0.37, tot_loss=1.577, over 2210.00 samples.], 2024-10-21 08:16:01,529 INFO [train.py:682] (1/4) Start epoch 1698 2024-10-21 08:16:21,407 INFO [train.py:561] (1/4) Epoch 1698, batch 8, global_batch_idx: 27160, batch size: 170, loss[dur_loss=0.2307, prior_loss=0.9807, diff_loss=0.3297, tot_loss=1.541, over 170.00 samples.], tot_loss[dur_loss=0.224, prior_loss=0.9802, diff_loss=0.4075, tot_loss=1.612, over 1432.00 samples.], 2024-10-21 08:16:31,424 INFO [train.py:682] (1/4) Start epoch 1699 2024-10-21 08:16:42,687 INFO [train.py:561] (1/4) Epoch 1699, batch 2, global_batch_idx: 27170, batch size: 203, loss[dur_loss=0.2299, prior_loss=0.9807, diff_loss=0.3702, tot_loss=1.581, over 203.00 samples.], tot_loss[dur_loss=0.2289, prior_loss=0.9808, diff_loss=0.338, tot_loss=1.548, over 442.00 samples.], 2024-10-21 08:16:56,819 INFO [train.py:561] (1/4) Epoch 1699, batch 12, global_batch_idx: 27180, batch size: 152, loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.3226, tot_loss=1.529, over 152.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9804, diff_loss=0.3762, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 08:17:01,236 INFO [train.py:682] (1/4) Start epoch 1700 2024-10-21 08:17:17,987 INFO [train.py:561] (1/4) Epoch 1700, batch 6, global_batch_idx: 27190, batch size: 106, loss[dur_loss=0.2238, prior_loss=0.9807, diff_loss=0.3355, tot_loss=1.54, over 106.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9798, diff_loss=0.4216, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 08:17:30,903 INFO [train.py:682] (1/4) Start epoch 1701 2024-10-21 08:17:39,394 INFO [train.py:561] (1/4) Epoch 1701, batch 0, global_batch_idx: 27200, batch size: 108, loss[dur_loss=0.2276, prior_loss=0.9814, diff_loss=0.353, tot_loss=1.562, over 108.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9814, diff_loss=0.353, tot_loss=1.562, over 108.00 samples.], 2024-10-21 08:17:53,455 INFO [train.py:561] (1/4) Epoch 1701, batch 10, global_batch_idx: 27210, batch size: 111, loss[dur_loss=0.2318, prior_loss=0.9821, diff_loss=0.3006, tot_loss=1.514, over 111.00 samples.], tot_loss[dur_loss=0.2245, prior_loss=0.9805, diff_loss=0.3873, tot_loss=1.592, over 1656.00 samples.], 2024-10-21 08:18:00,497 INFO [train.py:682] (1/4) Start epoch 1702 2024-10-21 08:18:14,297 INFO [train.py:561] (1/4) Epoch 1702, batch 4, global_batch_idx: 27220, batch size: 189, loss[dur_loss=0.2231, prior_loss=0.9807, diff_loss=0.3062, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.2203, prior_loss=0.9798, diff_loss=0.43, tot_loss=1.63, over 937.00 samples.], 2024-10-21 08:18:29,071 INFO [train.py:561] (1/4) Epoch 1702, batch 14, global_batch_idx: 27230, batch size: 142, loss[dur_loss=0.2262, prior_loss=0.9804, diff_loss=0.2994, tot_loss=1.506, over 142.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9804, diff_loss=0.3598, tot_loss=1.566, over 2210.00 samples.], 2024-10-21 08:18:30,496 INFO [train.py:682] (1/4) Start epoch 1703 2024-10-21 08:18:50,171 INFO [train.py:561] (1/4) Epoch 1703, batch 8, global_batch_idx: 27240, batch size: 170, loss[dur_loss=0.2317, prior_loss=0.9807, diff_loss=0.3065, tot_loss=1.519, over 170.00 samples.], tot_loss[dur_loss=0.2241, prior_loss=0.98, diff_loss=0.3878, tot_loss=1.592, over 1432.00 samples.], 2024-10-21 08:19:00,200 INFO [train.py:682] (1/4) Start epoch 1704 2024-10-21 08:19:11,389 INFO [train.py:561] (1/4) Epoch 1704, batch 2, global_batch_idx: 27250, batch size: 203, loss[dur_loss=0.2275, prior_loss=0.9805, diff_loss=0.3448, tot_loss=1.553, over 203.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9807, diff_loss=0.3305, tot_loss=1.539, over 442.00 samples.], 2024-10-21 08:19:25,544 INFO [train.py:561] (1/4) Epoch 1704, batch 12, global_batch_idx: 27260, batch size: 152, loss[dur_loss=0.2253, prior_loss=0.9807, diff_loss=0.3273, tot_loss=1.533, over 152.00 samples.], tot_loss[dur_loss=0.2244, prior_loss=0.9803, diff_loss=0.3762, tot_loss=1.581, over 1966.00 samples.], 2024-10-21 08:19:29,968 INFO [train.py:682] (1/4) Start epoch 1705 2024-10-21 08:19:47,292 INFO [train.py:561] (1/4) Epoch 1705, batch 6, global_batch_idx: 27270, batch size: 106, loss[dur_loss=0.2275, prior_loss=0.9806, diff_loss=0.3163, tot_loss=1.524, over 106.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9797, diff_loss=0.4159, tot_loss=1.617, over 1142.00 samples.], 2024-10-21 08:20:00,277 INFO [train.py:682] (1/4) Start epoch 1706 2024-10-21 08:20:08,940 INFO [train.py:561] (1/4) Epoch 1706, batch 0, global_batch_idx: 27280, batch size: 108, loss[dur_loss=0.2286, prior_loss=0.9811, diff_loss=0.3003, tot_loss=1.51, over 108.00 samples.], tot_loss[dur_loss=0.2286, prior_loss=0.9811, diff_loss=0.3003, tot_loss=1.51, over 108.00 samples.], 2024-10-21 08:20:23,021 INFO [train.py:561] (1/4) Epoch 1706, batch 10, global_batch_idx: 27290, batch size: 111, loss[dur_loss=0.2304, prior_loss=0.9817, diff_loss=0.3146, tot_loss=1.527, over 111.00 samples.], tot_loss[dur_loss=0.2231, prior_loss=0.9802, diff_loss=0.3963, tot_loss=1.6, over 1656.00 samples.], 2024-10-21 08:20:30,067 INFO [train.py:682] (1/4) Start epoch 1707 2024-10-21 08:20:43,524 INFO [train.py:561] (1/4) Epoch 1707, batch 4, global_batch_idx: 27300, batch size: 189, loss[dur_loss=0.227, prior_loss=0.9808, diff_loss=0.36, tot_loss=1.568, over 189.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.9797, diff_loss=0.4425, tot_loss=1.642, over 937.00 samples.], 2024-10-21 08:20:58,296 INFO [train.py:561] (1/4) Epoch 1707, batch 14, global_batch_idx: 27310, batch size: 142, loss[dur_loss=0.2234, prior_loss=0.9801, diff_loss=0.3225, tot_loss=1.526, over 142.00 samples.], tot_loss[dur_loss=0.225, prior_loss=0.9803, diff_loss=0.3724, tot_loss=1.578, over 2210.00 samples.], 2024-10-21 08:20:59,718 INFO [train.py:682] (1/4) Start epoch 1708 2024-10-21 08:21:19,439 INFO [train.py:561] (1/4) Epoch 1708, batch 8, global_batch_idx: 27320, batch size: 170, loss[dur_loss=0.2324, prior_loss=0.9807, diff_loss=0.3137, tot_loss=1.527, over 170.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.98, diff_loss=0.4007, tot_loss=1.604, over 1432.00 samples.], 2024-10-21 08:21:29,475 INFO [train.py:682] (1/4) Start epoch 1709 2024-10-21 08:21:40,686 INFO [train.py:561] (1/4) Epoch 1709, batch 2, global_batch_idx: 27330, batch size: 203, loss[dur_loss=0.2261, prior_loss=0.9804, diff_loss=0.3323, tot_loss=1.539, over 203.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9805, diff_loss=0.3157, tot_loss=1.523, over 442.00 samples.], 2024-10-21 08:21:54,847 INFO [train.py:561] (1/4) Epoch 1709, batch 12, global_batch_idx: 27340, batch size: 152, loss[dur_loss=0.2284, prior_loss=0.9805, diff_loss=0.3327, tot_loss=1.542, over 152.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9802, diff_loss=0.3678, tot_loss=1.573, over 1966.00 samples.], 2024-10-21 08:21:59,263 INFO [train.py:682] (1/4) Start epoch 1710 2024-10-21 08:22:16,511 INFO [train.py:561] (1/4) Epoch 1710, batch 6, global_batch_idx: 27350, batch size: 106, loss[dur_loss=0.2277, prior_loss=0.9808, diff_loss=0.3056, tot_loss=1.514, over 106.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9798, diff_loss=0.4204, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 08:22:29,614 INFO [train.py:682] (1/4) Start epoch 1711 2024-10-21 08:22:38,138 INFO [train.py:561] (1/4) Epoch 1711, batch 0, global_batch_idx: 27360, batch size: 108, loss[dur_loss=0.2296, prior_loss=0.9812, diff_loss=0.3211, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9812, diff_loss=0.3211, tot_loss=1.532, over 108.00 samples.], 2024-10-21 08:22:52,237 INFO [train.py:561] (1/4) Epoch 1711, batch 10, global_batch_idx: 27370, batch size: 111, loss[dur_loss=0.2288, prior_loss=0.9815, diff_loss=0.3235, tot_loss=1.534, over 111.00 samples.], tot_loss[dur_loss=0.224, prior_loss=0.9801, diff_loss=0.3892, tot_loss=1.593, over 1656.00 samples.], 2024-10-21 08:22:59,339 INFO [train.py:682] (1/4) Start epoch 1712 2024-10-21 08:23:13,012 INFO [train.py:561] (1/4) Epoch 1712, batch 4, global_batch_idx: 27380, batch size: 189, loss[dur_loss=0.224, prior_loss=0.9805, diff_loss=0.3417, tot_loss=1.546, over 189.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9795, diff_loss=0.4365, tot_loss=1.636, over 937.00 samples.], 2024-10-21 08:23:27,760 INFO [train.py:561] (1/4) Epoch 1712, batch 14, global_batch_idx: 27390, batch size: 142, loss[dur_loss=0.224, prior_loss=0.9802, diff_loss=0.3071, tot_loss=1.511, over 142.00 samples.], tot_loss[dur_loss=0.224, prior_loss=0.9802, diff_loss=0.3626, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 08:23:29,177 INFO [train.py:682] (1/4) Start epoch 1713 2024-10-21 08:23:48,901 INFO [train.py:561] (1/4) Epoch 1713, batch 8, global_batch_idx: 27400, batch size: 170, loss[dur_loss=0.2286, prior_loss=0.9807, diff_loss=0.313, tot_loss=1.522, over 170.00 samples.], tot_loss[dur_loss=0.2234, prior_loss=0.9801, diff_loss=0.3961, tot_loss=1.6, over 1432.00 samples.], 2024-10-21 08:23:58,902 INFO [train.py:682] (1/4) Start epoch 1714 2024-10-21 08:24:10,647 INFO [train.py:561] (1/4) Epoch 1714, batch 2, global_batch_idx: 27410, batch size: 203, loss[dur_loss=0.2245, prior_loss=0.9806, diff_loss=0.3311, tot_loss=1.536, over 203.00 samples.], tot_loss[dur_loss=0.226, prior_loss=0.9807, diff_loss=0.3199, tot_loss=1.527, over 442.00 samples.], 2024-10-21 08:24:24,722 INFO [train.py:561] (1/4) Epoch 1714, batch 12, global_batch_idx: 27420, batch size: 152, loss[dur_loss=0.2279, prior_loss=0.9806, diff_loss=0.3189, tot_loss=1.527, over 152.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9803, diff_loss=0.3712, tot_loss=1.576, over 1966.00 samples.], 2024-10-21 08:24:29,110 INFO [train.py:682] (1/4) Start epoch 1715 2024-10-21 08:24:46,027 INFO [train.py:561] (1/4) Epoch 1715, batch 6, global_batch_idx: 27430, batch size: 106, loss[dur_loss=0.2246, prior_loss=0.9807, diff_loss=0.3236, tot_loss=1.529, over 106.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9799, diff_loss=0.4163, tot_loss=1.619, over 1142.00 samples.], 2024-10-21 08:24:58,977 INFO [train.py:682] (1/4) Start epoch 1716 2024-10-21 08:25:07,605 INFO [train.py:561] (1/4) Epoch 1716, batch 0, global_batch_idx: 27440, batch size: 108, loss[dur_loss=0.2326, prior_loss=0.9813, diff_loss=0.2851, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2326, prior_loss=0.9813, diff_loss=0.2851, tot_loss=1.499, over 108.00 samples.], 2024-10-21 08:25:21,736 INFO [train.py:561] (1/4) Epoch 1716, batch 10, global_batch_idx: 27450, batch size: 111, loss[dur_loss=0.23, prior_loss=0.9817, diff_loss=0.3075, tot_loss=1.519, over 111.00 samples.], tot_loss[dur_loss=0.224, prior_loss=0.9803, diff_loss=0.3864, tot_loss=1.591, over 1656.00 samples.], 2024-10-21 08:25:28,769 INFO [train.py:682] (1/4) Start epoch 1717 2024-10-21 08:25:42,467 INFO [train.py:561] (1/4) Epoch 1717, batch 4, global_batch_idx: 27460, batch size: 189, loss[dur_loss=0.2218, prior_loss=0.9805, diff_loss=0.3476, tot_loss=1.55, over 189.00 samples.], tot_loss[dur_loss=0.2203, prior_loss=0.9797, diff_loss=0.4336, tot_loss=1.634, over 937.00 samples.], 2024-10-21 08:25:57,246 INFO [train.py:561] (1/4) Epoch 1717, batch 14, global_batch_idx: 27470, batch size: 142, loss[dur_loss=0.226, prior_loss=0.9802, diff_loss=0.2874, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9803, diff_loss=0.3642, tot_loss=1.569, over 2210.00 samples.], 2024-10-21 08:25:58,677 INFO [train.py:682] (1/4) Start epoch 1718 2024-10-21 08:26:18,293 INFO [train.py:561] (1/4) Epoch 1718, batch 8, global_batch_idx: 27480, batch size: 170, loss[dur_loss=0.2311, prior_loss=0.9816, diff_loss=0.3735, tot_loss=1.586, over 170.00 samples.], tot_loss[dur_loss=0.2225, prior_loss=0.9801, diff_loss=0.3945, tot_loss=1.597, over 1432.00 samples.], 2024-10-21 08:26:28,359 INFO [train.py:682] (1/4) Start epoch 1719 2024-10-21 08:26:39,948 INFO [train.py:561] (1/4) Epoch 1719, batch 2, global_batch_idx: 27490, batch size: 203, loss[dur_loss=0.2227, prior_loss=0.9807, diff_loss=0.3549, tot_loss=1.558, over 203.00 samples.], tot_loss[dur_loss=0.2244, prior_loss=0.9807, diff_loss=0.3435, tot_loss=1.549, over 442.00 samples.], 2024-10-21 08:26:54,041 INFO [train.py:561] (1/4) Epoch 1719, batch 12, global_batch_idx: 27500, batch size: 152, loss[dur_loss=0.2273, prior_loss=0.9804, diff_loss=0.3037, tot_loss=1.511, over 152.00 samples.], tot_loss[dur_loss=0.224, prior_loss=0.9804, diff_loss=0.3768, tot_loss=1.581, over 1966.00 samples.], 2024-10-21 08:26:58,421 INFO [train.py:682] (1/4) Start epoch 1720 2024-10-21 08:27:15,197 INFO [train.py:561] (1/4) Epoch 1720, batch 6, global_batch_idx: 27510, batch size: 106, loss[dur_loss=0.227, prior_loss=0.9806, diff_loss=0.302, tot_loss=1.51, over 106.00 samples.], tot_loss[dur_loss=0.2221, prior_loss=0.9798, diff_loss=0.4127, tot_loss=1.615, over 1142.00 samples.], 2024-10-21 08:27:28,125 INFO [train.py:682] (1/4) Start epoch 1721 2024-10-21 08:27:36,840 INFO [train.py:561] (1/4) Epoch 1721, batch 0, global_batch_idx: 27520, batch size: 108, loss[dur_loss=0.2296, prior_loss=0.9811, diff_loss=0.3206, tot_loss=1.531, over 108.00 samples.], tot_loss[dur_loss=0.2296, prior_loss=0.9811, diff_loss=0.3206, tot_loss=1.531, over 108.00 samples.], 2024-10-21 08:27:51,143 INFO [train.py:561] (1/4) Epoch 1721, batch 10, global_batch_idx: 27530, batch size: 111, loss[dur_loss=0.2321, prior_loss=0.9817, diff_loss=0.3341, tot_loss=1.548, over 111.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.9801, diff_loss=0.3903, tot_loss=1.593, over 1656.00 samples.], 2024-10-21 08:27:58,179 INFO [train.py:682] (1/4) Start epoch 1722 2024-10-21 08:28:11,661 INFO [train.py:561] (1/4) Epoch 1722, batch 4, global_batch_idx: 27540, batch size: 189, loss[dur_loss=0.2255, prior_loss=0.981, diff_loss=0.3612, tot_loss=1.568, over 189.00 samples.], tot_loss[dur_loss=0.2226, prior_loss=0.9795, diff_loss=0.4386, tot_loss=1.641, over 937.00 samples.], 2024-10-21 08:28:26,430 INFO [train.py:561] (1/4) Epoch 1722, batch 14, global_batch_idx: 27550, batch size: 142, loss[dur_loss=0.226, prior_loss=0.9801, diff_loss=0.2681, tot_loss=1.474, over 142.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9802, diff_loss=0.3712, tot_loss=1.577, over 2210.00 samples.], 2024-10-21 08:28:27,842 INFO [train.py:682] (1/4) Start epoch 1723 2024-10-21 08:28:47,633 INFO [train.py:561] (1/4) Epoch 1723, batch 8, global_batch_idx: 27560, batch size: 170, loss[dur_loss=0.2312, prior_loss=0.9807, diff_loss=0.3228, tot_loss=1.535, over 170.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.9799, diff_loss=0.3906, tot_loss=1.593, over 1432.00 samples.], 2024-10-21 08:28:57,682 INFO [train.py:682] (1/4) Start epoch 1724 2024-10-21 08:29:09,033 INFO [train.py:561] (1/4) Epoch 1724, batch 2, global_batch_idx: 27570, batch size: 203, loss[dur_loss=0.2256, prior_loss=0.9804, diff_loss=0.3367, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9806, diff_loss=0.3281, tot_loss=1.534, over 442.00 samples.], 2024-10-21 08:29:23,068 INFO [train.py:561] (1/4) Epoch 1724, batch 12, global_batch_idx: 27580, batch size: 152, loss[dur_loss=0.2278, prior_loss=0.9804, diff_loss=0.3418, tot_loss=1.55, over 152.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9802, diff_loss=0.373, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 08:29:27,472 INFO [train.py:682] (1/4) Start epoch 1725 2024-10-21 08:29:44,306 INFO [train.py:561] (1/4) Epoch 1725, batch 6, global_batch_idx: 27590, batch size: 106, loss[dur_loss=0.2286, prior_loss=0.9807, diff_loss=0.3135, tot_loss=1.523, over 106.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.9797, diff_loss=0.4098, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 08:29:57,229 INFO [train.py:682] (1/4) Start epoch 1726 2024-10-21 08:30:05,795 INFO [train.py:561] (1/4) Epoch 1726, batch 0, global_batch_idx: 27600, batch size: 108, loss[dur_loss=0.228, prior_loss=0.9809, diff_loss=0.3466, tot_loss=1.555, over 108.00 samples.], tot_loss[dur_loss=0.228, prior_loss=0.9809, diff_loss=0.3466, tot_loss=1.555, over 108.00 samples.], 2024-10-21 08:30:19,903 INFO [train.py:561] (1/4) Epoch 1726, batch 10, global_batch_idx: 27610, batch size: 111, loss[dur_loss=0.2292, prior_loss=0.9816, diff_loss=0.3287, tot_loss=1.539, over 111.00 samples.], tot_loss[dur_loss=0.2244, prior_loss=0.9801, diff_loss=0.3946, tot_loss=1.599, over 1656.00 samples.], 2024-10-21 08:30:26,993 INFO [train.py:682] (1/4) Start epoch 1727 2024-10-21 08:30:40,711 INFO [train.py:561] (1/4) Epoch 1727, batch 4, global_batch_idx: 27620, batch size: 189, loss[dur_loss=0.2219, prior_loss=0.9805, diff_loss=0.3702, tot_loss=1.573, over 189.00 samples.], tot_loss[dur_loss=0.2204, prior_loss=0.9794, diff_loss=0.443, tot_loss=1.643, over 937.00 samples.], 2024-10-21 08:30:55,442 INFO [train.py:561] (1/4) Epoch 1727, batch 14, global_batch_idx: 27630, batch size: 142, loss[dur_loss=0.2259, prior_loss=0.9801, diff_loss=0.3081, tot_loss=1.514, over 142.00 samples.], tot_loss[dur_loss=0.225, prior_loss=0.9802, diff_loss=0.3756, tot_loss=1.581, over 2210.00 samples.], 2024-10-21 08:30:56,902 INFO [train.py:682] (1/4) Start epoch 1728 2024-10-21 08:31:16,579 INFO [train.py:561] (1/4) Epoch 1728, batch 8, global_batch_idx: 27640, batch size: 170, loss[dur_loss=0.2315, prior_loss=0.9806, diff_loss=0.3176, tot_loss=1.53, over 170.00 samples.], tot_loss[dur_loss=0.2237, prior_loss=0.98, diff_loss=0.394, tot_loss=1.598, over 1432.00 samples.], 2024-10-21 08:31:26,696 INFO [train.py:682] (1/4) Start epoch 1729 2024-10-21 08:31:37,770 INFO [train.py:561] (1/4) Epoch 1729, batch 2, global_batch_idx: 27650, batch size: 203, loss[dur_loss=0.2252, prior_loss=0.9804, diff_loss=0.3563, tot_loss=1.562, over 203.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9804, diff_loss=0.3296, tot_loss=1.536, over 442.00 samples.], 2024-10-21 08:31:51,871 INFO [train.py:561] (1/4) Epoch 1729, batch 12, global_batch_idx: 27660, batch size: 152, loss[dur_loss=0.2336, prior_loss=0.981, diff_loss=0.2745, tot_loss=1.489, over 152.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9801, diff_loss=0.363, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 08:31:56,280 INFO [train.py:682] (1/4) Start epoch 1730 2024-10-21 08:32:13,354 INFO [train.py:561] (1/4) Epoch 1730, batch 6, global_batch_idx: 27670, batch size: 106, loss[dur_loss=0.2233, prior_loss=0.9804, diff_loss=0.3159, tot_loss=1.52, over 106.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.9798, diff_loss=0.4112, tot_loss=1.612, over 1142.00 samples.], 2024-10-21 08:32:26,301 INFO [train.py:682] (1/4) Start epoch 1731 2024-10-21 08:32:35,027 INFO [train.py:561] (1/4) Epoch 1731, batch 0, global_batch_idx: 27680, batch size: 108, loss[dur_loss=0.2261, prior_loss=0.9809, diff_loss=0.2646, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.2261, prior_loss=0.9809, diff_loss=0.2646, tot_loss=1.472, over 108.00 samples.], 2024-10-21 08:32:49,100 INFO [train.py:561] (1/4) Epoch 1731, batch 10, global_batch_idx: 27690, batch size: 111, loss[dur_loss=0.2308, prior_loss=0.9813, diff_loss=0.3564, tot_loss=1.568, over 111.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.98, diff_loss=0.3867, tot_loss=1.589, over 1656.00 samples.], 2024-10-21 08:32:56,160 INFO [train.py:682] (1/4) Start epoch 1732 2024-10-21 08:33:09,603 INFO [train.py:561] (1/4) Epoch 1732, batch 4, global_batch_idx: 27700, batch size: 189, loss[dur_loss=0.2248, prior_loss=0.98, diff_loss=0.3439, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2215, prior_loss=0.9792, diff_loss=0.4423, tot_loss=1.643, over 937.00 samples.], 2024-10-21 08:33:24,263 INFO [train.py:561] (1/4) Epoch 1732, batch 14, global_batch_idx: 27710, batch size: 142, loss[dur_loss=0.2276, prior_loss=0.98, diff_loss=0.2916, tot_loss=1.499, over 142.00 samples.], tot_loss[dur_loss=0.225, prior_loss=0.98, diff_loss=0.3665, tot_loss=1.571, over 2210.00 samples.], 2024-10-21 08:33:25,678 INFO [train.py:682] (1/4) Start epoch 1733 2024-10-21 08:33:45,420 INFO [train.py:561] (1/4) Epoch 1733, batch 8, global_batch_idx: 27720, batch size: 170, loss[dur_loss=0.229, prior_loss=0.9802, diff_loss=0.3373, tot_loss=1.546, over 170.00 samples.], tot_loss[dur_loss=0.2225, prior_loss=0.9796, diff_loss=0.3977, tot_loss=1.6, over 1432.00 samples.], 2024-10-21 08:33:55,421 INFO [train.py:682] (1/4) Start epoch 1734 2024-10-21 08:34:06,578 INFO [train.py:561] (1/4) Epoch 1734, batch 2, global_batch_idx: 27730, batch size: 203, loss[dur_loss=0.2234, prior_loss=0.9799, diff_loss=0.3399, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2245, prior_loss=0.9802, diff_loss=0.3353, tot_loss=1.54, over 442.00 samples.], 2024-10-21 08:34:20,648 INFO [train.py:561] (1/4) Epoch 1734, batch 12, global_batch_idx: 27740, batch size: 152, loss[dur_loss=0.2261, prior_loss=0.9803, diff_loss=0.3254, tot_loss=1.532, over 152.00 samples.], tot_loss[dur_loss=0.223, prior_loss=0.9799, diff_loss=0.3739, tot_loss=1.577, over 1966.00 samples.], 2024-10-21 08:34:25,024 INFO [train.py:682] (1/4) Start epoch 1735 2024-10-21 08:34:41,815 INFO [train.py:561] (1/4) Epoch 1735, batch 6, global_batch_idx: 27750, batch size: 106, loss[dur_loss=0.2273, prior_loss=0.9806, diff_loss=0.3429, tot_loss=1.551, over 106.00 samples.], tot_loss[dur_loss=0.2201, prior_loss=0.9796, diff_loss=0.4223, tot_loss=1.622, over 1142.00 samples.], 2024-10-21 08:34:54,799 INFO [train.py:682] (1/4) Start epoch 1736 2024-10-21 08:35:03,493 INFO [train.py:561] (1/4) Epoch 1736, batch 0, global_batch_idx: 27760, batch size: 108, loss[dur_loss=0.2314, prior_loss=0.9811, diff_loss=0.3007, tot_loss=1.513, over 108.00 samples.], tot_loss[dur_loss=0.2314, prior_loss=0.9811, diff_loss=0.3007, tot_loss=1.513, over 108.00 samples.], 2024-10-21 08:35:17,594 INFO [train.py:561] (1/4) Epoch 1736, batch 10, global_batch_idx: 27770, batch size: 111, loss[dur_loss=0.2326, prior_loss=0.9817, diff_loss=0.3313, tot_loss=1.546, over 111.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.9801, diff_loss=0.3915, tot_loss=1.596, over 1656.00 samples.], 2024-10-21 08:35:24,659 INFO [train.py:682] (1/4) Start epoch 1737 2024-10-21 08:35:38,274 INFO [train.py:561] (1/4) Epoch 1737, batch 4, global_batch_idx: 27780, batch size: 189, loss[dur_loss=0.222, prior_loss=0.9805, diff_loss=0.3558, tot_loss=1.558, over 189.00 samples.], tot_loss[dur_loss=0.2195, prior_loss=0.9793, diff_loss=0.4404, tot_loss=1.639, over 937.00 samples.], 2024-10-21 08:35:53,088 INFO [train.py:561] (1/4) Epoch 1737, batch 14, global_batch_idx: 27790, batch size: 142, loss[dur_loss=0.2277, prior_loss=0.9801, diff_loss=0.2951, tot_loss=1.503, over 142.00 samples.], tot_loss[dur_loss=0.2241, prior_loss=0.98, diff_loss=0.368, tot_loss=1.572, over 2210.00 samples.], 2024-10-21 08:35:54,510 INFO [train.py:682] (1/4) Start epoch 1738 2024-10-21 08:36:14,097 INFO [train.py:561] (1/4) Epoch 1738, batch 8, global_batch_idx: 27800, batch size: 170, loss[dur_loss=0.23, prior_loss=0.9807, diff_loss=0.3559, tot_loss=1.567, over 170.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9799, diff_loss=0.4073, tot_loss=1.61, over 1432.00 samples.], 2024-10-21 08:36:24,067 INFO [train.py:682] (1/4) Start epoch 1739 2024-10-21 08:36:35,409 INFO [train.py:561] (1/4) Epoch 1739, batch 2, global_batch_idx: 27810, batch size: 203, loss[dur_loss=0.2259, prior_loss=0.9801, diff_loss=0.3773, tot_loss=1.583, over 203.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9802, diff_loss=0.3371, tot_loss=1.543, over 442.00 samples.], 2024-10-21 08:36:49,454 INFO [train.py:561] (1/4) Epoch 1739, batch 12, global_batch_idx: 27820, batch size: 152, loss[dur_loss=0.2234, prior_loss=0.9805, diff_loss=0.294, tot_loss=1.498, over 152.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.98, diff_loss=0.3791, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 08:36:53,832 INFO [train.py:682] (1/4) Start epoch 1740 2024-10-21 08:37:10,469 INFO [train.py:561] (1/4) Epoch 1740, batch 6, global_batch_idx: 27830, batch size: 106, loss[dur_loss=0.2274, prior_loss=0.9807, diff_loss=0.3025, tot_loss=1.511, over 106.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9796, diff_loss=0.4187, tot_loss=1.62, over 1142.00 samples.], 2024-10-21 08:37:23,351 INFO [train.py:682] (1/4) Start epoch 1741 2024-10-21 08:37:31,920 INFO [train.py:561] (1/4) Epoch 1741, batch 0, global_batch_idx: 27840, batch size: 108, loss[dur_loss=0.2266, prior_loss=0.9809, diff_loss=0.3167, tot_loss=1.524, over 108.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9809, diff_loss=0.3167, tot_loss=1.524, over 108.00 samples.], 2024-10-21 08:37:46,020 INFO [train.py:561] (1/4) Epoch 1741, batch 10, global_batch_idx: 27850, batch size: 111, loss[dur_loss=0.232, prior_loss=0.9817, diff_loss=0.3109, tot_loss=1.525, over 111.00 samples.], tot_loss[dur_loss=0.2234, prior_loss=0.9799, diff_loss=0.3972, tot_loss=1.6, over 1656.00 samples.], 2024-10-21 08:37:53,065 INFO [train.py:682] (1/4) Start epoch 1742 2024-10-21 08:38:06,797 INFO [train.py:561] (1/4) Epoch 1742, batch 4, global_batch_idx: 27860, batch size: 189, loss[dur_loss=0.218, prior_loss=0.9798, diff_loss=0.3299, tot_loss=1.528, over 189.00 samples.], tot_loss[dur_loss=0.2192, prior_loss=0.9791, diff_loss=0.4372, tot_loss=1.636, over 937.00 samples.], 2024-10-21 08:38:21,743 INFO [train.py:561] (1/4) Epoch 1742, batch 14, global_batch_idx: 27870, batch size: 142, loss[dur_loss=0.2224, prior_loss=0.9797, diff_loss=0.2922, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.2241, prior_loss=0.9799, diff_loss=0.3676, tot_loss=1.572, over 2210.00 samples.], 2024-10-21 08:38:23,159 INFO [train.py:682] (1/4) Start epoch 1743 2024-10-21 08:38:43,116 INFO [train.py:561] (1/4) Epoch 1743, batch 8, global_batch_idx: 27880, batch size: 170, loss[dur_loss=0.2314, prior_loss=0.9803, diff_loss=0.3281, tot_loss=1.54, over 170.00 samples.], tot_loss[dur_loss=0.2221, prior_loss=0.9798, diff_loss=0.3912, tot_loss=1.593, over 1432.00 samples.], 2024-10-21 08:38:53,143 INFO [train.py:682] (1/4) Start epoch 1744 2024-10-21 08:39:04,309 INFO [train.py:561] (1/4) Epoch 1744, batch 2, global_batch_idx: 27890, batch size: 203, loss[dur_loss=0.2261, prior_loss=0.9801, diff_loss=0.3816, tot_loss=1.588, over 203.00 samples.], tot_loss[dur_loss=0.2262, prior_loss=0.9803, diff_loss=0.3383, tot_loss=1.545, over 442.00 samples.], 2024-10-21 08:39:18,428 INFO [train.py:561] (1/4) Epoch 1744, batch 12, global_batch_idx: 27900, batch size: 152, loss[dur_loss=0.2272, prior_loss=0.981, diff_loss=0.3394, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.225, prior_loss=0.98, diff_loss=0.3898, tot_loss=1.595, over 1966.00 samples.], 2024-10-21 08:39:22,853 INFO [train.py:682] (1/4) Start epoch 1745 2024-10-21 08:39:39,652 INFO [train.py:561] (1/4) Epoch 1745, batch 6, global_batch_idx: 27910, batch size: 106, loss[dur_loss=0.2247, prior_loss=0.9803, diff_loss=0.2762, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9794, diff_loss=0.4065, tot_loss=1.607, over 1142.00 samples.], 2024-10-21 08:39:52,630 INFO [train.py:682] (1/4) Start epoch 1746 2024-10-21 08:40:01,639 INFO [train.py:561] (1/4) Epoch 1746, batch 0, global_batch_idx: 27920, batch size: 108, loss[dur_loss=0.2313, prior_loss=0.9809, diff_loss=0.3236, tot_loss=1.536, over 108.00 samples.], tot_loss[dur_loss=0.2313, prior_loss=0.9809, diff_loss=0.3236, tot_loss=1.536, over 108.00 samples.], 2024-10-21 08:40:15,787 INFO [train.py:561] (1/4) Epoch 1746, batch 10, global_batch_idx: 27930, batch size: 111, loss[dur_loss=0.2311, prior_loss=0.9816, diff_loss=0.2857, tot_loss=1.498, over 111.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9798, diff_loss=0.3917, tot_loss=1.595, over 1656.00 samples.], 2024-10-21 08:40:22,825 INFO [train.py:682] (1/4) Start epoch 1747 2024-10-21 08:40:36,406 INFO [train.py:561] (1/4) Epoch 1747, batch 4, global_batch_idx: 27940, batch size: 189, loss[dur_loss=0.2249, prior_loss=0.9801, diff_loss=0.3394, tot_loss=1.544, over 189.00 samples.], tot_loss[dur_loss=0.2218, prior_loss=0.9793, diff_loss=0.4336, tot_loss=1.635, over 937.00 samples.], 2024-10-21 08:40:51,220 INFO [train.py:561] (1/4) Epoch 1747, batch 14, global_batch_idx: 27950, batch size: 142, loss[dur_loss=0.2225, prior_loss=0.98, diff_loss=0.3008, tot_loss=1.503, over 142.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.98, diff_loss=0.3644, tot_loss=1.57, over 2210.00 samples.], 2024-10-21 08:40:52,651 INFO [train.py:682] (1/4) Start epoch 1748 2024-10-21 08:41:12,483 INFO [train.py:561] (1/4) Epoch 1748, batch 8, global_batch_idx: 27960, batch size: 170, loss[dur_loss=0.2293, prior_loss=0.9807, diff_loss=0.3314, tot_loss=1.541, over 170.00 samples.], tot_loss[dur_loss=0.223, prior_loss=0.9797, diff_loss=0.3984, tot_loss=1.601, over 1432.00 samples.], 2024-10-21 08:41:22,674 INFO [train.py:682] (1/4) Start epoch 1749 2024-10-21 08:41:33,949 INFO [train.py:561] (1/4) Epoch 1749, batch 2, global_batch_idx: 27970, batch size: 203, loss[dur_loss=0.2256, prior_loss=0.9802, diff_loss=0.3541, tot_loss=1.56, over 203.00 samples.], tot_loss[dur_loss=0.2269, prior_loss=0.9804, diff_loss=0.331, tot_loss=1.538, over 442.00 samples.], 2024-10-21 08:41:48,115 INFO [train.py:561] (1/4) Epoch 1749, batch 12, global_batch_idx: 27980, batch size: 152, loss[dur_loss=0.2277, prior_loss=0.9805, diff_loss=0.3596, tot_loss=1.568, over 152.00 samples.], tot_loss[dur_loss=0.2247, prior_loss=0.9799, diff_loss=0.3841, tot_loss=1.589, over 1966.00 samples.], 2024-10-21 08:41:52,550 INFO [train.py:682] (1/4) Start epoch 1750 2024-10-21 08:42:09,118 INFO [train.py:561] (1/4) Epoch 1750, batch 6, global_batch_idx: 27990, batch size: 106, loss[dur_loss=0.2253, prior_loss=0.9803, diff_loss=0.3093, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9795, diff_loss=0.4305, tot_loss=1.63, over 1142.00 samples.], 2024-10-21 08:42:22,086 INFO [train.py:682] (1/4) Start epoch 1751 2024-10-21 08:42:30,988 INFO [train.py:561] (1/4) Epoch 1751, batch 0, global_batch_idx: 28000, batch size: 108, loss[dur_loss=0.2303, prior_loss=0.9809, diff_loss=0.3381, tot_loss=1.549, over 108.00 samples.], tot_loss[dur_loss=0.2303, prior_loss=0.9809, diff_loss=0.3381, tot_loss=1.549, over 108.00 samples.], 2024-10-21 08:42:45,089 INFO [train.py:561] (1/4) Epoch 1751, batch 10, global_batch_idx: 28010, batch size: 111, loss[dur_loss=0.2281, prior_loss=0.9811, diff_loss=0.3359, tot_loss=1.545, over 111.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9798, diff_loss=0.3904, tot_loss=1.592, over 1656.00 samples.], 2024-10-21 08:42:52,138 INFO [train.py:682] (1/4) Start epoch 1752 2024-10-21 08:43:05,675 INFO [train.py:561] (1/4) Epoch 1752, batch 4, global_batch_idx: 28020, batch size: 189, loss[dur_loss=0.2264, prior_loss=0.9815, diff_loss=0.3275, tot_loss=1.535, over 189.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9795, diff_loss=0.4391, tot_loss=1.639, over 937.00 samples.], 2024-10-21 08:43:20,381 INFO [train.py:561] (1/4) Epoch 1752, batch 14, global_batch_idx: 28030, batch size: 142, loss[dur_loss=0.2301, prior_loss=0.9803, diff_loss=0.3352, tot_loss=1.546, over 142.00 samples.], tot_loss[dur_loss=0.2246, prior_loss=0.9802, diff_loss=0.3789, tot_loss=1.584, over 2210.00 samples.], 2024-10-21 08:43:21,809 INFO [train.py:682] (1/4) Start epoch 1753 2024-10-21 08:43:41,509 INFO [train.py:561] (1/4) Epoch 1753, batch 8, global_batch_idx: 28040, batch size: 170, loss[dur_loss=0.2314, prior_loss=0.9802, diff_loss=0.281, tot_loss=1.493, over 170.00 samples.], tot_loss[dur_loss=0.2224, prior_loss=0.9797, diff_loss=0.3856, tot_loss=1.588, over 1432.00 samples.], 2024-10-21 08:43:51,558 INFO [train.py:682] (1/4) Start epoch 1754 2024-10-21 08:44:02,897 INFO [train.py:561] (1/4) Epoch 1754, batch 2, global_batch_idx: 28050, batch size: 203, loss[dur_loss=0.2255, prior_loss=0.9802, diff_loss=0.3371, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9803, diff_loss=0.3206, tot_loss=1.527, over 442.00 samples.], 2024-10-21 08:44:16,944 INFO [train.py:561] (1/4) Epoch 1754, batch 12, global_batch_idx: 28060, batch size: 152, loss[dur_loss=0.2268, prior_loss=0.9802, diff_loss=0.3395, tot_loss=1.547, over 152.00 samples.], tot_loss[dur_loss=0.2244, prior_loss=0.9798, diff_loss=0.368, tot_loss=1.572, over 1966.00 samples.], 2024-10-21 08:44:21,385 INFO [train.py:682] (1/4) Start epoch 1755 2024-10-21 08:44:38,546 INFO [train.py:561] (1/4) Epoch 1755, batch 6, global_batch_idx: 28070, batch size: 106, loss[dur_loss=0.2234, prior_loss=0.9803, diff_loss=0.3308, tot_loss=1.535, over 106.00 samples.], tot_loss[dur_loss=0.2212, prior_loss=0.9794, diff_loss=0.4186, tot_loss=1.619, over 1142.00 samples.], 2024-10-21 08:44:51,461 INFO [train.py:682] (1/4) Start epoch 1756 2024-10-21 08:45:00,030 INFO [train.py:561] (1/4) Epoch 1756, batch 0, global_batch_idx: 28080, batch size: 108, loss[dur_loss=0.2304, prior_loss=0.9808, diff_loss=0.3156, tot_loss=1.527, over 108.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9808, diff_loss=0.3156, tot_loss=1.527, over 108.00 samples.], 2024-10-21 08:45:14,110 INFO [train.py:561] (1/4) Epoch 1756, batch 10, global_batch_idx: 28090, batch size: 111, loss[dur_loss=0.2255, prior_loss=0.9811, diff_loss=0.2933, tot_loss=1.5, over 111.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9797, diff_loss=0.3735, tot_loss=1.577, over 1656.00 samples.], 2024-10-21 08:45:21,139 INFO [train.py:682] (1/4) Start epoch 1757 2024-10-21 08:45:34,751 INFO [train.py:561] (1/4) Epoch 1757, batch 4, global_batch_idx: 28100, batch size: 189, loss[dur_loss=0.2219, prior_loss=0.9805, diff_loss=0.3338, tot_loss=1.536, over 189.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9792, diff_loss=0.4374, tot_loss=1.635, over 937.00 samples.], 2024-10-21 08:45:49,445 INFO [train.py:561] (1/4) Epoch 1757, batch 14, global_batch_idx: 28110, batch size: 142, loss[dur_loss=0.2225, prior_loss=0.9799, diff_loss=0.3377, tot_loss=1.54, over 142.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9799, diff_loss=0.3688, tot_loss=1.572, over 2210.00 samples.], 2024-10-21 08:45:50,865 INFO [train.py:682] (1/4) Start epoch 1758 2024-10-21 08:46:10,451 INFO [train.py:561] (1/4) Epoch 1758, batch 8, global_batch_idx: 28120, batch size: 170, loss[dur_loss=0.2329, prior_loss=0.9807, diff_loss=0.3406, tot_loss=1.554, over 170.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9797, diff_loss=0.4046, tot_loss=1.607, over 1432.00 samples.], 2024-10-21 08:46:20,533 INFO [train.py:682] (1/4) Start epoch 1759 2024-10-21 08:46:31,913 INFO [train.py:561] (1/4) Epoch 1759, batch 2, global_batch_idx: 28130, batch size: 203, loss[dur_loss=0.2233, prior_loss=0.98, diff_loss=0.3321, tot_loss=1.535, over 203.00 samples.], tot_loss[dur_loss=0.2258, prior_loss=0.9801, diff_loss=0.3223, tot_loss=1.528, over 442.00 samples.], 2024-10-21 08:46:45,936 INFO [train.py:561] (1/4) Epoch 1759, batch 12, global_batch_idx: 28140, batch size: 152, loss[dur_loss=0.2259, prior_loss=0.9803, diff_loss=0.3298, tot_loss=1.536, over 152.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9799, diff_loss=0.3668, tot_loss=1.57, over 1966.00 samples.], 2024-10-21 08:46:50,370 INFO [train.py:682] (1/4) Start epoch 1760 2024-10-21 08:47:07,217 INFO [train.py:561] (1/4) Epoch 1760, batch 6, global_batch_idx: 28150, batch size: 106, loss[dur_loss=0.2218, prior_loss=0.9803, diff_loss=0.274, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.2198, prior_loss=0.9796, diff_loss=0.407, tot_loss=1.606, over 1142.00 samples.], 2024-10-21 08:47:20,123 INFO [train.py:682] (1/4) Start epoch 1761 2024-10-21 08:47:28,709 INFO [train.py:561] (1/4) Epoch 1761, batch 0, global_batch_idx: 28160, batch size: 108, loss[dur_loss=0.2259, prior_loss=0.981, diff_loss=0.3288, tot_loss=1.536, over 108.00 samples.], tot_loss[dur_loss=0.2259, prior_loss=0.981, diff_loss=0.3288, tot_loss=1.536, over 108.00 samples.], 2024-10-21 08:47:42,808 INFO [train.py:561] (1/4) Epoch 1761, batch 10, global_batch_idx: 28170, batch size: 111, loss[dur_loss=0.2336, prior_loss=0.9815, diff_loss=0.3133, tot_loss=1.528, over 111.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9798, diff_loss=0.3853, tot_loss=1.589, over 1656.00 samples.], 2024-10-21 08:47:49,821 INFO [train.py:682] (1/4) Start epoch 1762 2024-10-21 08:48:03,389 INFO [train.py:561] (1/4) Epoch 1762, batch 4, global_batch_idx: 28180, batch size: 189, loss[dur_loss=0.2239, prior_loss=0.9804, diff_loss=0.3379, tot_loss=1.542, over 189.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9793, diff_loss=0.441, tot_loss=1.639, over 937.00 samples.], 2024-10-21 08:48:18,186 INFO [train.py:561] (1/4) Epoch 1762, batch 14, global_batch_idx: 28190, batch size: 142, loss[dur_loss=0.2258, prior_loss=0.9799, diff_loss=0.2943, tot_loss=1.5, over 142.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.98, diff_loss=0.3684, tot_loss=1.572, over 2210.00 samples.], 2024-10-21 08:48:19,612 INFO [train.py:682] (1/4) Start epoch 1763 2024-10-21 08:48:39,851 INFO [train.py:561] (1/4) Epoch 1763, batch 8, global_batch_idx: 28200, batch size: 170, loss[dur_loss=0.23, prior_loss=0.9802, diff_loss=0.3358, tot_loss=1.546, over 170.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9797, diff_loss=0.3927, tot_loss=1.595, over 1432.00 samples.], 2024-10-21 08:48:49,878 INFO [train.py:682] (1/4) Start epoch 1764 2024-10-21 08:49:01,206 INFO [train.py:561] (1/4) Epoch 1764, batch 2, global_batch_idx: 28210, batch size: 203, loss[dur_loss=0.2264, prior_loss=0.9801, diff_loss=0.3312, tot_loss=1.538, over 203.00 samples.], tot_loss[dur_loss=0.2276, prior_loss=0.9802, diff_loss=0.3298, tot_loss=1.538, over 442.00 samples.], 2024-10-21 08:49:15,231 INFO [train.py:561] (1/4) Epoch 1764, batch 12, global_batch_idx: 28220, batch size: 152, loss[dur_loss=0.2239, prior_loss=0.9802, diff_loss=0.3368, tot_loss=1.541, over 152.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9798, diff_loss=0.3743, tot_loss=1.578, over 1966.00 samples.], 2024-10-21 08:49:19,641 INFO [train.py:682] (1/4) Start epoch 1765 2024-10-21 08:49:36,392 INFO [train.py:561] (1/4) Epoch 1765, batch 6, global_batch_idx: 28230, batch size: 106, loss[dur_loss=0.2242, prior_loss=0.9805, diff_loss=0.2973, tot_loss=1.502, over 106.00 samples.], tot_loss[dur_loss=0.2205, prior_loss=0.9793, diff_loss=0.4312, tot_loss=1.631, over 1142.00 samples.], 2024-10-21 08:49:49,296 INFO [train.py:682] (1/4) Start epoch 1766 2024-10-21 08:49:57,853 INFO [train.py:561] (1/4) Epoch 1766, batch 0, global_batch_idx: 28240, batch size: 108, loss[dur_loss=0.2278, prior_loss=0.9809, diff_loss=0.3234, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2278, prior_loss=0.9809, diff_loss=0.3234, tot_loss=1.532, over 108.00 samples.], 2024-10-21 08:50:12,002 INFO [train.py:561] (1/4) Epoch 1766, batch 10, global_batch_idx: 28250, batch size: 111, loss[dur_loss=0.2338, prior_loss=0.9816, diff_loss=0.3223, tot_loss=1.538, over 111.00 samples.], tot_loss[dur_loss=0.2233, prior_loss=0.9798, diff_loss=0.3765, tot_loss=1.58, over 1656.00 samples.], 2024-10-21 08:50:19,055 INFO [train.py:682] (1/4) Start epoch 1767 2024-10-21 08:50:32,741 INFO [train.py:561] (1/4) Epoch 1767, batch 4, global_batch_idx: 28260, batch size: 189, loss[dur_loss=0.2265, prior_loss=0.9802, diff_loss=0.3261, tot_loss=1.533, over 189.00 samples.], tot_loss[dur_loss=0.2216, prior_loss=0.9791, diff_loss=0.4421, tot_loss=1.643, over 937.00 samples.], 2024-10-21 08:50:47,541 INFO [train.py:561] (1/4) Epoch 1767, batch 14, global_batch_idx: 28270, batch size: 142, loss[dur_loss=0.2242, prior_loss=0.9802, diff_loss=0.2979, tot_loss=1.502, over 142.00 samples.], tot_loss[dur_loss=0.2254, prior_loss=0.9799, diff_loss=0.3705, tot_loss=1.576, over 2210.00 samples.], 2024-10-21 08:50:48,960 INFO [train.py:682] (1/4) Start epoch 1768 2024-10-21 08:51:08,497 INFO [train.py:561] (1/4) Epoch 1768, batch 8, global_batch_idx: 28280, batch size: 170, loss[dur_loss=0.2269, prior_loss=0.9801, diff_loss=0.3315, tot_loss=1.539, over 170.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9796, diff_loss=0.394, tot_loss=1.596, over 1432.00 samples.], 2024-10-21 08:51:18,522 INFO [train.py:682] (1/4) Start epoch 1769 2024-10-21 08:51:29,965 INFO [train.py:561] (1/4) Epoch 1769, batch 2, global_batch_idx: 28290, batch size: 203, loss[dur_loss=0.2274, prior_loss=0.9802, diff_loss=0.373, tot_loss=1.581, over 203.00 samples.], tot_loss[dur_loss=0.2264, prior_loss=0.9803, diff_loss=0.3405, tot_loss=1.547, over 442.00 samples.], 2024-10-21 08:51:44,047 INFO [train.py:561] (1/4) Epoch 1769, batch 12, global_batch_idx: 28300, batch size: 152, loss[dur_loss=0.2277, prior_loss=0.9804, diff_loss=0.3355, tot_loss=1.544, over 152.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9798, diff_loss=0.3716, tot_loss=1.575, over 1966.00 samples.], 2024-10-21 08:51:48,455 INFO [train.py:682] (1/4) Start epoch 1770 2024-10-21 08:52:05,317 INFO [train.py:561] (1/4) Epoch 1770, batch 6, global_batch_idx: 28310, batch size: 106, loss[dur_loss=0.2224, prior_loss=0.9801, diff_loss=0.3272, tot_loss=1.53, over 106.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9794, diff_loss=0.4064, tot_loss=1.606, over 1142.00 samples.], 2024-10-21 08:52:18,310 INFO [train.py:682] (1/4) Start epoch 1771 2024-10-21 08:52:27,138 INFO [train.py:561] (1/4) Epoch 1771, batch 0, global_batch_idx: 28320, batch size: 108, loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.3189, tot_loss=1.525, over 108.00 samples.], tot_loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.3189, tot_loss=1.525, over 108.00 samples.], 2024-10-21 08:52:41,272 INFO [train.py:561] (1/4) Epoch 1771, batch 10, global_batch_idx: 28330, batch size: 111, loss[dur_loss=0.2317, prior_loss=0.9814, diff_loss=0.3241, tot_loss=1.537, over 111.00 samples.], tot_loss[dur_loss=0.2225, prior_loss=0.9796, diff_loss=0.3838, tot_loss=1.586, over 1656.00 samples.], 2024-10-21 08:52:48,325 INFO [train.py:682] (1/4) Start epoch 1772 2024-10-21 08:53:01,841 INFO [train.py:561] (1/4) Epoch 1772, batch 4, global_batch_idx: 28340, batch size: 189, loss[dur_loss=0.2234, prior_loss=0.9803, diff_loss=0.3449, tot_loss=1.549, over 189.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9791, diff_loss=0.4337, tot_loss=1.631, over 937.00 samples.], 2024-10-21 08:53:16,518 INFO [train.py:561] (1/4) Epoch 1772, batch 14, global_batch_idx: 28350, batch size: 142, loss[dur_loss=0.222, prior_loss=0.9799, diff_loss=0.2672, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.2233, prior_loss=0.9799, diff_loss=0.3652, tot_loss=1.568, over 2210.00 samples.], 2024-10-21 08:53:17,937 INFO [train.py:682] (1/4) Start epoch 1773 2024-10-21 08:53:37,484 INFO [train.py:561] (1/4) Epoch 1773, batch 8, global_batch_idx: 28360, batch size: 170, loss[dur_loss=0.2294, prior_loss=0.9801, diff_loss=0.3423, tot_loss=1.552, over 170.00 samples.], tot_loss[dur_loss=0.2218, prior_loss=0.9795, diff_loss=0.396, tot_loss=1.597, over 1432.00 samples.], 2024-10-21 08:53:47,538 INFO [train.py:682] (1/4) Start epoch 1774 2024-10-21 08:53:58,859 INFO [train.py:561] (1/4) Epoch 1774, batch 2, global_batch_idx: 28370, batch size: 203, loss[dur_loss=0.2245, prior_loss=0.9799, diff_loss=0.3597, tot_loss=1.564, over 203.00 samples.], tot_loss[dur_loss=0.2253, prior_loss=0.9802, diff_loss=0.3304, tot_loss=1.536, over 442.00 samples.], 2024-10-21 08:54:12,986 INFO [train.py:561] (1/4) Epoch 1774, batch 12, global_batch_idx: 28380, batch size: 152, loss[dur_loss=0.2245, prior_loss=0.9801, diff_loss=0.3113, tot_loss=1.516, over 152.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9798, diff_loss=0.3642, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 08:54:17,401 INFO [train.py:682] (1/4) Start epoch 1775 2024-10-21 08:54:34,142 INFO [train.py:561] (1/4) Epoch 1775, batch 6, global_batch_idx: 28390, batch size: 106, loss[dur_loss=0.222, prior_loss=0.98, diff_loss=0.3349, tot_loss=1.537, over 106.00 samples.], tot_loss[dur_loss=0.2201, prior_loss=0.9794, diff_loss=0.4125, tot_loss=1.612, over 1142.00 samples.], 2024-10-21 08:54:47,087 INFO [train.py:682] (1/4) Start epoch 1776 2024-10-21 08:54:55,522 INFO [train.py:561] (1/4) Epoch 1776, batch 0, global_batch_idx: 28400, batch size: 108, loss[dur_loss=0.2281, prior_loss=0.9806, diff_loss=0.2994, tot_loss=1.508, over 108.00 samples.], tot_loss[dur_loss=0.2281, prior_loss=0.9806, diff_loss=0.2994, tot_loss=1.508, over 108.00 samples.], 2024-10-21 08:55:09,631 INFO [train.py:561] (1/4) Epoch 1776, batch 10, global_batch_idx: 28410, batch size: 111, loss[dur_loss=0.2312, prior_loss=0.9812, diff_loss=0.2842, tot_loss=1.497, over 111.00 samples.], tot_loss[dur_loss=0.2221, prior_loss=0.9796, diff_loss=0.3858, tot_loss=1.588, over 1656.00 samples.], 2024-10-21 08:55:16,704 INFO [train.py:682] (1/4) Start epoch 1777 2024-10-21 08:55:30,597 INFO [train.py:561] (1/4) Epoch 1777, batch 4, global_batch_idx: 28420, batch size: 189, loss[dur_loss=0.2247, prior_loss=0.9801, diff_loss=0.3498, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.218, prior_loss=0.9789, diff_loss=0.4476, tot_loss=1.645, over 937.00 samples.], 2024-10-21 08:55:45,266 INFO [train.py:561] (1/4) Epoch 1777, batch 14, global_batch_idx: 28430, batch size: 142, loss[dur_loss=0.2246, prior_loss=0.9799, diff_loss=0.3214, tot_loss=1.526, over 142.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.9797, diff_loss=0.3771, tot_loss=1.58, over 2210.00 samples.], 2024-10-21 08:55:46,691 INFO [train.py:682] (1/4) Start epoch 1778 2024-10-21 08:56:06,469 INFO [train.py:561] (1/4) Epoch 1778, batch 8, global_batch_idx: 28440, batch size: 170, loss[dur_loss=0.232, prior_loss=0.98, diff_loss=0.3259, tot_loss=1.538, over 170.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.9795, diff_loss=0.3909, tot_loss=1.591, over 1432.00 samples.], 2024-10-21 08:56:16,482 INFO [train.py:682] (1/4) Start epoch 1779 2024-10-21 08:56:27,885 INFO [train.py:561] (1/4) Epoch 1779, batch 2, global_batch_idx: 28450, batch size: 203, loss[dur_loss=0.2258, prior_loss=0.9799, diff_loss=0.3727, tot_loss=1.578, over 203.00 samples.], tot_loss[dur_loss=0.2255, prior_loss=0.9801, diff_loss=0.3344, tot_loss=1.54, over 442.00 samples.], 2024-10-21 08:56:42,020 INFO [train.py:561] (1/4) Epoch 1779, batch 12, global_batch_idx: 28460, batch size: 152, loss[dur_loss=0.2275, prior_loss=0.9805, diff_loss=0.3347, tot_loss=1.543, over 152.00 samples.], tot_loss[dur_loss=0.2239, prior_loss=0.9798, diff_loss=0.3786, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 08:56:46,436 INFO [train.py:682] (1/4) Start epoch 1780 2024-10-21 08:57:03,297 INFO [train.py:561] (1/4) Epoch 1780, batch 6, global_batch_idx: 28470, batch size: 106, loss[dur_loss=0.224, prior_loss=0.9803, diff_loss=0.288, tot_loss=1.492, over 106.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9793, diff_loss=0.4075, tot_loss=1.608, over 1142.00 samples.], 2024-10-21 08:57:16,263 INFO [train.py:682] (1/4) Start epoch 1781 2024-10-21 08:57:24,922 INFO [train.py:561] (1/4) Epoch 1781, batch 0, global_batch_idx: 28480, batch size: 108, loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.3234, tot_loss=1.53, over 108.00 samples.], tot_loss[dur_loss=0.2256, prior_loss=0.9806, diff_loss=0.3234, tot_loss=1.53, over 108.00 samples.], 2024-10-21 08:57:39,017 INFO [train.py:561] (1/4) Epoch 1781, batch 10, global_batch_idx: 28490, batch size: 111, loss[dur_loss=0.2287, prior_loss=0.9812, diff_loss=0.3583, tot_loss=1.568, over 111.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9797, diff_loss=0.3857, tot_loss=1.587, over 1656.00 samples.], 2024-10-21 08:57:46,038 INFO [train.py:682] (1/4) Start epoch 1782 2024-10-21 08:57:59,707 INFO [train.py:561] (1/4) Epoch 1782, batch 4, global_batch_idx: 28500, batch size: 189, loss[dur_loss=0.226, prior_loss=0.9804, diff_loss=0.3683, tot_loss=1.575, over 189.00 samples.], tot_loss[dur_loss=0.2195, prior_loss=0.9791, diff_loss=0.4383, tot_loss=1.637, over 937.00 samples.], 2024-10-21 08:58:01,339 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 08:59:01,847 INFO [train.py:589] (1/4) Epoch 1782, validation: dur_loss=0.4586, prior_loss=1.034, diff_loss=0.3749, tot_loss=1.868, over 100.00 samples. 2024-10-21 08:59:01,849 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 08:59:15,083 INFO [train.py:561] (1/4) Epoch 1782, batch 14, global_batch_idx: 28510, batch size: 142, loss[dur_loss=0.2261, prior_loss=0.98, diff_loss=0.3034, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9798, diff_loss=0.3754, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 08:59:16,500 INFO [train.py:682] (1/4) Start epoch 1783 2024-10-21 08:59:36,241 INFO [train.py:561] (1/4) Epoch 1783, batch 8, global_batch_idx: 28520, batch size: 170, loss[dur_loss=0.2252, prior_loss=0.98, diff_loss=0.315, tot_loss=1.52, over 170.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9794, diff_loss=0.398, tot_loss=1.598, over 1432.00 samples.], 2024-10-21 08:59:46,221 INFO [train.py:682] (1/4) Start epoch 1784 2024-10-21 08:59:57,831 INFO [train.py:561] (1/4) Epoch 1784, batch 2, global_batch_idx: 28530, batch size: 203, loss[dur_loss=0.2216, prior_loss=0.9798, diff_loss=0.3378, tot_loss=1.539, over 203.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.98, diff_loss=0.3376, tot_loss=1.539, over 442.00 samples.], 2024-10-21 09:00:11,896 INFO [train.py:561] (1/4) Epoch 1784, batch 12, global_batch_idx: 28540, batch size: 152, loss[dur_loss=0.2268, prior_loss=0.9801, diff_loss=0.3468, tot_loss=1.554, over 152.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9797, diff_loss=0.3752, tot_loss=1.577, over 1966.00 samples.], 2024-10-21 09:00:16,288 INFO [train.py:682] (1/4) Start epoch 1785 2024-10-21 09:00:33,303 INFO [train.py:561] (1/4) Epoch 1785, batch 6, global_batch_idx: 28550, batch size: 106, loss[dur_loss=0.2282, prior_loss=0.9804, diff_loss=0.3138, tot_loss=1.522, over 106.00 samples.], tot_loss[dur_loss=0.2203, prior_loss=0.9792, diff_loss=0.4066, tot_loss=1.606, over 1142.00 samples.], 2024-10-21 09:00:46,286 INFO [train.py:682] (1/4) Start epoch 1786 2024-10-21 09:00:54,974 INFO [train.py:561] (1/4) Epoch 1786, batch 0, global_batch_idx: 28560, batch size: 108, loss[dur_loss=0.2272, prior_loss=0.9806, diff_loss=0.3155, tot_loss=1.523, over 108.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9806, diff_loss=0.3155, tot_loss=1.523, over 108.00 samples.], 2024-10-21 09:01:09,140 INFO [train.py:561] (1/4) Epoch 1786, batch 10, global_batch_idx: 28570, batch size: 111, loss[dur_loss=0.2272, prior_loss=0.9811, diff_loss=0.3145, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2221, prior_loss=0.9796, diff_loss=0.3862, tot_loss=1.588, over 1656.00 samples.], 2024-10-21 09:01:16,182 INFO [train.py:682] (1/4) Start epoch 1787 2024-10-21 09:01:30,005 INFO [train.py:561] (1/4) Epoch 1787, batch 4, global_batch_idx: 28580, batch size: 189, loss[dur_loss=0.2287, prior_loss=0.9806, diff_loss=0.3577, tot_loss=1.567, over 189.00 samples.], tot_loss[dur_loss=0.2198, prior_loss=0.9791, diff_loss=0.4387, tot_loss=1.638, over 937.00 samples.], 2024-10-21 09:01:44,729 INFO [train.py:561] (1/4) Epoch 1787, batch 14, global_batch_idx: 28590, batch size: 142, loss[dur_loss=0.2247, prior_loss=0.9799, diff_loss=0.2966, tot_loss=1.501, over 142.00 samples.], tot_loss[dur_loss=0.2231, prior_loss=0.9797, diff_loss=0.3657, tot_loss=1.569, over 2210.00 samples.], 2024-10-21 09:01:46,141 INFO [train.py:682] (1/4) Start epoch 1788 2024-10-21 09:02:06,116 INFO [train.py:561] (1/4) Epoch 1788, batch 8, global_batch_idx: 28600, batch size: 170, loss[dur_loss=0.2261, prior_loss=0.9799, diff_loss=0.3475, tot_loss=1.554, over 170.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9794, diff_loss=0.3969, tot_loss=1.597, over 1432.00 samples.], 2024-10-21 09:02:16,094 INFO [train.py:682] (1/4) Start epoch 1789 2024-10-21 09:02:27,422 INFO [train.py:561] (1/4) Epoch 1789, batch 2, global_batch_idx: 28610, batch size: 203, loss[dur_loss=0.2228, prior_loss=0.9801, diff_loss=0.3524, tot_loss=1.555, over 203.00 samples.], tot_loss[dur_loss=0.2234, prior_loss=0.9801, diff_loss=0.3287, tot_loss=1.532, over 442.00 samples.], 2024-10-21 09:02:41,645 INFO [train.py:561] (1/4) Epoch 1789, batch 12, global_batch_idx: 28620, batch size: 152, loss[dur_loss=0.2293, prior_loss=0.9805, diff_loss=0.292, tot_loss=1.502, over 152.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9797, diff_loss=0.3647, tot_loss=1.567, over 1966.00 samples.], 2024-10-21 09:02:46,072 INFO [train.py:682] (1/4) Start epoch 1790 2024-10-21 09:03:03,050 INFO [train.py:561] (1/4) Epoch 1790, batch 6, global_batch_idx: 28630, batch size: 106, loss[dur_loss=0.2258, prior_loss=0.9801, diff_loss=0.2954, tot_loss=1.501, over 106.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9792, diff_loss=0.4139, tot_loss=1.614, over 1142.00 samples.], 2024-10-21 09:03:16,042 INFO [train.py:682] (1/4) Start epoch 1791 2024-10-21 09:03:24,843 INFO [train.py:561] (1/4) Epoch 1791, batch 0, global_batch_idx: 28640, batch size: 108, loss[dur_loss=0.2274, prior_loss=0.9807, diff_loss=0.2897, tot_loss=1.498, over 108.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.9807, diff_loss=0.2897, tot_loss=1.498, over 108.00 samples.], 2024-10-21 09:03:39,012 INFO [train.py:561] (1/4) Epoch 1791, batch 10, global_batch_idx: 28650, batch size: 111, loss[dur_loss=0.2297, prior_loss=0.981, diff_loss=0.2852, tot_loss=1.496, over 111.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9795, diff_loss=0.372, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 09:03:46,067 INFO [train.py:682] (1/4) Start epoch 1792 2024-10-21 09:03:59,875 INFO [train.py:561] (1/4) Epoch 1792, batch 4, global_batch_idx: 28660, batch size: 189, loss[dur_loss=0.2279, prior_loss=0.9799, diff_loss=0.368, tot_loss=1.576, over 189.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9789, diff_loss=0.4395, tot_loss=1.637, over 937.00 samples.], 2024-10-21 09:04:14,730 INFO [train.py:561] (1/4) Epoch 1792, batch 14, global_batch_idx: 28670, batch size: 142, loss[dur_loss=0.221, prior_loss=0.9798, diff_loss=0.333, tot_loss=1.534, over 142.00 samples.], tot_loss[dur_loss=0.2228, prior_loss=0.9796, diff_loss=0.3764, tot_loss=1.579, over 2210.00 samples.], 2024-10-21 09:04:16,152 INFO [train.py:682] (1/4) Start epoch 1793 2024-10-21 09:04:36,268 INFO [train.py:561] (1/4) Epoch 1793, batch 8, global_batch_idx: 28680, batch size: 170, loss[dur_loss=0.2229, prior_loss=0.9797, diff_loss=0.3236, tot_loss=1.526, over 170.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9792, diff_loss=0.4005, tot_loss=1.6, over 1432.00 samples.], 2024-10-21 09:04:46,374 INFO [train.py:682] (1/4) Start epoch 1794 2024-10-21 09:04:57,649 INFO [train.py:561] (1/4) Epoch 1794, batch 2, global_batch_idx: 28690, batch size: 203, loss[dur_loss=0.2204, prior_loss=0.9796, diff_loss=0.3689, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9799, diff_loss=0.3444, tot_loss=1.547, over 442.00 samples.], 2024-10-21 09:05:11,848 INFO [train.py:561] (1/4) Epoch 1794, batch 12, global_batch_idx: 28700, batch size: 152, loss[dur_loss=0.2243, prior_loss=0.98, diff_loss=0.3133, tot_loss=1.518, over 152.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9795, diff_loss=0.3778, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 09:05:16,301 INFO [train.py:682] (1/4) Start epoch 1795 2024-10-21 09:05:33,344 INFO [train.py:561] (1/4) Epoch 1795, batch 6, global_batch_idx: 28710, batch size: 106, loss[dur_loss=0.2275, prior_loss=0.9803, diff_loss=0.366, tot_loss=1.574, over 106.00 samples.], tot_loss[dur_loss=0.2199, prior_loss=0.9792, diff_loss=0.4192, tot_loss=1.618, over 1142.00 samples.], 2024-10-21 09:05:46,346 INFO [train.py:682] (1/4) Start epoch 1796 2024-10-21 09:05:55,204 INFO [train.py:561] (1/4) Epoch 1796, batch 0, global_batch_idx: 28720, batch size: 108, loss[dur_loss=0.2297, prior_loss=0.9807, diff_loss=0.3117, tot_loss=1.522, over 108.00 samples.], tot_loss[dur_loss=0.2297, prior_loss=0.9807, diff_loss=0.3117, tot_loss=1.522, over 108.00 samples.], 2024-10-21 09:06:09,330 INFO [train.py:561] (1/4) Epoch 1796, batch 10, global_batch_idx: 28730, batch size: 111, loss[dur_loss=0.2288, prior_loss=0.9807, diff_loss=0.3161, tot_loss=1.526, over 111.00 samples.], tot_loss[dur_loss=0.2205, prior_loss=0.9795, diff_loss=0.3888, tot_loss=1.589, over 1656.00 samples.], 2024-10-21 09:06:16,445 INFO [train.py:682] (1/4) Start epoch 1797 2024-10-21 09:06:30,114 INFO [train.py:561] (1/4) Epoch 1797, batch 4, global_batch_idx: 28740, batch size: 189, loss[dur_loss=0.2225, prior_loss=0.9802, diff_loss=0.3708, tot_loss=1.574, over 189.00 samples.], tot_loss[dur_loss=0.2176, prior_loss=0.9789, diff_loss=0.4501, tot_loss=1.647, over 937.00 samples.], 2024-10-21 09:06:44,894 INFO [train.py:561] (1/4) Epoch 1797, batch 14, global_batch_idx: 28750, batch size: 142, loss[dur_loss=0.2262, prior_loss=0.9804, diff_loss=0.3485, tot_loss=1.555, over 142.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9797, diff_loss=0.3737, tot_loss=1.576, over 2210.00 samples.], 2024-10-21 09:06:46,311 INFO [train.py:682] (1/4) Start epoch 1798 2024-10-21 09:07:06,108 INFO [train.py:561] (1/4) Epoch 1798, batch 8, global_batch_idx: 28760, batch size: 170, loss[dur_loss=0.2286, prior_loss=0.9803, diff_loss=0.3276, tot_loss=1.537, over 170.00 samples.], tot_loss[dur_loss=0.2212, prior_loss=0.9794, diff_loss=0.4056, tot_loss=1.606, over 1432.00 samples.], 2024-10-21 09:07:16,189 INFO [train.py:682] (1/4) Start epoch 1799 2024-10-21 09:07:27,480 INFO [train.py:561] (1/4) Epoch 1799, batch 2, global_batch_idx: 28770, batch size: 203, loss[dur_loss=0.2214, prior_loss=0.9798, diff_loss=0.3471, tot_loss=1.548, over 203.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9799, diff_loss=0.3334, tot_loss=1.537, over 442.00 samples.], 2024-10-21 09:07:41,652 INFO [train.py:561] (1/4) Epoch 1799, batch 12, global_batch_idx: 28780, batch size: 152, loss[dur_loss=0.2269, prior_loss=0.9805, diff_loss=0.3277, tot_loss=1.535, over 152.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9796, diff_loss=0.3693, tot_loss=1.571, over 1966.00 samples.], 2024-10-21 09:07:46,061 INFO [train.py:682] (1/4) Start epoch 1800 2024-10-21 09:08:02,749 INFO [train.py:561] (1/4) Epoch 1800, batch 6, global_batch_idx: 28790, batch size: 106, loss[dur_loss=0.2231, prior_loss=0.9801, diff_loss=0.3047, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.2201, prior_loss=0.9792, diff_loss=0.4141, tot_loss=1.613, over 1142.00 samples.], 2024-10-21 09:08:15,755 INFO [train.py:682] (1/4) Start epoch 1801 2024-10-21 09:08:24,188 INFO [train.py:561] (1/4) Epoch 1801, batch 0, global_batch_idx: 28800, batch size: 108, loss[dur_loss=0.2268, prior_loss=0.9806, diff_loss=0.3042, tot_loss=1.512, over 108.00 samples.], tot_loss[dur_loss=0.2268, prior_loss=0.9806, diff_loss=0.3042, tot_loss=1.512, over 108.00 samples.], 2024-10-21 09:08:38,426 INFO [train.py:561] (1/4) Epoch 1801, batch 10, global_batch_idx: 28810, batch size: 111, loss[dur_loss=0.2265, prior_loss=0.9809, diff_loss=0.3161, tot_loss=1.524, over 111.00 samples.], tot_loss[dur_loss=0.2204, prior_loss=0.9794, diff_loss=0.3907, tot_loss=1.591, over 1656.00 samples.], 2024-10-21 09:08:45,498 INFO [train.py:682] (1/4) Start epoch 1802 2024-10-21 09:08:59,188 INFO [train.py:561] (1/4) Epoch 1802, batch 4, global_batch_idx: 28820, batch size: 189, loss[dur_loss=0.2242, prior_loss=0.9802, diff_loss=0.3669, tot_loss=1.571, over 189.00 samples.], tot_loss[dur_loss=0.2192, prior_loss=0.979, diff_loss=0.4368, tot_loss=1.635, over 937.00 samples.], 2024-10-21 09:09:13,982 INFO [train.py:561] (1/4) Epoch 1802, batch 14, global_batch_idx: 28830, batch size: 142, loss[dur_loss=0.2254, prior_loss=0.9798, diff_loss=0.2971, tot_loss=1.502, over 142.00 samples.], tot_loss[dur_loss=0.2226, prior_loss=0.9797, diff_loss=0.3678, tot_loss=1.57, over 2210.00 samples.], 2024-10-21 09:09:15,397 INFO [train.py:682] (1/4) Start epoch 1803 2024-10-21 09:09:35,428 INFO [train.py:561] (1/4) Epoch 1803, batch 8, global_batch_idx: 28840, batch size: 170, loss[dur_loss=0.2267, prior_loss=0.9801, diff_loss=0.3126, tot_loss=1.519, over 170.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9793, diff_loss=0.3924, tot_loss=1.592, over 1432.00 samples.], 2024-10-21 09:09:45,540 INFO [train.py:682] (1/4) Start epoch 1804 2024-10-21 09:09:56,942 INFO [train.py:561] (1/4) Epoch 1804, batch 2, global_batch_idx: 28850, batch size: 203, loss[dur_loss=0.2242, prior_loss=0.9798, diff_loss=0.3433, tot_loss=1.547, over 203.00 samples.], tot_loss[dur_loss=0.2259, prior_loss=0.9801, diff_loss=0.3363, tot_loss=1.542, over 442.00 samples.], 2024-10-21 09:10:11,138 INFO [train.py:561] (1/4) Epoch 1804, batch 12, global_batch_idx: 28860, batch size: 152, loss[dur_loss=0.2266, prior_loss=0.9801, diff_loss=0.3082, tot_loss=1.515, over 152.00 samples.], tot_loss[dur_loss=0.2232, prior_loss=0.9796, diff_loss=0.3826, tot_loss=1.585, over 1966.00 samples.], 2024-10-21 09:10:15,572 INFO [train.py:682] (1/4) Start epoch 1805 2024-10-21 09:10:32,231 INFO [train.py:561] (1/4) Epoch 1805, batch 6, global_batch_idx: 28870, batch size: 106, loss[dur_loss=0.2216, prior_loss=0.9796, diff_loss=0.3019, tot_loss=1.503, over 106.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9791, diff_loss=0.412, tot_loss=1.612, over 1142.00 samples.], 2024-10-21 09:10:45,199 INFO [train.py:682] (1/4) Start epoch 1806 2024-10-21 09:10:53,915 INFO [train.py:561] (1/4) Epoch 1806, batch 0, global_batch_idx: 28880, batch size: 108, loss[dur_loss=0.227, prior_loss=0.9805, diff_loss=0.3203, tot_loss=1.528, over 108.00 samples.], tot_loss[dur_loss=0.227, prior_loss=0.9805, diff_loss=0.3203, tot_loss=1.528, over 108.00 samples.], 2024-10-21 09:11:08,043 INFO [train.py:561] (1/4) Epoch 1806, batch 10, global_batch_idx: 28890, batch size: 111, loss[dur_loss=0.2316, prior_loss=0.9811, diff_loss=0.2957, tot_loss=1.508, over 111.00 samples.], tot_loss[dur_loss=0.2218, prior_loss=0.9795, diff_loss=0.3815, tot_loss=1.583, over 1656.00 samples.], 2024-10-21 09:11:15,067 INFO [train.py:682] (1/4) Start epoch 1807 2024-10-21 09:11:28,716 INFO [train.py:561] (1/4) Epoch 1807, batch 4, global_batch_idx: 28900, batch size: 189, loss[dur_loss=0.221, prior_loss=0.98, diff_loss=0.3266, tot_loss=1.528, over 189.00 samples.], tot_loss[dur_loss=0.217, prior_loss=0.9789, diff_loss=0.4295, tot_loss=1.625, over 937.00 samples.], 2024-10-21 09:11:43,372 INFO [train.py:561] (1/4) Epoch 1807, batch 14, global_batch_idx: 28910, batch size: 142, loss[dur_loss=0.2223, prior_loss=0.98, diff_loss=0.3072, tot_loss=1.509, over 142.00 samples.], tot_loss[dur_loss=0.2225, prior_loss=0.9796, diff_loss=0.3609, tot_loss=1.563, over 2210.00 samples.], 2024-10-21 09:11:44,790 INFO [train.py:682] (1/4) Start epoch 1808 2024-10-21 09:12:04,359 INFO [train.py:561] (1/4) Epoch 1808, batch 8, global_batch_idx: 28920, batch size: 170, loss[dur_loss=0.2209, prior_loss=0.9798, diff_loss=0.3085, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.2188, prior_loss=0.9793, diff_loss=0.3891, tot_loss=1.587, over 1432.00 samples.], 2024-10-21 09:12:14,434 INFO [train.py:682] (1/4) Start epoch 1809 2024-10-21 09:12:25,582 INFO [train.py:561] (1/4) Epoch 1809, batch 2, global_batch_idx: 28930, batch size: 203, loss[dur_loss=0.2236, prior_loss=0.9798, diff_loss=0.3449, tot_loss=1.548, over 203.00 samples.], tot_loss[dur_loss=0.2229, prior_loss=0.98, diff_loss=0.3181, tot_loss=1.521, over 442.00 samples.], 2024-10-21 09:12:39,711 INFO [train.py:561] (1/4) Epoch 1809, batch 12, global_batch_idx: 28940, batch size: 152, loss[dur_loss=0.2244, prior_loss=0.9802, diff_loss=0.3059, tot_loss=1.51, over 152.00 samples.], tot_loss[dur_loss=0.2221, prior_loss=0.9795, diff_loss=0.3672, tot_loss=1.569, over 1966.00 samples.], 2024-10-21 09:12:44,102 INFO [train.py:682] (1/4) Start epoch 1810 2024-10-21 09:13:00,734 INFO [train.py:561] (1/4) Epoch 1810, batch 6, global_batch_idx: 28950, batch size: 106, loss[dur_loss=0.2245, prior_loss=0.98, diff_loss=0.2952, tot_loss=1.5, over 106.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9791, diff_loss=0.4026, tot_loss=1.601, over 1142.00 samples.], 2024-10-21 09:13:13,651 INFO [train.py:682] (1/4) Start epoch 1811 2024-10-21 09:13:22,431 INFO [train.py:561] (1/4) Epoch 1811, batch 0, global_batch_idx: 28960, batch size: 108, loss[dur_loss=0.2271, prior_loss=0.9807, diff_loss=0.2978, tot_loss=1.506, over 108.00 samples.], tot_loss[dur_loss=0.2271, prior_loss=0.9807, diff_loss=0.2978, tot_loss=1.506, over 108.00 samples.], 2024-10-21 09:13:36,515 INFO [train.py:561] (1/4) Epoch 1811, batch 10, global_batch_idx: 28970, batch size: 111, loss[dur_loss=0.2267, prior_loss=0.9806, diff_loss=0.2993, tot_loss=1.507, over 111.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9795, diff_loss=0.3773, tot_loss=1.578, over 1656.00 samples.], 2024-10-21 09:13:43,525 INFO [train.py:682] (1/4) Start epoch 1812 2024-10-21 09:13:57,169 INFO [train.py:561] (1/4) Epoch 1812, batch 4, global_batch_idx: 28980, batch size: 189, loss[dur_loss=0.2233, prior_loss=0.9798, diff_loss=0.3407, tot_loss=1.544, over 189.00 samples.], tot_loss[dur_loss=0.2161, prior_loss=0.9789, diff_loss=0.4378, tot_loss=1.633, over 937.00 samples.], 2024-10-21 09:14:11,971 INFO [train.py:561] (1/4) Epoch 1812, batch 14, global_batch_idx: 28990, batch size: 142, loss[dur_loss=0.2255, prior_loss=0.9799, diff_loss=0.3489, tot_loss=1.554, over 142.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9796, diff_loss=0.3746, tot_loss=1.575, over 2210.00 samples.], 2024-10-21 09:14:13,396 INFO [train.py:682] (1/4) Start epoch 1813 2024-10-21 09:14:33,306 INFO [train.py:561] (1/4) Epoch 1813, batch 8, global_batch_idx: 29000, batch size: 170, loss[dur_loss=0.2279, prior_loss=0.9797, diff_loss=0.3458, tot_loss=1.553, over 170.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9793, diff_loss=0.4049, tot_loss=1.606, over 1432.00 samples.], 2024-10-21 09:14:43,366 INFO [train.py:682] (1/4) Start epoch 1814 2024-10-21 09:14:54,639 INFO [train.py:561] (1/4) Epoch 1814, batch 2, global_batch_idx: 29010, batch size: 203, loss[dur_loss=0.2247, prior_loss=0.9798, diff_loss=0.364, tot_loss=1.569, over 203.00 samples.], tot_loss[dur_loss=0.2248, prior_loss=0.98, diff_loss=0.3545, tot_loss=1.559, over 442.00 samples.], 2024-10-21 09:15:08,798 INFO [train.py:561] (1/4) Epoch 1814, batch 12, global_batch_idx: 29020, batch size: 152, loss[dur_loss=0.2233, prior_loss=0.98, diff_loss=0.3447, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9796, diff_loss=0.3807, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 09:15:13,232 INFO [train.py:682] (1/4) Start epoch 1815 2024-10-21 09:15:30,180 INFO [train.py:561] (1/4) Epoch 1815, batch 6, global_batch_idx: 29030, batch size: 106, loss[dur_loss=0.2254, prior_loss=0.9803, diff_loss=0.3194, tot_loss=1.525, over 106.00 samples.], tot_loss[dur_loss=0.2203, prior_loss=0.9793, diff_loss=0.4104, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 09:15:43,210 INFO [train.py:682] (1/4) Start epoch 1816 2024-10-21 09:15:52,181 INFO [train.py:561] (1/4) Epoch 1816, batch 0, global_batch_idx: 29040, batch size: 108, loss[dur_loss=0.228, prior_loss=0.9806, diff_loss=0.3071, tot_loss=1.516, over 108.00 samples.], tot_loss[dur_loss=0.228, prior_loss=0.9806, diff_loss=0.3071, tot_loss=1.516, over 108.00 samples.], 2024-10-21 09:16:06,360 INFO [train.py:561] (1/4) Epoch 1816, batch 10, global_batch_idx: 29050, batch size: 111, loss[dur_loss=0.2242, prior_loss=0.9806, diff_loss=0.2986, tot_loss=1.503, over 111.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.9794, diff_loss=0.3811, tot_loss=1.581, over 1656.00 samples.], 2024-10-21 09:16:13,393 INFO [train.py:682] (1/4) Start epoch 1817 2024-10-21 09:16:27,102 INFO [train.py:561] (1/4) Epoch 1817, batch 4, global_batch_idx: 29060, batch size: 189, loss[dur_loss=0.2215, prior_loss=0.9797, diff_loss=0.3362, tot_loss=1.537, over 189.00 samples.], tot_loss[dur_loss=0.2189, prior_loss=0.9789, diff_loss=0.4297, tot_loss=1.627, over 937.00 samples.], 2024-10-21 09:16:41,868 INFO [train.py:561] (1/4) Epoch 1817, batch 14, global_batch_idx: 29070, batch size: 142, loss[dur_loss=0.2291, prior_loss=0.98, diff_loss=0.3175, tot_loss=1.527, over 142.00 samples.], tot_loss[dur_loss=0.2234, prior_loss=0.9796, diff_loss=0.3627, tot_loss=1.566, over 2210.00 samples.], 2024-10-21 09:16:43,300 INFO [train.py:682] (1/4) Start epoch 1818 2024-10-21 09:17:03,092 INFO [train.py:561] (1/4) Epoch 1818, batch 8, global_batch_idx: 29080, batch size: 170, loss[dur_loss=0.2264, prior_loss=0.9799, diff_loss=0.36, tot_loss=1.566, over 170.00 samples.], tot_loss[dur_loss=0.2209, prior_loss=0.9793, diff_loss=0.4005, tot_loss=1.601, over 1432.00 samples.], 2024-10-21 09:17:13,139 INFO [train.py:682] (1/4) Start epoch 1819 2024-10-21 09:17:24,315 INFO [train.py:561] (1/4) Epoch 1819, batch 2, global_batch_idx: 29090, batch size: 203, loss[dur_loss=0.2223, prior_loss=0.9799, diff_loss=0.3733, tot_loss=1.575, over 203.00 samples.], tot_loss[dur_loss=0.2223, prior_loss=0.98, diff_loss=0.3398, tot_loss=1.542, over 442.00 samples.], 2024-10-21 09:17:38,527 INFO [train.py:561] (1/4) Epoch 1819, batch 12, global_batch_idx: 29100, batch size: 152, loss[dur_loss=0.2229, prior_loss=0.9798, diff_loss=0.3208, tot_loss=1.523, over 152.00 samples.], tot_loss[dur_loss=0.2209, prior_loss=0.9796, diff_loss=0.3819, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 09:17:42,957 INFO [train.py:682] (1/4) Start epoch 1820 2024-10-21 09:17:59,843 INFO [train.py:561] (1/4) Epoch 1820, batch 6, global_batch_idx: 29110, batch size: 106, loss[dur_loss=0.2264, prior_loss=0.9801, diff_loss=0.3046, tot_loss=1.511, over 106.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9792, diff_loss=0.4089, tot_loss=1.607, over 1142.00 samples.], 2024-10-21 09:18:12,817 INFO [train.py:682] (1/4) Start epoch 1821 2024-10-21 09:18:21,381 INFO [train.py:561] (1/4) Epoch 1821, batch 0, global_batch_idx: 29120, batch size: 108, loss[dur_loss=0.2299, prior_loss=0.9808, diff_loss=0.2958, tot_loss=1.506, over 108.00 samples.], tot_loss[dur_loss=0.2299, prior_loss=0.9808, diff_loss=0.2958, tot_loss=1.506, over 108.00 samples.], 2024-10-21 09:18:35,473 INFO [train.py:561] (1/4) Epoch 1821, batch 10, global_batch_idx: 29130, batch size: 111, loss[dur_loss=0.2301, prior_loss=0.9809, diff_loss=0.2833, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9795, diff_loss=0.3703, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 09:18:42,505 INFO [train.py:682] (1/4) Start epoch 1822 2024-10-21 09:18:56,326 INFO [train.py:561] (1/4) Epoch 1822, batch 4, global_batch_idx: 29140, batch size: 189, loss[dur_loss=0.2203, prior_loss=0.9798, diff_loss=0.3222, tot_loss=1.522, over 189.00 samples.], tot_loss[dur_loss=0.2186, prior_loss=0.9788, diff_loss=0.4254, tot_loss=1.623, over 937.00 samples.], 2024-10-21 09:19:11,082 INFO [train.py:561] (1/4) Epoch 1822, batch 14, global_batch_idx: 29150, batch size: 142, loss[dur_loss=0.222, prior_loss=0.9794, diff_loss=0.3049, tot_loss=1.506, over 142.00 samples.], tot_loss[dur_loss=0.2228, prior_loss=0.9797, diff_loss=0.3647, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 09:19:12,495 INFO [train.py:682] (1/4) Start epoch 1823 2024-10-21 09:19:32,123 INFO [train.py:561] (1/4) Epoch 1823, batch 8, global_batch_idx: 29160, batch size: 170, loss[dur_loss=0.2274, prior_loss=0.9798, diff_loss=0.3277, tot_loss=1.535, over 170.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9793, diff_loss=0.3875, tot_loss=1.587, over 1432.00 samples.], 2024-10-21 09:19:42,141 INFO [train.py:682] (1/4) Start epoch 1824 2024-10-21 09:19:53,558 INFO [train.py:561] (1/4) Epoch 1824, batch 2, global_batch_idx: 29170, batch size: 203, loss[dur_loss=0.2229, prior_loss=0.9798, diff_loss=0.3765, tot_loss=1.579, over 203.00 samples.], tot_loss[dur_loss=0.2247, prior_loss=0.9799, diff_loss=0.3431, tot_loss=1.548, over 442.00 samples.], 2024-10-21 09:20:07,746 INFO [train.py:561] (1/4) Epoch 1824, batch 12, global_batch_idx: 29180, batch size: 152, loss[dur_loss=0.2261, prior_loss=0.9798, diff_loss=0.3336, tot_loss=1.54, over 152.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9795, diff_loss=0.3804, tot_loss=1.582, over 1966.00 samples.], 2024-10-21 09:20:12,158 INFO [train.py:682] (1/4) Start epoch 1825 2024-10-21 09:20:29,100 INFO [train.py:561] (1/4) Epoch 1825, batch 6, global_batch_idx: 29190, batch size: 106, loss[dur_loss=0.2281, prior_loss=0.9804, diff_loss=0.3068, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2193, prior_loss=0.9791, diff_loss=0.4184, tot_loss=1.617, over 1142.00 samples.], 2024-10-21 09:20:42,055 INFO [train.py:682] (1/4) Start epoch 1826 2024-10-21 09:20:50,778 INFO [train.py:561] (1/4) Epoch 1826, batch 0, global_batch_idx: 29200, batch size: 108, loss[dur_loss=0.2304, prior_loss=0.9808, diff_loss=0.32, tot_loss=1.531, over 108.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9808, diff_loss=0.32, tot_loss=1.531, over 108.00 samples.], 2024-10-21 09:21:04,847 INFO [train.py:561] (1/4) Epoch 1826, batch 10, global_batch_idx: 29210, batch size: 111, loss[dur_loss=0.2283, prior_loss=0.9808, diff_loss=0.2995, tot_loss=1.509, over 111.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9793, diff_loss=0.3826, tot_loss=1.584, over 1656.00 samples.], 2024-10-21 09:21:11,867 INFO [train.py:682] (1/4) Start epoch 1827 2024-10-21 09:21:25,404 INFO [train.py:561] (1/4) Epoch 1827, batch 4, global_batch_idx: 29220, batch size: 189, loss[dur_loss=0.2213, prior_loss=0.9794, diff_loss=0.3437, tot_loss=1.544, over 189.00 samples.], tot_loss[dur_loss=0.2184, prior_loss=0.9788, diff_loss=0.446, tot_loss=1.643, over 937.00 samples.], 2024-10-21 09:21:40,157 INFO [train.py:561] (1/4) Epoch 1827, batch 14, global_batch_idx: 29230, batch size: 142, loss[dur_loss=0.226, prior_loss=0.9795, diff_loss=0.3631, tot_loss=1.569, over 142.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9794, diff_loss=0.376, tot_loss=1.578, over 2210.00 samples.], 2024-10-21 09:21:41,576 INFO [train.py:682] (1/4) Start epoch 1828 2024-10-21 09:22:01,369 INFO [train.py:561] (1/4) Epoch 1828, batch 8, global_batch_idx: 29240, batch size: 170, loss[dur_loss=0.2228, prior_loss=0.9798, diff_loss=0.3344, tot_loss=1.537, over 170.00 samples.], tot_loss[dur_loss=0.2194, prior_loss=0.9792, diff_loss=0.4019, tot_loss=1.6, over 1432.00 samples.], 2024-10-21 09:22:11,492 INFO [train.py:682] (1/4) Start epoch 1829 2024-10-21 09:22:22,758 INFO [train.py:561] (1/4) Epoch 1829, batch 2, global_batch_idx: 29250, batch size: 203, loss[dur_loss=0.2241, prior_loss=0.9796, diff_loss=0.3513, tot_loss=1.555, over 203.00 samples.], tot_loss[dur_loss=0.2245, prior_loss=0.9799, diff_loss=0.3343, tot_loss=1.539, over 442.00 samples.], 2024-10-21 09:22:37,012 INFO [train.py:561] (1/4) Epoch 1829, batch 12, global_batch_idx: 29260, batch size: 152, loss[dur_loss=0.2269, prior_loss=0.9798, diff_loss=0.2917, tot_loss=1.498, over 152.00 samples.], tot_loss[dur_loss=0.2215, prior_loss=0.9794, diff_loss=0.3757, tot_loss=1.577, over 1966.00 samples.], 2024-10-21 09:22:41,440 INFO [train.py:682] (1/4) Start epoch 1830 2024-10-21 09:22:58,077 INFO [train.py:561] (1/4) Epoch 1830, batch 6, global_batch_idx: 29270, batch size: 106, loss[dur_loss=0.2286, prior_loss=0.98, diff_loss=0.3028, tot_loss=1.511, over 106.00 samples.], tot_loss[dur_loss=0.2206, prior_loss=0.979, diff_loss=0.4098, tot_loss=1.609, over 1142.00 samples.], 2024-10-21 09:23:11,010 INFO [train.py:682] (1/4) Start epoch 1831 2024-10-21 09:23:19,844 INFO [train.py:561] (1/4) Epoch 1831, batch 0, global_batch_idx: 29280, batch size: 108, loss[dur_loss=0.2304, prior_loss=0.9803, diff_loss=0.2967, tot_loss=1.507, over 108.00 samples.], tot_loss[dur_loss=0.2304, prior_loss=0.9803, diff_loss=0.2967, tot_loss=1.507, over 108.00 samples.], 2024-10-21 09:23:33,920 INFO [train.py:561] (1/4) Epoch 1831, batch 10, global_batch_idx: 29290, batch size: 111, loss[dur_loss=0.2252, prior_loss=0.9806, diff_loss=0.3633, tot_loss=1.569, over 111.00 samples.], tot_loss[dur_loss=0.2215, prior_loss=0.9792, diff_loss=0.3957, tot_loss=1.596, over 1656.00 samples.], 2024-10-21 09:23:40,953 INFO [train.py:682] (1/4) Start epoch 1832 2024-10-21 09:23:54,633 INFO [train.py:561] (1/4) Epoch 1832, batch 4, global_batch_idx: 29300, batch size: 189, loss[dur_loss=0.2226, prior_loss=0.9799, diff_loss=0.3499, tot_loss=1.552, over 189.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9787, diff_loss=0.4381, tot_loss=1.634, over 937.00 samples.], 2024-10-21 09:24:09,501 INFO [train.py:561] (1/4) Epoch 1832, batch 14, global_batch_idx: 29310, batch size: 142, loss[dur_loss=0.2245, prior_loss=0.9792, diff_loss=0.3082, tot_loss=1.512, over 142.00 samples.], tot_loss[dur_loss=0.2223, prior_loss=0.9793, diff_loss=0.3722, tot_loss=1.574, over 2210.00 samples.], 2024-10-21 09:24:10,910 INFO [train.py:682] (1/4) Start epoch 1833 2024-10-21 09:24:30,686 INFO [train.py:561] (1/4) Epoch 1833, batch 8, global_batch_idx: 29320, batch size: 170, loss[dur_loss=0.224, prior_loss=0.9795, diff_loss=0.338, tot_loss=1.541, over 170.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.979, diff_loss=0.403, tot_loss=1.602, over 1432.00 samples.], 2024-10-21 09:24:40,714 INFO [train.py:682] (1/4) Start epoch 1834 2024-10-21 09:24:51,862 INFO [train.py:561] (1/4) Epoch 1834, batch 2, global_batch_idx: 29330, batch size: 203, loss[dur_loss=0.2206, prior_loss=0.9794, diff_loss=0.3419, tot_loss=1.542, over 203.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9794, diff_loss=0.3292, tot_loss=1.53, over 442.00 samples.], 2024-10-21 09:25:05,963 INFO [train.py:561] (1/4) Epoch 1834, batch 12, global_batch_idx: 29340, batch size: 152, loss[dur_loss=0.2234, prior_loss=0.9796, diff_loss=0.3299, tot_loss=1.533, over 152.00 samples.], tot_loss[dur_loss=0.2204, prior_loss=0.9791, diff_loss=0.3815, tot_loss=1.581, over 1966.00 samples.], 2024-10-21 09:25:10,372 INFO [train.py:682] (1/4) Start epoch 1835 2024-10-21 09:25:27,119 INFO [train.py:561] (1/4) Epoch 1835, batch 6, global_batch_idx: 29350, batch size: 106, loss[dur_loss=0.2248, prior_loss=0.9798, diff_loss=0.3125, tot_loss=1.517, over 106.00 samples.], tot_loss[dur_loss=0.2181, prior_loss=0.9788, diff_loss=0.4104, tot_loss=1.607, over 1142.00 samples.], 2024-10-21 09:25:40,164 INFO [train.py:682] (1/4) Start epoch 1836 2024-10-21 09:25:49,229 INFO [train.py:561] (1/4) Epoch 1836, batch 0, global_batch_idx: 29360, batch size: 108, loss[dur_loss=0.2279, prior_loss=0.9805, diff_loss=0.3299, tot_loss=1.538, over 108.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9805, diff_loss=0.3299, tot_loss=1.538, over 108.00 samples.], 2024-10-21 09:26:03,396 INFO [train.py:561] (1/4) Epoch 1836, batch 10, global_batch_idx: 29370, batch size: 111, loss[dur_loss=0.226, prior_loss=0.9806, diff_loss=0.3163, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2209, prior_loss=0.9792, diff_loss=0.3872, tot_loss=1.587, over 1656.00 samples.], 2024-10-21 09:26:10,435 INFO [train.py:682] (1/4) Start epoch 1837 2024-10-21 09:26:23,922 INFO [train.py:561] (1/4) Epoch 1837, batch 4, global_batch_idx: 29380, batch size: 189, loss[dur_loss=0.2231, prior_loss=0.9796, diff_loss=0.3203, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.2185, prior_loss=0.9787, diff_loss=0.4272, tot_loss=1.624, over 937.00 samples.], 2024-10-21 09:26:38,759 INFO [train.py:561] (1/4) Epoch 1837, batch 14, global_batch_idx: 29390, batch size: 142, loss[dur_loss=0.2274, prior_loss=0.9794, diff_loss=0.2985, tot_loss=1.505, over 142.00 samples.], tot_loss[dur_loss=0.2224, prior_loss=0.9794, diff_loss=0.3646, tot_loss=1.566, over 2210.00 samples.], 2024-10-21 09:26:40,177 INFO [train.py:682] (1/4) Start epoch 1838 2024-10-21 09:26:59,915 INFO [train.py:561] (1/4) Epoch 1838, batch 8, global_batch_idx: 29400, batch size: 170, loss[dur_loss=0.2251, prior_loss=0.9797, diff_loss=0.2947, tot_loss=1.5, over 170.00 samples.], tot_loss[dur_loss=0.2219, prior_loss=0.9792, diff_loss=0.3916, tot_loss=1.593, over 1432.00 samples.], 2024-10-21 09:27:10,017 INFO [train.py:682] (1/4) Start epoch 1839 2024-10-21 09:27:21,393 INFO [train.py:561] (1/4) Epoch 1839, batch 2, global_batch_idx: 29410, batch size: 203, loss[dur_loss=0.2266, prior_loss=0.9794, diff_loss=0.3531, tot_loss=1.559, over 203.00 samples.], tot_loss[dur_loss=0.2256, prior_loss=0.9797, diff_loss=0.333, tot_loss=1.538, over 442.00 samples.], 2024-10-21 09:27:35,520 INFO [train.py:561] (1/4) Epoch 1839, batch 12, global_batch_idx: 29420, batch size: 152, loss[dur_loss=0.2234, prior_loss=0.9799, diff_loss=0.3111, tot_loss=1.514, over 152.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9794, diff_loss=0.3781, tot_loss=1.58, over 1966.00 samples.], 2024-10-21 09:27:39,940 INFO [train.py:682] (1/4) Start epoch 1840 2024-10-21 09:27:56,631 INFO [train.py:561] (1/4) Epoch 1840, batch 6, global_batch_idx: 29430, batch size: 106, loss[dur_loss=0.2253, prior_loss=0.9797, diff_loss=0.3066, tot_loss=1.512, over 106.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9789, diff_loss=0.4232, tot_loss=1.621, over 1142.00 samples.], 2024-10-21 09:28:09,613 INFO [train.py:682] (1/4) Start epoch 1841 2024-10-21 09:28:18,075 INFO [train.py:561] (1/4) Epoch 1841, batch 0, global_batch_idx: 29440, batch size: 108, loss[dur_loss=0.232, prior_loss=0.9806, diff_loss=0.3274, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.232, prior_loss=0.9806, diff_loss=0.3274, tot_loss=1.54, over 108.00 samples.], 2024-10-21 09:28:32,212 INFO [train.py:561] (1/4) Epoch 1841, batch 10, global_batch_idx: 29450, batch size: 111, loss[dur_loss=0.2284, prior_loss=0.9808, diff_loss=0.3311, tot_loss=1.54, over 111.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9793, diff_loss=0.3801, tot_loss=1.581, over 1656.00 samples.], 2024-10-21 09:28:39,240 INFO [train.py:682] (1/4) Start epoch 1842 2024-10-21 09:28:52,800 INFO [train.py:561] (1/4) Epoch 1842, batch 4, global_batch_idx: 29460, batch size: 189, loss[dur_loss=0.2212, prior_loss=0.9794, diff_loss=0.3546, tot_loss=1.555, over 189.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9784, diff_loss=0.4344, tot_loss=1.632, over 937.00 samples.], 2024-10-21 09:29:07,597 INFO [train.py:561] (1/4) Epoch 1842, batch 14, global_batch_idx: 29470, batch size: 142, loss[dur_loss=0.2242, prior_loss=0.9795, diff_loss=0.3232, tot_loss=1.527, over 142.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9792, diff_loss=0.366, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 09:29:09,019 INFO [train.py:682] (1/4) Start epoch 1843 2024-10-21 09:29:28,707 INFO [train.py:561] (1/4) Epoch 1843, batch 8, global_batch_idx: 29480, batch size: 170, loss[dur_loss=0.2272, prior_loss=0.9797, diff_loss=0.3212, tot_loss=1.528, over 170.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9789, diff_loss=0.4039, tot_loss=1.603, over 1432.00 samples.], 2024-10-21 09:29:38,808 INFO [train.py:682] (1/4) Start epoch 1844 2024-10-21 09:29:50,140 INFO [train.py:561] (1/4) Epoch 1844, batch 2, global_batch_idx: 29490, batch size: 203, loss[dur_loss=0.2209, prior_loss=0.9793, diff_loss=0.3596, tot_loss=1.56, over 203.00 samples.], tot_loss[dur_loss=0.2222, prior_loss=0.9796, diff_loss=0.3257, tot_loss=1.528, over 442.00 samples.], 2024-10-21 09:30:04,267 INFO [train.py:561] (1/4) Epoch 1844, batch 12, global_batch_idx: 29500, batch size: 152, loss[dur_loss=0.2236, prior_loss=0.9796, diff_loss=0.3305, tot_loss=1.534, over 152.00 samples.], tot_loss[dur_loss=0.2212, prior_loss=0.9792, diff_loss=0.3736, tot_loss=1.574, over 1966.00 samples.], 2024-10-21 09:30:08,689 INFO [train.py:682] (1/4) Start epoch 1845 2024-10-21 09:30:25,715 INFO [train.py:561] (1/4) Epoch 1845, batch 6, global_batch_idx: 29510, batch size: 106, loss[dur_loss=0.223, prior_loss=0.9795, diff_loss=0.3066, tot_loss=1.509, over 106.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9786, diff_loss=0.41, tot_loss=1.606, over 1142.00 samples.], 2024-10-21 09:30:38,698 INFO [train.py:682] (1/4) Start epoch 1846 2024-10-21 09:30:47,369 INFO [train.py:561] (1/4) Epoch 1846, batch 0, global_batch_idx: 29520, batch size: 108, loss[dur_loss=0.2263, prior_loss=0.9802, diff_loss=0.337, tot_loss=1.543, over 108.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9802, diff_loss=0.337, tot_loss=1.543, over 108.00 samples.], 2024-10-21 09:31:01,526 INFO [train.py:561] (1/4) Epoch 1846, batch 10, global_batch_idx: 29530, batch size: 111, loss[dur_loss=0.2265, prior_loss=0.9804, diff_loss=0.3049, tot_loss=1.512, over 111.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9791, diff_loss=0.3907, tot_loss=1.591, over 1656.00 samples.], 2024-10-21 09:31:08,676 INFO [train.py:682] (1/4) Start epoch 1847 2024-10-21 09:31:22,300 INFO [train.py:561] (1/4) Epoch 1847, batch 4, global_batch_idx: 29540, batch size: 189, loss[dur_loss=0.2221, prior_loss=0.9796, diff_loss=0.3567, tot_loss=1.558, over 189.00 samples.], tot_loss[dur_loss=0.2166, prior_loss=0.9785, diff_loss=0.4381, tot_loss=1.633, over 937.00 samples.], 2024-10-21 09:31:37,125 INFO [train.py:561] (1/4) Epoch 1847, batch 14, global_batch_idx: 29550, batch size: 142, loss[dur_loss=0.2242, prior_loss=0.9793, diff_loss=0.2929, tot_loss=1.496, over 142.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9793, diff_loss=0.3727, tot_loss=1.573, over 2210.00 samples.], 2024-10-21 09:31:38,573 INFO [train.py:682] (1/4) Start epoch 1848 2024-10-21 09:31:58,483 INFO [train.py:561] (1/4) Epoch 1848, batch 8, global_batch_idx: 29560, batch size: 170, loss[dur_loss=0.2259, prior_loss=0.9795, diff_loss=0.3318, tot_loss=1.537, over 170.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9792, diff_loss=0.3949, tot_loss=1.595, over 1432.00 samples.], 2024-10-21 09:32:08,647 INFO [train.py:682] (1/4) Start epoch 1849 2024-10-21 09:32:19,758 INFO [train.py:561] (1/4) Epoch 1849, batch 2, global_batch_idx: 29570, batch size: 203, loss[dur_loss=0.2212, prior_loss=0.9792, diff_loss=0.3382, tot_loss=1.539, over 203.00 samples.], tot_loss[dur_loss=0.2237, prior_loss=0.9795, diff_loss=0.3265, tot_loss=1.53, over 442.00 samples.], 2024-10-21 09:32:33,878 INFO [train.py:561] (1/4) Epoch 1849, batch 12, global_batch_idx: 29580, batch size: 152, loss[dur_loss=0.2234, prior_loss=0.9795, diff_loss=0.3445, tot_loss=1.547, over 152.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9792, diff_loss=0.3793, tot_loss=1.579, over 1966.00 samples.], 2024-10-21 09:32:38,289 INFO [train.py:682] (1/4) Start epoch 1850 2024-10-21 09:32:55,143 INFO [train.py:561] (1/4) Epoch 1850, batch 6, global_batch_idx: 29590, batch size: 106, loss[dur_loss=0.2217, prior_loss=0.9799, diff_loss=0.2922, tot_loss=1.494, over 106.00 samples.], tot_loss[dur_loss=0.2183, prior_loss=0.9788, diff_loss=0.4165, tot_loss=1.614, over 1142.00 samples.], 2024-10-21 09:33:08,107 INFO [train.py:682] (1/4) Start epoch 1851 2024-10-21 09:33:16,572 INFO [train.py:561] (1/4) Epoch 1851, batch 0, global_batch_idx: 29600, batch size: 108, loss[dur_loss=0.2282, prior_loss=0.9804, diff_loss=0.2847, tot_loss=1.493, over 108.00 samples.], tot_loss[dur_loss=0.2282, prior_loss=0.9804, diff_loss=0.2847, tot_loss=1.493, over 108.00 samples.], 2024-10-21 09:33:30,633 INFO [train.py:561] (1/4) Epoch 1851, batch 10, global_batch_idx: 29610, batch size: 111, loss[dur_loss=0.2277, prior_loss=0.9807, diff_loss=0.3142, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9792, diff_loss=0.3816, tot_loss=1.582, over 1656.00 samples.], 2024-10-21 09:33:37,650 INFO [train.py:682] (1/4) Start epoch 1852 2024-10-21 09:33:51,760 INFO [train.py:561] (1/4) Epoch 1852, batch 4, global_batch_idx: 29620, batch size: 189, loss[dur_loss=0.2207, prior_loss=0.9798, diff_loss=0.3348, tot_loss=1.535, over 189.00 samples.], tot_loss[dur_loss=0.2184, prior_loss=0.9786, diff_loss=0.4341, tot_loss=1.631, over 937.00 samples.], 2024-10-21 09:34:06,531 INFO [train.py:561] (1/4) Epoch 1852, batch 14, global_batch_idx: 29630, batch size: 142, loss[dur_loss=0.2215, prior_loss=0.9791, diff_loss=0.3244, tot_loss=1.525, over 142.00 samples.], tot_loss[dur_loss=0.2216, prior_loss=0.9792, diff_loss=0.3695, tot_loss=1.57, over 2210.00 samples.], 2024-10-21 09:34:07,955 INFO [train.py:682] (1/4) Start epoch 1853 2024-10-21 09:34:27,789 INFO [train.py:561] (1/4) Epoch 1853, batch 8, global_batch_idx: 29640, batch size: 170, loss[dur_loss=0.2263, prior_loss=0.9796, diff_loss=0.3209, tot_loss=1.527, over 170.00 samples.], tot_loss[dur_loss=0.2205, prior_loss=0.979, diff_loss=0.3908, tot_loss=1.59, over 1432.00 samples.], 2024-10-21 09:34:37,893 INFO [train.py:682] (1/4) Start epoch 1854 2024-10-21 09:34:49,087 INFO [train.py:561] (1/4) Epoch 1854, batch 2, global_batch_idx: 29650, batch size: 203, loss[dur_loss=0.2245, prior_loss=0.9794, diff_loss=0.3923, tot_loss=1.596, over 203.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.9797, diff_loss=0.3629, tot_loss=1.569, over 442.00 samples.], 2024-10-21 09:35:03,244 INFO [train.py:561] (1/4) Epoch 1854, batch 12, global_batch_idx: 29660, batch size: 152, loss[dur_loss=0.2231, prior_loss=0.9794, diff_loss=0.3178, tot_loss=1.52, over 152.00 samples.], tot_loss[dur_loss=0.2216, prior_loss=0.9791, diff_loss=0.3919, tot_loss=1.593, over 1966.00 samples.], 2024-10-21 09:35:07,653 INFO [train.py:682] (1/4) Start epoch 1855 2024-10-21 09:35:24,722 INFO [train.py:561] (1/4) Epoch 1855, batch 6, global_batch_idx: 29670, batch size: 106, loss[dur_loss=0.2258, prior_loss=0.9798, diff_loss=0.3377, tot_loss=1.543, over 106.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9787, diff_loss=0.3983, tot_loss=1.597, over 1142.00 samples.], 2024-10-21 09:35:37,816 INFO [train.py:682] (1/4) Start epoch 1856 2024-10-21 09:35:46,538 INFO [train.py:561] (1/4) Epoch 1856, batch 0, global_batch_idx: 29680, batch size: 108, loss[dur_loss=0.2323, prior_loss=0.9801, diff_loss=0.3506, tot_loss=1.563, over 108.00 samples.], tot_loss[dur_loss=0.2323, prior_loss=0.9801, diff_loss=0.3506, tot_loss=1.563, over 108.00 samples.], 2024-10-21 09:36:00,656 INFO [train.py:561] (1/4) Epoch 1856, batch 10, global_batch_idx: 29690, batch size: 111, loss[dur_loss=0.2244, prior_loss=0.9806, diff_loss=0.3095, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9791, diff_loss=0.3824, tot_loss=1.582, over 1656.00 samples.], 2024-10-21 09:36:07,738 INFO [train.py:682] (1/4) Start epoch 1857 2024-10-21 09:36:21,278 INFO [train.py:561] (1/4) Epoch 1857, batch 4, global_batch_idx: 29700, batch size: 189, loss[dur_loss=0.2132, prior_loss=0.9797, diff_loss=0.3533, tot_loss=1.546, over 189.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9784, diff_loss=0.4371, tot_loss=1.632, over 937.00 samples.], 2024-10-21 09:36:36,078 INFO [train.py:561] (1/4) Epoch 1857, batch 14, global_batch_idx: 29710, batch size: 142, loss[dur_loss=0.2254, prior_loss=0.9793, diff_loss=0.2881, tot_loss=1.493, over 142.00 samples.], tot_loss[dur_loss=0.2206, prior_loss=0.9791, diff_loss=0.3619, tot_loss=1.562, over 2210.00 samples.], 2024-10-21 09:36:37,498 INFO [train.py:682] (1/4) Start epoch 1858 2024-10-21 09:36:57,457 INFO [train.py:561] (1/4) Epoch 1858, batch 8, global_batch_idx: 29720, batch size: 170, loss[dur_loss=0.2246, prior_loss=0.9795, diff_loss=0.3171, tot_loss=1.521, over 170.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.979, diff_loss=0.4018, tot_loss=1.601, over 1432.00 samples.], 2024-10-21 09:37:07,502 INFO [train.py:682] (1/4) Start epoch 1859 2024-10-21 09:37:18,590 INFO [train.py:561] (1/4) Epoch 1859, batch 2, global_batch_idx: 29730, batch size: 203, loss[dur_loss=0.2217, prior_loss=0.9793, diff_loss=0.3369, tot_loss=1.538, over 203.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9795, diff_loss=0.3328, tot_loss=1.534, over 442.00 samples.], 2024-10-21 09:37:32,705 INFO [train.py:561] (1/4) Epoch 1859, batch 12, global_batch_idx: 29740, batch size: 152, loss[dur_loss=0.2235, prior_loss=0.9793, diff_loss=0.3379, tot_loss=1.541, over 152.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9791, diff_loss=0.3809, tot_loss=1.581, over 1966.00 samples.], 2024-10-21 09:37:37,230 INFO [train.py:682] (1/4) Start epoch 1860 2024-10-21 09:37:54,221 INFO [train.py:561] (1/4) Epoch 1860, batch 6, global_batch_idx: 29750, batch size: 106, loss[dur_loss=0.2209, prior_loss=0.9797, diff_loss=0.3306, tot_loss=1.531, over 106.00 samples.], tot_loss[dur_loss=0.2175, prior_loss=0.9788, diff_loss=0.4215, tot_loss=1.618, over 1142.00 samples.], 2024-10-21 09:38:07,317 INFO [train.py:682] (1/4) Start epoch 1861 2024-10-21 09:38:16,199 INFO [train.py:561] (1/4) Epoch 1861, batch 0, global_batch_idx: 29760, batch size: 108, loss[dur_loss=0.2279, prior_loss=0.9805, diff_loss=0.3442, tot_loss=1.553, over 108.00 samples.], tot_loss[dur_loss=0.2279, prior_loss=0.9805, diff_loss=0.3442, tot_loss=1.553, over 108.00 samples.], 2024-10-21 09:38:30,303 INFO [train.py:561] (1/4) Epoch 1861, batch 10, global_batch_idx: 29770, batch size: 111, loss[dur_loss=0.2298, prior_loss=0.9807, diff_loss=0.3397, tot_loss=1.55, over 111.00 samples.], tot_loss[dur_loss=0.2206, prior_loss=0.9792, diff_loss=0.3961, tot_loss=1.596, over 1656.00 samples.], 2024-10-21 09:38:37,378 INFO [train.py:682] (1/4) Start epoch 1862 2024-10-21 09:38:50,940 INFO [train.py:561] (1/4) Epoch 1862, batch 4, global_batch_idx: 29780, batch size: 189, loss[dur_loss=0.2233, prior_loss=0.9795, diff_loss=0.3385, tot_loss=1.541, over 189.00 samples.], tot_loss[dur_loss=0.2178, prior_loss=0.9784, diff_loss=0.4327, tot_loss=1.629, over 937.00 samples.], 2024-10-21 09:39:05,685 INFO [train.py:561] (1/4) Epoch 1862, batch 14, global_batch_idx: 29790, batch size: 142, loss[dur_loss=0.2271, prior_loss=0.9794, diff_loss=0.2804, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9791, diff_loss=0.3658, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 09:39:07,112 INFO [train.py:682] (1/4) Start epoch 1863 2024-10-21 09:39:26,716 INFO [train.py:561] (1/4) Epoch 1863, batch 8, global_batch_idx: 29800, batch size: 170, loss[dur_loss=0.2248, prior_loss=0.9798, diff_loss=0.3063, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.2212, prior_loss=0.9789, diff_loss=0.3941, tot_loss=1.594, over 1432.00 samples.], 2024-10-21 09:39:36,813 INFO [train.py:682] (1/4) Start epoch 1864 2024-10-21 09:39:48,233 INFO [train.py:561] (1/4) Epoch 1864, batch 2, global_batch_idx: 29810, batch size: 203, loss[dur_loss=0.2182, prior_loss=0.9792, diff_loss=0.3733, tot_loss=1.571, over 203.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.9796, diff_loss=0.3286, tot_loss=1.529, over 442.00 samples.], 2024-10-21 09:40:02,388 INFO [train.py:561] (1/4) Epoch 1864, batch 12, global_batch_idx: 29820, batch size: 152, loss[dur_loss=0.2254, prior_loss=0.9797, diff_loss=0.3391, tot_loss=1.544, over 152.00 samples.], tot_loss[dur_loss=0.2205, prior_loss=0.9791, diff_loss=0.3806, tot_loss=1.58, over 1966.00 samples.], 2024-10-21 09:40:06,819 INFO [train.py:682] (1/4) Start epoch 1865 2024-10-21 09:40:23,539 INFO [train.py:561] (1/4) Epoch 1865, batch 6, global_batch_idx: 29830, batch size: 106, loss[dur_loss=0.2212, prior_loss=0.9791, diff_loss=0.3237, tot_loss=1.524, over 106.00 samples.], tot_loss[dur_loss=0.2184, prior_loss=0.9787, diff_loss=0.4173, tot_loss=1.614, over 1142.00 samples.], 2024-10-21 09:40:36,491 INFO [train.py:682] (1/4) Start epoch 1866 2024-10-21 09:40:45,058 INFO [train.py:561] (1/4) Epoch 1866, batch 0, global_batch_idx: 29840, batch size: 108, loss[dur_loss=0.2277, prior_loss=0.9803, diff_loss=0.3271, tot_loss=1.535, over 108.00 samples.], tot_loss[dur_loss=0.2277, prior_loss=0.9803, diff_loss=0.3271, tot_loss=1.535, over 108.00 samples.], 2024-10-21 09:40:59,152 INFO [train.py:561] (1/4) Epoch 1866, batch 10, global_batch_idx: 29850, batch size: 111, loss[dur_loss=0.2281, prior_loss=0.9805, diff_loss=0.3219, tot_loss=1.53, over 111.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9791, diff_loss=0.378, tot_loss=1.578, over 1656.00 samples.], 2024-10-21 09:41:06,235 INFO [train.py:682] (1/4) Start epoch 1867 2024-10-21 09:41:20,053 INFO [train.py:561] (1/4) Epoch 1867, batch 4, global_batch_idx: 29860, batch size: 189, loss[dur_loss=0.2188, prior_loss=0.9793, diff_loss=0.3473, tot_loss=1.545, over 189.00 samples.], tot_loss[dur_loss=0.2162, prior_loss=0.9783, diff_loss=0.4404, tot_loss=1.635, over 937.00 samples.], 2024-10-21 09:41:34,892 INFO [train.py:561] (1/4) Epoch 1867, batch 14, global_batch_idx: 29870, batch size: 142, loss[dur_loss=0.2219, prior_loss=0.979, diff_loss=0.3714, tot_loss=1.572, over 142.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9791, diff_loss=0.3742, tot_loss=1.573, over 2210.00 samples.], 2024-10-21 09:41:36,314 INFO [train.py:682] (1/4) Start epoch 1868 2024-10-21 09:41:56,122 INFO [train.py:561] (1/4) Epoch 1868, batch 8, global_batch_idx: 29880, batch size: 170, loss[dur_loss=0.225, prior_loss=0.9796, diff_loss=0.3379, tot_loss=1.542, over 170.00 samples.], tot_loss[dur_loss=0.2193, prior_loss=0.9788, diff_loss=0.3784, tot_loss=1.576, over 1432.00 samples.], 2024-10-21 09:42:06,248 INFO [train.py:682] (1/4) Start epoch 1869 2024-10-21 09:42:17,572 INFO [train.py:561] (1/4) Epoch 1869, batch 2, global_batch_idx: 29890, batch size: 203, loss[dur_loss=0.2214, prior_loss=0.979, diff_loss=0.3482, tot_loss=1.549, over 203.00 samples.], tot_loss[dur_loss=0.223, prior_loss=0.9794, diff_loss=0.315, tot_loss=1.517, over 442.00 samples.], 2024-10-21 09:42:31,714 INFO [train.py:561] (1/4) Epoch 1869, batch 12, global_batch_idx: 29900, batch size: 152, loss[dur_loss=0.2209, prior_loss=0.9793, diff_loss=0.3015, tot_loss=1.502, over 152.00 samples.], tot_loss[dur_loss=0.2191, prior_loss=0.9789, diff_loss=0.3707, tot_loss=1.569, over 1966.00 samples.], 2024-10-21 09:42:36,143 INFO [train.py:682] (1/4) Start epoch 1870 2024-10-21 09:42:52,912 INFO [train.py:561] (1/4) Epoch 1870, batch 6, global_batch_idx: 29910, batch size: 106, loss[dur_loss=0.224, prior_loss=0.9795, diff_loss=0.3064, tot_loss=1.51, over 106.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9784, diff_loss=0.4006, tot_loss=1.596, over 1142.00 samples.], 2024-10-21 09:43:05,817 INFO [train.py:682] (1/4) Start epoch 1871 2024-10-21 09:43:14,611 INFO [train.py:561] (1/4) Epoch 1871, batch 0, global_batch_idx: 29920, batch size: 108, loss[dur_loss=0.2311, prior_loss=0.9804, diff_loss=0.2902, tot_loss=1.502, over 108.00 samples.], tot_loss[dur_loss=0.2311, prior_loss=0.9804, diff_loss=0.2902, tot_loss=1.502, over 108.00 samples.], 2024-10-21 09:43:28,619 INFO [train.py:561] (1/4) Epoch 1871, batch 10, global_batch_idx: 29930, batch size: 111, loss[dur_loss=0.2246, prior_loss=0.9802, diff_loss=0.3064, tot_loss=1.511, over 111.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9789, diff_loss=0.3859, tot_loss=1.585, over 1656.00 samples.], 2024-10-21 09:43:35,630 INFO [train.py:682] (1/4) Start epoch 1872 2024-10-21 09:43:49,150 INFO [train.py:561] (1/4) Epoch 1872, batch 4, global_batch_idx: 29940, batch size: 189, loss[dur_loss=0.2198, prior_loss=0.9793, diff_loss=0.3589, tot_loss=1.558, over 189.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9784, diff_loss=0.4394, tot_loss=1.635, over 937.00 samples.], 2024-10-21 09:44:03,973 INFO [train.py:561] (1/4) Epoch 1872, batch 14, global_batch_idx: 29950, batch size: 142, loss[dur_loss=0.2265, prior_loss=0.9793, diff_loss=0.3164, tot_loss=1.522, over 142.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.9791, diff_loss=0.3732, tot_loss=1.572, over 2210.00 samples.], 2024-10-21 09:44:05,440 INFO [train.py:682] (1/4) Start epoch 1873 2024-10-21 09:44:25,222 INFO [train.py:561] (1/4) Epoch 1873, batch 8, global_batch_idx: 29960, batch size: 170, loss[dur_loss=0.228, prior_loss=0.9793, diff_loss=0.339, tot_loss=1.546, over 170.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9787, diff_loss=0.3952, tot_loss=1.593, over 1432.00 samples.], 2024-10-21 09:44:35,334 INFO [train.py:682] (1/4) Start epoch 1874 2024-10-21 09:44:46,784 INFO [train.py:561] (1/4) Epoch 1874, batch 2, global_batch_idx: 29970, batch size: 203, loss[dur_loss=0.2204, prior_loss=0.9792, diff_loss=0.382, tot_loss=1.582, over 203.00 samples.], tot_loss[dur_loss=0.2213, prior_loss=0.9793, diff_loss=0.3412, tot_loss=1.542, over 442.00 samples.], 2024-10-21 09:45:00,910 INFO [train.py:561] (1/4) Epoch 1874, batch 12, global_batch_idx: 29980, batch size: 152, loss[dur_loss=0.2209, prior_loss=0.9792, diff_loss=0.3175, tot_loss=1.518, over 152.00 samples.], tot_loss[dur_loss=0.2204, prior_loss=0.9789, diff_loss=0.3731, tot_loss=1.572, over 1966.00 samples.], 2024-10-21 09:45:05,341 INFO [train.py:682] (1/4) Start epoch 1875 2024-10-21 09:45:22,238 INFO [train.py:561] (1/4) Epoch 1875, batch 6, global_batch_idx: 29990, batch size: 106, loss[dur_loss=0.2236, prior_loss=0.9795, diff_loss=0.2733, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.2163, prior_loss=0.9784, diff_loss=0.3969, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 09:45:35,333 INFO [train.py:682] (1/4) Start epoch 1876 2024-10-21 09:45:44,401 INFO [train.py:561] (1/4) Epoch 1876, batch 0, global_batch_idx: 30000, batch size: 108, loss[dur_loss=0.2286, prior_loss=0.9804, diff_loss=0.284, tot_loss=1.493, over 108.00 samples.], tot_loss[dur_loss=0.2286, prior_loss=0.9804, diff_loss=0.284, tot_loss=1.493, over 108.00 samples.], 2024-10-21 09:45:45,826 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 09:46:46,124 INFO [train.py:589] (1/4) Epoch 1876, validation: dur_loss=0.4542, prior_loss=1.035, diff_loss=0.4172, tot_loss=1.906, over 100.00 samples. 2024-10-21 09:46:46,125 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 09:46:59,063 INFO [train.py:561] (1/4) Epoch 1876, batch 10, global_batch_idx: 30010, batch size: 111, loss[dur_loss=0.224, prior_loss=0.9803, diff_loss=0.3197, tot_loss=1.524, over 111.00 samples.], tot_loss[dur_loss=0.2189, prior_loss=0.9789, diff_loss=0.3878, tot_loss=1.586, over 1656.00 samples.], 2024-10-21 09:47:06,171 INFO [train.py:682] (1/4) Start epoch 1877 2024-10-21 09:47:20,005 INFO [train.py:561] (1/4) Epoch 1877, batch 4, global_batch_idx: 30020, batch size: 189, loss[dur_loss=0.2176, prior_loss=0.9791, diff_loss=0.3098, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.2166, prior_loss=0.9783, diff_loss=0.4317, tot_loss=1.627, over 937.00 samples.], 2024-10-21 09:47:34,839 INFO [train.py:561] (1/4) Epoch 1877, batch 14, global_batch_idx: 30030, batch size: 142, loss[dur_loss=0.2249, prior_loss=0.9791, diff_loss=0.3168, tot_loss=1.521, over 142.00 samples.], tot_loss[dur_loss=0.2196, prior_loss=0.979, diff_loss=0.368, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 09:47:36,264 INFO [train.py:682] (1/4) Start epoch 1878 2024-10-21 09:47:56,492 INFO [train.py:561] (1/4) Epoch 1878, batch 8, global_batch_idx: 30040, batch size: 170, loss[dur_loss=0.2232, prior_loss=0.9792, diff_loss=0.3308, tot_loss=1.533, over 170.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9787, diff_loss=0.4001, tot_loss=1.597, over 1432.00 samples.], 2024-10-21 09:48:06,677 INFO [train.py:682] (1/4) Start epoch 1879 2024-10-21 09:48:18,094 INFO [train.py:561] (1/4) Epoch 1879, batch 2, global_batch_idx: 30050, batch size: 203, loss[dur_loss=0.2227, prior_loss=0.9793, diff_loss=0.3446, tot_loss=1.547, over 203.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9796, diff_loss=0.3213, tot_loss=1.525, over 442.00 samples.], 2024-10-21 09:48:32,376 INFO [train.py:561] (1/4) Epoch 1879, batch 12, global_batch_idx: 30060, batch size: 152, loss[dur_loss=0.2201, prior_loss=0.9792, diff_loss=0.2977, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.2196, prior_loss=0.979, diff_loss=0.3672, tot_loss=1.566, over 1966.00 samples.], 2024-10-21 09:48:36,833 INFO [train.py:682] (1/4) Start epoch 1880 2024-10-21 09:48:54,285 INFO [train.py:561] (1/4) Epoch 1880, batch 6, global_batch_idx: 30070, batch size: 106, loss[dur_loss=0.2237, prior_loss=0.9793, diff_loss=0.2809, tot_loss=1.484, over 106.00 samples.], tot_loss[dur_loss=0.2178, prior_loss=0.9786, diff_loss=0.4178, tot_loss=1.614, over 1142.00 samples.], 2024-10-21 09:49:07,389 INFO [train.py:682] (1/4) Start epoch 1881 2024-10-21 09:49:16,491 INFO [train.py:561] (1/4) Epoch 1881, batch 0, global_batch_idx: 30080, batch size: 108, loss[dur_loss=0.2272, prior_loss=0.9803, diff_loss=0.364, tot_loss=1.571, over 108.00 samples.], tot_loss[dur_loss=0.2272, prior_loss=0.9803, diff_loss=0.364, tot_loss=1.571, over 108.00 samples.], 2024-10-21 09:49:30,805 INFO [train.py:561] (1/4) Epoch 1881, batch 10, global_batch_idx: 30090, batch size: 111, loss[dur_loss=0.2265, prior_loss=0.9804, diff_loss=0.3291, tot_loss=1.536, over 111.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.979, diff_loss=0.3822, tot_loss=1.582, over 1656.00 samples.], 2024-10-21 09:49:37,946 INFO [train.py:682] (1/4) Start epoch 1882 2024-10-21 09:49:51,934 INFO [train.py:561] (1/4) Epoch 1882, batch 4, global_batch_idx: 30100, batch size: 189, loss[dur_loss=0.2223, prior_loss=0.9793, diff_loss=0.3304, tot_loss=1.532, over 189.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9783, diff_loss=0.4371, tot_loss=1.634, over 937.00 samples.], 2024-10-21 09:50:06,824 INFO [train.py:561] (1/4) Epoch 1882, batch 14, global_batch_idx: 30110, batch size: 142, loss[dur_loss=0.2219, prior_loss=0.9791, diff_loss=0.2885, tot_loss=1.49, over 142.00 samples.], tot_loss[dur_loss=0.221, prior_loss=0.9789, diff_loss=0.361, tot_loss=1.561, over 2210.00 samples.], 2024-10-21 09:50:08,248 INFO [train.py:682] (1/4) Start epoch 1883 2024-10-21 09:50:28,179 INFO [train.py:561] (1/4) Epoch 1883, batch 8, global_batch_idx: 30120, batch size: 170, loss[dur_loss=0.2262, prior_loss=0.9792, diff_loss=0.3421, tot_loss=1.548, over 170.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9787, diff_loss=0.3935, tot_loss=1.591, over 1432.00 samples.], 2024-10-21 09:50:38,301 INFO [train.py:682] (1/4) Start epoch 1884 2024-10-21 09:50:49,757 INFO [train.py:561] (1/4) Epoch 1884, batch 2, global_batch_idx: 30130, batch size: 203, loss[dur_loss=0.224, prior_loss=0.9793, diff_loss=0.3283, tot_loss=1.532, over 203.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9795, diff_loss=0.3215, tot_loss=1.525, over 442.00 samples.], 2024-10-21 09:51:04,008 INFO [train.py:561] (1/4) Epoch 1884, batch 12, global_batch_idx: 30140, batch size: 152, loss[dur_loss=0.223, prior_loss=0.9794, diff_loss=0.3008, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.22, prior_loss=0.9789, diff_loss=0.369, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 09:51:08,458 INFO [train.py:682] (1/4) Start epoch 1885 2024-10-21 09:51:25,746 INFO [train.py:561] (1/4) Epoch 1885, batch 6, global_batch_idx: 30150, batch size: 106, loss[dur_loss=0.2216, prior_loss=0.9792, diff_loss=0.3183, tot_loss=1.519, over 106.00 samples.], tot_loss[dur_loss=0.2179, prior_loss=0.9783, diff_loss=0.4161, tot_loss=1.612, over 1142.00 samples.], 2024-10-21 09:51:38,774 INFO [train.py:682] (1/4) Start epoch 1886 2024-10-21 09:51:47,757 INFO [train.py:561] (1/4) Epoch 1886, batch 0, global_batch_idx: 30160, batch size: 108, loss[dur_loss=0.2274, prior_loss=0.9802, diff_loss=0.3036, tot_loss=1.511, over 108.00 samples.], tot_loss[dur_loss=0.2274, prior_loss=0.9802, diff_loss=0.3036, tot_loss=1.511, over 108.00 samples.], 2024-10-21 09:52:02,016 INFO [train.py:561] (1/4) Epoch 1886, batch 10, global_batch_idx: 30170, batch size: 111, loss[dur_loss=0.2217, prior_loss=0.9799, diff_loss=0.3124, tot_loss=1.514, over 111.00 samples.], tot_loss[dur_loss=0.2192, prior_loss=0.9787, diff_loss=0.3843, tot_loss=1.582, over 1656.00 samples.], 2024-10-21 09:52:09,157 INFO [train.py:682] (1/4) Start epoch 1887 2024-10-21 09:52:23,049 INFO [train.py:561] (1/4) Epoch 1887, batch 4, global_batch_idx: 30180, batch size: 189, loss[dur_loss=0.2216, prior_loss=0.9791, diff_loss=0.344, tot_loss=1.545, over 189.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9781, diff_loss=0.4328, tot_loss=1.628, over 937.00 samples.], 2024-10-21 09:52:37,952 INFO [train.py:561] (1/4) Epoch 1887, batch 14, global_batch_idx: 30190, batch size: 142, loss[dur_loss=0.2218, prior_loss=0.9788, diff_loss=0.3001, tot_loss=1.501, over 142.00 samples.], tot_loss[dur_loss=0.2201, prior_loss=0.9788, diff_loss=0.3667, tot_loss=1.566, over 2210.00 samples.], 2024-10-21 09:52:39,381 INFO [train.py:682] (1/4) Start epoch 1888 2024-10-21 09:52:59,352 INFO [train.py:561] (1/4) Epoch 1888, batch 8, global_batch_idx: 30200, batch size: 170, loss[dur_loss=0.2257, prior_loss=0.9794, diff_loss=0.3286, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9787, diff_loss=0.397, tot_loss=1.595, over 1432.00 samples.], 2024-10-21 09:53:09,528 INFO [train.py:682] (1/4) Start epoch 1889 2024-10-21 09:53:20,849 INFO [train.py:561] (1/4) Epoch 1889, batch 2, global_batch_idx: 30210, batch size: 203, loss[dur_loss=0.2215, prior_loss=0.9788, diff_loss=0.3438, tot_loss=1.544, over 203.00 samples.], tot_loss[dur_loss=0.2209, prior_loss=0.9791, diff_loss=0.3201, tot_loss=1.52, over 442.00 samples.], 2024-10-21 09:53:35,127 INFO [train.py:561] (1/4) Epoch 1889, batch 12, global_batch_idx: 30220, batch size: 152, loss[dur_loss=0.2207, prior_loss=0.9792, diff_loss=0.3351, tot_loss=1.535, over 152.00 samples.], tot_loss[dur_loss=0.2191, prior_loss=0.9787, diff_loss=0.3675, tot_loss=1.565, over 1966.00 samples.], 2024-10-21 09:53:39,575 INFO [train.py:682] (1/4) Start epoch 1890 2024-10-21 09:53:56,792 INFO [train.py:561] (1/4) Epoch 1890, batch 6, global_batch_idx: 30230, batch size: 106, loss[dur_loss=0.221, prior_loss=0.9793, diff_loss=0.2984, tot_loss=1.499, over 106.00 samples.], tot_loss[dur_loss=0.2175, prior_loss=0.9784, diff_loss=0.409, tot_loss=1.605, over 1142.00 samples.], 2024-10-21 09:54:09,845 INFO [train.py:682] (1/4) Start epoch 1891 2024-10-21 09:54:18,755 INFO [train.py:561] (1/4) Epoch 1891, batch 0, global_batch_idx: 30240, batch size: 108, loss[dur_loss=0.2322, prior_loss=0.9805, diff_loss=0.2651, tot_loss=1.478, over 108.00 samples.], tot_loss[dur_loss=0.2322, prior_loss=0.9805, diff_loss=0.2651, tot_loss=1.478, over 108.00 samples.], 2024-10-21 09:54:33,071 INFO [train.py:561] (1/4) Epoch 1891, batch 10, global_batch_idx: 30250, batch size: 111, loss[dur_loss=0.2264, prior_loss=0.9802, diff_loss=0.2927, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.2195, prior_loss=0.9788, diff_loss=0.3736, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 09:54:40,188 INFO [train.py:682] (1/4) Start epoch 1892 2024-10-21 09:54:53,895 INFO [train.py:561] (1/4) Epoch 1892, batch 4, global_batch_idx: 30260, batch size: 189, loss[dur_loss=0.22, prior_loss=0.9792, diff_loss=0.3263, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.216, prior_loss=0.9782, diff_loss=0.4228, tot_loss=1.617, over 937.00 samples.], 2024-10-21 09:55:08,809 INFO [train.py:561] (1/4) Epoch 1892, batch 14, global_batch_idx: 30270, batch size: 142, loss[dur_loss=0.2233, prior_loss=0.9788, diff_loss=0.3026, tot_loss=1.505, over 142.00 samples.], tot_loss[dur_loss=0.2199, prior_loss=0.9789, diff_loss=0.3606, tot_loss=1.559, over 2210.00 samples.], 2024-10-21 09:55:10,231 INFO [train.py:682] (1/4) Start epoch 1893 2024-10-21 09:55:30,803 INFO [train.py:561] (1/4) Epoch 1893, batch 8, global_batch_idx: 30280, batch size: 170, loss[dur_loss=0.2264, prior_loss=0.9792, diff_loss=0.3224, tot_loss=1.528, over 170.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9786, diff_loss=0.4015, tot_loss=1.599, over 1432.00 samples.], 2024-10-21 09:55:40,937 INFO [train.py:682] (1/4) Start epoch 1894 2024-10-21 09:55:52,300 INFO [train.py:561] (1/4) Epoch 1894, batch 2, global_batch_idx: 30290, batch size: 203, loss[dur_loss=0.2192, prior_loss=0.9791, diff_loss=0.3387, tot_loss=1.537, over 203.00 samples.], tot_loss[dur_loss=0.2196, prior_loss=0.9793, diff_loss=0.3288, tot_loss=1.528, over 442.00 samples.], 2024-10-21 09:56:06,557 INFO [train.py:561] (1/4) Epoch 1894, batch 12, global_batch_idx: 30300, batch size: 152, loss[dur_loss=0.2236, prior_loss=0.9791, diff_loss=0.2804, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.2191, prior_loss=0.9788, diff_loss=0.3745, tot_loss=1.572, over 1966.00 samples.], 2024-10-21 09:56:11,018 INFO [train.py:682] (1/4) Start epoch 1895 2024-10-21 09:56:27,946 INFO [train.py:561] (1/4) Epoch 1895, batch 6, global_batch_idx: 30310, batch size: 106, loss[dur_loss=0.2202, prior_loss=0.9791, diff_loss=0.3082, tot_loss=1.507, over 106.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9783, diff_loss=0.4144, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 09:56:40,916 INFO [train.py:682] (1/4) Start epoch 1896 2024-10-21 09:56:49,520 INFO [train.py:561] (1/4) Epoch 1896, batch 0, global_batch_idx: 30320, batch size: 108, loss[dur_loss=0.2284, prior_loss=0.9803, diff_loss=0.3301, tot_loss=1.539, over 108.00 samples.], tot_loss[dur_loss=0.2284, prior_loss=0.9803, diff_loss=0.3301, tot_loss=1.539, over 108.00 samples.], 2024-10-21 09:57:03,729 INFO [train.py:561] (1/4) Epoch 1896, batch 10, global_batch_idx: 30330, batch size: 111, loss[dur_loss=0.2254, prior_loss=0.98, diff_loss=0.3211, tot_loss=1.527, over 111.00 samples.], tot_loss[dur_loss=0.2188, prior_loss=0.9787, diff_loss=0.3782, tot_loss=1.576, over 1656.00 samples.], 2024-10-21 09:57:10,810 INFO [train.py:682] (1/4) Start epoch 1897 2024-10-21 09:57:24,391 INFO [train.py:561] (1/4) Epoch 1897, batch 4, global_batch_idx: 30340, batch size: 189, loss[dur_loss=0.217, prior_loss=0.9786, diff_loss=0.3735, tot_loss=1.569, over 189.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.9781, diff_loss=0.4472, tot_loss=1.64, over 937.00 samples.], 2024-10-21 09:57:39,250 INFO [train.py:561] (1/4) Epoch 1897, batch 14, global_batch_idx: 30350, batch size: 142, loss[dur_loss=0.2225, prior_loss=0.9791, diff_loss=0.3065, tot_loss=1.508, over 142.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9789, diff_loss=0.365, tot_loss=1.563, over 2210.00 samples.], 2024-10-21 09:57:40,664 INFO [train.py:682] (1/4) Start epoch 1898 2024-10-21 09:58:00,632 INFO [train.py:561] (1/4) Epoch 1898, batch 8, global_batch_idx: 30360, batch size: 170, loss[dur_loss=0.2217, prior_loss=0.9793, diff_loss=0.3014, tot_loss=1.502, over 170.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9786, diff_loss=0.3883, tot_loss=1.585, over 1432.00 samples.], 2024-10-21 09:58:10,753 INFO [train.py:682] (1/4) Start epoch 1899 2024-10-21 09:58:22,082 INFO [train.py:561] (1/4) Epoch 1899, batch 2, global_batch_idx: 30370, batch size: 203, loss[dur_loss=0.22, prior_loss=0.979, diff_loss=0.3328, tot_loss=1.532, over 203.00 samples.], tot_loss[dur_loss=0.222, prior_loss=0.9792, diff_loss=0.3337, tot_loss=1.535, over 442.00 samples.], 2024-10-21 09:58:36,312 INFO [train.py:561] (1/4) Epoch 1899, batch 12, global_batch_idx: 30380, batch size: 152, loss[dur_loss=0.2199, prior_loss=0.9792, diff_loss=0.3089, tot_loss=1.508, over 152.00 samples.], tot_loss[dur_loss=0.219, prior_loss=0.9788, diff_loss=0.3699, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 09:58:40,747 INFO [train.py:682] (1/4) Start epoch 1900 2024-10-21 09:58:57,666 INFO [train.py:561] (1/4) Epoch 1900, batch 6, global_batch_idx: 30390, batch size: 106, loss[dur_loss=0.2213, prior_loss=0.979, diff_loss=0.2718, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.2159, prior_loss=0.9783, diff_loss=0.4049, tot_loss=1.599, over 1142.00 samples.], 2024-10-21 09:59:10,709 INFO [train.py:682] (1/4) Start epoch 1901 2024-10-21 09:59:19,843 INFO [train.py:561] (1/4) Epoch 1901, batch 0, global_batch_idx: 30400, batch size: 108, loss[dur_loss=0.2263, prior_loss=0.9802, diff_loss=0.3117, tot_loss=1.518, over 108.00 samples.], tot_loss[dur_loss=0.2263, prior_loss=0.9802, diff_loss=0.3117, tot_loss=1.518, over 108.00 samples.], 2024-10-21 09:59:34,286 INFO [train.py:561] (1/4) Epoch 1901, batch 10, global_batch_idx: 30410, batch size: 111, loss[dur_loss=0.2225, prior_loss=0.98, diff_loss=0.325, tot_loss=1.528, over 111.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9787, diff_loss=0.3807, tot_loss=1.578, over 1656.00 samples.], 2024-10-21 09:59:41,401 INFO [train.py:682] (1/4) Start epoch 1902 2024-10-21 09:59:55,393 INFO [train.py:561] (1/4) Epoch 1902, batch 4, global_batch_idx: 30420, batch size: 189, loss[dur_loss=0.2204, prior_loss=0.9791, diff_loss=0.3235, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.2155, prior_loss=0.9782, diff_loss=0.4256, tot_loss=1.619, over 937.00 samples.], 2024-10-21 10:00:10,247 INFO [train.py:561] (1/4) Epoch 1902, batch 14, global_batch_idx: 30430, batch size: 142, loss[dur_loss=0.2211, prior_loss=0.9786, diff_loss=0.3035, tot_loss=1.503, over 142.00 samples.], tot_loss[dur_loss=0.2187, prior_loss=0.9787, diff_loss=0.3652, tot_loss=1.563, over 2210.00 samples.], 2024-10-21 10:00:11,677 INFO [train.py:682] (1/4) Start epoch 1903 2024-10-21 10:00:31,676 INFO [train.py:561] (1/4) Epoch 1903, batch 8, global_batch_idx: 30440, batch size: 170, loss[dur_loss=0.2222, prior_loss=0.9792, diff_loss=0.3231, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9784, diff_loss=0.393, tot_loss=1.589, over 1432.00 samples.], 2024-10-21 10:00:41,764 INFO [train.py:682] (1/4) Start epoch 1904 2024-10-21 10:00:52,973 INFO [train.py:561] (1/4) Epoch 1904, batch 2, global_batch_idx: 30450, batch size: 203, loss[dur_loss=0.22, prior_loss=0.9789, diff_loss=0.3168, tot_loss=1.516, over 203.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.9791, diff_loss=0.3102, tot_loss=1.509, over 442.00 samples.], 2024-10-21 10:01:07,202 INFO [train.py:561] (1/4) Epoch 1904, batch 12, global_batch_idx: 30460, batch size: 152, loss[dur_loss=0.2158, prior_loss=0.9789, diff_loss=0.3046, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9786, diff_loss=0.3645, tot_loss=1.56, over 1966.00 samples.], 2024-10-21 10:01:11,665 INFO [train.py:682] (1/4) Start epoch 1905 2024-10-21 10:01:28,506 INFO [train.py:561] (1/4) Epoch 1905, batch 6, global_batch_idx: 30470, batch size: 106, loss[dur_loss=0.2167, prior_loss=0.979, diff_loss=0.3024, tot_loss=1.498, over 106.00 samples.], tot_loss[dur_loss=0.2136, prior_loss=0.9782, diff_loss=0.4119, tot_loss=1.604, over 1142.00 samples.], 2024-10-21 10:01:41,557 INFO [train.py:682] (1/4) Start epoch 1906 2024-10-21 10:01:50,496 INFO [train.py:561] (1/4) Epoch 1906, batch 0, global_batch_idx: 30480, batch size: 108, loss[dur_loss=0.2266, prior_loss=0.98, diff_loss=0.3258, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2266, prior_loss=0.98, diff_loss=0.3258, tot_loss=1.532, over 108.00 samples.], 2024-10-21 10:02:04,702 INFO [train.py:561] (1/4) Epoch 1906, batch 10, global_batch_idx: 30490, batch size: 111, loss[dur_loss=0.2225, prior_loss=0.98, diff_loss=0.3151, tot_loss=1.518, over 111.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9785, diff_loss=0.3817, tot_loss=1.577, over 1656.00 samples.], 2024-10-21 10:02:11,766 INFO [train.py:682] (1/4) Start epoch 1907 2024-10-21 10:02:25,468 INFO [train.py:561] (1/4) Epoch 1907, batch 4, global_batch_idx: 30500, batch size: 189, loss[dur_loss=0.2164, prior_loss=0.9792, diff_loss=0.3458, tot_loss=1.541, over 189.00 samples.], tot_loss[dur_loss=0.2138, prior_loss=0.9781, diff_loss=0.4467, tot_loss=1.639, over 937.00 samples.], 2024-10-21 10:02:40,374 INFO [train.py:561] (1/4) Epoch 1907, batch 14, global_batch_idx: 30510, batch size: 142, loss[dur_loss=0.2219, prior_loss=0.979, diff_loss=0.3081, tot_loss=1.509, over 142.00 samples.], tot_loss[dur_loss=0.2176, prior_loss=0.9788, diff_loss=0.3764, tot_loss=1.573, over 2210.00 samples.], 2024-10-21 10:02:41,799 INFO [train.py:682] (1/4) Start epoch 1908 2024-10-21 10:03:01,615 INFO [train.py:561] (1/4) Epoch 1908, batch 8, global_batch_idx: 30520, batch size: 170, loss[dur_loss=0.2257, prior_loss=0.9795, diff_loss=0.3287, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9785, diff_loss=0.3858, tot_loss=1.582, over 1432.00 samples.], 2024-10-21 10:03:11,700 INFO [train.py:682] (1/4) Start epoch 1909 2024-10-21 10:03:23,096 INFO [train.py:561] (1/4) Epoch 1909, batch 2, global_batch_idx: 30530, batch size: 203, loss[dur_loss=0.2186, prior_loss=0.9787, diff_loss=0.3301, tot_loss=1.527, over 203.00 samples.], tot_loss[dur_loss=0.2197, prior_loss=0.979, diff_loss=0.3315, tot_loss=1.53, over 442.00 samples.], 2024-10-21 10:03:37,262 INFO [train.py:561] (1/4) Epoch 1909, batch 12, global_batch_idx: 30540, batch size: 152, loss[dur_loss=0.218, prior_loss=0.9787, diff_loss=0.3013, tot_loss=1.498, over 152.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9786, diff_loss=0.369, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 10:03:41,678 INFO [train.py:682] (1/4) Start epoch 1910 2024-10-21 10:03:58,799 INFO [train.py:561] (1/4) Epoch 1910, batch 6, global_batch_idx: 30550, batch size: 106, loss[dur_loss=0.2154, prior_loss=0.9787, diff_loss=0.2874, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9782, diff_loss=0.4071, tot_loss=1.599, over 1142.00 samples.], 2024-10-21 10:04:11,805 INFO [train.py:682] (1/4) Start epoch 1911 2024-10-21 10:04:20,249 INFO [train.py:561] (1/4) Epoch 1911, batch 0, global_batch_idx: 30560, batch size: 108, loss[dur_loss=0.2265, prior_loss=0.98, diff_loss=0.3151, tot_loss=1.522, over 108.00 samples.], tot_loss[dur_loss=0.2265, prior_loss=0.98, diff_loss=0.3151, tot_loss=1.522, over 108.00 samples.], 2024-10-21 10:04:34,477 INFO [train.py:561] (1/4) Epoch 1911, batch 10, global_batch_idx: 30570, batch size: 111, loss[dur_loss=0.2198, prior_loss=0.9795, diff_loss=0.343, tot_loss=1.542, over 111.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.9785, diff_loss=0.3767, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 10:04:41,606 INFO [train.py:682] (1/4) Start epoch 1912 2024-10-21 10:04:55,327 INFO [train.py:561] (1/4) Epoch 1912, batch 4, global_batch_idx: 30580, batch size: 189, loss[dur_loss=0.2163, prior_loss=0.9788, diff_loss=0.3185, tot_loss=1.514, over 189.00 samples.], tot_loss[dur_loss=0.2125, prior_loss=0.9779, diff_loss=0.4288, tot_loss=1.619, over 937.00 samples.], 2024-10-21 10:05:10,131 INFO [train.py:561] (1/4) Epoch 1912, batch 14, global_batch_idx: 30590, batch size: 142, loss[dur_loss=0.2237, prior_loss=0.9788, diff_loss=0.2999, tot_loss=1.502, over 142.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9785, diff_loss=0.3683, tot_loss=1.564, over 2210.00 samples.], 2024-10-21 10:05:11,550 INFO [train.py:682] (1/4) Start epoch 1913 2024-10-21 10:05:31,395 INFO [train.py:561] (1/4) Epoch 1913, batch 8, global_batch_idx: 30600, batch size: 170, loss[dur_loss=0.2222, prior_loss=0.979, diff_loss=0.3135, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.2158, prior_loss=0.9783, diff_loss=0.3871, tot_loss=1.581, over 1432.00 samples.], 2024-10-21 10:05:41,490 INFO [train.py:682] (1/4) Start epoch 1914 2024-10-21 10:05:52,647 INFO [train.py:561] (1/4) Epoch 1914, batch 2, global_batch_idx: 30610, batch size: 203, loss[dur_loss=0.2184, prior_loss=0.9788, diff_loss=0.3634, tot_loss=1.561, over 203.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.979, diff_loss=0.3497, tot_loss=1.549, over 442.00 samples.], 2024-10-21 10:06:06,892 INFO [train.py:561] (1/4) Epoch 1914, batch 12, global_batch_idx: 30620, batch size: 152, loss[dur_loss=0.2169, prior_loss=0.9787, diff_loss=0.3297, tot_loss=1.525, over 152.00 samples.], tot_loss[dur_loss=0.2171, prior_loss=0.9785, diff_loss=0.3773, tot_loss=1.573, over 1966.00 samples.], 2024-10-21 10:06:11,341 INFO [train.py:682] (1/4) Start epoch 1915 2024-10-21 10:06:28,396 INFO [train.py:561] (1/4) Epoch 1915, batch 6, global_batch_idx: 30630, batch size: 106, loss[dur_loss=0.2176, prior_loss=0.9788, diff_loss=0.2942, tot_loss=1.491, over 106.00 samples.], tot_loss[dur_loss=0.2143, prior_loss=0.978, diff_loss=0.3947, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 10:06:41,478 INFO [train.py:682] (1/4) Start epoch 1916 2024-10-21 10:06:50,167 INFO [train.py:561] (1/4) Epoch 1916, batch 0, global_batch_idx: 30640, batch size: 108, loss[dur_loss=0.2227, prior_loss=0.9798, diff_loss=0.2721, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9798, diff_loss=0.2721, tot_loss=1.475, over 108.00 samples.], 2024-10-21 10:07:04,393 INFO [train.py:561] (1/4) Epoch 1916, batch 10, global_batch_idx: 30650, batch size: 111, loss[dur_loss=0.2192, prior_loss=0.9798, diff_loss=0.3167, tot_loss=1.516, over 111.00 samples.], tot_loss[dur_loss=0.2175, prior_loss=0.9784, diff_loss=0.3726, tot_loss=1.568, over 1656.00 samples.], 2024-10-21 10:07:11,491 INFO [train.py:682] (1/4) Start epoch 1917 2024-10-21 10:07:25,089 INFO [train.py:561] (1/4) Epoch 1917, batch 4, global_batch_idx: 30660, batch size: 189, loss[dur_loss=0.2146, prior_loss=0.9787, diff_loss=0.3361, tot_loss=1.529, over 189.00 samples.], tot_loss[dur_loss=0.2139, prior_loss=0.9779, diff_loss=0.4382, tot_loss=1.63, over 937.00 samples.], 2024-10-21 10:07:39,941 INFO [train.py:561] (1/4) Epoch 1917, batch 14, global_batch_idx: 30670, batch size: 142, loss[dur_loss=0.2204, prior_loss=0.9784, diff_loss=0.2963, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9785, diff_loss=0.3628, tot_loss=1.558, over 2210.00 samples.], 2024-10-21 10:07:41,355 INFO [train.py:682] (1/4) Start epoch 1918 2024-10-21 10:08:01,341 INFO [train.py:561] (1/4) Epoch 1918, batch 8, global_batch_idx: 30680, batch size: 170, loss[dur_loss=0.2212, prior_loss=0.9786, diff_loss=0.3152, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.216, prior_loss=0.9782, diff_loss=0.3934, tot_loss=1.588, over 1432.00 samples.], 2024-10-21 10:08:11,423 INFO [train.py:682] (1/4) Start epoch 1919 2024-10-21 10:08:22,738 INFO [train.py:561] (1/4) Epoch 1919, batch 2, global_batch_idx: 30690, batch size: 203, loss[dur_loss=0.2197, prior_loss=0.9788, diff_loss=0.3365, tot_loss=1.535, over 203.00 samples.], tot_loss[dur_loss=0.2214, prior_loss=0.9789, diff_loss=0.3174, tot_loss=1.518, over 442.00 samples.], 2024-10-21 10:08:37,018 INFO [train.py:561] (1/4) Epoch 1919, batch 12, global_batch_idx: 30700, batch size: 152, loss[dur_loss=0.217, prior_loss=0.9788, diff_loss=0.3191, tot_loss=1.515, over 152.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9785, diff_loss=0.3674, tot_loss=1.563, over 1966.00 samples.], 2024-10-21 10:08:41,483 INFO [train.py:682] (1/4) Start epoch 1920 2024-10-21 10:08:58,527 INFO [train.py:561] (1/4) Epoch 1920, batch 6, global_batch_idx: 30710, batch size: 106, loss[dur_loss=0.2193, prior_loss=0.9787, diff_loss=0.3178, tot_loss=1.516, over 106.00 samples.], tot_loss[dur_loss=0.2161, prior_loss=0.9782, diff_loss=0.4101, tot_loss=1.604, over 1142.00 samples.], 2024-10-21 10:09:11,714 INFO [train.py:682] (1/4) Start epoch 1921 2024-10-21 10:09:20,309 INFO [train.py:561] (1/4) Epoch 1921, batch 0, global_batch_idx: 30720, batch size: 108, loss[dur_loss=0.2234, prior_loss=0.9797, diff_loss=0.281, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.2234, prior_loss=0.9797, diff_loss=0.281, tot_loss=1.484, over 108.00 samples.], 2024-10-21 10:09:34,548 INFO [train.py:561] (1/4) Epoch 1921, batch 10, global_batch_idx: 30730, batch size: 111, loss[dur_loss=0.2212, prior_loss=0.9801, diff_loss=0.3086, tot_loss=1.51, over 111.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9785, diff_loss=0.3744, tot_loss=1.57, over 1656.00 samples.], 2024-10-21 10:09:41,613 INFO [train.py:682] (1/4) Start epoch 1922 2024-10-21 10:09:55,264 INFO [train.py:561] (1/4) Epoch 1922, batch 4, global_batch_idx: 30740, batch size: 189, loss[dur_loss=0.2176, prior_loss=0.9789, diff_loss=0.3433, tot_loss=1.54, over 189.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.978, diff_loss=0.4377, tot_loss=1.631, over 937.00 samples.], 2024-10-21 10:10:10,143 INFO [train.py:561] (1/4) Epoch 1922, batch 14, global_batch_idx: 30750, batch size: 142, loss[dur_loss=0.2178, prior_loss=0.9787, diff_loss=0.305, tot_loss=1.501, over 142.00 samples.], tot_loss[dur_loss=0.2176, prior_loss=0.9786, diff_loss=0.3646, tot_loss=1.561, over 2210.00 samples.], 2024-10-21 10:10:11,570 INFO [train.py:682] (1/4) Start epoch 1923 2024-10-21 10:10:31,403 INFO [train.py:561] (1/4) Epoch 1923, batch 8, global_batch_idx: 30760, batch size: 170, loss[dur_loss=0.2242, prior_loss=0.979, diff_loss=0.3485, tot_loss=1.552, over 170.00 samples.], tot_loss[dur_loss=0.2162, prior_loss=0.9782, diff_loss=0.3991, tot_loss=1.594, over 1432.00 samples.], 2024-10-21 10:10:41,624 INFO [train.py:682] (1/4) Start epoch 1924 2024-10-21 10:10:53,100 INFO [train.py:561] (1/4) Epoch 1924, batch 2, global_batch_idx: 30770, batch size: 203, loss[dur_loss=0.2196, prior_loss=0.9787, diff_loss=0.352, tot_loss=1.55, over 203.00 samples.], tot_loss[dur_loss=0.2215, prior_loss=0.9789, diff_loss=0.3278, tot_loss=1.528, over 442.00 samples.], 2024-10-21 10:11:07,301 INFO [train.py:561] (1/4) Epoch 1924, batch 12, global_batch_idx: 30780, batch size: 152, loss[dur_loss=0.2155, prior_loss=0.9788, diff_loss=0.3251, tot_loss=1.519, over 152.00 samples.], tot_loss[dur_loss=0.2175, prior_loss=0.9784, diff_loss=0.368, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 10:11:11,730 INFO [train.py:682] (1/4) Start epoch 1925 2024-10-21 10:11:28,537 INFO [train.py:561] (1/4) Epoch 1925, batch 6, global_batch_idx: 30790, batch size: 106, loss[dur_loss=0.2197, prior_loss=0.9788, diff_loss=0.3063, tot_loss=1.505, over 106.00 samples.], tot_loss[dur_loss=0.2145, prior_loss=0.978, diff_loss=0.398, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 10:11:41,547 INFO [train.py:682] (1/4) Start epoch 1926 2024-10-21 10:11:50,294 INFO [train.py:561] (1/4) Epoch 1926, batch 0, global_batch_idx: 30800, batch size: 108, loss[dur_loss=0.2251, prior_loss=0.9797, diff_loss=0.2968, tot_loss=1.501, over 108.00 samples.], tot_loss[dur_loss=0.2251, prior_loss=0.9797, diff_loss=0.2968, tot_loss=1.501, over 108.00 samples.], 2024-10-21 10:12:04,520 INFO [train.py:561] (1/4) Epoch 1926, batch 10, global_batch_idx: 30810, batch size: 111, loss[dur_loss=0.2237, prior_loss=0.9803, diff_loss=0.2683, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.2177, prior_loss=0.9786, diff_loss=0.3809, tot_loss=1.577, over 1656.00 samples.], 2024-10-21 10:12:11,621 INFO [train.py:682] (1/4) Start epoch 1927 2024-10-21 10:12:25,280 INFO [train.py:561] (1/4) Epoch 1927, batch 4, global_batch_idx: 30820, batch size: 189, loss[dur_loss=0.2163, prior_loss=0.9786, diff_loss=0.3521, tot_loss=1.547, over 189.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9778, diff_loss=0.4436, tot_loss=1.637, over 937.00 samples.], 2024-10-21 10:12:40,239 INFO [train.py:561] (1/4) Epoch 1927, batch 14, global_batch_idx: 30830, batch size: 142, loss[dur_loss=0.2185, prior_loss=0.9782, diff_loss=0.2935, tot_loss=1.49, over 142.00 samples.], tot_loss[dur_loss=0.2176, prior_loss=0.9784, diff_loss=0.3689, tot_loss=1.565, over 2210.00 samples.], 2024-10-21 10:12:41,661 INFO [train.py:682] (1/4) Start epoch 1928 2024-10-21 10:13:01,293 INFO [train.py:561] (1/4) Epoch 1928, batch 8, global_batch_idx: 30840, batch size: 170, loss[dur_loss=0.2205, prior_loss=0.9788, diff_loss=0.3345, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2162, prior_loss=0.9781, diff_loss=0.3948, tot_loss=1.589, over 1432.00 samples.], 2024-10-21 10:13:11,405 INFO [train.py:682] (1/4) Start epoch 1929 2024-10-21 10:13:22,863 INFO [train.py:561] (1/4) Epoch 1929, batch 2, global_batch_idx: 30850, batch size: 203, loss[dur_loss=0.2209, prior_loss=0.9787, diff_loss=0.3426, tot_loss=1.542, over 203.00 samples.], tot_loss[dur_loss=0.2215, prior_loss=0.9788, diff_loss=0.3316, tot_loss=1.532, over 442.00 samples.], 2024-10-21 10:13:37,089 INFO [train.py:561] (1/4) Epoch 1929, batch 12, global_batch_idx: 30860, batch size: 152, loss[dur_loss=0.2165, prior_loss=0.9788, diff_loss=0.3331, tot_loss=1.528, over 152.00 samples.], tot_loss[dur_loss=0.2176, prior_loss=0.9783, diff_loss=0.3742, tot_loss=1.57, over 1966.00 samples.], 2024-10-21 10:13:41,527 INFO [train.py:682] (1/4) Start epoch 1930 2024-10-21 10:13:58,266 INFO [train.py:561] (1/4) Epoch 1930, batch 6, global_batch_idx: 30870, batch size: 106, loss[dur_loss=0.2203, prior_loss=0.9784, diff_loss=0.3158, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9778, diff_loss=0.417, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 10:14:11,320 INFO [train.py:682] (1/4) Start epoch 1931 2024-10-21 10:14:19,821 INFO [train.py:561] (1/4) Epoch 1931, batch 0, global_batch_idx: 30880, batch size: 108, loss[dur_loss=0.2209, prior_loss=0.9796, diff_loss=0.2989, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2209, prior_loss=0.9796, diff_loss=0.2989, tot_loss=1.499, over 108.00 samples.], 2024-10-21 10:14:34,017 INFO [train.py:561] (1/4) Epoch 1931, batch 10, global_batch_idx: 30890, batch size: 111, loss[dur_loss=0.2225, prior_loss=0.9796, diff_loss=0.3277, tot_loss=1.53, over 111.00 samples.], tot_loss[dur_loss=0.217, prior_loss=0.9782, diff_loss=0.3895, tot_loss=1.585, over 1656.00 samples.], 2024-10-21 10:14:41,101 INFO [train.py:682] (1/4) Start epoch 1932 2024-10-21 10:14:54,867 INFO [train.py:561] (1/4) Epoch 1932, batch 4, global_batch_idx: 30900, batch size: 189, loss[dur_loss=0.22, prior_loss=0.9784, diff_loss=0.3486, tot_loss=1.547, over 189.00 samples.], tot_loss[dur_loss=0.2145, prior_loss=0.9777, diff_loss=0.4358, tot_loss=1.628, over 937.00 samples.], 2024-10-21 10:15:09,744 INFO [train.py:561] (1/4) Epoch 1932, batch 14, global_batch_idx: 30910, batch size: 142, loss[dur_loss=0.2186, prior_loss=0.9783, diff_loss=0.3323, tot_loss=1.529, over 142.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9783, diff_loss=0.3646, tot_loss=1.561, over 2210.00 samples.], 2024-10-21 10:15:11,180 INFO [train.py:682] (1/4) Start epoch 1933 2024-10-21 10:15:30,881 INFO [train.py:561] (1/4) Epoch 1933, batch 8, global_batch_idx: 30920, batch size: 170, loss[dur_loss=0.2233, prior_loss=0.9786, diff_loss=0.3343, tot_loss=1.536, over 170.00 samples.], tot_loss[dur_loss=0.2157, prior_loss=0.978, diff_loss=0.3897, tot_loss=1.583, over 1432.00 samples.], 2024-10-21 10:15:41,030 INFO [train.py:682] (1/4) Start epoch 1934 2024-10-21 10:15:52,499 INFO [train.py:561] (1/4) Epoch 1934, batch 2, global_batch_idx: 30930, batch size: 203, loss[dur_loss=0.2186, prior_loss=0.9784, diff_loss=0.3661, tot_loss=1.563, over 203.00 samples.], tot_loss[dur_loss=0.2203, prior_loss=0.9788, diff_loss=0.328, tot_loss=1.527, over 442.00 samples.], 2024-10-21 10:16:06,776 INFO [train.py:561] (1/4) Epoch 1934, batch 12, global_batch_idx: 30940, batch size: 152, loss[dur_loss=0.2175, prior_loss=0.9788, diff_loss=0.3205, tot_loss=1.517, over 152.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9783, diff_loss=0.3751, tot_loss=1.57, over 1966.00 samples.], 2024-10-21 10:16:11,237 INFO [train.py:682] (1/4) Start epoch 1935 2024-10-21 10:16:28,407 INFO [train.py:561] (1/4) Epoch 1935, batch 6, global_batch_idx: 30950, batch size: 106, loss[dur_loss=0.2173, prior_loss=0.9785, diff_loss=0.2942, tot_loss=1.49, over 106.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9778, diff_loss=0.4106, tot_loss=1.604, over 1142.00 samples.], 2024-10-21 10:16:41,491 INFO [train.py:682] (1/4) Start epoch 1936 2024-10-21 10:16:50,135 INFO [train.py:561] (1/4) Epoch 1936, batch 0, global_batch_idx: 30960, batch size: 108, loss[dur_loss=0.2235, prior_loss=0.9796, diff_loss=0.3413, tot_loss=1.544, over 108.00 samples.], tot_loss[dur_loss=0.2235, prior_loss=0.9796, diff_loss=0.3413, tot_loss=1.544, over 108.00 samples.], 2024-10-21 10:17:04,351 INFO [train.py:561] (1/4) Epoch 1936, batch 10, global_batch_idx: 30970, batch size: 111, loss[dur_loss=0.2237, prior_loss=0.9796, diff_loss=0.32, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2166, prior_loss=0.9782, diff_loss=0.3862, tot_loss=1.581, over 1656.00 samples.], 2024-10-21 10:17:11,421 INFO [train.py:682] (1/4) Start epoch 1937 2024-10-21 10:17:25,147 INFO [train.py:561] (1/4) Epoch 1937, batch 4, global_batch_idx: 30980, batch size: 189, loss[dur_loss=0.2155, prior_loss=0.9785, diff_loss=0.3189, tot_loss=1.513, over 189.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9777, diff_loss=0.4174, tot_loss=1.609, over 937.00 samples.], 2024-10-21 10:17:40,121 INFO [train.py:561] (1/4) Epoch 1937, batch 14, global_batch_idx: 30990, batch size: 142, loss[dur_loss=0.217, prior_loss=0.9783, diff_loss=0.3125, tot_loss=1.508, over 142.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.9783, diff_loss=0.3615, tot_loss=1.556, over 2210.00 samples.], 2024-10-21 10:17:41,559 INFO [train.py:682] (1/4) Start epoch 1938 2024-10-21 10:18:01,466 INFO [train.py:561] (1/4) Epoch 1938, batch 8, global_batch_idx: 31000, batch size: 170, loss[dur_loss=0.2214, prior_loss=0.9786, diff_loss=0.3319, tot_loss=1.532, over 170.00 samples.], tot_loss[dur_loss=0.2154, prior_loss=0.978, diff_loss=0.399, tot_loss=1.592, over 1432.00 samples.], 2024-10-21 10:18:11,648 INFO [train.py:682] (1/4) Start epoch 1939 2024-10-21 10:18:22,747 INFO [train.py:561] (1/4) Epoch 1939, batch 2, global_batch_idx: 31010, batch size: 203, loss[dur_loss=0.2162, prior_loss=0.9784, diff_loss=0.315, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9786, diff_loss=0.3043, tot_loss=1.5, over 442.00 samples.], 2024-10-21 10:18:37,003 INFO [train.py:561] (1/4) Epoch 1939, batch 12, global_batch_idx: 31020, batch size: 152, loss[dur_loss=0.2196, prior_loss=0.9787, diff_loss=0.311, tot_loss=1.509, over 152.00 samples.], tot_loss[dur_loss=0.2158, prior_loss=0.9781, diff_loss=0.3674, tot_loss=1.561, over 1966.00 samples.], 2024-10-21 10:18:41,449 INFO [train.py:682] (1/4) Start epoch 1940 2024-10-21 10:18:58,523 INFO [train.py:561] (1/4) Epoch 1940, batch 6, global_batch_idx: 31030, batch size: 106, loss[dur_loss=0.2204, prior_loss=0.9786, diff_loss=0.3316, tot_loss=1.531, over 106.00 samples.], tot_loss[dur_loss=0.2152, prior_loss=0.9778, diff_loss=0.4006, tot_loss=1.594, over 1142.00 samples.], 2024-10-21 10:19:11,733 INFO [train.py:682] (1/4) Start epoch 1941 2024-10-21 10:19:20,512 INFO [train.py:561] (1/4) Epoch 1941, batch 0, global_batch_idx: 31040, batch size: 108, loss[dur_loss=0.2223, prior_loss=0.9795, diff_loss=0.2878, tot_loss=1.49, over 108.00 samples.], tot_loss[dur_loss=0.2223, prior_loss=0.9795, diff_loss=0.2878, tot_loss=1.49, over 108.00 samples.], 2024-10-21 10:19:34,926 INFO [train.py:561] (1/4) Epoch 1941, batch 10, global_batch_idx: 31050, batch size: 111, loss[dur_loss=0.2194, prior_loss=0.9796, diff_loss=0.3002, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.9781, diff_loss=0.3664, tot_loss=1.559, over 1656.00 samples.], 2024-10-21 10:19:42,065 INFO [train.py:682] (1/4) Start epoch 1942 2024-10-21 10:19:56,314 INFO [train.py:561] (1/4) Epoch 1942, batch 4, global_batch_idx: 31060, batch size: 189, loss[dur_loss=0.2172, prior_loss=0.9786, diff_loss=0.3488, tot_loss=1.545, over 189.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9776, diff_loss=0.4443, tot_loss=1.635, over 937.00 samples.], 2024-10-21 10:20:11,308 INFO [train.py:561] (1/4) Epoch 1942, batch 14, global_batch_idx: 31070, batch size: 142, loss[dur_loss=0.2169, prior_loss=0.9782, diff_loss=0.2953, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.2165, prior_loss=0.9782, diff_loss=0.3668, tot_loss=1.561, over 2210.00 samples.], 2024-10-21 10:20:12,771 INFO [train.py:682] (1/4) Start epoch 1943 2024-10-21 10:20:32,772 INFO [train.py:561] (1/4) Epoch 1943, batch 8, global_batch_idx: 31080, batch size: 170, loss[dur_loss=0.2237, prior_loss=0.9785, diff_loss=0.3149, tot_loss=1.517, over 170.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9779, diff_loss=0.3912, tot_loss=1.586, over 1432.00 samples.], 2024-10-21 10:20:42,948 INFO [train.py:682] (1/4) Start epoch 1944 2024-10-21 10:20:54,158 INFO [train.py:561] (1/4) Epoch 1944, batch 2, global_batch_idx: 31090, batch size: 203, loss[dur_loss=0.2184, prior_loss=0.9787, diff_loss=0.3648, tot_loss=1.562, over 203.00 samples.], tot_loss[dur_loss=0.2208, prior_loss=0.9789, diff_loss=0.3242, tot_loss=1.524, over 442.00 samples.], 2024-10-21 10:21:08,380 INFO [train.py:561] (1/4) Epoch 1944, batch 12, global_batch_idx: 31100, batch size: 152, loss[dur_loss=0.2162, prior_loss=0.9785, diff_loss=0.315, tot_loss=1.51, over 152.00 samples.], tot_loss[dur_loss=0.2165, prior_loss=0.9783, diff_loss=0.377, tot_loss=1.572, over 1966.00 samples.], 2024-10-21 10:21:12,964 INFO [train.py:682] (1/4) Start epoch 1945 2024-10-21 10:21:30,251 INFO [train.py:561] (1/4) Epoch 1945, batch 6, global_batch_idx: 31110, batch size: 106, loss[dur_loss=0.2241, prior_loss=0.9786, diff_loss=0.3181, tot_loss=1.521, over 106.00 samples.], tot_loss[dur_loss=0.214, prior_loss=0.9778, diff_loss=0.4124, tot_loss=1.604, over 1142.00 samples.], 2024-10-21 10:21:43,392 INFO [train.py:682] (1/4) Start epoch 1946 2024-10-21 10:21:52,080 INFO [train.py:561] (1/4) Epoch 1946, batch 0, global_batch_idx: 31120, batch size: 108, loss[dur_loss=0.2238, prior_loss=0.9794, diff_loss=0.2873, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.2238, prior_loss=0.9794, diff_loss=0.2873, tot_loss=1.491, over 108.00 samples.], 2024-10-21 10:22:06,371 INFO [train.py:561] (1/4) Epoch 1946, batch 10, global_batch_idx: 31130, batch size: 111, loss[dur_loss=0.2199, prior_loss=0.9797, diff_loss=0.3083, tot_loss=1.508, over 111.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.9781, diff_loss=0.381, tot_loss=1.575, over 1656.00 samples.], 2024-10-21 10:22:13,523 INFO [train.py:682] (1/4) Start epoch 1947 2024-10-21 10:22:27,100 INFO [train.py:561] (1/4) Epoch 1947, batch 4, global_batch_idx: 31140, batch size: 189, loss[dur_loss=0.2147, prior_loss=0.9786, diff_loss=0.341, tot_loss=1.534, over 189.00 samples.], tot_loss[dur_loss=0.2146, prior_loss=0.9777, diff_loss=0.4354, tot_loss=1.628, over 937.00 samples.], 2024-10-21 10:22:41,982 INFO [train.py:561] (1/4) Epoch 1947, batch 14, global_batch_idx: 31150, batch size: 142, loss[dur_loss=0.2184, prior_loss=0.9783, diff_loss=0.3064, tot_loss=1.503, over 142.00 samples.], tot_loss[dur_loss=0.2166, prior_loss=0.9782, diff_loss=0.3614, tot_loss=1.556, over 2210.00 samples.], 2024-10-21 10:22:43,417 INFO [train.py:682] (1/4) Start epoch 1948 2024-10-21 10:23:03,478 INFO [train.py:561] (1/4) Epoch 1948, batch 8, global_batch_idx: 31160, batch size: 170, loss[dur_loss=0.2203, prior_loss=0.9783, diff_loss=0.2964, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.2155, prior_loss=0.9779, diff_loss=0.3928, tot_loss=1.586, over 1432.00 samples.], 2024-10-21 10:23:13,702 INFO [train.py:682] (1/4) Start epoch 1949 2024-10-21 10:23:25,027 INFO [train.py:561] (1/4) Epoch 1949, batch 2, global_batch_idx: 31170, batch size: 203, loss[dur_loss=0.2187, prior_loss=0.9784, diff_loss=0.3494, tot_loss=1.547, over 203.00 samples.], tot_loss[dur_loss=0.2189, prior_loss=0.9786, diff_loss=0.3417, tot_loss=1.539, over 442.00 samples.], 2024-10-21 10:23:39,208 INFO [train.py:561] (1/4) Epoch 1949, batch 12, global_batch_idx: 31180, batch size: 152, loss[dur_loss=0.2179, prior_loss=0.9788, diff_loss=0.3189, tot_loss=1.516, over 152.00 samples.], tot_loss[dur_loss=0.217, prior_loss=0.9782, diff_loss=0.3744, tot_loss=1.57, over 1966.00 samples.], 2024-10-21 10:23:43,819 INFO [train.py:682] (1/4) Start epoch 1950 2024-10-21 10:24:01,087 INFO [train.py:561] (1/4) Epoch 1950, batch 6, global_batch_idx: 31190, batch size: 106, loss[dur_loss=0.2224, prior_loss=0.9787, diff_loss=0.3052, tot_loss=1.506, over 106.00 samples.], tot_loss[dur_loss=0.2139, prior_loss=0.9778, diff_loss=0.4112, tot_loss=1.603, over 1142.00 samples.], 2024-10-21 10:24:14,193 INFO [train.py:682] (1/4) Start epoch 1951 2024-10-21 10:24:22,768 INFO [train.py:561] (1/4) Epoch 1951, batch 0, global_batch_idx: 31200, batch size: 108, loss[dur_loss=0.2236, prior_loss=0.9795, diff_loss=0.3449, tot_loss=1.548, over 108.00 samples.], tot_loss[dur_loss=0.2236, prior_loss=0.9795, diff_loss=0.3449, tot_loss=1.548, over 108.00 samples.], 2024-10-21 10:24:37,017 INFO [train.py:561] (1/4) Epoch 1951, batch 10, global_batch_idx: 31210, batch size: 111, loss[dur_loss=0.2201, prior_loss=0.9793, diff_loss=0.2947, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.2159, prior_loss=0.9781, diff_loss=0.3913, tot_loss=1.585, over 1656.00 samples.], 2024-10-21 10:24:44,149 INFO [train.py:682] (1/4) Start epoch 1952 2024-10-21 10:24:58,268 INFO [train.py:561] (1/4) Epoch 1952, batch 4, global_batch_idx: 31220, batch size: 189, loss[dur_loss=0.2164, prior_loss=0.9785, diff_loss=0.328, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9775, diff_loss=0.4385, tot_loss=1.629, over 937.00 samples.], 2024-10-21 10:25:13,174 INFO [train.py:561] (1/4) Epoch 1952, batch 14, global_batch_idx: 31230, batch size: 142, loss[dur_loss=0.2167, prior_loss=0.9783, diff_loss=0.3153, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.2162, prior_loss=0.9781, diff_loss=0.3725, tot_loss=1.567, over 2210.00 samples.], 2024-10-21 10:25:14,618 INFO [train.py:682] (1/4) Start epoch 1953 2024-10-21 10:25:34,904 INFO [train.py:561] (1/4) Epoch 1953, batch 8, global_batch_idx: 31240, batch size: 170, loss[dur_loss=0.2215, prior_loss=0.9784, diff_loss=0.3341, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2151, prior_loss=0.9778, diff_loss=0.3919, tot_loss=1.585, over 1432.00 samples.], 2024-10-21 10:25:45,059 INFO [train.py:682] (1/4) Start epoch 1954 2024-10-21 10:25:56,395 INFO [train.py:561] (1/4) Epoch 1954, batch 2, global_batch_idx: 31250, batch size: 203, loss[dur_loss=0.2196, prior_loss=0.9785, diff_loss=0.3377, tot_loss=1.536, over 203.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9787, diff_loss=0.3359, tot_loss=1.535, over 442.00 samples.], 2024-10-21 10:26:10,712 INFO [train.py:561] (1/4) Epoch 1954, batch 12, global_batch_idx: 31260, batch size: 152, loss[dur_loss=0.2188, prior_loss=0.9788, diff_loss=0.3388, tot_loss=1.536, over 152.00 samples.], tot_loss[dur_loss=0.2155, prior_loss=0.9782, diff_loss=0.3687, tot_loss=1.562, over 1966.00 samples.], 2024-10-21 10:26:15,188 INFO [train.py:682] (1/4) Start epoch 1955 2024-10-21 10:26:32,368 INFO [train.py:561] (1/4) Epoch 1955, batch 6, global_batch_idx: 31270, batch size: 106, loss[dur_loss=0.2192, prior_loss=0.9785, diff_loss=0.3167, tot_loss=1.514, over 106.00 samples.], tot_loss[dur_loss=0.2143, prior_loss=0.9778, diff_loss=0.4234, tot_loss=1.616, over 1142.00 samples.], 2024-10-21 10:26:45,486 INFO [train.py:682] (1/4) Start epoch 1956 2024-10-21 10:26:54,174 INFO [train.py:561] (1/4) Epoch 1956, batch 0, global_batch_idx: 31280, batch size: 108, loss[dur_loss=0.2283, prior_loss=0.9792, diff_loss=0.3099, tot_loss=1.517, over 108.00 samples.], tot_loss[dur_loss=0.2283, prior_loss=0.9792, diff_loss=0.3099, tot_loss=1.517, over 108.00 samples.], 2024-10-21 10:27:08,406 INFO [train.py:561] (1/4) Epoch 1956, batch 10, global_batch_idx: 31290, batch size: 111, loss[dur_loss=0.2188, prior_loss=0.9795, diff_loss=0.3215, tot_loss=1.52, over 111.00 samples.], tot_loss[dur_loss=0.2161, prior_loss=0.978, diff_loss=0.381, tot_loss=1.575, over 1656.00 samples.], 2024-10-21 10:27:15,490 INFO [train.py:682] (1/4) Start epoch 1957 2024-10-21 10:27:29,220 INFO [train.py:561] (1/4) Epoch 1957, batch 4, global_batch_idx: 31300, batch size: 189, loss[dur_loss=0.2172, prior_loss=0.9784, diff_loss=0.3639, tot_loss=1.56, over 189.00 samples.], tot_loss[dur_loss=0.2146, prior_loss=0.9775, diff_loss=0.4435, tot_loss=1.636, over 937.00 samples.], 2024-10-21 10:27:44,096 INFO [train.py:561] (1/4) Epoch 1957, batch 14, global_batch_idx: 31310, batch size: 142, loss[dur_loss=0.2178, prior_loss=0.9783, diff_loss=0.3129, tot_loss=1.509, over 142.00 samples.], tot_loss[dur_loss=0.2165, prior_loss=0.9781, diff_loss=0.3741, tot_loss=1.569, over 2210.00 samples.], 2024-10-21 10:27:45,531 INFO [train.py:682] (1/4) Start epoch 1958 2024-10-21 10:28:05,366 INFO [train.py:561] (1/4) Epoch 1958, batch 8, global_batch_idx: 31320, batch size: 170, loss[dur_loss=0.2215, prior_loss=0.9785, diff_loss=0.3351, tot_loss=1.535, over 170.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9779, diff_loss=0.3962, tot_loss=1.589, over 1432.00 samples.], 2024-10-21 10:28:15,532 INFO [train.py:682] (1/4) Start epoch 1959 2024-10-21 10:28:27,021 INFO [train.py:561] (1/4) Epoch 1959, batch 2, global_batch_idx: 31330, batch size: 203, loss[dur_loss=0.2196, prior_loss=0.9783, diff_loss=0.3923, tot_loss=1.59, over 203.00 samples.], tot_loss[dur_loss=0.2211, prior_loss=0.9785, diff_loss=0.3518, tot_loss=1.551, over 442.00 samples.], 2024-10-21 10:28:41,290 INFO [train.py:561] (1/4) Epoch 1959, batch 12, global_batch_idx: 31340, batch size: 152, loss[dur_loss=0.216, prior_loss=0.9787, diff_loss=0.3281, tot_loss=1.523, over 152.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9781, diff_loss=0.3805, tot_loss=1.575, over 1966.00 samples.], 2024-10-21 10:28:45,764 INFO [train.py:682] (1/4) Start epoch 1960 2024-10-21 10:29:02,956 INFO [train.py:561] (1/4) Epoch 1960, batch 6, global_batch_idx: 31350, batch size: 106, loss[dur_loss=0.2188, prior_loss=0.9782, diff_loss=0.2938, tot_loss=1.491, over 106.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9777, diff_loss=0.4092, tot_loss=1.601, over 1142.00 samples.], 2024-10-21 10:29:16,044 INFO [train.py:682] (1/4) Start epoch 1961 2024-10-21 10:29:24,672 INFO [train.py:561] (1/4) Epoch 1961, batch 0, global_batch_idx: 31360, batch size: 108, loss[dur_loss=0.2207, prior_loss=0.9792, diff_loss=0.3089, tot_loss=1.509, over 108.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.9792, diff_loss=0.3089, tot_loss=1.509, over 108.00 samples.], 2024-10-21 10:29:38,980 INFO [train.py:561] (1/4) Epoch 1961, batch 10, global_batch_idx: 31370, batch size: 111, loss[dur_loss=0.2238, prior_loss=0.9794, diff_loss=0.3422, tot_loss=1.545, over 111.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.978, diff_loss=0.3821, tot_loss=1.575, over 1656.00 samples.], 2024-10-21 10:29:46,121 INFO [train.py:682] (1/4) Start epoch 1962 2024-10-21 10:29:59,755 INFO [train.py:561] (1/4) Epoch 1962, batch 4, global_batch_idx: 31380, batch size: 189, loss[dur_loss=0.215, prior_loss=0.9782, diff_loss=0.3525, tot_loss=1.546, over 189.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9774, diff_loss=0.4336, tot_loss=1.622, over 937.00 samples.], 2024-10-21 10:30:14,584 INFO [train.py:561] (1/4) Epoch 1962, batch 14, global_batch_idx: 31390, batch size: 142, loss[dur_loss=0.2168, prior_loss=0.978, diff_loss=0.3174, tot_loss=1.512, over 142.00 samples.], tot_loss[dur_loss=0.2156, prior_loss=0.978, diff_loss=0.3699, tot_loss=1.564, over 2210.00 samples.], 2024-10-21 10:30:16,015 INFO [train.py:682] (1/4) Start epoch 1963 2024-10-21 10:30:36,002 INFO [train.py:561] (1/4) Epoch 1963, batch 8, global_batch_idx: 31400, batch size: 170, loss[dur_loss=0.2216, prior_loss=0.9783, diff_loss=0.3379, tot_loss=1.538, over 170.00 samples.], tot_loss[dur_loss=0.2154, prior_loss=0.9777, diff_loss=0.402, tot_loss=1.595, over 1432.00 samples.], 2024-10-21 10:30:46,185 INFO [train.py:682] (1/4) Start epoch 1964 2024-10-21 10:30:57,753 INFO [train.py:561] (1/4) Epoch 1964, batch 2, global_batch_idx: 31410, batch size: 203, loss[dur_loss=0.2166, prior_loss=0.9783, diff_loss=0.3627, tot_loss=1.558, over 203.00 samples.], tot_loss[dur_loss=0.2185, prior_loss=0.9785, diff_loss=0.3349, tot_loss=1.532, over 442.00 samples.], 2024-10-21 10:31:11,988 INFO [train.py:561] (1/4) Epoch 1964, batch 12, global_batch_idx: 31420, batch size: 152, loss[dur_loss=0.2202, prior_loss=0.9785, diff_loss=0.3253, tot_loss=1.524, over 152.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9779, diff_loss=0.3793, tot_loss=1.573, over 1966.00 samples.], 2024-10-21 10:31:16,420 INFO [train.py:682] (1/4) Start epoch 1965 2024-10-21 10:31:33,427 INFO [train.py:561] (1/4) Epoch 1965, batch 6, global_batch_idx: 31430, batch size: 106, loss[dur_loss=0.2174, prior_loss=0.9783, diff_loss=0.3109, tot_loss=1.507, over 106.00 samples.], tot_loss[dur_loss=0.2143, prior_loss=0.9777, diff_loss=0.405, tot_loss=1.597, over 1142.00 samples.], 2024-10-21 10:31:46,495 INFO [train.py:682] (1/4) Start epoch 1966 2024-10-21 10:31:55,088 INFO [train.py:561] (1/4) Epoch 1966, batch 0, global_batch_idx: 31440, batch size: 108, loss[dur_loss=0.2216, prior_loss=0.979, diff_loss=0.286, tot_loss=1.487, over 108.00 samples.], tot_loss[dur_loss=0.2216, prior_loss=0.979, diff_loss=0.286, tot_loss=1.487, over 108.00 samples.], 2024-10-21 10:32:09,285 INFO [train.py:561] (1/4) Epoch 1966, batch 10, global_batch_idx: 31450, batch size: 111, loss[dur_loss=0.2184, prior_loss=0.9791, diff_loss=0.2767, tot_loss=1.474, over 111.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9779, diff_loss=0.3766, tot_loss=1.571, over 1656.00 samples.], 2024-10-21 10:32:16,413 INFO [train.py:682] (1/4) Start epoch 1967 2024-10-21 10:32:30,003 INFO [train.py:561] (1/4) Epoch 1967, batch 4, global_batch_idx: 31460, batch size: 189, loss[dur_loss=0.2152, prior_loss=0.9785, diff_loss=0.3342, tot_loss=1.528, over 189.00 samples.], tot_loss[dur_loss=0.2118, prior_loss=0.9775, diff_loss=0.4355, tot_loss=1.625, over 937.00 samples.], 2024-10-21 10:32:44,905 INFO [train.py:561] (1/4) Epoch 1967, batch 14, global_batch_idx: 31470, batch size: 142, loss[dur_loss=0.2205, prior_loss=0.9781, diff_loss=0.3193, tot_loss=1.518, over 142.00 samples.], tot_loss[dur_loss=0.2159, prior_loss=0.9781, diff_loss=0.365, tot_loss=1.559, over 2210.00 samples.], 2024-10-21 10:32:46,341 INFO [train.py:682] (1/4) Start epoch 1968 2024-10-21 10:33:06,271 INFO [train.py:561] (1/4) Epoch 1968, batch 8, global_batch_idx: 31480, batch size: 170, loss[dur_loss=0.2232, prior_loss=0.9786, diff_loss=0.3166, tot_loss=1.518, over 170.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.9779, diff_loss=0.3978, tot_loss=1.591, over 1432.00 samples.], 2024-10-21 10:33:16,386 INFO [train.py:682] (1/4) Start epoch 1969 2024-10-21 10:33:27,693 INFO [train.py:561] (1/4) Epoch 1969, batch 2, global_batch_idx: 31490, batch size: 203, loss[dur_loss=0.2158, prior_loss=0.978, diff_loss=0.3494, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9783, diff_loss=0.3232, tot_loss=1.519, over 442.00 samples.], 2024-10-21 10:33:41,939 INFO [train.py:561] (1/4) Epoch 1969, batch 12, global_batch_idx: 31500, batch size: 152, loss[dur_loss=0.2182, prior_loss=0.9784, diff_loss=0.3432, tot_loss=1.54, over 152.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.9779, diff_loss=0.3686, tot_loss=1.563, over 1966.00 samples.], 2024-10-21 10:33:43,573 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 10:34:22,395 INFO [train.py:589] (1/4) Epoch 1969, validation: dur_loss=0.4576, prior_loss=1.035, diff_loss=0.4056, tot_loss=1.898, over 100.00 samples. 2024-10-21 10:34:22,396 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 10:34:25,248 INFO [train.py:682] (1/4) Start epoch 1970 2024-10-21 10:34:42,347 INFO [train.py:561] (1/4) Epoch 1970, batch 6, global_batch_idx: 31510, batch size: 106, loss[dur_loss=0.2141, prior_loss=0.9781, diff_loss=0.3093, tot_loss=1.501, over 106.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9776, diff_loss=0.4181, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 10:34:55,468 INFO [train.py:682] (1/4) Start epoch 1971 2024-10-21 10:35:04,149 INFO [train.py:561] (1/4) Epoch 1971, batch 0, global_batch_idx: 31520, batch size: 108, loss[dur_loss=0.2217, prior_loss=0.9789, diff_loss=0.3072, tot_loss=1.508, over 108.00 samples.], tot_loss[dur_loss=0.2217, prior_loss=0.9789, diff_loss=0.3072, tot_loss=1.508, over 108.00 samples.], 2024-10-21 10:35:18,435 INFO [train.py:561] (1/4) Epoch 1971, batch 10, global_batch_idx: 31530, batch size: 111, loss[dur_loss=0.2203, prior_loss=0.9792, diff_loss=0.2784, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.2151, prior_loss=0.9777, diff_loss=0.3861, tot_loss=1.579, over 1656.00 samples.], 2024-10-21 10:35:25,547 INFO [train.py:682] (1/4) Start epoch 1972 2024-10-21 10:35:39,589 INFO [train.py:561] (1/4) Epoch 1972, batch 4, global_batch_idx: 31540, batch size: 189, loss[dur_loss=0.2113, prior_loss=0.9779, diff_loss=0.3217, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9772, diff_loss=0.4355, tot_loss=1.624, over 937.00 samples.], 2024-10-21 10:35:54,429 INFO [train.py:561] (1/4) Epoch 1972, batch 14, global_batch_idx: 31550, batch size: 142, loss[dur_loss=0.2171, prior_loss=0.9781, diff_loss=0.317, tot_loss=1.512, over 142.00 samples.], tot_loss[dur_loss=0.2151, prior_loss=0.9778, diff_loss=0.3605, tot_loss=1.553, over 2210.00 samples.], 2024-10-21 10:35:55,852 INFO [train.py:682] (1/4) Start epoch 1973 2024-10-21 10:36:15,711 INFO [train.py:561] (1/4) Epoch 1973, batch 8, global_batch_idx: 31560, batch size: 170, loss[dur_loss=0.2205, prior_loss=0.9781, diff_loss=0.3232, tot_loss=1.522, over 170.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9776, diff_loss=0.3815, tot_loss=1.573, over 1432.00 samples.], 2024-10-21 10:36:25,798 INFO [train.py:682] (1/4) Start epoch 1974 2024-10-21 10:36:37,348 INFO [train.py:561] (1/4) Epoch 1974, batch 2, global_batch_idx: 31570, batch size: 203, loss[dur_loss=0.2182, prior_loss=0.9783, diff_loss=0.3519, tot_loss=1.548, over 203.00 samples.], tot_loss[dur_loss=0.2175, prior_loss=0.9783, diff_loss=0.3244, tot_loss=1.52, over 442.00 samples.], 2024-10-21 10:36:51,553 INFO [train.py:561] (1/4) Epoch 1974, batch 12, global_batch_idx: 31580, batch size: 152, loss[dur_loss=0.2174, prior_loss=0.9783, diff_loss=0.3412, tot_loss=1.537, over 152.00 samples.], tot_loss[dur_loss=0.2149, prior_loss=0.9778, diff_loss=0.3711, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 10:36:55,978 INFO [train.py:682] (1/4) Start epoch 1975 2024-10-21 10:37:12,963 INFO [train.py:561] (1/4) Epoch 1975, batch 6, global_batch_idx: 31590, batch size: 106, loss[dur_loss=0.2133, prior_loss=0.9781, diff_loss=0.3033, tot_loss=1.495, over 106.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.9774, diff_loss=0.4066, tot_loss=1.596, over 1142.00 samples.], 2024-10-21 10:37:26,019 INFO [train.py:682] (1/4) Start epoch 1976 2024-10-21 10:37:34,879 INFO [train.py:561] (1/4) Epoch 1976, batch 0, global_batch_idx: 31600, batch size: 108, loss[dur_loss=0.2198, prior_loss=0.9789, diff_loss=0.3514, tot_loss=1.55, over 108.00 samples.], tot_loss[dur_loss=0.2198, prior_loss=0.9789, diff_loss=0.3514, tot_loss=1.55, over 108.00 samples.], 2024-10-21 10:37:49,152 INFO [train.py:561] (1/4) Epoch 1976, batch 10, global_batch_idx: 31610, batch size: 111, loss[dur_loss=0.217, prior_loss=0.9791, diff_loss=0.3175, tot_loss=1.514, over 111.00 samples.], tot_loss[dur_loss=0.2136, prior_loss=0.9777, diff_loss=0.3826, tot_loss=1.574, over 1656.00 samples.], 2024-10-21 10:37:56,265 INFO [train.py:682] (1/4) Start epoch 1977 2024-10-21 10:38:09,799 INFO [train.py:561] (1/4) Epoch 1977, batch 4, global_batch_idx: 31620, batch size: 189, loss[dur_loss=0.2139, prior_loss=0.978, diff_loss=0.3317, tot_loss=1.524, over 189.00 samples.], tot_loss[dur_loss=0.2121, prior_loss=0.9773, diff_loss=0.4197, tot_loss=1.609, over 937.00 samples.], 2024-10-21 10:38:24,658 INFO [train.py:561] (1/4) Epoch 1977, batch 14, global_batch_idx: 31630, batch size: 142, loss[dur_loss=0.217, prior_loss=0.9778, diff_loss=0.2848, tot_loss=1.48, over 142.00 samples.], tot_loss[dur_loss=0.2149, prior_loss=0.9778, diff_loss=0.3539, tot_loss=1.547, over 2210.00 samples.], 2024-10-21 10:38:26,088 INFO [train.py:682] (1/4) Start epoch 1978 2024-10-21 10:38:46,180 INFO [train.py:561] (1/4) Epoch 1978, batch 8, global_batch_idx: 31640, batch size: 170, loss[dur_loss=0.2195, prior_loss=0.9781, diff_loss=0.3196, tot_loss=1.517, over 170.00 samples.], tot_loss[dur_loss=0.2136, prior_loss=0.9775, diff_loss=0.3959, tot_loss=1.587, over 1432.00 samples.], 2024-10-21 10:38:56,289 INFO [train.py:682] (1/4) Start epoch 1979 2024-10-21 10:39:07,650 INFO [train.py:561] (1/4) Epoch 1979, batch 2, global_batch_idx: 31650, batch size: 203, loss[dur_loss=0.2177, prior_loss=0.9781, diff_loss=0.3333, tot_loss=1.529, over 203.00 samples.], tot_loss[dur_loss=0.2185, prior_loss=0.9783, diff_loss=0.3294, tot_loss=1.526, over 442.00 samples.], 2024-10-21 10:39:21,903 INFO [train.py:561] (1/4) Epoch 1979, batch 12, global_batch_idx: 31660, batch size: 152, loss[dur_loss=0.217, prior_loss=0.9781, diff_loss=0.3241, tot_loss=1.519, over 152.00 samples.], tot_loss[dur_loss=0.2146, prior_loss=0.9776, diff_loss=0.3764, tot_loss=1.569, over 1966.00 samples.], 2024-10-21 10:39:26,384 INFO [train.py:682] (1/4) Start epoch 1980 2024-10-21 10:39:43,535 INFO [train.py:561] (1/4) Epoch 1980, batch 6, global_batch_idx: 31670, batch size: 106, loss[dur_loss=0.2164, prior_loss=0.978, diff_loss=0.323, tot_loss=1.517, over 106.00 samples.], tot_loss[dur_loss=0.2118, prior_loss=0.9773, diff_loss=0.4126, tot_loss=1.602, over 1142.00 samples.], 2024-10-21 10:39:56,601 INFO [train.py:682] (1/4) Start epoch 1981 2024-10-21 10:40:05,235 INFO [train.py:561] (1/4) Epoch 1981, batch 0, global_batch_idx: 31680, batch size: 108, loss[dur_loss=0.2243, prior_loss=0.979, diff_loss=0.3204, tot_loss=1.524, over 108.00 samples.], tot_loss[dur_loss=0.2243, prior_loss=0.979, diff_loss=0.3204, tot_loss=1.524, over 108.00 samples.], 2024-10-21 10:40:19,493 INFO [train.py:561] (1/4) Epoch 1981, batch 10, global_batch_idx: 31690, batch size: 111, loss[dur_loss=0.221, prior_loss=0.9789, diff_loss=0.2819, tot_loss=1.482, over 111.00 samples.], tot_loss[dur_loss=0.2144, prior_loss=0.9776, diff_loss=0.373, tot_loss=1.565, over 1656.00 samples.], 2024-10-21 10:40:26,593 INFO [train.py:682] (1/4) Start epoch 1982 2024-10-21 10:40:40,174 INFO [train.py:561] (1/4) Epoch 1982, batch 4, global_batch_idx: 31700, batch size: 189, loss[dur_loss=0.2166, prior_loss=0.9785, diff_loss=0.3402, tot_loss=1.535, over 189.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.9773, diff_loss=0.4287, tot_loss=1.618, over 937.00 samples.], 2024-10-21 10:40:54,927 INFO [train.py:561] (1/4) Epoch 1982, batch 14, global_batch_idx: 31710, batch size: 142, loss[dur_loss=0.2191, prior_loss=0.9779, diff_loss=0.2894, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.2159, prior_loss=0.9778, diff_loss=0.3618, tot_loss=1.556, over 2210.00 samples.], 2024-10-21 10:40:56,346 INFO [train.py:682] (1/4) Start epoch 1983 2024-10-21 10:41:16,448 INFO [train.py:561] (1/4) Epoch 1983, batch 8, global_batch_idx: 31720, batch size: 170, loss[dur_loss=0.2182, prior_loss=0.9779, diff_loss=0.325, tot_loss=1.521, over 170.00 samples.], tot_loss[dur_loss=0.2139, prior_loss=0.9776, diff_loss=0.3938, tot_loss=1.585, over 1432.00 samples.], 2024-10-21 10:41:26,459 INFO [train.py:682] (1/4) Start epoch 1984 2024-10-21 10:41:37,786 INFO [train.py:561] (1/4) Epoch 1984, batch 2, global_batch_idx: 31730, batch size: 203, loss[dur_loss=0.2222, prior_loss=0.9781, diff_loss=0.3657, tot_loss=1.566, over 203.00 samples.], tot_loss[dur_loss=0.2194, prior_loss=0.9783, diff_loss=0.3348, tot_loss=1.533, over 442.00 samples.], 2024-10-21 10:41:51,843 INFO [train.py:561] (1/4) Epoch 1984, batch 12, global_batch_idx: 31740, batch size: 152, loss[dur_loss=0.2141, prior_loss=0.9782, diff_loss=0.3764, tot_loss=1.569, over 152.00 samples.], tot_loss[dur_loss=0.2139, prior_loss=0.9777, diff_loss=0.3829, tot_loss=1.574, over 1966.00 samples.], 2024-10-21 10:41:56,233 INFO [train.py:682] (1/4) Start epoch 1985 2024-10-21 10:42:12,977 INFO [train.py:561] (1/4) Epoch 1985, batch 6, global_batch_idx: 31750, batch size: 106, loss[dur_loss=0.2144, prior_loss=0.9782, diff_loss=0.35, tot_loss=1.543, over 106.00 samples.], tot_loss[dur_loss=0.2101, prior_loss=0.9772, diff_loss=0.4169, tot_loss=1.604, over 1142.00 samples.], 2024-10-21 10:42:26,008 INFO [train.py:682] (1/4) Start epoch 1986 2024-10-21 10:42:34,704 INFO [train.py:561] (1/4) Epoch 1986, batch 0, global_batch_idx: 31760, batch size: 108, loss[dur_loss=0.2207, prior_loss=0.979, diff_loss=0.2919, tot_loss=1.492, over 108.00 samples.], tot_loss[dur_loss=0.2207, prior_loss=0.979, diff_loss=0.2919, tot_loss=1.492, over 108.00 samples.], 2024-10-21 10:42:48,881 INFO [train.py:561] (1/4) Epoch 1986, batch 10, global_batch_idx: 31770, batch size: 111, loss[dur_loss=0.2215, prior_loss=0.9789, diff_loss=0.3201, tot_loss=1.52, over 111.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9776, diff_loss=0.3759, tot_loss=1.566, over 1656.00 samples.], 2024-10-21 10:42:55,937 INFO [train.py:682] (1/4) Start epoch 1987 2024-10-21 10:43:09,416 INFO [train.py:561] (1/4) Epoch 1987, batch 4, global_batch_idx: 31780, batch size: 189, loss[dur_loss=0.215, prior_loss=0.978, diff_loss=0.3327, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.2117, prior_loss=0.9771, diff_loss=0.4234, tot_loss=1.612, over 937.00 samples.], 2024-10-21 10:43:24,259 INFO [train.py:561] (1/4) Epoch 1987, batch 14, global_batch_idx: 31790, batch size: 142, loss[dur_loss=0.2149, prior_loss=0.9779, diff_loss=0.3269, tot_loss=1.52, over 142.00 samples.], tot_loss[dur_loss=0.2144, prior_loss=0.9776, diff_loss=0.3614, tot_loss=1.553, over 2210.00 samples.], 2024-10-21 10:43:25,684 INFO [train.py:682] (1/4) Start epoch 1988 2024-10-21 10:43:45,370 INFO [train.py:561] (1/4) Epoch 1988, batch 8, global_batch_idx: 31800, batch size: 170, loss[dur_loss=0.2177, prior_loss=0.9779, diff_loss=0.2905, tot_loss=1.486, over 170.00 samples.], tot_loss[dur_loss=0.214, prior_loss=0.9774, diff_loss=0.3802, tot_loss=1.572, over 1432.00 samples.], 2024-10-21 10:43:55,385 INFO [train.py:682] (1/4) Start epoch 1989 2024-10-21 10:44:06,969 INFO [train.py:561] (1/4) Epoch 1989, batch 2, global_batch_idx: 31810, batch size: 203, loss[dur_loss=0.2159, prior_loss=0.978, diff_loss=0.3423, tot_loss=1.536, over 203.00 samples.], tot_loss[dur_loss=0.2193, prior_loss=0.9782, diff_loss=0.3128, tot_loss=1.51, over 442.00 samples.], 2024-10-21 10:44:21,153 INFO [train.py:561] (1/4) Epoch 1989, batch 12, global_batch_idx: 31820, batch size: 152, loss[dur_loss=0.2175, prior_loss=0.978, diff_loss=0.2778, tot_loss=1.473, over 152.00 samples.], tot_loss[dur_loss=0.2152, prior_loss=0.9776, diff_loss=0.3609, tot_loss=1.554, over 1966.00 samples.], 2024-10-21 10:44:25,599 INFO [train.py:682] (1/4) Start epoch 1990 2024-10-21 10:44:42,555 INFO [train.py:561] (1/4) Epoch 1990, batch 6, global_batch_idx: 31830, batch size: 106, loss[dur_loss=0.2153, prior_loss=0.978, diff_loss=0.3071, tot_loss=1.5, over 106.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9773, diff_loss=0.4058, tot_loss=1.594, over 1142.00 samples.], 2024-10-21 10:44:55,516 INFO [train.py:682] (1/4) Start epoch 1991 2024-10-21 10:45:04,499 INFO [train.py:561] (1/4) Epoch 1991, batch 0, global_batch_idx: 31840, batch size: 108, loss[dur_loss=0.2227, prior_loss=0.9787, diff_loss=0.3071, tot_loss=1.509, over 108.00 samples.], tot_loss[dur_loss=0.2227, prior_loss=0.9787, diff_loss=0.3071, tot_loss=1.509, over 108.00 samples.], 2024-10-21 10:45:18,763 INFO [train.py:561] (1/4) Epoch 1991, batch 10, global_batch_idx: 31850, batch size: 111, loss[dur_loss=0.2184, prior_loss=0.9789, diff_loss=0.3179, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.2131, prior_loss=0.9775, diff_loss=0.3808, tot_loss=1.571, over 1656.00 samples.], 2024-10-21 10:45:25,815 INFO [train.py:682] (1/4) Start epoch 1992 2024-10-21 10:45:39,588 INFO [train.py:561] (1/4) Epoch 1992, batch 4, global_batch_idx: 31860, batch size: 189, loss[dur_loss=0.215, prior_loss=0.9778, diff_loss=0.3284, tot_loss=1.521, over 189.00 samples.], tot_loss[dur_loss=0.2108, prior_loss=0.977, diff_loss=0.428, tot_loss=1.616, over 937.00 samples.], 2024-10-21 10:45:54,468 INFO [train.py:561] (1/4) Epoch 1992, batch 14, global_batch_idx: 31870, batch size: 142, loss[dur_loss=0.2204, prior_loss=0.9777, diff_loss=0.3259, tot_loss=1.524, over 142.00 samples.], tot_loss[dur_loss=0.2138, prior_loss=0.9776, diff_loss=0.3621, tot_loss=1.553, over 2210.00 samples.], 2024-10-21 10:45:55,892 INFO [train.py:682] (1/4) Start epoch 1993 2024-10-21 10:46:15,655 INFO [train.py:561] (1/4) Epoch 1993, batch 8, global_batch_idx: 31880, batch size: 170, loss[dur_loss=0.216, prior_loss=0.9781, diff_loss=0.3402, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.2133, prior_loss=0.9775, diff_loss=0.3961, tot_loss=1.587, over 1432.00 samples.], 2024-10-21 10:46:25,730 INFO [train.py:682] (1/4) Start epoch 1994 2024-10-21 10:46:37,173 INFO [train.py:561] (1/4) Epoch 1994, batch 2, global_batch_idx: 31890, batch size: 203, loss[dur_loss=0.2164, prior_loss=0.978, diff_loss=0.3465, tot_loss=1.541, over 203.00 samples.], tot_loss[dur_loss=0.2158, prior_loss=0.9781, diff_loss=0.3293, tot_loss=1.523, over 442.00 samples.], 2024-10-21 10:46:51,328 INFO [train.py:561] (1/4) Epoch 1994, batch 12, global_batch_idx: 31900, batch size: 152, loss[dur_loss=0.2153, prior_loss=0.9782, diff_loss=0.324, tot_loss=1.518, over 152.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9777, diff_loss=0.3769, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 10:46:55,725 INFO [train.py:682] (1/4) Start epoch 1995 2024-10-21 10:47:12,938 INFO [train.py:561] (1/4) Epoch 1995, batch 6, global_batch_idx: 31910, batch size: 106, loss[dur_loss=0.2195, prior_loss=0.9781, diff_loss=0.296, tot_loss=1.494, over 106.00 samples.], tot_loss[dur_loss=0.2111, prior_loss=0.9773, diff_loss=0.4178, tot_loss=1.606, over 1142.00 samples.], 2024-10-21 10:47:26,002 INFO [train.py:682] (1/4) Start epoch 1996 2024-10-21 10:47:34,614 INFO [train.py:561] (1/4) Epoch 1996, batch 0, global_batch_idx: 31920, batch size: 108, loss[dur_loss=0.2206, prior_loss=0.9789, diff_loss=0.3259, tot_loss=1.525, over 108.00 samples.], tot_loss[dur_loss=0.2206, prior_loss=0.9789, diff_loss=0.3259, tot_loss=1.525, over 108.00 samples.], 2024-10-21 10:47:48,827 INFO [train.py:561] (1/4) Epoch 1996, batch 10, global_batch_idx: 31930, batch size: 111, loss[dur_loss=0.217, prior_loss=0.9788, diff_loss=0.3057, tot_loss=1.502, over 111.00 samples.], tot_loss[dur_loss=0.2123, prior_loss=0.9775, diff_loss=0.3846, tot_loss=1.574, over 1656.00 samples.], 2024-10-21 10:47:55,847 INFO [train.py:682] (1/4) Start epoch 1997 2024-10-21 10:48:09,558 INFO [train.py:561] (1/4) Epoch 1997, batch 4, global_batch_idx: 31940, batch size: 189, loss[dur_loss=0.2143, prior_loss=0.978, diff_loss=0.3558, tot_loss=1.548, over 189.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.977, diff_loss=0.4372, tot_loss=1.624, over 937.00 samples.], 2024-10-21 10:48:24,322 INFO [train.py:561] (1/4) Epoch 1997, batch 14, global_batch_idx: 31950, batch size: 142, loss[dur_loss=0.2153, prior_loss=0.9776, diff_loss=0.3286, tot_loss=1.522, over 142.00 samples.], tot_loss[dur_loss=0.2138, prior_loss=0.9776, diff_loss=0.3723, tot_loss=1.564, over 2210.00 samples.], 2024-10-21 10:48:25,746 INFO [train.py:682] (1/4) Start epoch 1998 2024-10-21 10:48:45,556 INFO [train.py:561] (1/4) Epoch 1998, batch 8, global_batch_idx: 31960, batch size: 170, loss[dur_loss=0.2185, prior_loss=0.978, diff_loss=0.3634, tot_loss=1.56, over 170.00 samples.], tot_loss[dur_loss=0.2126, prior_loss=0.9774, diff_loss=0.3898, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 10:48:55,630 INFO [train.py:682] (1/4) Start epoch 1999 2024-10-21 10:49:07,106 INFO [train.py:561] (1/4) Epoch 1999, batch 2, global_batch_idx: 31970, batch size: 203, loss[dur_loss=0.2119, prior_loss=0.9778, diff_loss=0.3563, tot_loss=1.546, over 203.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.9781, diff_loss=0.3177, tot_loss=1.511, over 442.00 samples.], 2024-10-21 10:49:21,351 INFO [train.py:561] (1/4) Epoch 1999, batch 12, global_batch_idx: 31980, batch size: 152, loss[dur_loss=0.2187, prior_loss=0.9781, diff_loss=0.3255, tot_loss=1.522, over 152.00 samples.], tot_loss[dur_loss=0.2137, prior_loss=0.9776, diff_loss=0.3686, tot_loss=1.56, over 1966.00 samples.], 2024-10-21 10:49:25,766 INFO [train.py:682] (1/4) Start epoch 2000 2024-10-21 10:49:42,924 INFO [train.py:561] (1/4) Epoch 2000, batch 6, global_batch_idx: 31990, batch size: 106, loss[dur_loss=0.2165, prior_loss=0.9782, diff_loss=0.3145, tot_loss=1.509, over 106.00 samples.], tot_loss[dur_loss=0.2107, prior_loss=0.9773, diff_loss=0.4112, tot_loss=1.599, over 1142.00 samples.], 2024-10-21 10:49:55,893 INFO [train.py:682] (1/4) Start epoch 2001 2024-10-21 10:50:04,850 INFO [train.py:561] (1/4) Epoch 2001, batch 0, global_batch_idx: 32000, batch size: 108, loss[dur_loss=0.2194, prior_loss=0.9787, diff_loss=0.2882, tot_loss=1.486, over 108.00 samples.], tot_loss[dur_loss=0.2194, prior_loss=0.9787, diff_loss=0.2882, tot_loss=1.486, over 108.00 samples.], 2024-10-21 10:50:19,044 INFO [train.py:561] (1/4) Epoch 2001, batch 10, global_batch_idx: 32010, batch size: 111, loss[dur_loss=0.2193, prior_loss=0.9789, diff_loss=0.3103, tot_loss=1.509, over 111.00 samples.], tot_loss[dur_loss=0.2127, prior_loss=0.9776, diff_loss=0.3744, tot_loss=1.565, over 1656.00 samples.], 2024-10-21 10:50:26,099 INFO [train.py:682] (1/4) Start epoch 2002 2024-10-21 10:50:39,935 INFO [train.py:561] (1/4) Epoch 2002, batch 4, global_batch_idx: 32020, batch size: 189, loss[dur_loss=0.2163, prior_loss=0.9781, diff_loss=0.3348, tot_loss=1.529, over 189.00 samples.], tot_loss[dur_loss=0.2123, prior_loss=0.977, diff_loss=0.4223, tot_loss=1.612, over 937.00 samples.], 2024-10-21 10:50:54,675 INFO [train.py:561] (1/4) Epoch 2002, batch 14, global_batch_idx: 32030, batch size: 142, loss[dur_loss=0.2165, prior_loss=0.9775, diff_loss=0.3374, tot_loss=1.531, over 142.00 samples.], tot_loss[dur_loss=0.2149, prior_loss=0.9776, diff_loss=0.3648, tot_loss=1.557, over 2210.00 samples.], 2024-10-21 10:50:56,097 INFO [train.py:682] (1/4) Start epoch 2003 2024-10-21 10:51:15,949 INFO [train.py:561] (1/4) Epoch 2003, batch 8, global_batch_idx: 32040, batch size: 170, loss[dur_loss=0.2169, prior_loss=0.9779, diff_loss=0.3103, tot_loss=1.505, over 170.00 samples.], tot_loss[dur_loss=0.2133, prior_loss=0.9773, diff_loss=0.389, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 10:51:26,013 INFO [train.py:682] (1/4) Start epoch 2004 2024-10-21 10:51:37,591 INFO [train.py:561] (1/4) Epoch 2004, batch 2, global_batch_idx: 32050, batch size: 203, loss[dur_loss=0.2157, prior_loss=0.9778, diff_loss=0.3491, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2169, prior_loss=0.9781, diff_loss=0.3397, tot_loss=1.535, over 442.00 samples.], 2024-10-21 10:51:51,681 INFO [train.py:561] (1/4) Epoch 2004, batch 12, global_batch_idx: 32060, batch size: 152, loss[dur_loss=0.2151, prior_loss=0.9779, diff_loss=0.3536, tot_loss=1.547, over 152.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9774, diff_loss=0.3732, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 10:51:56,100 INFO [train.py:682] (1/4) Start epoch 2005 2024-10-21 10:52:13,012 INFO [train.py:561] (1/4) Epoch 2005, batch 6, global_batch_idx: 32070, batch size: 106, loss[dur_loss=0.2142, prior_loss=0.9779, diff_loss=0.322, tot_loss=1.514, over 106.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9771, diff_loss=0.4135, tot_loss=1.602, over 1142.00 samples.], 2024-10-21 10:52:25,939 INFO [train.py:682] (1/4) Start epoch 2006 2024-10-21 10:52:34,488 INFO [train.py:561] (1/4) Epoch 2006, batch 0, global_batch_idx: 32080, batch size: 108, loss[dur_loss=0.2198, prior_loss=0.9786, diff_loss=0.3011, tot_loss=1.5, over 108.00 samples.], tot_loss[dur_loss=0.2198, prior_loss=0.9786, diff_loss=0.3011, tot_loss=1.5, over 108.00 samples.], 2024-10-21 10:52:48,632 INFO [train.py:561] (1/4) Epoch 2006, batch 10, global_batch_idx: 32090, batch size: 111, loss[dur_loss=0.2183, prior_loss=0.9786, diff_loss=0.3074, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.212, prior_loss=0.9774, diff_loss=0.3805, tot_loss=1.57, over 1656.00 samples.], 2024-10-21 10:52:55,664 INFO [train.py:682] (1/4) Start epoch 2007 2024-10-21 10:53:09,295 INFO [train.py:561] (1/4) Epoch 2007, batch 4, global_batch_idx: 32100, batch size: 189, loss[dur_loss=0.2161, prior_loss=0.978, diff_loss=0.326, tot_loss=1.52, over 189.00 samples.], tot_loss[dur_loss=0.2101, prior_loss=0.9769, diff_loss=0.423, tot_loss=1.61, over 937.00 samples.], 2024-10-21 10:53:24,110 INFO [train.py:561] (1/4) Epoch 2007, batch 14, global_batch_idx: 32110, batch size: 142, loss[dur_loss=0.2167, prior_loss=0.9775, diff_loss=0.3179, tot_loss=1.512, over 142.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9775, diff_loss=0.3607, tot_loss=1.552, over 2210.00 samples.], 2024-10-21 10:53:25,530 INFO [train.py:682] (1/4) Start epoch 2008 2024-10-21 10:53:45,404 INFO [train.py:561] (1/4) Epoch 2008, batch 8, global_batch_idx: 32120, batch size: 170, loss[dur_loss=0.2173, prior_loss=0.978, diff_loss=0.3322, tot_loss=1.527, over 170.00 samples.], tot_loss[dur_loss=0.2126, prior_loss=0.9774, diff_loss=0.3913, tot_loss=1.581, over 1432.00 samples.], 2024-10-21 10:53:55,439 INFO [train.py:682] (1/4) Start epoch 2009 2024-10-21 10:54:06,811 INFO [train.py:561] (1/4) Epoch 2009, batch 2, global_batch_idx: 32130, batch size: 203, loss[dur_loss=0.2146, prior_loss=0.9778, diff_loss=0.3226, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.978, diff_loss=0.3123, tot_loss=1.507, over 442.00 samples.], 2024-10-21 10:54:20,889 INFO [train.py:561] (1/4) Epoch 2009, batch 12, global_batch_idx: 32140, batch size: 152, loss[dur_loss=0.2128, prior_loss=0.9779, diff_loss=0.307, tot_loss=1.498, over 152.00 samples.], tot_loss[dur_loss=0.2125, prior_loss=0.9774, diff_loss=0.3679, tot_loss=1.558, over 1966.00 samples.], 2024-10-21 10:54:25,305 INFO [train.py:682] (1/4) Start epoch 2010 2024-10-21 10:54:42,441 INFO [train.py:561] (1/4) Epoch 2010, batch 6, global_batch_idx: 32150, batch size: 106, loss[dur_loss=0.2159, prior_loss=0.9777, diff_loss=0.3031, tot_loss=1.497, over 106.00 samples.], tot_loss[dur_loss=0.2109, prior_loss=0.9771, diff_loss=0.4037, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 10:54:55,527 INFO [train.py:682] (1/4) Start epoch 2011 2024-10-21 10:55:04,355 INFO [train.py:561] (1/4) Epoch 2011, batch 0, global_batch_idx: 32160, batch size: 108, loss[dur_loss=0.2168, prior_loss=0.9785, diff_loss=0.2775, tot_loss=1.473, over 108.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9785, diff_loss=0.2775, tot_loss=1.473, over 108.00 samples.], 2024-10-21 10:55:18,453 INFO [train.py:561] (1/4) Epoch 2011, batch 10, global_batch_idx: 32170, batch size: 111, loss[dur_loss=0.2155, prior_loss=0.9786, diff_loss=0.3223, tot_loss=1.516, over 111.00 samples.], tot_loss[dur_loss=0.2118, prior_loss=0.9773, diff_loss=0.3849, tot_loss=1.574, over 1656.00 samples.], 2024-10-21 10:55:25,489 INFO [train.py:682] (1/4) Start epoch 2012 2024-10-21 10:55:39,167 INFO [train.py:561] (1/4) Epoch 2012, batch 4, global_batch_idx: 32180, batch size: 189, loss[dur_loss=0.2119, prior_loss=0.9779, diff_loss=0.3199, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.2083, prior_loss=0.9768, diff_loss=0.4244, tot_loss=1.61, over 937.00 samples.], 2024-10-21 10:55:53,916 INFO [train.py:561] (1/4) Epoch 2012, batch 14, global_batch_idx: 32190, batch size: 142, loss[dur_loss=0.2162, prior_loss=0.9774, diff_loss=0.3211, tot_loss=1.515, over 142.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.9774, diff_loss=0.3592, tot_loss=1.549, over 2210.00 samples.], 2024-10-21 10:55:55,339 INFO [train.py:682] (1/4) Start epoch 2013 2024-10-21 10:56:15,005 INFO [train.py:561] (1/4) Epoch 2013, batch 8, global_batch_idx: 32200, batch size: 170, loss[dur_loss=0.2142, prior_loss=0.9776, diff_loss=0.3328, tot_loss=1.525, over 170.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.9773, diff_loss=0.3908, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 10:56:24,989 INFO [train.py:682] (1/4) Start epoch 2014 2024-10-21 10:56:36,481 INFO [train.py:561] (1/4) Epoch 2014, batch 2, global_batch_idx: 32210, batch size: 203, loss[dur_loss=0.2167, prior_loss=0.9778, diff_loss=0.3388, tot_loss=1.533, over 203.00 samples.], tot_loss[dur_loss=0.2163, prior_loss=0.9779, diff_loss=0.3211, tot_loss=1.515, over 442.00 samples.], 2024-10-21 10:56:50,596 INFO [train.py:561] (1/4) Epoch 2014, batch 12, global_batch_idx: 32220, batch size: 152, loss[dur_loss=0.2168, prior_loss=0.9778, diff_loss=0.3408, tot_loss=1.535, over 152.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.9773, diff_loss=0.3782, tot_loss=1.568, over 1966.00 samples.], 2024-10-21 10:56:55,005 INFO [train.py:682] (1/4) Start epoch 2015 2024-10-21 10:57:11,864 INFO [train.py:561] (1/4) Epoch 2015, batch 6, global_batch_idx: 32230, batch size: 106, loss[dur_loss=0.2132, prior_loss=0.9777, diff_loss=0.2724, tot_loss=1.463, over 106.00 samples.], tot_loss[dur_loss=0.2106, prior_loss=0.9771, diff_loss=0.4007, tot_loss=1.588, over 1142.00 samples.], 2024-10-21 10:57:24,812 INFO [train.py:682] (1/4) Start epoch 2016 2024-10-21 10:57:33,367 INFO [train.py:561] (1/4) Epoch 2016, batch 0, global_batch_idx: 32240, batch size: 108, loss[dur_loss=0.2202, prior_loss=0.9786, diff_loss=0.3148, tot_loss=1.514, over 108.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.9786, diff_loss=0.3148, tot_loss=1.514, over 108.00 samples.], 2024-10-21 10:57:47,465 INFO [train.py:561] (1/4) Epoch 2016, batch 10, global_batch_idx: 32250, batch size: 111, loss[dur_loss=0.2168, prior_loss=0.9787, diff_loss=0.2859, tot_loss=1.481, over 111.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.9774, diff_loss=0.3843, tot_loss=1.573, over 1656.00 samples.], 2024-10-21 10:57:54,594 INFO [train.py:682] (1/4) Start epoch 2017 2024-10-21 10:58:08,390 INFO [train.py:561] (1/4) Epoch 2017, batch 4, global_batch_idx: 32260, batch size: 189, loss[dur_loss=0.2115, prior_loss=0.9777, diff_loss=0.3168, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9769, diff_loss=0.4341, tot_loss=1.621, over 937.00 samples.], 2024-10-21 10:58:23,147 INFO [train.py:561] (1/4) Epoch 2017, batch 14, global_batch_idx: 32270, batch size: 142, loss[dur_loss=0.216, prior_loss=0.9776, diff_loss=0.29, tot_loss=1.484, over 142.00 samples.], tot_loss[dur_loss=0.2125, prior_loss=0.9775, diff_loss=0.3581, tot_loss=1.548, over 2210.00 samples.], 2024-10-21 10:58:24,568 INFO [train.py:682] (1/4) Start epoch 2018 2024-10-21 10:58:44,327 INFO [train.py:561] (1/4) Epoch 2018, batch 8, global_batch_idx: 32280, batch size: 170, loss[dur_loss=0.216, prior_loss=0.9777, diff_loss=0.3529, tot_loss=1.547, over 170.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9771, diff_loss=0.3936, tot_loss=1.582, over 1432.00 samples.], 2024-10-21 10:58:54,348 INFO [train.py:682] (1/4) Start epoch 2019 2024-10-21 10:59:05,635 INFO [train.py:561] (1/4) Epoch 2019, batch 2, global_batch_idx: 32290, batch size: 203, loss[dur_loss=0.2173, prior_loss=0.9776, diff_loss=0.3493, tot_loss=1.544, over 203.00 samples.], tot_loss[dur_loss=0.2161, prior_loss=0.9778, diff_loss=0.3275, tot_loss=1.521, over 442.00 samples.], 2024-10-21 10:59:19,741 INFO [train.py:561] (1/4) Epoch 2019, batch 12, global_batch_idx: 32300, batch size: 152, loss[dur_loss=0.2122, prior_loss=0.9777, diff_loss=0.313, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9773, diff_loss=0.3674, tot_loss=1.558, over 1966.00 samples.], 2024-10-21 10:59:24,153 INFO [train.py:682] (1/4) Start epoch 2020 2024-10-21 10:59:41,105 INFO [train.py:561] (1/4) Epoch 2020, batch 6, global_batch_idx: 32310, batch size: 106, loss[dur_loss=0.2139, prior_loss=0.9778, diff_loss=0.2892, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.2102, prior_loss=0.977, diff_loss=0.404, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 10:59:54,033 INFO [train.py:682] (1/4) Start epoch 2021 2024-10-21 11:00:02,518 INFO [train.py:561] (1/4) Epoch 2021, batch 0, global_batch_idx: 32320, batch size: 108, loss[dur_loss=0.2178, prior_loss=0.9785, diff_loss=0.3293, tot_loss=1.526, over 108.00 samples.], tot_loss[dur_loss=0.2178, prior_loss=0.9785, diff_loss=0.3293, tot_loss=1.526, over 108.00 samples.], 2024-10-21 11:00:16,594 INFO [train.py:561] (1/4) Epoch 2021, batch 10, global_batch_idx: 32330, batch size: 111, loss[dur_loss=0.2182, prior_loss=0.9786, diff_loss=0.3349, tot_loss=1.532, over 111.00 samples.], tot_loss[dur_loss=0.2111, prior_loss=0.9772, diff_loss=0.3862, tot_loss=1.575, over 1656.00 samples.], 2024-10-21 11:00:23,621 INFO [train.py:682] (1/4) Start epoch 2022 2024-10-21 11:00:37,319 INFO [train.py:561] (1/4) Epoch 2022, batch 4, global_batch_idx: 32340, batch size: 189, loss[dur_loss=0.2154, prior_loss=0.9776, diff_loss=0.3216, tot_loss=1.515, over 189.00 samples.], tot_loss[dur_loss=0.2094, prior_loss=0.9768, diff_loss=0.4338, tot_loss=1.62, over 937.00 samples.], 2024-10-21 11:00:52,033 INFO [train.py:561] (1/4) Epoch 2022, batch 14, global_batch_idx: 32350, batch size: 142, loss[dur_loss=0.2115, prior_loss=0.9774, diff_loss=0.2953, tot_loss=1.484, over 142.00 samples.], tot_loss[dur_loss=0.212, prior_loss=0.9773, diff_loss=0.3703, tot_loss=1.56, over 2210.00 samples.], 2024-10-21 11:00:53,459 INFO [train.py:682] (1/4) Start epoch 2023 2024-10-21 11:01:13,072 INFO [train.py:561] (1/4) Epoch 2023, batch 8, global_batch_idx: 32360, batch size: 170, loss[dur_loss=0.2157, prior_loss=0.9775, diff_loss=0.3149, tot_loss=1.508, over 170.00 samples.], tot_loss[dur_loss=0.2116, prior_loss=0.9771, diff_loss=0.3877, tot_loss=1.576, over 1432.00 samples.], 2024-10-21 11:01:23,069 INFO [train.py:682] (1/4) Start epoch 2024 2024-10-21 11:01:34,190 INFO [train.py:561] (1/4) Epoch 2024, batch 2, global_batch_idx: 32370, batch size: 203, loss[dur_loss=0.217, prior_loss=0.9777, diff_loss=0.3441, tot_loss=1.539, over 203.00 samples.], tot_loss[dur_loss=0.2154, prior_loss=0.9778, diff_loss=0.3332, tot_loss=1.526, over 442.00 samples.], 2024-10-21 11:01:48,290 INFO [train.py:561] (1/4) Epoch 2024, batch 12, global_batch_idx: 32380, batch size: 152, loss[dur_loss=0.2157, prior_loss=0.978, diff_loss=0.3037, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.2121, prior_loss=0.9774, diff_loss=0.3668, tot_loss=1.556, over 1966.00 samples.], 2024-10-21 11:01:52,676 INFO [train.py:682] (1/4) Start epoch 2025 2024-10-21 11:02:09,429 INFO [train.py:561] (1/4) Epoch 2025, batch 6, global_batch_idx: 32390, batch size: 106, loss[dur_loss=0.2147, prior_loss=0.9779, diff_loss=0.2962, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.2109, prior_loss=0.9771, diff_loss=0.4203, tot_loss=1.608, over 1142.00 samples.], 2024-10-21 11:02:22,349 INFO [train.py:682] (1/4) Start epoch 2026 2024-10-21 11:02:30,947 INFO [train.py:561] (1/4) Epoch 2026, batch 0, global_batch_idx: 32400, batch size: 108, loss[dur_loss=0.2193, prior_loss=0.9786, diff_loss=0.31, tot_loss=1.508, over 108.00 samples.], tot_loss[dur_loss=0.2193, prior_loss=0.9786, diff_loss=0.31, tot_loss=1.508, over 108.00 samples.], 2024-10-21 11:02:45,106 INFO [train.py:561] (1/4) Epoch 2026, batch 10, global_batch_idx: 32410, batch size: 111, loss[dur_loss=0.2148, prior_loss=0.9787, diff_loss=0.3152, tot_loss=1.509, over 111.00 samples.], tot_loss[dur_loss=0.2121, prior_loss=0.9773, diff_loss=0.3828, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 11:02:52,172 INFO [train.py:682] (1/4) Start epoch 2027 2024-10-21 11:03:05,463 INFO [train.py:561] (1/4) Epoch 2027, batch 4, global_batch_idx: 32420, batch size: 189, loss[dur_loss=0.2097, prior_loss=0.9776, diff_loss=0.346, tot_loss=1.533, over 189.00 samples.], tot_loss[dur_loss=0.2103, prior_loss=0.9768, diff_loss=0.4486, tot_loss=1.636, over 937.00 samples.], 2024-10-21 11:03:20,163 INFO [train.py:561] (1/4) Epoch 2027, batch 14, global_batch_idx: 32430, batch size: 142, loss[dur_loss=0.2141, prior_loss=0.9774, diff_loss=0.3281, tot_loss=1.52, over 142.00 samples.], tot_loss[dur_loss=0.2131, prior_loss=0.9774, diff_loss=0.3718, tot_loss=1.562, over 2210.00 samples.], 2024-10-21 11:03:21,575 INFO [train.py:682] (1/4) Start epoch 2028 2024-10-21 11:03:41,490 INFO [train.py:561] (1/4) Epoch 2028, batch 8, global_batch_idx: 32440, batch size: 170, loss[dur_loss=0.2169, prior_loss=0.9775, diff_loss=0.3365, tot_loss=1.531, over 170.00 samples.], tot_loss[dur_loss=0.2123, prior_loss=0.9772, diff_loss=0.3882, tot_loss=1.578, over 1432.00 samples.], 2024-10-21 11:03:51,484 INFO [train.py:682] (1/4) Start epoch 2029 2024-10-21 11:04:02,686 INFO [train.py:561] (1/4) Epoch 2029, batch 2, global_batch_idx: 32450, batch size: 203, loss[dur_loss=0.2161, prior_loss=0.9779, diff_loss=0.346, tot_loss=1.54, over 203.00 samples.], tot_loss[dur_loss=0.216, prior_loss=0.9779, diff_loss=0.3215, tot_loss=1.515, over 442.00 samples.], 2024-10-21 11:04:16,996 INFO [train.py:561] (1/4) Epoch 2029, batch 12, global_batch_idx: 32460, batch size: 152, loss[dur_loss=0.2165, prior_loss=0.978, diff_loss=0.3498, tot_loss=1.544, over 152.00 samples.], tot_loss[dur_loss=0.2127, prior_loss=0.9774, diff_loss=0.3708, tot_loss=1.561, over 1966.00 samples.], 2024-10-21 11:04:21,444 INFO [train.py:682] (1/4) Start epoch 2030 2024-10-21 11:04:38,165 INFO [train.py:561] (1/4) Epoch 2030, batch 6, global_batch_idx: 32470, batch size: 106, loss[dur_loss=0.2138, prior_loss=0.9778, diff_loss=0.3124, tot_loss=1.504, over 106.00 samples.], tot_loss[dur_loss=0.2108, prior_loss=0.9771, diff_loss=0.4134, tot_loss=1.601, over 1142.00 samples.], 2024-10-21 11:04:51,182 INFO [train.py:682] (1/4) Start epoch 2031 2024-10-21 11:04:59,919 INFO [train.py:561] (1/4) Epoch 2031, batch 0, global_batch_idx: 32480, batch size: 108, loss[dur_loss=0.2186, prior_loss=0.9787, diff_loss=0.2627, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.2186, prior_loss=0.9787, diff_loss=0.2627, tot_loss=1.46, over 108.00 samples.], 2024-10-21 11:05:14,084 INFO [train.py:561] (1/4) Epoch 2031, batch 10, global_batch_idx: 32490, batch size: 111, loss[dur_loss=0.2165, prior_loss=0.9786, diff_loss=0.2788, tot_loss=1.474, over 111.00 samples.], tot_loss[dur_loss=0.2124, prior_loss=0.9774, diff_loss=0.3763, tot_loss=1.566, over 1656.00 samples.], 2024-10-21 11:05:21,130 INFO [train.py:682] (1/4) Start epoch 2032 2024-10-21 11:05:34,938 INFO [train.py:561] (1/4) Epoch 2032, batch 4, global_batch_idx: 32500, batch size: 189, loss[dur_loss=0.2111, prior_loss=0.9779, diff_loss=0.3221, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.2079, prior_loss=0.9768, diff_loss=0.4378, tot_loss=1.622, over 937.00 samples.], 2024-10-21 11:05:49,709 INFO [train.py:561] (1/4) Epoch 2032, batch 14, global_batch_idx: 32510, batch size: 142, loss[dur_loss=0.213, prior_loss=0.9776, diff_loss=0.3065, tot_loss=1.497, over 142.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.9774, diff_loss=0.3685, tot_loss=1.558, over 2210.00 samples.], 2024-10-21 11:05:51,121 INFO [train.py:682] (1/4) Start epoch 2033 2024-10-21 11:06:11,086 INFO [train.py:561] (1/4) Epoch 2033, batch 8, global_batch_idx: 32520, batch size: 170, loss[dur_loss=0.2181, prior_loss=0.9778, diff_loss=0.3359, tot_loss=1.532, over 170.00 samples.], tot_loss[dur_loss=0.2116, prior_loss=0.9772, diff_loss=0.3926, tot_loss=1.581, over 1432.00 samples.], 2024-10-21 11:06:21,143 INFO [train.py:682] (1/4) Start epoch 2034 2024-10-21 11:06:32,416 INFO [train.py:561] (1/4) Epoch 2034, batch 2, global_batch_idx: 32530, batch size: 203, loss[dur_loss=0.2119, prior_loss=0.9776, diff_loss=0.3244, tot_loss=1.514, over 203.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9778, diff_loss=0.3212, tot_loss=1.512, over 442.00 samples.], 2024-10-21 11:06:46,599 INFO [train.py:561] (1/4) Epoch 2034, batch 12, global_batch_idx: 32540, batch size: 152, loss[dur_loss=0.2154, prior_loss=0.9781, diff_loss=0.3455, tot_loss=1.539, over 152.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.9774, diff_loss=0.3791, tot_loss=1.569, over 1966.00 samples.], 2024-10-21 11:06:51,015 INFO [train.py:682] (1/4) Start epoch 2035 2024-10-21 11:07:07,631 INFO [train.py:561] (1/4) Epoch 2035, batch 6, global_batch_idx: 32550, batch size: 106, loss[dur_loss=0.216, prior_loss=0.9779, diff_loss=0.3527, tot_loss=1.547, over 106.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.977, diff_loss=0.4231, tot_loss=1.61, over 1142.00 samples.], 2024-10-21 11:07:20,635 INFO [train.py:682] (1/4) Start epoch 2036 2024-10-21 11:07:29,306 INFO [train.py:561] (1/4) Epoch 2036, batch 0, global_batch_idx: 32560, batch size: 108, loss[dur_loss=0.2172, prior_loss=0.9789, diff_loss=0.3068, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9789, diff_loss=0.3068, tot_loss=1.503, over 108.00 samples.], 2024-10-21 11:07:43,462 INFO [train.py:561] (1/4) Epoch 2036, batch 10, global_batch_idx: 32570, batch size: 111, loss[dur_loss=0.2162, prior_loss=0.9784, diff_loss=0.2898, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9773, diff_loss=0.375, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 11:07:50,512 INFO [train.py:682] (1/4) Start epoch 2037 2024-10-21 11:08:03,875 INFO [train.py:561] (1/4) Epoch 2037, batch 4, global_batch_idx: 32580, batch size: 189, loss[dur_loss=0.2129, prior_loss=0.9781, diff_loss=0.3035, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.2096, prior_loss=0.977, diff_loss=0.4195, tot_loss=1.606, over 937.00 samples.], 2024-10-21 11:08:18,685 INFO [train.py:561] (1/4) Epoch 2037, batch 14, global_batch_idx: 32590, batch size: 142, loss[dur_loss=0.2146, prior_loss=0.9775, diff_loss=0.2992, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.2134, prior_loss=0.9777, diff_loss=0.3665, tot_loss=1.558, over 2210.00 samples.], 2024-10-21 11:08:20,106 INFO [train.py:682] (1/4) Start epoch 2038 2024-10-21 11:08:39,687 INFO [train.py:561] (1/4) Epoch 2038, batch 8, global_batch_idx: 32600, batch size: 170, loss[dur_loss=0.2166, prior_loss=0.9777, diff_loss=0.3314, tot_loss=1.526, over 170.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9772, diff_loss=0.3915, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 11:08:49,760 INFO [train.py:682] (1/4) Start epoch 2039 2024-10-21 11:09:00,979 INFO [train.py:561] (1/4) Epoch 2039, batch 2, global_batch_idx: 32610, batch size: 203, loss[dur_loss=0.215, prior_loss=0.9776, diff_loss=0.3509, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2158, prior_loss=0.9778, diff_loss=0.3232, tot_loss=1.517, over 442.00 samples.], 2024-10-21 11:09:15,151 INFO [train.py:561] (1/4) Epoch 2039, batch 12, global_batch_idx: 32620, batch size: 152, loss[dur_loss=0.2131, prior_loss=0.9776, diff_loss=0.3124, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2116, prior_loss=0.9773, diff_loss=0.3755, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 11:09:19,560 INFO [train.py:682] (1/4) Start epoch 2040 2024-10-21 11:09:36,510 INFO [train.py:561] (1/4) Epoch 2040, batch 6, global_batch_idx: 32630, batch size: 106, loss[dur_loss=0.2166, prior_loss=0.9778, diff_loss=0.3258, tot_loss=1.52, over 106.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.977, diff_loss=0.4205, tot_loss=1.609, over 1142.00 samples.], 2024-10-21 11:09:49,527 INFO [train.py:682] (1/4) Start epoch 2041 2024-10-21 11:09:57,934 INFO [train.py:561] (1/4) Epoch 2041, batch 0, global_batch_idx: 32640, batch size: 108, loss[dur_loss=0.2173, prior_loss=0.9783, diff_loss=0.3096, tot_loss=1.505, over 108.00 samples.], tot_loss[dur_loss=0.2173, prior_loss=0.9783, diff_loss=0.3096, tot_loss=1.505, over 108.00 samples.], 2024-10-21 11:10:12,041 INFO [train.py:561] (1/4) Epoch 2041, batch 10, global_batch_idx: 32650, batch size: 111, loss[dur_loss=0.2134, prior_loss=0.9785, diff_loss=0.2912, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.2097, prior_loss=0.9772, diff_loss=0.3658, tot_loss=1.553, over 1656.00 samples.], 2024-10-21 11:10:19,107 INFO [train.py:682] (1/4) Start epoch 2042 2024-10-21 11:10:33,004 INFO [train.py:561] (1/4) Epoch 2042, batch 4, global_batch_idx: 32660, batch size: 189, loss[dur_loss=0.2108, prior_loss=0.9775, diff_loss=0.3082, tot_loss=1.496, over 189.00 samples.], tot_loss[dur_loss=0.2093, prior_loss=0.9767, diff_loss=0.4297, tot_loss=1.616, over 937.00 samples.], 2024-10-21 11:10:48,021 INFO [train.py:561] (1/4) Epoch 2042, batch 14, global_batch_idx: 32670, batch size: 142, loss[dur_loss=0.2155, prior_loss=0.9771, diff_loss=0.2914, tot_loss=1.484, over 142.00 samples.], tot_loss[dur_loss=0.2123, prior_loss=0.9773, diff_loss=0.3569, tot_loss=1.546, over 2210.00 samples.], 2024-10-21 11:10:49,447 INFO [train.py:682] (1/4) Start epoch 2043 2024-10-21 11:11:09,030 INFO [train.py:561] (1/4) Epoch 2043, batch 8, global_batch_idx: 32680, batch size: 170, loss[dur_loss=0.2159, prior_loss=0.9775, diff_loss=0.3467, tot_loss=1.54, over 170.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.977, diff_loss=0.3846, tot_loss=1.571, over 1432.00 samples.], 2024-10-21 11:11:19,206 INFO [train.py:682] (1/4) Start epoch 2044 2024-10-21 11:11:30,580 INFO [train.py:561] (1/4) Epoch 2044, batch 2, global_batch_idx: 32690, batch size: 203, loss[dur_loss=0.2123, prior_loss=0.9777, diff_loss=0.3528, tot_loss=1.543, over 203.00 samples.], tot_loss[dur_loss=0.2128, prior_loss=0.9777, diff_loss=0.3253, tot_loss=1.516, over 442.00 samples.], 2024-10-21 11:11:44,953 INFO [train.py:561] (1/4) Epoch 2044, batch 12, global_batch_idx: 32700, batch size: 152, loss[dur_loss=0.2158, prior_loss=0.9777, diff_loss=0.2925, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.2111, prior_loss=0.9772, diff_loss=0.3642, tot_loss=1.553, over 1966.00 samples.], 2024-10-21 11:11:49,402 INFO [train.py:682] (1/4) Start epoch 2045 2024-10-21 11:12:06,393 INFO [train.py:561] (1/4) Epoch 2045, batch 6, global_batch_idx: 32710, batch size: 106, loss[dur_loss=0.2125, prior_loss=0.9777, diff_loss=0.318, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.2099, prior_loss=0.9769, diff_loss=0.3991, tot_loss=1.586, over 1142.00 samples.], 2024-10-21 11:12:19,555 INFO [train.py:682] (1/4) Start epoch 2046 2024-10-21 11:12:28,297 INFO [train.py:561] (1/4) Epoch 2046, batch 0, global_batch_idx: 32720, batch size: 108, loss[dur_loss=0.2167, prior_loss=0.9785, diff_loss=0.2976, tot_loss=1.493, over 108.00 samples.], tot_loss[dur_loss=0.2167, prior_loss=0.9785, diff_loss=0.2976, tot_loss=1.493, over 108.00 samples.], 2024-10-21 11:12:42,513 INFO [train.py:561] (1/4) Epoch 2046, batch 10, global_batch_idx: 32730, batch size: 111, loss[dur_loss=0.2132, prior_loss=0.9786, diff_loss=0.2836, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.2116, prior_loss=0.9772, diff_loss=0.3823, tot_loss=1.571, over 1656.00 samples.], 2024-10-21 11:12:49,639 INFO [train.py:682] (1/4) Start epoch 2047 2024-10-21 11:13:03,290 INFO [train.py:561] (1/4) Epoch 2047, batch 4, global_batch_idx: 32740, batch size: 189, loss[dur_loss=0.2117, prior_loss=0.9777, diff_loss=0.3178, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9768, diff_loss=0.4254, tot_loss=1.612, over 937.00 samples.], 2024-10-21 11:13:18,127 INFO [train.py:561] (1/4) Epoch 2047, batch 14, global_batch_idx: 32750, batch size: 142, loss[dur_loss=0.2139, prior_loss=0.9773, diff_loss=0.3015, tot_loss=1.493, over 142.00 samples.], tot_loss[dur_loss=0.2126, prior_loss=0.9773, diff_loss=0.3553, tot_loss=1.545, over 2210.00 samples.], 2024-10-21 11:13:19,556 INFO [train.py:682] (1/4) Start epoch 2048 2024-10-21 11:13:39,606 INFO [train.py:561] (1/4) Epoch 2048, batch 8, global_batch_idx: 32760, batch size: 170, loss[dur_loss=0.2139, prior_loss=0.9776, diff_loss=0.3172, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9772, diff_loss=0.3881, tot_loss=1.576, over 1432.00 samples.], 2024-10-21 11:13:49,745 INFO [train.py:682] (1/4) Start epoch 2049 2024-10-21 11:14:01,058 INFO [train.py:561] (1/4) Epoch 2049, batch 2, global_batch_idx: 32770, batch size: 203, loss[dur_loss=0.2148, prior_loss=0.9776, diff_loss=0.3305, tot_loss=1.523, over 203.00 samples.], tot_loss[dur_loss=0.2148, prior_loss=0.9778, diff_loss=0.3389, tot_loss=1.531, over 442.00 samples.], 2024-10-21 11:14:15,462 INFO [train.py:561] (1/4) Epoch 2049, batch 12, global_batch_idx: 32780, batch size: 152, loss[dur_loss=0.2135, prior_loss=0.9776, diff_loss=0.3387, tot_loss=1.53, over 152.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9772, diff_loss=0.3733, tot_loss=1.562, over 1966.00 samples.], 2024-10-21 11:14:19,932 INFO [train.py:682] (1/4) Start epoch 2050 2024-10-21 11:14:37,080 INFO [train.py:561] (1/4) Epoch 2050, batch 6, global_batch_idx: 32790, batch size: 106, loss[dur_loss=0.2151, prior_loss=0.9777, diff_loss=0.3217, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2108, prior_loss=0.9769, diff_loss=0.407, tot_loss=1.595, over 1142.00 samples.], 2024-10-21 11:14:50,164 INFO [train.py:682] (1/4) Start epoch 2051 2024-10-21 11:14:58,750 INFO [train.py:561] (1/4) Epoch 2051, batch 0, global_batch_idx: 32800, batch size: 108, loss[dur_loss=0.2155, prior_loss=0.9784, diff_loss=0.3441, tot_loss=1.538, over 108.00 samples.], tot_loss[dur_loss=0.2155, prior_loss=0.9784, diff_loss=0.3441, tot_loss=1.538, over 108.00 samples.], 2024-10-21 11:15:13,347 INFO [train.py:561] (1/4) Epoch 2051, batch 10, global_batch_idx: 32810, batch size: 111, loss[dur_loss=0.2162, prior_loss=0.9784, diff_loss=0.2947, tot_loss=1.489, over 111.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9771, diff_loss=0.3839, tot_loss=1.572, over 1656.00 samples.], 2024-10-21 11:15:20,464 INFO [train.py:682] (1/4) Start epoch 2052 2024-10-21 11:15:34,022 INFO [train.py:561] (1/4) Epoch 2052, batch 4, global_batch_idx: 32820, batch size: 189, loss[dur_loss=0.2127, prior_loss=0.9775, diff_loss=0.353, tot_loss=1.543, over 189.00 samples.], tot_loss[dur_loss=0.2089, prior_loss=0.9767, diff_loss=0.4371, tot_loss=1.623, over 937.00 samples.], 2024-10-21 11:15:48,980 INFO [train.py:561] (1/4) Epoch 2052, batch 14, global_batch_idx: 32830, batch size: 142, loss[dur_loss=0.2168, prior_loss=0.9774, diff_loss=0.3245, tot_loss=1.519, over 142.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9773, diff_loss=0.3648, tot_loss=1.555, over 2210.00 samples.], 2024-10-21 11:15:50,412 INFO [train.py:682] (1/4) Start epoch 2053 2024-10-21 11:16:10,455 INFO [train.py:561] (1/4) Epoch 2053, batch 8, global_batch_idx: 32840, batch size: 170, loss[dur_loss=0.2167, prior_loss=0.9776, diff_loss=0.3149, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.2115, prior_loss=0.977, diff_loss=0.3772, tot_loss=1.566, over 1432.00 samples.], 2024-10-21 11:16:20,634 INFO [train.py:682] (1/4) Start epoch 2054 2024-10-21 11:16:31,814 INFO [train.py:561] (1/4) Epoch 2054, batch 2, global_batch_idx: 32850, batch size: 203, loss[dur_loss=0.2155, prior_loss=0.9774, diff_loss=0.3317, tot_loss=1.525, over 203.00 samples.], tot_loss[dur_loss=0.2149, prior_loss=0.9776, diff_loss=0.3178, tot_loss=1.51, over 442.00 samples.], 2024-10-21 11:16:46,019 INFO [train.py:561] (1/4) Epoch 2054, batch 12, global_batch_idx: 32860, batch size: 152, loss[dur_loss=0.2135, prior_loss=0.9779, diff_loss=0.3562, tot_loss=1.548, over 152.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.9771, diff_loss=0.3694, tot_loss=1.558, over 1966.00 samples.], 2024-10-21 11:16:50,476 INFO [train.py:682] (1/4) Start epoch 2055 2024-10-21 11:17:07,296 INFO [train.py:561] (1/4) Epoch 2055, batch 6, global_batch_idx: 32870, batch size: 106, loss[dur_loss=0.2127, prior_loss=0.9774, diff_loss=0.3039, tot_loss=1.494, over 106.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9767, diff_loss=0.4015, tot_loss=1.588, over 1142.00 samples.], 2024-10-21 11:17:20,371 INFO [train.py:682] (1/4) Start epoch 2056 2024-10-21 11:17:29,018 INFO [train.py:561] (1/4) Epoch 2056, batch 0, global_batch_idx: 32880, batch size: 108, loss[dur_loss=0.2202, prior_loss=0.9783, diff_loss=0.3344, tot_loss=1.533, over 108.00 samples.], tot_loss[dur_loss=0.2202, prior_loss=0.9783, diff_loss=0.3344, tot_loss=1.533, over 108.00 samples.], 2024-10-21 11:17:43,265 INFO [train.py:561] (1/4) Epoch 2056, batch 10, global_batch_idx: 32890, batch size: 111, loss[dur_loss=0.212, prior_loss=0.9781, diff_loss=0.2977, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9771, diff_loss=0.3827, tot_loss=1.571, over 1656.00 samples.], 2024-10-21 11:17:50,418 INFO [train.py:682] (1/4) Start epoch 2057 2024-10-21 11:18:04,294 INFO [train.py:561] (1/4) Epoch 2057, batch 4, global_batch_idx: 32900, batch size: 189, loss[dur_loss=0.2099, prior_loss=0.9773, diff_loss=0.339, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9767, diff_loss=0.4473, tot_loss=1.632, over 937.00 samples.], 2024-10-21 11:18:19,201 INFO [train.py:561] (1/4) Epoch 2057, batch 14, global_batch_idx: 32910, batch size: 142, loss[dur_loss=0.2105, prior_loss=0.977, diff_loss=0.331, tot_loss=1.519, over 142.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9771, diff_loss=0.3712, tot_loss=1.56, over 2210.00 samples.], 2024-10-21 11:18:20,645 INFO [train.py:682] (1/4) Start epoch 2058 2024-10-21 11:18:40,621 INFO [train.py:561] (1/4) Epoch 2058, batch 8, global_batch_idx: 32920, batch size: 170, loss[dur_loss=0.216, prior_loss=0.9773, diff_loss=0.3263, tot_loss=1.52, over 170.00 samples.], tot_loss[dur_loss=0.2109, prior_loss=0.9769, diff_loss=0.3944, tot_loss=1.582, over 1432.00 samples.], 2024-10-21 11:18:50,737 INFO [train.py:682] (1/4) Start epoch 2059 2024-10-21 11:19:01,865 INFO [train.py:561] (1/4) Epoch 2059, batch 2, global_batch_idx: 32930, batch size: 203, loss[dur_loss=0.211, prior_loss=0.9774, diff_loss=0.3304, tot_loss=1.519, over 203.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9776, diff_loss=0.3256, tot_loss=1.517, over 442.00 samples.], 2024-10-21 11:19:16,181 INFO [train.py:561] (1/4) Epoch 2059, batch 12, global_batch_idx: 32940, batch size: 152, loss[dur_loss=0.2124, prior_loss=0.9776, diff_loss=0.2911, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9771, diff_loss=0.3703, tot_loss=1.559, over 1966.00 samples.], 2024-10-21 11:19:20,637 INFO [train.py:682] (1/4) Start epoch 2060 2024-10-21 11:19:37,455 INFO [train.py:561] (1/4) Epoch 2060, batch 6, global_batch_idx: 32950, batch size: 106, loss[dur_loss=0.2167, prior_loss=0.9778, diff_loss=0.275, tot_loss=1.469, over 106.00 samples.], tot_loss[dur_loss=0.2105, prior_loss=0.9768, diff_loss=0.4111, tot_loss=1.598, over 1142.00 samples.], 2024-10-21 11:19:50,490 INFO [train.py:682] (1/4) Start epoch 2061 2024-10-21 11:19:59,126 INFO [train.py:561] (1/4) Epoch 2061, batch 0, global_batch_idx: 32960, batch size: 108, loss[dur_loss=0.217, prior_loss=0.9783, diff_loss=0.3217, tot_loss=1.517, over 108.00 samples.], tot_loss[dur_loss=0.217, prior_loss=0.9783, diff_loss=0.3217, tot_loss=1.517, over 108.00 samples.], 2024-10-21 11:20:13,302 INFO [train.py:561] (1/4) Epoch 2061, batch 10, global_batch_idx: 32970, batch size: 111, loss[dur_loss=0.213, prior_loss=0.9783, diff_loss=0.3205, tot_loss=1.512, over 111.00 samples.], tot_loss[dur_loss=0.2104, prior_loss=0.9771, diff_loss=0.3773, tot_loss=1.565, over 1656.00 samples.], 2024-10-21 11:20:20,410 INFO [train.py:682] (1/4) Start epoch 2062 2024-10-21 11:20:34,031 INFO [train.py:561] (1/4) Epoch 2062, batch 4, global_batch_idx: 32980, batch size: 189, loss[dur_loss=0.2136, prior_loss=0.978, diff_loss=0.3201, tot_loss=1.512, over 189.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.9769, diff_loss=0.4303, tot_loss=1.617, over 937.00 samples.], 2024-10-21 11:20:48,977 INFO [train.py:561] (1/4) Epoch 2062, batch 14, global_batch_idx: 32990, batch size: 142, loss[dur_loss=0.2143, prior_loss=0.9771, diff_loss=0.3523, tot_loss=1.544, over 142.00 samples.], tot_loss[dur_loss=0.2127, prior_loss=0.9773, diff_loss=0.3751, tot_loss=1.565, over 2210.00 samples.], 2024-10-21 11:20:50,412 INFO [train.py:682] (1/4) Start epoch 2063 2024-10-21 11:21:10,191 INFO [train.py:561] (1/4) Epoch 2063, batch 8, global_batch_idx: 33000, batch size: 170, loss[dur_loss=0.2131, prior_loss=0.9771, diff_loss=0.3193, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9769, diff_loss=0.3832, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 11:21:11,708 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 11:22:07,947 INFO [train.py:589] (1/4) Epoch 2063, validation: dur_loss=0.4596, prior_loss=1.036, diff_loss=0.4229, tot_loss=1.919, over 100.00 samples. 2024-10-21 11:22:07,948 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 11:22:16,717 INFO [train.py:682] (1/4) Start epoch 2064 2024-10-21 11:22:28,392 INFO [train.py:561] (1/4) Epoch 2064, batch 2, global_batch_idx: 33010, batch size: 203, loss[dur_loss=0.2142, prior_loss=0.9776, diff_loss=0.3379, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9776, diff_loss=0.3284, tot_loss=1.52, over 442.00 samples.], 2024-10-21 11:22:42,671 INFO [train.py:561] (1/4) Epoch 2064, batch 12, global_batch_idx: 33020, batch size: 152, loss[dur_loss=0.2151, prior_loss=0.9773, diff_loss=0.3237, tot_loss=1.516, over 152.00 samples.], tot_loss[dur_loss=0.2117, prior_loss=0.977, diff_loss=0.3769, tot_loss=1.566, over 1966.00 samples.], 2024-10-21 11:22:47,112 INFO [train.py:682] (1/4) Start epoch 2065 2024-10-21 11:23:04,181 INFO [train.py:561] (1/4) Epoch 2065, batch 6, global_batch_idx: 33030, batch size: 106, loss[dur_loss=0.2182, prior_loss=0.9776, diff_loss=0.3116, tot_loss=1.507, over 106.00 samples.], tot_loss[dur_loss=0.2104, prior_loss=0.9767, diff_loss=0.4048, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 11:23:17,207 INFO [train.py:682] (1/4) Start epoch 2066 2024-10-21 11:23:25,994 INFO [train.py:561] (1/4) Epoch 2066, batch 0, global_batch_idx: 33040, batch size: 108, loss[dur_loss=0.2168, prior_loss=0.9782, diff_loss=0.3036, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9782, diff_loss=0.3036, tot_loss=1.499, over 108.00 samples.], 2024-10-21 11:23:40,227 INFO [train.py:561] (1/4) Epoch 2066, batch 10, global_batch_idx: 33050, batch size: 111, loss[dur_loss=0.212, prior_loss=0.978, diff_loss=0.3038, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9768, diff_loss=0.3754, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 11:23:47,303 INFO [train.py:682] (1/4) Start epoch 2067 2024-10-21 11:24:01,443 INFO [train.py:561] (1/4) Epoch 2067, batch 4, global_batch_idx: 33060, batch size: 189, loss[dur_loss=0.2131, prior_loss=0.9774, diff_loss=0.3605, tot_loss=1.551, over 189.00 samples.], tot_loss[dur_loss=0.2079, prior_loss=0.9765, diff_loss=0.4366, tot_loss=1.621, over 937.00 samples.], 2024-10-21 11:24:16,296 INFO [train.py:561] (1/4) Epoch 2067, batch 14, global_batch_idx: 33070, batch size: 142, loss[dur_loss=0.2152, prior_loss=0.9771, diff_loss=0.291, tot_loss=1.483, over 142.00 samples.], tot_loss[dur_loss=0.2109, prior_loss=0.977, diff_loss=0.368, tot_loss=1.556, over 2210.00 samples.], 2024-10-21 11:24:17,720 INFO [train.py:682] (1/4) Start epoch 2068 2024-10-21 11:24:37,832 INFO [train.py:561] (1/4) Epoch 2068, batch 8, global_batch_idx: 33080, batch size: 170, loss[dur_loss=0.216, prior_loss=0.9773, diff_loss=0.3198, tot_loss=1.513, over 170.00 samples.], tot_loss[dur_loss=0.2101, prior_loss=0.9768, diff_loss=0.3814, tot_loss=1.568, over 1432.00 samples.], 2024-10-21 11:24:47,999 INFO [train.py:682] (1/4) Start epoch 2069 2024-10-21 11:24:59,431 INFO [train.py:561] (1/4) Epoch 2069, batch 2, global_batch_idx: 33090, batch size: 203, loss[dur_loss=0.2136, prior_loss=0.9774, diff_loss=0.3384, tot_loss=1.529, over 203.00 samples.], tot_loss[dur_loss=0.2133, prior_loss=0.9775, diff_loss=0.3163, tot_loss=1.507, over 442.00 samples.], 2024-10-21 11:25:13,603 INFO [train.py:561] (1/4) Epoch 2069, batch 12, global_batch_idx: 33100, batch size: 152, loss[dur_loss=0.2127, prior_loss=0.9773, diff_loss=0.3193, tot_loss=1.509, over 152.00 samples.], tot_loss[dur_loss=0.2108, prior_loss=0.9769, diff_loss=0.3634, tot_loss=1.551, over 1966.00 samples.], 2024-10-21 11:25:18,028 INFO [train.py:682] (1/4) Start epoch 2070 2024-10-21 11:25:35,298 INFO [train.py:561] (1/4) Epoch 2070, batch 6, global_batch_idx: 33110, batch size: 106, loss[dur_loss=0.2123, prior_loss=0.9775, diff_loss=0.3012, tot_loss=1.491, over 106.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9766, diff_loss=0.4261, tot_loss=1.611, over 1142.00 samples.], 2024-10-21 11:25:48,356 INFO [train.py:682] (1/4) Start epoch 2071 2024-10-21 11:25:57,022 INFO [train.py:561] (1/4) Epoch 2071, batch 0, global_batch_idx: 33120, batch size: 108, loss[dur_loss=0.2168, prior_loss=0.9781, diff_loss=0.286, tot_loss=1.481, over 108.00 samples.], tot_loss[dur_loss=0.2168, prior_loss=0.9781, diff_loss=0.286, tot_loss=1.481, over 108.00 samples.], 2024-10-21 11:26:11,216 INFO [train.py:561] (1/4) Epoch 2071, batch 10, global_batch_idx: 33130, batch size: 111, loss[dur_loss=0.2126, prior_loss=0.9781, diff_loss=0.3328, tot_loss=1.524, over 111.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9769, diff_loss=0.3786, tot_loss=1.566, over 1656.00 samples.], 2024-10-21 11:26:18,244 INFO [train.py:682] (1/4) Start epoch 2072 2024-10-21 11:26:32,315 INFO [train.py:561] (1/4) Epoch 2072, batch 4, global_batch_idx: 33140, batch size: 189, loss[dur_loss=0.2121, prior_loss=0.9771, diff_loss=0.3748, tot_loss=1.564, over 189.00 samples.], tot_loss[dur_loss=0.2075, prior_loss=0.9764, diff_loss=0.4256, tot_loss=1.61, over 937.00 samples.], 2024-10-21 11:26:47,172 INFO [train.py:561] (1/4) Epoch 2072, batch 14, global_batch_idx: 33150, batch size: 142, loss[dur_loss=0.2138, prior_loss=0.9769, diff_loss=0.3146, tot_loss=1.505, over 142.00 samples.], tot_loss[dur_loss=0.2109, prior_loss=0.977, diff_loss=0.3566, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 11:26:48,597 INFO [train.py:682] (1/4) Start epoch 2073 2024-10-21 11:27:08,898 INFO [train.py:561] (1/4) Epoch 2073, batch 8, global_batch_idx: 33160, batch size: 170, loss[dur_loss=0.2184, prior_loss=0.9773, diff_loss=0.3363, tot_loss=1.532, over 170.00 samples.], tot_loss[dur_loss=0.2104, prior_loss=0.9768, diff_loss=0.3891, tot_loss=1.576, over 1432.00 samples.], 2024-10-21 11:27:19,070 INFO [train.py:682] (1/4) Start epoch 2074 2024-10-21 11:27:30,440 INFO [train.py:561] (1/4) Epoch 2074, batch 2, global_batch_idx: 33170, batch size: 203, loss[dur_loss=0.2117, prior_loss=0.9772, diff_loss=0.342, tot_loss=1.531, over 203.00 samples.], tot_loss[dur_loss=0.2132, prior_loss=0.9774, diff_loss=0.3161, tot_loss=1.507, over 442.00 samples.], 2024-10-21 11:27:44,669 INFO [train.py:561] (1/4) Epoch 2074, batch 12, global_batch_idx: 33180, batch size: 152, loss[dur_loss=0.2128, prior_loss=0.9774, diff_loss=0.3272, tot_loss=1.517, over 152.00 samples.], tot_loss[dur_loss=0.2105, prior_loss=0.977, diff_loss=0.3633, tot_loss=1.551, over 1966.00 samples.], 2024-10-21 11:27:49,108 INFO [train.py:682] (1/4) Start epoch 2075 2024-10-21 11:28:06,076 INFO [train.py:561] (1/4) Epoch 2075, batch 6, global_batch_idx: 33190, batch size: 106, loss[dur_loss=0.2108, prior_loss=0.9773, diff_loss=0.3341, tot_loss=1.522, over 106.00 samples.], tot_loss[dur_loss=0.2076, prior_loss=0.9766, diff_loss=0.4016, tot_loss=1.586, over 1142.00 samples.], 2024-10-21 11:28:19,120 INFO [train.py:682] (1/4) Start epoch 2076 2024-10-21 11:28:27,670 INFO [train.py:561] (1/4) Epoch 2076, batch 0, global_batch_idx: 33200, batch size: 108, loss[dur_loss=0.2172, prior_loss=0.978, diff_loss=0.3041, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.978, diff_loss=0.3041, tot_loss=1.499, over 108.00 samples.], 2024-10-21 11:28:41,890 INFO [train.py:561] (1/4) Epoch 2076, batch 10, global_batch_idx: 33210, batch size: 111, loss[dur_loss=0.2102, prior_loss=0.9781, diff_loss=0.2892, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.2087, prior_loss=0.9769, diff_loss=0.372, tot_loss=1.558, over 1656.00 samples.], 2024-10-21 11:28:48,979 INFO [train.py:682] (1/4) Start epoch 2077 2024-10-21 11:29:02,958 INFO [train.py:561] (1/4) Epoch 2077, batch 4, global_batch_idx: 33220, batch size: 189, loss[dur_loss=0.2101, prior_loss=0.9771, diff_loss=0.3558, tot_loss=1.543, over 189.00 samples.], tot_loss[dur_loss=0.2079, prior_loss=0.9764, diff_loss=0.4322, tot_loss=1.616, over 937.00 samples.], 2024-10-21 11:29:17,816 INFO [train.py:561] (1/4) Epoch 2077, batch 14, global_batch_idx: 33230, batch size: 142, loss[dur_loss=0.2116, prior_loss=0.9767, diff_loss=0.3169, tot_loss=1.505, over 142.00 samples.], tot_loss[dur_loss=0.2112, prior_loss=0.9769, diff_loss=0.3642, tot_loss=1.552, over 2210.00 samples.], 2024-10-21 11:29:19,234 INFO [train.py:682] (1/4) Start epoch 2078 2024-10-21 11:29:39,059 INFO [train.py:561] (1/4) Epoch 2078, batch 8, global_batch_idx: 33240, batch size: 170, loss[dur_loss=0.2171, prior_loss=0.9774, diff_loss=0.3296, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.2102, prior_loss=0.9767, diff_loss=0.3827, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 11:29:49,159 INFO [train.py:682] (1/4) Start epoch 2079 2024-10-21 11:30:00,443 INFO [train.py:561] (1/4) Epoch 2079, batch 2, global_batch_idx: 33250, batch size: 203, loss[dur_loss=0.2115, prior_loss=0.9772, diff_loss=0.3439, tot_loss=1.533, over 203.00 samples.], tot_loss[dur_loss=0.2126, prior_loss=0.9774, diff_loss=0.3258, tot_loss=1.516, over 442.00 samples.], 2024-10-21 11:30:14,674 INFO [train.py:561] (1/4) Epoch 2079, batch 12, global_batch_idx: 33260, batch size: 152, loss[dur_loss=0.211, prior_loss=0.9775, diff_loss=0.3046, tot_loss=1.493, over 152.00 samples.], tot_loss[dur_loss=0.2111, prior_loss=0.977, diff_loss=0.3644, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 11:30:19,150 INFO [train.py:682] (1/4) Start epoch 2080 2024-10-21 11:30:35,935 INFO [train.py:561] (1/4) Epoch 2080, batch 6, global_batch_idx: 33270, batch size: 106, loss[dur_loss=0.2141, prior_loss=0.9774, diff_loss=0.3034, tot_loss=1.495, over 106.00 samples.], tot_loss[dur_loss=0.2103, prior_loss=0.9768, diff_loss=0.4084, tot_loss=1.595, over 1142.00 samples.], 2024-10-21 11:30:48,987 INFO [train.py:682] (1/4) Start epoch 2081 2024-10-21 11:30:58,143 INFO [train.py:561] (1/4) Epoch 2081, batch 0, global_batch_idx: 33280, batch size: 108, loss[dur_loss=0.2182, prior_loss=0.9783, diff_loss=0.2874, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.2182, prior_loss=0.9783, diff_loss=0.2874, tot_loss=1.484, over 108.00 samples.], 2024-10-21 11:31:12,390 INFO [train.py:561] (1/4) Epoch 2081, batch 10, global_batch_idx: 33290, batch size: 111, loss[dur_loss=0.2125, prior_loss=0.9781, diff_loss=0.2788, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.9769, diff_loss=0.3758, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 11:31:19,469 INFO [train.py:682] (1/4) Start epoch 2082 2024-10-21 11:31:33,167 INFO [train.py:561] (1/4) Epoch 2082, batch 4, global_batch_idx: 33300, batch size: 189, loss[dur_loss=0.212, prior_loss=0.9772, diff_loss=0.3374, tot_loss=1.527, over 189.00 samples.], tot_loss[dur_loss=0.2088, prior_loss=0.9765, diff_loss=0.4271, tot_loss=1.612, over 937.00 samples.], 2024-10-21 11:31:48,047 INFO [train.py:561] (1/4) Epoch 2082, batch 14, global_batch_idx: 33310, batch size: 142, loss[dur_loss=0.2149, prior_loss=0.9768, diff_loss=0.3086, tot_loss=1.5, over 142.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.977, diff_loss=0.3583, tot_loss=1.547, over 2210.00 samples.], 2024-10-21 11:31:49,469 INFO [train.py:682] (1/4) Start epoch 2083 2024-10-21 11:32:09,185 INFO [train.py:561] (1/4) Epoch 2083, batch 8, global_batch_idx: 33320, batch size: 170, loss[dur_loss=0.2139, prior_loss=0.9772, diff_loss=0.3214, tot_loss=1.512, over 170.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9767, diff_loss=0.3883, tot_loss=1.575, over 1432.00 samples.], 2024-10-21 11:32:19,308 INFO [train.py:682] (1/4) Start epoch 2084 2024-10-21 11:32:30,957 INFO [train.py:561] (1/4) Epoch 2084, batch 2, global_batch_idx: 33330, batch size: 203, loss[dur_loss=0.2117, prior_loss=0.9772, diff_loss=0.3663, tot_loss=1.555, over 203.00 samples.], tot_loss[dur_loss=0.2128, prior_loss=0.9772, diff_loss=0.3402, tot_loss=1.53, over 442.00 samples.], 2024-10-21 11:32:45,152 INFO [train.py:561] (1/4) Epoch 2084, batch 12, global_batch_idx: 33340, batch size: 152, loss[dur_loss=0.214, prior_loss=0.9774, diff_loss=0.317, tot_loss=1.508, over 152.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9769, diff_loss=0.3727, tot_loss=1.561, over 1966.00 samples.], 2024-10-21 11:32:49,576 INFO [train.py:682] (1/4) Start epoch 2085 2024-10-21 11:33:06,506 INFO [train.py:561] (1/4) Epoch 2085, batch 6, global_batch_idx: 33350, batch size: 106, loss[dur_loss=0.2136, prior_loss=0.9774, diff_loss=0.3172, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.2081, prior_loss=0.9766, diff_loss=0.4096, tot_loss=1.594, over 1142.00 samples.], 2024-10-21 11:33:19,576 INFO [train.py:682] (1/4) Start epoch 2086 2024-10-21 11:33:28,280 INFO [train.py:561] (1/4) Epoch 2086, batch 0, global_batch_idx: 33360, batch size: 108, loss[dur_loss=0.2153, prior_loss=0.9779, diff_loss=0.3061, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2153, prior_loss=0.9779, diff_loss=0.3061, tot_loss=1.499, over 108.00 samples.], 2024-10-21 11:33:42,554 INFO [train.py:561] (1/4) Epoch 2086, batch 10, global_batch_idx: 33370, batch size: 111, loss[dur_loss=0.2152, prior_loss=0.9781, diff_loss=0.2939, tot_loss=1.487, over 111.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9768, diff_loss=0.3705, tot_loss=1.557, over 1656.00 samples.], 2024-10-21 11:33:49,647 INFO [train.py:682] (1/4) Start epoch 2087 2024-10-21 11:34:03,276 INFO [train.py:561] (1/4) Epoch 2087, batch 4, global_batch_idx: 33380, batch size: 189, loss[dur_loss=0.212, prior_loss=0.9772, diff_loss=0.3282, tot_loss=1.517, over 189.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9763, diff_loss=0.4204, tot_loss=1.604, over 937.00 samples.], 2024-10-21 11:34:18,253 INFO [train.py:561] (1/4) Epoch 2087, batch 14, global_batch_idx: 33390, batch size: 142, loss[dur_loss=0.2138, prior_loss=0.977, diff_loss=0.2815, tot_loss=1.472, over 142.00 samples.], tot_loss[dur_loss=0.2102, prior_loss=0.9768, diff_loss=0.3554, tot_loss=1.542, over 2210.00 samples.], 2024-10-21 11:34:19,692 INFO [train.py:682] (1/4) Start epoch 2088 2024-10-21 11:34:39,513 INFO [train.py:561] (1/4) Epoch 2088, batch 8, global_batch_idx: 33400, batch size: 170, loss[dur_loss=0.216, prior_loss=0.9772, diff_loss=0.3188, tot_loss=1.512, over 170.00 samples.], tot_loss[dur_loss=0.2102, prior_loss=0.9766, diff_loss=0.3927, tot_loss=1.579, over 1432.00 samples.], 2024-10-21 11:34:49,659 INFO [train.py:682] (1/4) Start epoch 2089 2024-10-21 11:35:01,104 INFO [train.py:561] (1/4) Epoch 2089, batch 2, global_batch_idx: 33410, batch size: 203, loss[dur_loss=0.2106, prior_loss=0.9772, diff_loss=0.3365, tot_loss=1.524, over 203.00 samples.], tot_loss[dur_loss=0.2124, prior_loss=0.9773, diff_loss=0.3286, tot_loss=1.518, over 442.00 samples.], 2024-10-21 11:35:15,375 INFO [train.py:561] (1/4) Epoch 2089, batch 12, global_batch_idx: 33420, batch size: 152, loss[dur_loss=0.2088, prior_loss=0.977, diff_loss=0.3375, tot_loss=1.523, over 152.00 samples.], tot_loss[dur_loss=0.2093, prior_loss=0.9768, diff_loss=0.3691, tot_loss=1.555, over 1966.00 samples.], 2024-10-21 11:35:19,837 INFO [train.py:682] (1/4) Start epoch 2090 2024-10-21 11:35:37,028 INFO [train.py:561] (1/4) Epoch 2090, batch 6, global_batch_idx: 33430, batch size: 106, loss[dur_loss=0.2134, prior_loss=0.9771, diff_loss=0.2858, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9765, diff_loss=0.4167, tot_loss=1.601, over 1142.00 samples.], 2024-10-21 11:35:50,078 INFO [train.py:682] (1/4) Start epoch 2091 2024-10-21 11:35:58,702 INFO [train.py:561] (1/4) Epoch 2091, batch 0, global_batch_idx: 33440, batch size: 108, loss[dur_loss=0.2163, prior_loss=0.9779, diff_loss=0.3142, tot_loss=1.508, over 108.00 samples.], tot_loss[dur_loss=0.2163, prior_loss=0.9779, diff_loss=0.3142, tot_loss=1.508, over 108.00 samples.], 2024-10-21 11:36:12,957 INFO [train.py:561] (1/4) Epoch 2091, batch 10, global_batch_idx: 33450, batch size: 111, loss[dur_loss=0.2151, prior_loss=0.978, diff_loss=0.2604, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.209, prior_loss=0.9767, diff_loss=0.358, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 11:36:20,080 INFO [train.py:682] (1/4) Start epoch 2092 2024-10-21 11:36:33,759 INFO [train.py:561] (1/4) Epoch 2092, batch 4, global_batch_idx: 33460, batch size: 189, loss[dur_loss=0.2128, prior_loss=0.9772, diff_loss=0.3459, tot_loss=1.536, over 189.00 samples.], tot_loss[dur_loss=0.2084, prior_loss=0.9763, diff_loss=0.4325, tot_loss=1.617, over 937.00 samples.], 2024-10-21 11:36:48,697 INFO [train.py:561] (1/4) Epoch 2092, batch 14, global_batch_idx: 33470, batch size: 142, loss[dur_loss=0.21, prior_loss=0.9768, diff_loss=0.3311, tot_loss=1.518, over 142.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9769, diff_loss=0.3599, tot_loss=1.548, over 2210.00 samples.], 2024-10-21 11:36:50,138 INFO [train.py:682] (1/4) Start epoch 2093 2024-10-21 11:37:10,025 INFO [train.py:561] (1/4) Epoch 2093, batch 8, global_batch_idx: 33480, batch size: 170, loss[dur_loss=0.2148, prior_loss=0.9776, diff_loss=0.3117, tot_loss=1.504, over 170.00 samples.], tot_loss[dur_loss=0.2095, prior_loss=0.9768, diff_loss=0.3854, tot_loss=1.572, over 1432.00 samples.], 2024-10-21 11:37:20,157 INFO [train.py:682] (1/4) Start epoch 2094 2024-10-21 11:37:31,323 INFO [train.py:561] (1/4) Epoch 2094, batch 2, global_batch_idx: 33490, batch size: 203, loss[dur_loss=0.2106, prior_loss=0.9771, diff_loss=0.3239, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.2124, prior_loss=0.9773, diff_loss=0.3155, tot_loss=1.505, over 442.00 samples.], 2024-10-21 11:37:45,592 INFO [train.py:561] (1/4) Epoch 2094, batch 12, global_batch_idx: 33500, batch size: 152, loss[dur_loss=0.2102, prior_loss=0.977, diff_loss=0.3049, tot_loss=1.492, over 152.00 samples.], tot_loss[dur_loss=0.2093, prior_loss=0.9768, diff_loss=0.3598, tot_loss=1.546, over 1966.00 samples.], 2024-10-21 11:37:50,062 INFO [train.py:682] (1/4) Start epoch 2095 2024-10-21 11:38:07,142 INFO [train.py:561] (1/4) Epoch 2095, batch 6, global_batch_idx: 33510, batch size: 106, loss[dur_loss=0.2126, prior_loss=0.9772, diff_loss=0.3662, tot_loss=1.556, over 106.00 samples.], tot_loss[dur_loss=0.2089, prior_loss=0.9766, diff_loss=0.3957, tot_loss=1.581, over 1142.00 samples.], 2024-10-21 11:38:20,296 INFO [train.py:682] (1/4) Start epoch 2096 2024-10-21 11:38:28,878 INFO [train.py:561] (1/4) Epoch 2096, batch 0, global_batch_idx: 33520, batch size: 108, loss[dur_loss=0.2119, prior_loss=0.978, diff_loss=0.324, tot_loss=1.514, over 108.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.978, diff_loss=0.324, tot_loss=1.514, over 108.00 samples.], 2024-10-21 11:38:43,137 INFO [train.py:561] (1/4) Epoch 2096, batch 10, global_batch_idx: 33530, batch size: 111, loss[dur_loss=0.2127, prior_loss=0.9778, diff_loss=0.2841, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9767, diff_loss=0.3721, tot_loss=1.557, over 1656.00 samples.], 2024-10-21 11:38:50,232 INFO [train.py:682] (1/4) Start epoch 2097 2024-10-21 11:39:04,092 INFO [train.py:561] (1/4) Epoch 2097, batch 4, global_batch_idx: 33540, batch size: 189, loss[dur_loss=0.2113, prior_loss=0.9772, diff_loss=0.3373, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9762, diff_loss=0.4231, tot_loss=1.607, over 937.00 samples.], 2024-10-21 11:39:18,988 INFO [train.py:561] (1/4) Epoch 2097, batch 14, global_batch_idx: 33550, batch size: 142, loss[dur_loss=0.2125, prior_loss=0.9768, diff_loss=0.3166, tot_loss=1.506, over 142.00 samples.], tot_loss[dur_loss=0.2104, prior_loss=0.9768, diff_loss=0.3593, tot_loss=1.546, over 2210.00 samples.], 2024-10-21 11:39:20,428 INFO [train.py:682] (1/4) Start epoch 2098 2024-10-21 11:39:40,538 INFO [train.py:561] (1/4) Epoch 2098, batch 8, global_batch_idx: 33560, batch size: 170, loss[dur_loss=0.2165, prior_loss=0.9771, diff_loss=0.3062, tot_loss=1.5, over 170.00 samples.], tot_loss[dur_loss=0.2095, prior_loss=0.9767, diff_loss=0.3867, tot_loss=1.573, over 1432.00 samples.], 2024-10-21 11:39:50,723 INFO [train.py:682] (1/4) Start epoch 2099 2024-10-21 11:40:01,897 INFO [train.py:561] (1/4) Epoch 2099, batch 2, global_batch_idx: 33570, batch size: 203, loss[dur_loss=0.213, prior_loss=0.9774, diff_loss=0.3435, tot_loss=1.534, over 203.00 samples.], tot_loss[dur_loss=0.2129, prior_loss=0.9774, diff_loss=0.3318, tot_loss=1.522, over 442.00 samples.], 2024-10-21 11:40:16,198 INFO [train.py:561] (1/4) Epoch 2099, batch 12, global_batch_idx: 33580, batch size: 152, loss[dur_loss=0.2102, prior_loss=0.9771, diff_loss=0.3258, tot_loss=1.513, over 152.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9768, diff_loss=0.3721, tot_loss=1.558, over 1966.00 samples.], 2024-10-21 11:40:20,657 INFO [train.py:682] (1/4) Start epoch 2100 2024-10-21 11:40:37,674 INFO [train.py:561] (1/4) Epoch 2100, batch 6, global_batch_idx: 33590, batch size: 106, loss[dur_loss=0.2132, prior_loss=0.9773, diff_loss=0.3088, tot_loss=1.499, over 106.00 samples.], tot_loss[dur_loss=0.2081, prior_loss=0.9765, diff_loss=0.4051, tot_loss=1.59, over 1142.00 samples.], 2024-10-21 11:40:50,728 INFO [train.py:682] (1/4) Start epoch 2101 2024-10-21 11:40:59,481 INFO [train.py:561] (1/4) Epoch 2101, batch 0, global_batch_idx: 33600, batch size: 108, loss[dur_loss=0.2134, prior_loss=0.9776, diff_loss=0.3092, tot_loss=1.5, over 108.00 samples.], tot_loss[dur_loss=0.2134, prior_loss=0.9776, diff_loss=0.3092, tot_loss=1.5, over 108.00 samples.], 2024-10-21 11:41:13,791 INFO [train.py:561] (1/4) Epoch 2101, batch 10, global_batch_idx: 33610, batch size: 111, loss[dur_loss=0.2119, prior_loss=0.9778, diff_loss=0.3144, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.2089, prior_loss=0.9767, diff_loss=0.3737, tot_loss=1.559, over 1656.00 samples.], 2024-10-21 11:41:20,964 INFO [train.py:682] (1/4) Start epoch 2102 2024-10-21 11:41:34,635 INFO [train.py:561] (1/4) Epoch 2102, batch 4, global_batch_idx: 33620, batch size: 189, loss[dur_loss=0.2074, prior_loss=0.9771, diff_loss=0.3207, tot_loss=1.505, over 189.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.9762, diff_loss=0.4263, tot_loss=1.608, over 937.00 samples.], 2024-10-21 11:41:49,545 INFO [train.py:561] (1/4) Epoch 2102, batch 14, global_batch_idx: 33630, batch size: 142, loss[dur_loss=0.2094, prior_loss=0.9765, diff_loss=0.302, tot_loss=1.488, over 142.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9767, diff_loss=0.3594, tot_loss=1.545, over 2210.00 samples.], 2024-10-21 11:41:50,993 INFO [train.py:682] (1/4) Start epoch 2103 2024-10-21 11:42:10,805 INFO [train.py:561] (1/4) Epoch 2103, batch 8, global_batch_idx: 33640, batch size: 170, loss[dur_loss=0.2158, prior_loss=0.9769, diff_loss=0.3459, tot_loss=1.539, over 170.00 samples.], tot_loss[dur_loss=0.209, prior_loss=0.9765, diff_loss=0.3981, tot_loss=1.584, over 1432.00 samples.], 2024-10-21 11:42:20,941 INFO [train.py:682] (1/4) Start epoch 2104 2024-10-21 11:42:32,520 INFO [train.py:561] (1/4) Epoch 2104, batch 2, global_batch_idx: 33650, batch size: 203, loss[dur_loss=0.2116, prior_loss=0.977, diff_loss=0.3617, tot_loss=1.55, over 203.00 samples.], tot_loss[dur_loss=0.212, prior_loss=0.977, diff_loss=0.3184, tot_loss=1.507, over 442.00 samples.], 2024-10-21 11:42:46,755 INFO [train.py:561] (1/4) Epoch 2104, batch 12, global_batch_idx: 33660, batch size: 152, loss[dur_loss=0.2085, prior_loss=0.977, diff_loss=0.3137, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9766, diff_loss=0.3628, tot_loss=1.549, over 1966.00 samples.], 2024-10-21 11:42:51,219 INFO [train.py:682] (1/4) Start epoch 2105 2024-10-21 11:43:08,080 INFO [train.py:561] (1/4) Epoch 2105, batch 6, global_batch_idx: 33670, batch size: 106, loss[dur_loss=0.2128, prior_loss=0.977, diff_loss=0.3433, tot_loss=1.533, over 106.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9762, diff_loss=0.4047, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 11:43:21,200 INFO [train.py:682] (1/4) Start epoch 2106 2024-10-21 11:43:30,058 INFO [train.py:561] (1/4) Epoch 2106, batch 0, global_batch_idx: 33680, batch size: 108, loss[dur_loss=0.2131, prior_loss=0.9778, diff_loss=0.2464, tot_loss=1.437, over 108.00 samples.], tot_loss[dur_loss=0.2131, prior_loss=0.9778, diff_loss=0.2464, tot_loss=1.437, over 108.00 samples.], 2024-10-21 11:43:44,288 INFO [train.py:561] (1/4) Epoch 2106, batch 10, global_batch_idx: 33690, batch size: 111, loss[dur_loss=0.2134, prior_loss=0.9776, diff_loss=0.2804, tot_loss=1.471, over 111.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9765, diff_loss=0.3759, tot_loss=1.56, over 1656.00 samples.], 2024-10-21 11:43:51,439 INFO [train.py:682] (1/4) Start epoch 2107 2024-10-21 11:44:05,033 INFO [train.py:561] (1/4) Epoch 2107, batch 4, global_batch_idx: 33700, batch size: 189, loss[dur_loss=0.2084, prior_loss=0.9769, diff_loss=0.3682, tot_loss=1.554, over 189.00 samples.], tot_loss[dur_loss=0.2069, prior_loss=0.9761, diff_loss=0.4318, tot_loss=1.615, over 937.00 samples.], 2024-10-21 11:44:19,936 INFO [train.py:561] (1/4) Epoch 2107, batch 14, global_batch_idx: 33710, batch size: 142, loss[dur_loss=0.2146, prior_loss=0.9765, diff_loss=0.3073, tot_loss=1.498, over 142.00 samples.], tot_loss[dur_loss=0.2097, prior_loss=0.9766, diff_loss=0.3562, tot_loss=1.543, over 2210.00 samples.], 2024-10-21 11:44:21,376 INFO [train.py:682] (1/4) Start epoch 2108 2024-10-21 11:44:41,181 INFO [train.py:561] (1/4) Epoch 2108, batch 8, global_batch_idx: 33720, batch size: 170, loss[dur_loss=0.2134, prior_loss=0.9771, diff_loss=0.3038, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9764, diff_loss=0.3978, tot_loss=1.581, over 1432.00 samples.], 2024-10-21 11:44:51,348 INFO [train.py:682] (1/4) Start epoch 2109 2024-10-21 11:45:02,462 INFO [train.py:561] (1/4) Epoch 2109, batch 2, global_batch_idx: 33730, batch size: 203, loss[dur_loss=0.2111, prior_loss=0.9768, diff_loss=0.323, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.9771, diff_loss=0.3045, tot_loss=1.493, over 442.00 samples.], 2024-10-21 11:45:16,819 INFO [train.py:561] (1/4) Epoch 2109, batch 12, global_batch_idx: 33740, batch size: 152, loss[dur_loss=0.2132, prior_loss=0.9772, diff_loss=0.3058, tot_loss=1.496, over 152.00 samples.], tot_loss[dur_loss=0.209, prior_loss=0.9766, diff_loss=0.3624, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 11:45:21,300 INFO [train.py:682] (1/4) Start epoch 2110 2024-10-21 11:45:38,402 INFO [train.py:561] (1/4) Epoch 2110, batch 6, global_batch_idx: 33750, batch size: 106, loss[dur_loss=0.2094, prior_loss=0.9772, diff_loss=0.3026, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.2071, prior_loss=0.9763, diff_loss=0.4083, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 11:45:51,542 INFO [train.py:682] (1/4) Start epoch 2111 2024-10-21 11:46:00,122 INFO [train.py:561] (1/4) Epoch 2111, batch 0, global_batch_idx: 33760, batch size: 108, loss[dur_loss=0.2142, prior_loss=0.9777, diff_loss=0.2545, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9777, diff_loss=0.2545, tot_loss=1.446, over 108.00 samples.], 2024-10-21 11:46:14,337 INFO [train.py:561] (1/4) Epoch 2111, batch 10, global_batch_idx: 33770, batch size: 111, loss[dur_loss=0.213, prior_loss=0.9781, diff_loss=0.335, tot_loss=1.526, over 111.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9766, diff_loss=0.3694, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 11:46:21,478 INFO [train.py:682] (1/4) Start epoch 2112 2024-10-21 11:46:35,329 INFO [train.py:561] (1/4) Epoch 2112, batch 4, global_batch_idx: 33780, batch size: 189, loss[dur_loss=0.2113, prior_loss=0.9772, diff_loss=0.3346, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9761, diff_loss=0.4222, tot_loss=1.605, over 937.00 samples.], 2024-10-21 11:46:50,306 INFO [train.py:561] (1/4) Epoch 2112, batch 14, global_batch_idx: 33790, batch size: 142, loss[dur_loss=0.2132, prior_loss=0.9768, diff_loss=0.2887, tot_loss=1.479, over 142.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9767, diff_loss=0.3559, tot_loss=1.542, over 2210.00 samples.], 2024-10-21 11:46:51,738 INFO [train.py:682] (1/4) Start epoch 2113 2024-10-21 11:47:11,603 INFO [train.py:561] (1/4) Epoch 2113, batch 8, global_batch_idx: 33800, batch size: 170, loss[dur_loss=0.2151, prior_loss=0.977, diff_loss=0.3198, tot_loss=1.512, over 170.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9764, diff_loss=0.3797, tot_loss=1.564, over 1432.00 samples.], 2024-10-21 11:47:21,820 INFO [train.py:682] (1/4) Start epoch 2114 2024-10-21 11:47:33,058 INFO [train.py:561] (1/4) Epoch 2114, batch 2, global_batch_idx: 33810, batch size: 203, loss[dur_loss=0.2126, prior_loss=0.9769, diff_loss=0.348, tot_loss=1.538, over 203.00 samples.], tot_loss[dur_loss=0.2125, prior_loss=0.9771, diff_loss=0.3286, tot_loss=1.518, over 442.00 samples.], 2024-10-21 11:47:47,322 INFO [train.py:561] (1/4) Epoch 2114, batch 12, global_batch_idx: 33820, batch size: 152, loss[dur_loss=0.2116, prior_loss=0.9769, diff_loss=0.2943, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9766, diff_loss=0.3662, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 11:47:51,799 INFO [train.py:682] (1/4) Start epoch 2115 2024-10-21 11:48:08,790 INFO [train.py:561] (1/4) Epoch 2115, batch 6, global_batch_idx: 33830, batch size: 106, loss[dur_loss=0.2118, prior_loss=0.9772, diff_loss=0.2893, tot_loss=1.478, over 106.00 samples.], tot_loss[dur_loss=0.2076, prior_loss=0.9764, diff_loss=0.4077, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 11:48:21,932 INFO [train.py:682] (1/4) Start epoch 2116 2024-10-21 11:48:30,598 INFO [train.py:561] (1/4) Epoch 2116, batch 0, global_batch_idx: 33840, batch size: 108, loss[dur_loss=0.2154, prior_loss=0.9775, diff_loss=0.3267, tot_loss=1.52, over 108.00 samples.], tot_loss[dur_loss=0.2154, prior_loss=0.9775, diff_loss=0.3267, tot_loss=1.52, over 108.00 samples.], 2024-10-21 11:48:44,792 INFO [train.py:561] (1/4) Epoch 2116, batch 10, global_batch_idx: 33850, batch size: 111, loss[dur_loss=0.2112, prior_loss=0.9778, diff_loss=0.2764, tot_loss=1.465, over 111.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9765, diff_loss=0.3803, tot_loss=1.564, over 1656.00 samples.], 2024-10-21 11:48:51,909 INFO [train.py:682] (1/4) Start epoch 2117 2024-10-21 11:49:05,512 INFO [train.py:561] (1/4) Epoch 2117, batch 4, global_batch_idx: 33860, batch size: 189, loss[dur_loss=0.2095, prior_loss=0.9769, diff_loss=0.3221, tot_loss=1.508, over 189.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.976, diff_loss=0.4149, tot_loss=1.597, over 937.00 samples.], 2024-10-21 11:49:20,444 INFO [train.py:561] (1/4) Epoch 2117, batch 14, global_batch_idx: 33870, batch size: 142, loss[dur_loss=0.213, prior_loss=0.9767, diff_loss=0.3424, tot_loss=1.532, over 142.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9766, diff_loss=0.3616, tot_loss=1.547, over 2210.00 samples.], 2024-10-21 11:49:21,894 INFO [train.py:682] (1/4) Start epoch 2118 2024-10-21 11:49:41,906 INFO [train.py:561] (1/4) Epoch 2118, batch 8, global_batch_idx: 33880, batch size: 170, loss[dur_loss=0.214, prior_loss=0.9771, diff_loss=0.3033, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9764, diff_loss=0.393, tot_loss=1.577, over 1432.00 samples.], 2024-10-21 11:49:52,089 INFO [train.py:682] (1/4) Start epoch 2119 2024-10-21 11:50:03,303 INFO [train.py:561] (1/4) Epoch 2119, batch 2, global_batch_idx: 33890, batch size: 203, loss[dur_loss=0.2112, prior_loss=0.9769, diff_loss=0.3422, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.212, prior_loss=0.9771, diff_loss=0.3126, tot_loss=1.502, over 442.00 samples.], 2024-10-21 11:50:17,587 INFO [train.py:561] (1/4) Epoch 2119, batch 12, global_batch_idx: 33900, batch size: 152, loss[dur_loss=0.2093, prior_loss=0.9768, diff_loss=0.3373, tot_loss=1.523, over 152.00 samples.], tot_loss[dur_loss=0.2086, prior_loss=0.9765, diff_loss=0.3667, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 11:50:22,089 INFO [train.py:682] (1/4) Start epoch 2120 2024-10-21 11:50:38,903 INFO [train.py:561] (1/4) Epoch 2120, batch 6, global_batch_idx: 33910, batch size: 106, loss[dur_loss=0.2096, prior_loss=0.9771, diff_loss=0.3526, tot_loss=1.539, over 106.00 samples.], tot_loss[dur_loss=0.2064, prior_loss=0.9762, diff_loss=0.4075, tot_loss=1.59, over 1142.00 samples.], 2024-10-21 11:50:52,125 INFO [train.py:682] (1/4) Start epoch 2121 2024-10-21 11:51:00,944 INFO [train.py:561] (1/4) Epoch 2121, batch 0, global_batch_idx: 33920, batch size: 108, loss[dur_loss=0.2164, prior_loss=0.9779, diff_loss=0.3367, tot_loss=1.531, over 108.00 samples.], tot_loss[dur_loss=0.2164, prior_loss=0.9779, diff_loss=0.3367, tot_loss=1.531, over 108.00 samples.], 2024-10-21 11:51:15,180 INFO [train.py:561] (1/4) Epoch 2121, batch 10, global_batch_idx: 33930, batch size: 111, loss[dur_loss=0.2129, prior_loss=0.9777, diff_loss=0.2733, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.2088, prior_loss=0.9766, diff_loss=0.3765, tot_loss=1.562, over 1656.00 samples.], 2024-10-21 11:51:22,299 INFO [train.py:682] (1/4) Start epoch 2122 2024-10-21 11:51:36,095 INFO [train.py:561] (1/4) Epoch 2122, batch 4, global_batch_idx: 33940, batch size: 189, loss[dur_loss=0.2097, prior_loss=0.9772, diff_loss=0.3195, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9762, diff_loss=0.4243, tot_loss=1.606, over 937.00 samples.], 2024-10-21 11:51:51,017 INFO [train.py:561] (1/4) Epoch 2122, batch 14, global_batch_idx: 33950, batch size: 142, loss[dur_loss=0.2131, prior_loss=0.9766, diff_loss=0.3047, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9767, diff_loss=0.3609, tot_loss=1.547, over 2210.00 samples.], 2024-10-21 11:51:52,459 INFO [train.py:682] (1/4) Start epoch 2123 2024-10-21 11:52:12,775 INFO [train.py:561] (1/4) Epoch 2123, batch 8, global_batch_idx: 33960, batch size: 170, loss[dur_loss=0.214, prior_loss=0.977, diff_loss=0.2987, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.208, prior_loss=0.9766, diff_loss=0.3792, tot_loss=1.564, over 1432.00 samples.], 2024-10-21 11:52:23,021 INFO [train.py:682] (1/4) Start epoch 2124 2024-10-21 11:52:34,401 INFO [train.py:561] (1/4) Epoch 2124, batch 2, global_batch_idx: 33970, batch size: 203, loss[dur_loss=0.2108, prior_loss=0.9771, diff_loss=0.3495, tot_loss=1.537, over 203.00 samples.], tot_loss[dur_loss=0.2117, prior_loss=0.9771, diff_loss=0.3359, tot_loss=1.525, over 442.00 samples.], 2024-10-21 11:52:48,720 INFO [train.py:561] (1/4) Epoch 2124, batch 12, global_batch_idx: 33980, batch size: 152, loss[dur_loss=0.2111, prior_loss=0.9769, diff_loss=0.3363, tot_loss=1.524, over 152.00 samples.], tot_loss[dur_loss=0.2095, prior_loss=0.9767, diff_loss=0.3686, tot_loss=1.555, over 1966.00 samples.], 2024-10-21 11:52:53,226 INFO [train.py:682] (1/4) Start epoch 2125 2024-10-21 11:53:10,093 INFO [train.py:561] (1/4) Epoch 2125, batch 6, global_batch_idx: 33990, batch size: 106, loss[dur_loss=0.2124, prior_loss=0.9769, diff_loss=0.3752, tot_loss=1.564, over 106.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9761, diff_loss=0.4162, tot_loss=1.598, over 1142.00 samples.], 2024-10-21 11:53:23,148 INFO [train.py:682] (1/4) Start epoch 2126 2024-10-21 11:53:31,963 INFO [train.py:561] (1/4) Epoch 2126, batch 0, global_batch_idx: 34000, batch size: 108, loss[dur_loss=0.2137, prior_loss=0.9775, diff_loss=0.3138, tot_loss=1.505, over 108.00 samples.], tot_loss[dur_loss=0.2137, prior_loss=0.9775, diff_loss=0.3138, tot_loss=1.505, over 108.00 samples.], 2024-10-21 11:53:46,175 INFO [train.py:561] (1/4) Epoch 2126, batch 10, global_batch_idx: 34010, batch size: 111, loss[dur_loss=0.2116, prior_loss=0.9776, diff_loss=0.3177, tot_loss=1.507, over 111.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9764, diff_loss=0.3635, tot_loss=1.548, over 1656.00 samples.], 2024-10-21 11:53:53,346 INFO [train.py:682] (1/4) Start epoch 2127 2024-10-21 11:54:07,035 INFO [train.py:561] (1/4) Epoch 2127, batch 4, global_batch_idx: 34020, batch size: 189, loss[dur_loss=0.2101, prior_loss=0.9769, diff_loss=0.3262, tot_loss=1.513, over 189.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9759, diff_loss=0.429, tot_loss=1.611, over 937.00 samples.], 2024-10-21 11:54:21,963 INFO [train.py:561] (1/4) Epoch 2127, batch 14, global_batch_idx: 34030, batch size: 142, loss[dur_loss=0.2086, prior_loss=0.9765, diff_loss=0.2923, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9765, diff_loss=0.355, tot_loss=1.541, over 2210.00 samples.], 2024-10-21 11:54:23,453 INFO [train.py:682] (1/4) Start epoch 2128 2024-10-21 11:54:43,304 INFO [train.py:561] (1/4) Epoch 2128, batch 8, global_batch_idx: 34040, batch size: 170, loss[dur_loss=0.2115, prior_loss=0.977, diff_loss=0.3345, tot_loss=1.523, over 170.00 samples.], tot_loss[dur_loss=0.2079, prior_loss=0.9764, diff_loss=0.3954, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 11:54:53,499 INFO [train.py:682] (1/4) Start epoch 2129 2024-10-21 11:55:04,865 INFO [train.py:561] (1/4) Epoch 2129, batch 2, global_batch_idx: 34050, batch size: 203, loss[dur_loss=0.2088, prior_loss=0.9768, diff_loss=0.3171, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.2098, prior_loss=0.9769, diff_loss=0.3078, tot_loss=1.495, over 442.00 samples.], 2024-10-21 11:55:19,194 INFO [train.py:561] (1/4) Epoch 2129, batch 12, global_batch_idx: 34060, batch size: 152, loss[dur_loss=0.2078, prior_loss=0.977, diff_loss=0.3089, tot_loss=1.494, over 152.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9764, diff_loss=0.3643, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 11:55:23,706 INFO [train.py:682] (1/4) Start epoch 2130 2024-10-21 11:55:40,871 INFO [train.py:561] (1/4) Epoch 2130, batch 6, global_batch_idx: 34070, batch size: 106, loss[dur_loss=0.2072, prior_loss=0.9768, diff_loss=0.3001, tot_loss=1.484, over 106.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9761, diff_loss=0.4091, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 11:55:54,011 INFO [train.py:682] (1/4) Start epoch 2131 2024-10-21 11:56:02,555 INFO [train.py:561] (1/4) Epoch 2131, batch 0, global_batch_idx: 34080, batch size: 108, loss[dur_loss=0.2138, prior_loss=0.9773, diff_loss=0.3305, tot_loss=1.522, over 108.00 samples.], tot_loss[dur_loss=0.2138, prior_loss=0.9773, diff_loss=0.3305, tot_loss=1.522, over 108.00 samples.], 2024-10-21 11:56:16,831 INFO [train.py:561] (1/4) Epoch 2131, batch 10, global_batch_idx: 34090, batch size: 111, loss[dur_loss=0.2113, prior_loss=0.9776, diff_loss=0.3118, tot_loss=1.501, over 111.00 samples.], tot_loss[dur_loss=0.2083, prior_loss=0.9764, diff_loss=0.3795, tot_loss=1.564, over 1656.00 samples.], 2024-10-21 11:56:23,961 INFO [train.py:682] (1/4) Start epoch 2132 2024-10-21 11:56:37,822 INFO [train.py:561] (1/4) Epoch 2132, batch 4, global_batch_idx: 34100, batch size: 189, loss[dur_loss=0.2113, prior_loss=0.977, diff_loss=0.3298, tot_loss=1.518, over 189.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9759, diff_loss=0.4312, tot_loss=1.613, over 937.00 samples.], 2024-10-21 11:56:52,712 INFO [train.py:561] (1/4) Epoch 2132, batch 14, global_batch_idx: 34110, batch size: 142, loss[dur_loss=0.2126, prior_loss=0.9764, diff_loss=0.2973, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.2087, prior_loss=0.9765, diff_loss=0.3591, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 11:56:54,146 INFO [train.py:682] (1/4) Start epoch 2133 2024-10-21 11:57:13,998 INFO [train.py:561] (1/4) Epoch 2133, batch 8, global_batch_idx: 34120, batch size: 170, loss[dur_loss=0.2134, prior_loss=0.9768, diff_loss=0.3042, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9762, diff_loss=0.3837, tot_loss=1.568, over 1432.00 samples.], 2024-10-21 11:57:24,162 INFO [train.py:682] (1/4) Start epoch 2134 2024-10-21 11:57:35,664 INFO [train.py:561] (1/4) Epoch 2134, batch 2, global_batch_idx: 34130, batch size: 203, loss[dur_loss=0.2105, prior_loss=0.9766, diff_loss=0.3627, tot_loss=1.55, over 203.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.9769, diff_loss=0.3253, tot_loss=1.513, over 442.00 samples.], 2024-10-21 11:57:49,889 INFO [train.py:561] (1/4) Epoch 2134, batch 12, global_batch_idx: 34140, batch size: 152, loss[dur_loss=0.2082, prior_loss=0.9766, diff_loss=0.3, tot_loss=1.485, over 152.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9763, diff_loss=0.3728, tot_loss=1.557, over 1966.00 samples.], 2024-10-21 11:57:54,374 INFO [train.py:682] (1/4) Start epoch 2135 2024-10-21 11:58:11,200 INFO [train.py:561] (1/4) Epoch 2135, batch 6, global_batch_idx: 34150, batch size: 106, loss[dur_loss=0.2121, prior_loss=0.9769, diff_loss=0.2719, tot_loss=1.461, over 106.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.976, diff_loss=0.4134, tot_loss=1.596, over 1142.00 samples.], 2024-10-21 11:58:24,251 INFO [train.py:682] (1/4) Start epoch 2136 2024-10-21 11:58:32,866 INFO [train.py:561] (1/4) Epoch 2136, batch 0, global_batch_idx: 34160, batch size: 108, loss[dur_loss=0.2171, prior_loss=0.9775, diff_loss=0.3083, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2171, prior_loss=0.9775, diff_loss=0.3083, tot_loss=1.503, over 108.00 samples.], 2024-10-21 11:58:47,077 INFO [train.py:561] (1/4) Epoch 2136, batch 10, global_batch_idx: 34170, batch size: 111, loss[dur_loss=0.2109, prior_loss=0.9777, diff_loss=0.3025, tot_loss=1.491, over 111.00 samples.], tot_loss[dur_loss=0.2084, prior_loss=0.9765, diff_loss=0.3929, tot_loss=1.578, over 1656.00 samples.], 2024-10-21 11:58:54,297 INFO [train.py:682] (1/4) Start epoch 2137 2024-10-21 11:59:08,366 INFO [train.py:561] (1/4) Epoch 2137, batch 4, global_batch_idx: 34180, batch size: 189, loss[dur_loss=0.2115, prior_loss=0.9768, diff_loss=0.307, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.2064, prior_loss=0.9758, diff_loss=0.4122, tot_loss=1.594, over 937.00 samples.], 2024-10-21 11:59:23,297 INFO [train.py:561] (1/4) Epoch 2137, batch 14, global_batch_idx: 34190, batch size: 142, loss[dur_loss=0.2098, prior_loss=0.9764, diff_loss=0.2653, tot_loss=1.452, over 142.00 samples.], tot_loss[dur_loss=0.2094, prior_loss=0.9765, diff_loss=0.352, tot_loss=1.538, over 2210.00 samples.], 2024-10-21 11:59:24,731 INFO [train.py:682] (1/4) Start epoch 2138 2024-10-21 11:59:44,956 INFO [train.py:561] (1/4) Epoch 2138, batch 8, global_batch_idx: 34200, batch size: 170, loss[dur_loss=0.2133, prior_loss=0.977, diff_loss=0.3336, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.2068, prior_loss=0.9764, diff_loss=0.3848, tot_loss=1.568, over 1432.00 samples.], 2024-10-21 11:59:55,091 INFO [train.py:682] (1/4) Start epoch 2139 2024-10-21 12:00:06,279 INFO [train.py:561] (1/4) Epoch 2139, batch 2, global_batch_idx: 34210, batch size: 203, loss[dur_loss=0.2106, prior_loss=0.9771, diff_loss=0.3514, tot_loss=1.539, over 203.00 samples.], tot_loss[dur_loss=0.2117, prior_loss=0.977, diff_loss=0.3173, tot_loss=1.506, over 442.00 samples.], 2024-10-21 12:00:20,558 INFO [train.py:561] (1/4) Epoch 2139, batch 12, global_batch_idx: 34220, batch size: 152, loss[dur_loss=0.2086, prior_loss=0.9767, diff_loss=0.3048, tot_loss=1.49, over 152.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9765, diff_loss=0.3656, tot_loss=1.55, over 1966.00 samples.], 2024-10-21 12:00:25,090 INFO [train.py:682] (1/4) Start epoch 2140 2024-10-21 12:00:42,297 INFO [train.py:561] (1/4) Epoch 2140, batch 6, global_batch_idx: 34230, batch size: 106, loss[dur_loss=0.2103, prior_loss=0.9768, diff_loss=0.3281, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9761, diff_loss=0.3903, tot_loss=1.572, over 1142.00 samples.], 2024-10-21 12:00:55,448 INFO [train.py:682] (1/4) Start epoch 2141 2024-10-21 12:01:04,075 INFO [train.py:561] (1/4) Epoch 2141, batch 0, global_batch_idx: 34240, batch size: 108, loss[dur_loss=0.2172, prior_loss=0.9773, diff_loss=0.2943, tot_loss=1.489, over 108.00 samples.], tot_loss[dur_loss=0.2172, prior_loss=0.9773, diff_loss=0.2943, tot_loss=1.489, over 108.00 samples.], 2024-10-21 12:01:18,262 INFO [train.py:561] (1/4) Epoch 2141, batch 10, global_batch_idx: 34250, batch size: 111, loss[dur_loss=0.2089, prior_loss=0.9776, diff_loss=0.2846, tot_loss=1.471, over 111.00 samples.], tot_loss[dur_loss=0.2071, prior_loss=0.9763, diff_loss=0.3687, tot_loss=1.552, over 1656.00 samples.], 2024-10-21 12:01:25,352 INFO [train.py:682] (1/4) Start epoch 2142 2024-10-21 12:01:38,988 INFO [train.py:561] (1/4) Epoch 2142, batch 4, global_batch_idx: 34260, batch size: 189, loss[dur_loss=0.2095, prior_loss=0.9766, diff_loss=0.3135, tot_loss=1.5, over 189.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9757, diff_loss=0.4203, tot_loss=1.602, over 937.00 samples.], 2024-10-21 12:01:53,814 INFO [train.py:561] (1/4) Epoch 2142, batch 14, global_batch_idx: 34270, batch size: 142, loss[dur_loss=0.2073, prior_loss=0.9762, diff_loss=0.2954, tot_loss=1.479, over 142.00 samples.], tot_loss[dur_loss=0.208, prior_loss=0.9764, diff_loss=0.3557, tot_loss=1.54, over 2210.00 samples.], 2024-10-21 12:01:55,315 INFO [train.py:682] (1/4) Start epoch 2143 2024-10-21 12:02:15,282 INFO [train.py:561] (1/4) Epoch 2143, batch 8, global_batch_idx: 34280, batch size: 170, loss[dur_loss=0.2105, prior_loss=0.977, diff_loss=0.3312, tot_loss=1.519, over 170.00 samples.], tot_loss[dur_loss=0.2071, prior_loss=0.9762, diff_loss=0.397, tot_loss=1.58, over 1432.00 samples.], 2024-10-21 12:02:25,389 INFO [train.py:682] (1/4) Start epoch 2144 2024-10-21 12:02:36,929 INFO [train.py:561] (1/4) Epoch 2144, batch 2, global_batch_idx: 34290, batch size: 203, loss[dur_loss=0.2083, prior_loss=0.9767, diff_loss=0.359, tot_loss=1.544, over 203.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.9769, diff_loss=0.3311, tot_loss=1.518, over 442.00 samples.], 2024-10-21 12:02:51,128 INFO [train.py:561] (1/4) Epoch 2144, batch 12, global_batch_idx: 34300, batch size: 152, loss[dur_loss=0.2092, prior_loss=0.9768, diff_loss=0.3343, tot_loss=1.52, over 152.00 samples.], tot_loss[dur_loss=0.208, prior_loss=0.9765, diff_loss=0.3798, tot_loss=1.564, over 1966.00 samples.], 2024-10-21 12:02:55,597 INFO [train.py:682] (1/4) Start epoch 2145 2024-10-21 12:03:12,712 INFO [train.py:561] (1/4) Epoch 2145, batch 6, global_batch_idx: 34310, batch size: 106, loss[dur_loss=0.2107, prior_loss=0.9767, diff_loss=0.2679, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9761, diff_loss=0.4081, tot_loss=1.59, over 1142.00 samples.], 2024-10-21 12:03:25,822 INFO [train.py:682] (1/4) Start epoch 2146 2024-10-21 12:03:34,766 INFO [train.py:561] (1/4) Epoch 2146, batch 0, global_batch_idx: 34320, batch size: 108, loss[dur_loss=0.2135, prior_loss=0.9775, diff_loss=0.2372, tot_loss=1.428, over 108.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9775, diff_loss=0.2372, tot_loss=1.428, over 108.00 samples.], 2024-10-21 12:03:48,995 INFO [train.py:561] (1/4) Epoch 2146, batch 10, global_batch_idx: 34330, batch size: 111, loss[dur_loss=0.2084, prior_loss=0.9775, diff_loss=0.323, tot_loss=1.509, over 111.00 samples.], tot_loss[dur_loss=0.2083, prior_loss=0.9763, diff_loss=0.373, tot_loss=1.558, over 1656.00 samples.], 2024-10-21 12:03:56,126 INFO [train.py:682] (1/4) Start epoch 2147 2024-10-21 12:04:09,628 INFO [train.py:561] (1/4) Epoch 2147, batch 4, global_batch_idx: 34340, batch size: 189, loss[dur_loss=0.2071, prior_loss=0.9766, diff_loss=0.3478, tot_loss=1.531, over 189.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9758, diff_loss=0.4299, tot_loss=1.611, over 937.00 samples.], 2024-10-21 12:04:24,514 INFO [train.py:561] (1/4) Epoch 2147, batch 14, global_batch_idx: 34350, batch size: 142, loss[dur_loss=0.2119, prior_loss=0.9763, diff_loss=0.2846, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.208, prior_loss=0.9763, diff_loss=0.3594, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 12:04:25,936 INFO [train.py:682] (1/4) Start epoch 2148 2024-10-21 12:04:46,015 INFO [train.py:561] (1/4) Epoch 2148, batch 8, global_batch_idx: 34360, batch size: 170, loss[dur_loss=0.2093, prior_loss=0.9767, diff_loss=0.3242, tot_loss=1.51, over 170.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.9761, diff_loss=0.3865, tot_loss=1.569, over 1432.00 samples.], 2024-10-21 12:04:56,154 INFO [train.py:682] (1/4) Start epoch 2149 2024-10-21 12:05:07,478 INFO [train.py:561] (1/4) Epoch 2149, batch 2, global_batch_idx: 34370, batch size: 203, loss[dur_loss=0.2125, prior_loss=0.9766, diff_loss=0.3307, tot_loss=1.52, over 203.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9767, diff_loss=0.3169, tot_loss=1.505, over 442.00 samples.], 2024-10-21 12:05:21,655 INFO [train.py:561] (1/4) Epoch 2149, batch 12, global_batch_idx: 34380, batch size: 152, loss[dur_loss=0.208, prior_loss=0.9766, diff_loss=0.3401, tot_loss=1.525, over 152.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9762, diff_loss=0.3633, tot_loss=1.547, over 1966.00 samples.], 2024-10-21 12:05:26,087 INFO [train.py:682] (1/4) Start epoch 2150 2024-10-21 12:05:43,024 INFO [train.py:561] (1/4) Epoch 2150, batch 6, global_batch_idx: 34390, batch size: 106, loss[dur_loss=0.2098, prior_loss=0.9767, diff_loss=0.3293, tot_loss=1.516, over 106.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9759, diff_loss=0.4024, tot_loss=1.584, over 1142.00 samples.], 2024-10-21 12:05:56,050 INFO [train.py:682] (1/4) Start epoch 2151 2024-10-21 12:06:04,590 INFO [train.py:561] (1/4) Epoch 2151, batch 0, global_batch_idx: 34400, batch size: 108, loss[dur_loss=0.2113, prior_loss=0.9773, diff_loss=0.262, tot_loss=1.451, over 108.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.9773, diff_loss=0.262, tot_loss=1.451, over 108.00 samples.], 2024-10-21 12:06:18,694 INFO [train.py:561] (1/4) Epoch 2151, batch 10, global_batch_idx: 34410, batch size: 111, loss[dur_loss=0.2099, prior_loss=0.9774, diff_loss=0.3308, tot_loss=1.518, over 111.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9761, diff_loss=0.3746, tot_loss=1.557, over 1656.00 samples.], 2024-10-21 12:06:25,825 INFO [train.py:682] (1/4) Start epoch 2152 2024-10-21 12:06:39,607 INFO [train.py:561] (1/4) Epoch 2152, batch 4, global_batch_idx: 34420, batch size: 189, loss[dur_loss=0.2082, prior_loss=0.9767, diff_loss=0.3524, tot_loss=1.537, over 189.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9758, diff_loss=0.4347, tot_loss=1.615, over 937.00 samples.], 2024-10-21 12:06:54,401 INFO [train.py:561] (1/4) Epoch 2152, batch 14, global_batch_idx: 34430, batch size: 142, loss[dur_loss=0.2087, prior_loss=0.9762, diff_loss=0.3252, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9763, diff_loss=0.3601, tot_loss=1.545, over 2210.00 samples.], 2024-10-21 12:06:55,900 INFO [train.py:682] (1/4) Start epoch 2153 2024-10-21 12:07:15,941 INFO [train.py:561] (1/4) Epoch 2153, batch 8, global_batch_idx: 34440, batch size: 170, loss[dur_loss=0.2097, prior_loss=0.9766, diff_loss=0.3476, tot_loss=1.534, over 170.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9761, diff_loss=0.3959, tot_loss=1.578, over 1432.00 samples.], 2024-10-21 12:07:26,035 INFO [train.py:682] (1/4) Start epoch 2154 2024-10-21 12:07:37,312 INFO [train.py:561] (1/4) Epoch 2154, batch 2, global_batch_idx: 34450, batch size: 203, loss[dur_loss=0.2098, prior_loss=0.9766, diff_loss=0.3411, tot_loss=1.528, over 203.00 samples.], tot_loss[dur_loss=0.2114, prior_loss=0.9768, diff_loss=0.316, tot_loss=1.504, over 442.00 samples.], 2024-10-21 12:07:51,542 INFO [train.py:561] (1/4) Epoch 2154, batch 12, global_batch_idx: 34460, batch size: 152, loss[dur_loss=0.2079, prior_loss=0.9767, diff_loss=0.3186, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2084, prior_loss=0.9763, diff_loss=0.3635, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 12:07:55,999 INFO [train.py:682] (1/4) Start epoch 2155 2024-10-21 12:08:13,100 INFO [train.py:561] (1/4) Epoch 2155, batch 6, global_batch_idx: 34470, batch size: 106, loss[dur_loss=0.2099, prior_loss=0.9768, diff_loss=0.297, tot_loss=1.484, over 106.00 samples.], tot_loss[dur_loss=0.2046, prior_loss=0.9759, diff_loss=0.4077, tot_loss=1.588, over 1142.00 samples.], 2024-10-21 12:08:26,168 INFO [train.py:682] (1/4) Start epoch 2156 2024-10-21 12:08:34,972 INFO [train.py:561] (1/4) Epoch 2156, batch 0, global_batch_idx: 34480, batch size: 108, loss[dur_loss=0.2135, prior_loss=0.9774, diff_loss=0.3375, tot_loss=1.528, over 108.00 samples.], tot_loss[dur_loss=0.2135, prior_loss=0.9774, diff_loss=0.3375, tot_loss=1.528, over 108.00 samples.], 2024-10-21 12:08:49,117 INFO [train.py:561] (1/4) Epoch 2156, batch 10, global_batch_idx: 34490, batch size: 111, loss[dur_loss=0.2123, prior_loss=0.9777, diff_loss=0.3251, tot_loss=1.515, over 111.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9763, diff_loss=0.3798, tot_loss=1.565, over 1656.00 samples.], 2024-10-21 12:08:56,159 INFO [train.py:682] (1/4) Start epoch 2157 2024-10-21 12:09:09,636 INFO [train.py:561] (1/4) Epoch 2157, batch 4, global_batch_idx: 34500, batch size: 189, loss[dur_loss=0.2084, prior_loss=0.9768, diff_loss=0.3689, tot_loss=1.554, over 189.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9758, diff_loss=0.4368, tot_loss=1.619, over 937.00 samples.], 2024-10-21 12:09:11,272 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 12:09:43,954 INFO [train.py:589] (1/4) Epoch 2157, validation: dur_loss=0.4579, prior_loss=1.035, diff_loss=0.3858, tot_loss=1.879, over 100.00 samples. 2024-10-21 12:09:43,956 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 12:09:57,221 INFO [train.py:561] (1/4) Epoch 2157, batch 14, global_batch_idx: 34510, batch size: 142, loss[dur_loss=0.2107, prior_loss=0.9763, diff_loss=0.3224, tot_loss=1.509, over 142.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9764, diff_loss=0.3638, tot_loss=1.549, over 2210.00 samples.], 2024-10-21 12:09:58,647 INFO [train.py:682] (1/4) Start epoch 2158 2024-10-21 12:10:18,470 INFO [train.py:561] (1/4) Epoch 2158, batch 8, global_batch_idx: 34520, batch size: 170, loss[dur_loss=0.2128, prior_loss=0.9764, diff_loss=0.3179, tot_loss=1.507, over 170.00 samples.], tot_loss[dur_loss=0.2068, prior_loss=0.9761, diff_loss=0.3848, tot_loss=1.568, over 1432.00 samples.], 2024-10-21 12:10:28,537 INFO [train.py:682] (1/4) Start epoch 2159 2024-10-21 12:10:40,100 INFO [train.py:561] (1/4) Epoch 2159, batch 2, global_batch_idx: 34530, batch size: 203, loss[dur_loss=0.2128, prior_loss=0.9765, diff_loss=0.321, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.9767, diff_loss=0.3147, tot_loss=1.503, over 442.00 samples.], 2024-10-21 12:10:54,247 INFO [train.py:561] (1/4) Epoch 2159, batch 12, global_batch_idx: 34540, batch size: 152, loss[dur_loss=0.2063, prior_loss=0.9765, diff_loss=0.3341, tot_loss=1.517, over 152.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9762, diff_loss=0.3638, tot_loss=1.547, over 1966.00 samples.], 2024-10-21 12:10:58,663 INFO [train.py:682] (1/4) Start epoch 2160 2024-10-21 12:11:15,735 INFO [train.py:561] (1/4) Epoch 2160, batch 6, global_batch_idx: 34550, batch size: 106, loss[dur_loss=0.2104, prior_loss=0.9767, diff_loss=0.2702, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.2058, prior_loss=0.976, diff_loss=0.4046, tot_loss=1.586, over 1142.00 samples.], 2024-10-21 12:11:28,685 INFO [train.py:682] (1/4) Start epoch 2161 2024-10-21 12:11:37,327 INFO [train.py:561] (1/4) Epoch 2161, batch 0, global_batch_idx: 34560, batch size: 108, loss[dur_loss=0.2154, prior_loss=0.9776, diff_loss=0.2858, tot_loss=1.479, over 108.00 samples.], tot_loss[dur_loss=0.2154, prior_loss=0.9776, diff_loss=0.2858, tot_loss=1.479, over 108.00 samples.], 2024-10-21 12:11:51,413 INFO [train.py:561] (1/4) Epoch 2161, batch 10, global_batch_idx: 34570, batch size: 111, loss[dur_loss=0.209, prior_loss=0.9776, diff_loss=0.3172, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9762, diff_loss=0.3718, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 12:11:58,416 INFO [train.py:682] (1/4) Start epoch 2162 2024-10-21 12:12:12,183 INFO [train.py:561] (1/4) Epoch 2162, batch 4, global_batch_idx: 34580, batch size: 189, loss[dur_loss=0.2102, prior_loss=0.9766, diff_loss=0.317, tot_loss=1.504, over 189.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9757, diff_loss=0.4283, tot_loss=1.609, over 937.00 samples.], 2024-10-21 12:12:26,925 INFO [train.py:561] (1/4) Epoch 2162, batch 14, global_batch_idx: 34590, batch size: 142, loss[dur_loss=0.2084, prior_loss=0.9765, diff_loss=0.3191, tot_loss=1.504, over 142.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9763, diff_loss=0.3616, tot_loss=1.545, over 2210.00 samples.], 2024-10-21 12:12:28,329 INFO [train.py:682] (1/4) Start epoch 2163 2024-10-21 12:12:48,065 INFO [train.py:561] (1/4) Epoch 2163, batch 8, global_batch_idx: 34600, batch size: 170, loss[dur_loss=0.2107, prior_loss=0.9766, diff_loss=0.3017, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.2067, prior_loss=0.976, diff_loss=0.3844, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 12:12:58,103 INFO [train.py:682] (1/4) Start epoch 2164 2024-10-21 12:13:09,228 INFO [train.py:561] (1/4) Epoch 2164, batch 2, global_batch_idx: 34610, batch size: 203, loss[dur_loss=0.2087, prior_loss=0.9768, diff_loss=0.3262, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.2089, prior_loss=0.9768, diff_loss=0.309, tot_loss=1.495, over 442.00 samples.], 2024-10-21 12:13:23,338 INFO [train.py:561] (1/4) Epoch 2164, batch 12, global_batch_idx: 34620, batch size: 152, loss[dur_loss=0.2072, prior_loss=0.9766, diff_loss=0.3106, tot_loss=1.495, over 152.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9763, diff_loss=0.3686, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 12:13:27,741 INFO [train.py:682] (1/4) Start epoch 2165 2024-10-21 12:13:44,718 INFO [train.py:561] (1/4) Epoch 2165, batch 6, global_batch_idx: 34630, batch size: 106, loss[dur_loss=0.2086, prior_loss=0.9766, diff_loss=0.297, tot_loss=1.482, over 106.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9759, diff_loss=0.4257, tot_loss=1.607, over 1142.00 samples.], 2024-10-21 12:13:57,670 INFO [train.py:682] (1/4) Start epoch 2166 2024-10-21 12:14:06,362 INFO [train.py:561] (1/4) Epoch 2166, batch 0, global_batch_idx: 34640, batch size: 108, loss[dur_loss=0.2106, prior_loss=0.977, diff_loss=0.2791, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.2106, prior_loss=0.977, diff_loss=0.2791, tot_loss=1.467, over 108.00 samples.], 2024-10-21 12:14:20,452 INFO [train.py:561] (1/4) Epoch 2166, batch 10, global_batch_idx: 34650, batch size: 111, loss[dur_loss=0.2096, prior_loss=0.9775, diff_loss=0.2982, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9761, diff_loss=0.3753, tot_loss=1.557, over 1656.00 samples.], 2024-10-21 12:14:27,504 INFO [train.py:682] (1/4) Start epoch 2167 2024-10-21 12:14:41,015 INFO [train.py:561] (1/4) Epoch 2167, batch 4, global_batch_idx: 34660, batch size: 189, loss[dur_loss=0.2073, prior_loss=0.9765, diff_loss=0.319, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9757, diff_loss=0.4199, tot_loss=1.599, over 937.00 samples.], 2024-10-21 12:14:55,697 INFO [train.py:561] (1/4) Epoch 2167, batch 14, global_batch_idx: 34670, batch size: 142, loss[dur_loss=0.2076, prior_loss=0.9762, diff_loss=0.3041, tot_loss=1.488, over 142.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9762, diff_loss=0.3529, tot_loss=1.536, over 2210.00 samples.], 2024-10-21 12:14:57,101 INFO [train.py:682] (1/4) Start epoch 2168 2024-10-21 12:15:16,726 INFO [train.py:561] (1/4) Epoch 2168, batch 8, global_batch_idx: 34680, batch size: 170, loss[dur_loss=0.2105, prior_loss=0.9765, diff_loss=0.3238, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.2054, prior_loss=0.9761, diff_loss=0.3955, tot_loss=1.577, over 1432.00 samples.], 2024-10-21 12:15:26,812 INFO [train.py:682] (1/4) Start epoch 2169 2024-10-21 12:15:38,307 INFO [train.py:561] (1/4) Epoch 2169, batch 2, global_batch_idx: 34690, batch size: 203, loss[dur_loss=0.2079, prior_loss=0.9766, diff_loss=0.347, tot_loss=1.532, over 203.00 samples.], tot_loss[dur_loss=0.2104, prior_loss=0.9767, diff_loss=0.3205, tot_loss=1.508, over 442.00 samples.], 2024-10-21 12:15:52,529 INFO [train.py:561] (1/4) Epoch 2169, batch 12, global_batch_idx: 34700, batch size: 152, loss[dur_loss=0.2092, prior_loss=0.9767, diff_loss=0.3143, tot_loss=1.5, over 152.00 samples.], tot_loss[dur_loss=0.2067, prior_loss=0.9763, diff_loss=0.3745, tot_loss=1.557, over 1966.00 samples.], 2024-10-21 12:15:57,020 INFO [train.py:682] (1/4) Start epoch 2170 2024-10-21 12:16:13,785 INFO [train.py:561] (1/4) Epoch 2170, batch 6, global_batch_idx: 34710, batch size: 106, loss[dur_loss=0.2113, prior_loss=0.9767, diff_loss=0.2868, tot_loss=1.475, over 106.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.9758, diff_loss=0.4045, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 12:16:26,798 INFO [train.py:682] (1/4) Start epoch 2171 2024-10-21 12:16:35,631 INFO [train.py:561] (1/4) Epoch 2171, batch 0, global_batch_idx: 34720, batch size: 108, loss[dur_loss=0.215, prior_loss=0.9774, diff_loss=0.3311, tot_loss=1.523, over 108.00 samples.], tot_loss[dur_loss=0.215, prior_loss=0.9774, diff_loss=0.3311, tot_loss=1.523, over 108.00 samples.], 2024-10-21 12:16:49,725 INFO [train.py:561] (1/4) Epoch 2171, batch 10, global_batch_idx: 34730, batch size: 111, loss[dur_loss=0.212, prior_loss=0.9774, diff_loss=0.3193, tot_loss=1.509, over 111.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.976, diff_loss=0.3704, tot_loss=1.552, over 1656.00 samples.], 2024-10-21 12:16:56,746 INFO [train.py:682] (1/4) Start epoch 2172 2024-10-21 12:17:10,292 INFO [train.py:561] (1/4) Epoch 2172, batch 4, global_batch_idx: 34740, batch size: 189, loss[dur_loss=0.2072, prior_loss=0.9765, diff_loss=0.3197, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9756, diff_loss=0.4263, tot_loss=1.607, over 937.00 samples.], 2024-10-21 12:17:25,068 INFO [train.py:561] (1/4) Epoch 2172, batch 14, global_batch_idx: 34750, batch size: 142, loss[dur_loss=0.2092, prior_loss=0.9764, diff_loss=0.339, tot_loss=1.525, over 142.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9761, diff_loss=0.3604, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 12:17:26,484 INFO [train.py:682] (1/4) Start epoch 2173 2024-10-21 12:17:46,443 INFO [train.py:561] (1/4) Epoch 2173, batch 8, global_batch_idx: 34760, batch size: 170, loss[dur_loss=0.2105, prior_loss=0.9766, diff_loss=0.3306, tot_loss=1.518, over 170.00 samples.], tot_loss[dur_loss=0.2067, prior_loss=0.9761, diff_loss=0.3916, tot_loss=1.574, over 1432.00 samples.], 2024-10-21 12:17:56,540 INFO [train.py:682] (1/4) Start epoch 2174 2024-10-21 12:18:07,822 INFO [train.py:561] (1/4) Epoch 2174, batch 2, global_batch_idx: 34770, batch size: 203, loss[dur_loss=0.208, prior_loss=0.9764, diff_loss=0.3506, tot_loss=1.535, over 203.00 samples.], tot_loss[dur_loss=0.2086, prior_loss=0.9765, diff_loss=0.3128, tot_loss=1.498, over 442.00 samples.], 2024-10-21 12:18:21,898 INFO [train.py:561] (1/4) Epoch 2174, batch 12, global_batch_idx: 34780, batch size: 152, loss[dur_loss=0.2061, prior_loss=0.9766, diff_loss=0.3166, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.2066, prior_loss=0.9761, diff_loss=0.3673, tot_loss=1.55, over 1966.00 samples.], 2024-10-21 12:18:26,296 INFO [train.py:682] (1/4) Start epoch 2175 2024-10-21 12:18:42,960 INFO [train.py:561] (1/4) Epoch 2175, batch 6, global_batch_idx: 34790, batch size: 106, loss[dur_loss=0.2108, prior_loss=0.9767, diff_loss=0.2984, tot_loss=1.486, over 106.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9758, diff_loss=0.4094, tot_loss=1.59, over 1142.00 samples.], 2024-10-21 12:18:55,840 INFO [train.py:682] (1/4) Start epoch 2176 2024-10-21 12:19:04,618 INFO [train.py:561] (1/4) Epoch 2176, batch 0, global_batch_idx: 34800, batch size: 108, loss[dur_loss=0.2122, prior_loss=0.977, diff_loss=0.2859, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.2122, prior_loss=0.977, diff_loss=0.2859, tot_loss=1.475, over 108.00 samples.], 2024-10-21 12:19:18,804 INFO [train.py:561] (1/4) Epoch 2176, batch 10, global_batch_idx: 34810, batch size: 111, loss[dur_loss=0.208, prior_loss=0.9775, diff_loss=0.292, tot_loss=1.477, over 111.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.9761, diff_loss=0.3731, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 12:19:25,842 INFO [train.py:682] (1/4) Start epoch 2177 2024-10-21 12:19:39,513 INFO [train.py:561] (1/4) Epoch 2177, batch 4, global_batch_idx: 34820, batch size: 189, loss[dur_loss=0.212, prior_loss=0.9766, diff_loss=0.3178, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9755, diff_loss=0.4255, tot_loss=1.605, over 937.00 samples.], 2024-10-21 12:19:54,264 INFO [train.py:561] (1/4) Epoch 2177, batch 14, global_batch_idx: 34830, batch size: 142, loss[dur_loss=0.2094, prior_loss=0.9762, diff_loss=0.3319, tot_loss=1.517, over 142.00 samples.], tot_loss[dur_loss=0.2073, prior_loss=0.9762, diff_loss=0.3624, tot_loss=1.546, over 2210.00 samples.], 2024-10-21 12:19:55,677 INFO [train.py:682] (1/4) Start epoch 2178 2024-10-21 12:20:15,882 INFO [train.py:561] (1/4) Epoch 2178, batch 8, global_batch_idx: 34840, batch size: 170, loss[dur_loss=0.2162, prior_loss=0.9768, diff_loss=0.2886, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.2074, prior_loss=0.9761, diff_loss=0.3839, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 12:20:25,965 INFO [train.py:682] (1/4) Start epoch 2179 2024-10-21 12:20:37,698 INFO [train.py:561] (1/4) Epoch 2179, batch 2, global_batch_idx: 34850, batch size: 203, loss[dur_loss=0.2079, prior_loss=0.9762, diff_loss=0.3183, tot_loss=1.502, over 203.00 samples.], tot_loss[dur_loss=0.2097, prior_loss=0.9765, diff_loss=0.3054, tot_loss=1.492, over 442.00 samples.], 2024-10-21 12:20:52,132 INFO [train.py:561] (1/4) Epoch 2179, batch 12, global_batch_idx: 34860, batch size: 152, loss[dur_loss=0.2092, prior_loss=0.9764, diff_loss=0.2725, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9761, diff_loss=0.3656, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 12:20:56,587 INFO [train.py:682] (1/4) Start epoch 2180 2024-10-21 12:21:13,483 INFO [train.py:561] (1/4) Epoch 2180, batch 6, global_batch_idx: 34870, batch size: 106, loss[dur_loss=0.2108, prior_loss=0.9765, diff_loss=0.2877, tot_loss=1.475, over 106.00 samples.], tot_loss[dur_loss=0.2048, prior_loss=0.9757, diff_loss=0.4, tot_loss=1.581, over 1142.00 samples.], 2024-10-21 12:21:26,452 INFO [train.py:682] (1/4) Start epoch 2181 2024-10-21 12:21:35,260 INFO [train.py:561] (1/4) Epoch 2181, batch 0, global_batch_idx: 34880, batch size: 108, loss[dur_loss=0.2119, prior_loss=0.9771, diff_loss=0.3462, tot_loss=1.535, over 108.00 samples.], tot_loss[dur_loss=0.2119, prior_loss=0.9771, diff_loss=0.3462, tot_loss=1.535, over 108.00 samples.], 2024-10-21 12:21:49,472 INFO [train.py:561] (1/4) Epoch 2181, batch 10, global_batch_idx: 34890, batch size: 111, loss[dur_loss=0.2095, prior_loss=0.9772, diff_loss=0.2733, tot_loss=1.46, over 111.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.976, diff_loss=0.3707, tot_loss=1.553, over 1656.00 samples.], 2024-10-21 12:21:56,598 INFO [train.py:682] (1/4) Start epoch 2182 2024-10-21 12:22:10,284 INFO [train.py:561] (1/4) Epoch 2182, batch 4, global_batch_idx: 34900, batch size: 189, loss[dur_loss=0.2057, prior_loss=0.9765, diff_loss=0.3386, tot_loss=1.521, over 189.00 samples.], tot_loss[dur_loss=0.2031, prior_loss=0.9756, diff_loss=0.4421, tot_loss=1.621, over 937.00 samples.], 2024-10-21 12:22:25,094 INFO [train.py:561] (1/4) Epoch 2182, batch 14, global_batch_idx: 34910, batch size: 142, loss[dur_loss=0.2146, prior_loss=0.9762, diff_loss=0.3283, tot_loss=1.519, over 142.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9762, diff_loss=0.3739, tot_loss=1.557, over 2210.00 samples.], 2024-10-21 12:22:26,511 INFO [train.py:682] (1/4) Start epoch 2183 2024-10-21 12:22:46,233 INFO [train.py:561] (1/4) Epoch 2183, batch 8, global_batch_idx: 34920, batch size: 170, loss[dur_loss=0.212, prior_loss=0.9767, diff_loss=0.3033, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.2073, prior_loss=0.976, diff_loss=0.3821, tot_loss=1.565, over 1432.00 samples.], 2024-10-21 12:22:56,299 INFO [train.py:682] (1/4) Start epoch 2184 2024-10-21 12:23:08,061 INFO [train.py:561] (1/4) Epoch 2184, batch 2, global_batch_idx: 34930, batch size: 203, loss[dur_loss=0.2058, prior_loss=0.9763, diff_loss=0.3457, tot_loss=1.528, over 203.00 samples.], tot_loss[dur_loss=0.2084, prior_loss=0.9765, diff_loss=0.3268, tot_loss=1.512, over 442.00 samples.], 2024-10-21 12:23:22,337 INFO [train.py:561] (1/4) Epoch 2184, batch 12, global_batch_idx: 34940, batch size: 152, loss[dur_loss=0.2047, prior_loss=0.9764, diff_loss=0.3154, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.2064, prior_loss=0.9761, diff_loss=0.3687, tot_loss=1.551, over 1966.00 samples.], 2024-10-21 12:23:26,780 INFO [train.py:682] (1/4) Start epoch 2185 2024-10-21 12:23:43,626 INFO [train.py:561] (1/4) Epoch 2185, batch 6, global_batch_idx: 34950, batch size: 106, loss[dur_loss=0.2076, prior_loss=0.9764, diff_loss=0.2809, tot_loss=1.465, over 106.00 samples.], tot_loss[dur_loss=0.2044, prior_loss=0.9757, diff_loss=0.4, tot_loss=1.58, over 1142.00 samples.], 2024-10-21 12:23:56,565 INFO [train.py:682] (1/4) Start epoch 2186 2024-10-21 12:24:05,074 INFO [train.py:561] (1/4) Epoch 2186, batch 0, global_batch_idx: 34960, batch size: 108, loss[dur_loss=0.2134, prior_loss=0.9771, diff_loss=0.3042, tot_loss=1.495, over 108.00 samples.], tot_loss[dur_loss=0.2134, prior_loss=0.9771, diff_loss=0.3042, tot_loss=1.495, over 108.00 samples.], 2024-10-21 12:24:19,273 INFO [train.py:561] (1/4) Epoch 2186, batch 10, global_batch_idx: 34970, batch size: 111, loss[dur_loss=0.2097, prior_loss=0.9773, diff_loss=0.299, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9759, diff_loss=0.3611, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 12:24:26,365 INFO [train.py:682] (1/4) Start epoch 2187 2024-10-21 12:24:40,503 INFO [train.py:561] (1/4) Epoch 2187, batch 4, global_batch_idx: 34980, batch size: 189, loss[dur_loss=0.2036, prior_loss=0.9764, diff_loss=0.3489, tot_loss=1.529, over 189.00 samples.], tot_loss[dur_loss=0.2021, prior_loss=0.9755, diff_loss=0.4278, tot_loss=1.605, over 937.00 samples.], 2024-10-21 12:24:55,306 INFO [train.py:561] (1/4) Epoch 2187, batch 14, global_batch_idx: 34990, batch size: 142, loss[dur_loss=0.2093, prior_loss=0.9759, diff_loss=0.304, tot_loss=1.489, over 142.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9761, diff_loss=0.3577, tot_loss=1.54, over 2210.00 samples.], 2024-10-21 12:24:56,724 INFO [train.py:682] (1/4) Start epoch 2188 2024-10-21 12:25:16,508 INFO [train.py:561] (1/4) Epoch 2188, batch 8, global_batch_idx: 35000, batch size: 170, loss[dur_loss=0.21, prior_loss=0.9765, diff_loss=0.3286, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9759, diff_loss=0.3965, tot_loss=1.578, over 1432.00 samples.], 2024-10-21 12:25:26,553 INFO [train.py:682] (1/4) Start epoch 2189 2024-10-21 12:25:38,261 INFO [train.py:561] (1/4) Epoch 2189, batch 2, global_batch_idx: 35010, batch size: 203, loss[dur_loss=0.208, prior_loss=0.9762, diff_loss=0.3611, tot_loss=1.545, over 203.00 samples.], tot_loss[dur_loss=0.2087, prior_loss=0.9763, diff_loss=0.3316, tot_loss=1.517, over 442.00 samples.], 2024-10-21 12:25:52,447 INFO [train.py:561] (1/4) Epoch 2189, batch 12, global_batch_idx: 35020, batch size: 152, loss[dur_loss=0.2044, prior_loss=0.9762, diff_loss=0.301, tot_loss=1.482, over 152.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.976, diff_loss=0.3752, tot_loss=1.557, over 1966.00 samples.], 2024-10-21 12:25:56,854 INFO [train.py:682] (1/4) Start epoch 2190 2024-10-21 12:26:13,691 INFO [train.py:561] (1/4) Epoch 2190, batch 6, global_batch_idx: 35030, batch size: 106, loss[dur_loss=0.2084, prior_loss=0.9764, diff_loss=0.2999, tot_loss=1.485, over 106.00 samples.], tot_loss[dur_loss=0.2047, prior_loss=0.9757, diff_loss=0.4168, tot_loss=1.597, over 1142.00 samples.], 2024-10-21 12:26:26,686 INFO [train.py:682] (1/4) Start epoch 2191 2024-10-21 12:26:35,334 INFO [train.py:561] (1/4) Epoch 2191, batch 0, global_batch_idx: 35040, batch size: 108, loss[dur_loss=0.2127, prior_loss=0.977, diff_loss=0.2988, tot_loss=1.489, over 108.00 samples.], tot_loss[dur_loss=0.2127, prior_loss=0.977, diff_loss=0.2988, tot_loss=1.489, over 108.00 samples.], 2024-10-21 12:26:49,408 INFO [train.py:561] (1/4) Epoch 2191, batch 10, global_batch_idx: 35050, batch size: 111, loss[dur_loss=0.2101, prior_loss=0.9773, diff_loss=0.2826, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.9761, diff_loss=0.3869, tot_loss=1.569, over 1656.00 samples.], 2024-10-21 12:26:56,429 INFO [train.py:682] (1/4) Start epoch 2192 2024-10-21 12:27:10,155 INFO [train.py:561] (1/4) Epoch 2192, batch 4, global_batch_idx: 35060, batch size: 189, loss[dur_loss=0.207, prior_loss=0.9762, diff_loss=0.3833, tot_loss=1.567, over 189.00 samples.], tot_loss[dur_loss=0.2046, prior_loss=0.9755, diff_loss=0.4412, tot_loss=1.621, over 937.00 samples.], 2024-10-21 12:27:24,908 INFO [train.py:561] (1/4) Epoch 2192, batch 14, global_batch_idx: 35070, batch size: 142, loss[dur_loss=0.2119, prior_loss=0.9761, diff_loss=0.3068, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.2073, prior_loss=0.9761, diff_loss=0.3729, tot_loss=1.556, over 2210.00 samples.], 2024-10-21 12:27:26,324 INFO [train.py:682] (1/4) Start epoch 2193 2024-10-21 12:27:46,178 INFO [train.py:561] (1/4) Epoch 2193, batch 8, global_batch_idx: 35080, batch size: 170, loss[dur_loss=0.2094, prior_loss=0.9764, diff_loss=0.329, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.2054, prior_loss=0.9759, diff_loss=0.389, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 12:27:56,256 INFO [train.py:682] (1/4) Start epoch 2194 2024-10-21 12:28:07,363 INFO [train.py:561] (1/4) Epoch 2194, batch 2, global_batch_idx: 35090, batch size: 203, loss[dur_loss=0.2084, prior_loss=0.9765, diff_loss=0.3293, tot_loss=1.514, over 203.00 samples.], tot_loss[dur_loss=0.2087, prior_loss=0.9765, diff_loss=0.3191, tot_loss=1.504, over 442.00 samples.], 2024-10-21 12:28:21,477 INFO [train.py:561] (1/4) Epoch 2194, batch 12, global_batch_idx: 35100, batch size: 152, loss[dur_loss=0.2069, prior_loss=0.9762, diff_loss=0.3296, tot_loss=1.513, over 152.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.976, diff_loss=0.3683, tot_loss=1.55, over 1966.00 samples.], 2024-10-21 12:28:25,930 INFO [train.py:682] (1/4) Start epoch 2195 2024-10-21 12:28:43,495 INFO [train.py:561] (1/4) Epoch 2195, batch 6, global_batch_idx: 35110, batch size: 106, loss[dur_loss=0.2082, prior_loss=0.9764, diff_loss=0.3252, tot_loss=1.51, over 106.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9757, diff_loss=0.412, tot_loss=1.592, over 1142.00 samples.], 2024-10-21 12:28:56,504 INFO [train.py:682] (1/4) Start epoch 2196 2024-10-21 12:29:05,320 INFO [train.py:561] (1/4) Epoch 2196, batch 0, global_batch_idx: 35120, batch size: 108, loss[dur_loss=0.2113, prior_loss=0.9769, diff_loss=0.2976, tot_loss=1.486, over 108.00 samples.], tot_loss[dur_loss=0.2113, prior_loss=0.9769, diff_loss=0.2976, tot_loss=1.486, over 108.00 samples.], 2024-10-21 12:29:19,611 INFO [train.py:561] (1/4) Epoch 2196, batch 10, global_batch_idx: 35130, batch size: 111, loss[dur_loss=0.2097, prior_loss=0.9771, diff_loss=0.3117, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9758, diff_loss=0.374, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 12:29:26,762 INFO [train.py:682] (1/4) Start epoch 2197 2024-10-21 12:29:40,593 INFO [train.py:561] (1/4) Epoch 2197, batch 4, global_batch_idx: 35140, batch size: 189, loss[dur_loss=0.2071, prior_loss=0.9759, diff_loss=0.3556, tot_loss=1.539, over 189.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9754, diff_loss=0.4332, tot_loss=1.612, over 937.00 samples.], 2024-10-21 12:29:55,477 INFO [train.py:561] (1/4) Epoch 2197, batch 14, global_batch_idx: 35150, batch size: 142, loss[dur_loss=0.2051, prior_loss=0.9759, diff_loss=0.319, tot_loss=1.5, over 142.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9759, diff_loss=0.3554, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 12:29:56,917 INFO [train.py:682] (1/4) Start epoch 2198 2024-10-21 12:30:17,350 INFO [train.py:561] (1/4) Epoch 2198, batch 8, global_batch_idx: 35160, batch size: 170, loss[dur_loss=0.211, prior_loss=0.9761, diff_loss=0.3193, tot_loss=1.506, over 170.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9758, diff_loss=0.3764, tot_loss=1.558, over 1432.00 samples.], 2024-10-21 12:30:27,517 INFO [train.py:682] (1/4) Start epoch 2199 2024-10-21 12:30:39,044 INFO [train.py:561] (1/4) Epoch 2199, batch 2, global_batch_idx: 35170, batch size: 203, loss[dur_loss=0.2079, prior_loss=0.9761, diff_loss=0.3578, tot_loss=1.542, over 203.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.9763, diff_loss=0.3167, tot_loss=1.501, over 442.00 samples.], 2024-10-21 12:30:53,272 INFO [train.py:561] (1/4) Epoch 2199, batch 12, global_batch_idx: 35180, batch size: 152, loss[dur_loss=0.204, prior_loss=0.9762, diff_loss=0.3501, tot_loss=1.53, over 152.00 samples.], tot_loss[dur_loss=0.2051, prior_loss=0.9758, diff_loss=0.3815, tot_loss=1.562, over 1966.00 samples.], 2024-10-21 12:30:57,742 INFO [train.py:682] (1/4) Start epoch 2200 2024-10-21 12:31:15,195 INFO [train.py:561] (1/4) Epoch 2200, batch 6, global_batch_idx: 35190, batch size: 106, loss[dur_loss=0.2046, prior_loss=0.9762, diff_loss=0.3337, tot_loss=1.514, over 106.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9755, diff_loss=0.414, tot_loss=1.593, over 1142.00 samples.], 2024-10-21 12:31:28,369 INFO [train.py:682] (1/4) Start epoch 2201 2024-10-21 12:31:37,244 INFO [train.py:561] (1/4) Epoch 2201, batch 0, global_batch_idx: 35200, batch size: 108, loss[dur_loss=0.2142, prior_loss=0.9769, diff_loss=0.2837, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.2142, prior_loss=0.9769, diff_loss=0.2837, tot_loss=1.475, over 108.00 samples.], 2024-10-21 12:31:51,731 INFO [train.py:561] (1/4) Epoch 2201, batch 10, global_batch_idx: 35210, batch size: 111, loss[dur_loss=0.2082, prior_loss=0.9773, diff_loss=0.2635, tot_loss=1.449, over 111.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9759, diff_loss=0.3824, tot_loss=1.565, over 1656.00 samples.], 2024-10-21 12:31:58,931 INFO [train.py:682] (1/4) Start epoch 2202 2024-10-21 12:32:12,969 INFO [train.py:561] (1/4) Epoch 2202, batch 4, global_batch_idx: 35220, batch size: 189, loss[dur_loss=0.2074, prior_loss=0.9763, diff_loss=0.3157, tot_loss=1.499, over 189.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9754, diff_loss=0.4285, tot_loss=1.607, over 937.00 samples.], 2024-10-21 12:32:27,971 INFO [train.py:561] (1/4) Epoch 2202, batch 14, global_batch_idx: 35230, batch size: 142, loss[dur_loss=0.2078, prior_loss=0.9757, diff_loss=0.3113, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.9759, diff_loss=0.3551, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 12:32:29,401 INFO [train.py:682] (1/4) Start epoch 2203 2024-10-21 12:32:49,956 INFO [train.py:561] (1/4) Epoch 2203, batch 8, global_batch_idx: 35240, batch size: 170, loss[dur_loss=0.2108, prior_loss=0.9762, diff_loss=0.3194, tot_loss=1.506, over 170.00 samples.], tot_loss[dur_loss=0.2066, prior_loss=0.9757, diff_loss=0.3878, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 12:33:00,036 INFO [train.py:682] (1/4) Start epoch 2204 2024-10-21 12:33:11,925 INFO [train.py:561] (1/4) Epoch 2204, batch 2, global_batch_idx: 35250, batch size: 203, loss[dur_loss=0.2091, prior_loss=0.9762, diff_loss=0.3409, tot_loss=1.526, over 203.00 samples.], tot_loss[dur_loss=0.2091, prior_loss=0.9763, diff_loss=0.3231, tot_loss=1.508, over 442.00 samples.], 2024-10-21 12:33:26,191 INFO [train.py:561] (1/4) Epoch 2204, batch 12, global_batch_idx: 35260, batch size: 152, loss[dur_loss=0.2071, prior_loss=0.9763, diff_loss=0.3024, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9758, diff_loss=0.3618, tot_loss=1.544, over 1966.00 samples.], 2024-10-21 12:33:30,644 INFO [train.py:682] (1/4) Start epoch 2205 2024-10-21 12:33:47,942 INFO [train.py:561] (1/4) Epoch 2205, batch 6, global_batch_idx: 35270, batch size: 106, loss[dur_loss=0.2112, prior_loss=0.9764, diff_loss=0.3435, tot_loss=1.531, over 106.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9757, diff_loss=0.4081, tot_loss=1.589, over 1142.00 samples.], 2024-10-21 12:34:01,019 INFO [train.py:682] (1/4) Start epoch 2206 2024-10-21 12:34:09,851 INFO [train.py:561] (1/4) Epoch 2206, batch 0, global_batch_idx: 35280, batch size: 108, loss[dur_loss=0.2111, prior_loss=0.9771, diff_loss=0.3312, tot_loss=1.519, over 108.00 samples.], tot_loss[dur_loss=0.2111, prior_loss=0.9771, diff_loss=0.3312, tot_loss=1.519, over 108.00 samples.], 2024-10-21 12:34:24,083 INFO [train.py:561] (1/4) Epoch 2206, batch 10, global_batch_idx: 35290, batch size: 111, loss[dur_loss=0.2138, prior_loss=0.9771, diff_loss=0.3426, tot_loss=1.534, over 111.00 samples.], tot_loss[dur_loss=0.2069, prior_loss=0.9759, diff_loss=0.3807, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 12:34:31,173 INFO [train.py:682] (1/4) Start epoch 2207 2024-10-21 12:34:44,891 INFO [train.py:561] (1/4) Epoch 2207, batch 4, global_batch_idx: 35300, batch size: 189, loss[dur_loss=0.2085, prior_loss=0.9763, diff_loss=0.351, tot_loss=1.536, over 189.00 samples.], tot_loss[dur_loss=0.2018, prior_loss=0.9754, diff_loss=0.4203, tot_loss=1.597, over 937.00 samples.], 2024-10-21 12:34:59,731 INFO [train.py:561] (1/4) Epoch 2207, batch 14, global_batch_idx: 35310, batch size: 142, loss[dur_loss=0.2081, prior_loss=0.9759, diff_loss=0.3468, tot_loss=1.531, over 142.00 samples.], tot_loss[dur_loss=0.2053, prior_loss=0.9759, diff_loss=0.3598, tot_loss=1.541, over 2210.00 samples.], 2024-10-21 12:35:01,172 INFO [train.py:682] (1/4) Start epoch 2208 2024-10-21 12:35:21,383 INFO [train.py:561] (1/4) Epoch 2208, batch 8, global_batch_idx: 35320, batch size: 170, loss[dur_loss=0.2117, prior_loss=0.9762, diff_loss=0.3786, tot_loss=1.566, over 170.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9757, diff_loss=0.3819, tot_loss=1.563, over 1432.00 samples.], 2024-10-21 12:35:31,599 INFO [train.py:682] (1/4) Start epoch 2209 2024-10-21 12:35:42,995 INFO [train.py:561] (1/4) Epoch 2209, batch 2, global_batch_idx: 35330, batch size: 203, loss[dur_loss=0.2064, prior_loss=0.9763, diff_loss=0.3121, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.2069, prior_loss=0.9764, diff_loss=0.3035, tot_loss=1.487, over 442.00 samples.], 2024-10-21 12:35:57,203 INFO [train.py:561] (1/4) Epoch 2209, batch 12, global_batch_idx: 35340, batch size: 152, loss[dur_loss=0.2083, prior_loss=0.9763, diff_loss=0.3, tot_loss=1.485, over 152.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.976, diff_loss=0.3558, tot_loss=1.538, over 1966.00 samples.], 2024-10-21 12:36:01,664 INFO [train.py:682] (1/4) Start epoch 2210 2024-10-21 12:36:18,853 INFO [train.py:561] (1/4) Epoch 2210, batch 6, global_batch_idx: 35350, batch size: 106, loss[dur_loss=0.2087, prior_loss=0.9763, diff_loss=0.3348, tot_loss=1.52, over 106.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9755, diff_loss=0.408, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 12:36:31,926 INFO [train.py:682] (1/4) Start epoch 2211 2024-10-21 12:36:40,573 INFO [train.py:561] (1/4) Epoch 2211, batch 0, global_batch_idx: 35360, batch size: 108, loss[dur_loss=0.2105, prior_loss=0.9768, diff_loss=0.2993, tot_loss=1.487, over 108.00 samples.], tot_loss[dur_loss=0.2105, prior_loss=0.9768, diff_loss=0.2993, tot_loss=1.487, over 108.00 samples.], 2024-10-21 12:36:54,737 INFO [train.py:561] (1/4) Epoch 2211, batch 10, global_batch_idx: 35370, batch size: 111, loss[dur_loss=0.2088, prior_loss=0.9772, diff_loss=0.3132, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9757, diff_loss=0.3731, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 12:37:01,803 INFO [train.py:682] (1/4) Start epoch 2212 2024-10-21 12:37:15,378 INFO [train.py:561] (1/4) Epoch 2212, batch 4, global_batch_idx: 35380, batch size: 189, loss[dur_loss=0.2072, prior_loss=0.9761, diff_loss=0.3314, tot_loss=1.515, over 189.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9753, diff_loss=0.4338, tot_loss=1.612, over 937.00 samples.], 2024-10-21 12:37:30,139 INFO [train.py:561] (1/4) Epoch 2212, batch 14, global_batch_idx: 35390, batch size: 142, loss[dur_loss=0.2115, prior_loss=0.9759, diff_loss=0.29, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.9759, diff_loss=0.3604, tot_loss=1.542, over 2210.00 samples.], 2024-10-21 12:37:31,559 INFO [train.py:682] (1/4) Start epoch 2213 2024-10-21 12:37:51,587 INFO [train.py:561] (1/4) Epoch 2213, batch 8, global_batch_idx: 35400, batch size: 170, loss[dur_loss=0.2109, prior_loss=0.9761, diff_loss=0.3377, tot_loss=1.525, over 170.00 samples.], tot_loss[dur_loss=0.2048, prior_loss=0.9755, diff_loss=0.3722, tot_loss=1.553, over 1432.00 samples.], 2024-10-21 12:38:01,767 INFO [train.py:682] (1/4) Start epoch 2214 2024-10-21 12:38:13,016 INFO [train.py:561] (1/4) Epoch 2214, batch 2, global_batch_idx: 35410, batch size: 203, loss[dur_loss=0.2069, prior_loss=0.9763, diff_loss=0.3315, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.2089, prior_loss=0.9763, diff_loss=0.3236, tot_loss=1.509, over 442.00 samples.], 2024-10-21 12:38:27,332 INFO [train.py:561] (1/4) Epoch 2214, batch 12, global_batch_idx: 35420, batch size: 152, loss[dur_loss=0.2058, prior_loss=0.976, diff_loss=0.3297, tot_loss=1.511, over 152.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9758, diff_loss=0.3697, tot_loss=1.551, over 1966.00 samples.], 2024-10-21 12:38:31,808 INFO [train.py:682] (1/4) Start epoch 2215 2024-10-21 12:38:48,673 INFO [train.py:561] (1/4) Epoch 2215, batch 6, global_batch_idx: 35430, batch size: 106, loss[dur_loss=0.2098, prior_loss=0.9762, diff_loss=0.2786, tot_loss=1.465, over 106.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9754, diff_loss=0.4042, tot_loss=1.583, over 1142.00 samples.], 2024-10-21 12:39:01,777 INFO [train.py:682] (1/4) Start epoch 2216 2024-10-21 12:39:10,448 INFO [train.py:561] (1/4) Epoch 2216, batch 0, global_batch_idx: 35440, batch size: 108, loss[dur_loss=0.2106, prior_loss=0.977, diff_loss=0.2979, tot_loss=1.485, over 108.00 samples.], tot_loss[dur_loss=0.2106, prior_loss=0.977, diff_loss=0.2979, tot_loss=1.485, over 108.00 samples.], 2024-10-21 12:39:24,715 INFO [train.py:561] (1/4) Epoch 2216, batch 10, global_batch_idx: 35450, batch size: 111, loss[dur_loss=0.2088, prior_loss=0.9769, diff_loss=0.3077, tot_loss=1.493, over 111.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9757, diff_loss=0.3799, tot_loss=1.562, over 1656.00 samples.], 2024-10-21 12:39:31,799 INFO [train.py:682] (1/4) Start epoch 2217 2024-10-21 12:39:45,307 INFO [train.py:561] (1/4) Epoch 2217, batch 4, global_batch_idx: 35460, batch size: 189, loss[dur_loss=0.2052, prior_loss=0.976, diff_loss=0.308, tot_loss=1.489, over 189.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9753, diff_loss=0.4294, tot_loss=1.608, over 937.00 samples.], 2024-10-21 12:40:00,193 INFO [train.py:561] (1/4) Epoch 2217, batch 14, global_batch_idx: 35470, batch size: 142, loss[dur_loss=0.2081, prior_loss=0.9758, diff_loss=0.3034, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9758, diff_loss=0.3567, tot_loss=1.539, over 2210.00 samples.], 2024-10-21 12:40:01,631 INFO [train.py:682] (1/4) Start epoch 2218 2024-10-21 12:40:21,518 INFO [train.py:561] (1/4) Epoch 2218, batch 8, global_batch_idx: 35480, batch size: 170, loss[dur_loss=0.2102, prior_loss=0.9761, diff_loss=0.3337, tot_loss=1.52, over 170.00 samples.], tot_loss[dur_loss=0.2064, prior_loss=0.9756, diff_loss=0.3847, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 12:40:31,642 INFO [train.py:682] (1/4) Start epoch 2219 2024-10-21 12:40:42,910 INFO [train.py:561] (1/4) Epoch 2219, batch 2, global_batch_idx: 35490, batch size: 203, loss[dur_loss=0.2077, prior_loss=0.976, diff_loss=0.3774, tot_loss=1.561, over 203.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9761, diff_loss=0.3378, tot_loss=1.522, over 442.00 samples.], 2024-10-21 12:40:57,176 INFO [train.py:561] (1/4) Epoch 2219, batch 12, global_batch_idx: 35500, batch size: 152, loss[dur_loss=0.2054, prior_loss=0.9763, diff_loss=0.2898, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9757, diff_loss=0.3654, tot_loss=1.547, over 1966.00 samples.], 2024-10-21 12:41:01,646 INFO [train.py:682] (1/4) Start epoch 2220 2024-10-21 12:41:18,535 INFO [train.py:561] (1/4) Epoch 2220, batch 6, global_batch_idx: 35510, batch size: 106, loss[dur_loss=0.2078, prior_loss=0.9761, diff_loss=0.3128, tot_loss=1.497, over 106.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9755, diff_loss=0.4033, tot_loss=1.583, over 1142.00 samples.], 2024-10-21 12:41:31,581 INFO [train.py:682] (1/4) Start epoch 2221 2024-10-21 12:41:40,239 INFO [train.py:561] (1/4) Epoch 2221, batch 0, global_batch_idx: 35520, batch size: 108, loss[dur_loss=0.2117, prior_loss=0.9768, diff_loss=0.3071, tot_loss=1.496, over 108.00 samples.], tot_loss[dur_loss=0.2117, prior_loss=0.9768, diff_loss=0.3071, tot_loss=1.496, over 108.00 samples.], 2024-10-21 12:41:54,479 INFO [train.py:561] (1/4) Epoch 2221, batch 10, global_batch_idx: 35530, batch size: 111, loss[dur_loss=0.2093, prior_loss=0.977, diff_loss=0.3076, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.2051, prior_loss=0.9758, diff_loss=0.3691, tot_loss=1.55, over 1656.00 samples.], 2024-10-21 12:42:01,578 INFO [train.py:682] (1/4) Start epoch 2222 2024-10-21 12:42:15,217 INFO [train.py:561] (1/4) Epoch 2222, batch 4, global_batch_idx: 35540, batch size: 189, loss[dur_loss=0.2038, prior_loss=0.976, diff_loss=0.3343, tot_loss=1.514, over 189.00 samples.], tot_loss[dur_loss=0.2025, prior_loss=0.9754, diff_loss=0.429, tot_loss=1.607, over 937.00 samples.], 2024-10-21 12:42:30,020 INFO [train.py:561] (1/4) Epoch 2222, batch 14, global_batch_idx: 35550, batch size: 142, loss[dur_loss=0.2081, prior_loss=0.9756, diff_loss=0.3224, tot_loss=1.506, over 142.00 samples.], tot_loss[dur_loss=0.2066, prior_loss=0.9758, diff_loss=0.3654, tot_loss=1.548, over 2210.00 samples.], 2024-10-21 12:42:31,437 INFO [train.py:682] (1/4) Start epoch 2223 2024-10-21 12:42:51,020 INFO [train.py:561] (1/4) Epoch 2223, batch 8, global_batch_idx: 35560, batch size: 170, loss[dur_loss=0.2121, prior_loss=0.9763, diff_loss=0.3203, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.9757, diff_loss=0.3846, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 12:43:01,107 INFO [train.py:682] (1/4) Start epoch 2224 2024-10-21 12:43:12,532 INFO [train.py:561] (1/4) Epoch 2224, batch 2, global_batch_idx: 35570, batch size: 203, loss[dur_loss=0.2086, prior_loss=0.9762, diff_loss=0.3471, tot_loss=1.532, over 203.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.9762, diff_loss=0.3133, tot_loss=1.5, over 442.00 samples.], 2024-10-21 12:43:26,641 INFO [train.py:561] (1/4) Epoch 2224, batch 12, global_batch_idx: 35580, batch size: 152, loss[dur_loss=0.2079, prior_loss=0.976, diff_loss=0.3474, tot_loss=1.531, over 152.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9758, diff_loss=0.3645, tot_loss=1.547, over 1966.00 samples.], 2024-10-21 12:43:31,073 INFO [train.py:682] (1/4) Start epoch 2225 2024-10-21 12:43:47,818 INFO [train.py:561] (1/4) Epoch 2225, batch 6, global_batch_idx: 35590, batch size: 106, loss[dur_loss=0.2085, prior_loss=0.9763, diff_loss=0.3191, tot_loss=1.504, over 106.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9756, diff_loss=0.4112, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 12:44:00,762 INFO [train.py:682] (1/4) Start epoch 2226 2024-10-21 12:44:09,265 INFO [train.py:561] (1/4) Epoch 2226, batch 0, global_batch_idx: 35600, batch size: 108, loss[dur_loss=0.2097, prior_loss=0.9768, diff_loss=0.2881, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.2097, prior_loss=0.9768, diff_loss=0.2881, tot_loss=1.475, over 108.00 samples.], 2024-10-21 12:44:23,450 INFO [train.py:561] (1/4) Epoch 2226, batch 10, global_batch_idx: 35610, batch size: 111, loss[dur_loss=0.2106, prior_loss=0.9772, diff_loss=0.3354, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.2054, prior_loss=0.9758, diff_loss=0.3785, tot_loss=1.56, over 1656.00 samples.], 2024-10-21 12:44:30,512 INFO [train.py:682] (1/4) Start epoch 2227 2024-10-21 12:44:44,373 INFO [train.py:561] (1/4) Epoch 2227, batch 4, global_batch_idx: 35620, batch size: 189, loss[dur_loss=0.2065, prior_loss=0.976, diff_loss=0.3358, tot_loss=1.518, over 189.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9752, diff_loss=0.4244, tot_loss=1.603, over 937.00 samples.], 2024-10-21 12:44:59,147 INFO [train.py:561] (1/4) Epoch 2227, batch 14, global_batch_idx: 35630, batch size: 142, loss[dur_loss=0.2072, prior_loss=0.9757, diff_loss=0.2941, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.9758, diff_loss=0.3592, tot_loss=1.541, over 2210.00 samples.], 2024-10-21 12:45:00,577 INFO [train.py:682] (1/4) Start epoch 2228 2024-10-21 12:45:20,611 INFO [train.py:561] (1/4) Epoch 2228, batch 8, global_batch_idx: 35640, batch size: 170, loss[dur_loss=0.2095, prior_loss=0.9761, diff_loss=0.3169, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.9755, diff_loss=0.389, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 12:45:30,722 INFO [train.py:682] (1/4) Start epoch 2229 2024-10-21 12:45:42,071 INFO [train.py:561] (1/4) Epoch 2229, batch 2, global_batch_idx: 35650, batch size: 203, loss[dur_loss=0.2067, prior_loss=0.976, diff_loss=0.3397, tot_loss=1.522, over 203.00 samples.], tot_loss[dur_loss=0.2071, prior_loss=0.9761, diff_loss=0.3141, tot_loss=1.497, over 442.00 samples.], 2024-10-21 12:45:56,593 INFO [train.py:561] (1/4) Epoch 2229, batch 12, global_batch_idx: 35660, batch size: 152, loss[dur_loss=0.2061, prior_loss=0.976, diff_loss=0.3465, tot_loss=1.529, over 152.00 samples.], tot_loss[dur_loss=0.2067, prior_loss=0.9758, diff_loss=0.3623, tot_loss=1.545, over 1966.00 samples.], 2024-10-21 12:46:01,039 INFO [train.py:682] (1/4) Start epoch 2230 2024-10-21 12:46:18,165 INFO [train.py:561] (1/4) Epoch 2230, batch 6, global_batch_idx: 35670, batch size: 106, loss[dur_loss=0.2109, prior_loss=0.9761, diff_loss=0.2981, tot_loss=1.485, over 106.00 samples.], tot_loss[dur_loss=0.2047, prior_loss=0.9754, diff_loss=0.4065, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 12:46:31,187 INFO [train.py:682] (1/4) Start epoch 2231 2024-10-21 12:46:39,669 INFO [train.py:561] (1/4) Epoch 2231, batch 0, global_batch_idx: 35680, batch size: 108, loss[dur_loss=0.2118, prior_loss=0.9766, diff_loss=0.2696, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.2118, prior_loss=0.9766, diff_loss=0.2696, tot_loss=1.458, over 108.00 samples.], 2024-10-21 12:46:53,938 INFO [train.py:561] (1/4) Epoch 2231, batch 10, global_batch_idx: 35690, batch size: 111, loss[dur_loss=0.2079, prior_loss=0.9768, diff_loss=0.3267, tot_loss=1.511, over 111.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9755, diff_loss=0.3787, tot_loss=1.559, over 1656.00 samples.], 2024-10-21 12:47:01,044 INFO [train.py:682] (1/4) Start epoch 2232 2024-10-21 12:47:14,999 INFO [train.py:561] (1/4) Epoch 2232, batch 4, global_batch_idx: 35700, batch size: 189, loss[dur_loss=0.205, prior_loss=0.9757, diff_loss=0.3302, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9751, diff_loss=0.4204, tot_loss=1.598, over 937.00 samples.], 2024-10-21 12:47:29,804 INFO [train.py:561] (1/4) Epoch 2232, batch 14, global_batch_idx: 35710, batch size: 142, loss[dur_loss=0.2051, prior_loss=0.9755, diff_loss=0.3069, tot_loss=1.488, over 142.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9756, diff_loss=0.3511, tot_loss=1.532, over 2210.00 samples.], 2024-10-21 12:47:31,222 INFO [train.py:682] (1/4) Start epoch 2233 2024-10-21 12:47:51,110 INFO [train.py:561] (1/4) Epoch 2233, batch 8, global_batch_idx: 35720, batch size: 170, loss[dur_loss=0.2085, prior_loss=0.9759, diff_loss=0.3458, tot_loss=1.53, over 170.00 samples.], tot_loss[dur_loss=0.2047, prior_loss=0.9755, diff_loss=0.3954, tot_loss=1.576, over 1432.00 samples.], 2024-10-21 12:48:01,193 INFO [train.py:682] (1/4) Start epoch 2234 2024-10-21 12:48:12,367 INFO [train.py:561] (1/4) Epoch 2234, batch 2, global_batch_idx: 35730, batch size: 203, loss[dur_loss=0.2075, prior_loss=0.976, diff_loss=0.3477, tot_loss=1.531, over 203.00 samples.], tot_loss[dur_loss=0.207, prior_loss=0.9761, diff_loss=0.3182, tot_loss=1.501, over 442.00 samples.], 2024-10-21 12:48:26,624 INFO [train.py:561] (1/4) Epoch 2234, batch 12, global_batch_idx: 35740, batch size: 152, loss[dur_loss=0.2071, prior_loss=0.9761, diff_loss=0.2969, tot_loss=1.48, over 152.00 samples.], tot_loss[dur_loss=0.2051, prior_loss=0.9756, diff_loss=0.3718, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 12:48:31,091 INFO [train.py:682] (1/4) Start epoch 2235 2024-10-21 12:48:48,269 INFO [train.py:561] (1/4) Epoch 2235, batch 6, global_batch_idx: 35750, batch size: 106, loss[dur_loss=0.208, prior_loss=0.9761, diff_loss=0.3063, tot_loss=1.49, over 106.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9753, diff_loss=0.411, tot_loss=1.588, over 1142.00 samples.], 2024-10-21 12:49:01,303 INFO [train.py:682] (1/4) Start epoch 2236 2024-10-21 12:49:10,227 INFO [train.py:561] (1/4) Epoch 2236, batch 0, global_batch_idx: 35760, batch size: 108, loss[dur_loss=0.2124, prior_loss=0.9765, diff_loss=0.2858, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.2124, prior_loss=0.9765, diff_loss=0.2858, tot_loss=1.475, over 108.00 samples.], 2024-10-21 12:49:24,511 INFO [train.py:561] (1/4) Epoch 2236, batch 10, global_batch_idx: 35770, batch size: 111, loss[dur_loss=0.2081, prior_loss=0.9771, diff_loss=0.2968, tot_loss=1.482, over 111.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9756, diff_loss=0.3732, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 12:49:31,626 INFO [train.py:682] (1/4) Start epoch 2237 2024-10-21 12:49:45,534 INFO [train.py:561] (1/4) Epoch 2237, batch 4, global_batch_idx: 35780, batch size: 189, loss[dur_loss=0.2057, prior_loss=0.9761, diff_loss=0.3084, tot_loss=1.49, over 189.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9752, diff_loss=0.441, tot_loss=1.619, over 937.00 samples.], 2024-10-21 12:50:00,358 INFO [train.py:561] (1/4) Epoch 2237, batch 14, global_batch_idx: 35790, batch size: 142, loss[dur_loss=0.2045, prior_loss=0.9758, diff_loss=0.2825, tot_loss=1.463, over 142.00 samples.], tot_loss[dur_loss=0.2062, prior_loss=0.9758, diff_loss=0.3649, tot_loss=1.547, over 2210.00 samples.], 2024-10-21 12:50:01,786 INFO [train.py:682] (1/4) Start epoch 2238 2024-10-21 12:50:21,620 INFO [train.py:561] (1/4) Epoch 2238, batch 8, global_batch_idx: 35800, batch size: 170, loss[dur_loss=0.2089, prior_loss=0.9764, diff_loss=0.2761, tot_loss=1.461, over 170.00 samples.], tot_loss[dur_loss=0.204, prior_loss=0.9756, diff_loss=0.3862, tot_loss=1.566, over 1432.00 samples.], 2024-10-21 12:50:31,619 INFO [train.py:682] (1/4) Start epoch 2239 2024-10-21 12:50:42,605 INFO [train.py:561] (1/4) Epoch 2239, batch 2, global_batch_idx: 35810, batch size: 203, loss[dur_loss=0.2073, prior_loss=0.9762, diff_loss=0.3581, tot_loss=1.542, over 203.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9762, diff_loss=0.333, tot_loss=1.517, over 442.00 samples.], 2024-10-21 12:50:56,782 INFO [train.py:561] (1/4) Epoch 2239, batch 12, global_batch_idx: 35820, batch size: 152, loss[dur_loss=0.2057, prior_loss=0.9759, diff_loss=0.3338, tot_loss=1.515, over 152.00 samples.], tot_loss[dur_loss=0.2054, prior_loss=0.9757, diff_loss=0.375, tot_loss=1.556, over 1966.00 samples.], 2024-10-21 12:51:01,225 INFO [train.py:682] (1/4) Start epoch 2240 2024-10-21 12:51:18,296 INFO [train.py:561] (1/4) Epoch 2240, batch 6, global_batch_idx: 35830, batch size: 106, loss[dur_loss=0.207, prior_loss=0.976, diff_loss=0.254, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.2038, prior_loss=0.9754, diff_loss=0.3868, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 12:51:31,289 INFO [train.py:682] (1/4) Start epoch 2241 2024-10-21 12:51:39,793 INFO [train.py:561] (1/4) Epoch 2241, batch 0, global_batch_idx: 35840, batch size: 108, loss[dur_loss=0.2106, prior_loss=0.9767, diff_loss=0.2711, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.2106, prior_loss=0.9767, diff_loss=0.2711, tot_loss=1.458, over 108.00 samples.], 2024-10-21 12:51:53,948 INFO [train.py:561] (1/4) Epoch 2241, batch 10, global_batch_idx: 35850, batch size: 111, loss[dur_loss=0.2056, prior_loss=0.9769, diff_loss=0.32, tot_loss=1.502, over 111.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9755, diff_loss=0.3823, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 12:52:01,014 INFO [train.py:682] (1/4) Start epoch 2242 2024-10-21 12:52:14,457 INFO [train.py:561] (1/4) Epoch 2242, batch 4, global_batch_idx: 35860, batch size: 189, loss[dur_loss=0.2048, prior_loss=0.9761, diff_loss=0.3151, tot_loss=1.496, over 189.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.9751, diff_loss=0.432, tot_loss=1.61, over 937.00 samples.], 2024-10-21 12:52:29,320 INFO [train.py:561] (1/4) Epoch 2242, batch 14, global_batch_idx: 35870, batch size: 142, loss[dur_loss=0.2074, prior_loss=0.9757, diff_loss=0.3004, tot_loss=1.483, over 142.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9758, diff_loss=0.3621, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 12:52:30,751 INFO [train.py:682] (1/4) Start epoch 2243 2024-10-21 12:52:50,695 INFO [train.py:561] (1/4) Epoch 2243, batch 8, global_batch_idx: 35880, batch size: 170, loss[dur_loss=0.2096, prior_loss=0.9763, diff_loss=0.3025, tot_loss=1.488, over 170.00 samples.], tot_loss[dur_loss=0.2051, prior_loss=0.9755, diff_loss=0.3923, tot_loss=1.573, over 1432.00 samples.], 2024-10-21 12:53:00,794 INFO [train.py:682] (1/4) Start epoch 2244 2024-10-21 12:53:12,270 INFO [train.py:561] (1/4) Epoch 2244, batch 2, global_batch_idx: 35890, batch size: 203, loss[dur_loss=0.2071, prior_loss=0.9763, diff_loss=0.3331, tot_loss=1.517, over 203.00 samples.], tot_loss[dur_loss=0.2065, prior_loss=0.9762, diff_loss=0.3112, tot_loss=1.494, over 442.00 samples.], 2024-10-21 12:53:26,472 INFO [train.py:561] (1/4) Epoch 2244, batch 12, global_batch_idx: 35900, batch size: 152, loss[dur_loss=0.2068, prior_loss=0.9759, diff_loss=0.3164, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9756, diff_loss=0.365, tot_loss=1.546, over 1966.00 samples.], 2024-10-21 12:53:30,898 INFO [train.py:682] (1/4) Start epoch 2245 2024-10-21 12:53:47,819 INFO [train.py:561] (1/4) Epoch 2245, batch 6, global_batch_idx: 35910, batch size: 106, loss[dur_loss=0.2069, prior_loss=0.9759, diff_loss=0.3064, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9752, diff_loss=0.4067, tot_loss=1.585, over 1142.00 samples.], 2024-10-21 12:54:00,841 INFO [train.py:682] (1/4) Start epoch 2246 2024-10-21 12:54:09,754 INFO [train.py:561] (1/4) Epoch 2246, batch 0, global_batch_idx: 35920, batch size: 108, loss[dur_loss=0.2072, prior_loss=0.9765, diff_loss=0.2797, tot_loss=1.463, over 108.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9765, diff_loss=0.2797, tot_loss=1.463, over 108.00 samples.], 2024-10-21 12:54:23,907 INFO [train.py:561] (1/4) Epoch 2246, batch 10, global_batch_idx: 35930, batch size: 111, loss[dur_loss=0.2102, prior_loss=0.9769, diff_loss=0.3258, tot_loss=1.513, over 111.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9755, diff_loss=0.3692, tot_loss=1.549, over 1656.00 samples.], 2024-10-21 12:54:30,967 INFO [train.py:682] (1/4) Start epoch 2247 2024-10-21 12:54:44,440 INFO [train.py:561] (1/4) Epoch 2247, batch 4, global_batch_idx: 35940, batch size: 189, loss[dur_loss=0.2021, prior_loss=0.9757, diff_loss=0.3189, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.975, diff_loss=0.4233, tot_loss=1.6, over 937.00 samples.], 2024-10-21 12:54:59,267 INFO [train.py:561] (1/4) Epoch 2247, batch 14, global_batch_idx: 35950, batch size: 142, loss[dur_loss=0.2082, prior_loss=0.9756, diff_loss=0.2843, tot_loss=1.468, over 142.00 samples.], tot_loss[dur_loss=0.2046, prior_loss=0.9756, diff_loss=0.3593, tot_loss=1.539, over 2210.00 samples.], 2024-10-21 12:55:00,695 INFO [train.py:682] (1/4) Start epoch 2248 2024-10-21 12:55:20,279 INFO [train.py:561] (1/4) Epoch 2248, batch 8, global_batch_idx: 35960, batch size: 170, loss[dur_loss=0.2102, prior_loss=0.976, diff_loss=0.3302, tot_loss=1.516, over 170.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9754, diff_loss=0.3781, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 12:55:30,405 INFO [train.py:682] (1/4) Start epoch 2249 2024-10-21 12:55:41,874 INFO [train.py:561] (1/4) Epoch 2249, batch 2, global_batch_idx: 35970, batch size: 203, loss[dur_loss=0.204, prior_loss=0.9759, diff_loss=0.3498, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.976, diff_loss=0.3223, tot_loss=1.503, over 442.00 samples.], 2024-10-21 12:55:56,068 INFO [train.py:561] (1/4) Epoch 2249, batch 12, global_batch_idx: 35980, batch size: 152, loss[dur_loss=0.2059, prior_loss=0.9761, diff_loss=0.3006, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9756, diff_loss=0.3688, tot_loss=1.549, over 1966.00 samples.], 2024-10-21 12:56:00,496 INFO [train.py:682] (1/4) Start epoch 2250 2024-10-21 12:56:17,234 INFO [train.py:561] (1/4) Epoch 2250, batch 6, global_batch_idx: 35990, batch size: 106, loss[dur_loss=0.2072, prior_loss=0.9758, diff_loss=0.3401, tot_loss=1.523, over 106.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.4139, tot_loss=1.593, over 1142.00 samples.], 2024-10-21 12:56:30,245 INFO [train.py:682] (1/4) Start epoch 2251 2024-10-21 12:56:38,849 INFO [train.py:561] (1/4) Epoch 2251, batch 0, global_batch_idx: 36000, batch size: 108, loss[dur_loss=0.2125, prior_loss=0.9767, diff_loss=0.3077, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.2125, prior_loss=0.9767, diff_loss=0.3077, tot_loss=1.497, over 108.00 samples.], 2024-10-21 12:56:40,249 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 12:57:20,418 INFO [train.py:589] (1/4) Epoch 2251, validation: dur_loss=0.461, prior_loss=1.035, diff_loss=0.389, tot_loss=1.885, over 100.00 samples. 2024-10-21 12:57:20,419 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 12:57:33,270 INFO [train.py:561] (1/4) Epoch 2251, batch 10, global_batch_idx: 36010, batch size: 111, loss[dur_loss=0.2057, prior_loss=0.9768, diff_loss=0.3171, tot_loss=1.5, over 111.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9756, diff_loss=0.3782, tot_loss=1.559, over 1656.00 samples.], 2024-10-21 12:57:40,388 INFO [train.py:682] (1/4) Start epoch 2252 2024-10-21 12:57:53,990 INFO [train.py:561] (1/4) Epoch 2252, batch 4, global_batch_idx: 36020, batch size: 189, loss[dur_loss=0.205, prior_loss=0.976, diff_loss=0.3333, tot_loss=1.514, over 189.00 samples.], tot_loss[dur_loss=0.2022, prior_loss=0.9751, diff_loss=0.4328, tot_loss=1.61, over 937.00 samples.], 2024-10-21 12:58:08,694 INFO [train.py:561] (1/4) Epoch 2252, batch 14, global_batch_idx: 36030, batch size: 142, loss[dur_loss=0.2077, prior_loss=0.9758, diff_loss=0.3046, tot_loss=1.488, over 142.00 samples.], tot_loss[dur_loss=0.2053, prior_loss=0.9757, diff_loss=0.3595, tot_loss=1.54, over 2210.00 samples.], 2024-10-21 12:58:10,111 INFO [train.py:682] (1/4) Start epoch 2253 2024-10-21 12:58:29,947 INFO [train.py:561] (1/4) Epoch 2253, batch 8, global_batch_idx: 36040, batch size: 170, loss[dur_loss=0.2154, prior_loss=0.9765, diff_loss=0.2976, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.2055, prior_loss=0.9756, diff_loss=0.3806, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 12:58:39,972 INFO [train.py:682] (1/4) Start epoch 2254 2024-10-21 12:58:51,663 INFO [train.py:561] (1/4) Epoch 2254, batch 2, global_batch_idx: 36050, batch size: 203, loss[dur_loss=0.2072, prior_loss=0.9759, diff_loss=0.3497, tot_loss=1.533, over 203.00 samples.], tot_loss[dur_loss=0.2083, prior_loss=0.9761, diff_loss=0.32, tot_loss=1.504, over 442.00 samples.], 2024-10-21 12:59:05,824 INFO [train.py:561] (1/4) Epoch 2254, batch 12, global_batch_idx: 36060, batch size: 152, loss[dur_loss=0.2069, prior_loss=0.9759, diff_loss=0.3094, tot_loss=1.492, over 152.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9756, diff_loss=0.3612, tot_loss=1.542, over 1966.00 samples.], 2024-10-21 12:59:10,249 INFO [train.py:682] (1/4) Start epoch 2255 2024-10-21 12:59:27,003 INFO [train.py:561] (1/4) Epoch 2255, batch 6, global_batch_idx: 36070, batch size: 106, loss[dur_loss=0.2084, prior_loss=0.9758, diff_loss=0.3342, tot_loss=1.518, over 106.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9753, diff_loss=0.4026, tot_loss=1.582, over 1142.00 samples.], 2024-10-21 12:59:39,928 INFO [train.py:682] (1/4) Start epoch 2256 2024-10-21 12:59:48,599 INFO [train.py:561] (1/4) Epoch 2256, batch 0, global_batch_idx: 36080, batch size: 108, loss[dur_loss=0.2133, prior_loss=0.9767, diff_loss=0.2625, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.2133, prior_loss=0.9767, diff_loss=0.2625, tot_loss=1.452, over 108.00 samples.], 2024-10-21 13:00:02,611 INFO [train.py:561] (1/4) Epoch 2256, batch 10, global_batch_idx: 36090, batch size: 111, loss[dur_loss=0.2083, prior_loss=0.9767, diff_loss=0.3273, tot_loss=1.512, over 111.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9756, diff_loss=0.3733, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 13:00:09,586 INFO [train.py:682] (1/4) Start epoch 2257 2024-10-21 13:00:23,462 INFO [train.py:561] (1/4) Epoch 2257, batch 4, global_batch_idx: 36100, batch size: 189, loss[dur_loss=0.2081, prior_loss=0.9759, diff_loss=0.3391, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9752, diff_loss=0.4227, tot_loss=1.602, over 937.00 samples.], 2024-10-21 13:00:38,186 INFO [train.py:561] (1/4) Epoch 2257, batch 14, global_batch_idx: 36110, batch size: 142, loss[dur_loss=0.2043, prior_loss=0.9755, diff_loss=0.271, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9756, diff_loss=0.3556, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 13:00:39,594 INFO [train.py:682] (1/4) Start epoch 2258 2024-10-21 13:00:59,377 INFO [train.py:561] (1/4) Epoch 2258, batch 8, global_batch_idx: 36120, batch size: 170, loss[dur_loss=0.2087, prior_loss=0.9759, diff_loss=0.3565, tot_loss=1.541, over 170.00 samples.], tot_loss[dur_loss=0.2038, prior_loss=0.9755, diff_loss=0.3953, tot_loss=1.575, over 1432.00 samples.], 2024-10-21 13:01:09,392 INFO [train.py:682] (1/4) Start epoch 2259 2024-10-21 13:01:20,508 INFO [train.py:561] (1/4) Epoch 2259, batch 2, global_batch_idx: 36130, batch size: 203, loss[dur_loss=0.2038, prior_loss=0.9756, diff_loss=0.3208, tot_loss=1.5, over 203.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9757, diff_loss=0.3123, tot_loss=1.493, over 442.00 samples.], 2024-10-21 13:01:34,618 INFO [train.py:561] (1/4) Epoch 2259, batch 12, global_batch_idx: 36140, batch size: 152, loss[dur_loss=0.2052, prior_loss=0.9756, diff_loss=0.2636, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9755, diff_loss=0.3613, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 13:01:39,014 INFO [train.py:682] (1/4) Start epoch 2260 2024-10-21 13:01:55,967 INFO [train.py:561] (1/4) Epoch 2260, batch 6, global_batch_idx: 36150, batch size: 106, loss[dur_loss=0.2052, prior_loss=0.9757, diff_loss=0.2976, tot_loss=1.478, over 106.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9751, diff_loss=0.4095, tot_loss=1.586, over 1142.00 samples.], 2024-10-21 13:02:08,955 INFO [train.py:682] (1/4) Start epoch 2261 2024-10-21 13:02:17,846 INFO [train.py:561] (1/4) Epoch 2261, batch 0, global_batch_idx: 36160, batch size: 108, loss[dur_loss=0.2115, prior_loss=0.9765, diff_loss=0.3296, tot_loss=1.518, over 108.00 samples.], tot_loss[dur_loss=0.2115, prior_loss=0.9765, diff_loss=0.3296, tot_loss=1.518, over 108.00 samples.], 2024-10-21 13:02:32,037 INFO [train.py:561] (1/4) Epoch 2261, batch 10, global_batch_idx: 36170, batch size: 111, loss[dur_loss=0.2071, prior_loss=0.9766, diff_loss=0.3353, tot_loss=1.519, over 111.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9755, diff_loss=0.3775, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 13:02:39,053 INFO [train.py:682] (1/4) Start epoch 2262 2024-10-21 13:02:52,531 INFO [train.py:561] (1/4) Epoch 2262, batch 4, global_batch_idx: 36180, batch size: 189, loss[dur_loss=0.2022, prior_loss=0.9758, diff_loss=0.3406, tot_loss=1.519, over 189.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.975, diff_loss=0.4278, tot_loss=1.604, over 937.00 samples.], 2024-10-21 13:03:07,243 INFO [train.py:561] (1/4) Epoch 2262, batch 14, global_batch_idx: 36190, batch size: 142, loss[dur_loss=0.208, prior_loss=0.9755, diff_loss=0.312, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9756, diff_loss=0.3597, tot_loss=1.54, over 2210.00 samples.], 2024-10-21 13:03:08,662 INFO [train.py:682] (1/4) Start epoch 2263 2024-10-21 13:03:28,845 INFO [train.py:561] (1/4) Epoch 2263, batch 8, global_batch_idx: 36200, batch size: 170, loss[dur_loss=0.2091, prior_loss=0.9759, diff_loss=0.3002, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9753, diff_loss=0.3847, tot_loss=1.563, over 1432.00 samples.], 2024-10-21 13:03:38,944 INFO [train.py:682] (1/4) Start epoch 2264 2024-10-21 13:03:50,218 INFO [train.py:561] (1/4) Epoch 2264, batch 2, global_batch_idx: 36210, batch size: 203, loss[dur_loss=0.2091, prior_loss=0.9759, diff_loss=0.3188, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.2079, prior_loss=0.976, diff_loss=0.3003, tot_loss=1.484, over 442.00 samples.], 2024-10-21 13:04:04,313 INFO [train.py:561] (1/4) Epoch 2264, batch 12, global_batch_idx: 36220, batch size: 152, loss[dur_loss=0.2004, prior_loss=0.9756, diff_loss=0.3584, tot_loss=1.534, over 152.00 samples.], tot_loss[dur_loss=0.2046, prior_loss=0.9755, diff_loss=0.36, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 13:04:08,733 INFO [train.py:682] (1/4) Start epoch 2265 2024-10-21 13:04:25,695 INFO [train.py:561] (1/4) Epoch 2265, batch 6, global_batch_idx: 36230, batch size: 106, loss[dur_loss=0.2046, prior_loss=0.976, diff_loss=0.3321, tot_loss=1.513, over 106.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9752, diff_loss=0.414, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 13:04:38,682 INFO [train.py:682] (1/4) Start epoch 2266 2024-10-21 13:04:47,461 INFO [train.py:561] (1/4) Epoch 2266, batch 0, global_batch_idx: 36240, batch size: 108, loss[dur_loss=0.211, prior_loss=0.9765, diff_loss=0.3073, tot_loss=1.495, over 108.00 samples.], tot_loss[dur_loss=0.211, prior_loss=0.9765, diff_loss=0.3073, tot_loss=1.495, over 108.00 samples.], 2024-10-21 13:05:01,791 INFO [train.py:561] (1/4) Epoch 2266, batch 10, global_batch_idx: 36250, batch size: 111, loss[dur_loss=0.2094, prior_loss=0.9768, diff_loss=0.318, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.2048, prior_loss=0.9756, diff_loss=0.381, tot_loss=1.561, over 1656.00 samples.], 2024-10-21 13:05:08,841 INFO [train.py:682] (1/4) Start epoch 2267 2024-10-21 13:05:22,799 INFO [train.py:561] (1/4) Epoch 2267, batch 4, global_batch_idx: 36260, batch size: 189, loss[dur_loss=0.2052, prior_loss=0.9759, diff_loss=0.3287, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.2021, prior_loss=0.9751, diff_loss=0.4273, tot_loss=1.604, over 937.00 samples.], 2024-10-21 13:05:37,675 INFO [train.py:561] (1/4) Epoch 2267, batch 14, global_batch_idx: 36270, batch size: 142, loss[dur_loss=0.2073, prior_loss=0.9754, diff_loss=0.3199, tot_loss=1.503, over 142.00 samples.], tot_loss[dur_loss=0.2044, prior_loss=0.9756, diff_loss=0.3597, tot_loss=1.54, over 2210.00 samples.], 2024-10-21 13:05:39,095 INFO [train.py:682] (1/4) Start epoch 2268 2024-10-21 13:05:59,123 INFO [train.py:561] (1/4) Epoch 2268, batch 8, global_batch_idx: 36280, batch size: 170, loss[dur_loss=0.2101, prior_loss=0.976, diff_loss=0.339, tot_loss=1.525, over 170.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9753, diff_loss=0.3868, tot_loss=1.566, over 1432.00 samples.], 2024-10-21 13:06:09,280 INFO [train.py:682] (1/4) Start epoch 2269 2024-10-21 13:06:20,595 INFO [train.py:561] (1/4) Epoch 2269, batch 2, global_batch_idx: 36290, batch size: 203, loss[dur_loss=0.2083, prior_loss=0.9758, diff_loss=0.3371, tot_loss=1.521, over 203.00 samples.], tot_loss[dur_loss=0.2077, prior_loss=0.9759, diff_loss=0.308, tot_loss=1.492, over 442.00 samples.], 2024-10-21 13:06:34,908 INFO [train.py:561] (1/4) Epoch 2269, batch 12, global_batch_idx: 36300, batch size: 152, loss[dur_loss=0.2044, prior_loss=0.9757, diff_loss=0.3149, tot_loss=1.495, over 152.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9754, diff_loss=0.3675, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 13:06:39,327 INFO [train.py:682] (1/4) Start epoch 2270 2024-10-21 13:06:56,384 INFO [train.py:561] (1/4) Epoch 2270, batch 6, global_batch_idx: 36310, batch size: 106, loss[dur_loss=0.2074, prior_loss=0.9759, diff_loss=0.2953, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9753, diff_loss=0.4058, tot_loss=1.583, over 1142.00 samples.], 2024-10-21 13:07:09,433 INFO [train.py:682] (1/4) Start epoch 2271 2024-10-21 13:07:17,850 INFO [train.py:561] (1/4) Epoch 2271, batch 0, global_batch_idx: 36320, batch size: 108, loss[dur_loss=0.2102, prior_loss=0.9765, diff_loss=0.2952, tot_loss=1.482, over 108.00 samples.], tot_loss[dur_loss=0.2102, prior_loss=0.9765, diff_loss=0.2952, tot_loss=1.482, over 108.00 samples.], 2024-10-21 13:07:32,017 INFO [train.py:561] (1/4) Epoch 2271, batch 10, global_batch_idx: 36330, batch size: 111, loss[dur_loss=0.2091, prior_loss=0.9769, diff_loss=0.2837, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9755, diff_loss=0.3696, tot_loss=1.55, over 1656.00 samples.], 2024-10-21 13:07:39,068 INFO [train.py:682] (1/4) Start epoch 2272 2024-10-21 13:07:52,834 INFO [train.py:561] (1/4) Epoch 2272, batch 4, global_batch_idx: 36340, batch size: 189, loss[dur_loss=0.2049, prior_loss=0.9758, diff_loss=0.337, tot_loss=1.518, over 189.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.975, diff_loss=0.4267, tot_loss=1.603, over 937.00 samples.], 2024-10-21 13:08:07,654 INFO [train.py:561] (1/4) Epoch 2272, batch 14, global_batch_idx: 36350, batch size: 142, loss[dur_loss=0.208, prior_loss=0.9754, diff_loss=0.3333, tot_loss=1.517, over 142.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9755, diff_loss=0.3622, tot_loss=1.543, over 2210.00 samples.], 2024-10-21 13:08:09,088 INFO [train.py:682] (1/4) Start epoch 2273 2024-10-21 13:08:28,894 INFO [train.py:561] (1/4) Epoch 2273, batch 8, global_batch_idx: 36360, batch size: 170, loss[dur_loss=0.2109, prior_loss=0.9758, diff_loss=0.316, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.9753, diff_loss=0.3808, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 13:08:39,032 INFO [train.py:682] (1/4) Start epoch 2274 2024-10-21 13:08:50,221 INFO [train.py:561] (1/4) Epoch 2274, batch 2, global_batch_idx: 36370, batch size: 203, loss[dur_loss=0.206, prior_loss=0.9758, diff_loss=0.3293, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.2071, prior_loss=0.9759, diff_loss=0.3173, tot_loss=1.5, over 442.00 samples.], 2024-10-21 13:09:04,419 INFO [train.py:561] (1/4) Epoch 2274, batch 12, global_batch_idx: 36380, batch size: 152, loss[dur_loss=0.204, prior_loss=0.9758, diff_loss=0.3576, tot_loss=1.537, over 152.00 samples.], tot_loss[dur_loss=0.204, prior_loss=0.9754, diff_loss=0.3682, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 13:09:08,862 INFO [train.py:682] (1/4) Start epoch 2275 2024-10-21 13:09:25,633 INFO [train.py:561] (1/4) Epoch 2275, batch 6, global_batch_idx: 36390, batch size: 106, loss[dur_loss=0.2086, prior_loss=0.9759, diff_loss=0.2977, tot_loss=1.482, over 106.00 samples.], tot_loss[dur_loss=0.2031, prior_loss=0.9751, diff_loss=0.3934, tot_loss=1.572, over 1142.00 samples.], 2024-10-21 13:09:38,603 INFO [train.py:682] (1/4) Start epoch 2276 2024-10-21 13:09:47,107 INFO [train.py:561] (1/4) Epoch 2276, batch 0, global_batch_idx: 36400, batch size: 108, loss[dur_loss=0.2121, prior_loss=0.9765, diff_loss=0.3022, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.2121, prior_loss=0.9765, diff_loss=0.3022, tot_loss=1.491, over 108.00 samples.], 2024-10-21 13:10:01,293 INFO [train.py:561] (1/4) Epoch 2276, batch 10, global_batch_idx: 36410, batch size: 111, loss[dur_loss=0.2104, prior_loss=0.9769, diff_loss=0.296, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9754, diff_loss=0.3649, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 13:10:08,363 INFO [train.py:682] (1/4) Start epoch 2277 2024-10-21 13:10:22,478 INFO [train.py:561] (1/4) Epoch 2277, batch 4, global_batch_idx: 36420, batch size: 189, loss[dur_loss=0.2062, prior_loss=0.9758, diff_loss=0.3283, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9751, diff_loss=0.4341, tot_loss=1.611, over 937.00 samples.], 2024-10-21 13:10:37,258 INFO [train.py:561] (1/4) Epoch 2277, batch 14, global_batch_idx: 36430, batch size: 142, loss[dur_loss=0.2072, prior_loss=0.9755, diff_loss=0.3034, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9756, diff_loss=0.3699, tot_loss=1.551, over 2210.00 samples.], 2024-10-21 13:10:38,682 INFO [train.py:682] (1/4) Start epoch 2278 2024-10-21 13:10:58,283 INFO [train.py:561] (1/4) Epoch 2278, batch 8, global_batch_idx: 36440, batch size: 170, loss[dur_loss=0.2125, prior_loss=0.9761, diff_loss=0.3093, tot_loss=1.498, over 170.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9754, diff_loss=0.3884, tot_loss=1.569, over 1432.00 samples.], 2024-10-21 13:11:08,487 INFO [train.py:682] (1/4) Start epoch 2279 2024-10-21 13:11:19,801 INFO [train.py:561] (1/4) Epoch 2279, batch 2, global_batch_idx: 36450, batch size: 203, loss[dur_loss=0.2087, prior_loss=0.9762, diff_loss=0.3225, tot_loss=1.507, over 203.00 samples.], tot_loss[dur_loss=0.2088, prior_loss=0.9761, diff_loss=0.3221, tot_loss=1.507, over 442.00 samples.], 2024-10-21 13:11:34,001 INFO [train.py:561] (1/4) Epoch 2279, batch 12, global_batch_idx: 36460, batch size: 152, loss[dur_loss=0.2046, prior_loss=0.9761, diff_loss=0.3228, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9757, diff_loss=0.3684, tot_loss=1.55, over 1966.00 samples.], 2024-10-21 13:11:38,475 INFO [train.py:682] (1/4) Start epoch 2280 2024-10-21 13:11:55,152 INFO [train.py:561] (1/4) Epoch 2280, batch 6, global_batch_idx: 36470, batch size: 106, loss[dur_loss=0.205, prior_loss=0.9759, diff_loss=0.2655, tot_loss=1.446, over 106.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9751, diff_loss=0.4057, tot_loss=1.582, over 1142.00 samples.], 2024-10-21 13:12:08,130 INFO [train.py:682] (1/4) Start epoch 2281 2024-10-21 13:12:16,939 INFO [train.py:561] (1/4) Epoch 2281, batch 0, global_batch_idx: 36480, batch size: 108, loss[dur_loss=0.2075, prior_loss=0.9763, diff_loss=0.3488, tot_loss=1.533, over 108.00 samples.], tot_loss[dur_loss=0.2075, prior_loss=0.9763, diff_loss=0.3488, tot_loss=1.533, over 108.00 samples.], 2024-10-21 13:12:31,167 INFO [train.py:561] (1/4) Epoch 2281, batch 10, global_batch_idx: 36490, batch size: 111, loss[dur_loss=0.2072, prior_loss=0.9766, diff_loss=0.3036, tot_loss=1.487, over 111.00 samples.], tot_loss[dur_loss=0.204, prior_loss=0.9754, diff_loss=0.3658, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 13:12:38,295 INFO [train.py:682] (1/4) Start epoch 2282 2024-10-21 13:12:51,892 INFO [train.py:561] (1/4) Epoch 2282, batch 4, global_batch_idx: 36500, batch size: 189, loss[dur_loss=0.2033, prior_loss=0.9756, diff_loss=0.356, tot_loss=1.535, over 189.00 samples.], tot_loss[dur_loss=0.2016, prior_loss=0.975, diff_loss=0.423, tot_loss=1.6, over 937.00 samples.], 2024-10-21 13:13:06,596 INFO [train.py:561] (1/4) Epoch 2282, batch 14, global_batch_idx: 36510, batch size: 142, loss[dur_loss=0.2047, prior_loss=0.9754, diff_loss=0.2905, tot_loss=1.471, over 142.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9755, diff_loss=0.3564, tot_loss=1.536, over 2210.00 samples.], 2024-10-21 13:13:08,014 INFO [train.py:682] (1/4) Start epoch 2283 2024-10-21 13:13:27,671 INFO [train.py:561] (1/4) Epoch 2283, batch 8, global_batch_idx: 36520, batch size: 170, loss[dur_loss=0.2104, prior_loss=0.9757, diff_loss=0.2864, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.2031, prior_loss=0.9753, diff_loss=0.381, tot_loss=1.559, over 1432.00 samples.], 2024-10-21 13:13:37,706 INFO [train.py:682] (1/4) Start epoch 2284 2024-10-21 13:13:49,097 INFO [train.py:561] (1/4) Epoch 2284, batch 2, global_batch_idx: 36530, batch size: 203, loss[dur_loss=0.2063, prior_loss=0.9757, diff_loss=0.3387, tot_loss=1.521, over 203.00 samples.], tot_loss[dur_loss=0.2064, prior_loss=0.9759, diff_loss=0.3264, tot_loss=1.509, over 442.00 samples.], 2024-10-21 13:14:03,257 INFO [train.py:561] (1/4) Epoch 2284, batch 12, global_batch_idx: 36540, batch size: 152, loss[dur_loss=0.2023, prior_loss=0.9757, diff_loss=0.2871, tot_loss=1.465, over 152.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9755, diff_loss=0.3624, tot_loss=1.542, over 1966.00 samples.], 2024-10-21 13:14:07,680 INFO [train.py:682] (1/4) Start epoch 2285 2024-10-21 13:14:24,776 INFO [train.py:561] (1/4) Epoch 2285, batch 6, global_batch_idx: 36550, batch size: 106, loss[dur_loss=0.2071, prior_loss=0.9757, diff_loss=0.3205, tot_loss=1.503, over 106.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.9752, diff_loss=0.4064, tot_loss=1.584, over 1142.00 samples.], 2024-10-21 13:14:37,837 INFO [train.py:682] (1/4) Start epoch 2286 2024-10-21 13:14:46,378 INFO [train.py:561] (1/4) Epoch 2286, batch 0, global_batch_idx: 36560, batch size: 108, loss[dur_loss=0.2105, prior_loss=0.9764, diff_loss=0.319, tot_loss=1.506, over 108.00 samples.], tot_loss[dur_loss=0.2105, prior_loss=0.9764, diff_loss=0.319, tot_loss=1.506, over 108.00 samples.], 2024-10-21 13:15:00,518 INFO [train.py:561] (1/4) Epoch 2286, batch 10, global_batch_idx: 36570, batch size: 111, loss[dur_loss=0.2105, prior_loss=0.9769, diff_loss=0.2644, tot_loss=1.452, over 111.00 samples.], tot_loss[dur_loss=0.2039, prior_loss=0.9755, diff_loss=0.3733, tot_loss=1.553, over 1656.00 samples.], 2024-10-21 13:15:07,579 INFO [train.py:682] (1/4) Start epoch 2287 2024-10-21 13:15:21,079 INFO [train.py:561] (1/4) Epoch 2287, batch 4, global_batch_idx: 36580, batch size: 189, loss[dur_loss=0.2044, prior_loss=0.9756, diff_loss=0.3295, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.2019, prior_loss=0.9748, diff_loss=0.4252, tot_loss=1.602, over 937.00 samples.], 2024-10-21 13:15:35,814 INFO [train.py:561] (1/4) Epoch 2287, batch 14, global_batch_idx: 36590, batch size: 142, loss[dur_loss=0.2053, prior_loss=0.9753, diff_loss=0.3191, tot_loss=1.5, over 142.00 samples.], tot_loss[dur_loss=0.2047, prior_loss=0.9755, diff_loss=0.3588, tot_loss=1.539, over 2210.00 samples.], 2024-10-21 13:15:37,226 INFO [train.py:682] (1/4) Start epoch 2288 2024-10-21 13:15:57,054 INFO [train.py:561] (1/4) Epoch 2288, batch 8, global_batch_idx: 36600, batch size: 170, loss[dur_loss=0.2107, prior_loss=0.976, diff_loss=0.3244, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9753, diff_loss=0.3807, tot_loss=1.56, over 1432.00 samples.], 2024-10-21 13:16:07,141 INFO [train.py:682] (1/4) Start epoch 2289 2024-10-21 13:16:18,296 INFO [train.py:561] (1/4) Epoch 2289, batch 2, global_batch_idx: 36610, batch size: 203, loss[dur_loss=0.2052, prior_loss=0.9756, diff_loss=0.3601, tot_loss=1.541, over 203.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.9758, diff_loss=0.3375, tot_loss=1.519, over 442.00 samples.], 2024-10-21 13:16:32,544 INFO [train.py:561] (1/4) Epoch 2289, batch 12, global_batch_idx: 36620, batch size: 152, loss[dur_loss=0.2025, prior_loss=0.9758, diff_loss=0.3208, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9754, diff_loss=0.3728, tot_loss=1.552, over 1966.00 samples.], 2024-10-21 13:16:37,016 INFO [train.py:682] (1/4) Start epoch 2290 2024-10-21 13:16:53,979 INFO [train.py:561] (1/4) Epoch 2290, batch 6, global_batch_idx: 36630, batch size: 106, loss[dur_loss=0.2059, prior_loss=0.9756, diff_loss=0.2898, tot_loss=1.471, over 106.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9751, diff_loss=0.4107, tot_loss=1.587, over 1142.00 samples.], 2024-10-21 13:17:07,035 INFO [train.py:682] (1/4) Start epoch 2291 2024-10-21 13:17:15,859 INFO [train.py:561] (1/4) Epoch 2291, batch 0, global_batch_idx: 36640, batch size: 108, loss[dur_loss=0.2099, prior_loss=0.9765, diff_loss=0.318, tot_loss=1.504, over 108.00 samples.], tot_loss[dur_loss=0.2099, prior_loss=0.9765, diff_loss=0.318, tot_loss=1.504, over 108.00 samples.], 2024-10-21 13:17:30,051 INFO [train.py:561] (1/4) Epoch 2291, batch 10, global_batch_idx: 36650, batch size: 111, loss[dur_loss=0.2098, prior_loss=0.9766, diff_loss=0.2909, tot_loss=1.477, over 111.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9755, diff_loss=0.367, tot_loss=1.547, over 1656.00 samples.], 2024-10-21 13:17:37,150 INFO [train.py:682] (1/4) Start epoch 2292 2024-10-21 13:17:50,814 INFO [train.py:561] (1/4) Epoch 2292, batch 4, global_batch_idx: 36660, batch size: 189, loss[dur_loss=0.2061, prior_loss=0.9757, diff_loss=0.3333, tot_loss=1.515, over 189.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9748, diff_loss=0.4181, tot_loss=1.594, over 937.00 samples.], 2024-10-21 13:18:05,529 INFO [train.py:561] (1/4) Epoch 2292, batch 14, global_batch_idx: 36670, batch size: 142, loss[dur_loss=0.2067, prior_loss=0.9756, diff_loss=0.3092, tot_loss=1.492, over 142.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9754, diff_loss=0.3586, tot_loss=1.538, over 2210.00 samples.], 2024-10-21 13:18:06,968 INFO [train.py:682] (1/4) Start epoch 2293 2024-10-21 13:18:26,875 INFO [train.py:561] (1/4) Epoch 2293, batch 8, global_batch_idx: 36680, batch size: 170, loss[dur_loss=0.2073, prior_loss=0.9758, diff_loss=0.3504, tot_loss=1.533, over 170.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.9752, diff_loss=0.3774, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 13:18:36,935 INFO [train.py:682] (1/4) Start epoch 2294 2024-10-21 13:18:48,072 INFO [train.py:561] (1/4) Epoch 2294, batch 2, global_batch_idx: 36690, batch size: 203, loss[dur_loss=0.2064, prior_loss=0.9755, diff_loss=0.3116, tot_loss=1.493, over 203.00 samples.], tot_loss[dur_loss=0.2076, prior_loss=0.9757, diff_loss=0.3107, tot_loss=1.494, over 442.00 samples.], 2024-10-21 13:19:02,191 INFO [train.py:561] (1/4) Epoch 2294, batch 12, global_batch_idx: 36700, batch size: 152, loss[dur_loss=0.2047, prior_loss=0.9759, diff_loss=0.3131, tot_loss=1.494, over 152.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9753, diff_loss=0.3669, tot_loss=1.546, over 1966.00 samples.], 2024-10-21 13:19:06,597 INFO [train.py:682] (1/4) Start epoch 2295 2024-10-21 13:19:23,394 INFO [train.py:561] (1/4) Epoch 2295, batch 6, global_batch_idx: 36710, batch size: 106, loss[dur_loss=0.2041, prior_loss=0.9758, diff_loss=0.3354, tot_loss=1.515, over 106.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9751, diff_loss=0.4083, tot_loss=1.585, over 1142.00 samples.], 2024-10-21 13:19:36,568 INFO [train.py:682] (1/4) Start epoch 2296 2024-10-21 13:19:45,457 INFO [train.py:561] (1/4) Epoch 2296, batch 0, global_batch_idx: 36720, batch size: 108, loss[dur_loss=0.2086, prior_loss=0.9761, diff_loss=0.3031, tot_loss=1.488, over 108.00 samples.], tot_loss[dur_loss=0.2086, prior_loss=0.9761, diff_loss=0.3031, tot_loss=1.488, over 108.00 samples.], 2024-10-21 13:19:59,702 INFO [train.py:561] (1/4) Epoch 2296, batch 10, global_batch_idx: 36730, batch size: 111, loss[dur_loss=0.2083, prior_loss=0.9767, diff_loss=0.3277, tot_loss=1.513, over 111.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.9752, diff_loss=0.3767, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 13:20:06,731 INFO [train.py:682] (1/4) Start epoch 2297 2024-10-21 13:20:20,246 INFO [train.py:561] (1/4) Epoch 2297, batch 4, global_batch_idx: 36740, batch size: 189, loss[dur_loss=0.2027, prior_loss=0.9756, diff_loss=0.3044, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9748, diff_loss=0.4187, tot_loss=1.595, over 937.00 samples.], 2024-10-21 13:20:35,007 INFO [train.py:561] (1/4) Epoch 2297, batch 14, global_batch_idx: 36750, batch size: 142, loss[dur_loss=0.2049, prior_loss=0.9753, diff_loss=0.3178, tot_loss=1.498, over 142.00 samples.], tot_loss[dur_loss=0.2038, prior_loss=0.9753, diff_loss=0.3574, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 13:20:36,416 INFO [train.py:682] (1/4) Start epoch 2298 2024-10-21 13:20:56,520 INFO [train.py:561] (1/4) Epoch 2298, batch 8, global_batch_idx: 36760, batch size: 170, loss[dur_loss=0.2069, prior_loss=0.9757, diff_loss=0.3207, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9751, diff_loss=0.3739, tot_loss=1.552, over 1432.00 samples.], 2024-10-21 13:21:06,666 INFO [train.py:682] (1/4) Start epoch 2299 2024-10-21 13:21:17,896 INFO [train.py:561] (1/4) Epoch 2299, batch 2, global_batch_idx: 36770, batch size: 203, loss[dur_loss=0.2049, prior_loss=0.9756, diff_loss=0.3317, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9758, diff_loss=0.3192, tot_loss=1.501, over 442.00 samples.], 2024-10-21 13:21:32,307 INFO [train.py:561] (1/4) Epoch 2299, batch 12, global_batch_idx: 36780, batch size: 152, loss[dur_loss=0.2035, prior_loss=0.9758, diff_loss=0.3078, tot_loss=1.487, over 152.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9753, diff_loss=0.3598, tot_loss=1.539, over 1966.00 samples.], 2024-10-21 13:21:36,697 INFO [train.py:682] (1/4) Start epoch 2300 2024-10-21 13:21:53,697 INFO [train.py:561] (1/4) Epoch 2300, batch 6, global_batch_idx: 36790, batch size: 106, loss[dur_loss=0.2068, prior_loss=0.9757, diff_loss=0.2779, tot_loss=1.46, over 106.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9751, diff_loss=0.4028, tot_loss=1.579, over 1142.00 samples.], 2024-10-21 13:22:06,730 INFO [train.py:682] (1/4) Start epoch 2301 2024-10-21 13:22:15,464 INFO [train.py:561] (1/4) Epoch 2301, batch 0, global_batch_idx: 36800, batch size: 108, loss[dur_loss=0.2052, prior_loss=0.9763, diff_loss=0.2905, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9763, diff_loss=0.2905, tot_loss=1.472, over 108.00 samples.], 2024-10-21 13:22:29,586 INFO [train.py:561] (1/4) Epoch 2301, batch 10, global_batch_idx: 36810, batch size: 111, loss[dur_loss=0.2048, prior_loss=0.9766, diff_loss=0.2948, tot_loss=1.476, over 111.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.9753, diff_loss=0.3564, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 13:22:36,609 INFO [train.py:682] (1/4) Start epoch 2302 2024-10-21 13:22:50,385 INFO [train.py:561] (1/4) Epoch 2302, batch 4, global_batch_idx: 36820, batch size: 189, loss[dur_loss=0.2041, prior_loss=0.9758, diff_loss=0.3538, tot_loss=1.534, over 189.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9748, diff_loss=0.4386, tot_loss=1.615, over 937.00 samples.], 2024-10-21 13:23:05,211 INFO [train.py:561] (1/4) Epoch 2302, batch 14, global_batch_idx: 36830, batch size: 142, loss[dur_loss=0.2066, prior_loss=0.9751, diff_loss=0.3096, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.204, prior_loss=0.9753, diff_loss=0.3573, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 13:23:06,638 INFO [train.py:682] (1/4) Start epoch 2303 2024-10-21 13:23:26,121 INFO [train.py:561] (1/4) Epoch 2303, batch 8, global_batch_idx: 36840, batch size: 170, loss[dur_loss=0.2072, prior_loss=0.9757, diff_loss=0.3345, tot_loss=1.517, over 170.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9752, diff_loss=0.3902, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 13:23:36,113 INFO [train.py:682] (1/4) Start epoch 2304 2024-10-21 13:23:47,545 INFO [train.py:561] (1/4) Epoch 2304, batch 2, global_batch_idx: 36850, batch size: 203, loss[dur_loss=0.205, prior_loss=0.9754, diff_loss=0.3466, tot_loss=1.527, over 203.00 samples.], tot_loss[dur_loss=0.2056, prior_loss=0.9757, diff_loss=0.3165, tot_loss=1.498, over 442.00 samples.], 2024-10-21 13:24:01,682 INFO [train.py:561] (1/4) Epoch 2304, batch 12, global_batch_idx: 36860, batch size: 152, loss[dur_loss=0.2044, prior_loss=0.9756, diff_loss=0.3509, tot_loss=1.531, over 152.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9753, diff_loss=0.3677, tot_loss=1.546, over 1966.00 samples.], 2024-10-21 13:24:06,087 INFO [train.py:682] (1/4) Start epoch 2305 2024-10-21 13:24:22,764 INFO [train.py:561] (1/4) Epoch 2305, batch 6, global_batch_idx: 36870, batch size: 106, loss[dur_loss=0.2049, prior_loss=0.9755, diff_loss=0.3113, tot_loss=1.492, over 106.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.975, diff_loss=0.404, tot_loss=1.58, over 1142.00 samples.], 2024-10-21 13:24:35,756 INFO [train.py:682] (1/4) Start epoch 2306 2024-10-21 13:24:44,511 INFO [train.py:561] (1/4) Epoch 2306, batch 0, global_batch_idx: 36880, batch size: 108, loss[dur_loss=0.2054, prior_loss=0.9764, diff_loss=0.2814, tot_loss=1.463, over 108.00 samples.], tot_loss[dur_loss=0.2054, prior_loss=0.9764, diff_loss=0.2814, tot_loss=1.463, over 108.00 samples.], 2024-10-21 13:24:58,615 INFO [train.py:561] (1/4) Epoch 2306, batch 10, global_batch_idx: 36890, batch size: 111, loss[dur_loss=0.2099, prior_loss=0.9766, diff_loss=0.2992, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9754, diff_loss=0.3629, tot_loss=1.542, over 1656.00 samples.], 2024-10-21 13:25:05,691 INFO [train.py:682] (1/4) Start epoch 2307 2024-10-21 13:25:19,141 INFO [train.py:561] (1/4) Epoch 2307, batch 4, global_batch_idx: 36900, batch size: 189, loss[dur_loss=0.2059, prior_loss=0.9759, diff_loss=0.32, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9749, diff_loss=0.43, tot_loss=1.607, over 937.00 samples.], 2024-10-21 13:25:33,999 INFO [train.py:561] (1/4) Epoch 2307, batch 14, global_batch_idx: 36910, batch size: 142, loss[dur_loss=0.2067, prior_loss=0.9755, diff_loss=0.3031, tot_loss=1.485, over 142.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9754, diff_loss=0.3631, tot_loss=1.543, over 2210.00 samples.], 2024-10-21 13:25:35,414 INFO [train.py:682] (1/4) Start epoch 2308 2024-10-21 13:25:55,182 INFO [train.py:561] (1/4) Epoch 2308, batch 8, global_batch_idx: 36920, batch size: 170, loss[dur_loss=0.2066, prior_loss=0.9756, diff_loss=0.3165, tot_loss=1.499, over 170.00 samples.], tot_loss[dur_loss=0.2025, prior_loss=0.9751, diff_loss=0.3815, tot_loss=1.559, over 1432.00 samples.], 2024-10-21 13:26:05,226 INFO [train.py:682] (1/4) Start epoch 2309 2024-10-21 13:26:16,552 INFO [train.py:561] (1/4) Epoch 2309, batch 2, global_batch_idx: 36930, batch size: 203, loss[dur_loss=0.206, prior_loss=0.9757, diff_loss=0.3285, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.2072, prior_loss=0.9757, diff_loss=0.3024, tot_loss=1.485, over 442.00 samples.], 2024-10-21 13:26:30,718 INFO [train.py:561] (1/4) Epoch 2309, batch 12, global_batch_idx: 36940, batch size: 152, loss[dur_loss=0.2027, prior_loss=0.9757, diff_loss=0.3408, tot_loss=1.519, over 152.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9753, diff_loss=0.3587, tot_loss=1.538, over 1966.00 samples.], 2024-10-21 13:26:35,149 INFO [train.py:682] (1/4) Start epoch 2310 2024-10-21 13:26:51,940 INFO [train.py:561] (1/4) Epoch 2310, batch 6, global_batch_idx: 36950, batch size: 106, loss[dur_loss=0.2042, prior_loss=0.9758, diff_loss=0.2971, tot_loss=1.477, over 106.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9749, diff_loss=0.3881, tot_loss=1.562, over 1142.00 samples.], 2024-10-21 13:27:04,981 INFO [train.py:682] (1/4) Start epoch 2311 2024-10-21 13:27:13,441 INFO [train.py:561] (1/4) Epoch 2311, batch 0, global_batch_idx: 36960, batch size: 108, loss[dur_loss=0.2093, prior_loss=0.9761, diff_loss=0.2981, tot_loss=1.483, over 108.00 samples.], tot_loss[dur_loss=0.2093, prior_loss=0.9761, diff_loss=0.2981, tot_loss=1.483, over 108.00 samples.], 2024-10-21 13:27:27,633 INFO [train.py:561] (1/4) Epoch 2311, batch 10, global_batch_idx: 36970, batch size: 111, loss[dur_loss=0.2075, prior_loss=0.9763, diff_loss=0.3106, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.9752, diff_loss=0.3653, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 13:27:34,721 INFO [train.py:682] (1/4) Start epoch 2312 2024-10-21 13:27:48,646 INFO [train.py:561] (1/4) Epoch 2312, batch 4, global_batch_idx: 36980, batch size: 189, loss[dur_loss=0.2044, prior_loss=0.9758, diff_loss=0.3446, tot_loss=1.525, over 189.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9748, diff_loss=0.4214, tot_loss=1.597, over 937.00 samples.], 2024-10-21 13:28:03,512 INFO [train.py:561] (1/4) Epoch 2312, batch 14, global_batch_idx: 36990, batch size: 142, loss[dur_loss=0.204, prior_loss=0.9753, diff_loss=0.3342, tot_loss=1.514, over 142.00 samples.], tot_loss[dur_loss=0.2038, prior_loss=0.9753, diff_loss=0.3562, tot_loss=1.535, over 2210.00 samples.], 2024-10-21 13:28:04,928 INFO [train.py:682] (1/4) Start epoch 2313 2024-10-21 13:28:24,772 INFO [train.py:561] (1/4) Epoch 2313, batch 8, global_batch_idx: 37000, batch size: 170, loss[dur_loss=0.2116, prior_loss=0.9757, diff_loss=0.3098, tot_loss=1.497, over 170.00 samples.], tot_loss[dur_loss=0.2021, prior_loss=0.9751, diff_loss=0.3756, tot_loss=1.553, over 1432.00 samples.], 2024-10-21 13:28:34,893 INFO [train.py:682] (1/4) Start epoch 2314 2024-10-21 13:28:46,325 INFO [train.py:561] (1/4) Epoch 2314, batch 2, global_batch_idx: 37010, batch size: 203, loss[dur_loss=0.2042, prior_loss=0.9756, diff_loss=0.3468, tot_loss=1.527, over 203.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9758, diff_loss=0.3196, tot_loss=1.501, over 442.00 samples.], 2024-10-21 13:29:00,473 INFO [train.py:561] (1/4) Epoch 2314, batch 12, global_batch_idx: 37020, batch size: 152, loss[dur_loss=0.2076, prior_loss=0.9758, diff_loss=0.3173, tot_loss=1.501, over 152.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9753, diff_loss=0.3638, tot_loss=1.543, over 1966.00 samples.], 2024-10-21 13:29:04,894 INFO [train.py:682] (1/4) Start epoch 2315 2024-10-21 13:29:21,635 INFO [train.py:561] (1/4) Epoch 2315, batch 6, global_batch_idx: 37030, batch size: 106, loss[dur_loss=0.2072, prior_loss=0.9759, diff_loss=0.2887, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.2016, prior_loss=0.975, diff_loss=0.4025, tot_loss=1.579, over 1142.00 samples.], 2024-10-21 13:29:34,708 INFO [train.py:682] (1/4) Start epoch 2316 2024-10-21 13:29:43,313 INFO [train.py:561] (1/4) Epoch 2316, batch 0, global_batch_idx: 37040, batch size: 108, loss[dur_loss=0.2082, prior_loss=0.9764, diff_loss=0.2731, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9764, diff_loss=0.2731, tot_loss=1.458, over 108.00 samples.], 2024-10-21 13:29:57,525 INFO [train.py:561] (1/4) Epoch 2316, batch 10, global_batch_idx: 37050, batch size: 111, loss[dur_loss=0.2098, prior_loss=0.9767, diff_loss=0.2935, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9753, diff_loss=0.3729, tot_loss=1.552, over 1656.00 samples.], 2024-10-21 13:30:04,620 INFO [train.py:682] (1/4) Start epoch 2317 2024-10-21 13:30:18,393 INFO [train.py:561] (1/4) Epoch 2317, batch 4, global_batch_idx: 37060, batch size: 189, loss[dur_loss=0.2043, prior_loss=0.9758, diff_loss=0.3272, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9748, diff_loss=0.4323, tot_loss=1.608, over 937.00 samples.], 2024-10-21 13:30:33,164 INFO [train.py:561] (1/4) Epoch 2317, batch 14, global_batch_idx: 37070, batch size: 142, loss[dur_loss=0.2045, prior_loss=0.9752, diff_loss=0.3177, tot_loss=1.497, over 142.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9753, diff_loss=0.3599, tot_loss=1.539, over 2210.00 samples.], 2024-10-21 13:30:34,577 INFO [train.py:682] (1/4) Start epoch 2318 2024-10-21 13:30:54,331 INFO [train.py:561] (1/4) Epoch 2318, batch 8, global_batch_idx: 37080, batch size: 170, loss[dur_loss=0.2101, prior_loss=0.9759, diff_loss=0.3141, tot_loss=1.5, over 170.00 samples.], tot_loss[dur_loss=0.2026, prior_loss=0.9752, diff_loss=0.3868, tot_loss=1.565, over 1432.00 samples.], 2024-10-21 13:31:04,506 INFO [train.py:682] (1/4) Start epoch 2319 2024-10-21 13:31:15,928 INFO [train.py:561] (1/4) Epoch 2319, batch 2, global_batch_idx: 37090, batch size: 203, loss[dur_loss=0.2037, prior_loss=0.9755, diff_loss=0.3487, tot_loss=1.528, over 203.00 samples.], tot_loss[dur_loss=0.2048, prior_loss=0.9756, diff_loss=0.3352, tot_loss=1.516, over 442.00 samples.], 2024-10-21 13:31:30,075 INFO [train.py:561] (1/4) Epoch 2319, batch 12, global_batch_idx: 37100, batch size: 152, loss[dur_loss=0.2047, prior_loss=0.9757, diff_loss=0.3052, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9752, diff_loss=0.3622, tot_loss=1.541, over 1966.00 samples.], 2024-10-21 13:31:34,494 INFO [train.py:682] (1/4) Start epoch 2320 2024-10-21 13:31:51,761 INFO [train.py:561] (1/4) Epoch 2320, batch 6, global_batch_idx: 37110, batch size: 106, loss[dur_loss=0.2043, prior_loss=0.9757, diff_loss=0.2636, tot_loss=1.444, over 106.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9749, diff_loss=0.3977, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 13:32:04,775 INFO [train.py:682] (1/4) Start epoch 2321 2024-10-21 13:32:13,442 INFO [train.py:561] (1/4) Epoch 2321, batch 0, global_batch_idx: 37120, batch size: 108, loss[dur_loss=0.206, prior_loss=0.9761, diff_loss=0.2905, tot_loss=1.473, over 108.00 samples.], tot_loss[dur_loss=0.206, prior_loss=0.9761, diff_loss=0.2905, tot_loss=1.473, over 108.00 samples.], 2024-10-21 13:32:27,525 INFO [train.py:561] (1/4) Epoch 2321, batch 10, global_batch_idx: 37130, batch size: 111, loss[dur_loss=0.2075, prior_loss=0.9763, diff_loss=0.3061, tot_loss=1.49, over 111.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9751, diff_loss=0.3795, tot_loss=1.557, over 1656.00 samples.], 2024-10-21 13:32:34,541 INFO [train.py:682] (1/4) Start epoch 2322 2024-10-21 13:32:48,311 INFO [train.py:561] (1/4) Epoch 2322, batch 4, global_batch_idx: 37140, batch size: 189, loss[dur_loss=0.2048, prior_loss=0.9756, diff_loss=0.3452, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9746, diff_loss=0.4147, tot_loss=1.59, over 937.00 samples.], 2024-10-21 13:33:03,051 INFO [train.py:561] (1/4) Epoch 2322, batch 14, global_batch_idx: 37150, batch size: 142, loss[dur_loss=0.2049, prior_loss=0.9753, diff_loss=0.2863, tot_loss=1.467, over 142.00 samples.], tot_loss[dur_loss=0.2039, prior_loss=0.9753, diff_loss=0.3531, tot_loss=1.532, over 2210.00 samples.], 2024-10-21 13:33:04,459 INFO [train.py:682] (1/4) Start epoch 2323 2024-10-21 13:33:24,152 INFO [train.py:561] (1/4) Epoch 2323, batch 8, global_batch_idx: 37160, batch size: 170, loss[dur_loss=0.2079, prior_loss=0.9755, diff_loss=0.3291, tot_loss=1.512, over 170.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.975, diff_loss=0.3941, tot_loss=1.572, over 1432.00 samples.], 2024-10-21 13:33:34,210 INFO [train.py:682] (1/4) Start epoch 2324 2024-10-21 13:33:45,608 INFO [train.py:561] (1/4) Epoch 2324, batch 2, global_batch_idx: 37170, batch size: 203, loss[dur_loss=0.2029, prior_loss=0.9756, diff_loss=0.3535, tot_loss=1.532, over 203.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9756, diff_loss=0.3252, tot_loss=1.504, over 442.00 samples.], 2024-10-21 13:33:59,972 INFO [train.py:561] (1/4) Epoch 2324, batch 12, global_batch_idx: 37180, batch size: 152, loss[dur_loss=0.2016, prior_loss=0.9756, diff_loss=0.292, tot_loss=1.469, over 152.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9752, diff_loss=0.371, tot_loss=1.549, over 1966.00 samples.], 2024-10-21 13:34:04,395 INFO [train.py:682] (1/4) Start epoch 2325 2024-10-21 13:34:21,648 INFO [train.py:561] (1/4) Epoch 2325, batch 6, global_batch_idx: 37190, batch size: 106, loss[dur_loss=0.2052, prior_loss=0.9757, diff_loss=0.2596, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.2018, prior_loss=0.9749, diff_loss=0.3985, tot_loss=1.575, over 1142.00 samples.], 2024-10-21 13:34:34,691 INFO [train.py:682] (1/4) Start epoch 2326 2024-10-21 13:34:43,479 INFO [train.py:561] (1/4) Epoch 2326, batch 0, global_batch_idx: 37200, batch size: 108, loss[dur_loss=0.2085, prior_loss=0.976, diff_loss=0.3091, tot_loss=1.494, over 108.00 samples.], tot_loss[dur_loss=0.2085, prior_loss=0.976, diff_loss=0.3091, tot_loss=1.494, over 108.00 samples.], 2024-10-21 13:34:57,555 INFO [train.py:561] (1/4) Epoch 2326, batch 10, global_batch_idx: 37210, batch size: 111, loss[dur_loss=0.206, prior_loss=0.9764, diff_loss=0.3351, tot_loss=1.517, over 111.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9751, diff_loss=0.3704, tot_loss=1.549, over 1656.00 samples.], 2024-10-21 13:35:04,580 INFO [train.py:682] (1/4) Start epoch 2327 2024-10-21 13:35:18,256 INFO [train.py:561] (1/4) Epoch 2327, batch 4, global_batch_idx: 37220, batch size: 189, loss[dur_loss=0.2039, prior_loss=0.9756, diff_loss=0.3424, tot_loss=1.522, over 189.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9747, diff_loss=0.4318, tot_loss=1.607, over 937.00 samples.], 2024-10-21 13:35:33,261 INFO [train.py:561] (1/4) Epoch 2327, batch 14, global_batch_idx: 37230, batch size: 142, loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.3133, tot_loss=1.493, over 142.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9753, diff_loss=0.3618, tot_loss=1.541, over 2210.00 samples.], 2024-10-21 13:35:34,697 INFO [train.py:682] (1/4) Start epoch 2328 2024-10-21 13:35:54,504 INFO [train.py:561] (1/4) Epoch 2328, batch 8, global_batch_idx: 37240, batch size: 170, loss[dur_loss=0.2037, prior_loss=0.9752, diff_loss=0.3188, tot_loss=1.498, over 170.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.975, diff_loss=0.3728, tot_loss=1.549, over 1432.00 samples.], 2024-10-21 13:36:04,566 INFO [train.py:682] (1/4) Start epoch 2329 2024-10-21 13:36:15,696 INFO [train.py:561] (1/4) Epoch 2329, batch 2, global_batch_idx: 37250, batch size: 203, loss[dur_loss=0.2033, prior_loss=0.9755, diff_loss=0.3363, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.205, prior_loss=0.9757, diff_loss=0.3294, tot_loss=1.51, over 442.00 samples.], 2024-10-21 13:36:29,822 INFO [train.py:561] (1/4) Epoch 2329, batch 12, global_batch_idx: 37260, batch size: 152, loss[dur_loss=0.2037, prior_loss=0.9756, diff_loss=0.3157, tot_loss=1.495, over 152.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9753, diff_loss=0.3585, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 13:36:34,258 INFO [train.py:682] (1/4) Start epoch 2330 2024-10-21 13:36:51,191 INFO [train.py:561] (1/4) Epoch 2330, batch 6, global_batch_idx: 37270, batch size: 106, loss[dur_loss=0.2026, prior_loss=0.9757, diff_loss=0.3104, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9749, diff_loss=0.4053, tot_loss=1.58, over 1142.00 samples.], 2024-10-21 13:37:04,163 INFO [train.py:682] (1/4) Start epoch 2331 2024-10-21 13:37:12,898 INFO [train.py:561] (1/4) Epoch 2331, batch 0, global_batch_idx: 37280, batch size: 108, loss[dur_loss=0.2082, prior_loss=0.9762, diff_loss=0.3223, tot_loss=1.507, over 108.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9762, diff_loss=0.3223, tot_loss=1.507, over 108.00 samples.], 2024-10-21 13:37:27,043 INFO [train.py:561] (1/4) Epoch 2331, batch 10, global_batch_idx: 37290, batch size: 111, loss[dur_loss=0.2061, prior_loss=0.9765, diff_loss=0.3081, tot_loss=1.491, over 111.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.9751, diff_loss=0.3778, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 13:37:34,093 INFO [train.py:682] (1/4) Start epoch 2332 2024-10-21 13:37:47,611 INFO [train.py:561] (1/4) Epoch 2332, batch 4, global_batch_idx: 37300, batch size: 189, loss[dur_loss=0.2036, prior_loss=0.9757, diff_loss=0.3315, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9746, diff_loss=0.4251, tot_loss=1.598, over 937.00 samples.], 2024-10-21 13:38:02,382 INFO [train.py:561] (1/4) Epoch 2332, batch 14, global_batch_idx: 37310, batch size: 142, loss[dur_loss=0.2043, prior_loss=0.9753, diff_loss=0.3063, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9752, diff_loss=0.3537, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 13:38:03,802 INFO [train.py:682] (1/4) Start epoch 2333 2024-10-21 13:38:23,742 INFO [train.py:561] (1/4) Epoch 2333, batch 8, global_batch_idx: 37320, batch size: 170, loss[dur_loss=0.2083, prior_loss=0.9755, diff_loss=0.308, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.2025, prior_loss=0.975, diff_loss=0.3846, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 13:38:33,781 INFO [train.py:682] (1/4) Start epoch 2334 2024-10-21 13:38:45,193 INFO [train.py:561] (1/4) Epoch 2334, batch 2, global_batch_idx: 37330, batch size: 203, loss[dur_loss=0.2037, prior_loss=0.9757, diff_loss=0.3256, tot_loss=1.505, over 203.00 samples.], tot_loss[dur_loss=0.2048, prior_loss=0.9756, diff_loss=0.3064, tot_loss=1.487, over 442.00 samples.], 2024-10-21 13:38:59,323 INFO [train.py:561] (1/4) Epoch 2334, batch 12, global_batch_idx: 37340, batch size: 152, loss[dur_loss=0.204, prior_loss=0.9753, diff_loss=0.2598, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9751, diff_loss=0.355, tot_loss=1.532, over 1966.00 samples.], 2024-10-21 13:39:03,734 INFO [train.py:682] (1/4) Start epoch 2335 2024-10-21 13:39:20,470 INFO [train.py:561] (1/4) Epoch 2335, batch 6, global_batch_idx: 37350, batch size: 106, loss[dur_loss=0.2069, prior_loss=0.9757, diff_loss=0.3209, tot_loss=1.504, over 106.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9749, diff_loss=0.4069, tot_loss=1.581, over 1142.00 samples.], 2024-10-21 13:39:33,463 INFO [train.py:682] (1/4) Start epoch 2336 2024-10-21 13:39:41,953 INFO [train.py:561] (1/4) Epoch 2336, batch 0, global_batch_idx: 37360, batch size: 108, loss[dur_loss=0.2094, prior_loss=0.9763, diff_loss=0.2723, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.2094, prior_loss=0.9763, diff_loss=0.2723, tot_loss=1.458, over 108.00 samples.], 2024-10-21 13:39:56,035 INFO [train.py:561] (1/4) Epoch 2336, batch 10, global_batch_idx: 37370, batch size: 111, loss[dur_loss=0.2067, prior_loss=0.9766, diff_loss=0.3365, tot_loss=1.52, over 111.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9752, diff_loss=0.3682, tot_loss=1.546, over 1656.00 samples.], 2024-10-21 13:40:03,085 INFO [train.py:682] (1/4) Start epoch 2337 2024-10-21 13:40:16,931 INFO [train.py:561] (1/4) Epoch 2337, batch 4, global_batch_idx: 37380, batch size: 189, loss[dur_loss=0.2054, prior_loss=0.9753, diff_loss=0.324, tot_loss=1.505, over 189.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9746, diff_loss=0.4104, tot_loss=1.586, over 937.00 samples.], 2024-10-21 13:40:31,727 INFO [train.py:561] (1/4) Epoch 2337, batch 14, global_batch_idx: 37390, batch size: 142, loss[dur_loss=0.2036, prior_loss=0.9751, diff_loss=0.3202, tot_loss=1.499, over 142.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.9752, diff_loss=0.3582, tot_loss=1.536, over 2210.00 samples.], 2024-10-21 13:40:33,142 INFO [train.py:682] (1/4) Start epoch 2338 2024-10-21 13:40:52,810 INFO [train.py:561] (1/4) Epoch 2338, batch 8, global_batch_idx: 37400, batch size: 170, loss[dur_loss=0.2079, prior_loss=0.9754, diff_loss=0.2973, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9749, diff_loss=0.3942, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 13:41:02,903 INFO [train.py:682] (1/4) Start epoch 2339 2024-10-21 13:41:13,958 INFO [train.py:561] (1/4) Epoch 2339, batch 2, global_batch_idx: 37410, batch size: 203, loss[dur_loss=0.2059, prior_loss=0.9753, diff_loss=0.3327, tot_loss=1.514, over 203.00 samples.], tot_loss[dur_loss=0.2058, prior_loss=0.9755, diff_loss=0.3254, tot_loss=1.507, over 442.00 samples.], 2024-10-21 13:41:28,075 INFO [train.py:561] (1/4) Epoch 2339, batch 12, global_batch_idx: 37420, batch size: 152, loss[dur_loss=0.2046, prior_loss=0.9753, diff_loss=0.2825, tot_loss=1.462, over 152.00 samples.], tot_loss[dur_loss=0.2026, prior_loss=0.9751, diff_loss=0.3593, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 13:41:32,534 INFO [train.py:682] (1/4) Start epoch 2340 2024-10-21 13:41:49,459 INFO [train.py:561] (1/4) Epoch 2340, batch 6, global_batch_idx: 37430, batch size: 106, loss[dur_loss=0.2042, prior_loss=0.9756, diff_loss=0.2886, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9748, diff_loss=0.4005, tot_loss=1.576, over 1142.00 samples.], 2024-10-21 13:42:02,436 INFO [train.py:682] (1/4) Start epoch 2341 2024-10-21 13:42:11,121 INFO [train.py:561] (1/4) Epoch 2341, batch 0, global_batch_idx: 37440, batch size: 108, loss[dur_loss=0.2103, prior_loss=0.9761, diff_loss=0.3454, tot_loss=1.532, over 108.00 samples.], tot_loss[dur_loss=0.2103, prior_loss=0.9761, diff_loss=0.3454, tot_loss=1.532, over 108.00 samples.], 2024-10-21 13:42:25,144 INFO [train.py:561] (1/4) Epoch 2341, batch 10, global_batch_idx: 37450, batch size: 111, loss[dur_loss=0.205, prior_loss=0.9762, diff_loss=0.3144, tot_loss=1.496, over 111.00 samples.], tot_loss[dur_loss=0.2026, prior_loss=0.9751, diff_loss=0.3669, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 13:42:32,197 INFO [train.py:682] (1/4) Start epoch 2342 2024-10-21 13:42:46,030 INFO [train.py:561] (1/4) Epoch 2342, batch 4, global_batch_idx: 37460, batch size: 189, loss[dur_loss=0.2049, prior_loss=0.9756, diff_loss=0.3265, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9746, diff_loss=0.4331, tot_loss=1.607, over 937.00 samples.], 2024-10-21 13:43:00,693 INFO [train.py:561] (1/4) Epoch 2342, batch 14, global_batch_idx: 37470, batch size: 142, loss[dur_loss=0.2038, prior_loss=0.9751, diff_loss=0.3147, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9751, diff_loss=0.364, tot_loss=1.542, over 2210.00 samples.], 2024-10-21 13:43:02,110 INFO [train.py:682] (1/4) Start epoch 2343 2024-10-21 13:43:21,679 INFO [train.py:561] (1/4) Epoch 2343, batch 8, global_batch_idx: 37480, batch size: 170, loss[dur_loss=0.2079, prior_loss=0.9757, diff_loss=0.3214, tot_loss=1.505, over 170.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.975, diff_loss=0.3868, tot_loss=1.565, over 1432.00 samples.], 2024-10-21 13:43:31,740 INFO [train.py:682] (1/4) Start epoch 2344 2024-10-21 13:43:42,884 INFO [train.py:561] (1/4) Epoch 2344, batch 2, global_batch_idx: 37490, batch size: 203, loss[dur_loss=0.2053, prior_loss=0.9754, diff_loss=0.3306, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9755, diff_loss=0.3044, tot_loss=1.486, over 442.00 samples.], 2024-10-21 13:43:57,055 INFO [train.py:561] (1/4) Epoch 2344, batch 12, global_batch_idx: 37500, batch size: 152, loss[dur_loss=0.2024, prior_loss=0.9752, diff_loss=0.2995, tot_loss=1.477, over 152.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.975, diff_loss=0.3596, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 13:43:58,684 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 13:44:56,690 INFO [train.py:589] (1/4) Epoch 2344, validation: dur_loss=0.4612, prior_loss=1.037, diff_loss=0.3943, tot_loss=1.892, over 100.00 samples. 2024-10-21 13:44:56,691 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 13:44:59,536 INFO [train.py:682] (1/4) Start epoch 2345 2024-10-21 13:45:16,768 INFO [train.py:561] (1/4) Epoch 2345, batch 6, global_batch_idx: 37510, batch size: 106, loss[dur_loss=0.209, prior_loss=0.9758, diff_loss=0.2893, tot_loss=1.474, over 106.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9747, diff_loss=0.4041, tot_loss=1.579, over 1142.00 samples.], 2024-10-21 13:45:29,801 INFO [train.py:682] (1/4) Start epoch 2346 2024-10-21 13:45:38,406 INFO [train.py:561] (1/4) Epoch 2346, batch 0, global_batch_idx: 37520, batch size: 108, loss[dur_loss=0.2084, prior_loss=0.976, diff_loss=0.3188, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2084, prior_loss=0.976, diff_loss=0.3188, tot_loss=1.503, over 108.00 samples.], 2024-10-21 13:45:52,590 INFO [train.py:561] (1/4) Epoch 2346, batch 10, global_batch_idx: 37530, batch size: 111, loss[dur_loss=0.2077, prior_loss=0.9763, diff_loss=0.3077, tot_loss=1.492, over 111.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.975, diff_loss=0.3625, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 13:45:59,677 INFO [train.py:682] (1/4) Start epoch 2347 2024-10-21 13:46:13,374 INFO [train.py:561] (1/4) Epoch 2347, batch 4, global_batch_idx: 37540, batch size: 189, loss[dur_loss=0.2039, prior_loss=0.9752, diff_loss=0.3106, tot_loss=1.49, over 189.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9745, diff_loss=0.4196, tot_loss=1.593, over 937.00 samples.], 2024-10-21 13:46:28,300 INFO [train.py:561] (1/4) Epoch 2347, batch 14, global_batch_idx: 37550, batch size: 142, loss[dur_loss=0.2048, prior_loss=0.9751, diff_loss=0.3157, tot_loss=1.496, over 142.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9751, diff_loss=0.3558, tot_loss=1.533, over 2210.00 samples.], 2024-10-21 13:46:29,724 INFO [train.py:682] (1/4) Start epoch 2348 2024-10-21 13:46:49,663 INFO [train.py:561] (1/4) Epoch 2348, batch 8, global_batch_idx: 37560, batch size: 170, loss[dur_loss=0.2094, prior_loss=0.9758, diff_loss=0.3326, tot_loss=1.518, over 170.00 samples.], tot_loss[dur_loss=0.2026, prior_loss=0.975, diff_loss=0.3793, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 13:46:59,762 INFO [train.py:682] (1/4) Start epoch 2349 2024-10-21 13:47:11,015 INFO [train.py:561] (1/4) Epoch 2349, batch 2, global_batch_idx: 37570, batch size: 203, loss[dur_loss=0.2054, prior_loss=0.9755, diff_loss=0.3466, tot_loss=1.528, over 203.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9755, diff_loss=0.3152, tot_loss=1.496, over 442.00 samples.], 2024-10-21 13:47:25,233 INFO [train.py:561] (1/4) Epoch 2349, batch 12, global_batch_idx: 37580, batch size: 152, loss[dur_loss=0.203, prior_loss=0.9755, diff_loss=0.3079, tot_loss=1.487, over 152.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9751, diff_loss=0.3581, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 13:47:29,668 INFO [train.py:682] (1/4) Start epoch 2350 2024-10-21 13:47:46,403 INFO [train.py:561] (1/4) Epoch 2350, batch 6, global_batch_idx: 37590, batch size: 106, loss[dur_loss=0.2047, prior_loss=0.9757, diff_loss=0.3021, tot_loss=1.482, over 106.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9748, diff_loss=0.4022, tot_loss=1.578, over 1142.00 samples.], 2024-10-21 13:47:59,331 INFO [train.py:682] (1/4) Start epoch 2351 2024-10-21 13:48:08,423 INFO [train.py:561] (1/4) Epoch 2351, batch 0, global_batch_idx: 37600, batch size: 108, loss[dur_loss=0.2082, prior_loss=0.9763, diff_loss=0.3208, tot_loss=1.505, over 108.00 samples.], tot_loss[dur_loss=0.2082, prior_loss=0.9763, diff_loss=0.3208, tot_loss=1.505, over 108.00 samples.], 2024-10-21 13:48:22,636 INFO [train.py:561] (1/4) Epoch 2351, batch 10, global_batch_idx: 37610, batch size: 111, loss[dur_loss=0.2048, prior_loss=0.9763, diff_loss=0.3373, tot_loss=1.518, over 111.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.975, diff_loss=0.3775, tot_loss=1.554, over 1656.00 samples.], 2024-10-21 13:48:29,712 INFO [train.py:682] (1/4) Start epoch 2352 2024-10-21 13:48:43,442 INFO [train.py:561] (1/4) Epoch 2352, batch 4, global_batch_idx: 37620, batch size: 189, loss[dur_loss=0.2056, prior_loss=0.9755, diff_loss=0.3413, tot_loss=1.522, over 189.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9745, diff_loss=0.4347, tot_loss=1.61, over 937.00 samples.], 2024-10-21 13:48:58,306 INFO [train.py:561] (1/4) Epoch 2352, batch 14, global_batch_idx: 37630, batch size: 142, loss[dur_loss=0.2079, prior_loss=0.975, diff_loss=0.3384, tot_loss=1.521, over 142.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9751, diff_loss=0.3649, tot_loss=1.543, over 2210.00 samples.], 2024-10-21 13:48:59,734 INFO [train.py:682] (1/4) Start epoch 2353 2024-10-21 13:49:19,805 INFO [train.py:561] (1/4) Epoch 2353, batch 8, global_batch_idx: 37640, batch size: 170, loss[dur_loss=0.2067, prior_loss=0.9754, diff_loss=0.2932, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.2019, prior_loss=0.9748, diff_loss=0.3782, tot_loss=1.555, over 1432.00 samples.], 2024-10-21 13:49:29,888 INFO [train.py:682] (1/4) Start epoch 2354 2024-10-21 13:49:41,257 INFO [train.py:561] (1/4) Epoch 2354, batch 2, global_batch_idx: 37650, batch size: 203, loss[dur_loss=0.2049, prior_loss=0.9753, diff_loss=0.3664, tot_loss=1.547, over 203.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9754, diff_loss=0.3296, tot_loss=1.51, over 442.00 samples.], 2024-10-21 13:49:55,551 INFO [train.py:561] (1/4) Epoch 2354, batch 12, global_batch_idx: 37660, batch size: 152, loss[dur_loss=0.2, prior_loss=0.9752, diff_loss=0.3082, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.2018, prior_loss=0.975, diff_loss=0.3702, tot_loss=1.547, over 1966.00 samples.], 2024-10-21 13:50:00,003 INFO [train.py:682] (1/4) Start epoch 2355 2024-10-21 13:50:17,055 INFO [train.py:561] (1/4) Epoch 2355, batch 6, global_batch_idx: 37670, batch size: 106, loss[dur_loss=0.2052, prior_loss=0.9755, diff_loss=0.2657, tot_loss=1.446, over 106.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9746, diff_loss=0.4037, tot_loss=1.578, over 1142.00 samples.], 2024-10-21 13:50:29,998 INFO [train.py:682] (1/4) Start epoch 2356 2024-10-21 13:50:38,618 INFO [train.py:561] (1/4) Epoch 2356, batch 0, global_batch_idx: 37680, batch size: 108, loss[dur_loss=0.2066, prior_loss=0.976, diff_loss=0.2986, tot_loss=1.481, over 108.00 samples.], tot_loss[dur_loss=0.2066, prior_loss=0.976, diff_loss=0.2986, tot_loss=1.481, over 108.00 samples.], 2024-10-21 13:50:52,702 INFO [train.py:561] (1/4) Epoch 2356, batch 10, global_batch_idx: 37690, batch size: 111, loss[dur_loss=0.2032, prior_loss=0.9761, diff_loss=0.3003, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9749, diff_loss=0.3683, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 13:50:59,739 INFO [train.py:682] (1/4) Start epoch 2357 2024-10-21 13:51:13,618 INFO [train.py:561] (1/4) Epoch 2357, batch 4, global_batch_idx: 37700, batch size: 189, loss[dur_loss=0.2036, prior_loss=0.9754, diff_loss=0.3314, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.1997, prior_loss=0.9745, diff_loss=0.4272, tot_loss=1.601, over 937.00 samples.], 2024-10-21 13:51:28,457 INFO [train.py:561] (1/4) Epoch 2357, batch 14, global_batch_idx: 37710, batch size: 142, loss[dur_loss=0.2003, prior_loss=0.9748, diff_loss=0.2484, tot_loss=1.423, over 142.00 samples.], tot_loss[dur_loss=0.2016, prior_loss=0.975, diff_loss=0.365, tot_loss=1.542, over 2210.00 samples.], 2024-10-21 13:51:29,880 INFO [train.py:682] (1/4) Start epoch 2358 2024-10-21 13:51:49,681 INFO [train.py:561] (1/4) Epoch 2358, batch 8, global_batch_idx: 37720, batch size: 170, loss[dur_loss=0.2093, prior_loss=0.9756, diff_loss=0.3246, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9748, diff_loss=0.3795, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 13:51:59,687 INFO [train.py:682] (1/4) Start epoch 2359 2024-10-21 13:52:10,813 INFO [train.py:561] (1/4) Epoch 2359, batch 2, global_batch_idx: 37730, batch size: 203, loss[dur_loss=0.2038, prior_loss=0.9752, diff_loss=0.3208, tot_loss=1.5, over 203.00 samples.], tot_loss[dur_loss=0.2044, prior_loss=0.9753, diff_loss=0.3109, tot_loss=1.491, over 442.00 samples.], 2024-10-21 13:52:25,033 INFO [train.py:561] (1/4) Epoch 2359, batch 12, global_batch_idx: 37740, batch size: 152, loss[dur_loss=0.2036, prior_loss=0.9755, diff_loss=0.3131, tot_loss=1.492, over 152.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.975, diff_loss=0.3634, tot_loss=1.541, over 1966.00 samples.], 2024-10-21 13:52:29,493 INFO [train.py:682] (1/4) Start epoch 2360 2024-10-21 13:52:46,220 INFO [train.py:561] (1/4) Epoch 2360, batch 6, global_batch_idx: 37750, batch size: 106, loss[dur_loss=0.205, prior_loss=0.9754, diff_loss=0.2713, tot_loss=1.452, over 106.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9747, diff_loss=0.3984, tot_loss=1.573, over 1142.00 samples.], 2024-10-21 13:52:59,152 INFO [train.py:682] (1/4) Start epoch 2361 2024-10-21 13:53:07,882 INFO [train.py:561] (1/4) Epoch 2361, batch 0, global_batch_idx: 37760, batch size: 108, loss[dur_loss=0.2108, prior_loss=0.9761, diff_loss=0.2657, tot_loss=1.453, over 108.00 samples.], tot_loss[dur_loss=0.2108, prior_loss=0.9761, diff_loss=0.2657, tot_loss=1.453, over 108.00 samples.], 2024-10-21 13:53:22,060 INFO [train.py:561] (1/4) Epoch 2361, batch 10, global_batch_idx: 37770, batch size: 111, loss[dur_loss=0.2101, prior_loss=0.9763, diff_loss=0.3204, tot_loss=1.507, over 111.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9751, diff_loss=0.3689, tot_loss=1.547, over 1656.00 samples.], 2024-10-21 13:53:29,169 INFO [train.py:682] (1/4) Start epoch 2362 2024-10-21 13:53:42,755 INFO [train.py:561] (1/4) Epoch 2362, batch 4, global_batch_idx: 37780, batch size: 189, loss[dur_loss=0.204, prior_loss=0.9753, diff_loss=0.3546, tot_loss=1.534, over 189.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9745, diff_loss=0.4332, tot_loss=1.608, over 937.00 samples.], 2024-10-21 13:53:57,494 INFO [train.py:561] (1/4) Epoch 2362, batch 14, global_batch_idx: 37790, batch size: 142, loss[dur_loss=0.2028, prior_loss=0.9751, diff_loss=0.3284, tot_loss=1.506, over 142.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9751, diff_loss=0.3667, tot_loss=1.544, over 2210.00 samples.], 2024-10-21 13:53:58,912 INFO [train.py:682] (1/4) Start epoch 2363 2024-10-21 13:54:18,840 INFO [train.py:561] (1/4) Epoch 2363, batch 8, global_batch_idx: 37800, batch size: 170, loss[dur_loss=0.2061, prior_loss=0.9754, diff_loss=0.317, tot_loss=1.498, over 170.00 samples.], tot_loss[dur_loss=0.2009, prior_loss=0.9749, diff_loss=0.3811, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 13:54:28,892 INFO [train.py:682] (1/4) Start epoch 2364 2024-10-21 13:54:40,410 INFO [train.py:561] (1/4) Epoch 2364, batch 2, global_batch_idx: 37810, batch size: 203, loss[dur_loss=0.2011, prior_loss=0.9751, diff_loss=0.3359, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9752, diff_loss=0.3284, tot_loss=1.506, over 442.00 samples.], 2024-10-21 13:54:54,499 INFO [train.py:561] (1/4) Epoch 2364, batch 12, global_batch_idx: 37820, batch size: 152, loss[dur_loss=0.2018, prior_loss=0.9751, diff_loss=0.2861, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9749, diff_loss=0.3597, tot_loss=1.535, over 1966.00 samples.], 2024-10-21 13:54:58,934 INFO [train.py:682] (1/4) Start epoch 2365 2024-10-21 13:55:15,652 INFO [train.py:561] (1/4) Epoch 2365, batch 6, global_batch_idx: 37830, batch size: 106, loss[dur_loss=0.2028, prior_loss=0.9753, diff_loss=0.2826, tot_loss=1.461, over 106.00 samples.], tot_loss[dur_loss=0.1997, prior_loss=0.9746, diff_loss=0.3944, tot_loss=1.569, over 1142.00 samples.], 2024-10-21 13:55:28,672 INFO [train.py:682] (1/4) Start epoch 2366 2024-10-21 13:55:37,295 INFO [train.py:561] (1/4) Epoch 2366, batch 0, global_batch_idx: 37840, batch size: 108, loss[dur_loss=0.2052, prior_loss=0.9758, diff_loss=0.3589, tot_loss=1.54, over 108.00 samples.], tot_loss[dur_loss=0.2052, prior_loss=0.9758, diff_loss=0.3589, tot_loss=1.54, over 108.00 samples.], 2024-10-21 13:55:51,462 INFO [train.py:561] (1/4) Epoch 2366, batch 10, global_batch_idx: 37850, batch size: 111, loss[dur_loss=0.205, prior_loss=0.9761, diff_loss=0.3402, tot_loss=1.521, over 111.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9749, diff_loss=0.3796, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 13:55:58,489 INFO [train.py:682] (1/4) Start epoch 2367 2024-10-21 13:56:12,359 INFO [train.py:561] (1/4) Epoch 2367, batch 4, global_batch_idx: 37860, batch size: 189, loss[dur_loss=0.2022, prior_loss=0.9751, diff_loss=0.3429, tot_loss=1.52, over 189.00 samples.], tot_loss[dur_loss=0.1993, prior_loss=0.9743, diff_loss=0.4229, tot_loss=1.597, over 937.00 samples.], 2024-10-21 13:56:27,025 INFO [train.py:561] (1/4) Epoch 2367, batch 14, global_batch_idx: 37870, batch size: 142, loss[dur_loss=0.203, prior_loss=0.9746, diff_loss=0.3089, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9749, diff_loss=0.3547, tot_loss=1.532, over 2210.00 samples.], 2024-10-21 13:56:28,432 INFO [train.py:682] (1/4) Start epoch 2368 2024-10-21 13:56:47,992 INFO [train.py:561] (1/4) Epoch 2368, batch 8, global_batch_idx: 37880, batch size: 170, loss[dur_loss=0.2041, prior_loss=0.9754, diff_loss=0.3148, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9747, diff_loss=0.3832, tot_loss=1.56, over 1432.00 samples.], 2024-10-21 13:56:58,048 INFO [train.py:682] (1/4) Start epoch 2369 2024-10-21 13:57:09,141 INFO [train.py:561] (1/4) Epoch 2369, batch 2, global_batch_idx: 37890, batch size: 203, loss[dur_loss=0.2005, prior_loss=0.9748, diff_loss=0.3432, tot_loss=1.519, over 203.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.9751, diff_loss=0.2995, tot_loss=1.477, over 442.00 samples.], 2024-10-21 13:57:23,564 INFO [train.py:561] (1/4) Epoch 2369, batch 12, global_batch_idx: 37900, batch size: 152, loss[dur_loss=0.2017, prior_loss=0.9754, diff_loss=0.3306, tot_loss=1.508, over 152.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.9749, diff_loss=0.3622, tot_loss=1.538, over 1966.00 samples.], 2024-10-21 13:57:27,967 INFO [train.py:682] (1/4) Start epoch 2370 2024-10-21 13:57:44,824 INFO [train.py:561] (1/4) Epoch 2370, batch 6, global_batch_idx: 37910, batch size: 106, loss[dur_loss=0.2028, prior_loss=0.9754, diff_loss=0.3404, tot_loss=1.519, over 106.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9746, diff_loss=0.4031, tot_loss=1.577, over 1142.00 samples.], 2024-10-21 13:57:57,772 INFO [train.py:682] (1/4) Start epoch 2371 2024-10-21 13:58:06,339 INFO [train.py:561] (1/4) Epoch 2371, batch 0, global_batch_idx: 37920, batch size: 108, loss[dur_loss=0.21, prior_loss=0.976, diff_loss=0.2658, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.21, prior_loss=0.976, diff_loss=0.2658, tot_loss=1.452, over 108.00 samples.], 2024-10-21 13:58:20,372 INFO [train.py:561] (1/4) Epoch 2371, batch 10, global_batch_idx: 37930, batch size: 111, loss[dur_loss=0.2049, prior_loss=0.9764, diff_loss=0.3154, tot_loss=1.497, over 111.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.975, diff_loss=0.377, tot_loss=1.553, over 1656.00 samples.], 2024-10-21 13:58:27,354 INFO [train.py:682] (1/4) Start epoch 2372 2024-10-21 13:58:40,801 INFO [train.py:561] (1/4) Epoch 2372, batch 4, global_batch_idx: 37940, batch size: 189, loss[dur_loss=0.2062, prior_loss=0.9753, diff_loss=0.3415, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.2006, prior_loss=0.9743, diff_loss=0.4218, tot_loss=1.597, over 937.00 samples.], 2024-10-21 13:58:55,511 INFO [train.py:561] (1/4) Epoch 2372, batch 14, global_batch_idx: 37950, batch size: 142, loss[dur_loss=0.203, prior_loss=0.9747, diff_loss=0.307, tot_loss=1.485, over 142.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9749, diff_loss=0.3526, tot_loss=1.531, over 2210.00 samples.], 2024-10-21 13:58:56,935 INFO [train.py:682] (1/4) Start epoch 2373 2024-10-21 13:59:16,754 INFO [train.py:561] (1/4) Epoch 2373, batch 8, global_batch_idx: 37960, batch size: 170, loss[dur_loss=0.2069, prior_loss=0.9755, diff_loss=0.342, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9748, diff_loss=0.3942, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 13:59:26,794 INFO [train.py:682] (1/4) Start epoch 2374 2024-10-21 13:59:37,864 INFO [train.py:561] (1/4) Epoch 2374, batch 2, global_batch_idx: 37970, batch size: 203, loss[dur_loss=0.2071, prior_loss=0.9752, diff_loss=0.3222, tot_loss=1.505, over 203.00 samples.], tot_loss[dur_loss=0.2063, prior_loss=0.9753, diff_loss=0.3228, tot_loss=1.504, over 442.00 samples.], 2024-10-21 13:59:51,894 INFO [train.py:561] (1/4) Epoch 2374, batch 12, global_batch_idx: 37980, batch size: 152, loss[dur_loss=0.2043, prior_loss=0.9753, diff_loss=0.3498, tot_loss=1.529, over 152.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.975, diff_loss=0.3692, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 13:59:56,280 INFO [train.py:682] (1/4) Start epoch 2375 2024-10-21 14:00:13,470 INFO [train.py:561] (1/4) Epoch 2375, batch 6, global_batch_idx: 37990, batch size: 106, loss[dur_loss=0.203, prior_loss=0.9752, diff_loss=0.2919, tot_loss=1.47, over 106.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9746, diff_loss=0.401, tot_loss=1.575, over 1142.00 samples.], 2024-10-21 14:00:26,407 INFO [train.py:682] (1/4) Start epoch 2376 2024-10-21 14:00:35,042 INFO [train.py:561] (1/4) Epoch 2376, batch 0, global_batch_idx: 38000, batch size: 108, loss[dur_loss=0.208, prior_loss=0.9757, diff_loss=0.3304, tot_loss=1.514, over 108.00 samples.], tot_loss[dur_loss=0.208, prior_loss=0.9757, diff_loss=0.3304, tot_loss=1.514, over 108.00 samples.], 2024-10-21 14:00:49,165 INFO [train.py:561] (1/4) Epoch 2376, batch 10, global_batch_idx: 38010, batch size: 111, loss[dur_loss=0.2053, prior_loss=0.976, diff_loss=0.275, tot_loss=1.456, over 111.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9747, diff_loss=0.3745, tot_loss=1.551, over 1656.00 samples.], 2024-10-21 14:00:56,204 INFO [train.py:682] (1/4) Start epoch 2377 2024-10-21 14:01:09,808 INFO [train.py:561] (1/4) Epoch 2377, batch 4, global_batch_idx: 38020, batch size: 189, loss[dur_loss=0.2023, prior_loss=0.9751, diff_loss=0.3362, tot_loss=1.514, over 189.00 samples.], tot_loss[dur_loss=0.1994, prior_loss=0.9743, diff_loss=0.4176, tot_loss=1.591, over 937.00 samples.], 2024-10-21 14:01:24,726 INFO [train.py:561] (1/4) Epoch 2377, batch 14, global_batch_idx: 38030, batch size: 142, loss[dur_loss=0.205, prior_loss=0.9747, diff_loss=0.2757, tot_loss=1.455, over 142.00 samples.], tot_loss[dur_loss=0.2025, prior_loss=0.9748, diff_loss=0.3531, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 14:01:26,143 INFO [train.py:682] (1/4) Start epoch 2378 2024-10-21 14:01:45,807 INFO [train.py:561] (1/4) Epoch 2378, batch 8, global_batch_idx: 38040, batch size: 170, loss[dur_loss=0.2046, prior_loss=0.9752, diff_loss=0.3159, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9746, diff_loss=0.3858, tot_loss=1.561, over 1432.00 samples.], 2024-10-21 14:01:55,833 INFO [train.py:682] (1/4) Start epoch 2379 2024-10-21 14:02:07,276 INFO [train.py:561] (1/4) Epoch 2379, batch 2, global_batch_idx: 38050, batch size: 203, loss[dur_loss=0.2016, prior_loss=0.9751, diff_loss=0.3301, tot_loss=1.507, over 203.00 samples.], tot_loss[dur_loss=0.2029, prior_loss=0.9752, diff_loss=0.3226, tot_loss=1.501, over 442.00 samples.], 2024-10-21 14:02:21,427 INFO [train.py:561] (1/4) Epoch 2379, batch 12, global_batch_idx: 38060, batch size: 152, loss[dur_loss=0.2033, prior_loss=0.9754, diff_loss=0.3013, tot_loss=1.48, over 152.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9749, diff_loss=0.3586, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 14:02:25,853 INFO [train.py:682] (1/4) Start epoch 2380 2024-10-21 14:02:42,614 INFO [train.py:561] (1/4) Epoch 2380, batch 6, global_batch_idx: 38070, batch size: 106, loss[dur_loss=0.2058, prior_loss=0.9752, diff_loss=0.3598, tot_loss=1.541, over 106.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9744, diff_loss=0.4023, tot_loss=1.577, over 1142.00 samples.], 2024-10-21 14:02:55,647 INFO [train.py:682] (1/4) Start epoch 2381 2024-10-21 14:03:04,239 INFO [train.py:561] (1/4) Epoch 2381, batch 0, global_batch_idx: 38080, batch size: 108, loss[dur_loss=0.2059, prior_loss=0.9757, diff_loss=0.2604, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.2059, prior_loss=0.9757, diff_loss=0.2604, tot_loss=1.442, over 108.00 samples.], 2024-10-21 14:03:18,716 INFO [train.py:561] (1/4) Epoch 2381, batch 10, global_batch_idx: 38090, batch size: 111, loss[dur_loss=0.2044, prior_loss=0.9761, diff_loss=0.2569, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9747, diff_loss=0.3729, tot_loss=1.548, over 1656.00 samples.], 2024-10-21 14:03:25,786 INFO [train.py:682] (1/4) Start epoch 2382 2024-10-21 14:03:39,436 INFO [train.py:561] (1/4) Epoch 2382, batch 4, global_batch_idx: 38100, batch size: 189, loss[dur_loss=0.1998, prior_loss=0.975, diff_loss=0.3266, tot_loss=1.501, over 189.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9742, diff_loss=0.4201, tot_loss=1.592, over 937.00 samples.], 2024-10-21 14:03:54,213 INFO [train.py:561] (1/4) Epoch 2382, batch 14, global_batch_idx: 38110, batch size: 142, loss[dur_loss=0.2023, prior_loss=0.9746, diff_loss=0.2821, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9747, diff_loss=0.3582, tot_loss=1.533, over 2210.00 samples.], 2024-10-21 14:03:55,627 INFO [train.py:682] (1/4) Start epoch 2383 2024-10-21 14:04:15,642 INFO [train.py:561] (1/4) Epoch 2383, batch 8, global_batch_idx: 38120, batch size: 170, loss[dur_loss=0.206, prior_loss=0.9753, diff_loss=0.3204, tot_loss=1.502, over 170.00 samples.], tot_loss[dur_loss=0.1999, prior_loss=0.9746, diff_loss=0.3745, tot_loss=1.549, over 1432.00 samples.], 2024-10-21 14:04:25,705 INFO [train.py:682] (1/4) Start epoch 2384 2024-10-21 14:04:36,869 INFO [train.py:561] (1/4) Epoch 2384, batch 2, global_batch_idx: 38130, batch size: 203, loss[dur_loss=0.2044, prior_loss=0.9751, diff_loss=0.3354, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.2034, prior_loss=0.9752, diff_loss=0.3194, tot_loss=1.498, over 442.00 samples.], 2024-10-21 14:04:50,976 INFO [train.py:561] (1/4) Epoch 2384, batch 12, global_batch_idx: 38140, batch size: 152, loss[dur_loss=0.2042, prior_loss=0.9752, diff_loss=0.3018, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9748, diff_loss=0.3636, tot_loss=1.539, over 1966.00 samples.], 2024-10-21 14:04:55,377 INFO [train.py:682] (1/4) Start epoch 2385 2024-10-21 14:05:12,660 INFO [train.py:561] (1/4) Epoch 2385, batch 6, global_batch_idx: 38150, batch size: 106, loss[dur_loss=0.203, prior_loss=0.975, diff_loss=0.2848, tot_loss=1.463, over 106.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9743, diff_loss=0.3968, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 14:05:25,694 INFO [train.py:682] (1/4) Start epoch 2386 2024-10-21 14:05:34,456 INFO [train.py:561] (1/4) Epoch 2386, batch 0, global_batch_idx: 38160, batch size: 108, loss[dur_loss=0.2039, prior_loss=0.9757, diff_loss=0.3153, tot_loss=1.495, over 108.00 samples.], tot_loss[dur_loss=0.2039, prior_loss=0.9757, diff_loss=0.3153, tot_loss=1.495, over 108.00 samples.], 2024-10-21 14:05:48,631 INFO [train.py:561] (1/4) Epoch 2386, batch 10, global_batch_idx: 38170, batch size: 111, loss[dur_loss=0.206, prior_loss=0.9759, diff_loss=0.3172, tot_loss=1.499, over 111.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9747, diff_loss=0.3812, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 14:05:55,695 INFO [train.py:682] (1/4) Start epoch 2387 2024-10-21 14:06:09,595 INFO [train.py:561] (1/4) Epoch 2387, batch 4, global_batch_idx: 38180, batch size: 189, loss[dur_loss=0.1985, prior_loss=0.9749, diff_loss=0.3386, tot_loss=1.512, over 189.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9741, diff_loss=0.4221, tot_loss=1.594, over 937.00 samples.], 2024-10-21 14:06:24,413 INFO [train.py:561] (1/4) Epoch 2387, batch 14, global_batch_idx: 38190, batch size: 142, loss[dur_loss=0.2033, prior_loss=0.9748, diff_loss=0.2994, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9747, diff_loss=0.3554, tot_loss=1.531, over 2210.00 samples.], 2024-10-21 14:06:25,847 INFO [train.py:682] (1/4) Start epoch 2388 2024-10-21 14:06:45,725 INFO [train.py:561] (1/4) Epoch 2388, batch 8, global_batch_idx: 38200, batch size: 170, loss[dur_loss=0.2061, prior_loss=0.9752, diff_loss=0.332, tot_loss=1.513, over 170.00 samples.], tot_loss[dur_loss=0.2009, prior_loss=0.9745, diff_loss=0.392, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 14:06:55,821 INFO [train.py:682] (1/4) Start epoch 2389 2024-10-21 14:07:07,070 INFO [train.py:561] (1/4) Epoch 2389, batch 2, global_batch_idx: 38210, batch size: 203, loss[dur_loss=0.2031, prior_loss=0.9751, diff_loss=0.3158, tot_loss=1.494, over 203.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9751, diff_loss=0.3125, tot_loss=1.49, over 442.00 samples.], 2024-10-21 14:07:21,305 INFO [train.py:561] (1/4) Epoch 2389, batch 12, global_batch_idx: 38220, batch size: 152, loss[dur_loss=0.2012, prior_loss=0.975, diff_loss=0.353, tot_loss=1.529, over 152.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9747, diff_loss=0.3721, tot_loss=1.548, over 1966.00 samples.], 2024-10-21 14:07:25,736 INFO [train.py:682] (1/4) Start epoch 2390 2024-10-21 14:07:42,484 INFO [train.py:561] (1/4) Epoch 2390, batch 6, global_batch_idx: 38230, batch size: 106, loss[dur_loss=0.2037, prior_loss=0.9752, diff_loss=0.3292, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9744, diff_loss=0.3975, tot_loss=1.572, over 1142.00 samples.], 2024-10-21 14:07:55,424 INFO [train.py:682] (1/4) Start epoch 2391 2024-10-21 14:08:04,495 INFO [train.py:561] (1/4) Epoch 2391, batch 0, global_batch_idx: 38240, batch size: 108, loss[dur_loss=0.2069, prior_loss=0.9756, diff_loss=0.3313, tot_loss=1.514, over 108.00 samples.], tot_loss[dur_loss=0.2069, prior_loss=0.9756, diff_loss=0.3313, tot_loss=1.514, over 108.00 samples.], 2024-10-21 14:08:18,678 INFO [train.py:561] (1/4) Epoch 2391, batch 10, global_batch_idx: 38250, batch size: 111, loss[dur_loss=0.2091, prior_loss=0.9763, diff_loss=0.3055, tot_loss=1.491, over 111.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9747, diff_loss=0.37, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 14:08:25,715 INFO [train.py:682] (1/4) Start epoch 2392 2024-10-21 14:08:39,347 INFO [train.py:561] (1/4) Epoch 2392, batch 4, global_batch_idx: 38260, batch size: 189, loss[dur_loss=0.2003, prior_loss=0.9749, diff_loss=0.3164, tot_loss=1.492, over 189.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9741, diff_loss=0.4097, tot_loss=1.58, over 937.00 samples.], 2024-10-21 14:08:54,171 INFO [train.py:561] (1/4) Epoch 2392, batch 14, global_batch_idx: 38270, batch size: 142, loss[dur_loss=0.2046, prior_loss=0.9746, diff_loss=0.2957, tot_loss=1.475, over 142.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9747, diff_loss=0.3494, tot_loss=1.525, over 2210.00 samples.], 2024-10-21 14:08:55,588 INFO [train.py:682] (1/4) Start epoch 2393 2024-10-21 14:09:15,244 INFO [train.py:561] (1/4) Epoch 2393, batch 8, global_batch_idx: 38280, batch size: 170, loss[dur_loss=0.2078, prior_loss=0.9753, diff_loss=0.3688, tot_loss=1.552, over 170.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9746, diff_loss=0.3815, tot_loss=1.555, over 1432.00 samples.], 2024-10-21 14:09:25,325 INFO [train.py:682] (1/4) Start epoch 2394 2024-10-21 14:09:36,445 INFO [train.py:561] (1/4) Epoch 2394, batch 2, global_batch_idx: 38290, batch size: 203, loss[dur_loss=0.2026, prior_loss=0.9748, diff_loss=0.3372, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.975, diff_loss=0.309, tot_loss=1.487, over 442.00 samples.], 2024-10-21 14:09:50,589 INFO [train.py:561] (1/4) Epoch 2394, batch 12, global_batch_idx: 38300, batch size: 152, loss[dur_loss=0.2015, prior_loss=0.9751, diff_loss=0.2929, tot_loss=1.469, over 152.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9746, diff_loss=0.3568, tot_loss=1.533, over 1966.00 samples.], 2024-10-21 14:09:55,000 INFO [train.py:682] (1/4) Start epoch 2395 2024-10-21 14:10:11,890 INFO [train.py:561] (1/4) Epoch 2395, batch 6, global_batch_idx: 38310, batch size: 106, loss[dur_loss=0.2034, prior_loss=0.975, diff_loss=0.2943, tot_loss=1.473, over 106.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9743, diff_loss=0.3976, tot_loss=1.57, over 1142.00 samples.], 2024-10-21 14:10:24,820 INFO [train.py:682] (1/4) Start epoch 2396 2024-10-21 14:10:33,454 INFO [train.py:561] (1/4) Epoch 2396, batch 0, global_batch_idx: 38320, batch size: 108, loss[dur_loss=0.2044, prior_loss=0.9755, diff_loss=0.2647, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.2044, prior_loss=0.9755, diff_loss=0.2647, tot_loss=1.445, over 108.00 samples.], 2024-10-21 14:10:47,532 INFO [train.py:561] (1/4) Epoch 2396, batch 10, global_batch_idx: 38330, batch size: 111, loss[dur_loss=0.2063, prior_loss=0.9759, diff_loss=0.2903, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9746, diff_loss=0.3657, tot_loss=1.541, over 1656.00 samples.], 2024-10-21 14:10:54,563 INFO [train.py:682] (1/4) Start epoch 2397 2024-10-21 14:11:08,382 INFO [train.py:561] (1/4) Epoch 2397, batch 4, global_batch_idx: 38340, batch size: 189, loss[dur_loss=0.2046, prior_loss=0.9757, diff_loss=0.3392, tot_loss=1.519, over 189.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9742, diff_loss=0.4167, tot_loss=1.589, over 937.00 samples.], 2024-10-21 14:11:23,081 INFO [train.py:561] (1/4) Epoch 2397, batch 14, global_batch_idx: 38350, batch size: 142, loss[dur_loss=0.201, prior_loss=0.9747, diff_loss=0.3633, tot_loss=1.539, over 142.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9748, diff_loss=0.3613, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 14:11:24,507 INFO [train.py:682] (1/4) Start epoch 2398 2024-10-21 14:11:44,299 INFO [train.py:561] (1/4) Epoch 2398, batch 8, global_batch_idx: 38360, batch size: 170, loss[dur_loss=0.2047, prior_loss=0.9751, diff_loss=0.3493, tot_loss=1.529, over 170.00 samples.], tot_loss[dur_loss=0.2006, prior_loss=0.9745, diff_loss=0.3832, tot_loss=1.558, over 1432.00 samples.], 2024-10-21 14:11:54,315 INFO [train.py:682] (1/4) Start epoch 2399 2024-10-21 14:12:05,680 INFO [train.py:561] (1/4) Epoch 2399, batch 2, global_batch_idx: 38370, batch size: 203, loss[dur_loss=0.2065, prior_loss=0.9751, diff_loss=0.3199, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.2053, prior_loss=0.975, diff_loss=0.315, tot_loss=1.495, over 442.00 samples.], 2024-10-21 14:12:19,826 INFO [train.py:561] (1/4) Epoch 2399, batch 12, global_batch_idx: 38380, batch size: 152, loss[dur_loss=0.2, prior_loss=0.9749, diff_loss=0.3243, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9747, diff_loss=0.36, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 14:12:24,247 INFO [train.py:682] (1/4) Start epoch 2400 2024-10-21 14:12:41,376 INFO [train.py:561] (1/4) Epoch 2400, batch 6, global_batch_idx: 38390, batch size: 106, loss[dur_loss=0.2001, prior_loss=0.9751, diff_loss=0.2977, tot_loss=1.473, over 106.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9744, diff_loss=0.4173, tot_loss=1.591, over 1142.00 samples.], 2024-10-21 14:12:54,256 INFO [train.py:682] (1/4) Start epoch 2401 2024-10-21 14:13:02,834 INFO [train.py:561] (1/4) Epoch 2401, batch 0, global_batch_idx: 38400, batch size: 108, loss[dur_loss=0.2038, prior_loss=0.9757, diff_loss=0.3145, tot_loss=1.494, over 108.00 samples.], tot_loss[dur_loss=0.2038, prior_loss=0.9757, diff_loss=0.3145, tot_loss=1.494, over 108.00 samples.], 2024-10-21 14:13:16,907 INFO [train.py:561] (1/4) Epoch 2401, batch 10, global_batch_idx: 38410, batch size: 111, loss[dur_loss=0.2077, prior_loss=0.9762, diff_loss=0.351, tot_loss=1.535, over 111.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9747, diff_loss=0.3801, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 14:13:23,934 INFO [train.py:682] (1/4) Start epoch 2402 2024-10-21 14:13:37,445 INFO [train.py:561] (1/4) Epoch 2402, batch 4, global_batch_idx: 38420, batch size: 189, loss[dur_loss=0.2028, prior_loss=0.9752, diff_loss=0.3481, tot_loss=1.526, over 189.00 samples.], tot_loss[dur_loss=0.1997, prior_loss=0.9742, diff_loss=0.4309, tot_loss=1.605, over 937.00 samples.], 2024-10-21 14:13:52,165 INFO [train.py:561] (1/4) Epoch 2402, batch 14, global_batch_idx: 38430, batch size: 142, loss[dur_loss=0.2047, prior_loss=0.9749, diff_loss=0.261, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9748, diff_loss=0.3508, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 14:13:53,579 INFO [train.py:682] (1/4) Start epoch 2403 2024-10-21 14:14:13,184 INFO [train.py:561] (1/4) Epoch 2403, batch 8, global_batch_idx: 38440, batch size: 170, loss[dur_loss=0.2047, prior_loss=0.9751, diff_loss=0.3534, tot_loss=1.533, over 170.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9745, diff_loss=0.3874, tot_loss=1.563, over 1432.00 samples.], 2024-10-21 14:14:23,293 INFO [train.py:682] (1/4) Start epoch 2404 2024-10-21 14:14:34,783 INFO [train.py:561] (1/4) Epoch 2404, batch 2, global_batch_idx: 38450, batch size: 203, loss[dur_loss=0.2006, prior_loss=0.9753, diff_loss=0.3446, tot_loss=1.52, over 203.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9752, diff_loss=0.3317, tot_loss=1.508, over 442.00 samples.], 2024-10-21 14:14:48,962 INFO [train.py:561] (1/4) Epoch 2404, batch 12, global_batch_idx: 38460, batch size: 152, loss[dur_loss=0.2015, prior_loss=0.9753, diff_loss=0.3262, tot_loss=1.503, over 152.00 samples.], tot_loss[dur_loss=0.2009, prior_loss=0.9748, diff_loss=0.3591, tot_loss=1.535, over 1966.00 samples.], 2024-10-21 14:14:53,382 INFO [train.py:682] (1/4) Start epoch 2405 2024-10-21 14:15:10,141 INFO [train.py:561] (1/4) Epoch 2405, batch 6, global_batch_idx: 38470, batch size: 106, loss[dur_loss=0.2029, prior_loss=0.9751, diff_loss=0.2723, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9745, diff_loss=0.4022, tot_loss=1.576, over 1142.00 samples.], 2024-10-21 14:15:23,425 INFO [train.py:682] (1/4) Start epoch 2406 2024-10-21 14:15:32,337 INFO [train.py:561] (1/4) Epoch 2406, batch 0, global_batch_idx: 38480, batch size: 108, loss[dur_loss=0.2083, prior_loss=0.9758, diff_loss=0.3165, tot_loss=1.501, over 108.00 samples.], tot_loss[dur_loss=0.2083, prior_loss=0.9758, diff_loss=0.3165, tot_loss=1.501, over 108.00 samples.], 2024-10-21 14:15:46,680 INFO [train.py:561] (1/4) Epoch 2406, batch 10, global_batch_idx: 38490, batch size: 111, loss[dur_loss=0.2047, prior_loss=0.9758, diff_loss=0.3139, tot_loss=1.494, over 111.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9747, diff_loss=0.3888, tot_loss=1.563, over 1656.00 samples.], 2024-10-21 14:15:53,768 INFO [train.py:682] (1/4) Start epoch 2407 2024-10-21 14:16:07,540 INFO [train.py:561] (1/4) Epoch 2407, batch 4, global_batch_idx: 38500, batch size: 189, loss[dur_loss=0.2029, prior_loss=0.9752, diff_loss=0.3398, tot_loss=1.518, over 189.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9743, diff_loss=0.422, tot_loss=1.595, over 937.00 samples.], 2024-10-21 14:16:22,559 INFO [train.py:561] (1/4) Epoch 2407, batch 14, global_batch_idx: 38510, batch size: 142, loss[dur_loss=0.2001, prior_loss=0.9746, diff_loss=0.2978, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9748, diff_loss=0.3532, tot_loss=1.529, over 2210.00 samples.], 2024-10-21 14:16:23,970 INFO [train.py:682] (1/4) Start epoch 2408 2024-10-21 14:16:43,940 INFO [train.py:561] (1/4) Epoch 2408, batch 8, global_batch_idx: 38520, batch size: 170, loss[dur_loss=0.2031, prior_loss=0.9753, diff_loss=0.3175, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9746, diff_loss=0.3859, tot_loss=1.56, over 1432.00 samples.], 2024-10-21 14:16:54,064 INFO [train.py:682] (1/4) Start epoch 2409 2024-10-21 14:17:05,479 INFO [train.py:561] (1/4) Epoch 2409, batch 2, global_batch_idx: 38530, batch size: 203, loss[dur_loss=0.2007, prior_loss=0.975, diff_loss=0.3551, tot_loss=1.531, over 203.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9751, diff_loss=0.3344, tot_loss=1.512, over 442.00 samples.], 2024-10-21 14:17:19,777 INFO [train.py:561] (1/4) Epoch 2409, batch 12, global_batch_idx: 38540, batch size: 152, loss[dur_loss=0.2014, prior_loss=0.9751, diff_loss=0.3251, tot_loss=1.502, over 152.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9746, diff_loss=0.3651, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 14:17:24,186 INFO [train.py:682] (1/4) Start epoch 2410 2024-10-21 14:17:40,993 INFO [train.py:561] (1/4) Epoch 2410, batch 6, global_batch_idx: 38550, batch size: 106, loss[dur_loss=0.2016, prior_loss=0.9751, diff_loss=0.2759, tot_loss=1.453, over 106.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9744, diff_loss=0.398, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 14:17:53,985 INFO [train.py:682] (1/4) Start epoch 2411 2024-10-21 14:18:02,437 INFO [train.py:561] (1/4) Epoch 2411, batch 0, global_batch_idx: 38560, batch size: 108, loss[dur_loss=0.2078, prior_loss=0.9756, diff_loss=0.3345, tot_loss=1.518, over 108.00 samples.], tot_loss[dur_loss=0.2078, prior_loss=0.9756, diff_loss=0.3345, tot_loss=1.518, over 108.00 samples.], 2024-10-21 14:18:16,519 INFO [train.py:561] (1/4) Epoch 2411, batch 10, global_batch_idx: 38570, batch size: 111, loss[dur_loss=0.2049, prior_loss=0.9758, diff_loss=0.3051, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9746, diff_loss=0.3741, tot_loss=1.55, over 1656.00 samples.], 2024-10-21 14:18:23,535 INFO [train.py:682] (1/4) Start epoch 2412 2024-10-21 14:18:37,396 INFO [train.py:561] (1/4) Epoch 2412, batch 4, global_batch_idx: 38580, batch size: 189, loss[dur_loss=0.2024, prior_loss=0.975, diff_loss=0.3138, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9741, diff_loss=0.4192, tot_loss=1.591, over 937.00 samples.], 2024-10-21 14:18:52,293 INFO [train.py:561] (1/4) Epoch 2412, batch 14, global_batch_idx: 38590, batch size: 142, loss[dur_loss=0.2001, prior_loss=0.9748, diff_loss=0.3119, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9747, diff_loss=0.3519, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 14:18:53,702 INFO [train.py:682] (1/4) Start epoch 2413 2024-10-21 14:19:13,383 INFO [train.py:561] (1/4) Epoch 2413, batch 8, global_batch_idx: 38600, batch size: 170, loss[dur_loss=0.2033, prior_loss=0.9749, diff_loss=0.3035, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.374, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 14:19:23,489 INFO [train.py:682] (1/4) Start epoch 2414 2024-10-21 14:19:34,926 INFO [train.py:561] (1/4) Epoch 2414, batch 2, global_batch_idx: 38610, batch size: 203, loss[dur_loss=0.2036, prior_loss=0.975, diff_loss=0.3421, tot_loss=1.521, over 203.00 samples.], tot_loss[dur_loss=0.2037, prior_loss=0.9751, diff_loss=0.3101, tot_loss=1.489, over 442.00 samples.], 2024-10-21 14:19:49,059 INFO [train.py:561] (1/4) Epoch 2414, batch 12, global_batch_idx: 38620, batch size: 152, loss[dur_loss=0.1989, prior_loss=0.9749, diff_loss=0.3336, tot_loss=1.507, over 152.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9746, diff_loss=0.3634, tot_loss=1.538, over 1966.00 samples.], 2024-10-21 14:19:53,512 INFO [train.py:682] (1/4) Start epoch 2415 2024-10-21 14:20:10,377 INFO [train.py:561] (1/4) Epoch 2415, batch 6, global_batch_idx: 38630, batch size: 106, loss[dur_loss=0.2027, prior_loss=0.9748, diff_loss=0.2794, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9743, diff_loss=0.3916, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 14:20:23,388 INFO [train.py:682] (1/4) Start epoch 2416 2024-10-21 14:20:31,899 INFO [train.py:561] (1/4) Epoch 2416, batch 0, global_batch_idx: 38640, batch size: 108, loss[dur_loss=0.2053, prior_loss=0.9754, diff_loss=0.3042, tot_loss=1.485, over 108.00 samples.], tot_loss[dur_loss=0.2053, prior_loss=0.9754, diff_loss=0.3042, tot_loss=1.485, over 108.00 samples.], 2024-10-21 14:20:46,186 INFO [train.py:561] (1/4) Epoch 2416, batch 10, global_batch_idx: 38650, batch size: 111, loss[dur_loss=0.2054, prior_loss=0.9761, diff_loss=0.2847, tot_loss=1.466, over 111.00 samples.], tot_loss[dur_loss=0.2002, prior_loss=0.9745, diff_loss=0.3655, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 14:20:53,310 INFO [train.py:682] (1/4) Start epoch 2417 2024-10-21 14:21:07,139 INFO [train.py:561] (1/4) Epoch 2417, batch 4, global_batch_idx: 38660, batch size: 189, loss[dur_loss=0.2, prior_loss=0.975, diff_loss=0.3189, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.974, diff_loss=0.4232, tot_loss=1.594, over 937.00 samples.], 2024-10-21 14:21:22,081 INFO [train.py:561] (1/4) Epoch 2417, batch 14, global_batch_idx: 38670, batch size: 142, loss[dur_loss=0.2035, prior_loss=0.9744, diff_loss=0.3078, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9746, diff_loss=0.3548, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 14:21:23,526 INFO [train.py:682] (1/4) Start epoch 2418 2024-10-21 14:21:43,641 INFO [train.py:561] (1/4) Epoch 2418, batch 8, global_batch_idx: 38680, batch size: 170, loss[dur_loss=0.207, prior_loss=0.9751, diff_loss=0.3204, tot_loss=1.502, over 170.00 samples.], tot_loss[dur_loss=0.1999, prior_loss=0.9744, diff_loss=0.3878, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 14:21:53,802 INFO [train.py:682] (1/4) Start epoch 2419 2024-10-21 14:22:05,198 INFO [train.py:561] (1/4) Epoch 2419, batch 2, global_batch_idx: 38690, batch size: 203, loss[dur_loss=0.2022, prior_loss=0.9751, diff_loss=0.3039, tot_loss=1.481, over 203.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9751, diff_loss=0.2949, tot_loss=1.473, over 442.00 samples.], 2024-10-21 14:22:19,501 INFO [train.py:561] (1/4) Epoch 2419, batch 12, global_batch_idx: 38700, batch size: 152, loss[dur_loss=0.2044, prior_loss=0.975, diff_loss=0.328, tot_loss=1.507, over 152.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9746, diff_loss=0.3537, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 14:22:23,987 INFO [train.py:682] (1/4) Start epoch 2420 2024-10-21 14:22:41,621 INFO [train.py:561] (1/4) Epoch 2420, batch 6, global_batch_idx: 38710, batch size: 106, loss[dur_loss=0.1999, prior_loss=0.975, diff_loss=0.3139, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9743, diff_loss=0.4016, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 14:22:54,761 INFO [train.py:682] (1/4) Start epoch 2421 2024-10-21 14:23:03,420 INFO [train.py:561] (1/4) Epoch 2421, batch 0, global_batch_idx: 38720, batch size: 108, loss[dur_loss=0.2049, prior_loss=0.9754, diff_loss=0.2833, tot_loss=1.464, over 108.00 samples.], tot_loss[dur_loss=0.2049, prior_loss=0.9754, diff_loss=0.2833, tot_loss=1.464, over 108.00 samples.], 2024-10-21 14:23:17,641 INFO [train.py:561] (1/4) Epoch 2421, batch 10, global_batch_idx: 38730, batch size: 111, loss[dur_loss=0.2053, prior_loss=0.9758, diff_loss=0.2951, tot_loss=1.476, over 111.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9745, diff_loss=0.3766, tot_loss=1.552, over 1656.00 samples.], 2024-10-21 14:23:24,721 INFO [train.py:682] (1/4) Start epoch 2422 2024-10-21 14:23:38,410 INFO [train.py:561] (1/4) Epoch 2422, batch 4, global_batch_idx: 38740, batch size: 189, loss[dur_loss=0.2005, prior_loss=0.975, diff_loss=0.3212, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.974, diff_loss=0.4181, tot_loss=1.59, over 937.00 samples.], 2024-10-21 14:23:53,301 INFO [train.py:561] (1/4) Epoch 2422, batch 14, global_batch_idx: 38750, batch size: 142, loss[dur_loss=0.2031, prior_loss=0.9744, diff_loss=0.3091, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9746, diff_loss=0.3483, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 14:23:54,718 INFO [train.py:682] (1/4) Start epoch 2423 2024-10-21 14:24:14,867 INFO [train.py:561] (1/4) Epoch 2423, batch 8, global_batch_idx: 38760, batch size: 170, loss[dur_loss=0.2062, prior_loss=0.9749, diff_loss=0.344, tot_loss=1.525, over 170.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9745, diff_loss=0.3906, tot_loss=1.566, over 1432.00 samples.], 2024-10-21 14:24:25,019 INFO [train.py:682] (1/4) Start epoch 2424 2024-10-21 14:24:36,531 INFO [train.py:561] (1/4) Epoch 2424, batch 2, global_batch_idx: 38770, batch size: 203, loss[dur_loss=0.2028, prior_loss=0.975, diff_loss=0.3481, tot_loss=1.526, over 203.00 samples.], tot_loss[dur_loss=0.2043, prior_loss=0.975, diff_loss=0.3216, tot_loss=1.501, over 442.00 samples.], 2024-10-21 14:24:50,799 INFO [train.py:561] (1/4) Epoch 2424, batch 12, global_batch_idx: 38780, batch size: 152, loss[dur_loss=0.1993, prior_loss=0.975, diff_loss=0.3397, tot_loss=1.514, over 152.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9747, diff_loss=0.3637, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 14:24:55,219 INFO [train.py:682] (1/4) Start epoch 2425 2024-10-21 14:25:12,345 INFO [train.py:561] (1/4) Epoch 2425, batch 6, global_batch_idx: 38790, batch size: 106, loss[dur_loss=0.202, prior_loss=0.9752, diff_loss=0.3052, tot_loss=1.482, over 106.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9743, diff_loss=0.394, tot_loss=1.567, over 1142.00 samples.], 2024-10-21 14:25:25,399 INFO [train.py:682] (1/4) Start epoch 2426 2024-10-21 14:25:34,099 INFO [train.py:561] (1/4) Epoch 2426, batch 0, global_batch_idx: 38800, batch size: 108, loss[dur_loss=0.2042, prior_loss=0.9755, diff_loss=0.3363, tot_loss=1.516, over 108.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9755, diff_loss=0.3363, tot_loss=1.516, over 108.00 samples.], 2024-10-21 14:25:48,348 INFO [train.py:561] (1/4) Epoch 2426, batch 10, global_batch_idx: 38810, batch size: 111, loss[dur_loss=0.2075, prior_loss=0.9762, diff_loss=0.273, tot_loss=1.457, over 111.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9746, diff_loss=0.3708, tot_loss=1.546, over 1656.00 samples.], 2024-10-21 14:25:55,429 INFO [train.py:682] (1/4) Start epoch 2427 2024-10-21 14:26:09,177 INFO [train.py:561] (1/4) Epoch 2427, batch 4, global_batch_idx: 38820, batch size: 189, loss[dur_loss=0.2009, prior_loss=0.9752, diff_loss=0.3013, tot_loss=1.477, over 189.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9742, diff_loss=0.422, tot_loss=1.596, over 937.00 samples.], 2024-10-21 14:26:24,123 INFO [train.py:561] (1/4) Epoch 2427, batch 14, global_batch_idx: 38830, batch size: 142, loss[dur_loss=0.2041, prior_loss=0.9747, diff_loss=0.2906, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.2023, prior_loss=0.9748, diff_loss=0.3506, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 14:26:25,549 INFO [train.py:682] (1/4) Start epoch 2428 2024-10-21 14:26:45,795 INFO [train.py:561] (1/4) Epoch 2428, batch 8, global_batch_idx: 38840, batch size: 170, loss[dur_loss=0.2054, prior_loss=0.9753, diff_loss=0.3253, tot_loss=1.506, over 170.00 samples.], tot_loss[dur_loss=0.2019, prior_loss=0.9746, diff_loss=0.3798, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 14:26:55,961 INFO [train.py:682] (1/4) Start epoch 2429 2024-10-21 14:27:07,369 INFO [train.py:561] (1/4) Epoch 2429, batch 2, global_batch_idx: 38850, batch size: 203, loss[dur_loss=0.2022, prior_loss=0.9749, diff_loss=0.3422, tot_loss=1.519, over 203.00 samples.], tot_loss[dur_loss=0.203, prior_loss=0.9749, diff_loss=0.3129, tot_loss=1.491, over 442.00 samples.], 2024-10-21 14:27:21,816 INFO [train.py:561] (1/4) Epoch 2429, batch 12, global_batch_idx: 38860, batch size: 152, loss[dur_loss=0.1996, prior_loss=0.9748, diff_loss=0.3223, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.9746, diff_loss=0.3607, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 14:27:26,341 INFO [train.py:682] (1/4) Start epoch 2430 2024-10-21 14:27:43,560 INFO [train.py:561] (1/4) Epoch 2430, batch 6, global_batch_idx: 38870, batch size: 106, loss[dur_loss=0.2027, prior_loss=0.9751, diff_loss=0.2938, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9742, diff_loss=0.3929, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 14:27:56,647 INFO [train.py:682] (1/4) Start epoch 2431 2024-10-21 14:28:05,756 INFO [train.py:561] (1/4) Epoch 2431, batch 0, global_batch_idx: 38880, batch size: 108, loss[dur_loss=0.2044, prior_loss=0.9754, diff_loss=0.2938, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2044, prior_loss=0.9754, diff_loss=0.2938, tot_loss=1.474, over 108.00 samples.], 2024-10-21 14:28:19,989 INFO [train.py:561] (1/4) Epoch 2431, batch 10, global_batch_idx: 38890, batch size: 111, loss[dur_loss=0.2062, prior_loss=0.9758, diff_loss=0.3061, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9745, diff_loss=0.3709, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 14:28:27,061 INFO [train.py:682] (1/4) Start epoch 2432 2024-10-21 14:28:41,177 INFO [train.py:561] (1/4) Epoch 2432, batch 4, global_batch_idx: 38900, batch size: 189, loss[dur_loss=0.2033, prior_loss=0.9751, diff_loss=0.3363, tot_loss=1.515, over 189.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9741, diff_loss=0.4195, tot_loss=1.593, over 937.00 samples.], 2024-10-21 14:28:56,044 INFO [train.py:561] (1/4) Epoch 2432, batch 14, global_batch_idx: 38910, batch size: 142, loss[dur_loss=0.2014, prior_loss=0.9746, diff_loss=0.3048, tot_loss=1.481, over 142.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9746, diff_loss=0.3525, tot_loss=1.529, over 2210.00 samples.], 2024-10-21 14:28:57,477 INFO [train.py:682] (1/4) Start epoch 2433 2024-10-21 14:29:17,690 INFO [train.py:561] (1/4) Epoch 2433, batch 8, global_batch_idx: 38920, batch size: 170, loss[dur_loss=0.204, prior_loss=0.975, diff_loss=0.3173, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.2002, prior_loss=0.9744, diff_loss=0.3825, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 14:29:27,923 INFO [train.py:682] (1/4) Start epoch 2434 2024-10-21 14:29:39,281 INFO [train.py:561] (1/4) Epoch 2434, batch 2, global_batch_idx: 38930, batch size: 203, loss[dur_loss=0.2013, prior_loss=0.9748, diff_loss=0.3324, tot_loss=1.508, over 203.00 samples.], tot_loss[dur_loss=0.2025, prior_loss=0.9749, diff_loss=0.3245, tot_loss=1.502, over 442.00 samples.], 2024-10-21 14:29:53,538 INFO [train.py:561] (1/4) Epoch 2434, batch 12, global_batch_idx: 38940, batch size: 152, loss[dur_loss=0.2028, prior_loss=0.9749, diff_loss=0.282, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9746, diff_loss=0.364, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 14:29:58,006 INFO [train.py:682] (1/4) Start epoch 2435 2024-10-21 14:30:14,983 INFO [train.py:561] (1/4) Epoch 2435, batch 6, global_batch_idx: 38950, batch size: 106, loss[dur_loss=0.2042, prior_loss=0.975, diff_loss=0.3058, tot_loss=1.485, over 106.00 samples.], tot_loss[dur_loss=0.199, prior_loss=0.9742, diff_loss=0.4088, tot_loss=1.582, over 1142.00 samples.], 2024-10-21 14:30:28,031 INFO [train.py:682] (1/4) Start epoch 2436 2024-10-21 14:30:36,552 INFO [train.py:561] (1/4) Epoch 2436, batch 0, global_batch_idx: 38960, batch size: 108, loss[dur_loss=0.2061, prior_loss=0.9754, diff_loss=0.3173, tot_loss=1.499, over 108.00 samples.], tot_loss[dur_loss=0.2061, prior_loss=0.9754, diff_loss=0.3173, tot_loss=1.499, over 108.00 samples.], 2024-10-21 14:30:50,776 INFO [train.py:561] (1/4) Epoch 2436, batch 10, global_batch_idx: 38970, batch size: 111, loss[dur_loss=0.2052, prior_loss=0.9759, diff_loss=0.3033, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9745, diff_loss=0.3744, tot_loss=1.549, over 1656.00 samples.], 2024-10-21 14:30:57,854 INFO [train.py:682] (1/4) Start epoch 2437 2024-10-21 14:31:11,743 INFO [train.py:561] (1/4) Epoch 2437, batch 4, global_batch_idx: 38980, batch size: 189, loss[dur_loss=0.2012, prior_loss=0.9747, diff_loss=0.3258, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.974, diff_loss=0.4212, tot_loss=1.594, over 937.00 samples.], 2024-10-21 14:31:26,660 INFO [train.py:561] (1/4) Epoch 2437, batch 14, global_batch_idx: 38990, batch size: 142, loss[dur_loss=0.2001, prior_loss=0.9744, diff_loss=0.3077, tot_loss=1.482, over 142.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9746, diff_loss=0.3571, tot_loss=1.532, over 2210.00 samples.], 2024-10-21 14:31:28,119 INFO [train.py:682] (1/4) Start epoch 2438 2024-10-21 14:31:48,033 INFO [train.py:561] (1/4) Epoch 2438, batch 8, global_batch_idx: 39000, batch size: 170, loss[dur_loss=0.2049, prior_loss=0.975, diff_loss=0.3351, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.3839, tot_loss=1.558, over 1432.00 samples.], 2024-10-21 14:31:49,519 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 14:32:19,669 INFO [train.py:589] (1/4) Epoch 2438, validation: dur_loss=0.4559, prior_loss=1.037, diff_loss=0.3724, tot_loss=1.865, over 100.00 samples. 2024-10-21 14:32:19,670 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 14:32:28,501 INFO [train.py:682] (1/4) Start epoch 2439 2024-10-21 14:32:40,278 INFO [train.py:561] (1/4) Epoch 2439, batch 2, global_batch_idx: 39010, batch size: 203, loss[dur_loss=0.2026, prior_loss=0.9747, diff_loss=0.3366, tot_loss=1.514, over 203.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9748, diff_loss=0.3113, tot_loss=1.489, over 442.00 samples.], 2024-10-21 14:32:54,594 INFO [train.py:561] (1/4) Epoch 2439, batch 12, global_batch_idx: 39020, batch size: 152, loss[dur_loss=0.2022, prior_loss=0.9748, diff_loss=0.3049, tot_loss=1.482, over 152.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9745, diff_loss=0.3603, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 14:32:59,059 INFO [train.py:682] (1/4) Start epoch 2440 2024-10-21 14:33:16,567 INFO [train.py:561] (1/4) Epoch 2440, batch 6, global_batch_idx: 39030, batch size: 106, loss[dur_loss=0.203, prior_loss=0.9748, diff_loss=0.276, tot_loss=1.454, over 106.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9742, diff_loss=0.4064, tot_loss=1.58, over 1142.00 samples.], 2024-10-21 14:33:29,859 INFO [train.py:682] (1/4) Start epoch 2441 2024-10-21 14:33:38,700 INFO [train.py:561] (1/4) Epoch 2441, batch 0, global_batch_idx: 39040, batch size: 108, loss[dur_loss=0.2039, prior_loss=0.9753, diff_loss=0.282, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.2039, prior_loss=0.9753, diff_loss=0.282, tot_loss=1.461, over 108.00 samples.], 2024-10-21 14:33:53,080 INFO [train.py:561] (1/4) Epoch 2441, batch 10, global_batch_idx: 39050, batch size: 111, loss[dur_loss=0.2045, prior_loss=0.9758, diff_loss=0.2876, tot_loss=1.468, over 111.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9745, diff_loss=0.3583, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 14:34:00,191 INFO [train.py:682] (1/4) Start epoch 2442 2024-10-21 14:34:14,420 INFO [train.py:561] (1/4) Epoch 2442, batch 4, global_batch_idx: 39060, batch size: 189, loss[dur_loss=0.2023, prior_loss=0.9748, diff_loss=0.3198, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.974, diff_loss=0.4248, tot_loss=1.598, over 937.00 samples.], 2024-10-21 14:34:29,448 INFO [train.py:561] (1/4) Epoch 2442, batch 14, global_batch_idx: 39070, batch size: 142, loss[dur_loss=0.2011, prior_loss=0.9744, diff_loss=0.299, tot_loss=1.475, over 142.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9746, diff_loss=0.3544, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 14:34:30,887 INFO [train.py:682] (1/4) Start epoch 2443 2024-10-21 14:34:50,966 INFO [train.py:561] (1/4) Epoch 2443, batch 8, global_batch_idx: 39080, batch size: 170, loss[dur_loss=0.2071, prior_loss=0.9748, diff_loss=0.2932, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9743, diff_loss=0.3794, tot_loss=1.553, over 1432.00 samples.], 2024-10-21 14:35:01,151 INFO [train.py:682] (1/4) Start epoch 2444 2024-10-21 14:35:12,542 INFO [train.py:561] (1/4) Epoch 2444, batch 2, global_batch_idx: 39090, batch size: 203, loss[dur_loss=0.2021, prior_loss=0.9747, diff_loss=0.3085, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9749, diff_loss=0.2932, tot_loss=1.471, over 442.00 samples.], 2024-10-21 14:35:26,787 INFO [train.py:561] (1/4) Epoch 2444, batch 12, global_batch_idx: 39100, batch size: 152, loss[dur_loss=0.2033, prior_loss=0.9749, diff_loss=0.3055, tot_loss=1.484, over 152.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9745, diff_loss=0.3544, tot_loss=1.53, over 1966.00 samples.], 2024-10-21 14:35:31,258 INFO [train.py:682] (1/4) Start epoch 2445 2024-10-21 14:35:48,263 INFO [train.py:561] (1/4) Epoch 2445, batch 6, global_batch_idx: 39110, batch size: 106, loss[dur_loss=0.2025, prior_loss=0.975, diff_loss=0.2792, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9741, diff_loss=0.397, tot_loss=1.57, over 1142.00 samples.], 2024-10-21 14:36:01,276 INFO [train.py:682] (1/4) Start epoch 2446 2024-10-21 14:36:10,085 INFO [train.py:561] (1/4) Epoch 2446, batch 0, global_batch_idx: 39120, batch size: 108, loss[dur_loss=0.2032, prior_loss=0.9754, diff_loss=0.324, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9754, diff_loss=0.324, tot_loss=1.503, over 108.00 samples.], 2024-10-21 14:36:24,472 INFO [train.py:561] (1/4) Epoch 2446, batch 10, global_batch_idx: 39130, batch size: 111, loss[dur_loss=0.2061, prior_loss=0.9757, diff_loss=0.286, tot_loss=1.468, over 111.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9743, diff_loss=0.3693, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 14:36:31,647 INFO [train.py:682] (1/4) Start epoch 2447 2024-10-21 14:36:45,506 INFO [train.py:561] (1/4) Epoch 2447, batch 4, global_batch_idx: 39140, batch size: 189, loss[dur_loss=0.2025, prior_loss=0.9748, diff_loss=0.353, tot_loss=1.53, over 189.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9739, diff_loss=0.4169, tot_loss=1.589, over 937.00 samples.], 2024-10-21 14:37:00,421 INFO [train.py:561] (1/4) Epoch 2447, batch 14, global_batch_idx: 39150, batch size: 142, loss[dur_loss=0.2035, prior_loss=0.9745, diff_loss=0.2924, tot_loss=1.47, over 142.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.9745, diff_loss=0.3565, tot_loss=1.532, over 2210.00 samples.], 2024-10-21 14:37:01,859 INFO [train.py:682] (1/4) Start epoch 2448 2024-10-21 14:37:22,022 INFO [train.py:561] (1/4) Epoch 2448, batch 8, global_batch_idx: 39160, batch size: 170, loss[dur_loss=0.2, prior_loss=0.9749, diff_loss=0.3277, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.3926, tot_loss=1.567, over 1432.00 samples.], 2024-10-21 14:37:32,186 INFO [train.py:682] (1/4) Start epoch 2449 2024-10-21 14:37:43,813 INFO [train.py:561] (1/4) Epoch 2449, batch 2, global_batch_idx: 39170, batch size: 203, loss[dur_loss=0.2017, prior_loss=0.9747, diff_loss=0.3395, tot_loss=1.516, over 203.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9749, diff_loss=0.3215, tot_loss=1.498, over 442.00 samples.], 2024-10-21 14:37:58,005 INFO [train.py:561] (1/4) Epoch 2449, batch 12, global_batch_idx: 39180, batch size: 152, loss[dur_loss=0.2005, prior_loss=0.9746, diff_loss=0.2758, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9745, diff_loss=0.3596, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 14:38:02,460 INFO [train.py:682] (1/4) Start epoch 2450 2024-10-21 14:38:19,520 INFO [train.py:561] (1/4) Epoch 2450, batch 6, global_batch_idx: 39190, batch size: 106, loss[dur_loss=0.2005, prior_loss=0.9749, diff_loss=0.3211, tot_loss=1.497, over 106.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9741, diff_loss=0.3986, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 14:38:32,453 INFO [train.py:682] (1/4) Start epoch 2451 2024-10-21 14:38:40,969 INFO [train.py:561] (1/4) Epoch 2451, batch 0, global_batch_idx: 39200, batch size: 108, loss[dur_loss=0.2045, prior_loss=0.9754, diff_loss=0.3054, tot_loss=1.485, over 108.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9754, diff_loss=0.3054, tot_loss=1.485, over 108.00 samples.], 2024-10-21 14:38:55,184 INFO [train.py:561] (1/4) Epoch 2451, batch 10, global_batch_idx: 39210, batch size: 111, loss[dur_loss=0.2051, prior_loss=0.9757, diff_loss=0.2894, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1993, prior_loss=0.9743, diff_loss=0.3723, tot_loss=1.546, over 1656.00 samples.], 2024-10-21 14:39:02,254 INFO [train.py:682] (1/4) Start epoch 2452 2024-10-21 14:39:16,226 INFO [train.py:561] (1/4) Epoch 2452, batch 4, global_batch_idx: 39220, batch size: 189, loss[dur_loss=0.2006, prior_loss=0.9749, diff_loss=0.3096, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9739, diff_loss=0.4092, tot_loss=1.581, over 937.00 samples.], 2024-10-21 14:39:30,997 INFO [train.py:561] (1/4) Epoch 2452, batch 14, global_batch_idx: 39230, batch size: 142, loss[dur_loss=0.2009, prior_loss=0.9744, diff_loss=0.2743, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9746, diff_loss=0.3515, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 14:39:32,421 INFO [train.py:682] (1/4) Start epoch 2453 2024-10-21 14:39:52,235 INFO [train.py:561] (1/4) Epoch 2453, batch 8, global_batch_idx: 39240, batch size: 170, loss[dur_loss=0.2049, prior_loss=0.975, diff_loss=0.3031, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9743, diff_loss=0.3889, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 14:40:02,374 INFO [train.py:682] (1/4) Start epoch 2454 2024-10-21 14:40:13,578 INFO [train.py:561] (1/4) Epoch 2454, batch 2, global_batch_idx: 39250, batch size: 203, loss[dur_loss=0.2046, prior_loss=0.9747, diff_loss=0.3202, tot_loss=1.499, over 203.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9749, diff_loss=0.3077, tot_loss=1.486, over 442.00 samples.], 2024-10-21 14:40:27,767 INFO [train.py:561] (1/4) Epoch 2454, batch 12, global_batch_idx: 39260, batch size: 152, loss[dur_loss=0.2027, prior_loss=0.9748, diff_loss=0.2859, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.2017, prior_loss=0.9746, diff_loss=0.3637, tot_loss=1.54, over 1966.00 samples.], 2024-10-21 14:40:32,214 INFO [train.py:682] (1/4) Start epoch 2455 2024-10-21 14:40:49,072 INFO [train.py:561] (1/4) Epoch 2455, batch 6, global_batch_idx: 39270, batch size: 106, loss[dur_loss=0.1987, prior_loss=0.9747, diff_loss=0.2938, tot_loss=1.467, over 106.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.974, diff_loss=0.4067, tot_loss=1.578, over 1142.00 samples.], 2024-10-21 14:41:02,095 INFO [train.py:682] (1/4) Start epoch 2456 2024-10-21 14:41:10,571 INFO [train.py:561] (1/4) Epoch 2456, batch 0, global_batch_idx: 39280, batch size: 108, loss[dur_loss=0.2035, prior_loss=0.9752, diff_loss=0.2972, tot_loss=1.476, over 108.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.9752, diff_loss=0.2972, tot_loss=1.476, over 108.00 samples.], 2024-10-21 14:41:25,089 INFO [train.py:561] (1/4) Epoch 2456, batch 10, global_batch_idx: 39290, batch size: 111, loss[dur_loss=0.2056, prior_loss=0.9758, diff_loss=0.2463, tot_loss=1.428, over 111.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9742, diff_loss=0.367, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 14:41:32,257 INFO [train.py:682] (1/4) Start epoch 2457 2024-10-21 14:41:46,169 INFO [train.py:561] (1/4) Epoch 2457, batch 4, global_batch_idx: 39300, batch size: 189, loss[dur_loss=0.199, prior_loss=0.9745, diff_loss=0.3199, tot_loss=1.493, over 189.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9738, diff_loss=0.4244, tot_loss=1.594, over 937.00 samples.], 2024-10-21 14:42:01,095 INFO [train.py:561] (1/4) Epoch 2457, batch 14, global_batch_idx: 39310, batch size: 142, loss[dur_loss=0.2031, prior_loss=0.9743, diff_loss=0.3202, tot_loss=1.498, over 142.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.3525, tot_loss=1.526, over 2210.00 samples.], 2024-10-21 14:42:02,535 INFO [train.py:682] (1/4) Start epoch 2458 2024-10-21 14:42:22,655 INFO [train.py:561] (1/4) Epoch 2458, batch 8, global_batch_idx: 39320, batch size: 170, loss[dur_loss=0.2006, prior_loss=0.9745, diff_loss=0.2844, tot_loss=1.46, over 170.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9742, diff_loss=0.376, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 14:42:32,857 INFO [train.py:682] (1/4) Start epoch 2459 2024-10-21 14:42:44,508 INFO [train.py:561] (1/4) Epoch 2459, batch 2, global_batch_idx: 39330, batch size: 203, loss[dur_loss=0.2045, prior_loss=0.975, diff_loss=0.3045, tot_loss=1.484, over 203.00 samples.], tot_loss[dur_loss=0.2035, prior_loss=0.975, diff_loss=0.2952, tot_loss=1.474, over 442.00 samples.], 2024-10-21 14:42:59,004 INFO [train.py:561] (1/4) Epoch 2459, batch 12, global_batch_idx: 39340, batch size: 152, loss[dur_loss=0.2008, prior_loss=0.9747, diff_loss=0.2757, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9745, diff_loss=0.3538, tot_loss=1.528, over 1966.00 samples.], 2024-10-21 14:43:03,472 INFO [train.py:682] (1/4) Start epoch 2460 2024-10-21 14:43:20,560 INFO [train.py:561] (1/4) Epoch 2460, batch 6, global_batch_idx: 39350, batch size: 106, loss[dur_loss=0.2005, prior_loss=0.9747, diff_loss=0.2326, tot_loss=1.408, over 106.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.974, diff_loss=0.3879, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 14:43:33,695 INFO [train.py:682] (1/4) Start epoch 2461 2024-10-21 14:43:42,625 INFO [train.py:561] (1/4) Epoch 2461, batch 0, global_batch_idx: 39360, batch size: 108, loss[dur_loss=0.2039, prior_loss=0.9754, diff_loss=0.3464, tot_loss=1.526, over 108.00 samples.], tot_loss[dur_loss=0.2039, prior_loss=0.9754, diff_loss=0.3464, tot_loss=1.526, over 108.00 samples.], 2024-10-21 14:43:56,818 INFO [train.py:561] (1/4) Epoch 2461, batch 10, global_batch_idx: 39370, batch size: 111, loss[dur_loss=0.2043, prior_loss=0.9756, diff_loss=0.3313, tot_loss=1.511, over 111.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9743, diff_loss=0.3749, tot_loss=1.548, over 1656.00 samples.], 2024-10-21 14:44:03,898 INFO [train.py:682] (1/4) Start epoch 2462 2024-10-21 14:44:17,710 INFO [train.py:561] (1/4) Epoch 2462, batch 4, global_batch_idx: 39380, batch size: 189, loss[dur_loss=0.2007, prior_loss=0.9749, diff_loss=0.307, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.4168, tot_loss=1.588, over 937.00 samples.], 2024-10-21 14:44:32,540 INFO [train.py:561] (1/4) Epoch 2462, batch 14, global_batch_idx: 39390, batch size: 142, loss[dur_loss=0.2037, prior_loss=0.9744, diff_loss=0.3193, tot_loss=1.497, over 142.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9745, diff_loss=0.3527, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 14:44:33,967 INFO [train.py:682] (1/4) Start epoch 2463 2024-10-21 14:44:53,999 INFO [train.py:561] (1/4) Epoch 2463, batch 8, global_batch_idx: 39400, batch size: 170, loss[dur_loss=0.2024, prior_loss=0.975, diff_loss=0.3379, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9742, diff_loss=0.3883, tot_loss=1.561, over 1432.00 samples.], 2024-10-21 14:45:04,076 INFO [train.py:682] (1/4) Start epoch 2464 2024-10-21 14:45:15,298 INFO [train.py:561] (1/4) Epoch 2464, batch 2, global_batch_idx: 39410, batch size: 203, loss[dur_loss=0.2011, prior_loss=0.9744, diff_loss=0.3453, tot_loss=1.521, over 203.00 samples.], tot_loss[dur_loss=0.2018, prior_loss=0.9747, diff_loss=0.3204, tot_loss=1.497, over 442.00 samples.], 2024-10-21 14:45:29,410 INFO [train.py:561] (1/4) Epoch 2464, batch 12, global_batch_idx: 39420, batch size: 152, loss[dur_loss=0.2006, prior_loss=0.9747, diff_loss=0.3016, tot_loss=1.477, over 152.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9743, diff_loss=0.3568, tot_loss=1.531, over 1966.00 samples.], 2024-10-21 14:45:33,826 INFO [train.py:682] (1/4) Start epoch 2465 2024-10-21 14:45:50,554 INFO [train.py:561] (1/4) Epoch 2465, batch 6, global_batch_idx: 39430, batch size: 106, loss[dur_loss=0.2016, prior_loss=0.9748, diff_loss=0.282, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.974, diff_loss=0.3979, tot_loss=1.569, over 1142.00 samples.], 2024-10-21 14:46:03,563 INFO [train.py:682] (1/4) Start epoch 2466 2024-10-21 14:46:12,239 INFO [train.py:561] (1/4) Epoch 2466, batch 0, global_batch_idx: 39440, batch size: 108, loss[dur_loss=0.2032, prior_loss=0.9754, diff_loss=0.2958, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9754, diff_loss=0.2958, tot_loss=1.474, over 108.00 samples.], 2024-10-21 14:46:26,503 INFO [train.py:561] (1/4) Epoch 2466, batch 10, global_batch_idx: 39450, batch size: 111, loss[dur_loss=0.2016, prior_loss=0.9757, diff_loss=0.2793, tot_loss=1.456, over 111.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9743, diff_loss=0.3733, tot_loss=1.546, over 1656.00 samples.], 2024-10-21 14:46:33,606 INFO [train.py:682] (1/4) Start epoch 2467 2024-10-21 14:46:47,179 INFO [train.py:561] (1/4) Epoch 2467, batch 4, global_batch_idx: 39460, batch size: 189, loss[dur_loss=0.2016, prior_loss=0.9749, diff_loss=0.3417, tot_loss=1.518, over 189.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9738, diff_loss=0.4245, tot_loss=1.595, over 937.00 samples.], 2024-10-21 14:47:02,113 INFO [train.py:561] (1/4) Epoch 2467, batch 14, global_batch_idx: 39470, batch size: 142, loss[dur_loss=0.2031, prior_loss=0.9743, diff_loss=0.3094, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.1999, prior_loss=0.9744, diff_loss=0.3542, tot_loss=1.529, over 2210.00 samples.], 2024-10-21 14:47:03,538 INFO [train.py:682] (1/4) Start epoch 2468 2024-10-21 14:47:23,775 INFO [train.py:561] (1/4) Epoch 2468, batch 8, global_batch_idx: 39480, batch size: 170, loss[dur_loss=0.2023, prior_loss=0.9748, diff_loss=0.3114, tot_loss=1.488, over 170.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9742, diff_loss=0.3893, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 14:47:33,854 INFO [train.py:682] (1/4) Start epoch 2469 2024-10-21 14:47:45,186 INFO [train.py:561] (1/4) Epoch 2469, batch 2, global_batch_idx: 39490, batch size: 203, loss[dur_loss=0.2001, prior_loss=0.9746, diff_loss=0.3433, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.2027, prior_loss=0.9747, diff_loss=0.3088, tot_loss=1.486, over 442.00 samples.], 2024-10-21 14:47:59,392 INFO [train.py:561] (1/4) Epoch 2469, batch 12, global_batch_idx: 39500, batch size: 152, loss[dur_loss=0.205, prior_loss=0.9748, diff_loss=0.307, tot_loss=1.487, over 152.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9744, diff_loss=0.3607, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 14:48:03,836 INFO [train.py:682] (1/4) Start epoch 2470 2024-10-21 14:48:20,568 INFO [train.py:561] (1/4) Epoch 2470, batch 6, global_batch_idx: 39510, batch size: 106, loss[dur_loss=0.1996, prior_loss=0.9749, diff_loss=0.3021, tot_loss=1.477, over 106.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9741, diff_loss=0.4017, tot_loss=1.573, over 1142.00 samples.], 2024-10-21 14:48:33,580 INFO [train.py:682] (1/4) Start epoch 2471 2024-10-21 14:48:42,341 INFO [train.py:561] (1/4) Epoch 2471, batch 0, global_batch_idx: 39520, batch size: 108, loss[dur_loss=0.2045, prior_loss=0.9752, diff_loss=0.311, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.2045, prior_loss=0.9752, diff_loss=0.311, tot_loss=1.491, over 108.00 samples.], 2024-10-21 14:48:56,494 INFO [train.py:561] (1/4) Epoch 2471, batch 10, global_batch_idx: 39530, batch size: 111, loss[dur_loss=0.204, prior_loss=0.9757, diff_loss=0.2991, tot_loss=1.479, over 111.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9743, diff_loss=0.3754, tot_loss=1.548, over 1656.00 samples.], 2024-10-21 14:49:03,487 INFO [train.py:682] (1/4) Start epoch 2472 2024-10-21 14:49:17,038 INFO [train.py:561] (1/4) Epoch 2472, batch 4, global_batch_idx: 39540, batch size: 189, loss[dur_loss=0.1981, prior_loss=0.9745, diff_loss=0.3319, tot_loss=1.504, over 189.00 samples.], tot_loss[dur_loss=0.196, prior_loss=0.9737, diff_loss=0.4274, tot_loss=1.597, over 937.00 samples.], 2024-10-21 14:49:31,866 INFO [train.py:561] (1/4) Epoch 2472, batch 14, global_batch_idx: 39550, batch size: 142, loss[dur_loss=0.2036, prior_loss=0.9745, diff_loss=0.3449, tot_loss=1.523, over 142.00 samples.], tot_loss[dur_loss=0.1993, prior_loss=0.9743, diff_loss=0.3611, tot_loss=1.535, over 2210.00 samples.], 2024-10-21 14:49:33,290 INFO [train.py:682] (1/4) Start epoch 2473 2024-10-21 14:49:53,254 INFO [train.py:561] (1/4) Epoch 2473, batch 8, global_batch_idx: 39560, batch size: 170, loss[dur_loss=0.2028, prior_loss=0.9747, diff_loss=0.3018, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9741, diff_loss=0.3896, tot_loss=1.562, over 1432.00 samples.], 2024-10-21 14:50:03,293 INFO [train.py:682] (1/4) Start epoch 2474 2024-10-21 14:50:14,725 INFO [train.py:561] (1/4) Epoch 2474, batch 2, global_batch_idx: 39570, batch size: 203, loss[dur_loss=0.2023, prior_loss=0.9743, diff_loss=0.3049, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.202, prior_loss=0.9745, diff_loss=0.308, tot_loss=1.484, over 442.00 samples.], 2024-10-21 14:50:28,883 INFO [train.py:561] (1/4) Epoch 2474, batch 12, global_batch_idx: 39580, batch size: 152, loss[dur_loss=0.1999, prior_loss=0.9749, diff_loss=0.325, tot_loss=1.5, over 152.00 samples.], tot_loss[dur_loss=0.199, prior_loss=0.9742, diff_loss=0.3575, tot_loss=1.531, over 1966.00 samples.], 2024-10-21 14:50:33,310 INFO [train.py:682] (1/4) Start epoch 2475 2024-10-21 14:50:50,351 INFO [train.py:561] (1/4) Epoch 2475, batch 6, global_batch_idx: 39590, batch size: 106, loss[dur_loss=0.1976, prior_loss=0.9746, diff_loss=0.33, tot_loss=1.502, over 106.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.974, diff_loss=0.4049, tot_loss=1.575, over 1142.00 samples.], 2024-10-21 14:51:03,631 INFO [train.py:682] (1/4) Start epoch 2476 2024-10-21 14:51:12,153 INFO [train.py:561] (1/4) Epoch 2476, batch 0, global_batch_idx: 39600, batch size: 108, loss[dur_loss=0.2034, prior_loss=0.9751, diff_loss=0.3076, tot_loss=1.486, over 108.00 samples.], tot_loss[dur_loss=0.2034, prior_loss=0.9751, diff_loss=0.3076, tot_loss=1.486, over 108.00 samples.], 2024-10-21 14:51:26,423 INFO [train.py:561] (1/4) Epoch 2476, batch 10, global_batch_idx: 39610, batch size: 111, loss[dur_loss=0.204, prior_loss=0.9755, diff_loss=0.287, tot_loss=1.466, over 111.00 samples.], tot_loss[dur_loss=0.1997, prior_loss=0.9744, diff_loss=0.3802, tot_loss=1.554, over 1656.00 samples.], 2024-10-21 14:51:33,526 INFO [train.py:682] (1/4) Start epoch 2477 2024-10-21 14:51:47,326 INFO [train.py:561] (1/4) Epoch 2477, batch 4, global_batch_idx: 39620, batch size: 189, loss[dur_loss=0.1985, prior_loss=0.9747, diff_loss=0.3379, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9738, diff_loss=0.4127, tot_loss=1.585, over 937.00 samples.], 2024-10-21 14:52:02,159 INFO [train.py:561] (1/4) Epoch 2477, batch 14, global_batch_idx: 39630, batch size: 142, loss[dur_loss=0.2011, prior_loss=0.9743, diff_loss=0.3012, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9743, diff_loss=0.3547, tot_loss=1.529, over 2210.00 samples.], 2024-10-21 14:52:03,582 INFO [train.py:682] (1/4) Start epoch 2478 2024-10-21 14:52:23,299 INFO [train.py:561] (1/4) Epoch 2478, batch 8, global_batch_idx: 39640, batch size: 170, loss[dur_loss=0.2024, prior_loss=0.9746, diff_loss=0.3095, tot_loss=1.487, over 170.00 samples.], tot_loss[dur_loss=0.1994, prior_loss=0.9741, diff_loss=0.3785, tot_loss=1.552, over 1432.00 samples.], 2024-10-21 14:52:33,442 INFO [train.py:682] (1/4) Start epoch 2479 2024-10-21 14:52:44,955 INFO [train.py:561] (1/4) Epoch 2479, batch 2, global_batch_idx: 39650, batch size: 203, loss[dur_loss=0.2001, prior_loss=0.9745, diff_loss=0.3131, tot_loss=1.488, over 203.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9746, diff_loss=0.3095, tot_loss=1.485, over 442.00 samples.], 2024-10-21 14:52:59,156 INFO [train.py:561] (1/4) Epoch 2479, batch 12, global_batch_idx: 39660, batch size: 152, loss[dur_loss=0.2012, prior_loss=0.9747, diff_loss=0.2971, tot_loss=1.473, over 152.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9742, diff_loss=0.3558, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 14:53:03,617 INFO [train.py:682] (1/4) Start epoch 2480 2024-10-21 14:53:20,710 INFO [train.py:561] (1/4) Epoch 2480, batch 6, global_batch_idx: 39670, batch size: 106, loss[dur_loss=0.201, prior_loss=0.9746, diff_loss=0.3178, tot_loss=1.493, over 106.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.9739, diff_loss=0.4007, tot_loss=1.573, over 1142.00 samples.], 2024-10-21 14:53:33,706 INFO [train.py:682] (1/4) Start epoch 2481 2024-10-21 14:53:42,516 INFO [train.py:561] (1/4) Epoch 2481, batch 0, global_batch_idx: 39680, batch size: 108, loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.3266, tot_loss=1.506, over 108.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.3266, tot_loss=1.506, over 108.00 samples.], 2024-10-21 14:53:56,725 INFO [train.py:561] (1/4) Epoch 2481, batch 10, global_batch_idx: 39690, batch size: 111, loss[dur_loss=0.2018, prior_loss=0.9754, diff_loss=0.3243, tot_loss=1.502, over 111.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9742, diff_loss=0.3816, tot_loss=1.555, over 1656.00 samples.], 2024-10-21 14:54:03,813 INFO [train.py:682] (1/4) Start epoch 2482 2024-10-21 14:54:17,507 INFO [train.py:561] (1/4) Epoch 2482, batch 4, global_batch_idx: 39700, batch size: 189, loss[dur_loss=0.2004, prior_loss=0.9744, diff_loss=0.3107, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9736, diff_loss=0.4197, tot_loss=1.589, over 937.00 samples.], 2024-10-21 14:54:32,308 INFO [train.py:561] (1/4) Epoch 2482, batch 14, global_batch_idx: 39710, batch size: 142, loss[dur_loss=0.2015, prior_loss=0.9744, diff_loss=0.2938, tot_loss=1.47, over 142.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9742, diff_loss=0.3547, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 14:54:33,742 INFO [train.py:682] (1/4) Start epoch 2483 2024-10-21 14:54:53,555 INFO [train.py:561] (1/4) Epoch 2483, batch 8, global_batch_idx: 39720, batch size: 170, loss[dur_loss=0.2015, prior_loss=0.9745, diff_loss=0.3534, tot_loss=1.529, over 170.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.974, diff_loss=0.3868, tot_loss=1.558, over 1432.00 samples.], 2024-10-21 14:55:03,668 INFO [train.py:682] (1/4) Start epoch 2484 2024-10-21 14:55:14,796 INFO [train.py:561] (1/4) Epoch 2484, batch 2, global_batch_idx: 39730, batch size: 203, loss[dur_loss=0.2018, prior_loss=0.9745, diff_loss=0.3041, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.2021, prior_loss=0.9746, diff_loss=0.2806, tot_loss=1.457, over 442.00 samples.], 2024-10-21 14:55:28,956 INFO [train.py:561] (1/4) Epoch 2484, batch 12, global_batch_idx: 39740, batch size: 152, loss[dur_loss=0.202, prior_loss=0.9746, diff_loss=0.3093, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9742, diff_loss=0.3506, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 14:55:33,380 INFO [train.py:682] (1/4) Start epoch 2485 2024-10-21 14:55:50,287 INFO [train.py:561] (1/4) Epoch 2485, batch 6, global_batch_idx: 39750, batch size: 106, loss[dur_loss=0.1987, prior_loss=0.9746, diff_loss=0.2851, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9739, diff_loss=0.4099, tot_loss=1.581, over 1142.00 samples.], 2024-10-21 14:56:03,255 INFO [train.py:682] (1/4) Start epoch 2486 2024-10-21 14:56:11,851 INFO [train.py:561] (1/4) Epoch 2486, batch 0, global_batch_idx: 39760, batch size: 108, loss[dur_loss=0.2057, prior_loss=0.9751, diff_loss=0.3021, tot_loss=1.483, over 108.00 samples.], tot_loss[dur_loss=0.2057, prior_loss=0.9751, diff_loss=0.3021, tot_loss=1.483, over 108.00 samples.], 2024-10-21 14:56:26,019 INFO [train.py:561] (1/4) Epoch 2486, batch 10, global_batch_idx: 39770, batch size: 111, loss[dur_loss=0.2037, prior_loss=0.9754, diff_loss=0.3226, tot_loss=1.502, over 111.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9742, diff_loss=0.3683, tot_loss=1.541, over 1656.00 samples.], 2024-10-21 14:56:33,117 INFO [train.py:682] (1/4) Start epoch 2487 2024-10-21 14:56:46,580 INFO [train.py:561] (1/4) Epoch 2487, batch 4, global_batch_idx: 39780, batch size: 189, loss[dur_loss=0.202, prior_loss=0.9746, diff_loss=0.3162, tot_loss=1.493, over 189.00 samples.], tot_loss[dur_loss=0.1966, prior_loss=0.9737, diff_loss=0.4115, tot_loss=1.582, over 937.00 samples.], 2024-10-21 14:57:01,385 INFO [train.py:561] (1/4) Epoch 2487, batch 14, global_batch_idx: 39790, batch size: 142, loss[dur_loss=0.2018, prior_loss=0.9741, diff_loss=0.3109, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.2002, prior_loss=0.9743, diff_loss=0.3594, tot_loss=1.534, over 2210.00 samples.], 2024-10-21 14:57:02,813 INFO [train.py:682] (1/4) Start epoch 2488 2024-10-21 14:57:22,715 INFO [train.py:561] (1/4) Epoch 2488, batch 8, global_batch_idx: 39800, batch size: 170, loss[dur_loss=0.201, prior_loss=0.9746, diff_loss=0.3341, tot_loss=1.51, over 170.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9741, diff_loss=0.3844, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 14:57:32,809 INFO [train.py:682] (1/4) Start epoch 2489 2024-10-21 14:57:44,318 INFO [train.py:561] (1/4) Epoch 2489, batch 2, global_batch_idx: 39810, batch size: 203, loss[dur_loss=0.1998, prior_loss=0.9745, diff_loss=0.3301, tot_loss=1.505, over 203.00 samples.], tot_loss[dur_loss=0.2002, prior_loss=0.9746, diff_loss=0.3066, tot_loss=1.481, over 442.00 samples.], 2024-10-21 14:57:58,486 INFO [train.py:561] (1/4) Epoch 2489, batch 12, global_batch_idx: 39820, batch size: 152, loss[dur_loss=0.1983, prior_loss=0.9746, diff_loss=0.3403, tot_loss=1.513, over 152.00 samples.], tot_loss[dur_loss=0.2002, prior_loss=0.9743, diff_loss=0.3603, tot_loss=1.535, over 1966.00 samples.], 2024-10-21 14:58:02,902 INFO [train.py:682] (1/4) Start epoch 2490 2024-10-21 14:58:19,851 INFO [train.py:561] (1/4) Epoch 2490, batch 6, global_batch_idx: 39830, batch size: 106, loss[dur_loss=0.2003, prior_loss=0.9745, diff_loss=0.2977, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.1966, prior_loss=0.9738, diff_loss=0.3976, tot_loss=1.568, over 1142.00 samples.], 2024-10-21 14:58:32,884 INFO [train.py:682] (1/4) Start epoch 2491 2024-10-21 14:58:41,513 INFO [train.py:561] (1/4) Epoch 2491, batch 0, global_batch_idx: 39840, batch size: 108, loss[dur_loss=0.1993, prior_loss=0.9748, diff_loss=0.3385, tot_loss=1.513, over 108.00 samples.], tot_loss[dur_loss=0.1993, prior_loss=0.9748, diff_loss=0.3385, tot_loss=1.513, over 108.00 samples.], 2024-10-21 14:58:55,613 INFO [train.py:561] (1/4) Epoch 2491, batch 10, global_batch_idx: 39850, batch size: 111, loss[dur_loss=0.2033, prior_loss=0.9755, diff_loss=0.2785, tot_loss=1.457, over 111.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.9741, diff_loss=0.3712, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 14:59:02,676 INFO [train.py:682] (1/4) Start epoch 2492 2024-10-21 14:59:16,108 INFO [train.py:561] (1/4) Epoch 2492, batch 4, global_batch_idx: 39860, batch size: 189, loss[dur_loss=0.2023, prior_loss=0.9747, diff_loss=0.3165, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9737, diff_loss=0.4092, tot_loss=1.581, over 937.00 samples.], 2024-10-21 14:59:30,911 INFO [train.py:561] (1/4) Epoch 2492, batch 14, global_batch_idx: 39870, batch size: 142, loss[dur_loss=0.2023, prior_loss=0.9744, diff_loss=0.3225, tot_loss=1.499, over 142.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9743, diff_loss=0.3516, tot_loss=1.526, over 2210.00 samples.], 2024-10-21 14:59:32,328 INFO [train.py:682] (1/4) Start epoch 2493 2024-10-21 14:59:52,133 INFO [train.py:561] (1/4) Epoch 2493, batch 8, global_batch_idx: 39880, batch size: 170, loss[dur_loss=0.2014, prior_loss=0.9744, diff_loss=0.3101, tot_loss=1.486, over 170.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.974, diff_loss=0.3847, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 15:00:02,229 INFO [train.py:682] (1/4) Start epoch 2494 2024-10-21 15:00:13,425 INFO [train.py:561] (1/4) Epoch 2494, batch 2, global_batch_idx: 39890, batch size: 203, loss[dur_loss=0.2034, prior_loss=0.9746, diff_loss=0.3123, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9747, diff_loss=0.3163, tot_loss=1.495, over 442.00 samples.], 2024-10-21 15:00:27,546 INFO [train.py:561] (1/4) Epoch 2494, batch 12, global_batch_idx: 39900, batch size: 152, loss[dur_loss=0.203, prior_loss=0.9747, diff_loss=0.2827, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9743, diff_loss=0.3677, tot_loss=1.543, over 1966.00 samples.], 2024-10-21 15:00:31,994 INFO [train.py:682] (1/4) Start epoch 2495 2024-10-21 15:00:48,632 INFO [train.py:561] (1/4) Epoch 2495, batch 6, global_batch_idx: 39910, batch size: 106, loss[dur_loss=0.1995, prior_loss=0.9745, diff_loss=0.2656, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9739, diff_loss=0.3866, tot_loss=1.559, over 1142.00 samples.], 2024-10-21 15:01:01,794 INFO [train.py:682] (1/4) Start epoch 2496 2024-10-21 15:01:10,760 INFO [train.py:561] (1/4) Epoch 2496, batch 0, global_batch_idx: 39920, batch size: 108, loss[dur_loss=0.2051, prior_loss=0.9753, diff_loss=0.3394, tot_loss=1.52, over 108.00 samples.], tot_loss[dur_loss=0.2051, prior_loss=0.9753, diff_loss=0.3394, tot_loss=1.52, over 108.00 samples.], 2024-10-21 15:01:24,865 INFO [train.py:561] (1/4) Epoch 2496, batch 10, global_batch_idx: 39930, batch size: 111, loss[dur_loss=0.2034, prior_loss=0.9755, diff_loss=0.3069, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9743, diff_loss=0.3725, tot_loss=1.546, over 1656.00 samples.], 2024-10-21 15:01:32,027 INFO [train.py:682] (1/4) Start epoch 2497 2024-10-21 15:01:45,869 INFO [train.py:561] (1/4) Epoch 2497, batch 4, global_batch_idx: 39940, batch size: 189, loss[dur_loss=0.1992, prior_loss=0.9745, diff_loss=0.3025, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1949, prior_loss=0.9737, diff_loss=0.4181, tot_loss=1.587, over 937.00 samples.], 2024-10-21 15:02:00,672 INFO [train.py:561] (1/4) Epoch 2497, batch 14, global_batch_idx: 39950, batch size: 142, loss[dur_loss=0.2052, prior_loss=0.9744, diff_loss=0.2899, tot_loss=1.47, over 142.00 samples.], tot_loss[dur_loss=0.1994, prior_loss=0.9743, diff_loss=0.3541, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 15:02:02,095 INFO [train.py:682] (1/4) Start epoch 2498 2024-10-21 15:02:21,993 INFO [train.py:561] (1/4) Epoch 2498, batch 8, global_batch_idx: 39960, batch size: 170, loss[dur_loss=0.2025, prior_loss=0.9747, diff_loss=0.3151, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.1978, prior_loss=0.9741, diff_loss=0.3782, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 15:02:32,098 INFO [train.py:682] (1/4) Start epoch 2499 2024-10-21 15:02:43,765 INFO [train.py:561] (1/4) Epoch 2499, batch 2, global_batch_idx: 39970, batch size: 203, loss[dur_loss=0.2024, prior_loss=0.9747, diff_loss=0.3349, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9747, diff_loss=0.3057, tot_loss=1.482, over 442.00 samples.], 2024-10-21 15:02:58,256 INFO [train.py:561] (1/4) Epoch 2499, batch 12, global_batch_idx: 39980, batch size: 152, loss[dur_loss=0.2019, prior_loss=0.9746, diff_loss=0.3123, tot_loss=1.489, over 152.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9742, diff_loss=0.3541, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 15:03:02,704 INFO [train.py:682] (1/4) Start epoch 2500 2024-10-21 15:03:19,780 INFO [train.py:561] (1/4) Epoch 2500, batch 6, global_batch_idx: 39990, batch size: 106, loss[dur_loss=0.2001, prior_loss=0.9746, diff_loss=0.2948, tot_loss=1.47, over 106.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9739, diff_loss=0.4029, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 15:03:32,816 INFO [train.py:682] (1/4) Start epoch 2501 2024-10-21 15:03:41,635 INFO [train.py:561] (1/4) Epoch 2501, batch 0, global_batch_idx: 40000, batch size: 108, loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.344, tot_loss=1.523, over 108.00 samples.], tot_loss[dur_loss=0.2042, prior_loss=0.9751, diff_loss=0.344, tot_loss=1.523, over 108.00 samples.], 2024-10-21 15:03:55,913 INFO [train.py:561] (1/4) Epoch 2501, batch 10, global_batch_idx: 40010, batch size: 111, loss[dur_loss=0.2029, prior_loss=0.9753, diff_loss=0.2759, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9741, diff_loss=0.3805, tot_loss=1.553, over 1656.00 samples.], 2024-10-21 15:04:03,022 INFO [train.py:682] (1/4) Start epoch 2502 2024-10-21 15:04:16,716 INFO [train.py:561] (1/4) Epoch 2502, batch 4, global_batch_idx: 40020, batch size: 189, loss[dur_loss=0.1995, prior_loss=0.9747, diff_loss=0.3336, tot_loss=1.508, over 189.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9736, diff_loss=0.4389, tot_loss=1.608, over 937.00 samples.], 2024-10-21 15:04:31,438 INFO [train.py:561] (1/4) Epoch 2502, batch 14, global_batch_idx: 40030, batch size: 142, loss[dur_loss=0.1998, prior_loss=0.974, diff_loss=0.2622, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9741, diff_loss=0.36, tot_loss=1.533, over 2210.00 samples.], 2024-10-21 15:04:32,863 INFO [train.py:682] (1/4) Start epoch 2503 2024-10-21 15:04:52,621 INFO [train.py:561] (1/4) Epoch 2503, batch 8, global_batch_idx: 40040, batch size: 170, loss[dur_loss=0.2027, prior_loss=0.9749, diff_loss=0.3292, tot_loss=1.507, over 170.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9741, diff_loss=0.391, tot_loss=1.563, over 1432.00 samples.], 2024-10-21 15:05:02,693 INFO [train.py:682] (1/4) Start epoch 2504 2024-10-21 15:05:14,039 INFO [train.py:561] (1/4) Epoch 2504, batch 2, global_batch_idx: 40050, batch size: 203, loss[dur_loss=0.2001, prior_loss=0.9745, diff_loss=0.3336, tot_loss=1.508, over 203.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9746, diff_loss=0.3228, tot_loss=1.499, over 442.00 samples.], 2024-10-21 15:05:28,133 INFO [train.py:561] (1/4) Epoch 2504, batch 12, global_batch_idx: 40060, batch size: 152, loss[dur_loss=0.1994, prior_loss=0.9746, diff_loss=0.3041, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9742, diff_loss=0.362, tot_loss=1.535, over 1966.00 samples.], 2024-10-21 15:05:32,577 INFO [train.py:682] (1/4) Start epoch 2505 2024-10-21 15:05:49,311 INFO [train.py:561] (1/4) Epoch 2505, batch 6, global_batch_idx: 40070, batch size: 106, loss[dur_loss=0.1995, prior_loss=0.9745, diff_loss=0.2924, tot_loss=1.466, over 106.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.3955, tot_loss=1.567, over 1142.00 samples.], 2024-10-21 15:06:02,210 INFO [train.py:682] (1/4) Start epoch 2506 2024-10-21 15:06:10,620 INFO [train.py:561] (1/4) Epoch 2506, batch 0, global_batch_idx: 40080, batch size: 108, loss[dur_loss=0.2041, prior_loss=0.9755, diff_loss=0.3318, tot_loss=1.511, over 108.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9755, diff_loss=0.3318, tot_loss=1.511, over 108.00 samples.], 2024-10-21 15:06:24,659 INFO [train.py:561] (1/4) Epoch 2506, batch 10, global_batch_idx: 40090, batch size: 111, loss[dur_loss=0.2028, prior_loss=0.9753, diff_loss=0.3135, tot_loss=1.492, over 111.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.9743, diff_loss=0.3721, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:06:31,743 INFO [train.py:682] (1/4) Start epoch 2507 2024-10-21 15:06:45,649 INFO [train.py:561] (1/4) Epoch 2507, batch 4, global_batch_idx: 40100, batch size: 189, loss[dur_loss=0.1977, prior_loss=0.9749, diff_loss=0.3499, tot_loss=1.522, over 189.00 samples.], tot_loss[dur_loss=0.1959, prior_loss=0.9739, diff_loss=0.4069, tot_loss=1.577, over 937.00 samples.], 2024-10-21 15:07:00,386 INFO [train.py:561] (1/4) Epoch 2507, batch 14, global_batch_idx: 40110, batch size: 142, loss[dur_loss=0.2011, prior_loss=0.9741, diff_loss=0.2779, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9744, diff_loss=0.3481, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 15:07:01,803 INFO [train.py:682] (1/4) Start epoch 2508 2024-10-21 15:07:21,351 INFO [train.py:561] (1/4) Epoch 2508, batch 8, global_batch_idx: 40120, batch size: 170, loss[dur_loss=0.2059, prior_loss=0.9749, diff_loss=0.2876, tot_loss=1.468, over 170.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9741, diff_loss=0.3799, tot_loss=1.553, over 1432.00 samples.], 2024-10-21 15:07:31,436 INFO [train.py:682] (1/4) Start epoch 2509 2024-10-21 15:07:42,502 INFO [train.py:561] (1/4) Epoch 2509, batch 2, global_batch_idx: 40130, batch size: 203, loss[dur_loss=0.2016, prior_loss=0.9744, diff_loss=0.31, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.2016, prior_loss=0.9745, diff_loss=0.3069, tot_loss=1.483, over 442.00 samples.], 2024-10-21 15:07:56,588 INFO [train.py:561] (1/4) Epoch 2509, batch 12, global_batch_idx: 40140, batch size: 152, loss[dur_loss=0.2007, prior_loss=0.9746, diff_loss=0.3238, tot_loss=1.499, over 152.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9742, diff_loss=0.3603, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 15:08:01,015 INFO [train.py:682] (1/4) Start epoch 2510 2024-10-21 15:08:18,080 INFO [train.py:561] (1/4) Epoch 2510, batch 6, global_batch_idx: 40150, batch size: 106, loss[dur_loss=0.1995, prior_loss=0.9747, diff_loss=0.3132, tot_loss=1.487, over 106.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9739, diff_loss=0.403, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 15:08:31,061 INFO [train.py:682] (1/4) Start epoch 2511 2024-10-21 15:08:39,681 INFO [train.py:561] (1/4) Epoch 2511, batch 0, global_batch_idx: 40160, batch size: 108, loss[dur_loss=0.2036, prior_loss=0.9752, diff_loss=0.312, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9752, diff_loss=0.312, tot_loss=1.491, over 108.00 samples.], 2024-10-21 15:08:53,827 INFO [train.py:561] (1/4) Epoch 2511, batch 10, global_batch_idx: 40170, batch size: 111, loss[dur_loss=0.2012, prior_loss=0.9754, diff_loss=0.2665, tot_loss=1.443, over 111.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9743, diff_loss=0.3698, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 15:09:00,886 INFO [train.py:682] (1/4) Start epoch 2512 2024-10-21 15:09:14,697 INFO [train.py:561] (1/4) Epoch 2512, batch 4, global_batch_idx: 40180, batch size: 189, loss[dur_loss=0.2017, prior_loss=0.9748, diff_loss=0.3141, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.196, prior_loss=0.9738, diff_loss=0.4189, tot_loss=1.589, over 937.00 samples.], 2024-10-21 15:09:29,415 INFO [train.py:561] (1/4) Epoch 2512, batch 14, global_batch_idx: 40190, batch size: 142, loss[dur_loss=0.2031, prior_loss=0.9741, diff_loss=0.3173, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9743, diff_loss=0.3599, tot_loss=1.534, over 2210.00 samples.], 2024-10-21 15:09:30,842 INFO [train.py:682] (1/4) Start epoch 2513 2024-10-21 15:09:50,506 INFO [train.py:561] (1/4) Epoch 2513, batch 8, global_batch_idx: 40200, batch size: 170, loss[dur_loss=0.2021, prior_loss=0.9747, diff_loss=0.3239, tot_loss=1.501, over 170.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9741, diff_loss=0.3808, tot_loss=1.554, over 1432.00 samples.], 2024-10-21 15:10:00,587 INFO [train.py:682] (1/4) Start epoch 2514 2024-10-21 15:10:11,654 INFO [train.py:561] (1/4) Epoch 2514, batch 2, global_batch_idx: 40210, batch size: 203, loss[dur_loss=0.1967, prior_loss=0.9745, diff_loss=0.324, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9745, diff_loss=0.3165, tot_loss=1.489, over 442.00 samples.], 2024-10-21 15:10:25,778 INFO [train.py:561] (1/4) Epoch 2514, batch 12, global_batch_idx: 40220, batch size: 152, loss[dur_loss=0.1999, prior_loss=0.9745, diff_loss=0.3271, tot_loss=1.501, over 152.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9742, diff_loss=0.3727, tot_loss=1.546, over 1966.00 samples.], 2024-10-21 15:10:30,213 INFO [train.py:682] (1/4) Start epoch 2515 2024-10-21 15:10:47,238 INFO [train.py:561] (1/4) Epoch 2515, batch 6, global_batch_idx: 40230, batch size: 106, loss[dur_loss=0.2005, prior_loss=0.9747, diff_loss=0.2781, tot_loss=1.453, over 106.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9739, diff_loss=0.3912, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 15:11:00,154 INFO [train.py:682] (1/4) Start epoch 2516 2024-10-21 15:11:08,853 INFO [train.py:561] (1/4) Epoch 2516, batch 0, global_batch_idx: 40240, batch size: 108, loss[dur_loss=0.2036, prior_loss=0.9752, diff_loss=0.2999, tot_loss=1.479, over 108.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9752, diff_loss=0.2999, tot_loss=1.479, over 108.00 samples.], 2024-10-21 15:11:22,982 INFO [train.py:561] (1/4) Epoch 2516, batch 10, global_batch_idx: 40250, batch size: 111, loss[dur_loss=0.2015, prior_loss=0.9754, diff_loss=0.3142, tot_loss=1.491, over 111.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9742, diff_loss=0.3667, tot_loss=1.539, over 1656.00 samples.], 2024-10-21 15:11:30,187 INFO [train.py:682] (1/4) Start epoch 2517 2024-10-21 15:11:43,967 INFO [train.py:561] (1/4) Epoch 2517, batch 4, global_batch_idx: 40260, batch size: 189, loss[dur_loss=0.2014, prior_loss=0.9747, diff_loss=0.3074, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9737, diff_loss=0.4272, tot_loss=1.597, over 937.00 samples.], 2024-10-21 15:11:58,757 INFO [train.py:561] (1/4) Epoch 2517, batch 14, global_batch_idx: 40270, batch size: 142, loss[dur_loss=0.2035, prior_loss=0.9743, diff_loss=0.3108, tot_loss=1.489, over 142.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9744, diff_loss=0.357, tot_loss=1.531, over 2210.00 samples.], 2024-10-21 15:12:00,178 INFO [train.py:682] (1/4) Start epoch 2518 2024-10-21 15:12:20,206 INFO [train.py:561] (1/4) Epoch 2518, batch 8, global_batch_idx: 40280, batch size: 170, loss[dur_loss=0.2049, prior_loss=0.9752, diff_loss=0.3152, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9742, diff_loss=0.3829, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 15:12:30,297 INFO [train.py:682] (1/4) Start epoch 2519 2024-10-21 15:12:41,610 INFO [train.py:561] (1/4) Epoch 2519, batch 2, global_batch_idx: 40290, batch size: 203, loss[dur_loss=0.2009, prior_loss=0.9744, diff_loss=0.3529, tot_loss=1.528, over 203.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.9745, diff_loss=0.3195, tot_loss=1.495, over 442.00 samples.], 2024-10-21 15:12:55,706 INFO [train.py:561] (1/4) Epoch 2519, batch 12, global_batch_idx: 40300, batch size: 152, loss[dur_loss=0.2015, prior_loss=0.9747, diff_loss=0.3248, tot_loss=1.501, over 152.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9743, diff_loss=0.3641, tot_loss=1.539, over 1966.00 samples.], 2024-10-21 15:13:00,118 INFO [train.py:682] (1/4) Start epoch 2520 2024-10-21 15:13:16,850 INFO [train.py:561] (1/4) Epoch 2520, batch 6, global_batch_idx: 40310, batch size: 106, loss[dur_loss=0.1989, prior_loss=0.9747, diff_loss=0.305, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9739, diff_loss=0.3982, tot_loss=1.57, over 1142.00 samples.], 2024-10-21 15:13:29,829 INFO [train.py:682] (1/4) Start epoch 2521 2024-10-21 15:13:38,766 INFO [train.py:561] (1/4) Epoch 2521, batch 0, global_batch_idx: 40320, batch size: 108, loss[dur_loss=0.2018, prior_loss=0.975, diff_loss=0.2975, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2018, prior_loss=0.975, diff_loss=0.2975, tot_loss=1.474, over 108.00 samples.], 2024-10-21 15:13:52,905 INFO [train.py:561] (1/4) Epoch 2521, batch 10, global_batch_idx: 40330, batch size: 111, loss[dur_loss=0.2014, prior_loss=0.9756, diff_loss=0.2949, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9741, diff_loss=0.3609, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 15:13:59,931 INFO [train.py:682] (1/4) Start epoch 2522 2024-10-21 15:14:13,773 INFO [train.py:561] (1/4) Epoch 2522, batch 4, global_batch_idx: 40340, batch size: 189, loss[dur_loss=0.199, prior_loss=0.9746, diff_loss=0.3316, tot_loss=1.505, over 189.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9737, diff_loss=0.4213, tot_loss=1.592, over 937.00 samples.], 2024-10-21 15:14:28,584 INFO [train.py:561] (1/4) Epoch 2522, batch 14, global_batch_idx: 40350, batch size: 142, loss[dur_loss=0.202, prior_loss=0.9742, diff_loss=0.2826, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1999, prior_loss=0.9743, diff_loss=0.3556, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 15:14:30,020 INFO [train.py:682] (1/4) Start epoch 2523 2024-10-21 15:14:49,930 INFO [train.py:561] (1/4) Epoch 2523, batch 8, global_batch_idx: 40360, batch size: 170, loss[dur_loss=0.2034, prior_loss=0.9747, diff_loss=0.3289, tot_loss=1.507, over 170.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.974, diff_loss=0.3849, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 15:14:59,960 INFO [train.py:682] (1/4) Start epoch 2524 2024-10-21 15:15:11,144 INFO [train.py:561] (1/4) Epoch 2524, batch 2, global_batch_idx: 40370, batch size: 203, loss[dur_loss=0.1984, prior_loss=0.9744, diff_loss=0.3537, tot_loss=1.526, over 203.00 samples.], tot_loss[dur_loss=0.2001, prior_loss=0.9744, diff_loss=0.3136, tot_loss=1.488, over 442.00 samples.], 2024-10-21 15:15:25,277 INFO [train.py:561] (1/4) Epoch 2524, batch 12, global_batch_idx: 40380, batch size: 152, loss[dur_loss=0.2018, prior_loss=0.9744, diff_loss=0.2656, tot_loss=1.442, over 152.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9741, diff_loss=0.3586, tot_loss=1.532, over 1966.00 samples.], 2024-10-21 15:15:29,702 INFO [train.py:682] (1/4) Start epoch 2525 2024-10-21 15:15:46,484 INFO [train.py:561] (1/4) Epoch 2525, batch 6, global_batch_idx: 40390, batch size: 106, loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.2983, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.1978, prior_loss=0.9738, diff_loss=0.407, tot_loss=1.579, over 1142.00 samples.], 2024-10-21 15:15:59,613 INFO [train.py:682] (1/4) Start epoch 2526 2024-10-21 15:16:08,641 INFO [train.py:561] (1/4) Epoch 2526, batch 0, global_batch_idx: 40400, batch size: 108, loss[dur_loss=0.2007, prior_loss=0.9748, diff_loss=0.2937, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9748, diff_loss=0.2937, tot_loss=1.469, over 108.00 samples.], 2024-10-21 15:16:22,783 INFO [train.py:561] (1/4) Epoch 2526, batch 10, global_batch_idx: 40410, batch size: 111, loss[dur_loss=0.2038, prior_loss=0.9754, diff_loss=0.3065, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9742, diff_loss=0.3665, tot_loss=1.539, over 1656.00 samples.], 2024-10-21 15:16:29,819 INFO [train.py:682] (1/4) Start epoch 2527 2024-10-21 15:16:43,534 INFO [train.py:561] (1/4) Epoch 2527, batch 4, global_batch_idx: 40420, batch size: 189, loss[dur_loss=0.2031, prior_loss=0.9746, diff_loss=0.3334, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9737, diff_loss=0.4116, tot_loss=1.582, over 937.00 samples.], 2024-10-21 15:16:58,172 INFO [train.py:561] (1/4) Epoch 2527, batch 14, global_batch_idx: 40430, batch size: 142, loss[dur_loss=0.2026, prior_loss=0.9743, diff_loss=0.3151, tot_loss=1.492, over 142.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9743, diff_loss=0.3494, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 15:16:59,573 INFO [train.py:682] (1/4) Start epoch 2528 2024-10-21 15:17:19,370 INFO [train.py:561] (1/4) Epoch 2528, batch 8, global_batch_idx: 40440, batch size: 170, loss[dur_loss=0.2032, prior_loss=0.9748, diff_loss=0.3462, tot_loss=1.524, over 170.00 samples.], tot_loss[dur_loss=0.199, prior_loss=0.9742, diff_loss=0.3969, tot_loss=1.57, over 1432.00 samples.], 2024-10-21 15:17:29,480 INFO [train.py:682] (1/4) Start epoch 2529 2024-10-21 15:17:40,974 INFO [train.py:561] (1/4) Epoch 2529, batch 2, global_batch_idx: 40450, batch size: 203, loss[dur_loss=0.1971, prior_loss=0.9741, diff_loss=0.3471, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9744, diff_loss=0.3211, tot_loss=1.495, over 442.00 samples.], 2024-10-21 15:17:54,981 INFO [train.py:561] (1/4) Epoch 2529, batch 12, global_batch_idx: 40460, batch size: 152, loss[dur_loss=0.2005, prior_loss=0.9745, diff_loss=0.2869, tot_loss=1.462, over 152.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9741, diff_loss=0.3634, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 15:17:59,377 INFO [train.py:682] (1/4) Start epoch 2530 2024-10-21 15:18:16,498 INFO [train.py:561] (1/4) Epoch 2530, batch 6, global_batch_idx: 40470, batch size: 106, loss[dur_loss=0.2004, prior_loss=0.9746, diff_loss=0.2871, tot_loss=1.462, over 106.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9738, diff_loss=0.3859, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 15:18:29,383 INFO [train.py:682] (1/4) Start epoch 2531 2024-10-21 15:18:37,945 INFO [train.py:561] (1/4) Epoch 2531, batch 0, global_batch_idx: 40480, batch size: 108, loss[dur_loss=0.2028, prior_loss=0.9749, diff_loss=0.34, tot_loss=1.518, over 108.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9749, diff_loss=0.34, tot_loss=1.518, over 108.00 samples.], 2024-10-21 15:18:52,038 INFO [train.py:561] (1/4) Epoch 2531, batch 10, global_batch_idx: 40490, batch size: 111, loss[dur_loss=0.2001, prior_loss=0.9753, diff_loss=0.2518, tot_loss=1.427, over 111.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.9741, diff_loss=0.3721, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:18:59,063 INFO [train.py:682] (1/4) Start epoch 2532 2024-10-21 15:19:12,716 INFO [train.py:561] (1/4) Epoch 2532, batch 4, global_batch_idx: 40500, batch size: 189, loss[dur_loss=0.2008, prior_loss=0.9748, diff_loss=0.3433, tot_loss=1.519, over 189.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9736, diff_loss=0.425, tot_loss=1.595, over 937.00 samples.], 2024-10-21 15:19:14,324 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 15:19:47,999 INFO [train.py:589] (1/4) Epoch 2532, validation: dur_loss=0.4531, prior_loss=1.036, diff_loss=0.3716, tot_loss=1.861, over 100.00 samples. 2024-10-21 15:19:48,000 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 15:20:01,103 INFO [train.py:561] (1/4) Epoch 2532, batch 14, global_batch_idx: 40510, batch size: 142, loss[dur_loss=0.2005, prior_loss=0.9739, diff_loss=0.3059, tot_loss=1.48, over 142.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9742, diff_loss=0.3598, tot_loss=1.533, over 2210.00 samples.], 2024-10-21 15:20:02,520 INFO [train.py:682] (1/4) Start epoch 2533 2024-10-21 15:20:22,288 INFO [train.py:561] (1/4) Epoch 2533, batch 8, global_batch_idx: 40520, batch size: 170, loss[dur_loss=0.2029, prior_loss=0.9748, diff_loss=0.2947, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9741, diff_loss=0.3709, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 15:20:32,311 INFO [train.py:682] (1/4) Start epoch 2534 2024-10-21 15:20:44,048 INFO [train.py:561] (1/4) Epoch 2534, batch 2, global_batch_idx: 40530, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9743, diff_loss=0.3477, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.199, prior_loss=0.9744, diff_loss=0.3113, tot_loss=1.485, over 442.00 samples.], 2024-10-21 15:20:58,385 INFO [train.py:561] (1/4) Epoch 2534, batch 12, global_batch_idx: 40540, batch size: 152, loss[dur_loss=0.1999, prior_loss=0.9747, diff_loss=0.2951, tot_loss=1.47, over 152.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9742, diff_loss=0.3567, tot_loss=1.53, over 1966.00 samples.], 2024-10-21 15:21:02,805 INFO [train.py:682] (1/4) Start epoch 2535 2024-10-21 15:21:19,819 INFO [train.py:561] (1/4) Epoch 2535, batch 6, global_batch_idx: 40550, batch size: 106, loss[dur_loss=0.2014, prior_loss=0.9744, diff_loss=0.2973, tot_loss=1.473, over 106.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9737, diff_loss=0.3957, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 15:21:32,600 INFO [train.py:682] (1/4) Start epoch 2536 2024-10-21 15:21:42,002 INFO [train.py:561] (1/4) Epoch 2536, batch 0, global_batch_idx: 40560, batch size: 108, loss[dur_loss=0.2, prior_loss=0.9748, diff_loss=0.2963, tot_loss=1.471, over 108.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9748, diff_loss=0.2963, tot_loss=1.471, over 108.00 samples.], 2024-10-21 15:21:56,170 INFO [train.py:561] (1/4) Epoch 2536, batch 10, global_batch_idx: 40570, batch size: 111, loss[dur_loss=0.2003, prior_loss=0.9756, diff_loss=0.3038, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.1978, prior_loss=0.974, diff_loss=0.377, tot_loss=1.549, over 1656.00 samples.], 2024-10-21 15:22:03,244 INFO [train.py:682] (1/4) Start epoch 2537 2024-10-21 15:22:17,523 INFO [train.py:561] (1/4) Epoch 2537, batch 4, global_batch_idx: 40580, batch size: 189, loss[dur_loss=0.1993, prior_loss=0.9743, diff_loss=0.3274, tot_loss=1.501, over 189.00 samples.], tot_loss[dur_loss=0.1949, prior_loss=0.9734, diff_loss=0.4269, tot_loss=1.595, over 937.00 samples.], 2024-10-21 15:22:32,421 INFO [train.py:561] (1/4) Epoch 2537, batch 14, global_batch_idx: 40590, batch size: 142, loss[dur_loss=0.2014, prior_loss=0.974, diff_loss=0.3138, tot_loss=1.489, over 142.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.974, diff_loss=0.3572, tot_loss=1.53, over 2210.00 samples.], 2024-10-21 15:22:33,847 INFO [train.py:682] (1/4) Start epoch 2538 2024-10-21 15:22:54,343 INFO [train.py:561] (1/4) Epoch 2538, batch 8, global_batch_idx: 40600, batch size: 170, loss[dur_loss=0.202, prior_loss=0.9743, diff_loss=0.3248, tot_loss=1.501, over 170.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9737, diff_loss=0.3773, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 15:23:04,371 INFO [train.py:682] (1/4) Start epoch 2539 2024-10-21 15:23:15,797 INFO [train.py:561] (1/4) Epoch 2539, batch 2, global_batch_idx: 40610, batch size: 203, loss[dur_loss=0.1943, prior_loss=0.974, diff_loss=0.3169, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.9742, diff_loss=0.3142, tot_loss=1.487, over 442.00 samples.], 2024-10-21 15:23:29,975 INFO [train.py:561] (1/4) Epoch 2539, batch 12, global_batch_idx: 40620, batch size: 152, loss[dur_loss=0.1995, prior_loss=0.9743, diff_loss=0.3017, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9739, diff_loss=0.3583, tot_loss=1.53, over 1966.00 samples.], 2024-10-21 15:23:34,399 INFO [train.py:682] (1/4) Start epoch 2540 2024-10-21 15:23:51,851 INFO [train.py:561] (1/4) Epoch 2540, batch 6, global_batch_idx: 40630, batch size: 106, loss[dur_loss=0.1973, prior_loss=0.9743, diff_loss=0.2887, tot_loss=1.46, over 106.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.3914, tot_loss=1.561, over 1142.00 samples.], 2024-10-21 15:24:04,770 INFO [train.py:682] (1/4) Start epoch 2541 2024-10-21 15:24:13,589 INFO [train.py:561] (1/4) Epoch 2541, batch 0, global_batch_idx: 40640, batch size: 108, loss[dur_loss=0.2024, prior_loss=0.9747, diff_loss=0.3254, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9747, diff_loss=0.3254, tot_loss=1.503, over 108.00 samples.], 2024-10-21 15:24:27,731 INFO [train.py:561] (1/4) Epoch 2541, batch 10, global_batch_idx: 40650, batch size: 111, loss[dur_loss=0.2016, prior_loss=0.9754, diff_loss=0.2863, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.974, diff_loss=0.366, tot_loss=1.537, over 1656.00 samples.], 2024-10-21 15:24:34,761 INFO [train.py:682] (1/4) Start epoch 2542 2024-10-21 15:24:48,704 INFO [train.py:561] (1/4) Epoch 2542, batch 4, global_batch_idx: 40660, batch size: 189, loss[dur_loss=0.1982, prior_loss=0.9743, diff_loss=0.3296, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9733, diff_loss=0.4274, tot_loss=1.595, over 937.00 samples.], 2024-10-21 15:25:03,430 INFO [train.py:561] (1/4) Epoch 2542, batch 14, global_batch_idx: 40670, batch size: 142, loss[dur_loss=0.2018, prior_loss=0.974, diff_loss=0.3339, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.974, diff_loss=0.3585, tot_loss=1.531, over 2210.00 samples.], 2024-10-21 15:25:04,862 INFO [train.py:682] (1/4) Start epoch 2543 2024-10-21 15:25:24,937 INFO [train.py:561] (1/4) Epoch 2543, batch 8, global_batch_idx: 40680, batch size: 170, loss[dur_loss=0.2014, prior_loss=0.9743, diff_loss=0.3073, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9737, diff_loss=0.3806, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 15:25:34,980 INFO [train.py:682] (1/4) Start epoch 2544 2024-10-21 15:25:46,602 INFO [train.py:561] (1/4) Epoch 2544, batch 2, global_batch_idx: 40690, batch size: 203, loss[dur_loss=0.1989, prior_loss=0.9741, diff_loss=0.3647, tot_loss=1.538, over 203.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9742, diff_loss=0.3274, tot_loss=1.5, over 442.00 samples.], 2024-10-21 15:26:00,726 INFO [train.py:561] (1/4) Epoch 2544, batch 12, global_batch_idx: 40700, batch size: 152, loss[dur_loss=0.2008, prior_loss=0.9744, diff_loss=0.3057, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.3629, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 15:26:05,133 INFO [train.py:682] (1/4) Start epoch 2545 2024-10-21 15:26:22,463 INFO [train.py:561] (1/4) Epoch 2545, batch 6, global_batch_idx: 40710, batch size: 106, loss[dur_loss=0.1987, prior_loss=0.9742, diff_loss=0.2949, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9734, diff_loss=0.3971, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 15:26:35,509 INFO [train.py:682] (1/4) Start epoch 2546 2024-10-21 15:26:44,458 INFO [train.py:561] (1/4) Epoch 2546, batch 0, global_batch_idx: 40720, batch size: 108, loss[dur_loss=0.2028, prior_loss=0.9747, diff_loss=0.3193, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.2028, prior_loss=0.9747, diff_loss=0.3193, tot_loss=1.497, over 108.00 samples.], 2024-10-21 15:26:58,628 INFO [train.py:561] (1/4) Epoch 2546, batch 10, global_batch_idx: 40730, batch size: 111, loss[dur_loss=0.2016, prior_loss=0.9752, diff_loss=0.304, tot_loss=1.481, over 111.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9739, diff_loss=0.3712, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 15:27:05,710 INFO [train.py:682] (1/4) Start epoch 2547 2024-10-21 15:27:19,878 INFO [train.py:561] (1/4) Epoch 2547, batch 4, global_batch_idx: 40740, batch size: 189, loss[dur_loss=0.1975, prior_loss=0.9743, diff_loss=0.3113, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.1949, prior_loss=0.9733, diff_loss=0.4347, tot_loss=1.603, over 937.00 samples.], 2024-10-21 15:27:34,769 INFO [train.py:561] (1/4) Epoch 2547, batch 14, global_batch_idx: 40750, batch size: 142, loss[dur_loss=0.2004, prior_loss=0.9739, diff_loss=0.3256, tot_loss=1.5, over 142.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9739, diff_loss=0.3656, tot_loss=1.537, over 2210.00 samples.], 2024-10-21 15:27:36,182 INFO [train.py:682] (1/4) Start epoch 2548 2024-10-21 15:27:56,203 INFO [train.py:561] (1/4) Epoch 2548, batch 8, global_batch_idx: 40760, batch size: 170, loss[dur_loss=0.2027, prior_loss=0.9744, diff_loss=0.3159, tot_loss=1.493, over 170.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9738, diff_loss=0.3861, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 15:28:06,241 INFO [train.py:682] (1/4) Start epoch 2549 2024-10-21 15:28:17,904 INFO [train.py:561] (1/4) Epoch 2549, batch 2, global_batch_idx: 40770, batch size: 203, loss[dur_loss=0.2046, prior_loss=0.9743, diff_loss=0.3324, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.2022, prior_loss=0.9743, diff_loss=0.3151, tot_loss=1.492, over 442.00 samples.], 2024-10-21 15:28:32,127 INFO [train.py:561] (1/4) Epoch 2549, batch 12, global_batch_idx: 40780, batch size: 152, loss[dur_loss=0.1986, prior_loss=0.9742, diff_loss=0.2949, tot_loss=1.468, over 152.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9739, diff_loss=0.3516, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 15:28:36,550 INFO [train.py:682] (1/4) Start epoch 2550 2024-10-21 15:28:53,484 INFO [train.py:561] (1/4) Epoch 2550, batch 6, global_batch_idx: 40790, batch size: 106, loss[dur_loss=0.1963, prior_loss=0.9741, diff_loss=0.2769, tot_loss=1.447, over 106.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9736, diff_loss=0.4004, tot_loss=1.569, over 1142.00 samples.], 2024-10-21 15:29:06,556 INFO [train.py:682] (1/4) Start epoch 2551 2024-10-21 15:29:15,544 INFO [train.py:561] (1/4) Epoch 2551, batch 0, global_batch_idx: 40800, batch size: 108, loss[dur_loss=0.1982, prior_loss=0.9747, diff_loss=0.28, tot_loss=1.453, over 108.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9747, diff_loss=0.28, tot_loss=1.453, over 108.00 samples.], 2024-10-21 15:29:29,801 INFO [train.py:561] (1/4) Epoch 2551, batch 10, global_batch_idx: 40810, batch size: 111, loss[dur_loss=0.2014, prior_loss=0.9752, diff_loss=0.3369, tot_loss=1.513, over 111.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9738, diff_loss=0.3694, tot_loss=1.539, over 1656.00 samples.], 2024-10-21 15:29:36,861 INFO [train.py:682] (1/4) Start epoch 2552 2024-10-21 15:29:50,648 INFO [train.py:561] (1/4) Epoch 2552, batch 4, global_batch_idx: 40820, batch size: 189, loss[dur_loss=0.1964, prior_loss=0.9741, diff_loss=0.3329, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9733, diff_loss=0.426, tot_loss=1.595, over 937.00 samples.], 2024-10-21 15:30:05,447 INFO [train.py:561] (1/4) Epoch 2552, batch 14, global_batch_idx: 40830, batch size: 142, loss[dur_loss=0.2021, prior_loss=0.9739, diff_loss=0.3189, tot_loss=1.495, over 142.00 samples.], tot_loss[dur_loss=0.1984, prior_loss=0.9739, diff_loss=0.3564, tot_loss=1.529, over 2210.00 samples.], 2024-10-21 15:30:06,866 INFO [train.py:682] (1/4) Start epoch 2553 2024-10-21 15:30:26,878 INFO [train.py:561] (1/4) Epoch 2553, batch 8, global_batch_idx: 40840, batch size: 170, loss[dur_loss=0.2029, prior_loss=0.9743, diff_loss=0.3178, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.3691, tot_loss=1.539, over 1432.00 samples.], 2024-10-21 15:30:36,906 INFO [train.py:682] (1/4) Start epoch 2554 2024-10-21 15:30:48,572 INFO [train.py:561] (1/4) Epoch 2554, batch 2, global_batch_idx: 40850, batch size: 203, loss[dur_loss=0.1987, prior_loss=0.9738, diff_loss=0.337, tot_loss=1.509, over 203.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.974, diff_loss=0.3226, tot_loss=1.495, over 442.00 samples.], 2024-10-21 15:31:02,765 INFO [train.py:561] (1/4) Epoch 2554, batch 12, global_batch_idx: 40860, batch size: 152, loss[dur_loss=0.1988, prior_loss=0.9744, diff_loss=0.3046, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9739, diff_loss=0.36, tot_loss=1.531, over 1966.00 samples.], 2024-10-21 15:31:07,171 INFO [train.py:682] (1/4) Start epoch 2555 2024-10-21 15:31:24,024 INFO [train.py:561] (1/4) Epoch 2555, batch 6, global_batch_idx: 40870, batch size: 106, loss[dur_loss=0.1973, prior_loss=0.9742, diff_loss=0.2776, tot_loss=1.449, over 106.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9735, diff_loss=0.4051, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 15:31:37,025 INFO [train.py:682] (1/4) Start epoch 2556 2024-10-21 15:31:45,945 INFO [train.py:561] (1/4) Epoch 2556, batch 0, global_batch_idx: 40880, batch size: 108, loss[dur_loss=0.2003, prior_loss=0.9747, diff_loss=0.2881, tot_loss=1.463, over 108.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9747, diff_loss=0.2881, tot_loss=1.463, over 108.00 samples.], 2024-10-21 15:32:00,184 INFO [train.py:561] (1/4) Epoch 2556, batch 10, global_batch_idx: 40890, batch size: 111, loss[dur_loss=0.2004, prior_loss=0.975, diff_loss=0.2997, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.9737, diff_loss=0.376, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:32:07,177 INFO [train.py:682] (1/4) Start epoch 2557 2024-10-21 15:32:21,026 INFO [train.py:561] (1/4) Epoch 2557, batch 4, global_batch_idx: 40900, batch size: 189, loss[dur_loss=0.1987, prior_loss=0.974, diff_loss=0.3523, tot_loss=1.525, over 189.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9733, diff_loss=0.4263, tot_loss=1.594, over 937.00 samples.], 2024-10-21 15:32:35,890 INFO [train.py:561] (1/4) Epoch 2557, batch 14, global_batch_idx: 40910, batch size: 142, loss[dur_loss=0.2003, prior_loss=0.9737, diff_loss=0.2852, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9739, diff_loss=0.3458, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 15:32:37,299 INFO [train.py:682] (1/4) Start epoch 2558 2024-10-21 15:32:57,270 INFO [train.py:561] (1/4) Epoch 2558, batch 8, global_batch_idx: 40920, batch size: 170, loss[dur_loss=0.1977, prior_loss=0.9741, diff_loss=0.3347, tot_loss=1.507, over 170.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9737, diff_loss=0.3824, tot_loss=1.552, over 1432.00 samples.], 2024-10-21 15:33:07,298 INFO [train.py:682] (1/4) Start epoch 2559 2024-10-21 15:33:18,807 INFO [train.py:561] (1/4) Epoch 2559, batch 2, global_batch_idx: 40930, batch size: 203, loss[dur_loss=0.1954, prior_loss=0.974, diff_loss=0.3173, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.9741, diff_loss=0.3049, tot_loss=1.477, over 442.00 samples.], 2024-10-21 15:33:32,957 INFO [train.py:561] (1/4) Epoch 2559, batch 12, global_batch_idx: 40940, batch size: 152, loss[dur_loss=0.1999, prior_loss=0.9742, diff_loss=0.3027, tot_loss=1.477, over 152.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9738, diff_loss=0.3663, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 15:33:37,406 INFO [train.py:682] (1/4) Start epoch 2560 2024-10-21 15:33:54,299 INFO [train.py:561] (1/4) Epoch 2560, batch 6, global_batch_idx: 40950, batch size: 106, loss[dur_loss=0.1979, prior_loss=0.9742, diff_loss=0.3356, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.1944, prior_loss=0.9734, diff_loss=0.4082, tot_loss=1.576, over 1142.00 samples.], 2024-10-21 15:34:07,309 INFO [train.py:682] (1/4) Start epoch 2561 2024-10-21 15:34:16,117 INFO [train.py:561] (1/4) Epoch 2561, batch 0, global_batch_idx: 40960, batch size: 108, loss[dur_loss=0.2013, prior_loss=0.9746, diff_loss=0.2665, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.2013, prior_loss=0.9746, diff_loss=0.2665, tot_loss=1.442, over 108.00 samples.], 2024-10-21 15:34:30,407 INFO [train.py:561] (1/4) Epoch 2561, batch 10, global_batch_idx: 40970, batch size: 111, loss[dur_loss=0.1975, prior_loss=0.9748, diff_loss=0.3209, tot_loss=1.493, over 111.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9737, diff_loss=0.3707, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 15:34:37,497 INFO [train.py:682] (1/4) Start epoch 2562 2024-10-21 15:34:51,520 INFO [train.py:561] (1/4) Epoch 2562, batch 4, global_batch_idx: 40980, batch size: 189, loss[dur_loss=0.2004, prior_loss=0.9741, diff_loss=0.3521, tot_loss=1.527, over 189.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9732, diff_loss=0.4243, tot_loss=1.592, over 937.00 samples.], 2024-10-21 15:35:06,318 INFO [train.py:561] (1/4) Epoch 2562, batch 14, global_batch_idx: 40990, batch size: 142, loss[dur_loss=0.1991, prior_loss=0.9739, diff_loss=0.291, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9738, diff_loss=0.3528, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 15:35:07,735 INFO [train.py:682] (1/4) Start epoch 2563 2024-10-21 15:35:27,654 INFO [train.py:561] (1/4) Epoch 2563, batch 8, global_batch_idx: 41000, batch size: 170, loss[dur_loss=0.2011, prior_loss=0.9741, diff_loss=0.3166, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.3851, tot_loss=1.555, over 1432.00 samples.], 2024-10-21 15:35:37,747 INFO [train.py:682] (1/4) Start epoch 2564 2024-10-21 15:35:48,897 INFO [train.py:561] (1/4) Epoch 2564, batch 2, global_batch_idx: 41010, batch size: 203, loss[dur_loss=0.1975, prior_loss=0.9741, diff_loss=0.3378, tot_loss=1.509, over 203.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9741, diff_loss=0.3095, tot_loss=1.482, over 442.00 samples.], 2024-10-21 15:36:03,229 INFO [train.py:561] (1/4) Epoch 2564, batch 12, global_batch_idx: 41020, batch size: 152, loss[dur_loss=0.1988, prior_loss=0.9743, diff_loss=0.3363, tot_loss=1.51, over 152.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9737, diff_loss=0.3666, tot_loss=1.537, over 1966.00 samples.], 2024-10-21 15:36:07,658 INFO [train.py:682] (1/4) Start epoch 2565 2024-10-21 15:36:24,872 INFO [train.py:561] (1/4) Epoch 2565, batch 6, global_batch_idx: 41030, batch size: 106, loss[dur_loss=0.1957, prior_loss=0.9739, diff_loss=0.3089, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9735, diff_loss=0.4082, tot_loss=1.576, over 1142.00 samples.], 2024-10-21 15:36:37,880 INFO [train.py:682] (1/4) Start epoch 2566 2024-10-21 15:36:46,551 INFO [train.py:561] (1/4) Epoch 2566, batch 0, global_batch_idx: 41040, batch size: 108, loss[dur_loss=0.204, prior_loss=0.9747, diff_loss=0.3112, tot_loss=1.49, over 108.00 samples.], tot_loss[dur_loss=0.204, prior_loss=0.9747, diff_loss=0.3112, tot_loss=1.49, over 108.00 samples.], 2024-10-21 15:37:00,692 INFO [train.py:561] (1/4) Epoch 2566, batch 10, global_batch_idx: 41050, batch size: 111, loss[dur_loss=0.2003, prior_loss=0.975, diff_loss=0.3227, tot_loss=1.498, over 111.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9737, diff_loss=0.3763, tot_loss=1.547, over 1656.00 samples.], 2024-10-21 15:37:07,759 INFO [train.py:682] (1/4) Start epoch 2567 2024-10-21 15:37:21,670 INFO [train.py:561] (1/4) Epoch 2567, batch 4, global_batch_idx: 41060, batch size: 189, loss[dur_loss=0.1959, prior_loss=0.9739, diff_loss=0.3474, tot_loss=1.517, over 189.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9731, diff_loss=0.4109, tot_loss=1.578, over 937.00 samples.], 2024-10-21 15:37:36,558 INFO [train.py:561] (1/4) Epoch 2567, batch 14, global_batch_idx: 41070, batch size: 142, loss[dur_loss=0.1975, prior_loss=0.9737, diff_loss=0.3101, tot_loss=1.481, over 142.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9737, diff_loss=0.3491, tot_loss=1.52, over 2210.00 samples.], 2024-10-21 15:37:37,973 INFO [train.py:682] (1/4) Start epoch 2568 2024-10-21 15:37:58,009 INFO [train.py:561] (1/4) Epoch 2568, batch 8, global_batch_idx: 41080, batch size: 170, loss[dur_loss=0.2014, prior_loss=0.9741, diff_loss=0.3326, tot_loss=1.508, over 170.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9735, diff_loss=0.3734, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 15:38:08,073 INFO [train.py:682] (1/4) Start epoch 2569 2024-10-21 15:38:19,402 INFO [train.py:561] (1/4) Epoch 2569, batch 2, global_batch_idx: 41090, batch size: 203, loss[dur_loss=0.198, prior_loss=0.9738, diff_loss=0.3235, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.3091, tot_loss=1.48, over 442.00 samples.], 2024-10-21 15:38:33,721 INFO [train.py:561] (1/4) Epoch 2569, batch 12, global_batch_idx: 41100, batch size: 152, loss[dur_loss=0.2001, prior_loss=0.9744, diff_loss=0.3312, tot_loss=1.506, over 152.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9737, diff_loss=0.3629, tot_loss=1.533, over 1966.00 samples.], 2024-10-21 15:38:38,132 INFO [train.py:682] (1/4) Start epoch 2570 2024-10-21 15:38:55,293 INFO [train.py:561] (1/4) Epoch 2570, batch 6, global_batch_idx: 41110, batch size: 106, loss[dur_loss=0.1951, prior_loss=0.9742, diff_loss=0.2816, tot_loss=1.451, over 106.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9734, diff_loss=0.3875, tot_loss=1.555, over 1142.00 samples.], 2024-10-21 15:39:08,377 INFO [train.py:682] (1/4) Start epoch 2571 2024-10-21 15:39:17,515 INFO [train.py:561] (1/4) Epoch 2571, batch 0, global_batch_idx: 41120, batch size: 108, loss[dur_loss=0.2032, prior_loss=0.9746, diff_loss=0.3091, tot_loss=1.487, over 108.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9746, diff_loss=0.3091, tot_loss=1.487, over 108.00 samples.], 2024-10-21 15:39:31,672 INFO [train.py:561] (1/4) Epoch 2571, batch 10, global_batch_idx: 41130, batch size: 111, loss[dur_loss=0.2025, prior_loss=0.9753, diff_loss=0.2964, tot_loss=1.474, over 111.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9739, diff_loss=0.3745, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:39:38,722 INFO [train.py:682] (1/4) Start epoch 2572 2024-10-21 15:39:52,479 INFO [train.py:561] (1/4) Epoch 2572, batch 4, global_batch_idx: 41140, batch size: 189, loss[dur_loss=0.198, prior_loss=0.9743, diff_loss=0.3158, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9733, diff_loss=0.4329, tot_loss=1.6, over 937.00 samples.], 2024-10-21 15:40:07,294 INFO [train.py:561] (1/4) Epoch 2572, batch 14, global_batch_idx: 41150, batch size: 142, loss[dur_loss=0.2008, prior_loss=0.9739, diff_loss=0.2764, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9738, diff_loss=0.3568, tot_loss=1.528, over 2210.00 samples.], 2024-10-21 15:40:08,717 INFO [train.py:682] (1/4) Start epoch 2573 2024-10-21 15:40:28,882 INFO [train.py:561] (1/4) Epoch 2573, batch 8, global_batch_idx: 41160, batch size: 170, loss[dur_loss=0.2005, prior_loss=0.9742, diff_loss=0.3442, tot_loss=1.519, over 170.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9735, diff_loss=0.3759, tot_loss=1.545, over 1432.00 samples.], 2024-10-21 15:40:38,985 INFO [train.py:682] (1/4) Start epoch 2574 2024-10-21 15:40:50,575 INFO [train.py:561] (1/4) Epoch 2574, batch 2, global_batch_idx: 41170, batch size: 203, loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.3577, tot_loss=1.529, over 203.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.974, diff_loss=0.3264, tot_loss=1.499, over 442.00 samples.], 2024-10-21 15:41:04,863 INFO [train.py:561] (1/4) Epoch 2574, batch 12, global_batch_idx: 41180, batch size: 152, loss[dur_loss=0.2022, prior_loss=0.9742, diff_loss=0.3036, tot_loss=1.48, over 152.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9738, diff_loss=0.3629, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 15:41:09,267 INFO [train.py:682] (1/4) Start epoch 2575 2024-10-21 15:41:26,238 INFO [train.py:561] (1/4) Epoch 2575, batch 6, global_batch_idx: 41190, batch size: 106, loss[dur_loss=0.1992, prior_loss=0.9739, diff_loss=0.2926, tot_loss=1.466, over 106.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9734, diff_loss=0.4005, tot_loss=1.568, over 1142.00 samples.], 2024-10-21 15:41:39,319 INFO [train.py:682] (1/4) Start epoch 2576 2024-10-21 15:41:48,301 INFO [train.py:561] (1/4) Epoch 2576, batch 0, global_batch_idx: 41200, batch size: 108, loss[dur_loss=0.2007, prior_loss=0.9746, diff_loss=0.3017, tot_loss=1.477, over 108.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9746, diff_loss=0.3017, tot_loss=1.477, over 108.00 samples.], 2024-10-21 15:42:02,432 INFO [train.py:561] (1/4) Epoch 2576, batch 10, global_batch_idx: 41210, batch size: 111, loss[dur_loss=0.1996, prior_loss=0.9749, diff_loss=0.3139, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9738, diff_loss=0.374, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:42:09,436 INFO [train.py:682] (1/4) Start epoch 2577 2024-10-21 15:42:23,260 INFO [train.py:561] (1/4) Epoch 2577, batch 4, global_batch_idx: 41220, batch size: 189, loss[dur_loss=0.2002, prior_loss=0.974, diff_loss=0.3062, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.196, prior_loss=0.9732, diff_loss=0.4164, tot_loss=1.586, over 937.00 samples.], 2024-10-21 15:42:38,077 INFO [train.py:561] (1/4) Epoch 2577, batch 14, global_batch_idx: 41230, batch size: 142, loss[dur_loss=0.2006, prior_loss=0.9737, diff_loss=0.3164, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9738, diff_loss=0.355, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 15:42:39,499 INFO [train.py:682] (1/4) Start epoch 2578 2024-10-21 15:42:59,406 INFO [train.py:561] (1/4) Epoch 2578, batch 8, global_batch_idx: 41240, batch size: 170, loss[dur_loss=0.1995, prior_loss=0.9741, diff_loss=0.3001, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9735, diff_loss=0.3806, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 15:43:09,381 INFO [train.py:682] (1/4) Start epoch 2579 2024-10-21 15:43:21,357 INFO [train.py:561] (1/4) Epoch 2579, batch 2, global_batch_idx: 41250, batch size: 203, loss[dur_loss=0.1996, prior_loss=0.9741, diff_loss=0.3753, tot_loss=1.549, over 203.00 samples.], tot_loss[dur_loss=0.1999, prior_loss=0.9741, diff_loss=0.3391, tot_loss=1.513, over 442.00 samples.], 2024-10-21 15:43:35,682 INFO [train.py:561] (1/4) Epoch 2579, batch 12, global_batch_idx: 41260, batch size: 152, loss[dur_loss=0.1963, prior_loss=0.974, diff_loss=0.3204, tot_loss=1.491, over 152.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9737, diff_loss=0.3717, tot_loss=1.542, over 1966.00 samples.], 2024-10-21 15:43:40,099 INFO [train.py:682] (1/4) Start epoch 2580 2024-10-21 15:43:57,275 INFO [train.py:561] (1/4) Epoch 2580, batch 6, global_batch_idx: 41270, batch size: 106, loss[dur_loss=0.1975, prior_loss=0.9738, diff_loss=0.3379, tot_loss=1.509, over 106.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9733, diff_loss=0.3968, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 15:44:10,424 INFO [train.py:682] (1/4) Start epoch 2581 2024-10-21 15:44:19,338 INFO [train.py:561] (1/4) Epoch 2581, batch 0, global_batch_idx: 41280, batch size: 108, loss[dur_loss=0.2036, prior_loss=0.9747, diff_loss=0.2885, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.2036, prior_loss=0.9747, diff_loss=0.2885, tot_loss=1.467, over 108.00 samples.], 2024-10-21 15:44:33,545 INFO [train.py:561] (1/4) Epoch 2581, batch 10, global_batch_idx: 41290, batch size: 111, loss[dur_loss=0.1969, prior_loss=0.975, diff_loss=0.3122, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9736, diff_loss=0.3754, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 15:44:40,556 INFO [train.py:682] (1/4) Start epoch 2582 2024-10-21 15:44:54,326 INFO [train.py:561] (1/4) Epoch 2582, batch 4, global_batch_idx: 41300, batch size: 189, loss[dur_loss=0.1949, prior_loss=0.9738, diff_loss=0.3209, tot_loss=1.489, over 189.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9731, diff_loss=0.4203, tot_loss=1.588, over 937.00 samples.], 2024-10-21 15:45:09,101 INFO [train.py:561] (1/4) Epoch 2582, batch 14, global_batch_idx: 41310, batch size: 142, loss[dur_loss=0.1984, prior_loss=0.9736, diff_loss=0.3141, tot_loss=1.486, over 142.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9737, diff_loss=0.3559, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 15:45:10,524 INFO [train.py:682] (1/4) Start epoch 2583 2024-10-21 15:45:30,969 INFO [train.py:561] (1/4) Epoch 2583, batch 8, global_batch_idx: 41320, batch size: 170, loss[dur_loss=0.1991, prior_loss=0.974, diff_loss=0.3214, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9735, diff_loss=0.3724, tot_loss=1.542, over 1432.00 samples.], 2024-10-21 15:45:41,130 INFO [train.py:682] (1/4) Start epoch 2584 2024-10-21 15:45:52,783 INFO [train.py:561] (1/4) Epoch 2584, batch 2, global_batch_idx: 41330, batch size: 203, loss[dur_loss=0.1972, prior_loss=0.9737, diff_loss=0.348, tot_loss=1.519, over 203.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9738, diff_loss=0.3208, tot_loss=1.492, over 442.00 samples.], 2024-10-21 15:46:07,070 INFO [train.py:561] (1/4) Epoch 2584, batch 12, global_batch_idx: 41340, batch size: 152, loss[dur_loss=0.1956, prior_loss=0.9741, diff_loss=0.3085, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1959, prior_loss=0.9736, diff_loss=0.3575, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 15:46:11,531 INFO [train.py:682] (1/4) Start epoch 2585 2024-10-21 15:46:28,831 INFO [train.py:561] (1/4) Epoch 2585, batch 6, global_batch_idx: 41350, batch size: 106, loss[dur_loss=0.199, prior_loss=0.974, diff_loss=0.2868, tot_loss=1.46, over 106.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9733, diff_loss=0.4063, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 15:46:41,943 INFO [train.py:682] (1/4) Start epoch 2586 2024-10-21 15:46:51,238 INFO [train.py:561] (1/4) Epoch 2586, batch 0, global_batch_idx: 41360, batch size: 108, loss[dur_loss=0.2022, prior_loss=0.9745, diff_loss=0.306, tot_loss=1.483, over 108.00 samples.], tot_loss[dur_loss=0.2022, prior_loss=0.9745, diff_loss=0.306, tot_loss=1.483, over 108.00 samples.], 2024-10-21 15:47:05,357 INFO [train.py:561] (1/4) Epoch 2586, batch 10, global_batch_idx: 41370, batch size: 111, loss[dur_loss=0.2031, prior_loss=0.9749, diff_loss=0.3088, tot_loss=1.487, over 111.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9736, diff_loss=0.365, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 15:47:12,407 INFO [train.py:682] (1/4) Start epoch 2587 2024-10-21 15:47:26,768 INFO [train.py:561] (1/4) Epoch 2587, batch 4, global_batch_idx: 41380, batch size: 189, loss[dur_loss=0.1956, prior_loss=0.9738, diff_loss=0.2973, tot_loss=1.467, over 189.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.973, diff_loss=0.4238, tot_loss=1.591, over 937.00 samples.], 2024-10-21 15:47:41,730 INFO [train.py:561] (1/4) Epoch 2587, batch 14, global_batch_idx: 41390, batch size: 142, loss[dur_loss=0.1987, prior_loss=0.9736, diff_loss=0.3015, tot_loss=1.474, over 142.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9736, diff_loss=0.3521, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 15:47:43,158 INFO [train.py:682] (1/4) Start epoch 2588 2024-10-21 15:48:03,097 INFO [train.py:561] (1/4) Epoch 2588, batch 8, global_batch_idx: 41400, batch size: 170, loss[dur_loss=0.1984, prior_loss=0.9739, diff_loss=0.3237, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.9735, diff_loss=0.3844, tot_loss=1.553, over 1432.00 samples.], 2024-10-21 15:48:13,177 INFO [train.py:682] (1/4) Start epoch 2589 2024-10-21 15:48:24,721 INFO [train.py:561] (1/4) Epoch 2589, batch 2, global_batch_idx: 41410, batch size: 203, loss[dur_loss=0.1955, prior_loss=0.9738, diff_loss=0.3276, tot_loss=1.497, over 203.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9739, diff_loss=0.3037, tot_loss=1.475, over 442.00 samples.], 2024-10-21 15:48:38,825 INFO [train.py:561] (1/4) Epoch 2589, batch 12, global_batch_idx: 41420, batch size: 152, loss[dur_loss=0.1968, prior_loss=0.9741, diff_loss=0.327, tot_loss=1.498, over 152.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9736, diff_loss=0.3591, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 15:48:43,257 INFO [train.py:682] (1/4) Start epoch 2590 2024-10-21 15:49:00,153 INFO [train.py:561] (1/4) Epoch 2590, batch 6, global_batch_idx: 41430, batch size: 106, loss[dur_loss=0.1959, prior_loss=0.9737, diff_loss=0.2583, tot_loss=1.428, over 106.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9734, diff_loss=0.3883, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 15:49:13,293 INFO [train.py:682] (1/4) Start epoch 2591 2024-10-21 15:49:22,263 INFO [train.py:561] (1/4) Epoch 2591, batch 0, global_batch_idx: 41440, batch size: 108, loss[dur_loss=0.2041, prior_loss=0.9746, diff_loss=0.2726, tot_loss=1.451, over 108.00 samples.], tot_loss[dur_loss=0.2041, prior_loss=0.9746, diff_loss=0.2726, tot_loss=1.451, over 108.00 samples.], 2024-10-21 15:49:36,543 INFO [train.py:561] (1/4) Epoch 2591, batch 10, global_batch_idx: 41450, batch size: 111, loss[dur_loss=0.2001, prior_loss=0.9748, diff_loss=0.301, tot_loss=1.476, over 111.00 samples.], tot_loss[dur_loss=0.1965, prior_loss=0.9737, diff_loss=0.367, tot_loss=1.537, over 1656.00 samples.], 2024-10-21 15:49:43,625 INFO [train.py:682] (1/4) Start epoch 2592 2024-10-21 15:49:57,365 INFO [train.py:561] (1/4) Epoch 2592, batch 4, global_batch_idx: 41460, batch size: 189, loss[dur_loss=0.1985, prior_loss=0.974, diff_loss=0.3399, tot_loss=1.512, over 189.00 samples.], tot_loss[dur_loss=0.195, prior_loss=0.9732, diff_loss=0.43, tot_loss=1.598, over 937.00 samples.], 2024-10-21 15:50:12,208 INFO [train.py:561] (1/4) Epoch 2592, batch 14, global_batch_idx: 41470, batch size: 142, loss[dur_loss=0.2003, prior_loss=0.9738, diff_loss=0.3362, tot_loss=1.51, over 142.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9737, diff_loss=0.3552, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 15:50:13,659 INFO [train.py:682] (1/4) Start epoch 2593 2024-10-21 15:50:33,476 INFO [train.py:561] (1/4) Epoch 2593, batch 8, global_batch_idx: 41480, batch size: 170, loss[dur_loss=0.2006, prior_loss=0.9739, diff_loss=0.331, tot_loss=1.505, over 170.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9735, diff_loss=0.3799, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 15:50:43,584 INFO [train.py:682] (1/4) Start epoch 2594 2024-10-21 15:50:54,888 INFO [train.py:561] (1/4) Epoch 2594, batch 2, global_batch_idx: 41490, batch size: 203, loss[dur_loss=0.1968, prior_loss=0.9737, diff_loss=0.3261, tot_loss=1.497, over 203.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9739, diff_loss=0.3089, tot_loss=1.48, over 442.00 samples.], 2024-10-21 15:51:09,033 INFO [train.py:561] (1/4) Epoch 2594, batch 12, global_batch_idx: 41500, batch size: 152, loss[dur_loss=0.1977, prior_loss=0.9742, diff_loss=0.3068, tot_loss=1.479, over 152.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9737, diff_loss=0.3526, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 15:51:13,548 INFO [train.py:682] (1/4) Start epoch 2595 2024-10-21 15:51:30,769 INFO [train.py:561] (1/4) Epoch 2595, batch 6, global_batch_idx: 41510, batch size: 106, loss[dur_loss=0.199, prior_loss=0.9744, diff_loss=0.2667, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9734, diff_loss=0.3972, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 15:51:43,733 INFO [train.py:682] (1/4) Start epoch 2596 2024-10-21 15:51:52,729 INFO [train.py:561] (1/4) Epoch 2596, batch 0, global_batch_idx: 41520, batch size: 108, loss[dur_loss=0.201, prior_loss=0.9744, diff_loss=0.3145, tot_loss=1.49, over 108.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9744, diff_loss=0.3145, tot_loss=1.49, over 108.00 samples.], 2024-10-21 15:52:06,798 INFO [train.py:561] (1/4) Epoch 2596, batch 10, global_batch_idx: 41530, batch size: 111, loss[dur_loss=0.2013, prior_loss=0.975, diff_loss=0.3127, tot_loss=1.489, over 111.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9736, diff_loss=0.3733, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 15:52:13,825 INFO [train.py:682] (1/4) Start epoch 2597 2024-10-21 15:52:27,503 INFO [train.py:561] (1/4) Epoch 2597, batch 4, global_batch_idx: 41540, batch size: 189, loss[dur_loss=0.1997, prior_loss=0.9741, diff_loss=0.3228, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9731, diff_loss=0.422, tot_loss=1.591, over 937.00 samples.], 2024-10-21 15:52:42,252 INFO [train.py:561] (1/4) Epoch 2597, batch 14, global_batch_idx: 41550, batch size: 142, loss[dur_loss=0.1983, prior_loss=0.9736, diff_loss=0.2845, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9737, diff_loss=0.3519, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 15:52:43,720 INFO [train.py:682] (1/4) Start epoch 2598 2024-10-21 15:53:03,906 INFO [train.py:561] (1/4) Epoch 2598, batch 8, global_batch_idx: 41560, batch size: 170, loss[dur_loss=0.1997, prior_loss=0.9738, diff_loss=0.3152, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9734, diff_loss=0.3747, tot_loss=1.544, over 1432.00 samples.], 2024-10-21 15:53:14,079 INFO [train.py:682] (1/4) Start epoch 2599 2024-10-21 15:53:25,650 INFO [train.py:561] (1/4) Epoch 2599, batch 2, global_batch_idx: 41570, batch size: 203, loss[dur_loss=0.1968, prior_loss=0.9736, diff_loss=0.3511, tot_loss=1.522, over 203.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9738, diff_loss=0.3162, tot_loss=1.488, over 442.00 samples.], 2024-10-21 15:53:39,984 INFO [train.py:561] (1/4) Epoch 2599, batch 12, global_batch_idx: 41580, batch size: 152, loss[dur_loss=0.2015, prior_loss=0.9742, diff_loss=0.3003, tot_loss=1.476, over 152.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9736, diff_loss=0.3631, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 15:53:44,503 INFO [train.py:682] (1/4) Start epoch 2600 2024-10-21 15:54:01,686 INFO [train.py:561] (1/4) Epoch 2600, batch 6, global_batch_idx: 41590, batch size: 106, loss[dur_loss=0.198, prior_loss=0.9738, diff_loss=0.3174, tot_loss=1.489, over 106.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9732, diff_loss=0.3942, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 15:54:14,794 INFO [train.py:682] (1/4) Start epoch 2601 2024-10-21 15:54:23,705 INFO [train.py:561] (1/4) Epoch 2601, batch 0, global_batch_idx: 41600, batch size: 108, loss[dur_loss=0.1994, prior_loss=0.9745, diff_loss=0.2699, tot_loss=1.444, over 108.00 samples.], tot_loss[dur_loss=0.1994, prior_loss=0.9745, diff_loss=0.2699, tot_loss=1.444, over 108.00 samples.], 2024-10-21 15:54:38,089 INFO [train.py:561] (1/4) Epoch 2601, batch 10, global_batch_idx: 41610, batch size: 111, loss[dur_loss=0.2001, prior_loss=0.9749, diff_loss=0.3175, tot_loss=1.493, over 111.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9736, diff_loss=0.3693, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 15:54:45,278 INFO [train.py:682] (1/4) Start epoch 2602 2024-10-21 15:54:59,682 INFO [train.py:561] (1/4) Epoch 2602, batch 4, global_batch_idx: 41620, batch size: 189, loss[dur_loss=0.2012, prior_loss=0.9742, diff_loss=0.3363, tot_loss=1.512, over 189.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9731, diff_loss=0.4312, tot_loss=1.599, over 937.00 samples.], 2024-10-21 15:55:14,434 INFO [train.py:561] (1/4) Epoch 2602, batch 14, global_batch_idx: 41630, batch size: 142, loss[dur_loss=0.1986, prior_loss=0.9735, diff_loss=0.2845, tot_loss=1.457, over 142.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9737, diff_loss=0.3563, tot_loss=1.527, over 2210.00 samples.], 2024-10-21 15:55:15,896 INFO [train.py:682] (1/4) Start epoch 2603 2024-10-21 15:55:36,262 INFO [train.py:561] (1/4) Epoch 2603, batch 8, global_batch_idx: 41640, batch size: 170, loss[dur_loss=0.1982, prior_loss=0.9739, diff_loss=0.323, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9735, diff_loss=0.3841, tot_loss=1.554, over 1432.00 samples.], 2024-10-21 15:55:46,410 INFO [train.py:682] (1/4) Start epoch 2604 2024-10-21 15:55:58,069 INFO [train.py:561] (1/4) Epoch 2604, batch 2, global_batch_idx: 41650, batch size: 203, loss[dur_loss=0.2015, prior_loss=0.9737, diff_loss=0.3425, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9738, diff_loss=0.3212, tot_loss=1.496, over 442.00 samples.], 2024-10-21 15:56:12,185 INFO [train.py:561] (1/4) Epoch 2604, batch 12, global_batch_idx: 41660, batch size: 152, loss[dur_loss=0.1974, prior_loss=0.974, diff_loss=0.3146, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9736, diff_loss=0.3616, tot_loss=1.532, over 1966.00 samples.], 2024-10-21 15:56:16,637 INFO [train.py:682] (1/4) Start epoch 2605 2024-10-21 15:56:34,029 INFO [train.py:561] (1/4) Epoch 2605, batch 6, global_batch_idx: 41670, batch size: 106, loss[dur_loss=0.1991, prior_loss=0.9739, diff_loss=0.3057, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9732, diff_loss=0.3999, tot_loss=1.569, over 1142.00 samples.], 2024-10-21 15:56:47,001 INFO [train.py:682] (1/4) Start epoch 2606 2024-10-21 15:56:55,991 INFO [train.py:561] (1/4) Epoch 2606, batch 0, global_batch_idx: 41680, batch size: 108, loss[dur_loss=0.2015, prior_loss=0.9743, diff_loss=0.2862, tot_loss=1.462, over 108.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9743, diff_loss=0.2862, tot_loss=1.462, over 108.00 samples.], 2024-10-21 15:57:10,126 INFO [train.py:561] (1/4) Epoch 2606, batch 10, global_batch_idx: 41690, batch size: 111, loss[dur_loss=0.2015, prior_loss=0.9747, diff_loss=0.3281, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9734, diff_loss=0.3717, tot_loss=1.541, over 1656.00 samples.], 2024-10-21 15:57:17,296 INFO [train.py:682] (1/4) Start epoch 2607 2024-10-21 15:57:31,295 INFO [train.py:561] (1/4) Epoch 2607, batch 4, global_batch_idx: 41700, batch size: 189, loss[dur_loss=0.1974, prior_loss=0.9738, diff_loss=0.3478, tot_loss=1.519, over 189.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.973, diff_loss=0.417, tot_loss=1.584, over 937.00 samples.], 2024-10-21 15:57:46,134 INFO [train.py:561] (1/4) Epoch 2607, batch 14, global_batch_idx: 41710, batch size: 142, loss[dur_loss=0.197, prior_loss=0.9736, diff_loss=0.2852, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1969, prior_loss=0.9736, diff_loss=0.3528, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 15:57:47,568 INFO [train.py:682] (1/4) Start epoch 2608 2024-10-21 15:58:07,575 INFO [train.py:561] (1/4) Epoch 2608, batch 8, global_batch_idx: 41720, batch size: 170, loss[dur_loss=0.2, prior_loss=0.974, diff_loss=0.2969, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9734, diff_loss=0.3848, tot_loss=1.554, over 1432.00 samples.], 2024-10-21 15:58:17,751 INFO [train.py:682] (1/4) Start epoch 2609 2024-10-21 15:58:29,115 INFO [train.py:561] (1/4) Epoch 2609, batch 2, global_batch_idx: 41730, batch size: 203, loss[dur_loss=0.1976, prior_loss=0.9738, diff_loss=0.3588, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9739, diff_loss=0.3261, tot_loss=1.498, over 442.00 samples.], 2024-10-21 15:58:43,196 INFO [train.py:561] (1/4) Epoch 2609, batch 12, global_batch_idx: 41740, batch size: 152, loss[dur_loss=0.1972, prior_loss=0.9739, diff_loss=0.3032, tot_loss=1.474, over 152.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9735, diff_loss=0.3637, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 15:58:47,635 INFO [train.py:682] (1/4) Start epoch 2610 2024-10-21 15:59:04,744 INFO [train.py:561] (1/4) Epoch 2610, batch 6, global_batch_idx: 41750, batch size: 106, loss[dur_loss=0.195, prior_loss=0.9736, diff_loss=0.2899, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1944, prior_loss=0.9731, diff_loss=0.401, tot_loss=1.569, over 1142.00 samples.], 2024-10-21 15:59:17,780 INFO [train.py:682] (1/4) Start epoch 2611 2024-10-21 15:59:26,485 INFO [train.py:561] (1/4) Epoch 2611, batch 0, global_batch_idx: 41760, batch size: 108, loss[dur_loss=0.2015, prior_loss=0.9746, diff_loss=0.3684, tot_loss=1.544, over 108.00 samples.], tot_loss[dur_loss=0.2015, prior_loss=0.9746, diff_loss=0.3684, tot_loss=1.544, over 108.00 samples.], 2024-10-21 15:59:40,569 INFO [train.py:561] (1/4) Epoch 2611, batch 10, global_batch_idx: 41770, batch size: 111, loss[dur_loss=0.2027, prior_loss=0.9748, diff_loss=0.3292, tot_loss=1.507, over 111.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9735, diff_loss=0.3778, tot_loss=1.548, over 1656.00 samples.], 2024-10-21 15:59:47,607 INFO [train.py:682] (1/4) Start epoch 2612 2024-10-21 16:00:01,582 INFO [train.py:561] (1/4) Epoch 2612, batch 4, global_batch_idx: 41780, batch size: 189, loss[dur_loss=0.1963, prior_loss=0.9739, diff_loss=0.3394, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.4227, tot_loss=1.59, over 937.00 samples.], 2024-10-21 16:00:16,308 INFO [train.py:561] (1/4) Epoch 2612, batch 14, global_batch_idx: 41790, batch size: 142, loss[dur_loss=0.2013, prior_loss=0.9736, diff_loss=0.267, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9736, diff_loss=0.3535, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 16:00:17,744 INFO [train.py:682] (1/4) Start epoch 2613 2024-10-21 16:00:37,615 INFO [train.py:561] (1/4) Epoch 2613, batch 8, global_batch_idx: 41800, batch size: 170, loss[dur_loss=0.2016, prior_loss=0.9741, diff_loss=0.3236, tot_loss=1.499, over 170.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9735, diff_loss=0.3885, tot_loss=1.557, over 1432.00 samples.], 2024-10-21 16:00:47,639 INFO [train.py:682] (1/4) Start epoch 2614 2024-10-21 16:00:59,403 INFO [train.py:561] (1/4) Epoch 2614, batch 2, global_batch_idx: 41810, batch size: 203, loss[dur_loss=0.1959, prior_loss=0.9738, diff_loss=0.3523, tot_loss=1.522, over 203.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9739, diff_loss=0.3251, tot_loss=1.496, over 442.00 samples.], 2024-10-21 16:01:13,492 INFO [train.py:561] (1/4) Epoch 2614, batch 12, global_batch_idx: 41820, batch size: 152, loss[dur_loss=0.198, prior_loss=0.974, diff_loss=0.3087, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.373, tot_loss=1.543, over 1966.00 samples.], 2024-10-21 16:01:17,970 INFO [train.py:682] (1/4) Start epoch 2615 2024-10-21 16:01:35,040 INFO [train.py:561] (1/4) Epoch 2615, batch 6, global_batch_idx: 41830, batch size: 106, loss[dur_loss=0.1947, prior_loss=0.974, diff_loss=0.3127, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9733, diff_loss=0.3964, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 16:01:48,027 INFO [train.py:682] (1/4) Start epoch 2616 2024-10-21 16:01:56,951 INFO [train.py:561] (1/4) Epoch 2616, batch 0, global_batch_idx: 41840, batch size: 108, loss[dur_loss=0.2022, prior_loss=0.9744, diff_loss=0.3215, tot_loss=1.498, over 108.00 samples.], tot_loss[dur_loss=0.2022, prior_loss=0.9744, diff_loss=0.3215, tot_loss=1.498, over 108.00 samples.], 2024-10-21 16:02:11,137 INFO [train.py:561] (1/4) Epoch 2616, batch 10, global_batch_idx: 41850, batch size: 111, loss[dur_loss=0.1999, prior_loss=0.9746, diff_loss=0.3139, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.9734, diff_loss=0.3687, tot_loss=1.538, over 1656.00 samples.], 2024-10-21 16:02:18,195 INFO [train.py:682] (1/4) Start epoch 2617 2024-10-21 16:02:32,110 INFO [train.py:561] (1/4) Epoch 2617, batch 4, global_batch_idx: 41860, batch size: 189, loss[dur_loss=0.1966, prior_loss=0.9736, diff_loss=0.3357, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.1944, prior_loss=0.9729, diff_loss=0.421, tot_loss=1.588, over 937.00 samples.], 2024-10-21 16:02:46,826 INFO [train.py:561] (1/4) Epoch 2617, batch 14, global_batch_idx: 41870, batch size: 142, loss[dur_loss=0.2011, prior_loss=0.9736, diff_loss=0.3013, tot_loss=1.476, over 142.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9735, diff_loss=0.3517, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 16:02:48,270 INFO [train.py:682] (1/4) Start epoch 2618 2024-10-21 16:03:08,339 INFO [train.py:561] (1/4) Epoch 2618, batch 8, global_batch_idx: 41880, batch size: 170, loss[dur_loss=0.1997, prior_loss=0.974, diff_loss=0.3404, tot_loss=1.514, over 170.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9732, diff_loss=0.3712, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 16:03:18,387 INFO [train.py:682] (1/4) Start epoch 2619 2024-10-21 16:03:30,157 INFO [train.py:561] (1/4) Epoch 2619, batch 2, global_batch_idx: 41890, batch size: 203, loss[dur_loss=0.1979, prior_loss=0.9735, diff_loss=0.299, tot_loss=1.47, over 203.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.9737, diff_loss=0.2861, tot_loss=1.458, over 442.00 samples.], 2024-10-21 16:03:44,319 INFO [train.py:561] (1/4) Epoch 2619, batch 12, global_batch_idx: 41900, batch size: 152, loss[dur_loss=0.1949, prior_loss=0.9737, diff_loss=0.2796, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9734, diff_loss=0.3438, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 16:03:48,768 INFO [train.py:682] (1/4) Start epoch 2620 2024-10-21 16:04:06,280 INFO [train.py:561] (1/4) Epoch 2620, batch 6, global_batch_idx: 41910, batch size: 106, loss[dur_loss=0.199, prior_loss=0.9738, diff_loss=0.2895, tot_loss=1.462, over 106.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9731, diff_loss=0.3984, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 16:04:19,271 INFO [train.py:682] (1/4) Start epoch 2621 2024-10-21 16:04:28,040 INFO [train.py:561] (1/4) Epoch 2621, batch 0, global_batch_idx: 41920, batch size: 108, loss[dur_loss=0.1986, prior_loss=0.9744, diff_loss=0.301, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.1986, prior_loss=0.9744, diff_loss=0.301, tot_loss=1.474, over 108.00 samples.], 2024-10-21 16:04:42,214 INFO [train.py:561] (1/4) Epoch 2621, batch 10, global_batch_idx: 41930, batch size: 111, loss[dur_loss=0.2, prior_loss=0.9746, diff_loss=0.2806, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9734, diff_loss=0.3636, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 16:04:49,324 INFO [train.py:682] (1/4) Start epoch 2622 2024-10-21 16:05:03,493 INFO [train.py:561] (1/4) Epoch 2622, batch 4, global_batch_idx: 41940, batch size: 189, loss[dur_loss=0.1958, prior_loss=0.9739, diff_loss=0.3279, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9729, diff_loss=0.4119, tot_loss=1.577, over 937.00 samples.], 2024-10-21 16:05:18,212 INFO [train.py:561] (1/4) Epoch 2622, batch 14, global_batch_idx: 41950, batch size: 142, loss[dur_loss=0.2001, prior_loss=0.9738, diff_loss=0.2903, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.9735, diff_loss=0.3431, tot_loss=1.512, over 2210.00 samples.], 2024-10-21 16:05:19,635 INFO [train.py:682] (1/4) Start epoch 2623 2024-10-21 16:05:39,663 INFO [train.py:561] (1/4) Epoch 2623, batch 8, global_batch_idx: 41960, batch size: 170, loss[dur_loss=0.2004, prior_loss=0.9742, diff_loss=0.3068, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9735, diff_loss=0.3735, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 16:05:49,728 INFO [train.py:682] (1/4) Start epoch 2624 2024-10-21 16:06:01,214 INFO [train.py:561] (1/4) Epoch 2624, batch 2, global_batch_idx: 41970, batch size: 203, loss[dur_loss=0.1966, prior_loss=0.9736, diff_loss=0.3392, tot_loss=1.509, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9738, diff_loss=0.3084, tot_loss=1.479, over 442.00 samples.], 2024-10-21 16:06:15,306 INFO [train.py:561] (1/4) Epoch 2624, batch 12, global_batch_idx: 41980, batch size: 152, loss[dur_loss=0.2033, prior_loss=0.9741, diff_loss=0.2821, tot_loss=1.459, over 152.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9735, diff_loss=0.3541, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 16:06:19,715 INFO [train.py:682] (1/4) Start epoch 2625 2024-10-21 16:06:36,819 INFO [train.py:561] (1/4) Epoch 2625, batch 6, global_batch_idx: 41990, batch size: 106, loss[dur_loss=0.1969, prior_loss=0.974, diff_loss=0.3375, tot_loss=1.508, over 106.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.9731, diff_loss=0.4041, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 16:06:49,766 INFO [train.py:682] (1/4) Start epoch 2626 2024-10-21 16:06:58,502 INFO [train.py:561] (1/4) Epoch 2626, batch 0, global_batch_idx: 42000, batch size: 108, loss[dur_loss=0.1998, prior_loss=0.9743, diff_loss=0.2594, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9743, diff_loss=0.2594, tot_loss=1.433, over 108.00 samples.], 2024-10-21 16:06:59,897 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 16:07:30,233 INFO [train.py:589] (1/4) Epoch 2626, validation: dur_loss=0.4567, prior_loss=1.036, diff_loss=0.3563, tot_loss=1.849, over 100.00 samples. 2024-10-21 16:07:30,234 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 16:07:43,097 INFO [train.py:561] (1/4) Epoch 2626, batch 10, global_batch_idx: 42010, batch size: 111, loss[dur_loss=0.2005, prior_loss=0.9746, diff_loss=0.2879, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9734, diff_loss=0.3564, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 16:07:50,095 INFO [train.py:682] (1/4) Start epoch 2627 2024-10-21 16:08:04,257 INFO [train.py:561] (1/4) Epoch 2627, batch 4, global_batch_idx: 42020, batch size: 189, loss[dur_loss=0.1977, prior_loss=0.974, diff_loss=0.3216, tot_loss=1.493, over 189.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.973, diff_loss=0.4104, tot_loss=1.576, over 937.00 samples.], 2024-10-21 16:08:18,946 INFO [train.py:561] (1/4) Epoch 2627, batch 14, global_batch_idx: 42030, batch size: 142, loss[dur_loss=0.202, prior_loss=0.9736, diff_loss=0.3113, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9736, diff_loss=0.3533, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 16:08:20,357 INFO [train.py:682] (1/4) Start epoch 2628 2024-10-21 16:08:40,955 INFO [train.py:561] (1/4) Epoch 2628, batch 8, global_batch_idx: 42040, batch size: 170, loss[dur_loss=0.2018, prior_loss=0.9741, diff_loss=0.3179, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.1966, prior_loss=0.9735, diff_loss=0.3773, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 16:08:50,991 INFO [train.py:682] (1/4) Start epoch 2629 2024-10-21 16:09:02,437 INFO [train.py:561] (1/4) Epoch 2629, batch 2, global_batch_idx: 42050, batch size: 203, loss[dur_loss=0.2006, prior_loss=0.9738, diff_loss=0.3472, tot_loss=1.522, over 203.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9739, diff_loss=0.3414, tot_loss=1.516, over 442.00 samples.], 2024-10-21 16:09:16,533 INFO [train.py:561] (1/4) Epoch 2629, batch 12, global_batch_idx: 42060, batch size: 152, loss[dur_loss=0.198, prior_loss=0.9739, diff_loss=0.2771, tot_loss=1.449, over 152.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9736, diff_loss=0.3659, tot_loss=1.536, over 1966.00 samples.], 2024-10-21 16:09:20,893 INFO [train.py:682] (1/4) Start epoch 2630 2024-10-21 16:09:38,372 INFO [train.py:561] (1/4) Epoch 2630, batch 6, global_batch_idx: 42070, batch size: 106, loss[dur_loss=0.1987, prior_loss=0.9741, diff_loss=0.2863, tot_loss=1.459, over 106.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9733, diff_loss=0.3948, tot_loss=1.563, over 1142.00 samples.], 2024-10-21 16:09:51,190 INFO [train.py:682] (1/4) Start epoch 2631 2024-10-21 16:10:00,002 INFO [train.py:561] (1/4) Epoch 2631, batch 0, global_batch_idx: 42080, batch size: 108, loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.2913, tot_loss=1.465, over 108.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9744, diff_loss=0.2913, tot_loss=1.465, over 108.00 samples.], 2024-10-21 16:10:14,219 INFO [train.py:561] (1/4) Epoch 2631, batch 10, global_batch_idx: 42090, batch size: 111, loss[dur_loss=0.1996, prior_loss=0.9748, diff_loss=0.3029, tot_loss=1.477, over 111.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9735, diff_loss=0.3641, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 16:10:21,251 INFO [train.py:682] (1/4) Start epoch 2632 2024-10-21 16:10:35,003 INFO [train.py:561] (1/4) Epoch 2632, batch 4, global_batch_idx: 42100, batch size: 189, loss[dur_loss=0.2007, prior_loss=0.9739, diff_loss=0.3368, tot_loss=1.511, over 189.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9729, diff_loss=0.4186, tot_loss=1.586, over 937.00 samples.], 2024-10-21 16:10:49,748 INFO [train.py:561] (1/4) Epoch 2632, batch 14, global_batch_idx: 42110, batch size: 142, loss[dur_loss=0.1996, prior_loss=0.9736, diff_loss=0.312, tot_loss=1.485, over 142.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9735, diff_loss=0.3557, tot_loss=1.526, over 2210.00 samples.], 2024-10-21 16:10:51,174 INFO [train.py:682] (1/4) Start epoch 2633 2024-10-21 16:11:11,051 INFO [train.py:561] (1/4) Epoch 2633, batch 8, global_batch_idx: 42120, batch size: 170, loss[dur_loss=0.1985, prior_loss=0.974, diff_loss=0.3102, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.195, prior_loss=0.9734, diff_loss=0.3827, tot_loss=1.551, over 1432.00 samples.], 2024-10-21 16:11:21,128 INFO [train.py:682] (1/4) Start epoch 2634 2024-10-21 16:11:32,931 INFO [train.py:561] (1/4) Epoch 2634, batch 2, global_batch_idx: 42130, batch size: 203, loss[dur_loss=0.1998, prior_loss=0.9737, diff_loss=0.3255, tot_loss=1.499, over 203.00 samples.], tot_loss[dur_loss=0.199, prior_loss=0.9738, diff_loss=0.3064, tot_loss=1.479, over 442.00 samples.], 2024-10-21 16:11:47,044 INFO [train.py:561] (1/4) Epoch 2634, batch 12, global_batch_idx: 42140, batch size: 152, loss[dur_loss=0.1965, prior_loss=0.9737, diff_loss=0.3142, tot_loss=1.484, over 152.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9734, diff_loss=0.3619, tot_loss=1.531, over 1966.00 samples.], 2024-10-21 16:11:51,475 INFO [train.py:682] (1/4) Start epoch 2635 2024-10-21 16:12:08,441 INFO [train.py:561] (1/4) Epoch 2635, batch 6, global_batch_idx: 42150, batch size: 106, loss[dur_loss=0.1979, prior_loss=0.9739, diff_loss=0.2992, tot_loss=1.471, over 106.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3933, tot_loss=1.56, over 1142.00 samples.], 2024-10-21 16:12:21,298 INFO [train.py:682] (1/4) Start epoch 2636 2024-10-21 16:12:30,435 INFO [train.py:561] (1/4) Epoch 2636, batch 0, global_batch_idx: 42160, batch size: 108, loss[dur_loss=0.201, prior_loss=0.9744, diff_loss=0.3083, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.201, prior_loss=0.9744, diff_loss=0.3083, tot_loss=1.484, over 108.00 samples.], 2024-10-21 16:12:44,595 INFO [train.py:561] (1/4) Epoch 2636, batch 10, global_batch_idx: 42170, batch size: 111, loss[dur_loss=0.1996, prior_loss=0.9748, diff_loss=0.3107, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9734, diff_loss=0.3704, tot_loss=1.539, over 1656.00 samples.], 2024-10-21 16:12:51,686 INFO [train.py:682] (1/4) Start epoch 2637 2024-10-21 16:13:05,735 INFO [train.py:561] (1/4) Epoch 2637, batch 4, global_batch_idx: 42180, batch size: 189, loss[dur_loss=0.1978, prior_loss=0.9738, diff_loss=0.3279, tot_loss=1.5, over 189.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.4124, tot_loss=1.578, over 937.00 samples.], 2024-10-21 16:13:20,576 INFO [train.py:561] (1/4) Epoch 2637, batch 14, global_batch_idx: 42190, batch size: 142, loss[dur_loss=0.1997, prior_loss=0.9736, diff_loss=0.2889, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9736, diff_loss=0.3506, tot_loss=1.521, over 2210.00 samples.], 2024-10-21 16:13:21,990 INFO [train.py:682] (1/4) Start epoch 2638 2024-10-21 16:13:41,974 INFO [train.py:561] (1/4) Epoch 2638, batch 8, global_batch_idx: 42200, batch size: 170, loss[dur_loss=0.1983, prior_loss=0.974, diff_loss=0.3282, tot_loss=1.501, over 170.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9733, diff_loss=0.3798, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 16:13:51,930 INFO [train.py:682] (1/4) Start epoch 2639 2024-10-21 16:14:03,302 INFO [train.py:561] (1/4) Epoch 2639, batch 2, global_batch_idx: 42210, batch size: 203, loss[dur_loss=0.1974, prior_loss=0.9736, diff_loss=0.3328, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9737, diff_loss=0.304, tot_loss=1.474, over 442.00 samples.], 2024-10-21 16:14:17,414 INFO [train.py:561] (1/4) Epoch 2639, batch 12, global_batch_idx: 42220, batch size: 152, loss[dur_loss=0.1973, prior_loss=0.9738, diff_loss=0.3171, tot_loss=1.488, over 152.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9734, diff_loss=0.3552, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 16:14:21,804 INFO [train.py:682] (1/4) Start epoch 2640 2024-10-21 16:14:38,841 INFO [train.py:561] (1/4) Epoch 2640, batch 6, global_batch_idx: 42230, batch size: 106, loss[dur_loss=0.1944, prior_loss=0.9738, diff_loss=0.2668, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.9731, diff_loss=0.4021, tot_loss=1.57, over 1142.00 samples.], 2024-10-21 16:14:51,806 INFO [train.py:682] (1/4) Start epoch 2641 2024-10-21 16:15:00,344 INFO [train.py:561] (1/4) Epoch 2641, batch 0, global_batch_idx: 42240, batch size: 108, loss[dur_loss=0.2006, prior_loss=0.9743, diff_loss=0.2457, tot_loss=1.421, over 108.00 samples.], tot_loss[dur_loss=0.2006, prior_loss=0.9743, diff_loss=0.2457, tot_loss=1.421, over 108.00 samples.], 2024-10-21 16:15:14,437 INFO [train.py:561] (1/4) Epoch 2641, batch 10, global_batch_idx: 42250, batch size: 111, loss[dur_loss=0.2002, prior_loss=0.9748, diff_loss=0.2945, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9735, diff_loss=0.3568, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 16:15:21,454 INFO [train.py:682] (1/4) Start epoch 2642 2024-10-21 16:15:35,590 INFO [train.py:561] (1/4) Epoch 2642, batch 4, global_batch_idx: 42260, batch size: 189, loss[dur_loss=0.1962, prior_loss=0.9739, diff_loss=0.3095, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9729, diff_loss=0.4188, tot_loss=1.585, over 937.00 samples.], 2024-10-21 16:15:50,406 INFO [train.py:561] (1/4) Epoch 2642, batch 14, global_batch_idx: 42270, batch size: 142, loss[dur_loss=0.2049, prior_loss=0.9736, diff_loss=0.3001, tot_loss=1.479, over 142.00 samples.], tot_loss[dur_loss=0.1966, prior_loss=0.9735, diff_loss=0.3481, tot_loss=1.518, over 2210.00 samples.], 2024-10-21 16:15:51,823 INFO [train.py:682] (1/4) Start epoch 2643 2024-10-21 16:16:11,619 INFO [train.py:561] (1/4) Epoch 2643, batch 8, global_batch_idx: 42280, batch size: 170, loss[dur_loss=0.2021, prior_loss=0.9742, diff_loss=0.3347, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.1966, prior_loss=0.9733, diff_loss=0.3763, tot_loss=1.546, over 1432.00 samples.], 2024-10-21 16:16:21,641 INFO [train.py:682] (1/4) Start epoch 2644 2024-10-21 16:16:33,000 INFO [train.py:561] (1/4) Epoch 2644, batch 2, global_batch_idx: 42290, batch size: 203, loss[dur_loss=0.1986, prior_loss=0.9738, diff_loss=0.308, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.1977, prior_loss=0.9738, diff_loss=0.3019, tot_loss=1.473, over 442.00 samples.], 2024-10-21 16:16:47,243 INFO [train.py:561] (1/4) Epoch 2644, batch 12, global_batch_idx: 42300, batch size: 152, loss[dur_loss=0.1941, prior_loss=0.9737, diff_loss=0.3073, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1959, prior_loss=0.9736, diff_loss=0.3564, tot_loss=1.526, over 1966.00 samples.], 2024-10-21 16:16:51,683 INFO [train.py:682] (1/4) Start epoch 2645 2024-10-21 16:17:09,240 INFO [train.py:561] (1/4) Epoch 2645, batch 6, global_batch_idx: 42310, batch size: 106, loss[dur_loss=0.1972, prior_loss=0.9738, diff_loss=0.2637, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9731, diff_loss=0.4135, tot_loss=1.579, over 1142.00 samples.], 2024-10-21 16:17:22,088 INFO [train.py:682] (1/4) Start epoch 2646 2024-10-21 16:17:30,938 INFO [train.py:561] (1/4) Epoch 2646, batch 0, global_batch_idx: 42320, batch size: 108, loss[dur_loss=0.2012, prior_loss=0.9743, diff_loss=0.2728, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9743, diff_loss=0.2728, tot_loss=1.448, over 108.00 samples.], 2024-10-21 16:17:45,055 INFO [train.py:561] (1/4) Epoch 2646, batch 10, global_batch_idx: 42330, batch size: 111, loss[dur_loss=0.1997, prior_loss=0.9752, diff_loss=0.3101, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9735, diff_loss=0.3653, tot_loss=1.534, over 1656.00 samples.], 2024-10-21 16:17:52,123 INFO [train.py:682] (1/4) Start epoch 2647 2024-10-21 16:18:06,025 INFO [train.py:561] (1/4) Epoch 2647, batch 4, global_batch_idx: 42340, batch size: 189, loss[dur_loss=0.1943, prior_loss=0.9737, diff_loss=0.3286, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.9729, diff_loss=0.4253, tot_loss=1.59, over 937.00 samples.], 2024-10-21 16:18:20,807 INFO [train.py:561] (1/4) Epoch 2647, batch 14, global_batch_idx: 42350, batch size: 142, loss[dur_loss=0.2027, prior_loss=0.9735, diff_loss=0.2866, tot_loss=1.463, over 142.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9735, diff_loss=0.3616, tot_loss=1.531, over 2210.00 samples.], 2024-10-21 16:18:22,218 INFO [train.py:682] (1/4) Start epoch 2648 2024-10-21 16:18:42,233 INFO [train.py:561] (1/4) Epoch 2648, batch 8, global_batch_idx: 42360, batch size: 170, loss[dur_loss=0.1978, prior_loss=0.9737, diff_loss=0.3411, tot_loss=1.513, over 170.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9732, diff_loss=0.3796, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 16:18:52,246 INFO [train.py:682] (1/4) Start epoch 2649 2024-10-21 16:19:03,458 INFO [train.py:561] (1/4) Epoch 2649, batch 2, global_batch_idx: 42370, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9736, diff_loss=0.3308, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9737, diff_loss=0.3197, tot_loss=1.491, over 442.00 samples.], 2024-10-21 16:19:17,655 INFO [train.py:561] (1/4) Epoch 2649, batch 12, global_batch_idx: 42380, batch size: 152, loss[dur_loss=0.1959, prior_loss=0.9739, diff_loss=0.2938, tot_loss=1.464, over 152.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9735, diff_loss=0.3595, tot_loss=1.528, over 1966.00 samples.], 2024-10-21 16:19:22,065 INFO [train.py:682] (1/4) Start epoch 2650 2024-10-21 16:19:39,423 INFO [train.py:561] (1/4) Epoch 2650, batch 6, global_batch_idx: 42390, batch size: 106, loss[dur_loss=0.1983, prior_loss=0.9738, diff_loss=0.2962, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.973, diff_loss=0.3989, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 16:19:52,383 INFO [train.py:682] (1/4) Start epoch 2651 2024-10-21 16:20:01,125 INFO [train.py:561] (1/4) Epoch 2651, batch 0, global_batch_idx: 42400, batch size: 108, loss[dur_loss=0.2012, prior_loss=0.9744, diff_loss=0.2971, tot_loss=1.473, over 108.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9744, diff_loss=0.2971, tot_loss=1.473, over 108.00 samples.], 2024-10-21 16:20:15,376 INFO [train.py:561] (1/4) Epoch 2651, batch 10, global_batch_idx: 42410, batch size: 111, loss[dur_loss=0.2001, prior_loss=0.9748, diff_loss=0.3088, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9734, diff_loss=0.3689, tot_loss=1.538, over 1656.00 samples.], 2024-10-21 16:20:22,384 INFO [train.py:682] (1/4) Start epoch 2652 2024-10-21 16:20:36,076 INFO [train.py:561] (1/4) Epoch 2652, batch 4, global_batch_idx: 42420, batch size: 189, loss[dur_loss=0.198, prior_loss=0.9738, diff_loss=0.3343, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9728, diff_loss=0.4214, tot_loss=1.586, over 937.00 samples.], 2024-10-21 16:20:50,840 INFO [train.py:561] (1/4) Epoch 2652, batch 14, global_batch_idx: 42430, batch size: 142, loss[dur_loss=0.201, prior_loss=0.9736, diff_loss=0.2634, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9735, diff_loss=0.3552, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 16:20:52,245 INFO [train.py:682] (1/4) Start epoch 2653 2024-10-21 16:21:12,246 INFO [train.py:561] (1/4) Epoch 2653, batch 8, global_batch_idx: 42440, batch size: 170, loss[dur_loss=0.1969, prior_loss=0.9739, diff_loss=0.3669, tot_loss=1.538, over 170.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9732, diff_loss=0.3839, tot_loss=1.551, over 1432.00 samples.], 2024-10-21 16:21:22,263 INFO [train.py:682] (1/4) Start epoch 2654 2024-10-21 16:21:33,567 INFO [train.py:561] (1/4) Epoch 2654, batch 2, global_batch_idx: 42450, batch size: 203, loss[dur_loss=0.1961, prior_loss=0.9737, diff_loss=0.3067, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9737, diff_loss=0.2978, tot_loss=1.469, over 442.00 samples.], 2024-10-21 16:21:47,799 INFO [train.py:561] (1/4) Epoch 2654, batch 12, global_batch_idx: 42460, batch size: 152, loss[dur_loss=0.196, prior_loss=0.9739, diff_loss=0.321, tot_loss=1.491, over 152.00 samples.], tot_loss[dur_loss=0.1949, prior_loss=0.9734, diff_loss=0.3587, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:21:52,237 INFO [train.py:682] (1/4) Start epoch 2655 2024-10-21 16:22:09,059 INFO [train.py:561] (1/4) Epoch 2655, batch 6, global_batch_idx: 42470, batch size: 106, loss[dur_loss=0.1959, prior_loss=0.9738, diff_loss=0.2988, tot_loss=1.469, over 106.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.973, diff_loss=0.4135, tot_loss=1.581, over 1142.00 samples.], 2024-10-21 16:22:22,037 INFO [train.py:682] (1/4) Start epoch 2656 2024-10-21 16:22:31,087 INFO [train.py:561] (1/4) Epoch 2656, batch 0, global_batch_idx: 42480, batch size: 108, loss[dur_loss=0.1996, prior_loss=0.9742, diff_loss=0.3228, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.1996, prior_loss=0.9742, diff_loss=0.3228, tot_loss=1.497, over 108.00 samples.], 2024-10-21 16:22:45,165 INFO [train.py:561] (1/4) Epoch 2656, batch 10, global_batch_idx: 42490, batch size: 111, loss[dur_loss=0.1992, prior_loss=0.9747, diff_loss=0.2716, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9733, diff_loss=0.3619, tot_loss=1.529, over 1656.00 samples.], 2024-10-21 16:22:52,192 INFO [train.py:682] (1/4) Start epoch 2657 2024-10-21 16:23:06,157 INFO [train.py:561] (1/4) Epoch 2657, batch 4, global_batch_idx: 42500, batch size: 189, loss[dur_loss=0.1962, prior_loss=0.9736, diff_loss=0.317, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.9728, diff_loss=0.4197, tot_loss=1.586, over 937.00 samples.], 2024-10-21 16:23:20,981 INFO [train.py:561] (1/4) Epoch 2657, batch 14, global_batch_idx: 42510, batch size: 142, loss[dur_loss=0.2018, prior_loss=0.9733, diff_loss=0.3496, tot_loss=1.525, over 142.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9734, diff_loss=0.3541, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 16:23:22,398 INFO [train.py:682] (1/4) Start epoch 2658 2024-10-21 16:23:42,240 INFO [train.py:561] (1/4) Epoch 2658, batch 8, global_batch_idx: 42520, batch size: 170, loss[dur_loss=0.2005, prior_loss=0.974, diff_loss=0.3241, tot_loss=1.499, over 170.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.9732, diff_loss=0.3794, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 16:23:52,310 INFO [train.py:682] (1/4) Start epoch 2659 2024-10-21 16:24:03,610 INFO [train.py:561] (1/4) Epoch 2659, batch 2, global_batch_idx: 42530, batch size: 203, loss[dur_loss=0.1968, prior_loss=0.9736, diff_loss=0.3207, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9738, diff_loss=0.3169, tot_loss=1.488, over 442.00 samples.], 2024-10-21 16:24:17,729 INFO [train.py:561] (1/4) Epoch 2659, batch 12, global_batch_idx: 42540, batch size: 152, loss[dur_loss=0.1985, prior_loss=0.9738, diff_loss=0.2617, tot_loss=1.434, over 152.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9735, diff_loss=0.3498, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 16:24:22,149 INFO [train.py:682] (1/4) Start epoch 2660 2024-10-21 16:24:39,136 INFO [train.py:561] (1/4) Epoch 2660, batch 6, global_batch_idx: 42550, batch size: 106, loss[dur_loss=0.1948, prior_loss=0.9739, diff_loss=0.2674, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9731, diff_loss=0.3925, tot_loss=1.559, over 1142.00 samples.], 2024-10-21 16:24:52,110 INFO [train.py:682] (1/4) Start epoch 2661 2024-10-21 16:25:00,797 INFO [train.py:561] (1/4) Epoch 2661, batch 0, global_batch_idx: 42560, batch size: 108, loss[dur_loss=0.2014, prior_loss=0.9741, diff_loss=0.3293, tot_loss=1.505, over 108.00 samples.], tot_loss[dur_loss=0.2014, prior_loss=0.9741, diff_loss=0.3293, tot_loss=1.505, over 108.00 samples.], 2024-10-21 16:25:14,872 INFO [train.py:561] (1/4) Epoch 2661, batch 10, global_batch_idx: 42570, batch size: 111, loss[dur_loss=0.1983, prior_loss=0.9746, diff_loss=0.3139, tot_loss=1.487, over 111.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9734, diff_loss=0.3719, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 16:25:21,933 INFO [train.py:682] (1/4) Start epoch 2662 2024-10-21 16:25:35,870 INFO [train.py:561] (1/4) Epoch 2662, batch 4, global_batch_idx: 42580, batch size: 189, loss[dur_loss=0.1975, prior_loss=0.9739, diff_loss=0.3304, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.4312, tot_loss=1.597, over 937.00 samples.], 2024-10-21 16:25:50,751 INFO [train.py:561] (1/4) Epoch 2662, batch 14, global_batch_idx: 42590, batch size: 142, loss[dur_loss=0.1967, prior_loss=0.9735, diff_loss=0.2684, tot_loss=1.439, over 142.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9735, diff_loss=0.3562, tot_loss=1.525, over 2210.00 samples.], 2024-10-21 16:25:52,160 INFO [train.py:682] (1/4) Start epoch 2663 2024-10-21 16:26:11,828 INFO [train.py:561] (1/4) Epoch 2663, batch 8, global_batch_idx: 42600, batch size: 170, loss[dur_loss=0.1981, prior_loss=0.9739, diff_loss=0.2768, tot_loss=1.449, over 170.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9733, diff_loss=0.3772, tot_loss=1.544, over 1432.00 samples.], 2024-10-21 16:26:21,930 INFO [train.py:682] (1/4) Start epoch 2664 2024-10-21 16:26:33,485 INFO [train.py:561] (1/4) Epoch 2664, batch 2, global_batch_idx: 42610, batch size: 203, loss[dur_loss=0.1984, prior_loss=0.9737, diff_loss=0.354, tot_loss=1.526, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9738, diff_loss=0.3127, tot_loss=1.483, over 442.00 samples.], 2024-10-21 16:26:47,713 INFO [train.py:561] (1/4) Epoch 2664, batch 12, global_batch_idx: 42620, batch size: 152, loss[dur_loss=0.1952, prior_loss=0.9739, diff_loss=0.3221, tot_loss=1.491, over 152.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9735, diff_loss=0.3591, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:26:52,167 INFO [train.py:682] (1/4) Start epoch 2665 2024-10-21 16:27:09,236 INFO [train.py:561] (1/4) Epoch 2665, batch 6, global_batch_idx: 42630, batch size: 106, loss[dur_loss=0.1985, prior_loss=0.9739, diff_loss=0.3271, tot_loss=1.499, over 106.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.973, diff_loss=0.4116, tot_loss=1.578, over 1142.00 samples.], 2024-10-21 16:27:22,194 INFO [train.py:682] (1/4) Start epoch 2666 2024-10-21 16:27:30,812 INFO [train.py:561] (1/4) Epoch 2666, batch 0, global_batch_idx: 42640, batch size: 108, loss[dur_loss=0.2024, prior_loss=0.9743, diff_loss=0.3199, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.9743, diff_loss=0.3199, tot_loss=1.497, over 108.00 samples.], 2024-10-21 16:27:45,001 INFO [train.py:561] (1/4) Epoch 2666, batch 10, global_batch_idx: 42650, batch size: 111, loss[dur_loss=0.1945, prior_loss=0.9745, diff_loss=0.2824, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9733, diff_loss=0.3772, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 16:27:52,062 INFO [train.py:682] (1/4) Start epoch 2667 2024-10-21 16:28:06,005 INFO [train.py:561] (1/4) Epoch 2667, batch 4, global_batch_idx: 42660, batch size: 189, loss[dur_loss=0.1937, prior_loss=0.9736, diff_loss=0.3066, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9728, diff_loss=0.4042, tot_loss=1.568, over 937.00 samples.], 2024-10-21 16:28:20,832 INFO [train.py:561] (1/4) Epoch 2667, batch 14, global_batch_idx: 42670, batch size: 142, loss[dur_loss=0.1972, prior_loss=0.9734, diff_loss=0.2615, tot_loss=1.432, over 142.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9734, diff_loss=0.3456, tot_loss=1.514, over 2210.00 samples.], 2024-10-21 16:28:22,261 INFO [train.py:682] (1/4) Start epoch 2668 2024-10-21 16:28:42,061 INFO [train.py:561] (1/4) Epoch 2668, batch 8, global_batch_idx: 42680, batch size: 170, loss[dur_loss=0.1987, prior_loss=0.974, diff_loss=0.3031, tot_loss=1.476, over 170.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9732, diff_loss=0.3747, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 16:28:52,101 INFO [train.py:682] (1/4) Start epoch 2669 2024-10-21 16:29:03,519 INFO [train.py:561] (1/4) Epoch 2669, batch 2, global_batch_idx: 42690, batch size: 203, loss[dur_loss=0.1931, prior_loss=0.9733, diff_loss=0.3247, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9735, diff_loss=0.3096, tot_loss=1.478, over 442.00 samples.], 2024-10-21 16:29:17,622 INFO [train.py:561] (1/4) Epoch 2669, batch 12, global_batch_idx: 42700, batch size: 152, loss[dur_loss=0.1972, prior_loss=0.9737, diff_loss=0.3065, tot_loss=1.477, over 152.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9733, diff_loss=0.3612, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 16:29:22,038 INFO [train.py:682] (1/4) Start epoch 2670 2024-10-21 16:29:39,481 INFO [train.py:561] (1/4) Epoch 2670, batch 6, global_batch_idx: 42710, batch size: 106, loss[dur_loss=0.1956, prior_loss=0.9733, diff_loss=0.3022, tot_loss=1.471, over 106.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9729, diff_loss=0.394, tot_loss=1.56, over 1142.00 samples.], 2024-10-21 16:29:52,515 INFO [train.py:682] (1/4) Start epoch 2671 2024-10-21 16:30:01,402 INFO [train.py:561] (1/4) Epoch 2671, batch 0, global_batch_idx: 42720, batch size: 108, loss[dur_loss=0.2033, prior_loss=0.9742, diff_loss=0.3179, tot_loss=1.495, over 108.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9742, diff_loss=0.3179, tot_loss=1.495, over 108.00 samples.], 2024-10-21 16:30:15,554 INFO [train.py:561] (1/4) Epoch 2671, batch 10, global_batch_idx: 42730, batch size: 111, loss[dur_loss=0.1972, prior_loss=0.9746, diff_loss=0.3124, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9733, diff_loss=0.3792, tot_loss=1.547, over 1656.00 samples.], 2024-10-21 16:30:22,577 INFO [train.py:682] (1/4) Start epoch 2672 2024-10-21 16:30:36,566 INFO [train.py:561] (1/4) Epoch 2672, batch 4, global_batch_idx: 42740, batch size: 189, loss[dur_loss=0.1964, prior_loss=0.9738, diff_loss=0.3104, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9728, diff_loss=0.4274, tot_loss=1.594, over 937.00 samples.], 2024-10-21 16:30:51,466 INFO [train.py:561] (1/4) Epoch 2672, batch 14, global_batch_idx: 42750, batch size: 142, loss[dur_loss=0.1989, prior_loss=0.9733, diff_loss=0.3081, tot_loss=1.48, over 142.00 samples.], tot_loss[dur_loss=0.1953, prior_loss=0.9734, diff_loss=0.3541, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 16:30:52,866 INFO [train.py:682] (1/4) Start epoch 2673 2024-10-21 16:31:13,147 INFO [train.py:561] (1/4) Epoch 2673, batch 8, global_batch_idx: 42760, batch size: 170, loss[dur_loss=0.1989, prior_loss=0.9738, diff_loss=0.3106, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9731, diff_loss=0.3782, tot_loss=1.545, over 1432.00 samples.], 2024-10-21 16:31:23,174 INFO [train.py:682] (1/4) Start epoch 2674 2024-10-21 16:31:34,878 INFO [train.py:561] (1/4) Epoch 2674, batch 2, global_batch_idx: 42770, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9734, diff_loss=0.3273, tot_loss=1.497, over 203.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9736, diff_loss=0.3144, tot_loss=1.484, over 442.00 samples.], 2024-10-21 16:31:49,295 INFO [train.py:561] (1/4) Epoch 2674, batch 12, global_batch_idx: 42780, batch size: 152, loss[dur_loss=0.1941, prior_loss=0.9738, diff_loss=0.329, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9733, diff_loss=0.3666, tot_loss=1.534, over 1966.00 samples.], 2024-10-21 16:31:53,850 INFO [train.py:682] (1/4) Start epoch 2675 2024-10-21 16:32:11,139 INFO [train.py:561] (1/4) Epoch 2675, batch 6, global_batch_idx: 42790, batch size: 106, loss[dur_loss=0.1942, prior_loss=0.9734, diff_loss=0.2683, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9729, diff_loss=0.3995, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 16:32:24,267 INFO [train.py:682] (1/4) Start epoch 2676 2024-10-21 16:32:33,734 INFO [train.py:561] (1/4) Epoch 2676, batch 0, global_batch_idx: 42800, batch size: 108, loss[dur_loss=0.2012, prior_loss=0.9742, diff_loss=0.2983, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2012, prior_loss=0.9742, diff_loss=0.2983, tot_loss=1.474, over 108.00 samples.], 2024-10-21 16:32:47,972 INFO [train.py:561] (1/4) Epoch 2676, batch 10, global_batch_idx: 42810, batch size: 111, loss[dur_loss=0.2015, prior_loss=0.9746, diff_loss=0.3073, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9732, diff_loss=0.3688, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 16:32:55,029 INFO [train.py:682] (1/4) Start epoch 2677 2024-10-21 16:33:09,321 INFO [train.py:561] (1/4) Epoch 2677, batch 4, global_batch_idx: 42820, batch size: 189, loss[dur_loss=0.1966, prior_loss=0.9736, diff_loss=0.3313, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9728, diff_loss=0.4122, tot_loss=1.578, over 937.00 samples.], 2024-10-21 16:33:24,286 INFO [train.py:561] (1/4) Epoch 2677, batch 14, global_batch_idx: 42830, batch size: 142, loss[dur_loss=0.1988, prior_loss=0.9733, diff_loss=0.2841, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1953, prior_loss=0.9734, diff_loss=0.3529, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 16:33:25,694 INFO [train.py:682] (1/4) Start epoch 2678 2024-10-21 16:33:45,852 INFO [train.py:561] (1/4) Epoch 2678, batch 8, global_batch_idx: 42840, batch size: 170, loss[dur_loss=0.1997, prior_loss=0.974, diff_loss=0.3221, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9732, diff_loss=0.3824, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 16:33:55,993 INFO [train.py:682] (1/4) Start epoch 2679 2024-10-21 16:34:07,552 INFO [train.py:561] (1/4) Epoch 2679, batch 2, global_batch_idx: 42850, batch size: 203, loss[dur_loss=0.1934, prior_loss=0.9735, diff_loss=0.3181, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.1953, prior_loss=0.9736, diff_loss=0.3106, tot_loss=1.479, over 442.00 samples.], 2024-10-21 16:34:21,724 INFO [train.py:561] (1/4) Epoch 2679, batch 12, global_batch_idx: 42860, batch size: 152, loss[dur_loss=0.1957, prior_loss=0.9736, diff_loss=0.3192, tot_loss=1.488, over 152.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9733, diff_loss=0.3604, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:34:26,263 INFO [train.py:682] (1/4) Start epoch 2680 2024-10-21 16:34:43,330 INFO [train.py:561] (1/4) Epoch 2680, batch 6, global_batch_idx: 42870, batch size: 106, loss[dur_loss=0.1956, prior_loss=0.9735, diff_loss=0.2623, tot_loss=1.431, over 106.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.973, diff_loss=0.3922, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 16:34:56,373 INFO [train.py:682] (1/4) Start epoch 2681 2024-10-21 16:35:05,726 INFO [train.py:561] (1/4) Epoch 2681, batch 0, global_batch_idx: 42880, batch size: 108, loss[dur_loss=0.1995, prior_loss=0.9742, diff_loss=0.2897, tot_loss=1.463, over 108.00 samples.], tot_loss[dur_loss=0.1995, prior_loss=0.9742, diff_loss=0.2897, tot_loss=1.463, over 108.00 samples.], 2024-10-21 16:35:20,374 INFO [train.py:561] (1/4) Epoch 2681, batch 10, global_batch_idx: 42890, batch size: 111, loss[dur_loss=0.1995, prior_loss=0.9748, diff_loss=0.3136, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9733, diff_loss=0.3746, tot_loss=1.543, over 1656.00 samples.], 2024-10-21 16:35:27,735 INFO [train.py:682] (1/4) Start epoch 2682 2024-10-21 16:35:42,133 INFO [train.py:561] (1/4) Epoch 2682, batch 4, global_batch_idx: 42900, batch size: 189, loss[dur_loss=0.1951, prior_loss=0.9734, diff_loss=0.3026, tot_loss=1.471, over 189.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9727, diff_loss=0.4271, tot_loss=1.592, over 937.00 samples.], 2024-10-21 16:35:57,351 INFO [train.py:561] (1/4) Epoch 2682, batch 14, global_batch_idx: 42910, batch size: 142, loss[dur_loss=0.1973, prior_loss=0.9733, diff_loss=0.303, tot_loss=1.474, over 142.00 samples.], tot_loss[dur_loss=0.1953, prior_loss=0.9733, diff_loss=0.3502, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 16:35:58,838 INFO [train.py:682] (1/4) Start epoch 2683 2024-10-21 16:36:19,366 INFO [train.py:561] (1/4) Epoch 2683, batch 8, global_batch_idx: 42920, batch size: 170, loss[dur_loss=0.2006, prior_loss=0.9738, diff_loss=0.322, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.9731, diff_loss=0.3888, tot_loss=1.556, over 1432.00 samples.], 2024-10-21 16:36:29,817 INFO [train.py:682] (1/4) Start epoch 2684 2024-10-21 16:36:41,884 INFO [train.py:561] (1/4) Epoch 2684, batch 2, global_batch_idx: 42930, batch size: 203, loss[dur_loss=0.1962, prior_loss=0.9735, diff_loss=0.3588, tot_loss=1.529, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9736, diff_loss=0.3325, tot_loss=1.503, over 442.00 samples.], 2024-10-21 16:36:56,479 INFO [train.py:561] (1/4) Epoch 2684, batch 12, global_batch_idx: 42940, batch size: 152, loss[dur_loss=0.196, prior_loss=0.9737, diff_loss=0.3093, tot_loss=1.479, over 152.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9733, diff_loss=0.3676, tot_loss=1.535, over 1966.00 samples.], 2024-10-21 16:37:01,134 INFO [train.py:682] (1/4) Start epoch 2685 2024-10-21 16:37:19,234 INFO [train.py:561] (1/4) Epoch 2685, batch 6, global_batch_idx: 42950, batch size: 106, loss[dur_loss=0.1951, prior_loss=0.9737, diff_loss=0.3246, tot_loss=1.493, over 106.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3992, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 16:37:32,786 INFO [train.py:682] (1/4) Start epoch 2686 2024-10-21 16:37:41,991 INFO [train.py:561] (1/4) Epoch 2686, batch 0, global_batch_idx: 42960, batch size: 108, loss[dur_loss=0.2032, prior_loss=0.9742, diff_loss=0.3141, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.2032, prior_loss=0.9742, diff_loss=0.3141, tot_loss=1.491, over 108.00 samples.], 2024-10-21 16:37:56,616 INFO [train.py:561] (1/4) Epoch 2686, batch 10, global_batch_idx: 42970, batch size: 111, loss[dur_loss=0.1975, prior_loss=0.9747, diff_loss=0.2811, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9732, diff_loss=0.3721, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 16:38:03,808 INFO [train.py:682] (1/4) Start epoch 2687 2024-10-21 16:38:18,383 INFO [train.py:561] (1/4) Epoch 2687, batch 4, global_batch_idx: 42980, batch size: 189, loss[dur_loss=0.194, prior_loss=0.9736, diff_loss=0.3308, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9728, diff_loss=0.4212, tot_loss=1.586, over 937.00 samples.], 2024-10-21 16:38:33,539 INFO [train.py:561] (1/4) Epoch 2687, batch 14, global_batch_idx: 42990, batch size: 142, loss[dur_loss=0.1987, prior_loss=0.9736, diff_loss=0.2783, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9733, diff_loss=0.3481, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 16:38:34,998 INFO [train.py:682] (1/4) Start epoch 2688 2024-10-21 16:38:55,587 INFO [train.py:561] (1/4) Epoch 2688, batch 8, global_batch_idx: 43000, batch size: 170, loss[dur_loss=0.1994, prior_loss=0.9737, diff_loss=0.3085, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.973, diff_loss=0.3781, tot_loss=1.544, over 1432.00 samples.], 2024-10-21 16:39:05,900 INFO [train.py:682] (1/4) Start epoch 2689 2024-10-21 16:39:17,746 INFO [train.py:561] (1/4) Epoch 2689, batch 2, global_batch_idx: 43010, batch size: 203, loss[dur_loss=0.1975, prior_loss=0.9733, diff_loss=0.3153, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9735, diff_loss=0.3046, tot_loss=1.476, over 442.00 samples.], 2024-10-21 16:39:32,295 INFO [train.py:561] (1/4) Epoch 2689, batch 12, global_batch_idx: 43020, batch size: 152, loss[dur_loss=0.1985, prior_loss=0.9736, diff_loss=0.3003, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9732, diff_loss=0.3579, tot_loss=1.526, over 1966.00 samples.], 2024-10-21 16:39:36,747 INFO [train.py:682] (1/4) Start epoch 2690 2024-10-21 16:39:54,453 INFO [train.py:561] (1/4) Epoch 2690, batch 6, global_batch_idx: 43030, batch size: 106, loss[dur_loss=0.1923, prior_loss=0.9734, diff_loss=0.317, tot_loss=1.483, over 106.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9729, diff_loss=0.3983, tot_loss=1.563, over 1142.00 samples.], 2024-10-21 16:40:08,010 INFO [train.py:682] (1/4) Start epoch 2691 2024-10-21 16:40:17,250 INFO [train.py:561] (1/4) Epoch 2691, batch 0, global_batch_idx: 43040, batch size: 108, loss[dur_loss=0.2007, prior_loss=0.974, diff_loss=0.2916, tot_loss=1.466, over 108.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.974, diff_loss=0.2916, tot_loss=1.466, over 108.00 samples.], 2024-10-21 16:40:31,716 INFO [train.py:561] (1/4) Epoch 2691, batch 10, global_batch_idx: 43050, batch size: 111, loss[dur_loss=0.202, prior_loss=0.9747, diff_loss=0.293, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9732, diff_loss=0.3681, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 16:40:38,875 INFO [train.py:682] (1/4) Start epoch 2692 2024-10-21 16:40:52,972 INFO [train.py:561] (1/4) Epoch 2692, batch 4, global_batch_idx: 43060, batch size: 189, loss[dur_loss=0.1922, prior_loss=0.9733, diff_loss=0.3376, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9725, diff_loss=0.4124, tot_loss=1.576, over 937.00 samples.], 2024-10-21 16:41:07,968 INFO [train.py:561] (1/4) Epoch 2692, batch 14, global_batch_idx: 43070, batch size: 142, loss[dur_loss=0.2, prior_loss=0.9734, diff_loss=0.2952, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9732, diff_loss=0.3522, tot_loss=1.52, over 2210.00 samples.], 2024-10-21 16:41:09,371 INFO [train.py:682] (1/4) Start epoch 2693 2024-10-21 16:41:30,046 INFO [train.py:561] (1/4) Epoch 2693, batch 8, global_batch_idx: 43080, batch size: 170, loss[dur_loss=0.1957, prior_loss=0.9735, diff_loss=0.293, tot_loss=1.462, over 170.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9729, diff_loss=0.3726, tot_loss=1.539, over 1432.00 samples.], 2024-10-21 16:41:40,448 INFO [train.py:682] (1/4) Start epoch 2694 2024-10-21 16:41:52,438 INFO [train.py:561] (1/4) Epoch 2694, batch 2, global_batch_idx: 43090, batch size: 203, loss[dur_loss=0.1978, prior_loss=0.9734, diff_loss=0.356, tot_loss=1.527, over 203.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.9736, diff_loss=0.3266, tot_loss=1.498, over 442.00 samples.], 2024-10-21 16:42:06,745 INFO [train.py:561] (1/4) Epoch 2694, batch 12, global_batch_idx: 43100, batch size: 152, loss[dur_loss=0.1951, prior_loss=0.9738, diff_loss=0.3408, tot_loss=1.51, over 152.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9732, diff_loss=0.3609, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 16:42:11,260 INFO [train.py:682] (1/4) Start epoch 2695 2024-10-21 16:42:29,043 INFO [train.py:561] (1/4) Epoch 2695, batch 6, global_batch_idx: 43110, batch size: 106, loss[dur_loss=0.196, prior_loss=0.9736, diff_loss=0.2874, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.973, diff_loss=0.4019, tot_loss=1.568, over 1142.00 samples.], 2024-10-21 16:42:42,391 INFO [train.py:682] (1/4) Start epoch 2696 2024-10-21 16:42:51,773 INFO [train.py:561] (1/4) Epoch 2696, batch 0, global_batch_idx: 43120, batch size: 108, loss[dur_loss=0.2024, prior_loss=0.974, diff_loss=0.2688, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.2024, prior_loss=0.974, diff_loss=0.2688, tot_loss=1.445, over 108.00 samples.], 2024-10-21 16:43:06,406 INFO [train.py:561] (1/4) Epoch 2696, batch 10, global_batch_idx: 43130, batch size: 111, loss[dur_loss=0.1967, prior_loss=0.9745, diff_loss=0.3086, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9733, diff_loss=0.3671, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 16:43:13,814 INFO [train.py:682] (1/4) Start epoch 2697 2024-10-21 16:43:28,415 INFO [train.py:561] (1/4) Epoch 2697, batch 4, global_batch_idx: 43140, batch size: 189, loss[dur_loss=0.1955, prior_loss=0.9736, diff_loss=0.319, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9727, diff_loss=0.4146, tot_loss=1.579, over 937.00 samples.], 2024-10-21 16:43:44,007 INFO [train.py:561] (1/4) Epoch 2697, batch 14, global_batch_idx: 43150, batch size: 142, loss[dur_loss=0.2006, prior_loss=0.9734, diff_loss=0.2921, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1944, prior_loss=0.9732, diff_loss=0.352, tot_loss=1.52, over 2210.00 samples.], 2024-10-21 16:43:45,456 INFO [train.py:682] (1/4) Start epoch 2698 2024-10-21 16:44:06,077 INFO [train.py:561] (1/4) Epoch 2698, batch 8, global_batch_idx: 43160, batch size: 170, loss[dur_loss=0.1973, prior_loss=0.9735, diff_loss=0.2974, tot_loss=1.468, over 170.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.9729, diff_loss=0.3703, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 16:44:16,329 INFO [train.py:682] (1/4) Start epoch 2699 2024-10-21 16:44:28,283 INFO [train.py:561] (1/4) Epoch 2699, batch 2, global_batch_idx: 43170, batch size: 203, loss[dur_loss=0.1946, prior_loss=0.9733, diff_loss=0.3294, tot_loss=1.497, over 203.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.9735, diff_loss=0.3111, tot_loss=1.48, over 442.00 samples.], 2024-10-21 16:44:42,878 INFO [train.py:561] (1/4) Epoch 2699, batch 12, global_batch_idx: 43180, batch size: 152, loss[dur_loss=0.1944, prior_loss=0.9735, diff_loss=0.3037, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9731, diff_loss=0.3603, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:44:47,340 INFO [train.py:682] (1/4) Start epoch 2700 2024-10-21 16:45:05,116 INFO [train.py:561] (1/4) Epoch 2700, batch 6, global_batch_idx: 43190, batch size: 106, loss[dur_loss=0.195, prior_loss=0.9733, diff_loss=0.2821, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.9728, diff_loss=0.3927, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 16:45:18,764 INFO [train.py:682] (1/4) Start epoch 2701 2024-10-21 16:45:28,108 INFO [train.py:561] (1/4) Epoch 2701, batch 0, global_batch_idx: 43200, batch size: 108, loss[dur_loss=0.2, prior_loss=0.974, diff_loss=0.278, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.974, diff_loss=0.278, tot_loss=1.452, over 108.00 samples.], 2024-10-21 16:45:42,748 INFO [train.py:561] (1/4) Epoch 2701, batch 10, global_batch_idx: 43210, batch size: 111, loss[dur_loss=0.1977, prior_loss=0.9744, diff_loss=0.2934, tot_loss=1.466, over 111.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3734, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 16:45:50,022 INFO [train.py:682] (1/4) Start epoch 2702 2024-10-21 16:46:04,471 INFO [train.py:561] (1/4) Epoch 2702, batch 4, global_batch_idx: 43220, batch size: 189, loss[dur_loss=0.1934, prior_loss=0.9732, diff_loss=0.3115, tot_loss=1.478, over 189.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9725, diff_loss=0.4211, tot_loss=1.585, over 937.00 samples.], 2024-10-21 16:46:19,940 INFO [train.py:561] (1/4) Epoch 2702, batch 14, global_batch_idx: 43230, batch size: 142, loss[dur_loss=0.1979, prior_loss=0.9733, diff_loss=0.3052, tot_loss=1.476, over 142.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.9731, diff_loss=0.3482, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 16:46:21,429 INFO [train.py:682] (1/4) Start epoch 2703 2024-10-21 16:46:41,912 INFO [train.py:561] (1/4) Epoch 2703, batch 8, global_batch_idx: 43240, batch size: 170, loss[dur_loss=0.1997, prior_loss=0.9735, diff_loss=0.3127, tot_loss=1.486, over 170.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9729, diff_loss=0.3768, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 16:46:52,231 INFO [train.py:682] (1/4) Start epoch 2704 2024-10-21 16:47:03,735 INFO [train.py:561] (1/4) Epoch 2704, batch 2, global_batch_idx: 43250, batch size: 203, loss[dur_loss=0.1979, prior_loss=0.9733, diff_loss=0.3133, tot_loss=1.484, over 203.00 samples.], tot_loss[dur_loss=0.1962, prior_loss=0.9734, diff_loss=0.2924, tot_loss=1.462, over 442.00 samples.], 2024-10-21 16:47:18,082 INFO [train.py:561] (1/4) Epoch 2704, batch 12, global_batch_idx: 43260, batch size: 152, loss[dur_loss=0.1963, prior_loss=0.9738, diff_loss=0.2886, tot_loss=1.459, over 152.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9731, diff_loss=0.3532, tot_loss=1.52, over 1966.00 samples.], 2024-10-21 16:47:22,583 INFO [train.py:682] (1/4) Start epoch 2705 2024-10-21 16:47:40,336 INFO [train.py:561] (1/4) Epoch 2705, batch 6, global_batch_idx: 43270, batch size: 106, loss[dur_loss=0.1929, prior_loss=0.9733, diff_loss=0.2837, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9727, diff_loss=0.4027, tot_loss=1.568, over 1142.00 samples.], 2024-10-21 16:47:53,749 INFO [train.py:682] (1/4) Start epoch 2706 2024-10-21 16:48:02,849 INFO [train.py:561] (1/4) Epoch 2706, batch 0, global_batch_idx: 43280, batch size: 108, loss[dur_loss=0.1997, prior_loss=0.9739, diff_loss=0.2803, tot_loss=1.454, over 108.00 samples.], tot_loss[dur_loss=0.1997, prior_loss=0.9739, diff_loss=0.2803, tot_loss=1.454, over 108.00 samples.], 2024-10-21 16:48:17,257 INFO [train.py:561] (1/4) Epoch 2706, batch 10, global_batch_idx: 43290, batch size: 111, loss[dur_loss=0.1967, prior_loss=0.9742, diff_loss=0.2984, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9731, diff_loss=0.3693, tot_loss=1.537, over 1656.00 samples.], 2024-10-21 16:48:24,507 INFO [train.py:682] (1/4) Start epoch 2707 2024-10-21 16:48:38,614 INFO [train.py:561] (1/4) Epoch 2707, batch 4, global_batch_idx: 43300, batch size: 189, loss[dur_loss=0.1983, prior_loss=0.9735, diff_loss=0.3291, tot_loss=1.501, over 189.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9726, diff_loss=0.4006, tot_loss=1.566, over 937.00 samples.], 2024-10-21 16:48:53,784 INFO [train.py:561] (1/4) Epoch 2707, batch 14, global_batch_idx: 43310, batch size: 142, loss[dur_loss=0.1985, prior_loss=0.9731, diff_loss=0.2905, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.1953, prior_loss=0.9731, diff_loss=0.3476, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 16:48:55,237 INFO [train.py:682] (1/4) Start epoch 2708 2024-10-21 16:49:16,050 INFO [train.py:561] (1/4) Epoch 2708, batch 8, global_batch_idx: 43320, batch size: 170, loss[dur_loss=0.1974, prior_loss=0.9735, diff_loss=0.3191, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9728, diff_loss=0.3795, tot_loss=1.545, over 1432.00 samples.], 2024-10-21 16:49:26,615 INFO [train.py:682] (1/4) Start epoch 2709 2024-10-21 16:49:38,514 INFO [train.py:561] (1/4) Epoch 2709, batch 2, global_batch_idx: 43330, batch size: 203, loss[dur_loss=0.1962, prior_loss=0.9733, diff_loss=0.322, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9734, diff_loss=0.3079, tot_loss=1.478, over 442.00 samples.], 2024-10-21 16:49:53,090 INFO [train.py:561] (1/4) Epoch 2709, batch 12, global_batch_idx: 43340, batch size: 152, loss[dur_loss=0.1932, prior_loss=0.9734, diff_loss=0.3389, tot_loss=1.506, over 152.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.973, diff_loss=0.3593, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:49:57,690 INFO [train.py:682] (1/4) Start epoch 2710 2024-10-21 16:50:15,276 INFO [train.py:561] (1/4) Epoch 2710, batch 6, global_batch_idx: 43350, batch size: 106, loss[dur_loss=0.1919, prior_loss=0.9732, diff_loss=0.2964, tot_loss=1.461, over 106.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9728, diff_loss=0.3989, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 16:50:28,831 INFO [train.py:682] (1/4) Start epoch 2711 2024-10-21 16:50:38,042 INFO [train.py:561] (1/4) Epoch 2711, batch 0, global_batch_idx: 43360, batch size: 108, loss[dur_loss=0.2004, prior_loss=0.974, diff_loss=0.2994, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.974, diff_loss=0.2994, tot_loss=1.474, over 108.00 samples.], 2024-10-21 16:50:52,535 INFO [train.py:561] (1/4) Epoch 2711, batch 10, global_batch_idx: 43370, batch size: 111, loss[dur_loss=0.1974, prior_loss=0.9743, diff_loss=0.3517, tot_loss=1.523, over 111.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.973, diff_loss=0.3732, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 16:50:59,875 INFO [train.py:682] (1/4) Start epoch 2712 2024-10-21 16:51:14,456 INFO [train.py:561] (1/4) Epoch 2712, batch 4, global_batch_idx: 43380, batch size: 189, loss[dur_loss=0.1955, prior_loss=0.9734, diff_loss=0.3246, tot_loss=1.493, over 189.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9726, diff_loss=0.4229, tot_loss=1.587, over 937.00 samples.], 2024-10-21 16:51:29,846 INFO [train.py:561] (1/4) Epoch 2712, batch 14, global_batch_idx: 43390, batch size: 142, loss[dur_loss=0.196, prior_loss=0.9733, diff_loss=0.2737, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9731, diff_loss=0.3524, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 16:51:31,333 INFO [train.py:682] (1/4) Start epoch 2713 2024-10-21 16:51:52,021 INFO [train.py:561] (1/4) Epoch 2713, batch 8, global_batch_idx: 43400, batch size: 170, loss[dur_loss=0.1962, prior_loss=0.9737, diff_loss=0.3096, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.973, diff_loss=0.3731, tot_loss=1.539, over 1432.00 samples.], 2024-10-21 16:52:02,293 INFO [train.py:682] (1/4) Start epoch 2714 2024-10-21 16:52:14,094 INFO [train.py:561] (1/4) Epoch 2714, batch 2, global_batch_idx: 43410, batch size: 203, loss[dur_loss=0.1996, prior_loss=0.9735, diff_loss=0.3175, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1987, prior_loss=0.9736, diff_loss=0.3181, tot_loss=1.49, over 442.00 samples.], 2024-10-21 16:52:28,561 INFO [train.py:561] (1/4) Epoch 2714, batch 12, global_batch_idx: 43420, batch size: 152, loss[dur_loss=0.1961, prior_loss=0.9735, diff_loss=0.3149, tot_loss=1.485, over 152.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9733, diff_loss=0.3613, tot_loss=1.53, over 1966.00 samples.], 2024-10-21 16:52:33,232 INFO [train.py:682] (1/4) Start epoch 2715 2024-10-21 16:52:51,145 INFO [train.py:561] (1/4) Epoch 2715, batch 6, global_batch_idx: 43430, batch size: 106, loss[dur_loss=0.1949, prior_loss=0.9733, diff_loss=0.276, tot_loss=1.444, over 106.00 samples.], tot_loss[dur_loss=0.1924, prior_loss=0.9728, diff_loss=0.4018, tot_loss=1.567, over 1142.00 samples.], 2024-10-21 16:53:04,593 INFO [train.py:682] (1/4) Start epoch 2716 2024-10-21 16:53:14,051 INFO [train.py:561] (1/4) Epoch 2716, batch 0, global_batch_idx: 43440, batch size: 108, loss[dur_loss=0.198, prior_loss=0.9739, diff_loss=0.2969, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.198, prior_loss=0.9739, diff_loss=0.2969, tot_loss=1.469, over 108.00 samples.], 2024-10-21 16:53:28,748 INFO [train.py:561] (1/4) Epoch 2716, batch 10, global_batch_idx: 43450, batch size: 111, loss[dur_loss=0.1938, prior_loss=0.9743, diff_loss=0.3125, tot_loss=1.481, over 111.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.973, diff_loss=0.3794, tot_loss=1.545, over 1656.00 samples.], 2024-10-21 16:53:36,126 INFO [train.py:682] (1/4) Start epoch 2717 2024-10-21 16:53:50,522 INFO [train.py:561] (1/4) Epoch 2717, batch 4, global_batch_idx: 43460, batch size: 189, loss[dur_loss=0.1989, prior_loss=0.9734, diff_loss=0.3017, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9725, diff_loss=0.416, tot_loss=1.581, over 937.00 samples.], 2024-10-21 16:54:06,033 INFO [train.py:561] (1/4) Epoch 2717, batch 14, global_batch_idx: 43470, batch size: 142, loss[dur_loss=0.1999, prior_loss=0.9732, diff_loss=0.26, tot_loss=1.433, over 142.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9731, diff_loss=0.3461, tot_loss=1.514, over 2210.00 samples.], 2024-10-21 16:54:07,547 INFO [train.py:682] (1/4) Start epoch 2718 2024-10-21 16:54:28,451 INFO [train.py:561] (1/4) Epoch 2718, batch 8, global_batch_idx: 43480, batch size: 170, loss[dur_loss=0.1953, prior_loss=0.9735, diff_loss=0.3119, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9729, diff_loss=0.377, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 16:54:38,990 INFO [train.py:682] (1/4) Start epoch 2719 2024-10-21 16:54:50,731 INFO [train.py:561] (1/4) Epoch 2719, batch 2, global_batch_idx: 43490, batch size: 203, loss[dur_loss=0.194, prior_loss=0.9731, diff_loss=0.3339, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9733, diff_loss=0.3284, tot_loss=1.498, over 442.00 samples.], 2024-10-21 16:55:05,981 INFO [train.py:561] (1/4) Epoch 2719, batch 12, global_batch_idx: 43500, batch size: 152, loss[dur_loss=0.1978, prior_loss=0.9735, diff_loss=0.333, tot_loss=1.504, over 152.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.973, diff_loss=0.3596, tot_loss=1.527, over 1966.00 samples.], 2024-10-21 16:55:07,671 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 16:56:03,823 INFO [train.py:589] (1/4) Epoch 2719, validation: dur_loss=0.4572, prior_loss=1.038, diff_loss=0.3689, tot_loss=1.864, over 100.00 samples. 2024-10-21 16:56:03,824 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 16:56:06,949 INFO [train.py:682] (1/4) Start epoch 2720 2024-10-21 16:56:25,448 INFO [train.py:561] (1/4) Epoch 2720, batch 6, global_batch_idx: 43510, batch size: 106, loss[dur_loss=0.1946, prior_loss=0.9733, diff_loss=0.3105, tot_loss=1.478, over 106.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9727, diff_loss=0.409, tot_loss=1.574, over 1142.00 samples.], 2024-10-21 16:56:38,855 INFO [train.py:682] (1/4) Start epoch 2721 2024-10-21 16:56:48,393 INFO [train.py:561] (1/4) Epoch 2721, batch 0, global_batch_idx: 43520, batch size: 108, loss[dur_loss=0.2004, prior_loss=0.9741, diff_loss=0.3098, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.2004, prior_loss=0.9741, diff_loss=0.3098, tot_loss=1.484, over 108.00 samples.], 2024-10-21 16:57:03,116 INFO [train.py:561] (1/4) Epoch 2721, batch 10, global_batch_idx: 43530, batch size: 111, loss[dur_loss=0.1954, prior_loss=0.9744, diff_loss=0.2952, tot_loss=1.465, over 111.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.973, diff_loss=0.363, tot_loss=1.53, over 1656.00 samples.], 2024-10-21 16:57:10,361 INFO [train.py:682] (1/4) Start epoch 2722 2024-10-21 16:57:24,930 INFO [train.py:561] (1/4) Epoch 2722, batch 4, global_batch_idx: 43540, batch size: 189, loss[dur_loss=0.1956, prior_loss=0.9732, diff_loss=0.325, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9725, diff_loss=0.4136, tot_loss=1.578, over 937.00 samples.], 2024-10-21 16:57:40,585 INFO [train.py:561] (1/4) Epoch 2722, batch 14, global_batch_idx: 43550, batch size: 142, loss[dur_loss=0.1975, prior_loss=0.9731, diff_loss=0.3008, tot_loss=1.471, over 142.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9731, diff_loss=0.3494, tot_loss=1.517, over 2210.00 samples.], 2024-10-21 16:57:42,205 INFO [train.py:682] (1/4) Start epoch 2723 2024-10-21 16:58:03,591 INFO [train.py:561] (1/4) Epoch 2723, batch 8, global_batch_idx: 43560, batch size: 170, loss[dur_loss=0.1948, prior_loss=0.9734, diff_loss=0.3285, tot_loss=1.497, over 170.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.3789, tot_loss=1.545, over 1432.00 samples.], 2024-10-21 16:58:13,862 INFO [train.py:682] (1/4) Start epoch 2724 2024-10-21 16:58:26,435 INFO [train.py:561] (1/4) Epoch 2724, batch 2, global_batch_idx: 43570, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9733, diff_loss=0.3608, tot_loss=1.53, over 203.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9734, diff_loss=0.3353, tot_loss=1.505, over 442.00 samples.], 2024-10-21 16:58:41,180 INFO [train.py:561] (1/4) Epoch 2724, batch 12, global_batch_idx: 43580, batch size: 152, loss[dur_loss=0.1988, prior_loss=0.9735, diff_loss=0.306, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.973, diff_loss=0.3634, tot_loss=1.531, over 1966.00 samples.], 2024-10-21 16:58:45,975 INFO [train.py:682] (1/4) Start epoch 2725 2024-10-21 16:59:04,206 INFO [train.py:561] (1/4) Epoch 2725, batch 6, global_batch_idx: 43590, batch size: 106, loss[dur_loss=0.1956, prior_loss=0.9732, diff_loss=0.326, tot_loss=1.495, over 106.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9727, diff_loss=0.4001, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 16:59:18,476 INFO [train.py:682] (1/4) Start epoch 2726 2024-10-21 16:59:28,591 INFO [train.py:561] (1/4) Epoch 2726, batch 0, global_batch_idx: 43600, batch size: 108, loss[dur_loss=0.2003, prior_loss=0.9741, diff_loss=0.2569, tot_loss=1.431, over 108.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9741, diff_loss=0.2569, tot_loss=1.431, over 108.00 samples.], 2024-10-21 16:59:43,206 INFO [train.py:561] (1/4) Epoch 2726, batch 10, global_batch_idx: 43610, batch size: 111, loss[dur_loss=0.1967, prior_loss=0.9741, diff_loss=0.3023, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.973, diff_loss=0.371, tot_loss=1.537, over 1656.00 samples.], 2024-10-21 16:59:50,416 INFO [train.py:682] (1/4) Start epoch 2727 2024-10-21 17:00:05,277 INFO [train.py:561] (1/4) Epoch 2727, batch 4, global_batch_idx: 43620, batch size: 189, loss[dur_loss=0.1934, prior_loss=0.9733, diff_loss=0.3113, tot_loss=1.478, over 189.00 samples.], tot_loss[dur_loss=0.1924, prior_loss=0.9725, diff_loss=0.4244, tot_loss=1.589, over 937.00 samples.], 2024-10-21 17:00:20,760 INFO [train.py:561] (1/4) Epoch 2727, batch 14, global_batch_idx: 43630, batch size: 142, loss[dur_loss=0.1941, prior_loss=0.9729, diff_loss=0.3203, tot_loss=1.487, over 142.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3549, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 17:00:22,201 INFO [train.py:682] (1/4) Start epoch 2728 2024-10-21 17:00:43,648 INFO [train.py:561] (1/4) Epoch 2728, batch 8, global_batch_idx: 43640, batch size: 170, loss[dur_loss=0.1955, prior_loss=0.9736, diff_loss=0.2982, tot_loss=1.467, over 170.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9728, diff_loss=0.3834, tot_loss=1.549, over 1432.00 samples.], 2024-10-21 17:00:54,138 INFO [train.py:682] (1/4) Start epoch 2729 2024-10-21 17:01:06,511 INFO [train.py:561] (1/4) Epoch 2729, batch 2, global_batch_idx: 43650, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9734, diff_loss=0.3556, tot_loss=1.525, over 203.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9734, diff_loss=0.3205, tot_loss=1.491, over 442.00 samples.], 2024-10-21 17:01:20,908 INFO [train.py:561] (1/4) Epoch 2729, batch 12, global_batch_idx: 43660, batch size: 152, loss[dur_loss=0.1952, prior_loss=0.9731, diff_loss=0.3, tot_loss=1.468, over 152.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.973, diff_loss=0.3559, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 17:01:25,467 INFO [train.py:682] (1/4) Start epoch 2730 2024-10-21 17:01:43,518 INFO [train.py:561] (1/4) Epoch 2730, batch 6, global_batch_idx: 43670, batch size: 106, loss[dur_loss=0.1929, prior_loss=0.9733, diff_loss=0.2732, tot_loss=1.439, over 106.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9728, diff_loss=0.3897, tot_loss=1.554, over 1142.00 samples.], 2024-10-21 17:01:56,806 INFO [train.py:682] (1/4) Start epoch 2731 2024-10-21 17:02:06,122 INFO [train.py:561] (1/4) Epoch 2731, batch 0, global_batch_idx: 43680, batch size: 108, loss[dur_loss=0.2011, prior_loss=0.9738, diff_loss=0.2952, tot_loss=1.47, over 108.00 samples.], tot_loss[dur_loss=0.2011, prior_loss=0.9738, diff_loss=0.2952, tot_loss=1.47, over 108.00 samples.], 2024-10-21 17:02:20,674 INFO [train.py:561] (1/4) Epoch 2731, batch 10, global_batch_idx: 43690, batch size: 111, loss[dur_loss=0.196, prior_loss=0.9742, diff_loss=0.2799, tot_loss=1.45, over 111.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9729, diff_loss=0.3677, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 17:02:27,944 INFO [train.py:682] (1/4) Start epoch 2732 2024-10-21 17:02:42,652 INFO [train.py:561] (1/4) Epoch 2732, batch 4, global_batch_idx: 43700, batch size: 189, loss[dur_loss=0.194, prior_loss=0.9732, diff_loss=0.3198, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9724, diff_loss=0.4167, tot_loss=1.581, over 937.00 samples.], 2024-10-21 17:02:57,744 INFO [train.py:561] (1/4) Epoch 2732, batch 14, global_batch_idx: 43710, batch size: 142, loss[dur_loss=0.1955, prior_loss=0.973, diff_loss=0.3022, tot_loss=1.471, over 142.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3553, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 17:02:59,211 INFO [train.py:682] (1/4) Start epoch 2733 2024-10-21 17:03:20,236 INFO [train.py:561] (1/4) Epoch 2733, batch 8, global_batch_idx: 43720, batch size: 170, loss[dur_loss=0.1941, prior_loss=0.9733, diff_loss=0.3409, tot_loss=1.508, over 170.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9727, diff_loss=0.3811, tot_loss=1.546, over 1432.00 samples.], 2024-10-21 17:03:30,526 INFO [train.py:682] (1/4) Start epoch 2734 2024-10-21 17:03:43,036 INFO [train.py:561] (1/4) Epoch 2734, batch 2, global_batch_idx: 43730, batch size: 203, loss[dur_loss=0.1974, prior_loss=0.9733, diff_loss=0.3433, tot_loss=1.514, over 203.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9733, diff_loss=0.3221, tot_loss=1.494, over 442.00 samples.], 2024-10-21 17:03:57,671 INFO [train.py:561] (1/4) Epoch 2734, batch 12, global_batch_idx: 43740, batch size: 152, loss[dur_loss=0.1955, prior_loss=0.9733, diff_loss=0.2902, tot_loss=1.459, over 152.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.973, diff_loss=0.3573, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 17:04:02,259 INFO [train.py:682] (1/4) Start epoch 2735 2024-10-21 17:04:20,769 INFO [train.py:561] (1/4) Epoch 2735, batch 6, global_batch_idx: 43750, batch size: 106, loss[dur_loss=0.1942, prior_loss=0.973, diff_loss=0.2976, tot_loss=1.465, over 106.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9726, diff_loss=0.4027, tot_loss=1.568, over 1142.00 samples.], 2024-10-21 17:04:34,195 INFO [train.py:682] (1/4) Start epoch 2736 2024-10-21 17:04:43,643 INFO [train.py:561] (1/4) Epoch 2736, batch 0, global_batch_idx: 43760, batch size: 108, loss[dur_loss=0.1991, prior_loss=0.9738, diff_loss=0.2949, tot_loss=1.468, over 108.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9738, diff_loss=0.2949, tot_loss=1.468, over 108.00 samples.], 2024-10-21 17:04:58,453 INFO [train.py:561] (1/4) Epoch 2736, batch 10, global_batch_idx: 43770, batch size: 111, loss[dur_loss=0.1972, prior_loss=0.974, diff_loss=0.2475, tot_loss=1.419, over 111.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.9729, diff_loss=0.3697, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 17:05:05,709 INFO [train.py:682] (1/4) Start epoch 2737 2024-10-21 17:05:21,477 INFO [train.py:561] (1/4) Epoch 2737, batch 4, global_batch_idx: 43780, batch size: 189, loss[dur_loss=0.1943, prior_loss=0.9731, diff_loss=0.3025, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9724, diff_loss=0.4089, tot_loss=1.573, over 937.00 samples.], 2024-10-21 17:05:37,017 INFO [train.py:561] (1/4) Epoch 2737, batch 14, global_batch_idx: 43790, batch size: 142, loss[dur_loss=0.1974, prior_loss=0.9731, diff_loss=0.3133, tot_loss=1.484, over 142.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.973, diff_loss=0.3449, tot_loss=1.512, over 2210.00 samples.], 2024-10-21 17:05:38,509 INFO [train.py:682] (1/4) Start epoch 2738 2024-10-21 17:05:59,576 INFO [train.py:561] (1/4) Epoch 2738, batch 8, global_batch_idx: 43800, batch size: 170, loss[dur_loss=0.1957, prior_loss=0.9734, diff_loss=0.3117, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9727, diff_loss=0.3708, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 17:06:09,968 INFO [train.py:682] (1/4) Start epoch 2739 2024-10-21 17:06:22,312 INFO [train.py:561] (1/4) Epoch 2739, batch 2, global_batch_idx: 43810, batch size: 203, loss[dur_loss=0.1964, prior_loss=0.9731, diff_loss=0.3212, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1967, prior_loss=0.9733, diff_loss=0.3179, tot_loss=1.488, over 442.00 samples.], 2024-10-21 17:06:37,102 INFO [train.py:561] (1/4) Epoch 2739, batch 12, global_batch_idx: 43820, batch size: 152, loss[dur_loss=0.1944, prior_loss=0.9733, diff_loss=0.276, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9729, diff_loss=0.3532, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 17:06:41,657 INFO [train.py:682] (1/4) Start epoch 2740 2024-10-21 17:06:59,550 INFO [train.py:561] (1/4) Epoch 2740, batch 6, global_batch_idx: 43830, batch size: 106, loss[dur_loss=0.1949, prior_loss=0.9734, diff_loss=0.2999, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.1908, prior_loss=0.9726, diff_loss=0.3996, tot_loss=1.563, over 1142.00 samples.], 2024-10-21 17:07:12,935 INFO [train.py:682] (1/4) Start epoch 2741 2024-10-21 17:07:23,982 INFO [train.py:561] (1/4) Epoch 2741, batch 0, global_batch_idx: 43840, batch size: 108, loss[dur_loss=0.1963, prior_loss=0.9737, diff_loss=0.2855, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.9737, diff_loss=0.2855, tot_loss=1.455, over 108.00 samples.], 2024-10-21 17:07:38,633 INFO [train.py:561] (1/4) Epoch 2741, batch 10, global_batch_idx: 43850, batch size: 111, loss[dur_loss=0.1949, prior_loss=0.974, diff_loss=0.2955, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.1924, prior_loss=0.9728, diff_loss=0.3675, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 17:07:45,840 INFO [train.py:682] (1/4) Start epoch 2742 2024-10-21 17:08:00,544 INFO [train.py:561] (1/4) Epoch 2742, batch 4, global_batch_idx: 43860, batch size: 189, loss[dur_loss=0.1911, prior_loss=0.9729, diff_loss=0.3187, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.191, prior_loss=0.9723, diff_loss=0.4282, tot_loss=1.592, over 937.00 samples.], 2024-10-21 17:08:15,902 INFO [train.py:561] (1/4) Epoch 2742, batch 14, global_batch_idx: 43870, batch size: 142, loss[dur_loss=0.1967, prior_loss=0.973, diff_loss=0.2864, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.9729, diff_loss=0.3542, tot_loss=1.52, over 2210.00 samples.], 2024-10-21 17:08:17,346 INFO [train.py:682] (1/4) Start epoch 2743 2024-10-21 17:08:39,200 INFO [train.py:561] (1/4) Epoch 2743, batch 8, global_batch_idx: 43880, batch size: 170, loss[dur_loss=0.1989, prior_loss=0.9734, diff_loss=0.3067, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9728, diff_loss=0.37, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 17:08:49,822 INFO [train.py:682] (1/4) Start epoch 2744 2024-10-21 17:09:02,046 INFO [train.py:561] (1/4) Epoch 2744, batch 2, global_batch_idx: 43890, batch size: 203, loss[dur_loss=0.1939, prior_loss=0.9731, diff_loss=0.3073, tot_loss=1.474, over 203.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9733, diff_loss=0.3148, tot_loss=1.483, over 442.00 samples.], 2024-10-21 17:09:16,686 INFO [train.py:561] (1/4) Epoch 2744, batch 12, global_batch_idx: 43900, batch size: 152, loss[dur_loss=0.191, prior_loss=0.9732, diff_loss=0.293, tot_loss=1.457, over 152.00 samples.], tot_loss[dur_loss=0.1928, prior_loss=0.9729, diff_loss=0.3563, tot_loss=1.522, over 1966.00 samples.], 2024-10-21 17:09:21,264 INFO [train.py:682] (1/4) Start epoch 2745 2024-10-21 17:09:38,981 INFO [train.py:561] (1/4) Epoch 2745, batch 6, global_batch_idx: 43910, batch size: 106, loss[dur_loss=0.1927, prior_loss=0.9733, diff_loss=0.2571, tot_loss=1.423, over 106.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9725, diff_loss=0.3946, tot_loss=1.559, over 1142.00 samples.], 2024-10-21 17:09:52,499 INFO [train.py:682] (1/4) Start epoch 2746 2024-10-21 17:10:02,151 INFO [train.py:561] (1/4) Epoch 2746, batch 0, global_batch_idx: 43920, batch size: 108, loss[dur_loss=0.2003, prior_loss=0.9737, diff_loss=0.3282, tot_loss=1.502, over 108.00 samples.], tot_loss[dur_loss=0.2003, prior_loss=0.9737, diff_loss=0.3282, tot_loss=1.502, over 108.00 samples.], 2024-10-21 17:10:16,804 INFO [train.py:561] (1/4) Epoch 2746, batch 10, global_batch_idx: 43930, batch size: 111, loss[dur_loss=0.1977, prior_loss=0.9744, diff_loss=0.3064, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9728, diff_loss=0.375, tot_loss=1.54, over 1656.00 samples.], 2024-10-21 17:10:24,175 INFO [train.py:682] (1/4) Start epoch 2747 2024-10-21 17:10:39,136 INFO [train.py:561] (1/4) Epoch 2747, batch 4, global_batch_idx: 43940, batch size: 189, loss[dur_loss=0.1937, prior_loss=0.9732, diff_loss=0.3198, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9723, diff_loss=0.409, tot_loss=1.573, over 937.00 samples.], 2024-10-21 17:10:54,557 INFO [train.py:561] (1/4) Epoch 2747, batch 14, global_batch_idx: 43950, batch size: 142, loss[dur_loss=0.1962, prior_loss=0.9728, diff_loss=0.2916, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9729, diff_loss=0.3464, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 17:10:56,064 INFO [train.py:682] (1/4) Start epoch 2748 2024-10-21 17:11:17,187 INFO [train.py:561] (1/4) Epoch 2748, batch 8, global_batch_idx: 43960, batch size: 170, loss[dur_loss=0.1976, prior_loss=0.9733, diff_loss=0.3382, tot_loss=1.509, over 170.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9727, diff_loss=0.3826, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 17:11:27,975 INFO [train.py:682] (1/4) Start epoch 2749 2024-10-21 17:11:40,856 INFO [train.py:561] (1/4) Epoch 2749, batch 2, global_batch_idx: 43970, batch size: 203, loss[dur_loss=0.1965, prior_loss=0.9732, diff_loss=0.3483, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9733, diff_loss=0.3064, tot_loss=1.477, over 442.00 samples.], 2024-10-21 17:11:55,893 INFO [train.py:561] (1/4) Epoch 2749, batch 12, global_batch_idx: 43980, batch size: 152, loss[dur_loss=0.1939, prior_loss=0.9733, diff_loss=0.2773, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9729, diff_loss=0.359, tot_loss=1.525, over 1966.00 samples.], 2024-10-21 17:12:00,862 INFO [train.py:682] (1/4) Start epoch 2750 2024-10-21 17:12:18,621 INFO [train.py:561] (1/4) Epoch 2750, batch 6, global_batch_idx: 43990, batch size: 106, loss[dur_loss=0.1955, prior_loss=0.9732, diff_loss=0.2914, tot_loss=1.46, over 106.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9726, diff_loss=0.3865, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 17:12:31,998 INFO [train.py:682] (1/4) Start epoch 2751 2024-10-21 17:12:41,408 INFO [train.py:561] (1/4) Epoch 2751, batch 0, global_batch_idx: 44000, batch size: 108, loss[dur_loss=0.2009, prior_loss=0.9738, diff_loss=0.3285, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.2009, prior_loss=0.9738, diff_loss=0.3285, tot_loss=1.503, over 108.00 samples.], 2024-10-21 17:12:56,073 INFO [train.py:561] (1/4) Epoch 2751, batch 10, global_batch_idx: 44010, batch size: 111, loss[dur_loss=0.1953, prior_loss=0.9742, diff_loss=0.2954, tot_loss=1.465, over 111.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9728, diff_loss=0.3623, tot_loss=1.528, over 1656.00 samples.], 2024-10-21 17:13:03,214 INFO [train.py:682] (1/4) Start epoch 2752 2024-10-21 17:13:17,046 INFO [train.py:561] (1/4) Epoch 2752, batch 4, global_batch_idx: 44020, batch size: 189, loss[dur_loss=0.1915, prior_loss=0.973, diff_loss=0.291, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9724, diff_loss=0.4024, tot_loss=1.565, over 937.00 samples.], 2024-10-21 17:13:32,059 INFO [train.py:561] (1/4) Epoch 2752, batch 14, global_batch_idx: 44030, batch size: 142, loss[dur_loss=0.1937, prior_loss=0.9728, diff_loss=0.3253, tot_loss=1.492, over 142.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.3465, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 17:13:33,475 INFO [train.py:682] (1/4) Start epoch 2753 2024-10-21 17:13:54,349 INFO [train.py:561] (1/4) Epoch 2753, batch 8, global_batch_idx: 44040, batch size: 170, loss[dur_loss=0.1973, prior_loss=0.9732, diff_loss=0.323, tot_loss=1.493, over 170.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9727, diff_loss=0.3855, tot_loss=1.551, over 1432.00 samples.], 2024-10-21 17:14:05,127 INFO [train.py:682] (1/4) Start epoch 2754 2024-10-21 17:14:16,959 INFO [train.py:561] (1/4) Epoch 2754, batch 2, global_batch_idx: 44050, batch size: 203, loss[dur_loss=0.1974, prior_loss=0.9733, diff_loss=0.3388, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9733, diff_loss=0.303, tot_loss=1.474, over 442.00 samples.], 2024-10-21 17:14:31,763 INFO [train.py:561] (1/4) Epoch 2754, batch 12, global_batch_idx: 44060, batch size: 152, loss[dur_loss=0.1934, prior_loss=0.9732, diff_loss=0.2855, tot_loss=1.452, over 152.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9729, diff_loss=0.3514, tot_loss=1.518, over 1966.00 samples.], 2024-10-21 17:14:36,278 INFO [train.py:682] (1/4) Start epoch 2755 2024-10-21 17:14:53,884 INFO [train.py:561] (1/4) Epoch 2755, batch 6, global_batch_idx: 44070, batch size: 106, loss[dur_loss=0.1945, prior_loss=0.9731, diff_loss=0.2787, tot_loss=1.446, over 106.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9725, diff_loss=0.3927, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 17:15:07,282 INFO [train.py:682] (1/4) Start epoch 2756 2024-10-21 17:15:16,391 INFO [train.py:561] (1/4) Epoch 2756, batch 0, global_batch_idx: 44080, batch size: 108, loss[dur_loss=0.1957, prior_loss=0.9736, diff_loss=0.3111, tot_loss=1.48, over 108.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9736, diff_loss=0.3111, tot_loss=1.48, over 108.00 samples.], 2024-10-21 17:15:31,079 INFO [train.py:561] (1/4) Epoch 2756, batch 10, global_batch_idx: 44090, batch size: 111, loss[dur_loss=0.1935, prior_loss=0.9741, diff_loss=0.2916, tot_loss=1.459, over 111.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9727, diff_loss=0.3551, tot_loss=1.52, over 1656.00 samples.], 2024-10-21 17:15:38,489 INFO [train.py:682] (1/4) Start epoch 2757 2024-10-21 17:15:53,180 INFO [train.py:561] (1/4) Epoch 2757, batch 4, global_batch_idx: 44100, batch size: 189, loss[dur_loss=0.1932, prior_loss=0.9731, diff_loss=0.3568, tot_loss=1.523, over 189.00 samples.], tot_loss[dur_loss=0.1912, prior_loss=0.9723, diff_loss=0.4246, tot_loss=1.588, over 937.00 samples.], 2024-10-21 17:16:08,640 INFO [train.py:561] (1/4) Epoch 2757, batch 14, global_batch_idx: 44110, batch size: 142, loss[dur_loss=0.199, prior_loss=0.9731, diff_loss=0.2789, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.1937, prior_loss=0.973, diff_loss=0.3514, tot_loss=1.518, over 2210.00 samples.], 2024-10-21 17:16:10,120 INFO [train.py:682] (1/4) Start epoch 2758 2024-10-21 17:16:31,045 INFO [train.py:561] (1/4) Epoch 2758, batch 8, global_batch_idx: 44120, batch size: 170, loss[dur_loss=0.1947, prior_loss=0.9735, diff_loss=0.3078, tot_loss=1.476, over 170.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9728, diff_loss=0.3665, tot_loss=1.531, over 1432.00 samples.], 2024-10-21 17:16:41,659 INFO [train.py:682] (1/4) Start epoch 2759 2024-10-21 17:16:53,681 INFO [train.py:561] (1/4) Epoch 2759, batch 2, global_batch_idx: 44130, batch size: 203, loss[dur_loss=0.197, prior_loss=0.9731, diff_loss=0.3376, tot_loss=1.508, over 203.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9732, diff_loss=0.3062, tot_loss=1.476, over 442.00 samples.], 2024-10-21 17:17:08,560 INFO [train.py:561] (1/4) Epoch 2759, batch 12, global_batch_idx: 44140, batch size: 152, loss[dur_loss=0.1921, prior_loss=0.9733, diff_loss=0.2834, tot_loss=1.449, over 152.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9729, diff_loss=0.3469, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 17:17:13,161 INFO [train.py:682] (1/4) Start epoch 2760 2024-10-21 17:17:31,040 INFO [train.py:561] (1/4) Epoch 2760, batch 6, global_batch_idx: 44150, batch size: 106, loss[dur_loss=0.1927, prior_loss=0.9733, diff_loss=0.3074, tot_loss=1.473, over 106.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9726, diff_loss=0.3902, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 17:17:44,727 INFO [train.py:682] (1/4) Start epoch 2761 2024-10-21 17:17:54,375 INFO [train.py:561] (1/4) Epoch 2761, batch 0, global_batch_idx: 44160, batch size: 108, loss[dur_loss=0.2009, prior_loss=0.9738, diff_loss=0.3089, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.2009, prior_loss=0.9738, diff_loss=0.3089, tot_loss=1.484, over 108.00 samples.], 2024-10-21 17:18:10,130 INFO [train.py:561] (1/4) Epoch 2761, batch 10, global_batch_idx: 44170, batch size: 111, loss[dur_loss=0.1965, prior_loss=0.9743, diff_loss=0.2846, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.362, tot_loss=1.528, over 1656.00 samples.], 2024-10-21 17:18:17,520 INFO [train.py:682] (1/4) Start epoch 2762 2024-10-21 17:18:32,078 INFO [train.py:561] (1/4) Epoch 2762, batch 4, global_batch_idx: 44180, batch size: 189, loss[dur_loss=0.1937, prior_loss=0.9735, diff_loss=0.3248, tot_loss=1.492, over 189.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9724, diff_loss=0.4131, tot_loss=1.577, over 937.00 samples.], 2024-10-21 17:18:47,591 INFO [train.py:561] (1/4) Epoch 2762, batch 14, global_batch_idx: 44190, batch size: 142, loss[dur_loss=0.1945, prior_loss=0.9727, diff_loss=0.2746, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.9729, diff_loss=0.3498, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 17:18:49,081 INFO [train.py:682] (1/4) Start epoch 2763 2024-10-21 17:19:11,535 INFO [train.py:561] (1/4) Epoch 2763, batch 8, global_batch_idx: 44200, batch size: 170, loss[dur_loss=0.1953, prior_loss=0.9733, diff_loss=0.2831, tot_loss=1.452, over 170.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9726, diff_loss=0.3842, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 17:19:22,210 INFO [train.py:682] (1/4) Start epoch 2764 2024-10-21 17:19:37,952 INFO [train.py:561] (1/4) Epoch 2764, batch 2, global_batch_idx: 44210, batch size: 203, loss[dur_loss=0.1962, prior_loss=0.9731, diff_loss=0.3423, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.1952, prior_loss=0.9731, diff_loss=0.3179, tot_loss=1.486, over 442.00 samples.], 2024-10-21 17:19:52,800 INFO [train.py:561] (1/4) Epoch 2764, batch 12, global_batch_idx: 44220, batch size: 152, loss[dur_loss=0.1912, prior_loss=0.973, diff_loss=0.3136, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9728, diff_loss=0.3537, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 17:19:57,507 INFO [train.py:682] (1/4) Start epoch 2765 2024-10-21 17:20:17,445 INFO [train.py:561] (1/4) Epoch 2765, batch 6, global_batch_idx: 44230, batch size: 106, loss[dur_loss=0.1936, prior_loss=0.9731, diff_loss=0.3286, tot_loss=1.495, over 106.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9724, diff_loss=0.4078, tot_loss=1.572, over 1142.00 samples.], 2024-10-21 17:20:31,362 INFO [train.py:682] (1/4) Start epoch 2766 2024-10-21 17:20:41,465 INFO [train.py:561] (1/4) Epoch 2766, batch 0, global_batch_idx: 44240, batch size: 108, loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.2776, tot_loss=1.447, over 108.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9736, diff_loss=0.2776, tot_loss=1.447, over 108.00 samples.], 2024-10-21 17:20:56,677 INFO [train.py:561] (1/4) Epoch 2766, batch 10, global_batch_idx: 44250, batch size: 111, loss[dur_loss=0.1941, prior_loss=0.974, diff_loss=0.2716, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9728, diff_loss=0.367, tot_loss=1.532, over 1656.00 samples.], 2024-10-21 17:21:04,220 INFO [train.py:682] (1/4) Start epoch 2767 2024-10-21 17:21:19,408 INFO [train.py:561] (1/4) Epoch 2767, batch 4, global_batch_idx: 44260, batch size: 189, loss[dur_loss=0.1953, prior_loss=0.9731, diff_loss=0.3489, tot_loss=1.517, over 189.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9723, diff_loss=0.4347, tot_loss=1.599, over 937.00 samples.], 2024-10-21 17:21:35,057 INFO [train.py:561] (1/4) Epoch 2767, batch 14, global_batch_idx: 44270, batch size: 142, loss[dur_loss=0.1961, prior_loss=0.9729, diff_loss=0.2948, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3581, tot_loss=1.525, over 2210.00 samples.], 2024-10-21 17:21:36,543 INFO [train.py:682] (1/4) Start epoch 2768 2024-10-21 17:21:57,293 INFO [train.py:561] (1/4) Epoch 2768, batch 8, global_batch_idx: 44280, batch size: 170, loss[dur_loss=0.198, prior_loss=0.9731, diff_loss=0.3429, tot_loss=1.514, over 170.00 samples.], tot_loss[dur_loss=0.194, prior_loss=0.9726, diff_loss=0.3761, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 17:22:07,843 INFO [train.py:682] (1/4) Start epoch 2769 2024-10-21 17:22:20,328 INFO [train.py:561] (1/4) Epoch 2769, batch 2, global_batch_idx: 44290, batch size: 203, loss[dur_loss=0.1952, prior_loss=0.9731, diff_loss=0.3426, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9732, diff_loss=0.3292, tot_loss=1.497, over 442.00 samples.], 2024-10-21 17:22:35,147 INFO [train.py:561] (1/4) Epoch 2769, batch 12, global_batch_idx: 44300, batch size: 152, loss[dur_loss=0.1928, prior_loss=0.9735, diff_loss=0.334, tot_loss=1.5, over 152.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9729, diff_loss=0.3599, tot_loss=1.526, over 1966.00 samples.], 2024-10-21 17:22:39,773 INFO [train.py:682] (1/4) Start epoch 2770 2024-10-21 17:22:57,724 INFO [train.py:561] (1/4) Epoch 2770, batch 6, global_batch_idx: 44310, batch size: 106, loss[dur_loss=0.1948, prior_loss=0.9729, diff_loss=0.3258, tot_loss=1.494, over 106.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9725, diff_loss=0.3913, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 17:23:11,264 INFO [train.py:682] (1/4) Start epoch 2771 2024-10-21 17:23:20,822 INFO [train.py:561] (1/4) Epoch 2771, batch 0, global_batch_idx: 44320, batch size: 108, loss[dur_loss=0.2007, prior_loss=0.9739, diff_loss=0.3401, tot_loss=1.515, over 108.00 samples.], tot_loss[dur_loss=0.2007, prior_loss=0.9739, diff_loss=0.3401, tot_loss=1.515, over 108.00 samples.], 2024-10-21 17:23:35,494 INFO [train.py:561] (1/4) Epoch 2771, batch 10, global_batch_idx: 44330, batch size: 111, loss[dur_loss=0.1964, prior_loss=0.9741, diff_loss=0.3331, tot_loss=1.504, over 111.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9729, diff_loss=0.3694, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 17:23:42,950 INFO [train.py:682] (1/4) Start epoch 2772 2024-10-21 17:23:57,247 INFO [train.py:561] (1/4) Epoch 2772, batch 4, global_batch_idx: 44340, batch size: 189, loss[dur_loss=0.1936, prior_loss=0.9733, diff_loss=0.3137, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9724, diff_loss=0.416, tot_loss=1.58, over 937.00 samples.], 2024-10-21 17:24:12,684 INFO [train.py:561] (1/4) Epoch 2772, batch 14, global_batch_idx: 44350, batch size: 142, loss[dur_loss=0.196, prior_loss=0.9727, diff_loss=0.2872, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.973, diff_loss=0.343, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 17:24:14,181 INFO [train.py:682] (1/4) Start epoch 2773 2024-10-21 17:24:35,064 INFO [train.py:561] (1/4) Epoch 2773, batch 8, global_batch_idx: 44360, batch size: 170, loss[dur_loss=0.199, prior_loss=0.9738, diff_loss=0.3101, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9728, diff_loss=0.382, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 17:24:45,566 INFO [train.py:682] (1/4) Start epoch 2774 2024-10-21 17:24:57,519 INFO [train.py:561] (1/4) Epoch 2774, batch 2, global_batch_idx: 44370, batch size: 203, loss[dur_loss=0.1941, prior_loss=0.9732, diff_loss=0.3101, tot_loss=1.477, over 203.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9732, diff_loss=0.2885, tot_loss=1.456, over 442.00 samples.], 2024-10-21 17:25:12,079 INFO [train.py:561] (1/4) Epoch 2774, batch 12, global_batch_idx: 44380, batch size: 152, loss[dur_loss=0.1939, prior_loss=0.9735, diff_loss=0.2985, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9729, diff_loss=0.3458, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 17:25:16,658 INFO [train.py:682] (1/4) Start epoch 2775 2024-10-21 17:25:34,730 INFO [train.py:561] (1/4) Epoch 2775, batch 6, global_batch_idx: 44390, batch size: 106, loss[dur_loss=0.1933, prior_loss=0.9728, diff_loss=0.297, tot_loss=1.463, over 106.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9727, diff_loss=0.3995, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 17:25:48,332 INFO [train.py:682] (1/4) Start epoch 2776 2024-10-21 17:25:57,423 INFO [train.py:561] (1/4) Epoch 2776, batch 0, global_batch_idx: 44400, batch size: 108, loss[dur_loss=0.1985, prior_loss=0.9737, diff_loss=0.2942, tot_loss=1.466, over 108.00 samples.], tot_loss[dur_loss=0.1985, prior_loss=0.9737, diff_loss=0.2942, tot_loss=1.466, over 108.00 samples.], 2024-10-21 17:26:12,014 INFO [train.py:561] (1/4) Epoch 2776, batch 10, global_batch_idx: 44410, batch size: 111, loss[dur_loss=0.1947, prior_loss=0.9742, diff_loss=0.3094, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.3622, tot_loss=1.528, over 1656.00 samples.], 2024-10-21 17:26:19,248 INFO [train.py:682] (1/4) Start epoch 2777 2024-10-21 17:26:33,801 INFO [train.py:561] (1/4) Epoch 2777, batch 4, global_batch_idx: 44420, batch size: 189, loss[dur_loss=0.1928, prior_loss=0.9732, diff_loss=0.3436, tot_loss=1.51, over 189.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9724, diff_loss=0.4342, tot_loss=1.598, over 937.00 samples.], 2024-10-21 17:26:49,136 INFO [train.py:561] (1/4) Epoch 2777, batch 14, global_batch_idx: 44430, batch size: 142, loss[dur_loss=0.1973, prior_loss=0.9728, diff_loss=0.3208, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9729, diff_loss=0.3585, tot_loss=1.525, over 2210.00 samples.], 2024-10-21 17:26:50,609 INFO [train.py:682] (1/4) Start epoch 2778 2024-10-21 17:27:11,084 INFO [train.py:561] (1/4) Epoch 2778, batch 8, global_batch_idx: 44440, batch size: 170, loss[dur_loss=0.2001, prior_loss=0.9739, diff_loss=0.3175, tot_loss=1.491, over 170.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9728, diff_loss=0.3624, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 17:27:21,465 INFO [train.py:682] (1/4) Start epoch 2779 2024-10-21 17:27:33,448 INFO [train.py:561] (1/4) Epoch 2779, batch 2, global_batch_idx: 44450, batch size: 203, loss[dur_loss=0.1968, prior_loss=0.9731, diff_loss=0.3133, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9732, diff_loss=0.3051, tot_loss=1.476, over 442.00 samples.], 2024-10-21 17:27:47,997 INFO [train.py:561] (1/4) Epoch 2779, batch 12, global_batch_idx: 44460, batch size: 152, loss[dur_loss=0.1944, prior_loss=0.9734, diff_loss=0.2891, tot_loss=1.457, over 152.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.9729, diff_loss=0.3548, tot_loss=1.522, over 1966.00 samples.], 2024-10-21 17:27:52,559 INFO [train.py:682] (1/4) Start epoch 2780 2024-10-21 17:28:09,941 INFO [train.py:561] (1/4) Epoch 2780, batch 6, global_batch_idx: 44470, batch size: 106, loss[dur_loss=0.1906, prior_loss=0.9731, diff_loss=0.2888, tot_loss=1.452, over 106.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9725, diff_loss=0.3977, tot_loss=1.56, over 1142.00 samples.], 2024-10-21 17:28:23,275 INFO [train.py:682] (1/4) Start epoch 2781 2024-10-21 17:28:32,368 INFO [train.py:561] (1/4) Epoch 2781, batch 0, global_batch_idx: 44480, batch size: 108, loss[dur_loss=0.2005, prior_loss=0.9736, diff_loss=0.2902, tot_loss=1.464, over 108.00 samples.], tot_loss[dur_loss=0.2005, prior_loss=0.9736, diff_loss=0.2902, tot_loss=1.464, over 108.00 samples.], 2024-10-21 17:28:46,698 INFO [train.py:561] (1/4) Epoch 2781, batch 10, global_batch_idx: 44490, batch size: 111, loss[dur_loss=0.1945, prior_loss=0.9746, diff_loss=0.2974, tot_loss=1.466, over 111.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.9731, diff_loss=0.3675, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 17:28:53,972 INFO [train.py:682] (1/4) Start epoch 2782 2024-10-21 17:29:08,010 INFO [train.py:561] (1/4) Epoch 2782, batch 4, global_batch_idx: 44500, batch size: 189, loss[dur_loss=0.1946, prior_loss=0.9733, diff_loss=0.3157, tot_loss=1.484, over 189.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9723, diff_loss=0.4277, tot_loss=1.59, over 937.00 samples.], 2024-10-21 17:29:23,230 INFO [train.py:561] (1/4) Epoch 2782, batch 14, global_batch_idx: 44510, batch size: 142, loss[dur_loss=0.1967, prior_loss=0.9731, diff_loss=0.3045, tot_loss=1.474, over 142.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.973, diff_loss=0.3564, tot_loss=1.523, over 2210.00 samples.], 2024-10-21 17:29:24,841 INFO [train.py:682] (1/4) Start epoch 2783 2024-10-21 17:29:45,781 INFO [train.py:561] (1/4) Epoch 2783, batch 8, global_batch_idx: 44520, batch size: 170, loss[dur_loss=0.1967, prior_loss=0.9736, diff_loss=0.3216, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.9729, diff_loss=0.384, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 17:29:57,008 INFO [train.py:682] (1/4) Start epoch 2784 2024-10-21 17:30:09,046 INFO [train.py:561] (1/4) Epoch 2784, batch 2, global_batch_idx: 44530, batch size: 203, loss[dur_loss=0.1971, prior_loss=0.9734, diff_loss=0.3506, tot_loss=1.521, over 203.00 samples.], tot_loss[dur_loss=0.1975, prior_loss=0.9734, diff_loss=0.3308, tot_loss=1.502, over 442.00 samples.], 2024-10-21 17:30:24,014 INFO [train.py:561] (1/4) Epoch 2784, batch 12, global_batch_idx: 44540, batch size: 152, loss[dur_loss=0.1949, prior_loss=0.9735, diff_loss=0.2753, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.195, prior_loss=0.9731, diff_loss=0.3563, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 17:30:28,832 INFO [train.py:682] (1/4) Start epoch 2785 2024-10-21 17:30:46,732 INFO [train.py:561] (1/4) Epoch 2785, batch 6, global_batch_idx: 44550, batch size: 106, loss[dur_loss=0.1957, prior_loss=0.9733, diff_loss=0.247, tot_loss=1.416, over 106.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9727, diff_loss=0.4064, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 17:31:00,893 INFO [train.py:682] (1/4) Start epoch 2786 2024-10-21 17:31:10,221 INFO [train.py:561] (1/4) Epoch 2786, batch 0, global_batch_idx: 44560, batch size: 108, loss[dur_loss=0.2033, prior_loss=0.9744, diff_loss=0.2798, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.2033, prior_loss=0.9744, diff_loss=0.2798, tot_loss=1.458, over 108.00 samples.], 2024-10-21 17:31:24,891 INFO [train.py:561] (1/4) Epoch 2786, batch 10, global_batch_idx: 44570, batch size: 111, loss[dur_loss=0.1969, prior_loss=0.9747, diff_loss=0.3116, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9731, diff_loss=0.3715, tot_loss=1.538, over 1656.00 samples.], 2024-10-21 17:31:32,495 INFO [train.py:682] (1/4) Start epoch 2787 2024-10-21 17:31:47,086 INFO [train.py:561] (1/4) Epoch 2787, batch 4, global_batch_idx: 44580, batch size: 189, loss[dur_loss=0.1949, prior_loss=0.9733, diff_loss=0.3202, tot_loss=1.489, over 189.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9727, diff_loss=0.4335, tot_loss=1.599, over 937.00 samples.], 2024-10-21 17:32:06,983 INFO [train.py:561] (1/4) Epoch 2787, batch 14, global_batch_idx: 44590, batch size: 142, loss[dur_loss=0.1993, prior_loss=0.9731, diff_loss=0.3045, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.1948, prior_loss=0.9733, diff_loss=0.3577, tot_loss=1.526, over 2210.00 samples.], 2024-10-21 17:32:08,474 INFO [train.py:682] (1/4) Start epoch 2788 2024-10-21 17:32:29,610 INFO [train.py:561] (1/4) Epoch 2788, batch 8, global_batch_idx: 44600, batch size: 170, loss[dur_loss=0.1969, prior_loss=0.9738, diff_loss=0.3046, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9728, diff_loss=0.3732, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 17:32:40,192 INFO [train.py:682] (1/4) Start epoch 2789 2024-10-21 17:32:52,161 INFO [train.py:561] (1/4) Epoch 2789, batch 2, global_batch_idx: 44610, batch size: 203, loss[dur_loss=0.1948, prior_loss=0.9731, diff_loss=0.3202, tot_loss=1.488, over 203.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9733, diff_loss=0.2991, tot_loss=1.469, over 442.00 samples.], 2024-10-21 17:33:06,896 INFO [train.py:561] (1/4) Epoch 2789, batch 12, global_batch_idx: 44620, batch size: 152, loss[dur_loss=0.1924, prior_loss=0.9733, diff_loss=0.3309, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.1943, prior_loss=0.973, diff_loss=0.3462, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 17:33:11,587 INFO [train.py:682] (1/4) Start epoch 2790 2024-10-21 17:33:29,391 INFO [train.py:561] (1/4) Epoch 2790, batch 6, global_batch_idx: 44630, batch size: 106, loss[dur_loss=0.1947, prior_loss=0.9731, diff_loss=0.3535, tot_loss=1.521, over 106.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9726, diff_loss=0.3986, tot_loss=1.563, over 1142.00 samples.], 2024-10-21 17:33:42,857 INFO [train.py:682] (1/4) Start epoch 2791 2024-10-21 17:33:51,984 INFO [train.py:561] (1/4) Epoch 2791, batch 0, global_batch_idx: 44640, batch size: 108, loss[dur_loss=0.1978, prior_loss=0.9736, diff_loss=0.3322, tot_loss=1.504, over 108.00 samples.], tot_loss[dur_loss=0.1978, prior_loss=0.9736, diff_loss=0.3322, tot_loss=1.504, over 108.00 samples.], 2024-10-21 17:34:06,801 INFO [train.py:561] (1/4) Epoch 2791, batch 10, global_batch_idx: 44650, batch size: 111, loss[dur_loss=0.1964, prior_loss=0.9744, diff_loss=0.3297, tot_loss=1.5, over 111.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9728, diff_loss=0.3704, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 17:34:14,270 INFO [train.py:682] (1/4) Start epoch 2792 2024-10-21 17:34:30,204 INFO [train.py:561] (1/4) Epoch 2792, batch 4, global_batch_idx: 44660, batch size: 189, loss[dur_loss=0.1925, prior_loss=0.9733, diff_loss=0.3289, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9723, diff_loss=0.4167, tot_loss=1.58, over 937.00 samples.], 2024-10-21 17:34:45,682 INFO [train.py:561] (1/4) Epoch 2792, batch 14, global_batch_idx: 44670, batch size: 142, loss[dur_loss=0.1954, prior_loss=0.9728, diff_loss=0.2933, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9729, diff_loss=0.3454, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 17:34:47,218 INFO [train.py:682] (1/4) Start epoch 2793 2024-10-21 17:35:08,356 INFO [train.py:561] (1/4) Epoch 2793, batch 8, global_batch_idx: 44680, batch size: 170, loss[dur_loss=0.1979, prior_loss=0.9736, diff_loss=0.3255, tot_loss=1.497, over 170.00 samples.], tot_loss[dur_loss=0.1924, prior_loss=0.9727, diff_loss=0.3863, tot_loss=1.551, over 1432.00 samples.], 2024-10-21 17:35:19,004 INFO [train.py:682] (1/4) Start epoch 2794 2024-10-21 17:35:31,099 INFO [train.py:561] (1/4) Epoch 2794, batch 2, global_batch_idx: 44690, batch size: 203, loss[dur_loss=0.1987, prior_loss=0.9733, diff_loss=0.3198, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1979, prior_loss=0.9734, diff_loss=0.3142, tot_loss=1.486, over 442.00 samples.], 2024-10-21 17:35:45,903 INFO [train.py:561] (1/4) Epoch 2794, batch 12, global_batch_idx: 44700, batch size: 152, loss[dur_loss=0.1946, prior_loss=0.9734, diff_loss=0.3126, tot_loss=1.481, over 152.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9731, diff_loss=0.3564, tot_loss=1.524, over 1966.00 samples.], 2024-10-21 17:35:50,721 INFO [train.py:682] (1/4) Start epoch 2795 2024-10-21 17:36:09,014 INFO [train.py:561] (1/4) Epoch 2795, batch 6, global_batch_idx: 44710, batch size: 106, loss[dur_loss=0.1955, prior_loss=0.9733, diff_loss=0.2904, tot_loss=1.459, over 106.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9726, diff_loss=0.3939, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 17:36:22,826 INFO [train.py:682] (1/4) Start epoch 2796 2024-10-21 17:36:32,376 INFO [train.py:561] (1/4) Epoch 2796, batch 0, global_batch_idx: 44720, batch size: 108, loss[dur_loss=0.2008, prior_loss=0.9739, diff_loss=0.2992, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.2008, prior_loss=0.9739, diff_loss=0.2992, tot_loss=1.474, over 108.00 samples.], 2024-10-21 17:36:47,314 INFO [train.py:561] (1/4) Epoch 2796, batch 10, global_batch_idx: 44730, batch size: 111, loss[dur_loss=0.1949, prior_loss=0.9744, diff_loss=0.3158, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.973, diff_loss=0.3658, tot_loss=1.531, over 1656.00 samples.], 2024-10-21 17:36:54,758 INFO [train.py:682] (1/4) Start epoch 2797 2024-10-21 17:37:09,973 INFO [train.py:561] (1/4) Epoch 2797, batch 4, global_batch_idx: 44740, batch size: 189, loss[dur_loss=0.1931, prior_loss=0.9731, diff_loss=0.341, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9724, diff_loss=0.4218, tot_loss=1.587, over 937.00 samples.], 2024-10-21 17:37:25,615 INFO [train.py:561] (1/4) Epoch 2797, batch 14, global_batch_idx: 44750, batch size: 142, loss[dur_loss=0.1956, prior_loss=0.9732, diff_loss=0.296, tot_loss=1.465, over 142.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.973, diff_loss=0.3571, tot_loss=1.525, over 2210.00 samples.], 2024-10-21 17:37:27,122 INFO [train.py:682] (1/4) Start epoch 2798 2024-10-21 17:37:48,479 INFO [train.py:561] (1/4) Epoch 2798, batch 8, global_batch_idx: 44760, batch size: 170, loss[dur_loss=0.1977, prior_loss=0.9742, diff_loss=0.2949, tot_loss=1.467, over 170.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.9728, diff_loss=0.3757, tot_loss=1.541, over 1432.00 samples.], 2024-10-21 17:37:59,266 INFO [train.py:682] (1/4) Start epoch 2799 2024-10-21 17:38:11,593 INFO [train.py:561] (1/4) Epoch 2799, batch 2, global_batch_idx: 44770, batch size: 203, loss[dur_loss=0.1962, prior_loss=0.9732, diff_loss=0.3424, tot_loss=1.512, over 203.00 samples.], tot_loss[dur_loss=0.1957, prior_loss=0.9732, diff_loss=0.3022, tot_loss=1.471, over 442.00 samples.], 2024-10-21 17:38:26,666 INFO [train.py:561] (1/4) Epoch 2799, batch 12, global_batch_idx: 44780, batch size: 152, loss[dur_loss=0.1952, prior_loss=0.9733, diff_loss=0.3373, tot_loss=1.506, over 152.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9729, diff_loss=0.3584, tot_loss=1.525, over 1966.00 samples.], 2024-10-21 17:38:31,522 INFO [train.py:682] (1/4) Start epoch 2800 2024-10-21 17:38:49,932 INFO [train.py:561] (1/4) Epoch 2800, batch 6, global_batch_idx: 44790, batch size: 106, loss[dur_loss=0.1924, prior_loss=0.9733, diff_loss=0.3317, tot_loss=1.497, over 106.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9725, diff_loss=0.4057, tot_loss=1.57, over 1142.00 samples.], 2024-10-21 17:39:03,767 INFO [train.py:682] (1/4) Start epoch 2801 2024-10-21 17:39:13,370 INFO [train.py:561] (1/4) Epoch 2801, batch 0, global_batch_idx: 44800, batch size: 108, loss[dur_loss=0.2, prior_loss=0.9737, diff_loss=0.2862, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.2, prior_loss=0.9737, diff_loss=0.2862, tot_loss=1.46, over 108.00 samples.], 2024-10-21 17:39:28,555 INFO [train.py:561] (1/4) Epoch 2801, batch 10, global_batch_idx: 44810, batch size: 111, loss[dur_loss=0.1962, prior_loss=0.9741, diff_loss=0.2835, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.9729, diff_loss=0.3575, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 17:39:36,057 INFO [train.py:682] (1/4) Start epoch 2802 2024-10-21 17:39:50,743 INFO [train.py:561] (1/4) Epoch 2802, batch 4, global_batch_idx: 44820, batch size: 189, loss[dur_loss=0.1933, prior_loss=0.9734, diff_loss=0.3041, tot_loss=1.471, over 189.00 samples.], tot_loss[dur_loss=0.191, prior_loss=0.9723, diff_loss=0.414, tot_loss=1.577, over 937.00 samples.], 2024-10-21 17:40:06,477 INFO [train.py:561] (1/4) Epoch 2802, batch 14, global_batch_idx: 44830, batch size: 142, loss[dur_loss=0.1976, prior_loss=0.9732, diff_loss=0.2818, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.973, diff_loss=0.3438, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 17:40:08,007 INFO [train.py:682] (1/4) Start epoch 2803 2024-10-21 17:40:29,333 INFO [train.py:561] (1/4) Epoch 2803, batch 8, global_batch_idx: 44840, batch size: 170, loss[dur_loss=0.1971, prior_loss=0.9735, diff_loss=0.3002, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9727, diff_loss=0.392, tot_loss=1.558, over 1432.00 samples.], 2024-10-21 17:40:40,021 INFO [train.py:682] (1/4) Start epoch 2804 2024-10-21 17:40:52,467 INFO [train.py:561] (1/4) Epoch 2804, batch 2, global_batch_idx: 44850, batch size: 203, loss[dur_loss=0.196, prior_loss=0.9732, diff_loss=0.3417, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.1958, prior_loss=0.9732, diff_loss=0.329, tot_loss=1.498, over 442.00 samples.], 2024-10-21 17:41:07,271 INFO [train.py:561] (1/4) Epoch 2804, batch 12, global_batch_idx: 44860, batch size: 152, loss[dur_loss=0.1905, prior_loss=0.9732, diff_loss=0.2938, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.973, diff_loss=0.356, tot_loss=1.521, over 1966.00 samples.], 2024-10-21 17:41:11,947 INFO [train.py:682] (1/4) Start epoch 2805 2024-10-21 17:41:29,995 INFO [train.py:561] (1/4) Epoch 2805, batch 6, global_batch_idx: 44870, batch size: 106, loss[dur_loss=0.1947, prior_loss=0.9732, diff_loss=0.2771, tot_loss=1.445, over 106.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9725, diff_loss=0.4006, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 17:41:43,761 INFO [train.py:682] (1/4) Start epoch 2806 2024-10-21 17:41:53,142 INFO [train.py:561] (1/4) Epoch 2806, batch 0, global_batch_idx: 44880, batch size: 108, loss[dur_loss=0.1991, prior_loss=0.9735, diff_loss=0.297, tot_loss=1.47, over 108.00 samples.], tot_loss[dur_loss=0.1991, prior_loss=0.9735, diff_loss=0.297, tot_loss=1.47, over 108.00 samples.], 2024-10-21 17:42:07,919 INFO [train.py:561] (1/4) Epoch 2806, batch 10, global_batch_idx: 44890, batch size: 111, loss[dur_loss=0.1939, prior_loss=0.9739, diff_loss=0.2787, tot_loss=1.446, over 111.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9727, diff_loss=0.3678, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 17:42:15,380 INFO [train.py:682] (1/4) Start epoch 2807 2024-10-21 17:42:30,151 INFO [train.py:561] (1/4) Epoch 2807, batch 4, global_batch_idx: 44900, batch size: 189, loss[dur_loss=0.1949, prior_loss=0.973, diff_loss=0.3375, tot_loss=1.505, over 189.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.4278, tot_loss=1.592, over 937.00 samples.], 2024-10-21 17:42:45,456 INFO [train.py:561] (1/4) Epoch 2807, batch 14, global_batch_idx: 44910, batch size: 142, loss[dur_loss=0.1957, prior_loss=0.973, diff_loss=0.2945, tot_loss=1.463, over 142.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9729, diff_loss=0.3577, tot_loss=1.524, over 2210.00 samples.], 2024-10-21 17:42:46,958 INFO [train.py:682] (1/4) Start epoch 2808 2024-10-21 17:43:08,301 INFO [train.py:561] (1/4) Epoch 2808, batch 8, global_batch_idx: 44920, batch size: 170, loss[dur_loss=0.1962, prior_loss=0.9732, diff_loss=0.3257, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9726, diff_loss=0.3735, tot_loss=1.539, over 1432.00 samples.], 2024-10-21 17:43:18,841 INFO [train.py:682] (1/4) Start epoch 2809 2024-10-21 17:43:31,170 INFO [train.py:561] (1/4) Epoch 2809, batch 2, global_batch_idx: 44930, batch size: 203, loss[dur_loss=0.1963, prior_loss=0.9729, diff_loss=0.3088, tot_loss=1.478, over 203.00 samples.], tot_loss[dur_loss=0.1964, prior_loss=0.9731, diff_loss=0.3194, tot_loss=1.489, over 442.00 samples.], 2024-10-21 17:43:45,790 INFO [train.py:561] (1/4) Epoch 2809, batch 12, global_batch_idx: 44940, batch size: 152, loss[dur_loss=0.1907, prior_loss=0.9732, diff_loss=0.2708, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9727, diff_loss=0.3607, tot_loss=1.526, over 1966.00 samples.], 2024-10-21 17:43:50,399 INFO [train.py:682] (1/4) Start epoch 2810 2024-10-21 17:44:08,673 INFO [train.py:561] (1/4) Epoch 2810, batch 6, global_batch_idx: 44950, batch size: 106, loss[dur_loss=0.1903, prior_loss=0.9728, diff_loss=0.2695, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9724, diff_loss=0.3887, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 17:44:22,042 INFO [train.py:682] (1/4) Start epoch 2811 2024-10-21 17:44:31,396 INFO [train.py:561] (1/4) Epoch 2811, batch 0, global_batch_idx: 44960, batch size: 108, loss[dur_loss=0.197, prior_loss=0.9736, diff_loss=0.3168, tot_loss=1.487, over 108.00 samples.], tot_loss[dur_loss=0.197, prior_loss=0.9736, diff_loss=0.3168, tot_loss=1.487, over 108.00 samples.], 2024-10-21 17:44:45,976 INFO [train.py:561] (1/4) Epoch 2811, batch 10, global_batch_idx: 44970, batch size: 111, loss[dur_loss=0.1943, prior_loss=0.9741, diff_loss=0.3009, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.3712, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 17:44:53,258 INFO [train.py:682] (1/4) Start epoch 2812 2024-10-21 17:45:08,198 INFO [train.py:561] (1/4) Epoch 2812, batch 4, global_batch_idx: 44980, batch size: 189, loss[dur_loss=0.1913, prior_loss=0.9726, diff_loss=0.3247, tot_loss=1.489, over 189.00 samples.], tot_loss[dur_loss=0.1904, prior_loss=0.9721, diff_loss=0.4171, tot_loss=1.58, over 937.00 samples.], 2024-10-21 17:45:23,578 INFO [train.py:561] (1/4) Epoch 2812, batch 14, global_batch_idx: 44990, batch size: 142, loss[dur_loss=0.1943, prior_loss=0.9728, diff_loss=0.2756, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.3553, tot_loss=1.521, over 2210.00 samples.], 2024-10-21 17:45:25,064 INFO [train.py:682] (1/4) Start epoch 2813 2024-10-21 17:45:46,180 INFO [train.py:561] (1/4) Epoch 2813, batch 8, global_batch_idx: 45000, batch size: 170, loss[dur_loss=0.1975, prior_loss=0.973, diff_loss=0.3441, tot_loss=1.515, over 170.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9725, diff_loss=0.3829, tot_loss=1.548, over 1432.00 samples.], 2024-10-21 17:45:47,695 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 17:47:19,373 INFO [train.py:589] (1/4) Epoch 2813, validation: dur_loss=0.4527, prior_loss=1.036, diff_loss=0.3811, tot_loss=1.87, over 100.00 samples. 2024-10-21 17:47:19,374 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 17:47:28,298 INFO [train.py:682] (1/4) Start epoch 2814 2024-10-21 17:47:41,291 INFO [train.py:561] (1/4) Epoch 2814, batch 2, global_batch_idx: 45010, batch size: 203, loss[dur_loss=0.1952, prior_loss=0.973, diff_loss=0.3314, tot_loss=1.5, over 203.00 samples.], tot_loss[dur_loss=0.1956, prior_loss=0.973, diff_loss=0.3067, tot_loss=1.475, over 442.00 samples.], 2024-10-21 17:47:56,013 INFO [train.py:561] (1/4) Epoch 2814, batch 12, global_batch_idx: 45020, batch size: 152, loss[dur_loss=0.1936, prior_loss=0.9731, diff_loss=0.3049, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9726, diff_loss=0.3528, tot_loss=1.517, over 1966.00 samples.], 2024-10-21 17:48:00,529 INFO [train.py:682] (1/4) Start epoch 2815 2024-10-21 17:48:18,880 INFO [train.py:561] (1/4) Epoch 2815, batch 6, global_batch_idx: 45030, batch size: 106, loss[dur_loss=0.193, prior_loss=0.9729, diff_loss=0.2839, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9723, diff_loss=0.407, tot_loss=1.571, over 1142.00 samples.], 2024-10-21 17:48:32,531 INFO [train.py:682] (1/4) Start epoch 2816 2024-10-21 17:48:42,508 INFO [train.py:561] (1/4) Epoch 2816, batch 0, global_batch_idx: 45040, batch size: 108, loss[dur_loss=0.1989, prior_loss=0.9736, diff_loss=0.3361, tot_loss=1.509, over 108.00 samples.], tot_loss[dur_loss=0.1989, prior_loss=0.9736, diff_loss=0.3361, tot_loss=1.509, over 108.00 samples.], 2024-10-21 17:48:57,267 INFO [train.py:561] (1/4) Epoch 2816, batch 10, global_batch_idx: 45050, batch size: 111, loss[dur_loss=0.1953, prior_loss=0.9741, diff_loss=0.2887, tot_loss=1.458, over 111.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9727, diff_loss=0.362, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 17:49:04,546 INFO [train.py:682] (1/4) Start epoch 2817 2024-10-21 17:49:19,209 INFO [train.py:561] (1/4) Epoch 2817, batch 4, global_batch_idx: 45060, batch size: 189, loss[dur_loss=0.1918, prior_loss=0.973, diff_loss=0.3291, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9722, diff_loss=0.411, tot_loss=1.575, over 937.00 samples.], 2024-10-21 17:49:34,714 INFO [train.py:561] (1/4) Epoch 2817, batch 14, global_batch_idx: 45070, batch size: 142, loss[dur_loss=0.1938, prior_loss=0.9725, diff_loss=0.2755, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.344, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 17:49:36,136 INFO [train.py:682] (1/4) Start epoch 2818 2024-10-21 17:49:57,356 INFO [train.py:561] (1/4) Epoch 2818, batch 8, global_batch_idx: 45080, batch size: 170, loss[dur_loss=0.1988, prior_loss=0.9733, diff_loss=0.3012, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9725, diff_loss=0.3746, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 17:50:07,686 INFO [train.py:682] (1/4) Start epoch 2819 2024-10-21 17:50:19,926 INFO [train.py:561] (1/4) Epoch 2819, batch 2, global_batch_idx: 45090, batch size: 203, loss[dur_loss=0.1942, prior_loss=0.9731, diff_loss=0.3342, tot_loss=1.502, over 203.00 samples.], tot_loss[dur_loss=0.1937, prior_loss=0.9731, diff_loss=0.3124, tot_loss=1.479, over 442.00 samples.], 2024-10-21 17:50:34,919 INFO [train.py:561] (1/4) Epoch 2819, batch 12, global_batch_idx: 45100, batch size: 152, loss[dur_loss=0.1895, prior_loss=0.9729, diff_loss=0.2977, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9727, diff_loss=0.3575, tot_loss=1.522, over 1966.00 samples.], 2024-10-21 17:50:39,971 INFO [train.py:682] (1/4) Start epoch 2820 2024-10-21 17:50:58,039 INFO [train.py:561] (1/4) Epoch 2820, batch 6, global_batch_idx: 45110, batch size: 106, loss[dur_loss=0.1927, prior_loss=0.9731, diff_loss=0.287, tot_loss=1.453, over 106.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9724, diff_loss=0.3928, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 17:51:11,751 INFO [train.py:682] (1/4) Start epoch 2821 2024-10-21 17:51:21,167 INFO [train.py:561] (1/4) Epoch 2821, batch 0, global_batch_idx: 45120, batch size: 108, loss[dur_loss=0.1992, prior_loss=0.9736, diff_loss=0.3044, tot_loss=1.477, over 108.00 samples.], tot_loss[dur_loss=0.1992, prior_loss=0.9736, diff_loss=0.3044, tot_loss=1.477, over 108.00 samples.], 2024-10-21 17:51:35,795 INFO [train.py:561] (1/4) Epoch 2821, batch 10, global_batch_idx: 45130, batch size: 111, loss[dur_loss=0.1946, prior_loss=0.9742, diff_loss=0.3009, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9727, diff_loss=0.3694, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 17:51:43,008 INFO [train.py:682] (1/4) Start epoch 2822 2024-10-21 17:51:57,814 INFO [train.py:561] (1/4) Epoch 2822, batch 4, global_batch_idx: 45140, batch size: 189, loss[dur_loss=0.197, prior_loss=0.9733, diff_loss=0.3122, tot_loss=1.482, over 189.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9722, diff_loss=0.4148, tot_loss=1.578, over 937.00 samples.], 2024-10-21 17:52:13,097 INFO [train.py:561] (1/4) Epoch 2822, batch 14, global_batch_idx: 45150, batch size: 142, loss[dur_loss=0.1981, prior_loss=0.9729, diff_loss=0.2671, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1937, prior_loss=0.9729, diff_loss=0.3523, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 17:52:14,574 INFO [train.py:682] (1/4) Start epoch 2823 2024-10-21 17:52:35,827 INFO [train.py:561] (1/4) Epoch 2823, batch 8, global_batch_idx: 45160, batch size: 170, loss[dur_loss=0.1969, prior_loss=0.9742, diff_loss=0.3229, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.1932, prior_loss=0.9728, diff_loss=0.3768, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 17:52:46,314 INFO [train.py:682] (1/4) Start epoch 2824 2024-10-21 17:52:58,868 INFO [train.py:561] (1/4) Epoch 2824, batch 2, global_batch_idx: 45170, batch size: 203, loss[dur_loss=0.1948, prior_loss=0.9731, diff_loss=0.3274, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1954, prior_loss=0.9733, diff_loss=0.3137, tot_loss=1.482, over 442.00 samples.], 2024-10-21 17:53:13,605 INFO [train.py:561] (1/4) Epoch 2824, batch 12, global_batch_idx: 45180, batch size: 152, loss[dur_loss=0.1932, prior_loss=0.9733, diff_loss=0.27, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9729, diff_loss=0.3533, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 17:53:18,143 INFO [train.py:682] (1/4) Start epoch 2825 2024-10-21 17:53:36,530 INFO [train.py:561] (1/4) Epoch 2825, batch 6, global_batch_idx: 45190, batch size: 106, loss[dur_loss=0.1917, prior_loss=0.973, diff_loss=0.2901, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9725, diff_loss=0.4011, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 17:53:49,857 INFO [train.py:682] (1/4) Start epoch 2826 2024-10-21 17:53:59,661 INFO [train.py:561] (1/4) Epoch 2826, batch 0, global_batch_idx: 45200, batch size: 108, loss[dur_loss=0.1983, prior_loss=0.9735, diff_loss=0.2959, tot_loss=1.468, over 108.00 samples.], tot_loss[dur_loss=0.1983, prior_loss=0.9735, diff_loss=0.2959, tot_loss=1.468, over 108.00 samples.], 2024-10-21 17:54:14,431 INFO [train.py:561] (1/4) Epoch 2826, batch 10, global_batch_idx: 45210, batch size: 111, loss[dur_loss=0.1938, prior_loss=0.974, diff_loss=0.2822, tot_loss=1.45, over 111.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9727, diff_loss=0.3657, tot_loss=1.532, over 1656.00 samples.], 2024-10-21 17:54:21,790 INFO [train.py:682] (1/4) Start epoch 2827 2024-10-21 17:54:36,767 INFO [train.py:561] (1/4) Epoch 2827, batch 4, global_batch_idx: 45220, batch size: 189, loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3255, tot_loss=1.492, over 189.00 samples.], tot_loss[dur_loss=0.1908, prior_loss=0.9722, diff_loss=0.4254, tot_loss=1.588, over 937.00 samples.], 2024-10-21 17:54:52,096 INFO [train.py:561] (1/4) Epoch 2827, batch 14, global_batch_idx: 45230, batch size: 142, loss[dur_loss=0.1933, prior_loss=0.9726, diff_loss=0.2996, tot_loss=1.465, over 142.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9727, diff_loss=0.3501, tot_loss=1.515, over 2210.00 samples.], 2024-10-21 17:54:53,550 INFO [train.py:682] (1/4) Start epoch 2828 2024-10-21 17:55:14,781 INFO [train.py:561] (1/4) Epoch 2828, batch 8, global_batch_idx: 45240, batch size: 170, loss[dur_loss=0.1933, prior_loss=0.9731, diff_loss=0.3191, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9725, diff_loss=0.3816, tot_loss=1.546, over 1432.00 samples.], 2024-10-21 17:55:25,193 INFO [train.py:682] (1/4) Start epoch 2829 2024-10-21 17:55:37,467 INFO [train.py:561] (1/4) Epoch 2829, batch 2, global_batch_idx: 45250, batch size: 203, loss[dur_loss=0.1963, prior_loss=0.9731, diff_loss=0.3299, tot_loss=1.499, over 203.00 samples.], tot_loss[dur_loss=0.1968, prior_loss=0.9732, diff_loss=0.3083, tot_loss=1.478, over 442.00 samples.], 2024-10-21 17:55:52,263 INFO [train.py:561] (1/4) Epoch 2829, batch 12, global_batch_idx: 45260, batch size: 152, loss[dur_loss=0.1936, prior_loss=0.9731, diff_loss=0.2895, tot_loss=1.456, over 152.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9728, diff_loss=0.3464, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 17:55:56,844 INFO [train.py:682] (1/4) Start epoch 2830 2024-10-21 17:56:14,510 INFO [train.py:561] (1/4) Epoch 2830, batch 6, global_batch_idx: 45270, batch size: 106, loss[dur_loss=0.1932, prior_loss=0.9727, diff_loss=0.3242, tot_loss=1.49, over 106.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9724, diff_loss=0.4013, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 17:56:27,884 INFO [train.py:682] (1/4) Start epoch 2831 2024-10-21 17:56:37,565 INFO [train.py:561] (1/4) Epoch 2831, batch 0, global_batch_idx: 45280, batch size: 108, loss[dur_loss=0.1974, prior_loss=0.9736, diff_loss=0.3308, tot_loss=1.502, over 108.00 samples.], tot_loss[dur_loss=0.1974, prior_loss=0.9736, diff_loss=0.3308, tot_loss=1.502, over 108.00 samples.], 2024-10-21 17:56:52,168 INFO [train.py:561] (1/4) Epoch 2831, batch 10, global_batch_idx: 45290, batch size: 111, loss[dur_loss=0.1949, prior_loss=0.9741, diff_loss=0.311, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9728, diff_loss=0.3913, tot_loss=1.556, over 1656.00 samples.], 2024-10-21 17:56:59,526 INFO [train.py:682] (1/4) Start epoch 2832 2024-10-21 17:57:14,369 INFO [train.py:561] (1/4) Epoch 2832, batch 4, global_batch_idx: 45300, batch size: 189, loss[dur_loss=0.1933, prior_loss=0.9729, diff_loss=0.3323, tot_loss=1.499, over 189.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9722, diff_loss=0.4273, tot_loss=1.59, over 937.00 samples.], 2024-10-21 17:57:29,774 INFO [train.py:561] (1/4) Epoch 2832, batch 14, global_batch_idx: 45310, batch size: 142, loss[dur_loss=0.195, prior_loss=0.9727, diff_loss=0.3112, tot_loss=1.479, over 142.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9728, diff_loss=0.3534, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 17:57:31,251 INFO [train.py:682] (1/4) Start epoch 2833 2024-10-21 17:57:52,371 INFO [train.py:561] (1/4) Epoch 2833, batch 8, global_batch_idx: 45320, batch size: 170, loss[dur_loss=0.1965, prior_loss=0.9734, diff_loss=0.2909, tot_loss=1.461, over 170.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.9725, diff_loss=0.3737, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 17:58:02,714 INFO [train.py:682] (1/4) Start epoch 2834 2024-10-21 17:58:14,958 INFO [train.py:561] (1/4) Epoch 2834, batch 2, global_batch_idx: 45330, batch size: 203, loss[dur_loss=0.1951, prior_loss=0.973, diff_loss=0.3226, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.973, diff_loss=0.3036, tot_loss=1.471, over 442.00 samples.], 2024-10-21 17:58:29,740 INFO [train.py:561] (1/4) Epoch 2834, batch 12, global_batch_idx: 45340, batch size: 152, loss[dur_loss=0.1899, prior_loss=0.973, diff_loss=0.3011, tot_loss=1.464, over 152.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9728, diff_loss=0.3576, tot_loss=1.522, over 1966.00 samples.], 2024-10-21 17:58:34,382 INFO [train.py:682] (1/4) Start epoch 2835 2024-10-21 17:58:51,940 INFO [train.py:561] (1/4) Epoch 2835, batch 6, global_batch_idx: 45350, batch size: 106, loss[dur_loss=0.1923, prior_loss=0.9731, diff_loss=0.2518, tot_loss=1.417, over 106.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9724, diff_loss=0.3883, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 17:59:05,202 INFO [train.py:682] (1/4) Start epoch 2836 2024-10-21 17:59:14,500 INFO [train.py:561] (1/4) Epoch 2836, batch 0, global_batch_idx: 45360, batch size: 108, loss[dur_loss=0.1955, prior_loss=0.9736, diff_loss=0.2726, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1955, prior_loss=0.9736, diff_loss=0.2726, tot_loss=1.442, over 108.00 samples.], 2024-10-21 17:59:29,072 INFO [train.py:561] (1/4) Epoch 2836, batch 10, global_batch_idx: 45370, batch size: 111, loss[dur_loss=0.1942, prior_loss=0.974, diff_loss=0.3148, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9726, diff_loss=0.3564, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 17:59:36,232 INFO [train.py:682] (1/4) Start epoch 2837 2024-10-21 17:59:50,502 INFO [train.py:561] (1/4) Epoch 2837, batch 4, global_batch_idx: 45380, batch size: 189, loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.2929, tot_loss=1.458, over 189.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9721, diff_loss=0.4089, tot_loss=1.571, over 937.00 samples.], 2024-10-21 18:00:05,524 INFO [train.py:561] (1/4) Epoch 2837, batch 14, global_batch_idx: 45390, batch size: 142, loss[dur_loss=0.1948, prior_loss=0.9726, diff_loss=0.291, tot_loss=1.458, over 142.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9727, diff_loss=0.3531, tot_loss=1.518, over 2210.00 samples.], 2024-10-21 18:00:06,951 INFO [train.py:682] (1/4) Start epoch 2838 2024-10-21 18:00:27,116 INFO [train.py:561] (1/4) Epoch 2838, batch 8, global_batch_idx: 45400, batch size: 170, loss[dur_loss=0.1945, prior_loss=0.9727, diff_loss=0.2922, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9724, diff_loss=0.3866, tot_loss=1.55, over 1432.00 samples.], 2024-10-21 18:00:37,318 INFO [train.py:682] (1/4) Start epoch 2839 2024-10-21 18:00:49,055 INFO [train.py:561] (1/4) Epoch 2839, batch 2, global_batch_idx: 45410, batch size: 203, loss[dur_loss=0.1938, prior_loss=0.973, diff_loss=0.3423, tot_loss=1.509, over 203.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9731, diff_loss=0.3184, tot_loss=1.485, over 442.00 samples.], 2024-10-21 18:01:03,355 INFO [train.py:561] (1/4) Epoch 2839, batch 12, global_batch_idx: 45420, batch size: 152, loss[dur_loss=0.1902, prior_loss=0.9728, diff_loss=0.2848, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9725, diff_loss=0.3523, tot_loss=1.517, over 1966.00 samples.], 2024-10-21 18:01:07,800 INFO [train.py:682] (1/4) Start epoch 2840 2024-10-21 18:01:25,005 INFO [train.py:561] (1/4) Epoch 2840, batch 6, global_batch_idx: 45430, batch size: 106, loss[dur_loss=0.1921, prior_loss=0.9728, diff_loss=0.3515, tot_loss=1.516, over 106.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9723, diff_loss=0.4036, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 18:01:38,376 INFO [train.py:682] (1/4) Start epoch 2841 2024-10-21 18:01:47,627 INFO [train.py:561] (1/4) Epoch 2841, batch 0, global_batch_idx: 45440, batch size: 108, loss[dur_loss=0.1973, prior_loss=0.9733, diff_loss=0.2848, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9733, diff_loss=0.2848, tot_loss=1.455, over 108.00 samples.], 2024-10-21 18:02:01,960 INFO [train.py:561] (1/4) Epoch 2841, batch 10, global_batch_idx: 45450, batch size: 111, loss[dur_loss=0.1951, prior_loss=0.9738, diff_loss=0.2899, tot_loss=1.459, over 111.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9725, diff_loss=0.3687, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 18:02:09,031 INFO [train.py:682] (1/4) Start epoch 2842 2024-10-21 18:02:22,956 INFO [train.py:561] (1/4) Epoch 2842, batch 4, global_batch_idx: 45460, batch size: 189, loss[dur_loss=0.1932, prior_loss=0.9728, diff_loss=0.3419, tot_loss=1.508, over 189.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.972, diff_loss=0.424, tot_loss=1.586, over 937.00 samples.], 2024-10-21 18:02:38,044 INFO [train.py:561] (1/4) Epoch 2842, batch 14, global_batch_idx: 45470, batch size: 142, loss[dur_loss=0.1933, prior_loss=0.9726, diff_loss=0.3278, tot_loss=1.494, over 142.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9726, diff_loss=0.3486, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 18:02:39,475 INFO [train.py:682] (1/4) Start epoch 2843 2024-10-21 18:02:59,597 INFO [train.py:561] (1/4) Epoch 2843, batch 8, global_batch_idx: 45480, batch size: 170, loss[dur_loss=0.1979, prior_loss=0.9731, diff_loss=0.2986, tot_loss=1.47, over 170.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9724, diff_loss=0.3788, tot_loss=1.544, over 1432.00 samples.], 2024-10-21 18:03:09,781 INFO [train.py:682] (1/4) Start epoch 2844 2024-10-21 18:03:21,626 INFO [train.py:561] (1/4) Epoch 2844, batch 2, global_batch_idx: 45490, batch size: 203, loss[dur_loss=0.1958, prior_loss=0.9728, diff_loss=0.3358, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.9728, diff_loss=0.3015, tot_loss=1.469, over 442.00 samples.], 2024-10-21 18:03:36,006 INFO [train.py:561] (1/4) Epoch 2844, batch 12, global_batch_idx: 45500, batch size: 152, loss[dur_loss=0.1924, prior_loss=0.9729, diff_loss=0.2994, tot_loss=1.465, over 152.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9725, diff_loss=0.3589, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 18:03:40,497 INFO [train.py:682] (1/4) Start epoch 2845 2024-10-21 18:03:57,748 INFO [train.py:561] (1/4) Epoch 2845, batch 6, global_batch_idx: 45510, batch size: 106, loss[dur_loss=0.1905, prior_loss=0.9729, diff_loss=0.2789, tot_loss=1.442, over 106.00 samples.], tot_loss[dur_loss=0.1902, prior_loss=0.9722, diff_loss=0.3892, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 18:04:10,978 INFO [train.py:682] (1/4) Start epoch 2846 2024-10-21 18:04:19,818 INFO [train.py:561] (1/4) Epoch 2846, batch 0, global_batch_idx: 45520, batch size: 108, loss[dur_loss=0.1988, prior_loss=0.9734, diff_loss=0.2893, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9734, diff_loss=0.2893, tot_loss=1.461, over 108.00 samples.], 2024-10-21 18:04:34,304 INFO [train.py:561] (1/4) Epoch 2846, batch 10, global_batch_idx: 45530, batch size: 111, loss[dur_loss=0.1957, prior_loss=0.974, diff_loss=0.3139, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9726, diff_loss=0.3732, tot_loss=1.538, over 1656.00 samples.], 2024-10-21 18:04:41,469 INFO [train.py:682] (1/4) Start epoch 2847 2024-10-21 18:04:55,538 INFO [train.py:561] (1/4) Epoch 2847, batch 4, global_batch_idx: 45540, batch size: 189, loss[dur_loss=0.1905, prior_loss=0.9728, diff_loss=0.3164, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.972, diff_loss=0.4132, tot_loss=1.574, over 937.00 samples.], 2024-10-21 18:05:10,567 INFO [train.py:561] (1/4) Epoch 2847, batch 14, global_batch_idx: 45550, batch size: 142, loss[dur_loss=0.1954, prior_loss=0.9727, diff_loss=0.3436, tot_loss=1.512, over 142.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9726, diff_loss=0.3519, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 18:05:11,991 INFO [train.py:682] (1/4) Start epoch 2848 2024-10-21 18:05:32,260 INFO [train.py:561] (1/4) Epoch 2848, batch 8, global_batch_idx: 45560, batch size: 170, loss[dur_loss=0.1959, prior_loss=0.973, diff_loss=0.3035, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.1912, prior_loss=0.9724, diff_loss=0.3713, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 18:05:42,499 INFO [train.py:682] (1/4) Start epoch 2849 2024-10-21 18:05:53,913 INFO [train.py:561] (1/4) Epoch 2849, batch 2, global_batch_idx: 45570, batch size: 203, loss[dur_loss=0.1934, prior_loss=0.9726, diff_loss=0.3317, tot_loss=1.498, over 203.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9727, diff_loss=0.3115, tot_loss=1.478, over 442.00 samples.], 2024-10-21 18:06:08,385 INFO [train.py:561] (1/4) Epoch 2849, batch 12, global_batch_idx: 45580, batch size: 152, loss[dur_loss=0.1939, prior_loss=0.973, diff_loss=0.3305, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9725, diff_loss=0.3659, tot_loss=1.53, over 1966.00 samples.], 2024-10-21 18:06:12,920 INFO [train.py:682] (1/4) Start epoch 2850 2024-10-21 18:06:30,363 INFO [train.py:561] (1/4) Epoch 2850, batch 6, global_batch_idx: 45590, batch size: 106, loss[dur_loss=0.1924, prior_loss=0.9728, diff_loss=0.2837, tot_loss=1.449, over 106.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9722, diff_loss=0.3918, tot_loss=1.553, over 1142.00 samples.], 2024-10-21 18:06:43,582 INFO [train.py:682] (1/4) Start epoch 2851 2024-10-21 18:06:52,409 INFO [train.py:561] (1/4) Epoch 2851, batch 0, global_batch_idx: 45600, batch size: 108, loss[dur_loss=0.1971, prior_loss=0.9734, diff_loss=0.2711, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9734, diff_loss=0.2711, tot_loss=1.442, over 108.00 samples.], 2024-10-21 18:07:06,948 INFO [train.py:561] (1/4) Epoch 2851, batch 10, global_batch_idx: 45610, batch size: 111, loss[dur_loss=0.1949, prior_loss=0.9738, diff_loss=0.3156, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9726, diff_loss=0.3799, tot_loss=1.544, over 1656.00 samples.], 2024-10-21 18:07:14,129 INFO [train.py:682] (1/4) Start epoch 2852 2024-10-21 18:07:27,989 INFO [train.py:561] (1/4) Epoch 2852, batch 4, global_batch_idx: 45620, batch size: 189, loss[dur_loss=0.1967, prior_loss=0.9731, diff_loss=0.3334, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.191, prior_loss=0.9721, diff_loss=0.4296, tot_loss=1.593, over 937.00 samples.], 2024-10-21 18:07:43,101 INFO [train.py:561] (1/4) Epoch 2852, batch 14, global_batch_idx: 45630, batch size: 142, loss[dur_loss=0.1956, prior_loss=0.9727, diff_loss=0.3079, tot_loss=1.476, over 142.00 samples.], tot_loss[dur_loss=0.1927, prior_loss=0.9727, diff_loss=0.3565, tot_loss=1.522, over 2210.00 samples.], 2024-10-21 18:07:44,551 INFO [train.py:682] (1/4) Start epoch 2853 2024-10-21 18:08:04,951 INFO [train.py:561] (1/4) Epoch 2853, batch 8, global_batch_idx: 45640, batch size: 170, loss[dur_loss=0.1957, prior_loss=0.973, diff_loss=0.3102, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1918, prior_loss=0.9724, diff_loss=0.374, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 18:08:15,257 INFO [train.py:682] (1/4) Start epoch 2854 2024-10-21 18:08:26,443 INFO [train.py:561] (1/4) Epoch 2854, batch 2, global_batch_idx: 45650, batch size: 203, loss[dur_loss=0.1928, prior_loss=0.9728, diff_loss=0.3649, tot_loss=1.531, over 203.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9728, diff_loss=0.3206, tot_loss=1.487, over 442.00 samples.], 2024-10-21 18:08:40,997 INFO [train.py:561] (1/4) Epoch 2854, batch 12, global_batch_idx: 45660, batch size: 152, loss[dur_loss=0.1903, prior_loss=0.973, diff_loss=0.3023, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9726, diff_loss=0.3588, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 18:08:45,562 INFO [train.py:682] (1/4) Start epoch 2855 2024-10-21 18:09:02,960 INFO [train.py:561] (1/4) Epoch 2855, batch 6, global_batch_idx: 45670, batch size: 106, loss[dur_loss=0.1926, prior_loss=0.9728, diff_loss=0.2903, tot_loss=1.456, over 106.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.4008, tot_loss=1.563, over 1142.00 samples.], 2024-10-21 18:09:16,340 INFO [train.py:682] (1/4) Start epoch 2856 2024-10-21 18:09:25,142 INFO [train.py:561] (1/4) Epoch 2856, batch 0, global_batch_idx: 45680, batch size: 108, loss[dur_loss=0.1959, prior_loss=0.9733, diff_loss=0.2665, tot_loss=1.436, over 108.00 samples.], tot_loss[dur_loss=0.1959, prior_loss=0.9733, diff_loss=0.2665, tot_loss=1.436, over 108.00 samples.], 2024-10-21 18:09:39,645 INFO [train.py:561] (1/4) Epoch 2856, batch 10, global_batch_idx: 45690, batch size: 111, loss[dur_loss=0.1921, prior_loss=0.974, diff_loss=0.2842, tot_loss=1.45, over 111.00 samples.], tot_loss[dur_loss=0.1912, prior_loss=0.9725, diff_loss=0.3591, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 18:09:46,877 INFO [train.py:682] (1/4) Start epoch 2857 2024-10-21 18:10:00,877 INFO [train.py:561] (1/4) Epoch 2857, batch 4, global_batch_idx: 45700, batch size: 189, loss[dur_loss=0.1911, prior_loss=0.9728, diff_loss=0.3307, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9719, diff_loss=0.4226, tot_loss=1.584, over 937.00 samples.], 2024-10-21 18:10:15,951 INFO [train.py:561] (1/4) Epoch 2857, batch 14, global_batch_idx: 45710, batch size: 142, loss[dur_loss=0.192, prior_loss=0.9726, diff_loss=0.2875, tot_loss=1.452, over 142.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9726, diff_loss=0.3447, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 18:10:17,401 INFO [train.py:682] (1/4) Start epoch 2858 2024-10-21 18:10:37,797 INFO [train.py:561] (1/4) Epoch 2858, batch 8, global_batch_idx: 45720, batch size: 170, loss[dur_loss=0.1953, prior_loss=0.9729, diff_loss=0.3264, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9723, diff_loss=0.3839, tot_loss=1.546, over 1432.00 samples.], 2024-10-21 18:10:48,020 INFO [train.py:682] (1/4) Start epoch 2859 2024-10-21 18:10:59,596 INFO [train.py:561] (1/4) Epoch 2859, batch 2, global_batch_idx: 45730, batch size: 203, loss[dur_loss=0.1926, prior_loss=0.9727, diff_loss=0.3158, tot_loss=1.481, over 203.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.2829, tot_loss=1.45, over 442.00 samples.], 2024-10-21 18:11:13,975 INFO [train.py:561] (1/4) Epoch 2859, batch 12, global_batch_idx: 45740, batch size: 152, loss[dur_loss=0.1906, prior_loss=0.9728, diff_loss=0.3099, tot_loss=1.473, over 152.00 samples.], tot_loss[dur_loss=0.1912, prior_loss=0.9725, diff_loss=0.3449, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 18:11:18,447 INFO [train.py:682] (1/4) Start epoch 2860 2024-10-21 18:11:35,376 INFO [train.py:561] (1/4) Epoch 2860, batch 6, global_batch_idx: 45750, batch size: 106, loss[dur_loss=0.1923, prior_loss=0.9728, diff_loss=0.285, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9723, diff_loss=0.3945, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 18:11:48,685 INFO [train.py:682] (1/4) Start epoch 2861 2024-10-21 18:11:57,605 INFO [train.py:561] (1/4) Epoch 2861, batch 0, global_batch_idx: 45760, batch size: 108, loss[dur_loss=0.1982, prior_loss=0.9735, diff_loss=0.2837, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1982, prior_loss=0.9735, diff_loss=0.2837, tot_loss=1.455, over 108.00 samples.], 2024-10-21 18:12:11,866 INFO [train.py:561] (1/4) Epoch 2861, batch 10, global_batch_idx: 45770, batch size: 111, loss[dur_loss=0.1933, prior_loss=0.9738, diff_loss=0.3091, tot_loss=1.476, over 111.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9725, diff_loss=0.3625, tot_loss=1.527, over 1656.00 samples.], 2024-10-21 18:12:18,987 INFO [train.py:682] (1/4) Start epoch 2862 2024-10-21 18:12:32,894 INFO [train.py:561] (1/4) Epoch 2862, batch 4, global_batch_idx: 45780, batch size: 189, loss[dur_loss=0.1964, prior_loss=0.9728, diff_loss=0.3331, tot_loss=1.502, over 189.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.972, diff_loss=0.4219, tot_loss=1.586, over 937.00 samples.], 2024-10-21 18:12:47,821 INFO [train.py:561] (1/4) Epoch 2862, batch 14, global_batch_idx: 45790, batch size: 142, loss[dur_loss=0.1936, prior_loss=0.9726, diff_loss=0.3022, tot_loss=1.468, over 142.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9726, diff_loss=0.3536, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 18:12:49,253 INFO [train.py:682] (1/4) Start epoch 2863 2024-10-21 18:13:09,201 INFO [train.py:561] (1/4) Epoch 2863, batch 8, global_batch_idx: 45800, batch size: 170, loss[dur_loss=0.1966, prior_loss=0.9731, diff_loss=0.3413, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9724, diff_loss=0.3832, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 18:13:19,390 INFO [train.py:682] (1/4) Start epoch 2864 2024-10-21 18:13:31,128 INFO [train.py:561] (1/4) Epoch 2864, batch 2, global_batch_idx: 45810, batch size: 203, loss[dur_loss=0.1909, prior_loss=0.9728, diff_loss=0.3107, tot_loss=1.474, over 203.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9728, diff_loss=0.3002, tot_loss=1.465, over 442.00 samples.], 2024-10-21 18:13:45,449 INFO [train.py:561] (1/4) Epoch 2864, batch 12, global_batch_idx: 45820, batch size: 152, loss[dur_loss=0.1889, prior_loss=0.9729, diff_loss=0.2853, tot_loss=1.447, over 152.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9725, diff_loss=0.3516, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 18:13:49,999 INFO [train.py:682] (1/4) Start epoch 2865 2024-10-21 18:14:07,312 INFO [train.py:561] (1/4) Epoch 2865, batch 6, global_batch_idx: 45830, batch size: 106, loss[dur_loss=0.1929, prior_loss=0.9725, diff_loss=0.2936, tot_loss=1.459, over 106.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.9721, diff_loss=0.3836, tot_loss=1.545, over 1142.00 samples.], 2024-10-21 18:14:20,538 INFO [train.py:682] (1/4) Start epoch 2866 2024-10-21 18:14:29,440 INFO [train.py:561] (1/4) Epoch 2866, batch 0, global_batch_idx: 45840, batch size: 108, loss[dur_loss=0.1972, prior_loss=0.9731, diff_loss=0.3322, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.1972, prior_loss=0.9731, diff_loss=0.3322, tot_loss=1.503, over 108.00 samples.], 2024-10-21 18:14:43,703 INFO [train.py:561] (1/4) Epoch 2866, batch 10, global_batch_idx: 45850, batch size: 111, loss[dur_loss=0.1927, prior_loss=0.9736, diff_loss=0.2742, tot_loss=1.441, over 111.00 samples.], tot_loss[dur_loss=0.1902, prior_loss=0.9724, diff_loss=0.3664, tot_loss=1.529, over 1656.00 samples.], 2024-10-21 18:14:50,825 INFO [train.py:682] (1/4) Start epoch 2867 2024-10-21 18:15:04,693 INFO [train.py:561] (1/4) Epoch 2867, batch 4, global_batch_idx: 45860, batch size: 189, loss[dur_loss=0.1921, prior_loss=0.9725, diff_loss=0.3268, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9718, diff_loss=0.422, tot_loss=1.584, over 937.00 samples.], 2024-10-21 18:15:19,607 INFO [train.py:561] (1/4) Epoch 2867, batch 14, global_batch_idx: 45870, batch size: 142, loss[dur_loss=0.1924, prior_loss=0.9724, diff_loss=0.3013, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9724, diff_loss=0.3494, tot_loss=1.514, over 2210.00 samples.], 2024-10-21 18:15:21,018 INFO [train.py:682] (1/4) Start epoch 2868 2024-10-21 18:15:41,048 INFO [train.py:561] (1/4) Epoch 2868, batch 8, global_batch_idx: 45880, batch size: 170, loss[dur_loss=0.1956, prior_loss=0.9728, diff_loss=0.3122, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1917, prior_loss=0.9723, diff_loss=0.3786, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 18:15:51,125 INFO [train.py:682] (1/4) Start epoch 2869 2024-10-21 18:16:02,623 INFO [train.py:561] (1/4) Epoch 2869, batch 2, global_batch_idx: 45890, batch size: 203, loss[dur_loss=0.1924, prior_loss=0.9727, diff_loss=0.3097, tot_loss=1.475, over 203.00 samples.], tot_loss[dur_loss=0.1945, prior_loss=0.9728, diff_loss=0.3001, tot_loss=1.467, over 442.00 samples.], 2024-10-21 18:16:16,841 INFO [train.py:561] (1/4) Epoch 2869, batch 12, global_batch_idx: 45900, batch size: 152, loss[dur_loss=0.1915, prior_loss=0.9728, diff_loss=0.2837, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9724, diff_loss=0.347, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 18:16:21,312 INFO [train.py:682] (1/4) Start epoch 2870 2024-10-21 18:16:38,537 INFO [train.py:561] (1/4) Epoch 2870, batch 6, global_batch_idx: 45910, batch size: 106, loss[dur_loss=0.1907, prior_loss=0.9727, diff_loss=0.2658, tot_loss=1.429, over 106.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.3925, tot_loss=1.554, over 1142.00 samples.], 2024-10-21 18:16:51,674 INFO [train.py:682] (1/4) Start epoch 2871 2024-10-21 18:17:00,423 INFO [train.py:561] (1/4) Epoch 2871, batch 0, global_batch_idx: 45920, batch size: 108, loss[dur_loss=0.1988, prior_loss=0.9732, diff_loss=0.2739, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1988, prior_loss=0.9732, diff_loss=0.2739, tot_loss=1.446, over 108.00 samples.], 2024-10-21 18:17:14,708 INFO [train.py:561] (1/4) Epoch 2871, batch 10, global_batch_idx: 45930, batch size: 111, loss[dur_loss=0.1911, prior_loss=0.9735, diff_loss=0.3194, tot_loss=1.484, over 111.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9723, diff_loss=0.3728, tot_loss=1.536, over 1656.00 samples.], 2024-10-21 18:17:21,864 INFO [train.py:682] (1/4) Start epoch 2872 2024-10-21 18:17:36,005 INFO [train.py:561] (1/4) Epoch 2872, batch 4, global_batch_idx: 45940, batch size: 189, loss[dur_loss=0.1906, prior_loss=0.9726, diff_loss=0.3128, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9718, diff_loss=0.3939, tot_loss=1.553, over 937.00 samples.], 2024-10-21 18:17:50,887 INFO [train.py:561] (1/4) Epoch 2872, batch 14, global_batch_idx: 45950, batch size: 142, loss[dur_loss=0.1941, prior_loss=0.9724, diff_loss=0.3025, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9724, diff_loss=0.3358, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 18:17:52,335 INFO [train.py:682] (1/4) Start epoch 2873 2024-10-21 18:18:12,672 INFO [train.py:561] (1/4) Epoch 2873, batch 8, global_batch_idx: 45960, batch size: 170, loss[dur_loss=0.1922, prior_loss=0.9728, diff_loss=0.3204, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.1904, prior_loss=0.9721, diff_loss=0.3822, tot_loss=1.545, over 1432.00 samples.], 2024-10-21 18:18:22,906 INFO [train.py:682] (1/4) Start epoch 2874 2024-10-21 18:18:34,482 INFO [train.py:561] (1/4) Epoch 2874, batch 2, global_batch_idx: 45970, batch size: 203, loss[dur_loss=0.1903, prior_loss=0.9723, diff_loss=0.3439, tot_loss=1.507, over 203.00 samples.], tot_loss[dur_loss=0.192, prior_loss=0.9724, diff_loss=0.3233, tot_loss=1.488, over 442.00 samples.], 2024-10-21 18:18:48,803 INFO [train.py:561] (1/4) Epoch 2874, batch 12, global_batch_idx: 45980, batch size: 152, loss[dur_loss=0.1902, prior_loss=0.9728, diff_loss=0.2954, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.1908, prior_loss=0.9723, diff_loss=0.3584, tot_loss=1.521, over 1966.00 samples.], 2024-10-21 18:18:53,252 INFO [train.py:682] (1/4) Start epoch 2875 2024-10-21 18:19:10,672 INFO [train.py:561] (1/4) Epoch 2875, batch 6, global_batch_idx: 45990, batch size: 106, loss[dur_loss=0.1936, prior_loss=0.9727, diff_loss=0.2763, tot_loss=1.443, over 106.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.972, diff_loss=0.3917, tot_loss=1.553, over 1142.00 samples.], 2024-10-21 18:19:23,766 INFO [train.py:682] (1/4) Start epoch 2876 2024-10-21 18:19:32,547 INFO [train.py:561] (1/4) Epoch 2876, batch 0, global_batch_idx: 46000, batch size: 108, loss[dur_loss=0.1963, prior_loss=0.973, diff_loss=0.2965, tot_loss=1.466, over 108.00 samples.], tot_loss[dur_loss=0.1963, prior_loss=0.973, diff_loss=0.2965, tot_loss=1.466, over 108.00 samples.], 2024-10-21 18:19:46,747 INFO [train.py:561] (1/4) Epoch 2876, batch 10, global_batch_idx: 46010, batch size: 111, loss[dur_loss=0.1934, prior_loss=0.9735, diff_loss=0.3413, tot_loss=1.508, over 111.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9722, diff_loss=0.3756, tot_loss=1.538, over 1656.00 samples.], 2024-10-21 18:19:53,900 INFO [train.py:682] (1/4) Start epoch 2877 2024-10-21 18:20:07,918 INFO [train.py:561] (1/4) Epoch 2877, batch 4, global_batch_idx: 46020, batch size: 189, loss[dur_loss=0.1937, prior_loss=0.9726, diff_loss=0.3195, tot_loss=1.486, over 189.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9718, diff_loss=0.4119, tot_loss=1.573, over 937.00 samples.], 2024-10-21 18:20:22,950 INFO [train.py:561] (1/4) Epoch 2877, batch 14, global_batch_idx: 46030, batch size: 142, loss[dur_loss=0.1953, prior_loss=0.9724, diff_loss=0.2918, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9723, diff_loss=0.3479, tot_loss=1.512, over 2210.00 samples.], 2024-10-21 18:20:24,382 INFO [train.py:682] (1/4) Start epoch 2878 2024-10-21 18:20:44,542 INFO [train.py:561] (1/4) Epoch 2878, batch 8, global_batch_idx: 46040, batch size: 170, loss[dur_loss=0.1966, prior_loss=0.9726, diff_loss=0.2935, tot_loss=1.463, over 170.00 samples.], tot_loss[dur_loss=0.1908, prior_loss=0.9721, diff_loss=0.3838, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 18:20:54,706 INFO [train.py:682] (1/4) Start epoch 2879 2024-10-21 18:21:06,111 INFO [train.py:561] (1/4) Epoch 2879, batch 2, global_batch_idx: 46050, batch size: 203, loss[dur_loss=0.1945, prior_loss=0.9726, diff_loss=0.3425, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9726, diff_loss=0.3125, tot_loss=1.48, over 442.00 samples.], 2024-10-21 18:21:20,339 INFO [train.py:561] (1/4) Epoch 2879, batch 12, global_batch_idx: 46060, batch size: 152, loss[dur_loss=0.1939, prior_loss=0.9729, diff_loss=0.312, tot_loss=1.479, over 152.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.3506, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 18:21:24,851 INFO [train.py:682] (1/4) Start epoch 2880 2024-10-21 18:21:42,171 INFO [train.py:561] (1/4) Epoch 2880, batch 6, global_batch_idx: 46070, batch size: 106, loss[dur_loss=0.1908, prior_loss=0.9725, diff_loss=0.3125, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3957, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 18:21:55,277 INFO [train.py:682] (1/4) Start epoch 2881 2024-10-21 18:22:04,357 INFO [train.py:561] (1/4) Epoch 2881, batch 0, global_batch_idx: 46080, batch size: 108, loss[dur_loss=0.1973, prior_loss=0.9731, diff_loss=0.2681, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.1973, prior_loss=0.9731, diff_loss=0.2681, tot_loss=1.438, over 108.00 samples.], 2024-10-21 18:22:18,640 INFO [train.py:561] (1/4) Epoch 2881, batch 10, global_batch_idx: 46090, batch size: 111, loss[dur_loss=0.1961, prior_loss=0.9734, diff_loss=0.2613, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.3561, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 18:22:25,801 INFO [train.py:682] (1/4) Start epoch 2882 2024-10-21 18:22:39,795 INFO [train.py:561] (1/4) Epoch 2882, batch 4, global_batch_idx: 46100, batch size: 189, loss[dur_loss=0.1897, prior_loss=0.9725, diff_loss=0.3363, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.9718, diff_loss=0.4165, tot_loss=1.578, over 937.00 samples.], 2024-10-21 18:22:54,746 INFO [train.py:561] (1/4) Epoch 2882, batch 14, global_batch_idx: 46110, batch size: 142, loss[dur_loss=0.1941, prior_loss=0.9723, diff_loss=0.3096, tot_loss=1.476, over 142.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.3454, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 18:22:56,197 INFO [train.py:682] (1/4) Start epoch 2883 2024-10-21 18:23:16,470 INFO [train.py:561] (1/4) Epoch 2883, batch 8, global_batch_idx: 46120, batch size: 170, loss[dur_loss=0.1974, prior_loss=0.9727, diff_loss=0.3026, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9721, diff_loss=0.3709, tot_loss=1.534, over 1432.00 samples.], 2024-10-21 18:23:26,647 INFO [train.py:682] (1/4) Start epoch 2884 2024-10-21 18:23:38,238 INFO [train.py:561] (1/4) Epoch 2884, batch 2, global_batch_idx: 46130, batch size: 203, loss[dur_loss=0.1946, prior_loss=0.9724, diff_loss=0.3546, tot_loss=1.522, over 203.00 samples.], tot_loss[dur_loss=0.195, prior_loss=0.9726, diff_loss=0.323, tot_loss=1.491, over 442.00 samples.], 2024-10-21 18:23:52,470 INFO [train.py:561] (1/4) Epoch 2884, batch 12, global_batch_idx: 46140, batch size: 152, loss[dur_loss=0.1923, prior_loss=0.9726, diff_loss=0.2984, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9722, diff_loss=0.3566, tot_loss=1.52, over 1966.00 samples.], 2024-10-21 18:23:56,933 INFO [train.py:682] (1/4) Start epoch 2885 2024-10-21 18:24:14,069 INFO [train.py:561] (1/4) Epoch 2885, batch 6, global_batch_idx: 46150, batch size: 106, loss[dur_loss=0.1929, prior_loss=0.9726, diff_loss=0.2973, tot_loss=1.463, over 106.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.9718, diff_loss=0.3897, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 18:24:27,235 INFO [train.py:682] (1/4) Start epoch 2886 2024-10-21 18:24:36,318 INFO [train.py:561] (1/4) Epoch 2886, batch 0, global_batch_idx: 46160, batch size: 108, loss[dur_loss=0.1961, prior_loss=0.9731, diff_loss=0.2596, tot_loss=1.429, over 108.00 samples.], tot_loss[dur_loss=0.1961, prior_loss=0.9731, diff_loss=0.2596, tot_loss=1.429, over 108.00 samples.], 2024-10-21 18:24:50,695 INFO [train.py:561] (1/4) Epoch 2886, batch 10, global_batch_idx: 46170, batch size: 111, loss[dur_loss=0.1923, prior_loss=0.9734, diff_loss=0.3074, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.1905, prior_loss=0.9722, diff_loss=0.3601, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 18:24:57,785 INFO [train.py:682] (1/4) Start epoch 2887 2024-10-21 18:25:11,607 INFO [train.py:561] (1/4) Epoch 2887, batch 4, global_batch_idx: 46180, batch size: 189, loss[dur_loss=0.1912, prior_loss=0.9722, diff_loss=0.3251, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9717, diff_loss=0.4224, tot_loss=1.583, over 937.00 samples.], 2024-10-21 18:25:26,584 INFO [train.py:561] (1/4) Epoch 2887, batch 14, global_batch_idx: 46190, batch size: 142, loss[dur_loss=0.1922, prior_loss=0.9723, diff_loss=0.3262, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9723, diff_loss=0.357, tot_loss=1.52, over 2210.00 samples.], 2024-10-21 18:25:28,011 INFO [train.py:682] (1/4) Start epoch 2888 2024-10-21 18:25:48,652 INFO [train.py:561] (1/4) Epoch 2888, batch 8, global_batch_idx: 46200, batch size: 170, loss[dur_loss=0.1926, prior_loss=0.9725, diff_loss=0.3188, tot_loss=1.484, over 170.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.972, diff_loss=0.3936, tot_loss=1.555, over 1432.00 samples.], 2024-10-21 18:25:58,794 INFO [train.py:682] (1/4) Start epoch 2889 2024-10-21 18:26:10,536 INFO [train.py:561] (1/4) Epoch 2889, batch 2, global_batch_idx: 46210, batch size: 203, loss[dur_loss=0.1927, prior_loss=0.9724, diff_loss=0.2998, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9725, diff_loss=0.2891, tot_loss=1.454, over 442.00 samples.], 2024-10-21 18:26:25,041 INFO [train.py:561] (1/4) Epoch 2889, batch 12, global_batch_idx: 46220, batch size: 152, loss[dur_loss=0.1933, prior_loss=0.9726, diff_loss=0.2767, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.3391, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 18:26:29,553 INFO [train.py:682] (1/4) Start epoch 2890 2024-10-21 18:26:46,987 INFO [train.py:561] (1/4) Epoch 2890, batch 6, global_batch_idx: 46230, batch size: 106, loss[dur_loss=0.1908, prior_loss=0.9724, diff_loss=0.2685, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9718, diff_loss=0.4023, tot_loss=1.562, over 1142.00 samples.], 2024-10-21 18:27:00,132 INFO [train.py:682] (1/4) Start epoch 2891 2024-10-21 18:27:09,198 INFO [train.py:561] (1/4) Epoch 2891, batch 0, global_batch_idx: 46240, batch size: 108, loss[dur_loss=0.1946, prior_loss=0.973, diff_loss=0.2848, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.973, diff_loss=0.2848, tot_loss=1.452, over 108.00 samples.], 2024-10-21 18:27:23,466 INFO [train.py:561] (1/4) Epoch 2891, batch 10, global_batch_idx: 46250, batch size: 111, loss[dur_loss=0.1934, prior_loss=0.9735, diff_loss=0.2857, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9721, diff_loss=0.371, tot_loss=1.533, over 1656.00 samples.], 2024-10-21 18:27:30,597 INFO [train.py:682] (1/4) Start epoch 2892 2024-10-21 18:27:44,716 INFO [train.py:561] (1/4) Epoch 2892, batch 4, global_batch_idx: 46260, batch size: 189, loss[dur_loss=0.1924, prior_loss=0.9725, diff_loss=0.3324, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9717, diff_loss=0.4277, tot_loss=1.588, over 937.00 samples.], 2024-10-21 18:27:59,707 INFO [train.py:561] (1/4) Epoch 2892, batch 14, global_batch_idx: 46270, batch size: 142, loss[dur_loss=0.1924, prior_loss=0.9723, diff_loss=0.2717, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1902, prior_loss=0.9723, diff_loss=0.3509, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 18:28:01,131 INFO [train.py:682] (1/4) Start epoch 2893 2024-10-21 18:28:21,331 INFO [train.py:561] (1/4) Epoch 2893, batch 8, global_batch_idx: 46280, batch size: 170, loss[dur_loss=0.1919, prior_loss=0.9725, diff_loss=0.318, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.972, diff_loss=0.3828, tot_loss=1.543, over 1432.00 samples.], 2024-10-21 18:28:31,539 INFO [train.py:682] (1/4) Start epoch 2894 2024-10-21 18:28:43,307 INFO [train.py:561] (1/4) Epoch 2894, batch 2, global_batch_idx: 46290, batch size: 203, loss[dur_loss=0.1936, prior_loss=0.9724, diff_loss=0.3215, tot_loss=1.488, over 203.00 samples.], tot_loss[dur_loss=0.1934, prior_loss=0.9725, diff_loss=0.3107, tot_loss=1.477, over 442.00 samples.], 2024-10-21 18:28:57,651 INFO [train.py:561] (1/4) Epoch 2894, batch 12, global_batch_idx: 46300, batch size: 152, loss[dur_loss=0.1935, prior_loss=0.9727, diff_loss=0.3054, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.3575, tot_loss=1.52, over 1966.00 samples.], 2024-10-21 18:29:02,096 INFO [train.py:682] (1/4) Start epoch 2895 2024-10-21 18:29:19,290 INFO [train.py:561] (1/4) Epoch 2895, batch 6, global_batch_idx: 46310, batch size: 106, loss[dur_loss=0.1897, prior_loss=0.9726, diff_loss=0.314, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9719, diff_loss=0.4144, tot_loss=1.575, over 1142.00 samples.], 2024-10-21 18:29:32,497 INFO [train.py:682] (1/4) Start epoch 2896 2024-10-21 18:29:41,298 INFO [train.py:561] (1/4) Epoch 2896, batch 0, global_batch_idx: 46320, batch size: 108, loss[dur_loss=0.1976, prior_loss=0.9731, diff_loss=0.329, tot_loss=1.5, over 108.00 samples.], tot_loss[dur_loss=0.1976, prior_loss=0.9731, diff_loss=0.329, tot_loss=1.5, over 108.00 samples.], 2024-10-21 18:29:55,752 INFO [train.py:561] (1/4) Epoch 2896, batch 10, global_batch_idx: 46330, batch size: 111, loss[dur_loss=0.1925, prior_loss=0.9735, diff_loss=0.2642, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.3701, tot_loss=1.532, over 1656.00 samples.], 2024-10-21 18:30:02,902 INFO [train.py:682] (1/4) Start epoch 2897 2024-10-21 18:30:17,012 INFO [train.py:561] (1/4) Epoch 2897, batch 4, global_batch_idx: 46340, batch size: 189, loss[dur_loss=0.1896, prior_loss=0.9724, diff_loss=0.3357, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1877, prior_loss=0.9717, diff_loss=0.4135, tot_loss=1.573, over 937.00 samples.], 2024-10-21 18:30:32,003 INFO [train.py:561] (1/4) Epoch 2897, batch 14, global_batch_idx: 46350, batch size: 142, loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.2821, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.348, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 18:30:33,435 INFO [train.py:682] (1/4) Start epoch 2898 2024-10-21 18:30:53,282 INFO [train.py:561] (1/4) Epoch 2898, batch 8, global_batch_idx: 46360, batch size: 170, loss[dur_loss=0.1924, prior_loss=0.9725, diff_loss=0.29, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.972, diff_loss=0.3634, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 18:31:03,417 INFO [train.py:682] (1/4) Start epoch 2899 2024-10-21 18:31:15,096 INFO [train.py:561] (1/4) Epoch 2899, batch 2, global_batch_idx: 46370, batch size: 203, loss[dur_loss=0.1921, prior_loss=0.9725, diff_loss=0.3299, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.9725, diff_loss=0.3093, tot_loss=1.475, over 442.00 samples.], 2024-10-21 18:31:29,343 INFO [train.py:561] (1/4) Epoch 2899, batch 12, global_batch_idx: 46380, batch size: 152, loss[dur_loss=0.1908, prior_loss=0.9727, diff_loss=0.3077, tot_loss=1.471, over 152.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.3555, tot_loss=1.518, over 1966.00 samples.], 2024-10-21 18:31:33,787 INFO [train.py:682] (1/4) Start epoch 2900 2024-10-21 18:31:50,962 INFO [train.py:561] (1/4) Epoch 2900, batch 6, global_batch_idx: 46390, batch size: 106, loss[dur_loss=0.1917, prior_loss=0.9725, diff_loss=0.3102, tot_loss=1.474, over 106.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.9719, diff_loss=0.3962, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 18:32:04,017 INFO [train.py:682] (1/4) Start epoch 2901 2024-10-21 18:32:12,813 INFO [train.py:561] (1/4) Epoch 2901, batch 0, global_batch_idx: 46400, batch size: 108, loss[dur_loss=0.196, prior_loss=0.973, diff_loss=0.2894, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.196, prior_loss=0.973, diff_loss=0.2894, tot_loss=1.458, over 108.00 samples.], 2024-10-21 18:32:27,026 INFO [train.py:561] (1/4) Epoch 2901, batch 10, global_batch_idx: 46410, batch size: 111, loss[dur_loss=0.1901, prior_loss=0.9735, diff_loss=0.3103, tot_loss=1.474, over 111.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9721, diff_loss=0.3646, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 18:32:34,085 INFO [train.py:682] (1/4) Start epoch 2902 2024-10-21 18:32:48,172 INFO [train.py:561] (1/4) Epoch 2902, batch 4, global_batch_idx: 46420, batch size: 189, loss[dur_loss=0.1918, prior_loss=0.9725, diff_loss=0.2968, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9717, diff_loss=0.4075, tot_loss=1.567, over 937.00 samples.], 2024-10-21 18:33:03,090 INFO [train.py:561] (1/4) Epoch 2902, batch 14, global_batch_idx: 46430, batch size: 142, loss[dur_loss=0.1942, prior_loss=0.9724, diff_loss=0.2716, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1905, prior_loss=0.9722, diff_loss=0.348, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 18:33:04,543 INFO [train.py:682] (1/4) Start epoch 2903 2024-10-21 18:33:24,550 INFO [train.py:561] (1/4) Epoch 2903, batch 8, global_batch_idx: 46440, batch size: 170, loss[dur_loss=0.1925, prior_loss=0.9726, diff_loss=0.2913, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.972, diff_loss=0.3724, tot_loss=1.534, over 1432.00 samples.], 2024-10-21 18:33:34,752 INFO [train.py:682] (1/4) Start epoch 2904 2024-10-21 18:33:46,206 INFO [train.py:561] (1/4) Epoch 2904, batch 2, global_batch_idx: 46450, batch size: 203, loss[dur_loss=0.1932, prior_loss=0.9725, diff_loss=0.3369, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.1941, prior_loss=0.9726, diff_loss=0.3088, tot_loss=1.475, over 442.00 samples.], 2024-10-21 18:34:00,621 INFO [train.py:561] (1/4) Epoch 2904, batch 12, global_batch_idx: 46460, batch size: 152, loss[dur_loss=0.1915, prior_loss=0.9727, diff_loss=0.2792, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9723, diff_loss=0.3514, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 18:34:05,129 INFO [train.py:682] (1/4) Start epoch 2905 2024-10-21 18:34:22,458 INFO [train.py:561] (1/4) Epoch 2905, batch 6, global_batch_idx: 46470, batch size: 106, loss[dur_loss=0.1898, prior_loss=0.9725, diff_loss=0.3538, tot_loss=1.516, over 106.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.4049, tot_loss=1.566, over 1142.00 samples.], 2024-10-21 18:34:35,619 INFO [train.py:682] (1/4) Start epoch 2906 2024-10-21 18:34:44,412 INFO [train.py:561] (1/4) Epoch 2906, batch 0, global_batch_idx: 46480, batch size: 108, loss[dur_loss=0.1998, prior_loss=0.9733, diff_loss=0.2595, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1998, prior_loss=0.9733, diff_loss=0.2595, tot_loss=1.433, over 108.00 samples.], 2024-10-21 18:34:58,701 INFO [train.py:561] (1/4) Epoch 2906, batch 10, global_batch_idx: 46490, batch size: 111, loss[dur_loss=0.1916, prior_loss=0.9735, diff_loss=0.2987, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9722, diff_loss=0.3641, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 18:35:05,862 INFO [train.py:682] (1/4) Start epoch 2907 2024-10-21 18:35:19,616 INFO [train.py:561] (1/4) Epoch 2907, batch 4, global_batch_idx: 46500, batch size: 189, loss[dur_loss=0.1912, prior_loss=0.9725, diff_loss=0.3436, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9718, diff_loss=0.4171, tot_loss=1.577, over 937.00 samples.], 2024-10-21 18:35:21,283 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 18:35:59,039 INFO [train.py:589] (1/4) Epoch 2907, validation: dur_loss=0.4529, prior_loss=1.036, diff_loss=0.4001, tot_loss=1.889, over 100.00 samples. 2024-10-21 18:35:59,041 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 18:36:12,477 INFO [train.py:561] (1/4) Epoch 2907, batch 14, global_batch_idx: 46510, batch size: 142, loss[dur_loss=0.1929, prior_loss=0.9723, diff_loss=0.2989, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9723, diff_loss=0.3426, tot_loss=1.505, over 2210.00 samples.], 2024-10-21 18:36:13,912 INFO [train.py:682] (1/4) Start epoch 2908 2024-10-21 18:36:34,400 INFO [train.py:561] (1/4) Epoch 2908, batch 8, global_batch_idx: 46520, batch size: 170, loss[dur_loss=0.1929, prior_loss=0.9729, diff_loss=0.3035, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9721, diff_loss=0.3709, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 18:36:44,512 INFO [train.py:682] (1/4) Start epoch 2909 2024-10-21 18:36:56,101 INFO [train.py:561] (1/4) Epoch 2909, batch 2, global_batch_idx: 46530, batch size: 203, loss[dur_loss=0.1912, prior_loss=0.9724, diff_loss=0.3298, tot_loss=1.493, over 203.00 samples.], tot_loss[dur_loss=0.1922, prior_loss=0.9726, diff_loss=0.3218, tot_loss=1.487, over 442.00 samples.], 2024-10-21 18:37:10,373 INFO [train.py:561] (1/4) Epoch 2909, batch 12, global_batch_idx: 46540, batch size: 152, loss[dur_loss=0.1904, prior_loss=0.9723, diff_loss=0.2761, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.3569, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 18:37:14,820 INFO [train.py:682] (1/4) Start epoch 2910 2024-10-21 18:37:32,153 INFO [train.py:561] (1/4) Epoch 2910, batch 6, global_batch_idx: 46550, batch size: 106, loss[dur_loss=0.1904, prior_loss=0.9726, diff_loss=0.3203, tot_loss=1.483, over 106.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.972, diff_loss=0.3993, tot_loss=1.561, over 1142.00 samples.], 2024-10-21 18:37:45,165 INFO [train.py:682] (1/4) Start epoch 2911 2024-10-21 18:37:54,193 INFO [train.py:561] (1/4) Epoch 2911, batch 0, global_batch_idx: 46560, batch size: 108, loss[dur_loss=0.1971, prior_loss=0.9732, diff_loss=0.2986, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.1971, prior_loss=0.9732, diff_loss=0.2986, tot_loss=1.469, over 108.00 samples.], 2024-10-21 18:38:08,422 INFO [train.py:561] (1/4) Epoch 2911, batch 10, global_batch_idx: 46570, batch size: 111, loss[dur_loss=0.193, prior_loss=0.9735, diff_loss=0.3074, tot_loss=1.474, over 111.00 samples.], tot_loss[dur_loss=0.1912, prior_loss=0.9723, diff_loss=0.3717, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 18:38:15,516 INFO [train.py:682] (1/4) Start epoch 2912 2024-10-21 18:38:29,594 INFO [train.py:561] (1/4) Epoch 2912, batch 4, global_batch_idx: 46580, batch size: 189, loss[dur_loss=0.1917, prior_loss=0.9729, diff_loss=0.3257, tot_loss=1.49, over 189.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.9719, diff_loss=0.4221, tot_loss=1.583, over 937.00 samples.], 2024-10-21 18:38:44,525 INFO [train.py:561] (1/4) Epoch 2912, batch 14, global_batch_idx: 46590, batch size: 142, loss[dur_loss=0.1914, prior_loss=0.9722, diff_loss=0.2967, tot_loss=1.46, over 142.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9724, diff_loss=0.3497, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 18:38:45,956 INFO [train.py:682] (1/4) Start epoch 2913 2024-10-21 18:39:06,286 INFO [train.py:561] (1/4) Epoch 2913, batch 8, global_batch_idx: 46600, batch size: 170, loss[dur_loss=0.1942, prior_loss=0.9726, diff_loss=0.3292, tot_loss=1.496, over 170.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9721, diff_loss=0.3855, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 18:39:16,544 INFO [train.py:682] (1/4) Start epoch 2914 2024-10-21 18:39:28,884 INFO [train.py:561] (1/4) Epoch 2914, batch 2, global_batch_idx: 46610, batch size: 203, loss[dur_loss=0.1932, prior_loss=0.9724, diff_loss=0.3039, tot_loss=1.469, over 203.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9725, diff_loss=0.2838, tot_loss=1.45, over 442.00 samples.], 2024-10-21 18:39:43,084 INFO [train.py:561] (1/4) Epoch 2914, batch 12, global_batch_idx: 46620, batch size: 152, loss[dur_loss=0.1886, prior_loss=0.9725, diff_loss=0.2777, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.3493, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 18:39:47,510 INFO [train.py:682] (1/4) Start epoch 2915 2024-10-21 18:40:04,793 INFO [train.py:561] (1/4) Epoch 2915, batch 6, global_batch_idx: 46630, batch size: 106, loss[dur_loss=0.192, prior_loss=0.9724, diff_loss=0.2895, tot_loss=1.454, over 106.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9719, diff_loss=0.3977, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 18:40:17,885 INFO [train.py:682] (1/4) Start epoch 2916 2024-10-21 18:40:26,698 INFO [train.py:561] (1/4) Epoch 2916, batch 0, global_batch_idx: 46640, batch size: 108, loss[dur_loss=0.1951, prior_loss=0.9729, diff_loss=0.329, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9729, diff_loss=0.329, tot_loss=1.497, over 108.00 samples.], 2024-10-21 18:40:40,910 INFO [train.py:561] (1/4) Epoch 2916, batch 10, global_batch_idx: 46650, batch size: 111, loss[dur_loss=0.194, prior_loss=0.9737, diff_loss=0.2634, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.36, tot_loss=1.522, over 1656.00 samples.], 2024-10-21 18:40:47,952 INFO [train.py:682] (1/4) Start epoch 2917 2024-10-21 18:41:02,202 INFO [train.py:561] (1/4) Epoch 2917, batch 4, global_batch_idx: 46660, batch size: 189, loss[dur_loss=0.1915, prior_loss=0.9726, diff_loss=0.3228, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9718, diff_loss=0.408, tot_loss=1.569, over 937.00 samples.], 2024-10-21 18:41:17,117 INFO [train.py:561] (1/4) Epoch 2917, batch 14, global_batch_idx: 46670, batch size: 142, loss[dur_loss=0.1941, prior_loss=0.9724, diff_loss=0.2807, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9723, diff_loss=0.3451, tot_loss=1.508, over 2210.00 samples.], 2024-10-21 18:41:18,538 INFO [train.py:682] (1/4) Start epoch 2918 2024-10-21 18:41:38,907 INFO [train.py:561] (1/4) Epoch 2918, batch 8, global_batch_idx: 46680, batch size: 170, loss[dur_loss=0.1902, prior_loss=0.9727, diff_loss=0.3079, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9721, diff_loss=0.3735, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 18:41:49,048 INFO [train.py:682] (1/4) Start epoch 2919 2024-10-21 18:42:00,668 INFO [train.py:561] (1/4) Epoch 2919, batch 2, global_batch_idx: 46690, batch size: 203, loss[dur_loss=0.194, prior_loss=0.9727, diff_loss=0.3251, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9727, diff_loss=0.3006, tot_loss=1.466, over 442.00 samples.], 2024-10-21 18:42:15,030 INFO [train.py:561] (1/4) Epoch 2919, batch 12, global_batch_idx: 46700, batch size: 152, loss[dur_loss=0.1912, prior_loss=0.9723, diff_loss=0.2907, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9723, diff_loss=0.3523, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 18:42:19,484 INFO [train.py:682] (1/4) Start epoch 2920 2024-10-21 18:42:36,926 INFO [train.py:561] (1/4) Epoch 2920, batch 6, global_batch_idx: 46710, batch size: 106, loss[dur_loss=0.1909, prior_loss=0.9723, diff_loss=0.3224, tot_loss=1.486, over 106.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9718, diff_loss=0.4014, tot_loss=1.562, over 1142.00 samples.], 2024-10-21 18:42:50,035 INFO [train.py:682] (1/4) Start epoch 2921 2024-10-21 18:42:58,767 INFO [train.py:561] (1/4) Epoch 2921, batch 0, global_batch_idx: 46720, batch size: 108, loss[dur_loss=0.1935, prior_loss=0.973, diff_loss=0.3231, tot_loss=1.49, over 108.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.973, diff_loss=0.3231, tot_loss=1.49, over 108.00 samples.], 2024-10-21 18:43:13,092 INFO [train.py:561] (1/4) Epoch 2921, batch 10, global_batch_idx: 46730, batch size: 111, loss[dur_loss=0.1941, prior_loss=0.9735, diff_loss=0.2695, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.9721, diff_loss=0.3632, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 18:43:20,255 INFO [train.py:682] (1/4) Start epoch 2922 2024-10-21 18:43:34,817 INFO [train.py:561] (1/4) Epoch 2922, batch 4, global_batch_idx: 46740, batch size: 189, loss[dur_loss=0.189, prior_loss=0.9724, diff_loss=0.3118, tot_loss=1.473, over 189.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9716, diff_loss=0.4047, tot_loss=1.564, over 937.00 samples.], 2024-10-21 18:43:49,787 INFO [train.py:561] (1/4) Epoch 2922, batch 14, global_batch_idx: 46750, batch size: 142, loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.2552, tot_loss=1.418, over 142.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.9722, diff_loss=0.3415, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 18:43:51,204 INFO [train.py:682] (1/4) Start epoch 2923 2024-10-21 18:44:11,422 INFO [train.py:561] (1/4) Epoch 2923, batch 8, global_batch_idx: 46760, batch size: 170, loss[dur_loss=0.192, prior_loss=0.9726, diff_loss=0.3176, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.972, diff_loss=0.3755, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 18:44:21,597 INFO [train.py:682] (1/4) Start epoch 2924 2024-10-21 18:44:33,104 INFO [train.py:561] (1/4) Epoch 2924, batch 2, global_batch_idx: 46770, batch size: 203, loss[dur_loss=0.1956, prior_loss=0.9726, diff_loss=0.3187, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1951, prior_loss=0.9726, diff_loss=0.2938, tot_loss=1.461, over 442.00 samples.], 2024-10-21 18:44:47,373 INFO [train.py:561] (1/4) Epoch 2924, batch 12, global_batch_idx: 46780, batch size: 152, loss[dur_loss=0.1881, prior_loss=0.9724, diff_loss=0.3146, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9721, diff_loss=0.3509, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 18:44:51,828 INFO [train.py:682] (1/4) Start epoch 2925 2024-10-21 18:45:09,072 INFO [train.py:561] (1/4) Epoch 2925, batch 6, global_batch_idx: 46790, batch size: 106, loss[dur_loss=0.1908, prior_loss=0.9724, diff_loss=0.3207, tot_loss=1.484, over 106.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9719, diff_loss=0.3998, tot_loss=1.56, over 1142.00 samples.], 2024-10-21 18:45:22,145 INFO [train.py:682] (1/4) Start epoch 2926 2024-10-21 18:45:31,435 INFO [train.py:561] (1/4) Epoch 2926, batch 0, global_batch_idx: 46800, batch size: 108, loss[dur_loss=0.1936, prior_loss=0.973, diff_loss=0.3, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.973, diff_loss=0.3, tot_loss=1.467, over 108.00 samples.], 2024-10-21 18:45:45,721 INFO [train.py:561] (1/4) Epoch 2926, batch 10, global_batch_idx: 46810, batch size: 111, loss[dur_loss=0.1928, prior_loss=0.9734, diff_loss=0.3129, tot_loss=1.479, over 111.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3625, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 18:45:52,795 INFO [train.py:682] (1/4) Start epoch 2927 2024-10-21 18:46:06,671 INFO [train.py:561] (1/4) Epoch 2927, batch 4, global_batch_idx: 46820, batch size: 189, loss[dur_loss=0.1885, prior_loss=0.9723, diff_loss=0.3238, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9717, diff_loss=0.4215, tot_loss=1.582, over 937.00 samples.], 2024-10-21 18:46:21,537 INFO [train.py:561] (1/4) Epoch 2927, batch 14, global_batch_idx: 46830, batch size: 142, loss[dur_loss=0.1902, prior_loss=0.9723, diff_loss=0.2897, tot_loss=1.452, over 142.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9721, diff_loss=0.3514, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 18:46:22,953 INFO [train.py:682] (1/4) Start epoch 2928 2024-10-21 18:46:42,929 INFO [train.py:561] (1/4) Epoch 2928, batch 8, global_batch_idx: 46840, batch size: 170, loss[dur_loss=0.1922, prior_loss=0.9728, diff_loss=0.3038, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.972, diff_loss=0.3678, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 18:46:52,973 INFO [train.py:682] (1/4) Start epoch 2929 2024-10-21 18:47:04,680 INFO [train.py:561] (1/4) Epoch 2929, batch 2, global_batch_idx: 46850, batch size: 203, loss[dur_loss=0.1943, prior_loss=0.9724, diff_loss=0.3437, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.1933, prior_loss=0.9725, diff_loss=0.315, tot_loss=1.481, over 442.00 samples.], 2024-10-21 18:47:18,855 INFO [train.py:561] (1/4) Epoch 2929, batch 12, global_batch_idx: 46860, batch size: 152, loss[dur_loss=0.1886, prior_loss=0.9724, diff_loss=0.3029, tot_loss=1.464, over 152.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.9721, diff_loss=0.3521, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 18:47:23,295 INFO [train.py:682] (1/4) Start epoch 2930 2024-10-21 18:47:40,728 INFO [train.py:561] (1/4) Epoch 2930, batch 6, global_batch_idx: 46870, batch size: 106, loss[dur_loss=0.1895, prior_loss=0.9723, diff_loss=0.2923, tot_loss=1.454, over 106.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9717, diff_loss=0.3984, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 18:47:53,841 INFO [train.py:682] (1/4) Start epoch 2931 2024-10-21 18:48:02,674 INFO [train.py:561] (1/4) Epoch 2931, batch 0, global_batch_idx: 46880, batch size: 108, loss[dur_loss=0.1936, prior_loss=0.9729, diff_loss=0.3146, tot_loss=1.481, over 108.00 samples.], tot_loss[dur_loss=0.1936, prior_loss=0.9729, diff_loss=0.3146, tot_loss=1.481, over 108.00 samples.], 2024-10-21 18:48:17,016 INFO [train.py:561] (1/4) Epoch 2931, batch 10, global_batch_idx: 46890, batch size: 111, loss[dur_loss=0.1901, prior_loss=0.9732, diff_loss=0.2535, tot_loss=1.417, over 111.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9719, diff_loss=0.3658, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 18:48:24,129 INFO [train.py:682] (1/4) Start epoch 2932 2024-10-21 18:48:37,945 INFO [train.py:561] (1/4) Epoch 2932, batch 4, global_batch_idx: 46900, batch size: 189, loss[dur_loss=0.1904, prior_loss=0.9725, diff_loss=0.3159, tot_loss=1.479, over 189.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9715, diff_loss=0.4024, tot_loss=1.562, over 937.00 samples.], 2024-10-21 18:48:52,913 INFO [train.py:561] (1/4) Epoch 2932, batch 14, global_batch_idx: 46910, batch size: 142, loss[dur_loss=0.1928, prior_loss=0.9722, diff_loss=0.2809, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9721, diff_loss=0.3469, tot_loss=1.508, over 2210.00 samples.], 2024-10-21 18:48:54,367 INFO [train.py:682] (1/4) Start epoch 2933 2024-10-21 18:49:14,900 INFO [train.py:561] (1/4) Epoch 2933, batch 8, global_batch_idx: 46920, batch size: 170, loss[dur_loss=0.1927, prior_loss=0.9723, diff_loss=0.3295, tot_loss=1.495, over 170.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.9719, diff_loss=0.379, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 18:49:25,131 INFO [train.py:682] (1/4) Start epoch 2934 2024-10-21 18:49:36,708 INFO [train.py:561] (1/4) Epoch 2934, batch 2, global_batch_idx: 46930, batch size: 203, loss[dur_loss=0.191, prior_loss=0.9725, diff_loss=0.3237, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9725, diff_loss=0.3029, tot_loss=1.467, over 442.00 samples.], 2024-10-21 18:49:51,142 INFO [train.py:561] (1/4) Epoch 2934, batch 12, global_batch_idx: 46940, batch size: 152, loss[dur_loss=0.1873, prior_loss=0.9725, diff_loss=0.307, tot_loss=1.467, over 152.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.972, diff_loss=0.3548, tot_loss=1.516, over 1966.00 samples.], 2024-10-21 18:49:55,697 INFO [train.py:682] (1/4) Start epoch 2935 2024-10-21 18:50:12,852 INFO [train.py:561] (1/4) Epoch 2935, batch 6, global_batch_idx: 46950, batch size: 106, loss[dur_loss=0.1906, prior_loss=0.9724, diff_loss=0.3043, tot_loss=1.467, over 106.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9718, diff_loss=0.3915, tot_loss=1.553, over 1142.00 samples.], 2024-10-21 18:50:25,972 INFO [train.py:682] (1/4) Start epoch 2936 2024-10-21 18:50:35,219 INFO [train.py:561] (1/4) Epoch 2936, batch 0, global_batch_idx: 46960, batch size: 108, loss[dur_loss=0.1937, prior_loss=0.9728, diff_loss=0.2722, tot_loss=1.439, over 108.00 samples.], tot_loss[dur_loss=0.1937, prior_loss=0.9728, diff_loss=0.2722, tot_loss=1.439, over 108.00 samples.], 2024-10-21 18:50:49,542 INFO [train.py:561] (1/4) Epoch 2936, batch 10, global_batch_idx: 46970, batch size: 111, loss[dur_loss=0.1902, prior_loss=0.9733, diff_loss=0.2769, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.972, diff_loss=0.3511, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 18:50:56,678 INFO [train.py:682] (1/4) Start epoch 2937 2024-10-21 18:51:10,570 INFO [train.py:561] (1/4) Epoch 2937, batch 4, global_batch_idx: 46980, batch size: 189, loss[dur_loss=0.1913, prior_loss=0.9721, diff_loss=0.3093, tot_loss=1.473, over 189.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9715, diff_loss=0.4138, tot_loss=1.574, over 937.00 samples.], 2024-10-21 18:51:25,533 INFO [train.py:561] (1/4) Epoch 2937, batch 14, global_batch_idx: 46990, batch size: 142, loss[dur_loss=0.1941, prior_loss=0.9722, diff_loss=0.3116, tot_loss=1.478, over 142.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9721, diff_loss=0.3567, tot_loss=1.519, over 2210.00 samples.], 2024-10-21 18:51:26,967 INFO [train.py:682] (1/4) Start epoch 2938 2024-10-21 18:51:47,015 INFO [train.py:561] (1/4) Epoch 2938, batch 8, global_batch_idx: 47000, batch size: 170, loss[dur_loss=0.1956, prior_loss=0.9726, diff_loss=0.293, tot_loss=1.461, over 170.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9719, diff_loss=0.3666, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 18:51:57,169 INFO [train.py:682] (1/4) Start epoch 2939 2024-10-21 18:52:08,825 INFO [train.py:561] (1/4) Epoch 2939, batch 2, global_batch_idx: 47010, batch size: 203, loss[dur_loss=0.1897, prior_loss=0.9721, diff_loss=0.3147, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.3099, tot_loss=1.472, over 442.00 samples.], 2024-10-21 18:52:23,090 INFO [train.py:561] (1/4) Epoch 2939, batch 12, global_batch_idx: 47020, batch size: 152, loss[dur_loss=0.1914, prior_loss=0.9727, diff_loss=0.2733, tot_loss=1.437, over 152.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.9721, diff_loss=0.3539, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 18:52:27,611 INFO [train.py:682] (1/4) Start epoch 2940 2024-10-21 18:52:44,625 INFO [train.py:561] (1/4) Epoch 2940, batch 6, global_batch_idx: 47030, batch size: 106, loss[dur_loss=0.188, prior_loss=0.972, diff_loss=0.2722, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9717, diff_loss=0.3865, tot_loss=1.546, over 1142.00 samples.], 2024-10-21 18:52:57,704 INFO [train.py:682] (1/4) Start epoch 2941 2024-10-21 18:53:06,910 INFO [train.py:561] (1/4) Epoch 2941, batch 0, global_batch_idx: 47040, batch size: 108, loss[dur_loss=0.1925, prior_loss=0.9729, diff_loss=0.3184, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9729, diff_loss=0.3184, tot_loss=1.484, over 108.00 samples.], 2024-10-21 18:53:21,200 INFO [train.py:561] (1/4) Epoch 2941, batch 10, global_batch_idx: 47050, batch size: 111, loss[dur_loss=0.1904, prior_loss=0.9733, diff_loss=0.2972, tot_loss=1.461, over 111.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.972, diff_loss=0.3598, tot_loss=1.52, over 1656.00 samples.], 2024-10-21 18:53:28,325 INFO [train.py:682] (1/4) Start epoch 2942 2024-10-21 18:53:42,284 INFO [train.py:561] (1/4) Epoch 2942, batch 4, global_batch_idx: 47060, batch size: 189, loss[dur_loss=0.1928, prior_loss=0.9723, diff_loss=0.3072, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9715, diff_loss=0.4107, tot_loss=1.571, over 937.00 samples.], 2024-10-21 18:53:57,146 INFO [train.py:561] (1/4) Epoch 2942, batch 14, global_batch_idx: 47070, batch size: 142, loss[dur_loss=0.1905, prior_loss=0.9721, diff_loss=0.2902, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9721, diff_loss=0.3493, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 18:53:58,566 INFO [train.py:682] (1/4) Start epoch 2943 2024-10-21 18:54:18,609 INFO [train.py:561] (1/4) Epoch 2943, batch 8, global_batch_idx: 47080, batch size: 170, loss[dur_loss=0.1929, prior_loss=0.9724, diff_loss=0.3152, tot_loss=1.48, over 170.00 samples.], tot_loss[dur_loss=0.1902, prior_loss=0.9718, diff_loss=0.3755, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 18:54:28,741 INFO [train.py:682] (1/4) Start epoch 2944 2024-10-21 18:54:40,207 INFO [train.py:561] (1/4) Epoch 2944, batch 2, global_batch_idx: 47090, batch size: 203, loss[dur_loss=0.1925, prior_loss=0.9723, diff_loss=0.29, tot_loss=1.455, over 203.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9723, diff_loss=0.3098, tot_loss=1.475, over 442.00 samples.], 2024-10-21 18:54:54,508 INFO [train.py:561] (1/4) Epoch 2944, batch 12, global_batch_idx: 47100, batch size: 152, loss[dur_loss=0.19, prior_loss=0.9725, diff_loss=0.3032, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.3599, tot_loss=1.521, over 1966.00 samples.], 2024-10-21 18:54:58,976 INFO [train.py:682] (1/4) Start epoch 2945 2024-10-21 18:55:16,197 INFO [train.py:561] (1/4) Epoch 2945, batch 6, global_batch_idx: 47110, batch size: 106, loss[dur_loss=0.1914, prior_loss=0.9723, diff_loss=0.3047, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.1873, prior_loss=0.9717, diff_loss=0.405, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 18:55:29,331 INFO [train.py:682] (1/4) Start epoch 2946 2024-10-21 18:55:38,014 INFO [train.py:561] (1/4) Epoch 2946, batch 0, global_batch_idx: 47120, batch size: 108, loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3008, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3008, tot_loss=1.467, over 108.00 samples.], 2024-10-21 18:55:52,289 INFO [train.py:561] (1/4) Epoch 2946, batch 10, global_batch_idx: 47130, batch size: 111, loss[dur_loss=0.1917, prior_loss=0.9731, diff_loss=0.2929, tot_loss=1.458, over 111.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9719, diff_loss=0.3555, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 18:55:59,335 INFO [train.py:682] (1/4) Start epoch 2947 2024-10-21 18:56:13,017 INFO [train.py:561] (1/4) Epoch 2947, batch 4, global_batch_idx: 47140, batch size: 189, loss[dur_loss=0.1912, prior_loss=0.9723, diff_loss=0.345, tot_loss=1.509, over 189.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.4229, tot_loss=1.581, over 937.00 samples.], 2024-10-21 18:56:27,941 INFO [train.py:561] (1/4) Epoch 2947, batch 14, global_batch_idx: 47150, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.972, diff_loss=0.2893, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.972, diff_loss=0.3481, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 18:56:29,365 INFO [train.py:682] (1/4) Start epoch 2948 2024-10-21 18:56:49,527 INFO [train.py:561] (1/4) Epoch 2948, batch 8, global_batch_idx: 47160, batch size: 170, loss[dur_loss=0.1909, prior_loss=0.9723, diff_loss=0.289, tot_loss=1.452, over 170.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9718, diff_loss=0.3885, tot_loss=1.549, over 1432.00 samples.], 2024-10-21 18:56:59,674 INFO [train.py:682] (1/4) Start epoch 2949 2024-10-21 18:57:11,128 INFO [train.py:561] (1/4) Epoch 2949, batch 2, global_batch_idx: 47170, batch size: 203, loss[dur_loss=0.194, prior_loss=0.9723, diff_loss=0.3107, tot_loss=1.477, over 203.00 samples.], tot_loss[dur_loss=0.1928, prior_loss=0.9723, diff_loss=0.3064, tot_loss=1.472, over 442.00 samples.], 2024-10-21 18:57:25,465 INFO [train.py:561] (1/4) Epoch 2949, batch 12, global_batch_idx: 47180, batch size: 152, loss[dur_loss=0.1911, prior_loss=0.9724, diff_loss=0.2901, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9719, diff_loss=0.3504, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 18:57:29,953 INFO [train.py:682] (1/4) Start epoch 2950 2024-10-21 18:57:47,024 INFO [train.py:561] (1/4) Epoch 2950, batch 6, global_batch_idx: 47190, batch size: 106, loss[dur_loss=0.1894, prior_loss=0.9721, diff_loss=0.2683, tot_loss=1.43, over 106.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9716, diff_loss=0.392, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 18:58:00,219 INFO [train.py:682] (1/4) Start epoch 2951 2024-10-21 18:58:08,854 INFO [train.py:561] (1/4) Epoch 2951, batch 0, global_batch_idx: 47200, batch size: 108, loss[dur_loss=0.1947, prior_loss=0.9729, diff_loss=0.2471, tot_loss=1.415, over 108.00 samples.], tot_loss[dur_loss=0.1947, prior_loss=0.9729, diff_loss=0.2471, tot_loss=1.415, over 108.00 samples.], 2024-10-21 18:58:23,216 INFO [train.py:561] (1/4) Epoch 2951, batch 10, global_batch_idx: 47210, batch size: 111, loss[dur_loss=0.194, prior_loss=0.9731, diff_loss=0.2882, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9719, diff_loss=0.3573, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 18:58:30,374 INFO [train.py:682] (1/4) Start epoch 2952 2024-10-21 18:58:44,589 INFO [train.py:561] (1/4) Epoch 2952, batch 4, global_batch_idx: 47220, batch size: 189, loss[dur_loss=0.1905, prior_loss=0.9724, diff_loss=0.3282, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9714, diff_loss=0.4218, tot_loss=1.581, over 937.00 samples.], 2024-10-21 18:58:59,630 INFO [train.py:561] (1/4) Epoch 2952, batch 14, global_batch_idx: 47230, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.9719, diff_loss=0.3228, tot_loss=1.484, over 142.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.355, tot_loss=1.517, over 2210.00 samples.], 2024-10-21 18:59:01,073 INFO [train.py:682] (1/4) Start epoch 2953 2024-10-21 18:59:21,167 INFO [train.py:561] (1/4) Epoch 2953, batch 8, global_batch_idx: 47240, batch size: 170, loss[dur_loss=0.1915, prior_loss=0.9722, diff_loss=0.2945, tot_loss=1.458, over 170.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9718, diff_loss=0.3713, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 18:59:31,356 INFO [train.py:682] (1/4) Start epoch 2954 2024-10-21 18:59:42,783 INFO [train.py:561] (1/4) Epoch 2954, batch 2, global_batch_idx: 47250, batch size: 203, loss[dur_loss=0.1922, prior_loss=0.9725, diff_loss=0.3481, tot_loss=1.513, over 203.00 samples.], tot_loss[dur_loss=0.1931, prior_loss=0.9724, diff_loss=0.3206, tot_loss=1.486, over 442.00 samples.], 2024-10-21 18:59:57,000 INFO [train.py:561] (1/4) Epoch 2954, batch 12, global_batch_idx: 47260, batch size: 152, loss[dur_loss=0.1888, prior_loss=0.9725, diff_loss=0.2733, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.1886, prior_loss=0.972, diff_loss=0.3547, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 19:00:01,446 INFO [train.py:682] (1/4) Start epoch 2955 2024-10-21 19:00:18,756 INFO [train.py:561] (1/4) Epoch 2955, batch 6, global_batch_idx: 47270, batch size: 106, loss[dur_loss=0.1881, prior_loss=0.9722, diff_loss=0.3177, tot_loss=1.478, over 106.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9717, diff_loss=0.3975, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 19:00:31,873 INFO [train.py:682] (1/4) Start epoch 2956 2024-10-21 19:00:40,696 INFO [train.py:561] (1/4) Epoch 2956, batch 0, global_batch_idx: 47280, batch size: 108, loss[dur_loss=0.1942, prior_loss=0.9728, diff_loss=0.3046, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1942, prior_loss=0.9728, diff_loss=0.3046, tot_loss=1.472, over 108.00 samples.], 2024-10-21 19:00:55,000 INFO [train.py:561] (1/4) Epoch 2956, batch 10, global_batch_idx: 47290, batch size: 111, loss[dur_loss=0.1908, prior_loss=0.9731, diff_loss=0.2623, tot_loss=1.426, over 111.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9719, diff_loss=0.3663, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 19:01:02,120 INFO [train.py:682] (1/4) Start epoch 2957 2024-10-21 19:01:15,862 INFO [train.py:561] (1/4) Epoch 2957, batch 4, global_batch_idx: 47300, batch size: 189, loss[dur_loss=0.1906, prior_loss=0.9722, diff_loss=0.3398, tot_loss=1.503, over 189.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9716, diff_loss=0.4064, tot_loss=1.565, over 937.00 samples.], 2024-10-21 19:01:30,912 INFO [train.py:561] (1/4) Epoch 2957, batch 14, global_batch_idx: 47310, batch size: 142, loss[dur_loss=0.1898, prior_loss=0.9721, diff_loss=0.2846, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.9721, diff_loss=0.3423, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 19:01:32,361 INFO [train.py:682] (1/4) Start epoch 2958 2024-10-21 19:01:52,499 INFO [train.py:561] (1/4) Epoch 2958, batch 8, global_batch_idx: 47320, batch size: 170, loss[dur_loss=0.193, prior_loss=0.9725, diff_loss=0.3443, tot_loss=1.51, over 170.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9719, diff_loss=0.381, tot_loss=1.541, over 1432.00 samples.], 2024-10-21 19:02:02,680 INFO [train.py:682] (1/4) Start epoch 2959 2024-10-21 19:02:13,942 INFO [train.py:561] (1/4) Epoch 2959, batch 2, global_batch_idx: 47330, batch size: 203, loss[dur_loss=0.1922, prior_loss=0.9723, diff_loss=0.3523, tot_loss=1.517, over 203.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9723, diff_loss=0.3256, tot_loss=1.489, over 442.00 samples.], 2024-10-21 19:02:28,478 INFO [train.py:561] (1/4) Epoch 2959, batch 12, global_batch_idx: 47340, batch size: 152, loss[dur_loss=0.1896, prior_loss=0.9724, diff_loss=0.2871, tot_loss=1.449, over 152.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.972, diff_loss=0.3523, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 19:02:32,946 INFO [train.py:682] (1/4) Start epoch 2960 2024-10-21 19:02:50,165 INFO [train.py:561] (1/4) Epoch 2960, batch 6, global_batch_idx: 47350, batch size: 106, loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.2781, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9717, diff_loss=0.3901, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 19:03:03,279 INFO [train.py:682] (1/4) Start epoch 2961 2024-10-21 19:03:11,951 INFO [train.py:561] (1/4) Epoch 2961, batch 0, global_batch_idx: 47360, batch size: 108, loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.3187, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9727, diff_loss=0.3187, tot_loss=1.484, over 108.00 samples.], 2024-10-21 19:03:26,138 INFO [train.py:561] (1/4) Epoch 2961, batch 10, global_batch_idx: 47370, batch size: 111, loss[dur_loss=0.1905, prior_loss=0.9733, diff_loss=0.3097, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.9719, diff_loss=0.3675, tot_loss=1.528, over 1656.00 samples.], 2024-10-21 19:03:33,258 INFO [train.py:682] (1/4) Start epoch 2962 2024-10-21 19:03:46,964 INFO [train.py:561] (1/4) Epoch 2962, batch 4, global_batch_idx: 47380, batch size: 189, loss[dur_loss=0.1914, prior_loss=0.9725, diff_loss=0.2976, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9716, diff_loss=0.42, tot_loss=1.58, over 937.00 samples.], 2024-10-21 19:04:01,901 INFO [train.py:561] (1/4) Epoch 2962, batch 14, global_batch_idx: 47390, batch size: 142, loss[dur_loss=0.1916, prior_loss=0.972, diff_loss=0.2942, tot_loss=1.458, over 142.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.9721, diff_loss=0.359, tot_loss=1.521, over 2210.00 samples.], 2024-10-21 19:04:03,338 INFO [train.py:682] (1/4) Start epoch 2963 2024-10-21 19:04:23,487 INFO [train.py:561] (1/4) Epoch 2963, batch 8, global_batch_idx: 47400, batch size: 170, loss[dur_loss=0.1944, prior_loss=0.9726, diff_loss=0.3166, tot_loss=1.484, over 170.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9719, diff_loss=0.373, tot_loss=1.533, over 1432.00 samples.], 2024-10-21 19:04:33,797 INFO [train.py:682] (1/4) Start epoch 2964 2024-10-21 19:04:45,469 INFO [train.py:561] (1/4) Epoch 2964, batch 2, global_batch_idx: 47410, batch size: 203, loss[dur_loss=0.1928, prior_loss=0.9723, diff_loss=0.3254, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9724, diff_loss=0.3059, tot_loss=1.471, over 442.00 samples.], 2024-10-21 19:04:59,847 INFO [train.py:561] (1/4) Epoch 2964, batch 12, global_batch_idx: 47420, batch size: 152, loss[dur_loss=0.1894, prior_loss=0.9727, diff_loss=0.3099, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9721, diff_loss=0.3525, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 19:05:04,350 INFO [train.py:682] (1/4) Start epoch 2965 2024-10-21 19:05:21,429 INFO [train.py:561] (1/4) Epoch 2965, batch 6, global_batch_idx: 47430, batch size: 106, loss[dur_loss=0.1888, prior_loss=0.9721, diff_loss=0.2486, tot_loss=1.41, over 106.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9718, diff_loss=0.3848, tot_loss=1.544, over 1142.00 samples.], 2024-10-21 19:05:34,633 INFO [train.py:682] (1/4) Start epoch 2966 2024-10-21 19:05:43,319 INFO [train.py:561] (1/4) Epoch 2966, batch 0, global_batch_idx: 47440, batch size: 108, loss[dur_loss=0.1946, prior_loss=0.9728, diff_loss=0.3169, tot_loss=1.484, over 108.00 samples.], tot_loss[dur_loss=0.1946, prior_loss=0.9728, diff_loss=0.3169, tot_loss=1.484, over 108.00 samples.], 2024-10-21 19:05:57,614 INFO [train.py:561] (1/4) Epoch 2966, batch 10, global_batch_idx: 47450, batch size: 111, loss[dur_loss=0.1924, prior_loss=0.9734, diff_loss=0.3092, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.972, diff_loss=0.3677, tot_loss=1.529, over 1656.00 samples.], 2024-10-21 19:06:04,735 INFO [train.py:682] (1/4) Start epoch 2967 2024-10-21 19:06:18,806 INFO [train.py:561] (1/4) Epoch 2967, batch 4, global_batch_idx: 47460, batch size: 189, loss[dur_loss=0.1896, prior_loss=0.9722, diff_loss=0.3231, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9715, diff_loss=0.4033, tot_loss=1.563, over 937.00 samples.], 2024-10-21 19:06:33,827 INFO [train.py:561] (1/4) Epoch 2967, batch 14, global_batch_idx: 47470, batch size: 142, loss[dur_loss=0.1925, prior_loss=0.972, diff_loss=0.2994, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.3373, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 19:06:35,254 INFO [train.py:682] (1/4) Start epoch 2968 2024-10-21 19:06:55,231 INFO [train.py:561] (1/4) Epoch 2968, batch 8, global_batch_idx: 47480, batch size: 170, loss[dur_loss=0.1913, prior_loss=0.9724, diff_loss=0.3177, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9718, diff_loss=0.3724, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 19:07:05,365 INFO [train.py:682] (1/4) Start epoch 2969 2024-10-21 19:07:16,661 INFO [train.py:561] (1/4) Epoch 2969, batch 2, global_batch_idx: 47490, batch size: 203, loss[dur_loss=0.1942, prior_loss=0.9722, diff_loss=0.3162, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1919, prior_loss=0.9724, diff_loss=0.3147, tot_loss=1.479, over 442.00 samples.], 2024-10-21 19:07:30,928 INFO [train.py:561] (1/4) Epoch 2969, batch 12, global_batch_idx: 47500, batch size: 152, loss[dur_loss=0.1891, prior_loss=0.9723, diff_loss=0.2934, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.972, diff_loss=0.357, tot_loss=1.518, over 1966.00 samples.], 2024-10-21 19:07:35,370 INFO [train.py:682] (1/4) Start epoch 2970 2024-10-21 19:07:52,293 INFO [train.py:561] (1/4) Epoch 2970, batch 6, global_batch_idx: 47510, batch size: 106, loss[dur_loss=0.1885, prior_loss=0.972, diff_loss=0.2733, tot_loss=1.434, over 106.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9716, diff_loss=0.3981, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 19:08:05,361 INFO [train.py:682] (1/4) Start epoch 2971 2024-10-21 19:08:14,532 INFO [train.py:561] (1/4) Epoch 2971, batch 0, global_batch_idx: 47520, batch size: 108, loss[dur_loss=0.1926, prior_loss=0.9727, diff_loss=0.2742, tot_loss=1.439, over 108.00 samples.], tot_loss[dur_loss=0.1926, prior_loss=0.9727, diff_loss=0.2742, tot_loss=1.439, over 108.00 samples.], 2024-10-21 19:08:28,721 INFO [train.py:561] (1/4) Epoch 2971, batch 10, global_batch_idx: 47530, batch size: 111, loss[dur_loss=0.1913, prior_loss=0.9731, diff_loss=0.2725, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9718, diff_loss=0.37, tot_loss=1.53, over 1656.00 samples.], 2024-10-21 19:08:35,873 INFO [train.py:682] (1/4) Start epoch 2972 2024-10-21 19:08:49,463 INFO [train.py:561] (1/4) Epoch 2972, batch 4, global_batch_idx: 47540, batch size: 189, loss[dur_loss=0.1889, prior_loss=0.9722, diff_loss=0.3379, tot_loss=1.499, over 189.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9714, diff_loss=0.4184, tot_loss=1.577, over 937.00 samples.], 2024-10-21 19:09:04,391 INFO [train.py:561] (1/4) Epoch 2972, batch 14, global_batch_idx: 47550, batch size: 142, loss[dur_loss=0.1896, prior_loss=0.972, diff_loss=0.3067, tot_loss=1.468, over 142.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3522, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 19:09:05,816 INFO [train.py:682] (1/4) Start epoch 2973 2024-10-21 19:09:25,882 INFO [train.py:561] (1/4) Epoch 2973, batch 8, global_batch_idx: 47560, batch size: 170, loss[dur_loss=0.1918, prior_loss=0.9721, diff_loss=0.3047, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9717, diff_loss=0.3871, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 19:09:36,086 INFO [train.py:682] (1/4) Start epoch 2974 2024-10-21 19:09:47,633 INFO [train.py:561] (1/4) Epoch 2974, batch 2, global_batch_idx: 47570, batch size: 203, loss[dur_loss=0.191, prior_loss=0.9721, diff_loss=0.3428, tot_loss=1.506, over 203.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.9722, diff_loss=0.3249, tot_loss=1.487, over 442.00 samples.], 2024-10-21 19:10:01,846 INFO [train.py:561] (1/4) Epoch 2974, batch 12, global_batch_idx: 47580, batch size: 152, loss[dur_loss=0.1885, prior_loss=0.9724, diff_loss=0.3152, tot_loss=1.476, over 152.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9719, diff_loss=0.3607, tot_loss=1.521, over 1966.00 samples.], 2024-10-21 19:10:06,384 INFO [train.py:682] (1/4) Start epoch 2975 2024-10-21 19:10:23,566 INFO [train.py:561] (1/4) Epoch 2975, batch 6, global_batch_idx: 47590, batch size: 106, loss[dur_loss=0.189, prior_loss=0.9722, diff_loss=0.2803, tot_loss=1.442, over 106.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9716, diff_loss=0.3948, tot_loss=1.554, over 1142.00 samples.], 2024-10-21 19:10:36,628 INFO [train.py:682] (1/4) Start epoch 2976 2024-10-21 19:10:45,320 INFO [train.py:561] (1/4) Epoch 2976, batch 0, global_batch_idx: 47600, batch size: 108, loss[dur_loss=0.1906, prior_loss=0.9727, diff_loss=0.2992, tot_loss=1.463, over 108.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9727, diff_loss=0.2992, tot_loss=1.463, over 108.00 samples.], 2024-10-21 19:10:59,471 INFO [train.py:561] (1/4) Epoch 2976, batch 10, global_batch_idx: 47610, batch size: 111, loss[dur_loss=0.1928, prior_loss=0.9735, diff_loss=0.3261, tot_loss=1.492, over 111.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.9719, diff_loss=0.3707, tot_loss=1.532, over 1656.00 samples.], 2024-10-21 19:11:06,613 INFO [train.py:682] (1/4) Start epoch 2977 2024-10-21 19:11:20,246 INFO [train.py:561] (1/4) Epoch 2977, batch 4, global_batch_idx: 47620, batch size: 189, loss[dur_loss=0.1908, prior_loss=0.9723, diff_loss=0.3009, tot_loss=1.464, over 189.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9716, diff_loss=0.4251, tot_loss=1.585, over 937.00 samples.], 2024-10-21 19:11:35,220 INFO [train.py:561] (1/4) Epoch 2977, batch 14, global_batch_idx: 47630, batch size: 142, loss[dur_loss=0.1891, prior_loss=0.9719, diff_loss=0.2849, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.9721, diff_loss=0.3492, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 19:11:36,662 INFO [train.py:682] (1/4) Start epoch 2978 2024-10-21 19:11:56,501 INFO [train.py:561] (1/4) Epoch 2978, batch 8, global_batch_idx: 47640, batch size: 170, loss[dur_loss=0.191, prior_loss=0.9724, diff_loss=0.2954, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9718, diff_loss=0.3664, tot_loss=1.527, over 1432.00 samples.], 2024-10-21 19:12:06,646 INFO [train.py:682] (1/4) Start epoch 2979 2024-10-21 19:12:18,599 INFO [train.py:561] (1/4) Epoch 2979, batch 2, global_batch_idx: 47650, batch size: 203, loss[dur_loss=0.1953, prior_loss=0.9725, diff_loss=0.3369, tot_loss=1.505, over 203.00 samples.], tot_loss[dur_loss=0.1935, prior_loss=0.9724, diff_loss=0.3087, tot_loss=1.475, over 442.00 samples.], 2024-10-21 19:12:32,900 INFO [train.py:561] (1/4) Epoch 2979, batch 12, global_batch_idx: 47660, batch size: 152, loss[dur_loss=0.1891, prior_loss=0.9724, diff_loss=0.2991, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9721, diff_loss=0.3517, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 19:12:37,351 INFO [train.py:682] (1/4) Start epoch 2980 2024-10-21 19:12:54,444 INFO [train.py:561] (1/4) Epoch 2980, batch 6, global_batch_idx: 47670, batch size: 106, loss[dur_loss=0.1885, prior_loss=0.972, diff_loss=0.3108, tot_loss=1.471, over 106.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9716, diff_loss=0.3952, tot_loss=1.554, over 1142.00 samples.], 2024-10-21 19:13:07,494 INFO [train.py:682] (1/4) Start epoch 2981 2024-10-21 19:13:16,233 INFO [train.py:561] (1/4) Epoch 2981, batch 0, global_batch_idx: 47680, batch size: 108, loss[dur_loss=0.1938, prior_loss=0.9731, diff_loss=0.278, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.1938, prior_loss=0.9731, diff_loss=0.278, tot_loss=1.445, over 108.00 samples.], 2024-10-21 19:13:30,438 INFO [train.py:561] (1/4) Epoch 2981, batch 10, global_batch_idx: 47690, batch size: 111, loss[dur_loss=0.193, prior_loss=0.9733, diff_loss=0.3315, tot_loss=1.498, over 111.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9719, diff_loss=0.37, tot_loss=1.53, over 1656.00 samples.], 2024-10-21 19:13:37,577 INFO [train.py:682] (1/4) Start epoch 2982 2024-10-21 19:13:51,109 INFO [train.py:561] (1/4) Epoch 2982, batch 4, global_batch_idx: 47700, batch size: 189, loss[dur_loss=0.1879, prior_loss=0.9721, diff_loss=0.3197, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9714, diff_loss=0.4121, tot_loss=1.568, over 937.00 samples.], 2024-10-21 19:14:06,024 INFO [train.py:561] (1/4) Epoch 2982, batch 14, global_batch_idx: 47710, batch size: 142, loss[dur_loss=0.1929, prior_loss=0.9719, diff_loss=0.2904, tot_loss=1.455, over 142.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9719, diff_loss=0.3444, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 19:14:07,448 INFO [train.py:682] (1/4) Start epoch 2983 2024-10-21 19:14:27,628 INFO [train.py:561] (1/4) Epoch 2983, batch 8, global_batch_idx: 47720, batch size: 170, loss[dur_loss=0.1923, prior_loss=0.9721, diff_loss=0.3101, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1886, prior_loss=0.9717, diff_loss=0.3798, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 19:14:37,904 INFO [train.py:682] (1/4) Start epoch 2984 2024-10-21 19:14:49,252 INFO [train.py:561] (1/4) Epoch 2984, batch 2, global_batch_idx: 47730, batch size: 203, loss[dur_loss=0.1915, prior_loss=0.9722, diff_loss=0.3366, tot_loss=1.5, over 203.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.3159, tot_loss=1.479, over 442.00 samples.], 2024-10-21 19:15:03,546 INFO [train.py:561] (1/4) Epoch 2984, batch 12, global_batch_idx: 47740, batch size: 152, loss[dur_loss=0.1918, prior_loss=0.9725, diff_loss=0.3368, tot_loss=1.501, over 152.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.972, diff_loss=0.3563, tot_loss=1.517, over 1966.00 samples.], 2024-10-21 19:15:08,030 INFO [train.py:682] (1/4) Start epoch 2985 2024-10-21 19:15:25,032 INFO [train.py:561] (1/4) Epoch 2985, batch 6, global_batch_idx: 47750, batch size: 106, loss[dur_loss=0.1895, prior_loss=0.9722, diff_loss=0.3104, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9716, diff_loss=0.3969, tot_loss=1.556, over 1142.00 samples.], 2024-10-21 19:15:38,157 INFO [train.py:682] (1/4) Start epoch 2986 2024-10-21 19:15:47,167 INFO [train.py:561] (1/4) Epoch 2986, batch 0, global_batch_idx: 47760, batch size: 108, loss[dur_loss=0.193, prior_loss=0.9726, diff_loss=0.2917, tot_loss=1.457, over 108.00 samples.], tot_loss[dur_loss=0.193, prior_loss=0.9726, diff_loss=0.2917, tot_loss=1.457, over 108.00 samples.], 2024-10-21 19:16:01,630 INFO [train.py:561] (1/4) Epoch 2986, batch 10, global_batch_idx: 47770, batch size: 111, loss[dur_loss=0.1939, prior_loss=0.9732, diff_loss=0.3213, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9718, diff_loss=0.3578, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 19:16:08,837 INFO [train.py:682] (1/4) Start epoch 2987 2024-10-21 19:16:22,752 INFO [train.py:561] (1/4) Epoch 2987, batch 4, global_batch_idx: 47780, batch size: 189, loss[dur_loss=0.1889, prior_loss=0.9721, diff_loss=0.3228, tot_loss=1.484, over 189.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9714, diff_loss=0.4048, tot_loss=1.562, over 937.00 samples.], 2024-10-21 19:16:37,801 INFO [train.py:561] (1/4) Epoch 2987, batch 14, global_batch_idx: 47790, batch size: 142, loss[dur_loss=0.1923, prior_loss=0.9719, diff_loss=0.2709, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9719, diff_loss=0.3369, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 19:16:39,225 INFO [train.py:682] (1/4) Start epoch 2988 2024-10-21 19:16:59,485 INFO [train.py:561] (1/4) Epoch 2988, batch 8, global_batch_idx: 47800, batch size: 170, loss[dur_loss=0.1907, prior_loss=0.9723, diff_loss=0.326, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9717, diff_loss=0.3739, tot_loss=1.533, over 1432.00 samples.], 2024-10-21 19:17:09,794 INFO [train.py:682] (1/4) Start epoch 2989 2024-10-21 19:17:21,101 INFO [train.py:561] (1/4) Epoch 2989, batch 2, global_batch_idx: 47810, batch size: 203, loss[dur_loss=0.1895, prior_loss=0.9721, diff_loss=0.3285, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9722, diff_loss=0.3146, tot_loss=1.478, over 442.00 samples.], 2024-10-21 19:17:35,434 INFO [train.py:561] (1/4) Epoch 2989, batch 12, global_batch_idx: 47820, batch size: 152, loss[dur_loss=0.1874, prior_loss=0.9723, diff_loss=0.3229, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9719, diff_loss=0.3602, tot_loss=1.52, over 1966.00 samples.], 2024-10-21 19:17:39,972 INFO [train.py:682] (1/4) Start epoch 2990 2024-10-21 19:17:57,537 INFO [train.py:561] (1/4) Epoch 2990, batch 6, global_batch_idx: 47830, batch size: 106, loss[dur_loss=0.1883, prior_loss=0.9722, diff_loss=0.2736, tot_loss=1.434, over 106.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9715, diff_loss=0.3922, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 19:18:10,852 INFO [train.py:682] (1/4) Start epoch 2991 2024-10-21 19:18:19,760 INFO [train.py:561] (1/4) Epoch 2991, batch 0, global_batch_idx: 47840, batch size: 108, loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3005, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.1939, prior_loss=0.9728, diff_loss=0.3005, tot_loss=1.467, over 108.00 samples.], 2024-10-21 19:18:34,176 INFO [train.py:561] (1/4) Epoch 2991, batch 10, global_batch_idx: 47850, batch size: 111, loss[dur_loss=0.1908, prior_loss=0.973, diff_loss=0.3004, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9718, diff_loss=0.3661, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 19:18:41,337 INFO [train.py:682] (1/4) Start epoch 2992 2024-10-21 19:18:55,047 INFO [train.py:561] (1/4) Epoch 2992, batch 4, global_batch_idx: 47860, batch size: 189, loss[dur_loss=0.1872, prior_loss=0.9719, diff_loss=0.2925, tot_loss=1.452, over 189.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9713, diff_loss=0.4113, tot_loss=1.568, over 937.00 samples.], 2024-10-21 19:19:10,177 INFO [train.py:561] (1/4) Epoch 2992, batch 14, global_batch_idx: 47870, batch size: 142, loss[dur_loss=0.1903, prior_loss=0.9717, diff_loss=0.2825, tot_loss=1.445, over 142.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9718, diff_loss=0.3476, tot_loss=1.507, over 2210.00 samples.], 2024-10-21 19:19:11,600 INFO [train.py:682] (1/4) Start epoch 2993 2024-10-21 19:19:31,861 INFO [train.py:561] (1/4) Epoch 2993, batch 8, global_batch_idx: 47880, batch size: 170, loss[dur_loss=0.1928, prior_loss=0.9723, diff_loss=0.3252, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9718, diff_loss=0.3753, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 19:19:42,105 INFO [train.py:682] (1/4) Start epoch 2994 2024-10-21 19:19:53,650 INFO [train.py:561] (1/4) Epoch 2994, batch 2, global_batch_idx: 47890, batch size: 203, loss[dur_loss=0.1904, prior_loss=0.9721, diff_loss=0.3292, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.9722, diff_loss=0.3125, tot_loss=1.476, over 442.00 samples.], 2024-10-21 19:20:08,042 INFO [train.py:561] (1/4) Epoch 2994, batch 12, global_batch_idx: 47900, batch size: 152, loss[dur_loss=0.1887, prior_loss=0.9723, diff_loss=0.3234, tot_loss=1.484, over 152.00 samples.], tot_loss[dur_loss=0.1886, prior_loss=0.9719, diff_loss=0.3574, tot_loss=1.518, over 1966.00 samples.], 2024-10-21 19:20:12,591 INFO [train.py:682] (1/4) Start epoch 2995 2024-10-21 19:20:29,683 INFO [train.py:561] (1/4) Epoch 2995, batch 6, global_batch_idx: 47910, batch size: 106, loss[dur_loss=0.1886, prior_loss=0.972, diff_loss=0.2757, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9715, diff_loss=0.3979, tot_loss=1.558, over 1142.00 samples.], 2024-10-21 19:20:42,851 INFO [train.py:682] (1/4) Start epoch 2996 2024-10-21 19:20:51,523 INFO [train.py:561] (1/4) Epoch 2996, batch 0, global_batch_idx: 47920, batch size: 108, loss[dur_loss=0.1913, prior_loss=0.9725, diff_loss=0.2784, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1913, prior_loss=0.9725, diff_loss=0.2784, tot_loss=1.442, over 108.00 samples.], 2024-10-21 19:21:05,871 INFO [train.py:561] (1/4) Epoch 2996, batch 10, global_batch_idx: 47930, batch size: 111, loss[dur_loss=0.1919, prior_loss=0.9731, diff_loss=0.2863, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9717, diff_loss=0.3594, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 19:21:13,002 INFO [train.py:682] (1/4) Start epoch 2997 2024-10-21 19:21:27,013 INFO [train.py:561] (1/4) Epoch 2997, batch 4, global_batch_idx: 47940, batch size: 189, loss[dur_loss=0.188, prior_loss=0.9722, diff_loss=0.3098, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.4304, tot_loss=1.588, over 937.00 samples.], 2024-10-21 19:21:41,889 INFO [train.py:561] (1/4) Epoch 2997, batch 14, global_batch_idx: 47950, batch size: 142, loss[dur_loss=0.1904, prior_loss=0.972, diff_loss=0.2991, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9719, diff_loss=0.3548, tot_loss=1.515, over 2210.00 samples.], 2024-10-21 19:21:43,305 INFO [train.py:682] (1/4) Start epoch 2998 2024-10-21 19:22:03,306 INFO [train.py:561] (1/4) Epoch 2998, batch 8, global_batch_idx: 47960, batch size: 170, loss[dur_loss=0.1926, prior_loss=0.9724, diff_loss=0.3245, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9717, diff_loss=0.3809, tot_loss=1.541, over 1432.00 samples.], 2024-10-21 19:22:13,322 INFO [train.py:682] (1/4) Start epoch 2999 2024-10-21 19:22:24,844 INFO [train.py:561] (1/4) Epoch 2999, batch 2, global_batch_idx: 47970, batch size: 203, loss[dur_loss=0.1895, prior_loss=0.9721, diff_loss=0.3223, tot_loss=1.484, over 203.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9721, diff_loss=0.3138, tot_loss=1.477, over 442.00 samples.], 2024-10-21 19:22:39,011 INFO [train.py:561] (1/4) Epoch 2999, batch 12, global_batch_idx: 47980, batch size: 152, loss[dur_loss=0.1884, prior_loss=0.9721, diff_loss=0.2857, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9718, diff_loss=0.36, tot_loss=1.52, over 1966.00 samples.], 2024-10-21 19:22:43,468 INFO [train.py:682] (1/4) Start epoch 3000 2024-10-21 19:23:00,267 INFO [train.py:561] (1/4) Epoch 3000, batch 6, global_batch_idx: 47990, batch size: 106, loss[dur_loss=0.192, prior_loss=0.9721, diff_loss=0.2724, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9714, diff_loss=0.3896, tot_loss=1.547, over 1142.00 samples.], 2024-10-21 19:23:13,209 INFO [train.py:682] (1/4) Start epoch 3001 2024-10-21 19:23:21,708 INFO [train.py:561] (1/4) Epoch 3001, batch 0, global_batch_idx: 48000, batch size: 108, loss[dur_loss=0.1921, prior_loss=0.9725, diff_loss=0.2958, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.1921, prior_loss=0.9725, diff_loss=0.2958, tot_loss=1.46, over 108.00 samples.], 2024-10-21 19:23:23,116 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 19:23:50,686 INFO [train.py:589] (1/4) Epoch 3001, validation: dur_loss=0.4584, prior_loss=1.037, diff_loss=0.3541, tot_loss=1.849, over 100.00 samples. 2024-10-21 19:23:50,687 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 19:24:03,614 INFO [train.py:561] (1/4) Epoch 3001, batch 10, global_batch_idx: 48010, batch size: 111, loss[dur_loss=0.1899, prior_loss=0.9732, diff_loss=0.2969, tot_loss=1.46, over 111.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9718, diff_loss=0.3594, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 19:24:10,715 INFO [train.py:682] (1/4) Start epoch 3002 2024-10-21 19:24:25,033 INFO [train.py:561] (1/4) Epoch 3002, batch 4, global_batch_idx: 48020, batch size: 189, loss[dur_loss=0.1907, prior_loss=0.9723, diff_loss=0.2869, tot_loss=1.45, over 189.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.4102, tot_loss=1.568, over 937.00 samples.], 2024-10-21 19:24:39,828 INFO [train.py:561] (1/4) Epoch 3002, batch 14, global_batch_idx: 48030, batch size: 142, loss[dur_loss=0.1902, prior_loss=0.9718, diff_loss=0.3029, tot_loss=1.465, over 142.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9719, diff_loss=0.3524, tot_loss=1.513, over 2210.00 samples.], 2024-10-21 19:24:41,239 INFO [train.py:682] (1/4) Start epoch 3003 2024-10-21 19:25:01,127 INFO [train.py:561] (1/4) Epoch 3003, batch 8, global_batch_idx: 48040, batch size: 170, loss[dur_loss=0.1888, prior_loss=0.9722, diff_loss=0.3106, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3757, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 19:25:11,136 INFO [train.py:682] (1/4) Start epoch 3004 2024-10-21 19:25:22,621 INFO [train.py:561] (1/4) Epoch 3004, batch 2, global_batch_idx: 48050, batch size: 203, loss[dur_loss=0.1934, prior_loss=0.9723, diff_loss=0.3131, tot_loss=1.479, over 203.00 samples.], tot_loss[dur_loss=0.191, prior_loss=0.9722, diff_loss=0.2871, tot_loss=1.45, over 442.00 samples.], 2024-10-21 19:25:36,881 INFO [train.py:561] (1/4) Epoch 3004, batch 12, global_batch_idx: 48060, batch size: 152, loss[dur_loss=0.1898, prior_loss=0.9721, diff_loss=0.318, tot_loss=1.48, over 152.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9719, diff_loss=0.3438, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 19:25:41,311 INFO [train.py:682] (1/4) Start epoch 3005 2024-10-21 19:25:58,554 INFO [train.py:561] (1/4) Epoch 3005, batch 6, global_batch_idx: 48070, batch size: 106, loss[dur_loss=0.1868, prior_loss=0.972, diff_loss=0.2845, tot_loss=1.443, over 106.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9716, diff_loss=0.39, tot_loss=1.55, over 1142.00 samples.], 2024-10-21 19:26:11,495 INFO [train.py:682] (1/4) Start epoch 3006 2024-10-21 19:26:20,145 INFO [train.py:561] (1/4) Epoch 3006, batch 0, global_batch_idx: 48080, batch size: 108, loss[dur_loss=0.1899, prior_loss=0.9726, diff_loss=0.3198, tot_loss=1.482, over 108.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.9726, diff_loss=0.3198, tot_loss=1.482, over 108.00 samples.], 2024-10-21 19:26:34,375 INFO [train.py:561] (1/4) Epoch 3006, batch 10, global_batch_idx: 48090, batch size: 111, loss[dur_loss=0.191, prior_loss=0.9733, diff_loss=0.3091, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9719, diff_loss=0.3678, tot_loss=1.527, over 1656.00 samples.], 2024-10-21 19:26:41,426 INFO [train.py:682] (1/4) Start epoch 3007 2024-10-21 19:26:55,459 INFO [train.py:561] (1/4) Epoch 3007, batch 4, global_batch_idx: 48100, batch size: 189, loss[dur_loss=0.1897, prior_loss=0.9722, diff_loss=0.2989, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9714, diff_loss=0.4201, tot_loss=1.577, over 937.00 samples.], 2024-10-21 19:27:10,354 INFO [train.py:561] (1/4) Epoch 3007, batch 14, global_batch_idx: 48110, batch size: 142, loss[dur_loss=0.1876, prior_loss=0.972, diff_loss=0.297, tot_loss=1.457, over 142.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9719, diff_loss=0.3514, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 19:27:11,781 INFO [train.py:682] (1/4) Start epoch 3008 2024-10-21 19:27:31,823 INFO [train.py:561] (1/4) Epoch 3008, batch 8, global_batch_idx: 48120, batch size: 170, loss[dur_loss=0.1919, prior_loss=0.9725, diff_loss=0.3093, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9717, diff_loss=0.3747, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 19:27:41,916 INFO [train.py:682] (1/4) Start epoch 3009 2024-10-21 19:27:53,313 INFO [train.py:561] (1/4) Epoch 3009, batch 2, global_batch_idx: 48130, batch size: 203, loss[dur_loss=0.1912, prior_loss=0.9721, diff_loss=0.3266, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.316, tot_loss=1.478, over 442.00 samples.], 2024-10-21 19:28:07,628 INFO [train.py:561] (1/4) Epoch 3009, batch 12, global_batch_idx: 48140, batch size: 152, loss[dur_loss=0.1879, prior_loss=0.9723, diff_loss=0.2952, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.9719, diff_loss=0.3591, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 19:28:12,045 INFO [train.py:682] (1/4) Start epoch 3010 2024-10-21 19:28:29,078 INFO [train.py:561] (1/4) Epoch 3010, batch 6, global_batch_idx: 48150, batch size: 106, loss[dur_loss=0.1901, prior_loss=0.9722, diff_loss=0.2743, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9716, diff_loss=0.3865, tot_loss=1.544, over 1142.00 samples.], 2024-10-21 19:28:42,082 INFO [train.py:682] (1/4) Start epoch 3011 2024-10-21 19:28:51,261 INFO [train.py:561] (1/4) Epoch 3011, batch 0, global_batch_idx: 48160, batch size: 108, loss[dur_loss=0.1911, prior_loss=0.9726, diff_loss=0.3086, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9726, diff_loss=0.3086, tot_loss=1.472, over 108.00 samples.], 2024-10-21 19:29:05,637 INFO [train.py:561] (1/4) Epoch 3011, batch 10, global_batch_idx: 48170, batch size: 111, loss[dur_loss=0.1929, prior_loss=0.9733, diff_loss=0.2676, tot_loss=1.434, over 111.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.3649, tot_loss=1.526, over 1656.00 samples.], 2024-10-21 19:29:12,839 INFO [train.py:682] (1/4) Start epoch 3012 2024-10-21 19:29:26,979 INFO [train.py:561] (1/4) Epoch 3012, batch 4, global_batch_idx: 48180, batch size: 189, loss[dur_loss=0.1897, prior_loss=0.9724, diff_loss=0.3044, tot_loss=1.467, over 189.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9715, diff_loss=0.4032, tot_loss=1.562, over 937.00 samples.], 2024-10-21 19:29:41,856 INFO [train.py:561] (1/4) Epoch 3012, batch 14, global_batch_idx: 48190, batch size: 142, loss[dur_loss=0.1918, prior_loss=0.9719, diff_loss=0.2854, tot_loss=1.449, over 142.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.972, diff_loss=0.3376, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 19:29:43,278 INFO [train.py:682] (1/4) Start epoch 3013 2024-10-21 19:30:03,678 INFO [train.py:561] (1/4) Epoch 3013, batch 8, global_batch_idx: 48200, batch size: 170, loss[dur_loss=0.1926, prior_loss=0.9726, diff_loss=0.3146, tot_loss=1.48, over 170.00 samples.], tot_loss[dur_loss=0.1877, prior_loss=0.9718, diff_loss=0.3767, tot_loss=1.536, over 1432.00 samples.], 2024-10-21 19:30:13,757 INFO [train.py:682] (1/4) Start epoch 3014 2024-10-21 19:30:25,077 INFO [train.py:561] (1/4) Epoch 3014, batch 2, global_batch_idx: 48210, batch size: 203, loss[dur_loss=0.1897, prior_loss=0.9722, diff_loss=0.2804, tot_loss=1.442, over 203.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9722, diff_loss=0.2916, tot_loss=1.454, over 442.00 samples.], 2024-10-21 19:30:39,307 INFO [train.py:561] (1/4) Epoch 3014, batch 12, global_batch_idx: 48220, batch size: 152, loss[dur_loss=0.1877, prior_loss=0.9724, diff_loss=0.2934, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9721, diff_loss=0.3498, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 19:30:43,770 INFO [train.py:682] (1/4) Start epoch 3015 2024-10-21 19:31:01,012 INFO [train.py:561] (1/4) Epoch 3015, batch 6, global_batch_idx: 48230, batch size: 106, loss[dur_loss=0.1882, prior_loss=0.9721, diff_loss=0.3079, tot_loss=1.468, over 106.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.3991, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 19:31:14,183 INFO [train.py:682] (1/4) Start epoch 3016 2024-10-21 19:31:22,938 INFO [train.py:561] (1/4) Epoch 3016, batch 0, global_batch_idx: 48240, batch size: 108, loss[dur_loss=0.1928, prior_loss=0.9728, diff_loss=0.2962, tot_loss=1.462, over 108.00 samples.], tot_loss[dur_loss=0.1928, prior_loss=0.9728, diff_loss=0.2962, tot_loss=1.462, over 108.00 samples.], 2024-10-21 19:31:37,216 INFO [train.py:561] (1/4) Epoch 3016, batch 10, global_batch_idx: 48250, batch size: 111, loss[dur_loss=0.1899, prior_loss=0.9732, diff_loss=0.3066, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9719, diff_loss=0.3558, tot_loss=1.517, over 1656.00 samples.], 2024-10-21 19:31:44,300 INFO [train.py:682] (1/4) Start epoch 3017 2024-10-21 19:31:58,372 INFO [train.py:561] (1/4) Epoch 3017, batch 4, global_batch_idx: 48260, batch size: 189, loss[dur_loss=0.1927, prior_loss=0.9723, diff_loss=0.3109, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9714, diff_loss=0.4128, tot_loss=1.572, over 937.00 samples.], 2024-10-21 19:32:13,292 INFO [train.py:561] (1/4) Epoch 3017, batch 14, global_batch_idx: 48270, batch size: 142, loss[dur_loss=0.1918, prior_loss=0.9722, diff_loss=0.321, tot_loss=1.485, over 142.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.972, diff_loss=0.3497, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 19:32:14,706 INFO [train.py:682] (1/4) Start epoch 3018 2024-10-21 19:32:34,717 INFO [train.py:561] (1/4) Epoch 3018, batch 8, global_batch_idx: 48280, batch size: 170, loss[dur_loss=0.1939, prior_loss=0.9724, diff_loss=0.3089, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9718, diff_loss=0.3791, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 19:32:44,877 INFO [train.py:682] (1/4) Start epoch 3019 2024-10-21 19:32:56,179 INFO [train.py:561] (1/4) Epoch 3019, batch 2, global_batch_idx: 48290, batch size: 203, loss[dur_loss=0.191, prior_loss=0.9722, diff_loss=0.3405, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.9723, diff_loss=0.3183, tot_loss=1.48, over 442.00 samples.], 2024-10-21 19:33:10,411 INFO [train.py:561] (1/4) Epoch 3019, batch 12, global_batch_idx: 48300, batch size: 152, loss[dur_loss=0.1858, prior_loss=0.9724, diff_loss=0.3073, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9719, diff_loss=0.3555, tot_loss=1.516, over 1966.00 samples.], 2024-10-21 19:33:14,849 INFO [train.py:682] (1/4) Start epoch 3020 2024-10-21 19:33:32,378 INFO [train.py:561] (1/4) Epoch 3020, batch 6, global_batch_idx: 48310, batch size: 106, loss[dur_loss=0.1866, prior_loss=0.972, diff_loss=0.3015, tot_loss=1.46, over 106.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.3928, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 19:33:45,465 INFO [train.py:682] (1/4) Start epoch 3021 2024-10-21 19:33:54,069 INFO [train.py:561] (1/4) Epoch 3021, batch 0, global_batch_idx: 48320, batch size: 108, loss[dur_loss=0.1906, prior_loss=0.9724, diff_loss=0.3164, tot_loss=1.479, over 108.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9724, diff_loss=0.3164, tot_loss=1.479, over 108.00 samples.], 2024-10-21 19:34:08,386 INFO [train.py:561] (1/4) Epoch 3021, batch 10, global_batch_idx: 48330, batch size: 111, loss[dur_loss=0.1901, prior_loss=0.9732, diff_loss=0.2787, tot_loss=1.442, over 111.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9718, diff_loss=0.361, tot_loss=1.52, over 1656.00 samples.], 2024-10-21 19:34:15,482 INFO [train.py:682] (1/4) Start epoch 3022 2024-10-21 19:34:29,450 INFO [train.py:561] (1/4) Epoch 3022, batch 4, global_batch_idx: 48340, batch size: 189, loss[dur_loss=0.1915, prior_loss=0.9722, diff_loss=0.3311, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9713, diff_loss=0.4241, tot_loss=1.582, over 937.00 samples.], 2024-10-21 19:34:44,317 INFO [train.py:561] (1/4) Epoch 3022, batch 14, global_batch_idx: 48350, batch size: 142, loss[dur_loss=0.1869, prior_loss=0.9718, diff_loss=0.2697, tot_loss=1.428, over 142.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9718, diff_loss=0.3538, tot_loss=1.514, over 2210.00 samples.], 2024-10-21 19:34:45,743 INFO [train.py:682] (1/4) Start epoch 3023 2024-10-21 19:35:05,785 INFO [train.py:561] (1/4) Epoch 3023, batch 8, global_batch_idx: 48360, batch size: 170, loss[dur_loss=0.1939, prior_loss=0.9724, diff_loss=0.3522, tot_loss=1.518, over 170.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9716, diff_loss=0.3808, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 19:35:15,846 INFO [train.py:682] (1/4) Start epoch 3024 2024-10-21 19:35:27,413 INFO [train.py:561] (1/4) Epoch 3024, batch 2, global_batch_idx: 48370, batch size: 203, loss[dur_loss=0.1902, prior_loss=0.972, diff_loss=0.3413, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.1901, prior_loss=0.9721, diff_loss=0.3189, tot_loss=1.481, over 442.00 samples.], 2024-10-21 19:35:41,705 INFO [train.py:561] (1/4) Epoch 3024, batch 12, global_batch_idx: 48380, batch size: 152, loss[dur_loss=0.1872, prior_loss=0.9724, diff_loss=0.3684, tot_loss=1.528, over 152.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9719, diff_loss=0.3648, tot_loss=1.525, over 1966.00 samples.], 2024-10-21 19:35:46,136 INFO [train.py:682] (1/4) Start epoch 3025 2024-10-21 19:36:03,103 INFO [train.py:561] (1/4) Epoch 3025, batch 6, global_batch_idx: 48390, batch size: 106, loss[dur_loss=0.1872, prior_loss=0.9719, diff_loss=0.2964, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9715, diff_loss=0.3927, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 19:36:16,169 INFO [train.py:682] (1/4) Start epoch 3026 2024-10-21 19:36:24,745 INFO [train.py:561] (1/4) Epoch 3026, batch 0, global_batch_idx: 48400, batch size: 108, loss[dur_loss=0.1887, prior_loss=0.9726, diff_loss=0.2822, tot_loss=1.443, over 108.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9726, diff_loss=0.2822, tot_loss=1.443, over 108.00 samples.], 2024-10-21 19:36:39,051 INFO [train.py:561] (1/4) Epoch 3026, batch 10, global_batch_idx: 48410, batch size: 111, loss[dur_loss=0.1896, prior_loss=0.9729, diff_loss=0.3403, tot_loss=1.503, over 111.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9718, diff_loss=0.3658, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 19:36:46,168 INFO [train.py:682] (1/4) Start epoch 3027 2024-10-21 19:37:00,087 INFO [train.py:561] (1/4) Epoch 3027, batch 4, global_batch_idx: 48420, batch size: 189, loss[dur_loss=0.1922, prior_loss=0.9724, diff_loss=0.3157, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9714, diff_loss=0.4099, tot_loss=1.567, over 937.00 samples.], 2024-10-21 19:37:15,012 INFO [train.py:561] (1/4) Epoch 3027, batch 14, global_batch_idx: 48430, batch size: 142, loss[dur_loss=0.1918, prior_loss=0.9718, diff_loss=0.2971, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9718, diff_loss=0.3437, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 19:37:16,431 INFO [train.py:682] (1/4) Start epoch 3028 2024-10-21 19:37:36,709 INFO [train.py:561] (1/4) Epoch 3028, batch 8, global_batch_idx: 48440, batch size: 170, loss[dur_loss=0.1934, prior_loss=0.9724, diff_loss=0.3242, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9717, diff_loss=0.3703, tot_loss=1.53, over 1432.00 samples.], 2024-10-21 19:37:46,857 INFO [train.py:682] (1/4) Start epoch 3029 2024-10-21 19:37:58,138 INFO [train.py:561] (1/4) Epoch 3029, batch 2, global_batch_idx: 48450, batch size: 203, loss[dur_loss=0.1936, prior_loss=0.9722, diff_loss=0.3281, tot_loss=1.494, over 203.00 samples.], tot_loss[dur_loss=0.191, prior_loss=0.9722, diff_loss=0.3031, tot_loss=1.466, over 442.00 samples.], 2024-10-21 19:38:12,383 INFO [train.py:561] (1/4) Epoch 3029, batch 12, global_batch_idx: 48460, batch size: 152, loss[dur_loss=0.1896, prior_loss=0.9722, diff_loss=0.3073, tot_loss=1.469, over 152.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9719, diff_loss=0.3497, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 19:38:16,829 INFO [train.py:682] (1/4) Start epoch 3030 2024-10-21 19:38:34,042 INFO [train.py:561] (1/4) Epoch 3030, batch 6, global_batch_idx: 48470, batch size: 106, loss[dur_loss=0.1868, prior_loss=0.9718, diff_loss=0.2961, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.393, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 19:38:47,166 INFO [train.py:682] (1/4) Start epoch 3031 2024-10-21 19:38:55,933 INFO [train.py:561] (1/4) Epoch 3031, batch 0, global_batch_idx: 48480, batch size: 108, loss[dur_loss=0.1925, prior_loss=0.9726, diff_loss=0.3045, tot_loss=1.47, over 108.00 samples.], tot_loss[dur_loss=0.1925, prior_loss=0.9726, diff_loss=0.3045, tot_loss=1.47, over 108.00 samples.], 2024-10-21 19:39:10,207 INFO [train.py:561] (1/4) Epoch 3031, batch 10, global_batch_idx: 48490, batch size: 111, loss[dur_loss=0.1878, prior_loss=0.9728, diff_loss=0.3176, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9717, diff_loss=0.3657, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 19:39:17,307 INFO [train.py:682] (1/4) Start epoch 3032 2024-10-21 19:39:31,032 INFO [train.py:561] (1/4) Epoch 3032, batch 4, global_batch_idx: 48500, batch size: 189, loss[dur_loss=0.1916, prior_loss=0.9719, diff_loss=0.3307, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9712, diff_loss=0.418, tot_loss=1.576, over 937.00 samples.], 2024-10-21 19:39:45,912 INFO [train.py:561] (1/4) Epoch 3032, batch 14, global_batch_idx: 48510, batch size: 142, loss[dur_loss=0.1897, prior_loss=0.9717, diff_loss=0.2913, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1886, prior_loss=0.9717, diff_loss=0.3496, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 19:39:47,340 INFO [train.py:682] (1/4) Start epoch 3033 2024-10-21 19:40:07,159 INFO [train.py:561] (1/4) Epoch 3033, batch 8, global_batch_idx: 48520, batch size: 170, loss[dur_loss=0.1907, prior_loss=0.9721, diff_loss=0.3083, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3762, tot_loss=1.534, over 1432.00 samples.], 2024-10-21 19:40:17,336 INFO [train.py:682] (1/4) Start epoch 3034 2024-10-21 19:40:29,086 INFO [train.py:561] (1/4) Epoch 3034, batch 2, global_batch_idx: 48530, batch size: 203, loss[dur_loss=0.1898, prior_loss=0.9718, diff_loss=0.3321, tot_loss=1.494, over 203.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.9719, diff_loss=0.313, tot_loss=1.474, over 442.00 samples.], 2024-10-21 19:40:43,371 INFO [train.py:561] (1/4) Epoch 3034, batch 12, global_batch_idx: 48540, batch size: 152, loss[dur_loss=0.1875, prior_loss=0.9722, diff_loss=0.2958, tot_loss=1.456, over 152.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9717, diff_loss=0.3526, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 19:40:47,872 INFO [train.py:682] (1/4) Start epoch 3035 2024-10-21 19:41:05,149 INFO [train.py:561] (1/4) Epoch 3035, batch 6, global_batch_idx: 48550, batch size: 106, loss[dur_loss=0.1882, prior_loss=0.9718, diff_loss=0.3046, tot_loss=1.465, over 106.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9713, diff_loss=0.4, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 19:41:18,201 INFO [train.py:682] (1/4) Start epoch 3036 2024-10-21 19:41:26,916 INFO [train.py:561] (1/4) Epoch 3036, batch 0, global_batch_idx: 48560, batch size: 108, loss[dur_loss=0.1907, prior_loss=0.9724, diff_loss=0.3033, tot_loss=1.466, over 108.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9724, diff_loss=0.3033, tot_loss=1.466, over 108.00 samples.], 2024-10-21 19:41:41,193 INFO [train.py:561] (1/4) Epoch 3036, batch 10, global_batch_idx: 48570, batch size: 111, loss[dur_loss=0.1904, prior_loss=0.9728, diff_loss=0.2476, tot_loss=1.411, over 111.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.3545, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 19:41:48,287 INFO [train.py:682] (1/4) Start epoch 3037 2024-10-21 19:42:02,501 INFO [train.py:561] (1/4) Epoch 3037, batch 4, global_batch_idx: 48580, batch size: 189, loss[dur_loss=0.1924, prior_loss=0.972, diff_loss=0.3233, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9712, diff_loss=0.4128, tot_loss=1.572, over 937.00 samples.], 2024-10-21 19:42:17,353 INFO [train.py:561] (1/4) Epoch 3037, batch 14, global_batch_idx: 48590, batch size: 142, loss[dur_loss=0.1915, prior_loss=0.9718, diff_loss=0.3038, tot_loss=1.467, over 142.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9717, diff_loss=0.3447, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 19:42:18,780 INFO [train.py:682] (1/4) Start epoch 3038 2024-10-21 19:42:38,805 INFO [train.py:561] (1/4) Epoch 3038, batch 8, global_batch_idx: 48600, batch size: 170, loss[dur_loss=0.191, prior_loss=0.972, diff_loss=0.2759, tot_loss=1.439, over 170.00 samples.], tot_loss[dur_loss=0.1877, prior_loss=0.9715, diff_loss=0.3653, tot_loss=1.524, over 1432.00 samples.], 2024-10-21 19:42:48,974 INFO [train.py:682] (1/4) Start epoch 3039 2024-10-21 19:43:00,435 INFO [train.py:561] (1/4) Epoch 3039, batch 2, global_batch_idx: 48610, batch size: 203, loss[dur_loss=0.1923, prior_loss=0.972, diff_loss=0.3343, tot_loss=1.499, over 203.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.972, diff_loss=0.3131, tot_loss=1.475, over 442.00 samples.], 2024-10-21 19:43:14,675 INFO [train.py:561] (1/4) Epoch 3039, batch 12, global_batch_idx: 48620, batch size: 152, loss[dur_loss=0.187, prior_loss=0.9721, diff_loss=0.2856, tot_loss=1.445, over 152.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3495, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 19:43:19,121 INFO [train.py:682] (1/4) Start epoch 3040 2024-10-21 19:43:36,153 INFO [train.py:561] (1/4) Epoch 3040, batch 6, global_batch_idx: 48630, batch size: 106, loss[dur_loss=0.1851, prior_loss=0.9717, diff_loss=0.2994, tot_loss=1.456, over 106.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9713, diff_loss=0.3917, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 19:43:49,195 INFO [train.py:682] (1/4) Start epoch 3041 2024-10-21 19:43:57,783 INFO [train.py:561] (1/4) Epoch 3041, batch 0, global_batch_idx: 48640, batch size: 108, loss[dur_loss=0.1915, prior_loss=0.9725, diff_loss=0.2773, tot_loss=1.441, over 108.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9725, diff_loss=0.2773, tot_loss=1.441, over 108.00 samples.], 2024-10-21 19:44:12,028 INFO [train.py:561] (1/4) Epoch 3041, batch 10, global_batch_idx: 48650, batch size: 111, loss[dur_loss=0.1879, prior_loss=0.9728, diff_loss=0.3068, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3759, tot_loss=1.535, over 1656.00 samples.], 2024-10-21 19:44:19,099 INFO [train.py:682] (1/4) Start epoch 3042 2024-10-21 19:44:33,023 INFO [train.py:561] (1/4) Epoch 3042, batch 4, global_batch_idx: 48660, batch size: 189, loss[dur_loss=0.1869, prior_loss=0.9718, diff_loss=0.3153, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.415, tot_loss=1.571, over 937.00 samples.], 2024-10-21 19:44:47,911 INFO [train.py:561] (1/4) Epoch 3042, batch 14, global_batch_idx: 48670, batch size: 142, loss[dur_loss=0.1901, prior_loss=0.9717, diff_loss=0.2809, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9717, diff_loss=0.3446, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 19:44:49,329 INFO [train.py:682] (1/4) Start epoch 3043 2024-10-21 19:45:09,263 INFO [train.py:561] (1/4) Epoch 3043, batch 8, global_batch_idx: 48680, batch size: 170, loss[dur_loss=0.1905, prior_loss=0.972, diff_loss=0.3099, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9714, diff_loss=0.381, tot_loss=1.538, over 1432.00 samples.], 2024-10-21 19:45:19,374 INFO [train.py:682] (1/4) Start epoch 3044 2024-10-21 19:45:30,915 INFO [train.py:561] (1/4) Epoch 3044, batch 2, global_batch_idx: 48690, batch size: 203, loss[dur_loss=0.1891, prior_loss=0.972, diff_loss=0.3154, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1893, prior_loss=0.972, diff_loss=0.3016, tot_loss=1.463, over 442.00 samples.], 2024-10-21 19:45:45,134 INFO [train.py:561] (1/4) Epoch 3044, batch 12, global_batch_idx: 48700, batch size: 152, loss[dur_loss=0.1861, prior_loss=0.972, diff_loss=0.3437, tot_loss=1.502, over 152.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.3505, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 19:45:49,590 INFO [train.py:682] (1/4) Start epoch 3045 2024-10-21 19:46:06,645 INFO [train.py:561] (1/4) Epoch 3045, batch 6, global_batch_idx: 48710, batch size: 106, loss[dur_loss=0.1866, prior_loss=0.972, diff_loss=0.3285, tot_loss=1.487, over 106.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9714, diff_loss=0.4044, tot_loss=1.561, over 1142.00 samples.], 2024-10-21 19:46:19,674 INFO [train.py:682] (1/4) Start epoch 3046 2024-10-21 19:46:28,421 INFO [train.py:561] (1/4) Epoch 3046, batch 0, global_batch_idx: 48720, batch size: 108, loss[dur_loss=0.1881, prior_loss=0.9724, diff_loss=0.2592, tot_loss=1.42, over 108.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9724, diff_loss=0.2592, tot_loss=1.42, over 108.00 samples.], 2024-10-21 19:46:42,671 INFO [train.py:561] (1/4) Epoch 3046, batch 10, global_batch_idx: 48730, batch size: 111, loss[dur_loss=0.1892, prior_loss=0.9729, diff_loss=0.2728, tot_loss=1.435, over 111.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.3667, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 19:46:49,759 INFO [train.py:682] (1/4) Start epoch 3047 2024-10-21 19:47:03,747 INFO [train.py:561] (1/4) Epoch 3047, batch 4, global_batch_idx: 48740, batch size: 189, loss[dur_loss=0.19, prior_loss=0.9721, diff_loss=0.3097, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9712, diff_loss=0.4159, tot_loss=1.573, over 937.00 samples.], 2024-10-21 19:47:18,618 INFO [train.py:561] (1/4) Epoch 3047, batch 14, global_batch_idx: 48750, batch size: 142, loss[dur_loss=0.1908, prior_loss=0.9716, diff_loss=0.304, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9717, diff_loss=0.3562, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 19:47:20,045 INFO [train.py:682] (1/4) Start epoch 3048 2024-10-21 19:47:39,994 INFO [train.py:561] (1/4) Epoch 3048, batch 8, global_batch_idx: 48760, batch size: 170, loss[dur_loss=0.1902, prior_loss=0.9718, diff_loss=0.3247, tot_loss=1.487, over 170.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3695, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 19:47:50,105 INFO [train.py:682] (1/4) Start epoch 3049 2024-10-21 19:48:01,426 INFO [train.py:561] (1/4) Epoch 3049, batch 2, global_batch_idx: 48770, batch size: 203, loss[dur_loss=0.1908, prior_loss=0.9719, diff_loss=0.3157, tot_loss=1.478, over 203.00 samples.], tot_loss[dur_loss=0.1903, prior_loss=0.9719, diff_loss=0.2994, tot_loss=1.462, over 442.00 samples.], 2024-10-21 19:48:15,685 INFO [train.py:561] (1/4) Epoch 3049, batch 12, global_batch_idx: 48780, batch size: 152, loss[dur_loss=0.1874, prior_loss=0.9719, diff_loss=0.3158, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3496, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 19:48:20,147 INFO [train.py:682] (1/4) Start epoch 3050 2024-10-21 19:48:37,291 INFO [train.py:561] (1/4) Epoch 3050, batch 6, global_batch_idx: 48790, batch size: 106, loss[dur_loss=0.1886, prior_loss=0.972, diff_loss=0.2929, tot_loss=1.454, over 106.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.3881, tot_loss=1.545, over 1142.00 samples.], 2024-10-21 19:48:50,365 INFO [train.py:682] (1/4) Start epoch 3051 2024-10-21 19:48:59,029 INFO [train.py:561] (1/4) Epoch 3051, batch 0, global_batch_idx: 48800, batch size: 108, loss[dur_loss=0.1923, prior_loss=0.9724, diff_loss=0.3036, tot_loss=1.468, over 108.00 samples.], tot_loss[dur_loss=0.1923, prior_loss=0.9724, diff_loss=0.3036, tot_loss=1.468, over 108.00 samples.], 2024-10-21 19:49:13,275 INFO [train.py:561] (1/4) Epoch 3051, batch 10, global_batch_idx: 48810, batch size: 111, loss[dur_loss=0.1911, prior_loss=0.9731, diff_loss=0.2973, tot_loss=1.461, over 111.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3657, tot_loss=1.524, over 1656.00 samples.], 2024-10-21 19:49:20,332 INFO [train.py:682] (1/4) Start epoch 3052 2024-10-21 19:49:33,965 INFO [train.py:561] (1/4) Epoch 3052, batch 4, global_batch_idx: 48820, batch size: 189, loss[dur_loss=0.1877, prior_loss=0.9718, diff_loss=0.3232, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.4161, tot_loss=1.572, over 937.00 samples.], 2024-10-21 19:49:48,726 INFO [train.py:561] (1/4) Epoch 3052, batch 14, global_batch_idx: 48830, batch size: 142, loss[dur_loss=0.1889, prior_loss=0.9717, diff_loss=0.3048, tot_loss=1.465, over 142.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.3449, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 19:49:50,147 INFO [train.py:682] (1/4) Start epoch 3053 2024-10-21 19:50:10,332 INFO [train.py:561] (1/4) Epoch 3053, batch 8, global_batch_idx: 48840, batch size: 170, loss[dur_loss=0.1901, prior_loss=0.9719, diff_loss=0.2873, tot_loss=1.449, over 170.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9714, diff_loss=0.368, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 19:50:20,439 INFO [train.py:682] (1/4) Start epoch 3054 2024-10-21 19:50:31,812 INFO [train.py:561] (1/4) Epoch 3054, batch 2, global_batch_idx: 48850, batch size: 203, loss[dur_loss=0.1866, prior_loss=0.9718, diff_loss=0.3665, tot_loss=1.525, over 203.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9719, diff_loss=0.3352, tot_loss=1.495, over 442.00 samples.], 2024-10-21 19:50:46,016 INFO [train.py:561] (1/4) Epoch 3054, batch 12, global_batch_idx: 48860, batch size: 152, loss[dur_loss=0.1888, prior_loss=0.9719, diff_loss=0.2921, tot_loss=1.453, over 152.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9716, diff_loss=0.3564, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 19:50:50,431 INFO [train.py:682] (1/4) Start epoch 3055 2024-10-21 19:51:07,476 INFO [train.py:561] (1/4) Epoch 3055, batch 6, global_batch_idx: 48870, batch size: 106, loss[dur_loss=0.1841, prior_loss=0.9719, diff_loss=0.2728, tot_loss=1.429, over 106.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9713, diff_loss=0.3808, tot_loss=1.538, over 1142.00 samples.], 2024-10-21 19:51:20,481 INFO [train.py:682] (1/4) Start epoch 3056 2024-10-21 19:51:29,216 INFO [train.py:561] (1/4) Epoch 3056, batch 0, global_batch_idx: 48880, batch size: 108, loss[dur_loss=0.1905, prior_loss=0.9723, diff_loss=0.3091, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1905, prior_loss=0.9723, diff_loss=0.3091, tot_loss=1.472, over 108.00 samples.], 2024-10-21 19:51:43,640 INFO [train.py:561] (1/4) Epoch 3056, batch 10, global_batch_idx: 48890, batch size: 111, loss[dur_loss=0.1894, prior_loss=0.9728, diff_loss=0.2883, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9715, diff_loss=0.37, tot_loss=1.528, over 1656.00 samples.], 2024-10-21 19:51:50,731 INFO [train.py:682] (1/4) Start epoch 3057 2024-10-21 19:52:04,772 INFO [train.py:561] (1/4) Epoch 3057, batch 4, global_batch_idx: 48900, batch size: 189, loss[dur_loss=0.1892, prior_loss=0.972, diff_loss=0.3031, tot_loss=1.464, over 189.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9711, diff_loss=0.4204, tot_loss=1.577, over 937.00 samples.], 2024-10-21 19:52:19,711 INFO [train.py:561] (1/4) Epoch 3057, batch 14, global_batch_idx: 48910, batch size: 142, loss[dur_loss=0.1898, prior_loss=0.9715, diff_loss=0.2735, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3506, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 19:52:21,142 INFO [train.py:682] (1/4) Start epoch 3058 2024-10-21 19:52:41,262 INFO [train.py:561] (1/4) Epoch 3058, batch 8, global_batch_idx: 48920, batch size: 170, loss[dur_loss=0.1896, prior_loss=0.972, diff_loss=0.2898, tot_loss=1.451, over 170.00 samples.], tot_loss[dur_loss=0.1863, prior_loss=0.9714, diff_loss=0.3743, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 19:52:51,413 INFO [train.py:682] (1/4) Start epoch 3059 2024-10-21 19:53:02,758 INFO [train.py:561] (1/4) Epoch 3059, batch 2, global_batch_idx: 48930, batch size: 203, loss[dur_loss=0.1882, prior_loss=0.9717, diff_loss=0.324, tot_loss=1.484, over 203.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9718, diff_loss=0.3027, tot_loss=1.463, over 442.00 samples.], 2024-10-21 19:53:17,049 INFO [train.py:561] (1/4) Epoch 3059, batch 12, global_batch_idx: 48940, batch size: 152, loss[dur_loss=0.1879, prior_loss=0.9722, diff_loss=0.3467, tot_loss=1.507, over 152.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9716, diff_loss=0.3613, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 19:53:21,510 INFO [train.py:682] (1/4) Start epoch 3060 2024-10-21 19:53:38,449 INFO [train.py:561] (1/4) Epoch 3060, batch 6, global_batch_idx: 48950, batch size: 106, loss[dur_loss=0.1882, prior_loss=0.972, diff_loss=0.3238, tot_loss=1.484, over 106.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9713, diff_loss=0.3914, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 19:53:51,491 INFO [train.py:682] (1/4) Start epoch 3061 2024-10-21 19:54:00,773 INFO [train.py:561] (1/4) Epoch 3061, batch 0, global_batch_idx: 48960, batch size: 108, loss[dur_loss=0.1905, prior_loss=0.9723, diff_loss=0.2971, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.1905, prior_loss=0.9723, diff_loss=0.2971, tot_loss=1.46, over 108.00 samples.], 2024-10-21 19:54:15,113 INFO [train.py:561] (1/4) Epoch 3061, batch 10, global_batch_idx: 48970, batch size: 111, loss[dur_loss=0.19, prior_loss=0.9729, diff_loss=0.3249, tot_loss=1.488, over 111.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.3682, tot_loss=1.527, over 1656.00 samples.], 2024-10-21 19:54:22,202 INFO [train.py:682] (1/4) Start epoch 3062 2024-10-21 19:54:36,146 INFO [train.py:561] (1/4) Epoch 3062, batch 4, global_batch_idx: 48980, batch size: 189, loss[dur_loss=0.1863, prior_loss=0.9717, diff_loss=0.3176, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9711, diff_loss=0.4011, tot_loss=1.557, over 937.00 samples.], 2024-10-21 19:54:51,109 INFO [train.py:561] (1/4) Epoch 3062, batch 14, global_batch_idx: 48990, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.9716, diff_loss=0.2836, tot_loss=1.444, over 142.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3396, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 19:54:52,534 INFO [train.py:682] (1/4) Start epoch 3063 2024-10-21 19:55:12,417 INFO [train.py:561] (1/4) Epoch 3063, batch 8, global_batch_idx: 49000, batch size: 170, loss[dur_loss=0.188, prior_loss=0.9718, diff_loss=0.3137, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9714, diff_loss=0.3882, tot_loss=1.547, over 1432.00 samples.], 2024-10-21 19:55:22,460 INFO [train.py:682] (1/4) Start epoch 3064 2024-10-21 19:55:34,148 INFO [train.py:561] (1/4) Epoch 3064, batch 2, global_batch_idx: 49010, batch size: 203, loss[dur_loss=0.1874, prior_loss=0.9717, diff_loss=0.3125, tot_loss=1.472, over 203.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9719, diff_loss=0.3134, tot_loss=1.474, over 442.00 samples.], 2024-10-21 19:55:48,354 INFO [train.py:561] (1/4) Epoch 3064, batch 12, global_batch_idx: 49020, batch size: 152, loss[dur_loss=0.1851, prior_loss=0.9719, diff_loss=0.3032, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1863, prior_loss=0.9715, diff_loss=0.357, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 19:55:52,823 INFO [train.py:682] (1/4) Start epoch 3065 2024-10-21 19:56:09,807 INFO [train.py:561] (1/4) Epoch 3065, batch 6, global_batch_idx: 49030, batch size: 106, loss[dur_loss=0.1883, prior_loss=0.9717, diff_loss=0.3171, tot_loss=1.477, over 106.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.3935, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 19:56:22,878 INFO [train.py:682] (1/4) Start epoch 3066 2024-10-21 19:56:31,731 INFO [train.py:561] (1/4) Epoch 3066, batch 0, global_batch_idx: 49040, batch size: 108, loss[dur_loss=0.189, prior_loss=0.9723, diff_loss=0.2866, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.189, prior_loss=0.9723, diff_loss=0.2866, tot_loss=1.448, over 108.00 samples.], 2024-10-21 19:56:45,931 INFO [train.py:561] (1/4) Epoch 3066, batch 10, global_batch_idx: 49050, batch size: 111, loss[dur_loss=0.1904, prior_loss=0.9728, diff_loss=0.3079, tot_loss=1.471, over 111.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.3641, tot_loss=1.522, over 1656.00 samples.], 2024-10-21 19:56:53,030 INFO [train.py:682] (1/4) Start epoch 3067 2024-10-21 19:57:07,005 INFO [train.py:561] (1/4) Epoch 3067, batch 4, global_batch_idx: 49060, batch size: 189, loss[dur_loss=0.1903, prior_loss=0.9719, diff_loss=0.3434, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.4152, tot_loss=1.572, over 937.00 samples.], 2024-10-21 19:57:21,845 INFO [train.py:561] (1/4) Epoch 3067, batch 14, global_batch_idx: 49070, batch size: 142, loss[dur_loss=0.1897, prior_loss=0.9716, diff_loss=0.2531, tot_loss=1.414, over 142.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9716, diff_loss=0.3433, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 19:57:23,274 INFO [train.py:682] (1/4) Start epoch 3068 2024-10-21 19:57:43,421 INFO [train.py:561] (1/4) Epoch 3068, batch 8, global_batch_idx: 49080, batch size: 170, loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3502, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9715, diff_loss=0.3791, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 19:57:53,594 INFO [train.py:682] (1/4) Start epoch 3069 2024-10-21 19:58:05,176 INFO [train.py:561] (1/4) Epoch 3069, batch 2, global_batch_idx: 49090, batch size: 203, loss[dur_loss=0.1882, prior_loss=0.9719, diff_loss=0.3365, tot_loss=1.497, over 203.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9719, diff_loss=0.3273, tot_loss=1.489, over 442.00 samples.], 2024-10-21 19:58:19,398 INFO [train.py:561] (1/4) Epoch 3069, batch 12, global_batch_idx: 49100, batch size: 152, loss[dur_loss=0.1901, prior_loss=0.972, diff_loss=0.2841, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.348, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 19:58:23,837 INFO [train.py:682] (1/4) Start epoch 3070 2024-10-21 19:58:41,119 INFO [train.py:561] (1/4) Epoch 3070, batch 6, global_batch_idx: 49110, batch size: 106, loss[dur_loss=0.1896, prior_loss=0.9718, diff_loss=0.3212, tot_loss=1.483, over 106.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9712, diff_loss=0.3991, tot_loss=1.555, over 1142.00 samples.], 2024-10-21 19:58:54,161 INFO [train.py:682] (1/4) Start epoch 3071 2024-10-21 19:59:02,844 INFO [train.py:561] (1/4) Epoch 3071, batch 0, global_batch_idx: 49120, batch size: 108, loss[dur_loss=0.1875, prior_loss=0.9723, diff_loss=0.2672, tot_loss=1.427, over 108.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9723, diff_loss=0.2672, tot_loss=1.427, over 108.00 samples.], 2024-10-21 19:59:17,060 INFO [train.py:561] (1/4) Epoch 3071, batch 10, global_batch_idx: 49130, batch size: 111, loss[dur_loss=0.1885, prior_loss=0.9726, diff_loss=0.2957, tot_loss=1.457, over 111.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9716, diff_loss=0.3652, tot_loss=1.524, over 1656.00 samples.], 2024-10-21 19:59:24,134 INFO [train.py:682] (1/4) Start epoch 3072 2024-10-21 19:59:37,964 INFO [train.py:561] (1/4) Epoch 3072, batch 4, global_batch_idx: 49140, batch size: 189, loss[dur_loss=0.1884, prior_loss=0.9717, diff_loss=0.3214, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.971, diff_loss=0.4052, tot_loss=1.561, over 937.00 samples.], 2024-10-21 19:59:52,971 INFO [train.py:561] (1/4) Epoch 3072, batch 14, global_batch_idx: 49150, batch size: 142, loss[dur_loss=0.1892, prior_loss=0.9718, diff_loss=0.2823, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9716, diff_loss=0.3382, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 19:59:54,408 INFO [train.py:682] (1/4) Start epoch 3073 2024-10-21 20:00:14,581 INFO [train.py:561] (1/4) Epoch 3073, batch 8, global_batch_idx: 49160, batch size: 170, loss[dur_loss=0.1907, prior_loss=0.9717, diff_loss=0.3274, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9714, diff_loss=0.375, tot_loss=1.533, over 1432.00 samples.], 2024-10-21 20:00:24,716 INFO [train.py:682] (1/4) Start epoch 3074 2024-10-21 20:00:36,205 INFO [train.py:561] (1/4) Epoch 3074, batch 2, global_batch_idx: 49170, batch size: 203, loss[dur_loss=0.1868, prior_loss=0.9718, diff_loss=0.2939, tot_loss=1.453, over 203.00 samples.], tot_loss[dur_loss=0.1877, prior_loss=0.9719, diff_loss=0.2991, tot_loss=1.459, over 442.00 samples.], 2024-10-21 20:00:50,395 INFO [train.py:561] (1/4) Epoch 3074, batch 12, global_batch_idx: 49180, batch size: 152, loss[dur_loss=0.1867, prior_loss=0.9719, diff_loss=0.2715, tot_loss=1.43, over 152.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9717, diff_loss=0.3537, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 20:00:54,831 INFO [train.py:682] (1/4) Start epoch 3075 2024-10-21 20:01:11,731 INFO [train.py:561] (1/4) Epoch 3075, batch 6, global_batch_idx: 49190, batch size: 106, loss[dur_loss=0.1864, prior_loss=0.9718, diff_loss=0.2768, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.3902, tot_loss=1.547, over 1142.00 samples.], 2024-10-21 20:01:24,806 INFO [train.py:682] (1/4) Start epoch 3076 2024-10-21 20:01:33,769 INFO [train.py:561] (1/4) Epoch 3076, batch 0, global_batch_idx: 49200, batch size: 108, loss[dur_loss=0.1911, prior_loss=0.9727, diff_loss=0.262, tot_loss=1.426, over 108.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9727, diff_loss=0.262, tot_loss=1.426, over 108.00 samples.], 2024-10-21 20:01:48,047 INFO [train.py:561] (1/4) Epoch 3076, batch 10, global_batch_idx: 49210, batch size: 111, loss[dur_loss=0.1898, prior_loss=0.9727, diff_loss=0.2536, tot_loss=1.416, over 111.00 samples.], tot_loss[dur_loss=0.1873, prior_loss=0.9716, diff_loss=0.3648, tot_loss=1.524, over 1656.00 samples.], 2024-10-21 20:01:55,163 INFO [train.py:682] (1/4) Start epoch 3077 2024-10-21 20:02:09,445 INFO [train.py:561] (1/4) Epoch 3077, batch 4, global_batch_idx: 49220, batch size: 189, loss[dur_loss=0.1887, prior_loss=0.9719, diff_loss=0.3185, tot_loss=1.479, over 189.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.971, diff_loss=0.4221, tot_loss=1.579, over 937.00 samples.], 2024-10-21 20:02:24,358 INFO [train.py:561] (1/4) Epoch 3077, batch 14, global_batch_idx: 49230, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.9717, diff_loss=0.3111, tot_loss=1.472, over 142.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9717, diff_loss=0.3521, tot_loss=1.512, over 2210.00 samples.], 2024-10-21 20:02:25,795 INFO [train.py:682] (1/4) Start epoch 3078 2024-10-21 20:02:45,772 INFO [train.py:561] (1/4) Epoch 3078, batch 8, global_batch_idx: 49240, batch size: 170, loss[dur_loss=0.1917, prior_loss=0.9721, diff_loss=0.3388, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9715, diff_loss=0.3781, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 20:02:55,898 INFO [train.py:682] (1/4) Start epoch 3079 2024-10-21 20:03:07,596 INFO [train.py:561] (1/4) Epoch 3079, batch 2, global_batch_idx: 49250, batch size: 203, loss[dur_loss=0.1874, prior_loss=0.972, diff_loss=0.3671, tot_loss=1.526, over 203.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.972, diff_loss=0.3334, tot_loss=1.494, over 442.00 samples.], 2024-10-21 20:03:21,812 INFO [train.py:561] (1/4) Epoch 3079, batch 12, global_batch_idx: 49260, batch size: 152, loss[dur_loss=0.1856, prior_loss=0.9718, diff_loss=0.311, tot_loss=1.468, over 152.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9716, diff_loss=0.3716, tot_loss=1.529, over 1966.00 samples.], 2024-10-21 20:03:26,234 INFO [train.py:682] (1/4) Start epoch 3080 2024-10-21 20:03:43,307 INFO [train.py:561] (1/4) Epoch 3080, batch 6, global_batch_idx: 49270, batch size: 106, loss[dur_loss=0.1877, prior_loss=0.9719, diff_loss=0.3074, tot_loss=1.467, over 106.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9713, diff_loss=0.3895, tot_loss=1.547, over 1142.00 samples.], 2024-10-21 20:03:56,347 INFO [train.py:682] (1/4) Start epoch 3081 2024-10-21 20:04:04,995 INFO [train.py:561] (1/4) Epoch 3081, batch 0, global_batch_idx: 49280, batch size: 108, loss[dur_loss=0.1898, prior_loss=0.9724, diff_loss=0.2764, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9724, diff_loss=0.2764, tot_loss=1.438, over 108.00 samples.], 2024-10-21 20:04:19,147 INFO [train.py:561] (1/4) Epoch 3081, batch 10, global_batch_idx: 49290, batch size: 111, loss[dur_loss=0.1906, prior_loss=0.9729, diff_loss=0.3141, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9716, diff_loss=0.3712, tot_loss=1.53, over 1656.00 samples.], 2024-10-21 20:04:26,201 INFO [train.py:682] (1/4) Start epoch 3082 2024-10-21 20:04:40,411 INFO [train.py:561] (1/4) Epoch 3082, batch 4, global_batch_idx: 49300, batch size: 189, loss[dur_loss=0.1865, prior_loss=0.9716, diff_loss=0.3477, tot_loss=1.506, over 189.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.971, diff_loss=0.4164, tot_loss=1.572, over 937.00 samples.], 2024-10-21 20:04:55,319 INFO [train.py:561] (1/4) Epoch 3082, batch 14, global_batch_idx: 49310, batch size: 142, loss[dur_loss=0.1889, prior_loss=0.9715, diff_loss=0.2784, tot_loss=1.439, over 142.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9716, diff_loss=0.3478, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 20:04:56,741 INFO [train.py:682] (1/4) Start epoch 3083 2024-10-21 20:05:16,875 INFO [train.py:561] (1/4) Epoch 3083, batch 8, global_batch_idx: 49320, batch size: 170, loss[dur_loss=0.1889, prior_loss=0.972, diff_loss=0.3219, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9714, diff_loss=0.3777, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 20:05:27,075 INFO [train.py:682] (1/4) Start epoch 3084 2024-10-21 20:05:38,545 INFO [train.py:561] (1/4) Epoch 3084, batch 2, global_batch_idx: 49330, batch size: 203, loss[dur_loss=0.189, prior_loss=0.9718, diff_loss=0.3449, tot_loss=1.506, over 203.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9719, diff_loss=0.324, tot_loss=1.485, over 442.00 samples.], 2024-10-21 20:05:52,787 INFO [train.py:561] (1/4) Epoch 3084, batch 12, global_batch_idx: 49340, batch size: 152, loss[dur_loss=0.1854, prior_loss=0.9721, diff_loss=0.2819, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9716, diff_loss=0.3505, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 20:05:57,251 INFO [train.py:682] (1/4) Start epoch 3085 2024-10-21 20:06:14,103 INFO [train.py:561] (1/4) Epoch 3085, batch 6, global_batch_idx: 49350, batch size: 106, loss[dur_loss=0.1862, prior_loss=0.9717, diff_loss=0.2597, tot_loss=1.418, over 106.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9712, diff_loss=0.386, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 20:06:27,181 INFO [train.py:682] (1/4) Start epoch 3086 2024-10-21 20:06:36,404 INFO [train.py:561] (1/4) Epoch 3086, batch 0, global_batch_idx: 49360, batch size: 108, loss[dur_loss=0.1929, prior_loss=0.9724, diff_loss=0.2822, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.1929, prior_loss=0.9724, diff_loss=0.2822, tot_loss=1.448, over 108.00 samples.], 2024-10-21 20:06:50,621 INFO [train.py:561] (1/4) Epoch 3086, batch 10, global_batch_idx: 49370, batch size: 111, loss[dur_loss=0.1893, prior_loss=0.9729, diff_loss=0.3019, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3623, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 20:06:57,694 INFO [train.py:682] (1/4) Start epoch 3087 2024-10-21 20:07:11,616 INFO [train.py:561] (1/4) Epoch 3087, batch 4, global_batch_idx: 49380, batch size: 189, loss[dur_loss=0.1912, prior_loss=0.9722, diff_loss=0.3458, tot_loss=1.509, over 189.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9712, diff_loss=0.422, tot_loss=1.579, over 937.00 samples.], 2024-10-21 20:07:26,471 INFO [train.py:561] (1/4) Epoch 3087, batch 14, global_batch_idx: 49390, batch size: 142, loss[dur_loss=0.1881, prior_loss=0.9717, diff_loss=0.2796, tot_loss=1.439, over 142.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9718, diff_loss=0.3488, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 20:07:27,896 INFO [train.py:682] (1/4) Start epoch 3088 2024-10-21 20:07:47,826 INFO [train.py:561] (1/4) Epoch 3088, batch 8, global_batch_idx: 49400, batch size: 170, loss[dur_loss=0.1927, prior_loss=0.9721, diff_loss=0.2856, tot_loss=1.45, over 170.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9714, diff_loss=0.3627, tot_loss=1.521, over 1432.00 samples.], 2024-10-21 20:07:57,990 INFO [train.py:682] (1/4) Start epoch 3089 2024-10-21 20:08:09,558 INFO [train.py:561] (1/4) Epoch 3089, batch 2, global_batch_idx: 49410, batch size: 203, loss[dur_loss=0.1897, prior_loss=0.9719, diff_loss=0.3232, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9719, diff_loss=0.289, tot_loss=1.45, over 442.00 samples.], 2024-10-21 20:08:23,856 INFO [train.py:561] (1/4) Epoch 3089, batch 12, global_batch_idx: 49420, batch size: 152, loss[dur_loss=0.1865, prior_loss=0.972, diff_loss=0.3016, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9716, diff_loss=0.341, tot_loss=1.5, over 1966.00 samples.], 2024-10-21 20:08:28,341 INFO [train.py:682] (1/4) Start epoch 3090 2024-10-21 20:08:45,555 INFO [train.py:561] (1/4) Epoch 3090, batch 6, global_batch_idx: 49430, batch size: 106, loss[dur_loss=0.1889, prior_loss=0.9718, diff_loss=0.2695, tot_loss=1.43, over 106.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9713, diff_loss=0.3931, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 20:08:58,619 INFO [train.py:682] (1/4) Start epoch 3091 2024-10-21 20:09:07,508 INFO [train.py:561] (1/4) Epoch 3091, batch 0, global_batch_idx: 49440, batch size: 108, loss[dur_loss=0.1899, prior_loss=0.9723, diff_loss=0.3118, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.1899, prior_loss=0.9723, diff_loss=0.3118, tot_loss=1.474, over 108.00 samples.], 2024-10-21 20:09:21,696 INFO [train.py:561] (1/4) Epoch 3091, batch 10, global_batch_idx: 49450, batch size: 111, loss[dur_loss=0.1864, prior_loss=0.9729, diff_loss=0.2955, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3603, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 20:09:28,767 INFO [train.py:682] (1/4) Start epoch 3092 2024-10-21 20:09:42,496 INFO [train.py:561] (1/4) Epoch 3092, batch 4, global_batch_idx: 49460, batch size: 189, loss[dur_loss=0.1862, prior_loss=0.9718, diff_loss=0.3237, tot_loss=1.482, over 189.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9711, diff_loss=0.4217, tot_loss=1.578, over 937.00 samples.], 2024-10-21 20:09:57,379 INFO [train.py:561] (1/4) Epoch 3092, batch 14, global_batch_idx: 49470, batch size: 142, loss[dur_loss=0.1895, prior_loss=0.9718, diff_loss=0.3073, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.3462, tot_loss=1.505, over 2210.00 samples.], 2024-10-21 20:09:58,801 INFO [train.py:682] (1/4) Start epoch 3093 2024-10-21 20:10:18,647 INFO [train.py:561] (1/4) Epoch 3093, batch 8, global_batch_idx: 49480, batch size: 170, loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.2869, tot_loss=1.45, over 170.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9713, diff_loss=0.3622, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 20:10:28,757 INFO [train.py:682] (1/4) Start epoch 3094 2024-10-21 20:10:40,237 INFO [train.py:561] (1/4) Epoch 3094, batch 2, global_batch_idx: 49490, batch size: 203, loss[dur_loss=0.1891, prior_loss=0.9718, diff_loss=0.3544, tot_loss=1.515, over 203.00 samples.], tot_loss[dur_loss=0.1886, prior_loss=0.9718, diff_loss=0.3261, tot_loss=1.487, over 442.00 samples.], 2024-10-21 20:10:54,454 INFO [train.py:561] (1/4) Epoch 3094, batch 12, global_batch_idx: 49500, batch size: 152, loss[dur_loss=0.1864, prior_loss=0.9721, diff_loss=0.3164, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3652, tot_loss=1.523, over 1966.00 samples.], 2024-10-21 20:10:56,082 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 20:11:39,565 INFO [train.py:589] (1/4) Epoch 3094, validation: dur_loss=0.4601, prior_loss=1.037, diff_loss=0.3703, tot_loss=1.867, over 100.00 samples. 2024-10-21 20:11:39,567 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 20:11:42,403 INFO [train.py:682] (1/4) Start epoch 3095 2024-10-21 20:11:59,545 INFO [train.py:561] (1/4) Epoch 3095, batch 6, global_batch_idx: 49510, batch size: 106, loss[dur_loss=0.1855, prior_loss=0.9717, diff_loss=0.3065, tot_loss=1.464, over 106.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9711, diff_loss=0.3966, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 20:12:12,570 INFO [train.py:682] (1/4) Start epoch 3096 2024-10-21 20:12:21,644 INFO [train.py:561] (1/4) Epoch 3096, batch 0, global_batch_idx: 49520, batch size: 108, loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.2967, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.2967, tot_loss=1.46, over 108.00 samples.], 2024-10-21 20:12:35,862 INFO [train.py:561] (1/4) Epoch 3096, batch 10, global_batch_idx: 49530, batch size: 111, loss[dur_loss=0.1901, prior_loss=0.9728, diff_loss=0.3441, tot_loss=1.507, over 111.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.3607, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 20:12:42,937 INFO [train.py:682] (1/4) Start epoch 3097 2024-10-21 20:12:56,861 INFO [train.py:561] (1/4) Epoch 3097, batch 4, global_batch_idx: 49540, batch size: 189, loss[dur_loss=0.1856, prior_loss=0.9717, diff_loss=0.3138, tot_loss=1.471, over 189.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9709, diff_loss=0.4135, tot_loss=1.568, over 937.00 samples.], 2024-10-21 20:13:11,731 INFO [train.py:561] (1/4) Epoch 3097, batch 14, global_batch_idx: 49550, batch size: 142, loss[dur_loss=0.1891, prior_loss=0.9714, diff_loss=0.2821, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1863, prior_loss=0.9715, diff_loss=0.3464, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 20:13:13,160 INFO [train.py:682] (1/4) Start epoch 3098 2024-10-21 20:13:33,252 INFO [train.py:561] (1/4) Epoch 3098, batch 8, global_batch_idx: 49560, batch size: 170, loss[dur_loss=0.1893, prior_loss=0.9721, diff_loss=0.3125, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9713, diff_loss=0.366, tot_loss=1.524, over 1432.00 samples.], 2024-10-21 20:13:43,496 INFO [train.py:682] (1/4) Start epoch 3099 2024-10-21 20:13:55,172 INFO [train.py:561] (1/4) Epoch 3099, batch 2, global_batch_idx: 49570, batch size: 203, loss[dur_loss=0.1908, prior_loss=0.9718, diff_loss=0.3295, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9718, diff_loss=0.3031, tot_loss=1.464, over 442.00 samples.], 2024-10-21 20:14:09,423 INFO [train.py:561] (1/4) Epoch 3099, batch 12, global_batch_idx: 49580, batch size: 152, loss[dur_loss=0.1868, prior_loss=0.9719, diff_loss=0.2988, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9714, diff_loss=0.3493, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 20:14:13,905 INFO [train.py:682] (1/4) Start epoch 3100 2024-10-21 20:14:31,084 INFO [train.py:561] (1/4) Epoch 3100, batch 6, global_batch_idx: 49590, batch size: 106, loss[dur_loss=0.1888, prior_loss=0.9716, diff_loss=0.2347, tot_loss=1.395, over 106.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9711, diff_loss=0.3771, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 20:14:44,220 INFO [train.py:682] (1/4) Start epoch 3101 2024-10-21 20:14:53,524 INFO [train.py:561] (1/4) Epoch 3101, batch 0, global_batch_idx: 49600, batch size: 108, loss[dur_loss=0.19, prior_loss=0.9722, diff_loss=0.2966, tot_loss=1.459, over 108.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9722, diff_loss=0.2966, tot_loss=1.459, over 108.00 samples.], 2024-10-21 20:15:07,738 INFO [train.py:561] (1/4) Epoch 3101, batch 10, global_batch_idx: 49610, batch size: 111, loss[dur_loss=0.1882, prior_loss=0.9728, diff_loss=0.2594, tot_loss=1.42, over 111.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.9714, diff_loss=0.3687, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 20:15:14,776 INFO [train.py:682] (1/4) Start epoch 3102 2024-10-21 20:15:28,571 INFO [train.py:561] (1/4) Epoch 3102, batch 4, global_batch_idx: 49620, batch size: 189, loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.3315, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9709, diff_loss=0.4205, tot_loss=1.575, over 937.00 samples.], 2024-10-21 20:15:43,394 INFO [train.py:561] (1/4) Epoch 3102, batch 14, global_batch_idx: 49630, batch size: 142, loss[dur_loss=0.1906, prior_loss=0.9715, diff_loss=0.3174, tot_loss=1.48, over 142.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9714, diff_loss=0.3528, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 20:15:44,812 INFO [train.py:682] (1/4) Start epoch 3103 2024-10-21 20:16:04,717 INFO [train.py:561] (1/4) Epoch 3103, batch 8, global_batch_idx: 49640, batch size: 170, loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.3476, tot_loss=1.51, over 170.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9713, diff_loss=0.3842, tot_loss=1.542, over 1432.00 samples.], 2024-10-21 20:16:14,759 INFO [train.py:682] (1/4) Start epoch 3104 2024-10-21 20:16:26,377 INFO [train.py:561] (1/4) Epoch 3104, batch 2, global_batch_idx: 49650, batch size: 203, loss[dur_loss=0.1884, prior_loss=0.9718, diff_loss=0.3134, tot_loss=1.474, over 203.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9718, diff_loss=0.312, tot_loss=1.472, over 442.00 samples.], 2024-10-21 20:16:40,649 INFO [train.py:561] (1/4) Epoch 3104, batch 12, global_batch_idx: 49660, batch size: 152, loss[dur_loss=0.1905, prior_loss=0.9718, diff_loss=0.3197, tot_loss=1.482, over 152.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9714, diff_loss=0.3548, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 20:16:45,107 INFO [train.py:682] (1/4) Start epoch 3105 2024-10-21 20:17:02,123 INFO [train.py:561] (1/4) Epoch 3105, batch 6, global_batch_idx: 49670, batch size: 106, loss[dur_loss=0.1885, prior_loss=0.9716, diff_loss=0.2988, tot_loss=1.459, over 106.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9711, diff_loss=0.3928, tot_loss=1.548, over 1142.00 samples.], 2024-10-21 20:17:15,043 INFO [train.py:682] (1/4) Start epoch 3106 2024-10-21 20:17:23,978 INFO [train.py:561] (1/4) Epoch 3106, batch 0, global_batch_idx: 49680, batch size: 108, loss[dur_loss=0.1896, prior_loss=0.9722, diff_loss=0.3182, tot_loss=1.48, over 108.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9722, diff_loss=0.3182, tot_loss=1.48, over 108.00 samples.], 2024-10-21 20:17:38,144 INFO [train.py:561] (1/4) Epoch 3106, batch 10, global_batch_idx: 49690, batch size: 111, loss[dur_loss=0.1855, prior_loss=0.9726, diff_loss=0.2562, tot_loss=1.414, over 111.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9713, diff_loss=0.3648, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 20:17:45,199 INFO [train.py:682] (1/4) Start epoch 3107 2024-10-21 20:17:58,914 INFO [train.py:561] (1/4) Epoch 3107, batch 4, global_batch_idx: 49700, batch size: 189, loss[dur_loss=0.1868, prior_loss=0.9717, diff_loss=0.3296, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.4004, tot_loss=1.555, over 937.00 samples.], 2024-10-21 20:18:13,623 INFO [train.py:561] (1/4) Epoch 3107, batch 14, global_batch_idx: 49710, batch size: 142, loss[dur_loss=0.1873, prior_loss=0.9714, diff_loss=0.2931, tot_loss=1.452, over 142.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9715, diff_loss=0.3401, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 20:18:15,031 INFO [train.py:682] (1/4) Start epoch 3108 2024-10-21 20:18:34,916 INFO [train.py:561] (1/4) Epoch 3108, batch 8, global_batch_idx: 49720, batch size: 170, loss[dur_loss=0.1914, prior_loss=0.9717, diff_loss=0.2849, tot_loss=1.448, over 170.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9712, diff_loss=0.3653, tot_loss=1.521, over 1432.00 samples.], 2024-10-21 20:18:45,062 INFO [train.py:682] (1/4) Start epoch 3109 2024-10-21 20:18:56,733 INFO [train.py:561] (1/4) Epoch 3109, batch 2, global_batch_idx: 49730, batch size: 203, loss[dur_loss=0.1894, prior_loss=0.9718, diff_loss=0.3035, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9719, diff_loss=0.2919, tot_loss=1.452, over 442.00 samples.], 2024-10-21 20:19:10,901 INFO [train.py:561] (1/4) Epoch 3109, batch 12, global_batch_idx: 49740, batch size: 152, loss[dur_loss=0.1863, prior_loss=0.9717, diff_loss=0.3006, tot_loss=1.459, over 152.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9714, diff_loss=0.3472, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 20:19:15,331 INFO [train.py:682] (1/4) Start epoch 3110 2024-10-21 20:19:32,605 INFO [train.py:561] (1/4) Epoch 3110, batch 6, global_batch_idx: 49750, batch size: 106, loss[dur_loss=0.1853, prior_loss=0.9717, diff_loss=0.2819, tot_loss=1.439, over 106.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.971, diff_loss=0.3916, tot_loss=1.548, over 1142.00 samples.], 2024-10-21 20:19:45,624 INFO [train.py:682] (1/4) Start epoch 3111 2024-10-21 20:19:54,296 INFO [train.py:561] (1/4) Epoch 3111, batch 0, global_batch_idx: 49760, batch size: 108, loss[dur_loss=0.1894, prior_loss=0.9721, diff_loss=0.2833, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9721, diff_loss=0.2833, tot_loss=1.445, over 108.00 samples.], 2024-10-21 20:20:08,557 INFO [train.py:561] (1/4) Epoch 3111, batch 10, global_batch_idx: 49770, batch size: 111, loss[dur_loss=0.1894, prior_loss=0.9728, diff_loss=0.3327, tot_loss=1.495, over 111.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9714, diff_loss=0.3674, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 20:20:15,677 INFO [train.py:682] (1/4) Start epoch 3112 2024-10-21 20:20:29,956 INFO [train.py:561] (1/4) Epoch 3112, batch 4, global_batch_idx: 49780, batch size: 189, loss[dur_loss=0.1894, prior_loss=0.9718, diff_loss=0.3085, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.971, diff_loss=0.4274, tot_loss=1.582, over 937.00 samples.], 2024-10-21 20:20:44,979 INFO [train.py:561] (1/4) Epoch 3112, batch 14, global_batch_idx: 49790, batch size: 142, loss[dur_loss=0.1883, prior_loss=0.9713, diff_loss=0.3134, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3517, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 20:20:46,417 INFO [train.py:682] (1/4) Start epoch 3113 2024-10-21 20:21:06,665 INFO [train.py:561] (1/4) Epoch 3113, batch 8, global_batch_idx: 49800, batch size: 170, loss[dur_loss=0.1911, prior_loss=0.9718, diff_loss=0.3161, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9714, diff_loss=0.3703, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 20:21:16,936 INFO [train.py:682] (1/4) Start epoch 3114 2024-10-21 20:21:28,558 INFO [train.py:561] (1/4) Epoch 3114, batch 2, global_batch_idx: 49810, batch size: 203, loss[dur_loss=0.1907, prior_loss=0.9717, diff_loss=0.3323, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9718, diff_loss=0.3139, tot_loss=1.475, over 442.00 samples.], 2024-10-21 20:21:42,960 INFO [train.py:561] (1/4) Epoch 3114, batch 12, global_batch_idx: 49820, batch size: 152, loss[dur_loss=0.1846, prior_loss=0.9719, diff_loss=0.3048, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9716, diff_loss=0.3562, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 20:21:47,451 INFO [train.py:682] (1/4) Start epoch 3115 2024-10-21 20:22:04,764 INFO [train.py:561] (1/4) Epoch 3115, batch 6, global_batch_idx: 49830, batch size: 106, loss[dur_loss=0.1874, prior_loss=0.9716, diff_loss=0.2872, tot_loss=1.446, over 106.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9712, diff_loss=0.3997, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 20:22:17,923 INFO [train.py:682] (1/4) Start epoch 3116 2024-10-21 20:22:26,709 INFO [train.py:561] (1/4) Epoch 3116, batch 0, global_batch_idx: 49840, batch size: 108, loss[dur_loss=0.1907, prior_loss=0.9723, diff_loss=0.2784, tot_loss=1.441, over 108.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9723, diff_loss=0.2784, tot_loss=1.441, over 108.00 samples.], 2024-10-21 20:22:41,069 INFO [train.py:561] (1/4) Epoch 3116, batch 10, global_batch_idx: 49850, batch size: 111, loss[dur_loss=0.1903, prior_loss=0.9727, diff_loss=0.3092, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9714, diff_loss=0.355, tot_loss=1.512, over 1656.00 samples.], 2024-10-21 20:22:48,167 INFO [train.py:682] (1/4) Start epoch 3117 2024-10-21 20:23:02,273 INFO [train.py:561] (1/4) Epoch 3117, batch 4, global_batch_idx: 49860, batch size: 189, loss[dur_loss=0.1907, prior_loss=0.9719, diff_loss=0.3111, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.971, diff_loss=0.4228, tot_loss=1.579, over 937.00 samples.], 2024-10-21 20:23:17,186 INFO [train.py:561] (1/4) Epoch 3117, batch 14, global_batch_idx: 49870, batch size: 142, loss[dur_loss=0.1867, prior_loss=0.9713, diff_loss=0.2849, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.347, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 20:23:18,620 INFO [train.py:682] (1/4) Start epoch 3118 2024-10-21 20:23:39,071 INFO [train.py:561] (1/4) Epoch 3118, batch 8, global_batch_idx: 49880, batch size: 170, loss[dur_loss=0.1951, prior_loss=0.9722, diff_loss=0.3313, tot_loss=1.499, over 170.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9714, diff_loss=0.3759, tot_loss=1.534, over 1432.00 samples.], 2024-10-21 20:23:49,243 INFO [train.py:682] (1/4) Start epoch 3119 2024-10-21 20:24:00,624 INFO [train.py:561] (1/4) Epoch 3119, batch 2, global_batch_idx: 49890, batch size: 203, loss[dur_loss=0.1891, prior_loss=0.9719, diff_loss=0.3446, tot_loss=1.506, over 203.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9719, diff_loss=0.3194, tot_loss=1.48, over 442.00 samples.], 2024-10-21 20:24:14,991 INFO [train.py:561] (1/4) Epoch 3119, batch 12, global_batch_idx: 49900, batch size: 152, loss[dur_loss=0.1849, prior_loss=0.972, diff_loss=0.2772, tot_loss=1.434, over 152.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9715, diff_loss=0.3563, tot_loss=1.515, over 1966.00 samples.], 2024-10-21 20:24:19,488 INFO [train.py:682] (1/4) Start epoch 3120 2024-10-21 20:24:36,746 INFO [train.py:561] (1/4) Epoch 3120, batch 6, global_batch_idx: 49910, batch size: 106, loss[dur_loss=0.1846, prior_loss=0.9716, diff_loss=0.3081, tot_loss=1.464, over 106.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9711, diff_loss=0.3989, tot_loss=1.555, over 1142.00 samples.], 2024-10-21 20:24:49,965 INFO [train.py:682] (1/4) Start epoch 3121 2024-10-21 20:24:58,728 INFO [train.py:561] (1/4) Epoch 3121, batch 0, global_batch_idx: 49920, batch size: 108, loss[dur_loss=0.1892, prior_loss=0.9719, diff_loss=0.2848, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9719, diff_loss=0.2848, tot_loss=1.446, over 108.00 samples.], 2024-10-21 20:25:13,164 INFO [train.py:561] (1/4) Epoch 3121, batch 10, global_batch_idx: 49930, batch size: 111, loss[dur_loss=0.1861, prior_loss=0.9728, diff_loss=0.2694, tot_loss=1.428, over 111.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9714, diff_loss=0.3629, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 20:25:20,298 INFO [train.py:682] (1/4) Start epoch 3122 2024-10-21 20:25:34,074 INFO [train.py:561] (1/4) Epoch 3122, batch 4, global_batch_idx: 49940, batch size: 189, loss[dur_loss=0.1868, prior_loss=0.9718, diff_loss=0.3049, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.971, diff_loss=0.4133, tot_loss=1.568, over 937.00 samples.], 2024-10-21 20:25:49,092 INFO [train.py:561] (1/4) Epoch 3122, batch 14, global_batch_idx: 49950, batch size: 142, loss[dur_loss=0.1867, prior_loss=0.9717, diff_loss=0.2795, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9715, diff_loss=0.3423, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 20:25:50,525 INFO [train.py:682] (1/4) Start epoch 3123 2024-10-21 20:26:10,935 INFO [train.py:561] (1/4) Epoch 3123, batch 8, global_batch_idx: 49960, batch size: 170, loss[dur_loss=0.1908, prior_loss=0.9718, diff_loss=0.3036, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9712, diff_loss=0.3701, tot_loss=1.527, over 1432.00 samples.], 2024-10-21 20:26:21,094 INFO [train.py:682] (1/4) Start epoch 3124 2024-10-21 20:26:32,641 INFO [train.py:561] (1/4) Epoch 3124, batch 2, global_batch_idx: 49970, batch size: 203, loss[dur_loss=0.1896, prior_loss=0.9718, diff_loss=0.3406, tot_loss=1.502, over 203.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.9718, diff_loss=0.3096, tot_loss=1.47, over 442.00 samples.], 2024-10-21 20:26:47,104 INFO [train.py:561] (1/4) Epoch 3124, batch 12, global_batch_idx: 49980, batch size: 152, loss[dur_loss=0.1874, prior_loss=0.9721, diff_loss=0.3412, tot_loss=1.501, over 152.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9716, diff_loss=0.3606, tot_loss=1.519, over 1966.00 samples.], 2024-10-21 20:26:51,583 INFO [train.py:682] (1/4) Start epoch 3125 2024-10-21 20:27:08,724 INFO [train.py:561] (1/4) Epoch 3125, batch 6, global_batch_idx: 49990, batch size: 106, loss[dur_loss=0.1909, prior_loss=0.9718, diff_loss=0.2746, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.9711, diff_loss=0.3802, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 20:27:21,850 INFO [train.py:682] (1/4) Start epoch 3126 2024-10-21 20:27:31,070 INFO [train.py:561] (1/4) Epoch 3126, batch 0, global_batch_idx: 50000, batch size: 108, loss[dur_loss=0.1909, prior_loss=0.9721, diff_loss=0.3037, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.1909, prior_loss=0.9721, diff_loss=0.3037, tot_loss=1.467, over 108.00 samples.], 2024-10-21 20:27:45,457 INFO [train.py:561] (1/4) Epoch 3126, batch 10, global_batch_idx: 50010, batch size: 111, loss[dur_loss=0.1896, prior_loss=0.9727, diff_loss=0.3045, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.3636, tot_loss=1.522, over 1656.00 samples.], 2024-10-21 20:27:52,647 INFO [train.py:682] (1/4) Start epoch 3127 2024-10-21 20:28:06,438 INFO [train.py:561] (1/4) Epoch 3127, batch 4, global_batch_idx: 50020, batch size: 189, loss[dur_loss=0.1863, prior_loss=0.9719, diff_loss=0.3052, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.4115, tot_loss=1.566, over 937.00 samples.], 2024-10-21 20:28:21,447 INFO [train.py:561] (1/4) Epoch 3127, batch 14, global_batch_idx: 50030, batch size: 142, loss[dur_loss=0.1862, prior_loss=0.9714, diff_loss=0.245, tot_loss=1.403, over 142.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9715, diff_loss=0.3365, tot_loss=1.494, over 2210.00 samples.], 2024-10-21 20:28:22,885 INFO [train.py:682] (1/4) Start epoch 3128 2024-10-21 20:28:43,110 INFO [train.py:561] (1/4) Epoch 3128, batch 8, global_batch_idx: 50040, batch size: 170, loss[dur_loss=0.1901, prior_loss=0.9723, diff_loss=0.3028, tot_loss=1.465, over 170.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9714, diff_loss=0.3686, tot_loss=1.527, over 1432.00 samples.], 2024-10-21 20:28:53,355 INFO [train.py:682] (1/4) Start epoch 3129 2024-10-21 20:29:05,188 INFO [train.py:561] (1/4) Epoch 3129, batch 2, global_batch_idx: 50050, batch size: 203, loss[dur_loss=0.1878, prior_loss=0.9719, diff_loss=0.3203, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9719, diff_loss=0.3064, tot_loss=1.466, over 442.00 samples.], 2024-10-21 20:29:19,531 INFO [train.py:561] (1/4) Epoch 3129, batch 12, global_batch_idx: 50060, batch size: 152, loss[dur_loss=0.19, prior_loss=0.9721, diff_loss=0.2987, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9716, diff_loss=0.3474, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 20:29:23,980 INFO [train.py:682] (1/4) Start epoch 3130 2024-10-21 20:29:41,171 INFO [train.py:561] (1/4) Epoch 3130, batch 6, global_batch_idx: 50070, batch size: 106, loss[dur_loss=0.1835, prior_loss=0.9717, diff_loss=0.3018, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9712, diff_loss=0.385, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 20:29:54,298 INFO [train.py:682] (1/4) Start epoch 3131 2024-10-21 20:30:03,131 INFO [train.py:561] (1/4) Epoch 3131, batch 0, global_batch_idx: 50080, batch size: 108, loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.3424, tot_loss=1.506, over 108.00 samples.], tot_loss[dur_loss=0.1915, prior_loss=0.9723, diff_loss=0.3424, tot_loss=1.506, over 108.00 samples.], 2024-10-21 20:30:17,477 INFO [train.py:561] (1/4) Epoch 3131, batch 10, global_batch_idx: 50090, batch size: 111, loss[dur_loss=0.1908, prior_loss=0.973, diff_loss=0.3147, tot_loss=1.478, over 111.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9716, diff_loss=0.3612, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 20:30:24,604 INFO [train.py:682] (1/4) Start epoch 3132 2024-10-21 20:30:38,496 INFO [train.py:561] (1/4) Epoch 3132, batch 4, global_batch_idx: 50100, batch size: 189, loss[dur_loss=0.1929, prior_loss=0.9718, diff_loss=0.3512, tot_loss=1.516, over 189.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.971, diff_loss=0.4292, tot_loss=1.587, over 937.00 samples.], 2024-10-21 20:30:53,471 INFO [train.py:561] (1/4) Epoch 3132, batch 14, global_batch_idx: 50110, batch size: 142, loss[dur_loss=0.1909, prior_loss=0.9717, diff_loss=0.2852, tot_loss=1.448, over 142.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9716, diff_loss=0.3569, tot_loss=1.516, over 2210.00 samples.], 2024-10-21 20:30:54,902 INFO [train.py:682] (1/4) Start epoch 3133 2024-10-21 20:31:14,930 INFO [train.py:561] (1/4) Epoch 3133, batch 8, global_batch_idx: 50120, batch size: 170, loss[dur_loss=0.189, prior_loss=0.9718, diff_loss=0.3053, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.3728, tot_loss=1.53, over 1432.00 samples.], 2024-10-21 20:31:25,122 INFO [train.py:682] (1/4) Start epoch 3134 2024-10-21 20:31:36,696 INFO [train.py:561] (1/4) Epoch 3134, batch 2, global_batch_idx: 50130, batch size: 203, loss[dur_loss=0.1898, prior_loss=0.9717, diff_loss=0.3065, tot_loss=1.468, over 203.00 samples.], tot_loss[dur_loss=0.1885, prior_loss=0.9717, diff_loss=0.2892, tot_loss=1.449, over 442.00 samples.], 2024-10-21 20:31:51,055 INFO [train.py:561] (1/4) Epoch 3134, batch 12, global_batch_idx: 50140, batch size: 152, loss[dur_loss=0.1866, prior_loss=0.9717, diff_loss=0.2767, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9715, diff_loss=0.3443, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 20:31:55,488 INFO [train.py:682] (1/4) Start epoch 3135 2024-10-21 20:32:13,078 INFO [train.py:561] (1/4) Epoch 3135, batch 6, global_batch_idx: 50150, batch size: 106, loss[dur_loss=0.1848, prior_loss=0.9716, diff_loss=0.2535, tot_loss=1.41, over 106.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.9712, diff_loss=0.3958, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 20:32:26,269 INFO [train.py:682] (1/4) Start epoch 3136 2024-10-21 20:32:35,045 INFO [train.py:561] (1/4) Epoch 3136, batch 0, global_batch_idx: 50160, batch size: 108, loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.2644, tot_loss=1.426, over 108.00 samples.], tot_loss[dur_loss=0.1898, prior_loss=0.9722, diff_loss=0.2644, tot_loss=1.426, over 108.00 samples.], 2024-10-21 20:32:49,328 INFO [train.py:561] (1/4) Epoch 3136, batch 10, global_batch_idx: 50170, batch size: 111, loss[dur_loss=0.1861, prior_loss=0.9726, diff_loss=0.2955, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9714, diff_loss=0.3624, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 20:32:56,479 INFO [train.py:682] (1/4) Start epoch 3137 2024-10-21 20:33:10,113 INFO [train.py:561] (1/4) Epoch 3137, batch 4, global_batch_idx: 50180, batch size: 189, loss[dur_loss=0.1871, prior_loss=0.9717, diff_loss=0.2852, tot_loss=1.444, over 189.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9709, diff_loss=0.417, tot_loss=1.572, over 937.00 samples.], 2024-10-21 20:33:25,146 INFO [train.py:561] (1/4) Epoch 3137, batch 14, global_batch_idx: 50190, batch size: 142, loss[dur_loss=0.1911, prior_loss=0.9716, diff_loss=0.3279, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.3507, tot_loss=1.509, over 2210.00 samples.], 2024-10-21 20:33:26,590 INFO [train.py:682] (1/4) Start epoch 3138 2024-10-21 20:33:46,889 INFO [train.py:561] (1/4) Epoch 3138, batch 8, global_batch_idx: 50200, batch size: 170, loss[dur_loss=0.1891, prior_loss=0.9719, diff_loss=0.3284, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9713, diff_loss=0.3764, tot_loss=1.533, over 1432.00 samples.], 2024-10-21 20:33:57,082 INFO [train.py:682] (1/4) Start epoch 3139 2024-10-21 20:34:08,470 INFO [train.py:561] (1/4) Epoch 3139, batch 2, global_batch_idx: 50210, batch size: 203, loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3024, tot_loss=1.461, over 203.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9716, diff_loss=0.3058, tot_loss=1.465, over 442.00 samples.], 2024-10-21 20:34:22,899 INFO [train.py:561] (1/4) Epoch 3139, batch 12, global_batch_idx: 50220, batch size: 152, loss[dur_loss=0.1868, prior_loss=0.9721, diff_loss=0.309, tot_loss=1.468, over 152.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.3502, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 20:34:27,402 INFO [train.py:682] (1/4) Start epoch 3140 2024-10-21 20:34:44,510 INFO [train.py:561] (1/4) Epoch 3140, batch 6, global_batch_idx: 50230, batch size: 106, loss[dur_loss=0.187, prior_loss=0.9717, diff_loss=0.265, tot_loss=1.424, over 106.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9711, diff_loss=0.3994, tot_loss=1.555, over 1142.00 samples.], 2024-10-21 20:34:57,687 INFO [train.py:682] (1/4) Start epoch 3141 2024-10-21 20:35:06,306 INFO [train.py:561] (1/4) Epoch 3141, batch 0, global_batch_idx: 50240, batch size: 108, loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.2835, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9722, diff_loss=0.2835, tot_loss=1.446, over 108.00 samples.], 2024-10-21 20:35:20,662 INFO [train.py:561] (1/4) Epoch 3141, batch 10, global_batch_idx: 50250, batch size: 111, loss[dur_loss=0.189, prior_loss=0.9725, diff_loss=0.2895, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9713, diff_loss=0.3577, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 20:35:27,819 INFO [train.py:682] (1/4) Start epoch 3142 2024-10-21 20:35:41,996 INFO [train.py:561] (1/4) Epoch 3142, batch 4, global_batch_idx: 50260, batch size: 189, loss[dur_loss=0.185, prior_loss=0.9717, diff_loss=0.3345, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9709, diff_loss=0.4074, tot_loss=1.563, over 937.00 samples.], 2024-10-21 20:35:56,970 INFO [train.py:561] (1/4) Epoch 3142, batch 14, global_batch_idx: 50270, batch size: 142, loss[dur_loss=0.1889, prior_loss=0.9715, diff_loss=0.2613, tot_loss=1.422, over 142.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.3377, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 20:35:58,408 INFO [train.py:682] (1/4) Start epoch 3143 2024-10-21 20:36:18,514 INFO [train.py:561] (1/4) Epoch 3143, batch 8, global_batch_idx: 50280, batch size: 170, loss[dur_loss=0.1899, prior_loss=0.972, diff_loss=0.3037, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9712, diff_loss=0.3796, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 20:36:28,839 INFO [train.py:682] (1/4) Start epoch 3144 2024-10-21 20:36:40,278 INFO [train.py:561] (1/4) Epoch 3144, batch 2, global_batch_idx: 50290, batch size: 203, loss[dur_loss=0.1867, prior_loss=0.9715, diff_loss=0.3286, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3116, tot_loss=1.47, over 442.00 samples.], 2024-10-21 20:36:54,664 INFO [train.py:561] (1/4) Epoch 3144, batch 12, global_batch_idx: 50300, batch size: 152, loss[dur_loss=0.1854, prior_loss=0.9718, diff_loss=0.286, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.3498, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 20:36:59,175 INFO [train.py:682] (1/4) Start epoch 3145 2024-10-21 20:37:16,663 INFO [train.py:561] (1/4) Epoch 3145, batch 6, global_batch_idx: 50310, batch size: 106, loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.2636, tot_loss=1.423, over 106.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.3838, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 20:37:29,985 INFO [train.py:682] (1/4) Start epoch 3146 2024-10-21 20:37:38,867 INFO [train.py:561] (1/4) Epoch 3146, batch 0, global_batch_idx: 50320, batch size: 108, loss[dur_loss=0.1875, prior_loss=0.9721, diff_loss=0.3431, tot_loss=1.503, over 108.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9721, diff_loss=0.3431, tot_loss=1.503, over 108.00 samples.], 2024-10-21 20:37:53,274 INFO [train.py:561] (1/4) Epoch 3146, batch 10, global_batch_idx: 50330, batch size: 111, loss[dur_loss=0.1887, prior_loss=0.9725, diff_loss=0.3112, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9713, diff_loss=0.3676, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 20:38:00,473 INFO [train.py:682] (1/4) Start epoch 3147 2024-10-21 20:38:14,341 INFO [train.py:561] (1/4) Epoch 3147, batch 4, global_batch_idx: 50340, batch size: 189, loss[dur_loss=0.1879, prior_loss=0.9715, diff_loss=0.3447, tot_loss=1.504, over 189.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9708, diff_loss=0.4198, tot_loss=1.576, over 937.00 samples.], 2024-10-21 20:38:29,381 INFO [train.py:561] (1/4) Epoch 3147, batch 14, global_batch_idx: 50350, batch size: 142, loss[dur_loss=0.1876, prior_loss=0.9714, diff_loss=0.3142, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.3479, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 20:38:30,841 INFO [train.py:682] (1/4) Start epoch 3148 2024-10-21 20:38:50,856 INFO [train.py:561] (1/4) Epoch 3148, batch 8, global_batch_idx: 50360, batch size: 170, loss[dur_loss=0.1881, prior_loss=0.9717, diff_loss=0.28, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9712, diff_loss=0.3615, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 20:39:01,050 INFO [train.py:682] (1/4) Start epoch 3149 2024-10-21 20:39:12,503 INFO [train.py:561] (1/4) Epoch 3149, batch 2, global_batch_idx: 50370, batch size: 203, loss[dur_loss=0.1859, prior_loss=0.9715, diff_loss=0.3144, tot_loss=1.472, over 203.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9716, diff_loss=0.3046, tot_loss=1.463, over 442.00 samples.], 2024-10-21 20:39:26,762 INFO [train.py:561] (1/4) Epoch 3149, batch 12, global_batch_idx: 50380, batch size: 152, loss[dur_loss=0.1853, prior_loss=0.9717, diff_loss=0.3224, tot_loss=1.479, over 152.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9713, diff_loss=0.3531, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 20:39:31,231 INFO [train.py:682] (1/4) Start epoch 3150 2024-10-21 20:39:48,209 INFO [train.py:561] (1/4) Epoch 3150, batch 6, global_batch_idx: 50390, batch size: 106, loss[dur_loss=0.1884, prior_loss=0.9716, diff_loss=0.285, tot_loss=1.445, over 106.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.3877, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 20:40:01,276 INFO [train.py:682] (1/4) Start epoch 3151 2024-10-21 20:40:09,942 INFO [train.py:561] (1/4) Epoch 3151, batch 0, global_batch_idx: 50400, batch size: 108, loss[dur_loss=0.1906, prior_loss=0.9721, diff_loss=0.2838, tot_loss=1.447, over 108.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9721, diff_loss=0.2838, tot_loss=1.447, over 108.00 samples.], 2024-10-21 20:40:24,106 INFO [train.py:561] (1/4) Epoch 3151, batch 10, global_batch_idx: 50410, batch size: 111, loss[dur_loss=0.1892, prior_loss=0.9727, diff_loss=0.2675, tot_loss=1.429, over 111.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9714, diff_loss=0.3609, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 20:40:31,189 INFO [train.py:682] (1/4) Start epoch 3152 2024-10-21 20:40:45,056 INFO [train.py:561] (1/4) Epoch 3152, batch 4, global_batch_idx: 50420, batch size: 189, loss[dur_loss=0.1877, prior_loss=0.9718, diff_loss=0.3445, tot_loss=1.504, over 189.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9708, diff_loss=0.407, tot_loss=1.562, over 937.00 samples.], 2024-10-21 20:41:00,005 INFO [train.py:561] (1/4) Epoch 3152, batch 14, global_batch_idx: 50430, batch size: 142, loss[dur_loss=0.189, prior_loss=0.9713, diff_loss=0.2807, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.3494, tot_loss=1.507, over 2210.00 samples.], 2024-10-21 20:41:01,450 INFO [train.py:682] (1/4) Start epoch 3153 2024-10-21 20:41:21,818 INFO [train.py:561] (1/4) Epoch 3153, batch 8, global_batch_idx: 50440, batch size: 170, loss[dur_loss=0.1924, prior_loss=0.9721, diff_loss=0.3097, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3829, tot_loss=1.54, over 1432.00 samples.], 2024-10-21 20:41:31,978 INFO [train.py:682] (1/4) Start epoch 3154 2024-10-21 20:41:43,400 INFO [train.py:561] (1/4) Epoch 3154, batch 2, global_batch_idx: 50450, batch size: 203, loss[dur_loss=0.188, prior_loss=0.9715, diff_loss=0.3232, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1879, prior_loss=0.9716, diff_loss=0.303, tot_loss=1.462, over 442.00 samples.], 2024-10-21 20:41:57,742 INFO [train.py:561] (1/4) Epoch 3154, batch 12, global_batch_idx: 50460, batch size: 152, loss[dur_loss=0.1844, prior_loss=0.9717, diff_loss=0.3088, tot_loss=1.465, over 152.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9713, diff_loss=0.357, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 20:42:02,430 INFO [train.py:682] (1/4) Start epoch 3155 2024-10-21 20:42:19,606 INFO [train.py:561] (1/4) Epoch 3155, batch 6, global_batch_idx: 50470, batch size: 106, loss[dur_loss=0.189, prior_loss=0.9717, diff_loss=0.2861, tot_loss=1.447, over 106.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.971, diff_loss=0.3935, tot_loss=1.548, over 1142.00 samples.], 2024-10-21 20:42:32,784 INFO [train.py:682] (1/4) Start epoch 3156 2024-10-21 20:42:41,629 INFO [train.py:561] (1/4) Epoch 3156, batch 0, global_batch_idx: 50480, batch size: 108, loss[dur_loss=0.1896, prior_loss=0.9719, diff_loss=0.2977, tot_loss=1.459, over 108.00 samples.], tot_loss[dur_loss=0.1896, prior_loss=0.9719, diff_loss=0.2977, tot_loss=1.459, over 108.00 samples.], 2024-10-21 20:42:55,884 INFO [train.py:561] (1/4) Epoch 3156, batch 10, global_batch_idx: 50490, batch size: 111, loss[dur_loss=0.1877, prior_loss=0.9727, diff_loss=0.2637, tot_loss=1.424, over 111.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9713, diff_loss=0.3596, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 20:43:02,958 INFO [train.py:682] (1/4) Start epoch 3157 2024-10-21 20:43:16,760 INFO [train.py:561] (1/4) Epoch 3157, batch 4, global_batch_idx: 50500, batch size: 189, loss[dur_loss=0.1857, prior_loss=0.9715, diff_loss=0.291, tot_loss=1.448, over 189.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9707, diff_loss=0.4048, tot_loss=1.558, over 937.00 samples.], 2024-10-21 20:43:31,636 INFO [train.py:561] (1/4) Epoch 3157, batch 14, global_batch_idx: 50510, batch size: 142, loss[dur_loss=0.1877, prior_loss=0.9713, diff_loss=0.2879, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9713, diff_loss=0.3473, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 20:43:33,122 INFO [train.py:682] (1/4) Start epoch 3158 2024-10-21 20:43:52,949 INFO [train.py:561] (1/4) Epoch 3158, batch 8, global_batch_idx: 50520, batch size: 170, loss[dur_loss=0.1883, prior_loss=0.9716, diff_loss=0.3062, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9711, diff_loss=0.3732, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 20:44:03,070 INFO [train.py:682] (1/4) Start epoch 3159 2024-10-21 20:44:14,974 INFO [train.py:561] (1/4) Epoch 3159, batch 2, global_batch_idx: 50530, batch size: 203, loss[dur_loss=0.1908, prior_loss=0.9714, diff_loss=0.3704, tot_loss=1.533, over 203.00 samples.], tot_loss[dur_loss=0.1889, prior_loss=0.9716, diff_loss=0.3139, tot_loss=1.474, over 442.00 samples.], 2024-10-21 20:44:29,253 INFO [train.py:561] (1/4) Epoch 3159, batch 12, global_batch_idx: 50540, batch size: 152, loss[dur_loss=0.1852, prior_loss=0.9715, diff_loss=0.3035, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9713, diff_loss=0.3495, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 20:44:33,799 INFO [train.py:682] (1/4) Start epoch 3160 2024-10-21 20:44:50,891 INFO [train.py:561] (1/4) Epoch 3160, batch 6, global_batch_idx: 50550, batch size: 106, loss[dur_loss=0.1865, prior_loss=0.9717, diff_loss=0.2591, tot_loss=1.417, over 106.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9711, diff_loss=0.3889, tot_loss=1.544, over 1142.00 samples.], 2024-10-21 20:45:04,027 INFO [train.py:682] (1/4) Start epoch 3161 2024-10-21 20:45:12,914 INFO [train.py:561] (1/4) Epoch 3161, batch 0, global_batch_idx: 50560, batch size: 108, loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3151, tot_loss=1.476, over 108.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.972, diff_loss=0.3151, tot_loss=1.476, over 108.00 samples.], 2024-10-21 20:45:27,184 INFO [train.py:561] (1/4) Epoch 3161, batch 10, global_batch_idx: 50570, batch size: 111, loss[dur_loss=0.1846, prior_loss=0.9726, diff_loss=0.2929, tot_loss=1.45, over 111.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.9712, diff_loss=0.3688, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 20:45:34,296 INFO [train.py:682] (1/4) Start epoch 3162 2024-10-21 20:45:48,245 INFO [train.py:561] (1/4) Epoch 3162, batch 4, global_batch_idx: 50580, batch size: 189, loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.2978, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9707, diff_loss=0.4029, tot_loss=1.558, over 937.00 samples.], 2024-10-21 20:46:03,086 INFO [train.py:561] (1/4) Epoch 3162, batch 14, global_batch_idx: 50590, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.9713, diff_loss=0.2917, tot_loss=1.452, over 142.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9712, diff_loss=0.3377, tot_loss=1.494, over 2210.00 samples.], 2024-10-21 20:46:04,505 INFO [train.py:682] (1/4) Start epoch 3163 2024-10-21 20:46:24,411 INFO [train.py:561] (1/4) Epoch 3163, batch 8, global_batch_idx: 50600, batch size: 170, loss[dur_loss=0.1905, prior_loss=0.9718, diff_loss=0.3314, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.971, diff_loss=0.3824, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 20:46:34,572 INFO [train.py:682] (1/4) Start epoch 3164 2024-10-21 20:46:46,027 INFO [train.py:561] (1/4) Epoch 3164, batch 2, global_batch_idx: 50610, batch size: 203, loss[dur_loss=0.1871, prior_loss=0.9714, diff_loss=0.3439, tot_loss=1.502, over 203.00 samples.], tot_loss[dur_loss=0.1877, prior_loss=0.9715, diff_loss=0.3172, tot_loss=1.476, over 442.00 samples.], 2024-10-21 20:47:00,320 INFO [train.py:561] (1/4) Epoch 3164, batch 12, global_batch_idx: 50620, batch size: 152, loss[dur_loss=0.1859, prior_loss=0.9716, diff_loss=0.2624, tot_loss=1.42, over 152.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3415, tot_loss=1.498, over 1966.00 samples.], 2024-10-21 20:47:04,808 INFO [train.py:682] (1/4) Start epoch 3165 2024-10-21 20:47:21,909 INFO [train.py:561] (1/4) Epoch 3165, batch 6, global_batch_idx: 50630, batch size: 106, loss[dur_loss=0.1858, prior_loss=0.9714, diff_loss=0.3298, tot_loss=1.487, over 106.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9709, diff_loss=0.4086, tot_loss=1.565, over 1142.00 samples.], 2024-10-21 20:47:35,114 INFO [train.py:682] (1/4) Start epoch 3166 2024-10-21 20:47:43,937 INFO [train.py:561] (1/4) Epoch 3166, batch 0, global_batch_idx: 50640, batch size: 108, loss[dur_loss=0.1891, prior_loss=0.9721, diff_loss=0.3359, tot_loss=1.497, over 108.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9721, diff_loss=0.3359, tot_loss=1.497, over 108.00 samples.], 2024-10-21 20:47:58,245 INFO [train.py:561] (1/4) Epoch 3166, batch 10, global_batch_idx: 50650, batch size: 111, loss[dur_loss=0.189, prior_loss=0.9725, diff_loss=0.3021, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9712, diff_loss=0.3667, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 20:48:05,434 INFO [train.py:682] (1/4) Start epoch 3167 2024-10-21 20:48:19,742 INFO [train.py:561] (1/4) Epoch 3167, batch 4, global_batch_idx: 50660, batch size: 189, loss[dur_loss=0.1844, prior_loss=0.9714, diff_loss=0.3169, tot_loss=1.473, over 189.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9708, diff_loss=0.4108, tot_loss=1.565, over 937.00 samples.], 2024-10-21 20:48:34,736 INFO [train.py:561] (1/4) Epoch 3167, batch 14, global_batch_idx: 50670, batch size: 142, loss[dur_loss=0.1881, prior_loss=0.9712, diff_loss=0.2981, tot_loss=1.457, over 142.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.3413, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 20:48:36,179 INFO [train.py:682] (1/4) Start epoch 3168 2024-10-21 20:48:56,278 INFO [train.py:561] (1/4) Epoch 3168, batch 8, global_batch_idx: 50680, batch size: 170, loss[dur_loss=0.1862, prior_loss=0.9715, diff_loss=0.3274, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.3817, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 20:49:06,513 INFO [train.py:682] (1/4) Start epoch 3169 2024-10-21 20:49:17,852 INFO [train.py:561] (1/4) Epoch 3169, batch 2, global_batch_idx: 50690, batch size: 203, loss[dur_loss=0.1878, prior_loss=0.9713, diff_loss=0.3301, tot_loss=1.489, over 203.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9715, diff_loss=0.3218, tot_loss=1.48, over 442.00 samples.], 2024-10-21 20:49:32,123 INFO [train.py:561] (1/4) Epoch 3169, batch 12, global_batch_idx: 50700, batch size: 152, loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.2645, tot_loss=1.423, over 152.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9712, diff_loss=0.352, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 20:49:36,663 INFO [train.py:682] (1/4) Start epoch 3170 2024-10-21 20:49:53,925 INFO [train.py:561] (1/4) Epoch 3170, batch 6, global_batch_idx: 50710, batch size: 106, loss[dur_loss=0.1888, prior_loss=0.9715, diff_loss=0.3109, tot_loss=1.471, over 106.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.3857, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 20:50:07,146 INFO [train.py:682] (1/4) Start epoch 3171 2024-10-21 20:50:16,059 INFO [train.py:561] (1/4) Epoch 3171, batch 0, global_batch_idx: 50720, batch size: 108, loss[dur_loss=0.1875, prior_loss=0.972, diff_loss=0.3425, tot_loss=1.502, over 108.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.972, diff_loss=0.3425, tot_loss=1.502, over 108.00 samples.], 2024-10-21 20:50:30,326 INFO [train.py:561] (1/4) Epoch 3171, batch 10, global_batch_idx: 50730, batch size: 111, loss[dur_loss=0.1868, prior_loss=0.9726, diff_loss=0.2808, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9712, diff_loss=0.358, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 20:50:37,554 INFO [train.py:682] (1/4) Start epoch 3172 2024-10-21 20:50:51,652 INFO [train.py:561] (1/4) Epoch 3172, batch 4, global_batch_idx: 50740, batch size: 189, loss[dur_loss=0.1885, prior_loss=0.9716, diff_loss=0.321, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.4138, tot_loss=1.568, over 937.00 samples.], 2024-10-21 20:51:06,693 INFO [train.py:561] (1/4) Epoch 3172, batch 14, global_batch_idx: 50750, batch size: 142, loss[dur_loss=0.1888, prior_loss=0.9714, diff_loss=0.3071, tot_loss=1.467, over 142.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9713, diff_loss=0.3434, tot_loss=1.5, over 2210.00 samples.], 2024-10-21 20:51:08,124 INFO [train.py:682] (1/4) Start epoch 3173 2024-10-21 20:51:28,145 INFO [train.py:561] (1/4) Epoch 3173, batch 8, global_batch_idx: 50760, batch size: 170, loss[dur_loss=0.1888, prior_loss=0.9716, diff_loss=0.2943, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9711, diff_loss=0.3631, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 20:51:38,310 INFO [train.py:682] (1/4) Start epoch 3174 2024-10-21 20:51:49,653 INFO [train.py:561] (1/4) Epoch 3174, batch 2, global_batch_idx: 50770, batch size: 203, loss[dur_loss=0.1857, prior_loss=0.9714, diff_loss=0.3247, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9715, diff_loss=0.3233, tot_loss=1.482, over 442.00 samples.], 2024-10-21 20:52:03,875 INFO [train.py:561] (1/4) Epoch 3174, batch 12, global_batch_idx: 50780, batch size: 152, loss[dur_loss=0.1853, prior_loss=0.9714, diff_loss=0.2928, tot_loss=1.449, over 152.00 samples.], tot_loss[dur_loss=0.1863, prior_loss=0.9712, diff_loss=0.3606, tot_loss=1.518, over 1966.00 samples.], 2024-10-21 20:52:08,398 INFO [train.py:682] (1/4) Start epoch 3175 2024-10-21 20:52:25,838 INFO [train.py:561] (1/4) Epoch 3175, batch 6, global_batch_idx: 50790, batch size: 106, loss[dur_loss=0.1874, prior_loss=0.9715, diff_loss=0.2986, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9708, diff_loss=0.3975, tot_loss=1.553, over 1142.00 samples.], 2024-10-21 20:52:38,878 INFO [train.py:682] (1/4) Start epoch 3176 2024-10-21 20:52:47,645 INFO [train.py:561] (1/4) Epoch 3176, batch 0, global_batch_idx: 50800, batch size: 108, loss[dur_loss=0.1914, prior_loss=0.972, diff_loss=0.2915, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1914, prior_loss=0.972, diff_loss=0.2915, tot_loss=1.455, over 108.00 samples.], 2024-10-21 20:53:01,783 INFO [train.py:561] (1/4) Epoch 3176, batch 10, global_batch_idx: 50810, batch size: 111, loss[dur_loss=0.1887, prior_loss=0.9725, diff_loss=0.2941, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9711, diff_loss=0.3704, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 20:53:08,893 INFO [train.py:682] (1/4) Start epoch 3177 2024-10-21 20:53:22,739 INFO [train.py:561] (1/4) Epoch 3177, batch 4, global_batch_idx: 50820, batch size: 189, loss[dur_loss=0.1877, prior_loss=0.9714, diff_loss=0.2973, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9706, diff_loss=0.4031, tot_loss=1.555, over 937.00 samples.], 2024-10-21 20:53:37,514 INFO [train.py:561] (1/4) Epoch 3177, batch 14, global_batch_idx: 50830, batch size: 142, loss[dur_loss=0.1865, prior_loss=0.9712, diff_loss=0.2832, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9711, diff_loss=0.3459, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 20:53:38,937 INFO [train.py:682] (1/4) Start epoch 3178 2024-10-21 20:53:58,965 INFO [train.py:561] (1/4) Epoch 3178, batch 8, global_batch_idx: 50840, batch size: 170, loss[dur_loss=0.189, prior_loss=0.9714, diff_loss=0.3139, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.971, diff_loss=0.3785, tot_loss=1.535, over 1432.00 samples.], 2024-10-21 20:54:09,109 INFO [train.py:682] (1/4) Start epoch 3179 2024-10-21 20:54:20,780 INFO [train.py:561] (1/4) Epoch 3179, batch 2, global_batch_idx: 50850, batch size: 203, loss[dur_loss=0.1876, prior_loss=0.9713, diff_loss=0.32, tot_loss=1.479, over 203.00 samples.], tot_loss[dur_loss=0.1872, prior_loss=0.9715, diff_loss=0.3038, tot_loss=1.462, over 442.00 samples.], 2024-10-21 20:54:34,898 INFO [train.py:561] (1/4) Epoch 3179, batch 12, global_batch_idx: 50860, batch size: 152, loss[dur_loss=0.1846, prior_loss=0.9715, diff_loss=0.3048, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9711, diff_loss=0.3458, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 20:54:39,329 INFO [train.py:682] (1/4) Start epoch 3180 2024-10-21 20:54:56,325 INFO [train.py:561] (1/4) Epoch 3180, batch 6, global_batch_idx: 50870, batch size: 106, loss[dur_loss=0.1855, prior_loss=0.9713, diff_loss=0.3232, tot_loss=1.48, over 106.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9709, diff_loss=0.3965, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 20:55:09,393 INFO [train.py:682] (1/4) Start epoch 3181 2024-10-21 20:55:18,332 INFO [train.py:561] (1/4) Epoch 3181, batch 0, global_batch_idx: 50880, batch size: 108, loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.2972, tot_loss=1.459, over 108.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.972, diff_loss=0.2972, tot_loss=1.459, over 108.00 samples.], 2024-10-21 20:55:32,525 INFO [train.py:561] (1/4) Epoch 3181, batch 10, global_batch_idx: 50890, batch size: 111, loss[dur_loss=0.1884, prior_loss=0.9725, diff_loss=0.2532, tot_loss=1.414, over 111.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9711, diff_loss=0.3599, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 20:55:39,703 INFO [train.py:682] (1/4) Start epoch 3182 2024-10-21 20:55:53,632 INFO [train.py:561] (1/4) Epoch 3182, batch 4, global_batch_idx: 50900, batch size: 189, loss[dur_loss=0.1895, prior_loss=0.9717, diff_loss=0.3152, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1832, prior_loss=0.9707, diff_loss=0.4083, tot_loss=1.562, over 937.00 samples.], 2024-10-21 20:56:08,684 INFO [train.py:561] (1/4) Epoch 3182, batch 14, global_batch_idx: 50910, batch size: 142, loss[dur_loss=0.1861, prior_loss=0.9711, diff_loss=0.2585, tot_loss=1.416, over 142.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9712, diff_loss=0.3358, tot_loss=1.492, over 2210.00 samples.], 2024-10-21 20:56:10,110 INFO [train.py:682] (1/4) Start epoch 3183 2024-10-21 20:56:30,404 INFO [train.py:561] (1/4) Epoch 3183, batch 8, global_batch_idx: 50920, batch size: 170, loss[dur_loss=0.1873, prior_loss=0.9716, diff_loss=0.3213, tot_loss=1.48, over 170.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.971, diff_loss=0.3696, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 20:56:40,617 INFO [train.py:682] (1/4) Start epoch 3184 2024-10-21 20:56:52,689 INFO [train.py:561] (1/4) Epoch 3184, batch 2, global_batch_idx: 50930, batch size: 203, loss[dur_loss=0.1846, prior_loss=0.9714, diff_loss=0.3056, tot_loss=1.462, over 203.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9715, diff_loss=0.3045, tot_loss=1.461, over 442.00 samples.], 2024-10-21 20:57:06,983 INFO [train.py:561] (1/4) Epoch 3184, batch 12, global_batch_idx: 50940, batch size: 152, loss[dur_loss=0.1835, prior_loss=0.9717, diff_loss=0.2942, tot_loss=1.449, over 152.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9712, diff_loss=0.3577, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 20:57:11,480 INFO [train.py:682] (1/4) Start epoch 3185 2024-10-21 20:57:28,588 INFO [train.py:561] (1/4) Epoch 3185, batch 6, global_batch_idx: 50950, batch size: 106, loss[dur_loss=0.1868, prior_loss=0.9712, diff_loss=0.3213, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9709, diff_loss=0.3948, tot_loss=1.55, over 1142.00 samples.], 2024-10-21 20:57:41,620 INFO [train.py:682] (1/4) Start epoch 3186 2024-10-21 20:57:50,656 INFO [train.py:561] (1/4) Epoch 3186, batch 0, global_batch_idx: 50960, batch size: 108, loss[dur_loss=0.1906, prior_loss=0.9719, diff_loss=0.2617, tot_loss=1.424, over 108.00 samples.], tot_loss[dur_loss=0.1906, prior_loss=0.9719, diff_loss=0.2617, tot_loss=1.424, over 108.00 samples.], 2024-10-21 20:58:04,868 INFO [train.py:561] (1/4) Epoch 3186, batch 10, global_batch_idx: 50970, batch size: 111, loss[dur_loss=0.1903, prior_loss=0.9727, diff_loss=0.309, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.9711, diff_loss=0.3594, tot_loss=1.516, over 1656.00 samples.], 2024-10-21 20:58:12,043 INFO [train.py:682] (1/4) Start epoch 3187 2024-10-21 20:58:26,023 INFO [train.py:561] (1/4) Epoch 3187, batch 4, global_batch_idx: 50980, batch size: 189, loss[dur_loss=0.1865, prior_loss=0.9713, diff_loss=0.3082, tot_loss=1.466, over 189.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9706, diff_loss=0.4159, tot_loss=1.57, over 937.00 samples.], 2024-10-21 20:58:40,891 INFO [train.py:561] (1/4) Epoch 3187, batch 14, global_batch_idx: 50990, batch size: 142, loss[dur_loss=0.1895, prior_loss=0.9714, diff_loss=0.2952, tot_loss=1.456, over 142.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.3488, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 20:58:42,324 INFO [train.py:682] (1/4) Start epoch 3188 2024-10-21 20:59:02,239 INFO [train.py:561] (1/4) Epoch 3188, batch 8, global_batch_idx: 51000, batch size: 170, loss[dur_loss=0.1896, prior_loss=0.9718, diff_loss=0.3198, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.971, diff_loss=0.3675, tot_loss=1.523, over 1432.00 samples.], 2024-10-21 20:59:03,726 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 20:59:34,114 INFO [train.py:589] (1/4) Epoch 3188, validation: dur_loss=0.4609, prior_loss=1.037, diff_loss=0.44, tot_loss=1.938, over 100.00 samples. 2024-10-21 20:59:34,116 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 20:59:42,827 INFO [train.py:682] (1/4) Start epoch 3189 2024-10-21 20:59:54,889 INFO [train.py:561] (1/4) Epoch 3189, batch 2, global_batch_idx: 51010, batch size: 203, loss[dur_loss=0.1849, prior_loss=0.9712, diff_loss=0.3445, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9714, diff_loss=0.3201, tot_loss=1.477, over 442.00 samples.], 2024-10-21 21:00:09,108 INFO [train.py:561] (1/4) Epoch 3189, batch 12, global_batch_idx: 51020, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.9716, diff_loss=0.2876, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.3486, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 21:00:13,518 INFO [train.py:682] (1/4) Start epoch 3190 2024-10-21 21:00:30,965 INFO [train.py:561] (1/4) Epoch 3190, batch 6, global_batch_idx: 51030, batch size: 106, loss[dur_loss=0.1877, prior_loss=0.9716, diff_loss=0.3092, tot_loss=1.469, over 106.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9709, diff_loss=0.3938, tot_loss=1.549, over 1142.00 samples.], 2024-10-21 21:00:44,040 INFO [train.py:682] (1/4) Start epoch 3191 2024-10-21 21:00:53,161 INFO [train.py:561] (1/4) Epoch 3191, batch 0, global_batch_idx: 51040, batch size: 108, loss[dur_loss=0.1892, prior_loss=0.9719, diff_loss=0.2738, tot_loss=1.435, over 108.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9719, diff_loss=0.2738, tot_loss=1.435, over 108.00 samples.], 2024-10-21 21:01:07,301 INFO [train.py:561] (1/4) Epoch 3191, batch 10, global_batch_idx: 51050, batch size: 111, loss[dur_loss=0.1871, prior_loss=0.9726, diff_loss=0.2817, tot_loss=1.441, over 111.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9712, diff_loss=0.3612, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 21:01:14,325 INFO [train.py:682] (1/4) Start epoch 3192 2024-10-21 21:01:28,338 INFO [train.py:561] (1/4) Epoch 3192, batch 4, global_batch_idx: 51060, batch size: 189, loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.2905, tot_loss=1.449, over 189.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9708, diff_loss=0.4132, tot_loss=1.568, over 937.00 samples.], 2024-10-21 21:01:43,201 INFO [train.py:561] (1/4) Epoch 3192, batch 14, global_batch_idx: 51070, batch size: 142, loss[dur_loss=0.1906, prior_loss=0.9715, diff_loss=0.2642, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.3448, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 21:01:44,620 INFO [train.py:682] (1/4) Start epoch 3193 2024-10-21 21:02:04,564 INFO [train.py:561] (1/4) Epoch 3193, batch 8, global_batch_idx: 51080, batch size: 170, loss[dur_loss=0.1909, prior_loss=0.9717, diff_loss=0.3224, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.3809, tot_loss=1.536, over 1432.00 samples.], 2024-10-21 21:02:14,572 INFO [train.py:682] (1/4) Start epoch 3194 2024-10-21 21:02:26,157 INFO [train.py:561] (1/4) Epoch 3194, batch 2, global_batch_idx: 51090, batch size: 203, loss[dur_loss=0.1885, prior_loss=0.9717, diff_loss=0.3354, tot_loss=1.496, over 203.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9716, diff_loss=0.3178, tot_loss=1.477, over 442.00 samples.], 2024-10-21 21:02:40,240 INFO [train.py:561] (1/4) Epoch 3194, batch 12, global_batch_idx: 51100, batch size: 152, loss[dur_loss=0.1846, prior_loss=0.9716, diff_loss=0.3077, tot_loss=1.464, over 152.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9713, diff_loss=0.3556, tot_loss=1.512, over 1966.00 samples.], 2024-10-21 21:02:44,621 INFO [train.py:682] (1/4) Start epoch 3195 2024-10-21 21:03:01,770 INFO [train.py:561] (1/4) Epoch 3195, batch 6, global_batch_idx: 51110, batch size: 106, loss[dur_loss=0.1857, prior_loss=0.9714, diff_loss=0.2636, tot_loss=1.421, over 106.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.971, diff_loss=0.3848, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 21:03:14,721 INFO [train.py:682] (1/4) Start epoch 3196 2024-10-21 21:03:23,402 INFO [train.py:561] (1/4) Epoch 3196, batch 0, global_batch_idx: 51120, batch size: 108, loss[dur_loss=0.1916, prior_loss=0.9723, diff_loss=0.2934, tot_loss=1.457, over 108.00 samples.], tot_loss[dur_loss=0.1916, prior_loss=0.9723, diff_loss=0.2934, tot_loss=1.457, over 108.00 samples.], 2024-10-21 21:03:37,598 INFO [train.py:561] (1/4) Epoch 3196, batch 10, global_batch_idx: 51130, batch size: 111, loss[dur_loss=0.1881, prior_loss=0.973, diff_loss=0.2878, tot_loss=1.449, over 111.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9713, diff_loss=0.3632, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 21:03:44,588 INFO [train.py:682] (1/4) Start epoch 3197 2024-10-21 21:03:58,612 INFO [train.py:561] (1/4) Epoch 3197, batch 4, global_batch_idx: 51140, batch size: 189, loss[dur_loss=0.1844, prior_loss=0.9715, diff_loss=0.3024, tot_loss=1.458, over 189.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.4095, tot_loss=1.564, over 937.00 samples.], 2024-10-21 21:04:13,320 INFO [train.py:561] (1/4) Epoch 3197, batch 14, global_batch_idx: 51150, batch size: 142, loss[dur_loss=0.1897, prior_loss=0.9714, diff_loss=0.3086, tot_loss=1.47, over 142.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9713, diff_loss=0.3455, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 21:04:14,723 INFO [train.py:682] (1/4) Start epoch 3198 2024-10-21 21:04:34,697 INFO [train.py:561] (1/4) Epoch 3198, batch 8, global_batch_idx: 51160, batch size: 170, loss[dur_loss=0.191, prior_loss=0.9719, diff_loss=0.2964, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9711, diff_loss=0.3709, tot_loss=1.527, over 1432.00 samples.], 2024-10-21 21:04:44,702 INFO [train.py:682] (1/4) Start epoch 3199 2024-10-21 21:04:56,118 INFO [train.py:561] (1/4) Epoch 3199, batch 2, global_batch_idx: 51170, batch size: 203, loss[dur_loss=0.188, prior_loss=0.9715, diff_loss=0.3446, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9715, diff_loss=0.3109, tot_loss=1.469, over 442.00 samples.], 2024-10-21 21:05:10,333 INFO [train.py:561] (1/4) Epoch 3199, batch 12, global_batch_idx: 51180, batch size: 152, loss[dur_loss=0.1863, prior_loss=0.9718, diff_loss=0.2965, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9713, diff_loss=0.3559, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 21:05:14,692 INFO [train.py:682] (1/4) Start epoch 3200 2024-10-21 21:05:32,109 INFO [train.py:561] (1/4) Epoch 3200, batch 6, global_batch_idx: 51190, batch size: 106, loss[dur_loss=0.1846, prior_loss=0.9714, diff_loss=0.3211, tot_loss=1.477, over 106.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.971, diff_loss=0.3857, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 21:05:45,009 INFO [train.py:682] (1/4) Start epoch 3201 2024-10-21 21:05:53,747 INFO [train.py:561] (1/4) Epoch 3201, batch 0, global_batch_idx: 51200, batch size: 108, loss[dur_loss=0.1911, prior_loss=0.9721, diff_loss=0.2693, tot_loss=1.432, over 108.00 samples.], tot_loss[dur_loss=0.1911, prior_loss=0.9721, diff_loss=0.2693, tot_loss=1.432, over 108.00 samples.], 2024-10-21 21:06:08,040 INFO [train.py:561] (1/4) Epoch 3201, batch 10, global_batch_idx: 51210, batch size: 111, loss[dur_loss=0.1883, prior_loss=0.9729, diff_loss=0.3026, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9713, diff_loss=0.3692, tot_loss=1.525, over 1656.00 samples.], 2024-10-21 21:06:15,092 INFO [train.py:682] (1/4) Start epoch 3202 2024-10-21 21:06:28,901 INFO [train.py:561] (1/4) Epoch 3202, batch 4, global_batch_idx: 51220, batch size: 189, loss[dur_loss=0.1868, prior_loss=0.9715, diff_loss=0.3371, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9709, diff_loss=0.411, tot_loss=1.566, over 937.00 samples.], 2024-10-21 21:06:43,783 INFO [train.py:561] (1/4) Epoch 3202, batch 14, global_batch_idx: 51230, batch size: 142, loss[dur_loss=0.1891, prior_loss=0.9718, diff_loss=0.32, tot_loss=1.481, over 142.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9714, diff_loss=0.3466, tot_loss=1.505, over 2210.00 samples.], 2024-10-21 21:06:45,199 INFO [train.py:682] (1/4) Start epoch 3203 2024-10-21 21:07:05,735 INFO [train.py:561] (1/4) Epoch 3203, batch 8, global_batch_idx: 51240, batch size: 170, loss[dur_loss=0.1923, prior_loss=0.9717, diff_loss=0.3472, tot_loss=1.511, over 170.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9712, diff_loss=0.3752, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 21:07:15,879 INFO [train.py:682] (1/4) Start epoch 3204 2024-10-21 21:07:27,458 INFO [train.py:561] (1/4) Epoch 3204, batch 2, global_batch_idx: 51250, batch size: 203, loss[dur_loss=0.1886, prior_loss=0.9716, diff_loss=0.3312, tot_loss=1.491, over 203.00 samples.], tot_loss[dur_loss=0.1883, prior_loss=0.9716, diff_loss=0.3003, tot_loss=1.46, over 442.00 samples.], 2024-10-21 21:07:41,835 INFO [train.py:561] (1/4) Epoch 3204, batch 12, global_batch_idx: 51260, batch size: 152, loss[dur_loss=0.187, prior_loss=0.9716, diff_loss=0.3196, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.1861, prior_loss=0.9713, diff_loss=0.3568, tot_loss=1.514, over 1966.00 samples.], 2024-10-21 21:07:46,314 INFO [train.py:682] (1/4) Start epoch 3205 2024-10-21 21:08:03,414 INFO [train.py:561] (1/4) Epoch 3205, batch 6, global_batch_idx: 51270, batch size: 106, loss[dur_loss=0.1867, prior_loss=0.9716, diff_loss=0.2485, tot_loss=1.407, over 106.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9709, diff_loss=0.381, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 21:08:16,520 INFO [train.py:682] (1/4) Start epoch 3206 2024-10-21 21:08:25,526 INFO [train.py:561] (1/4) Epoch 3206, batch 0, global_batch_idx: 51280, batch size: 108, loss[dur_loss=0.1908, prior_loss=0.9722, diff_loss=0.2712, tot_loss=1.434, over 108.00 samples.], tot_loss[dur_loss=0.1908, prior_loss=0.9722, diff_loss=0.2712, tot_loss=1.434, over 108.00 samples.], 2024-10-21 21:08:39,936 INFO [train.py:561] (1/4) Epoch 3206, batch 10, global_batch_idx: 51290, batch size: 111, loss[dur_loss=0.1891, prior_loss=0.9725, diff_loss=0.3171, tot_loss=1.479, over 111.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9712, diff_loss=0.3582, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 21:08:47,049 INFO [train.py:682] (1/4) Start epoch 3207 2024-10-21 21:09:00,959 INFO [train.py:561] (1/4) Epoch 3207, batch 4, global_batch_idx: 51300, batch size: 189, loss[dur_loss=0.1858, prior_loss=0.9715, diff_loss=0.337, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9707, diff_loss=0.408, tot_loss=1.562, over 937.00 samples.], 2024-10-21 21:09:16,035 INFO [train.py:561] (1/4) Epoch 3207, batch 14, global_batch_idx: 51310, batch size: 142, loss[dur_loss=0.188, prior_loss=0.9714, diff_loss=0.2803, tot_loss=1.44, over 142.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.9713, diff_loss=0.3402, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 21:09:17,470 INFO [train.py:682] (1/4) Start epoch 3208 2024-10-21 21:09:37,958 INFO [train.py:561] (1/4) Epoch 3208, batch 8, global_batch_idx: 51320, batch size: 170, loss[dur_loss=0.1887, prior_loss=0.9718, diff_loss=0.2987, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.3601, tot_loss=1.516, over 1432.00 samples.], 2024-10-21 21:09:48,115 INFO [train.py:682] (1/4) Start epoch 3209 2024-10-21 21:09:59,729 INFO [train.py:561] (1/4) Epoch 3209, batch 2, global_batch_idx: 51330, batch size: 203, loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.3249, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1868, prior_loss=0.9717, diff_loss=0.3106, tot_loss=1.469, over 442.00 samples.], 2024-10-21 21:10:14,204 INFO [train.py:561] (1/4) Epoch 3209, batch 12, global_batch_idx: 51340, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.9715, diff_loss=0.3069, tot_loss=1.462, over 152.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9713, diff_loss=0.3509, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 21:10:18,680 INFO [train.py:682] (1/4) Start epoch 3210 2024-10-21 21:10:36,008 INFO [train.py:561] (1/4) Epoch 3210, batch 6, global_batch_idx: 51350, batch size: 106, loss[dur_loss=0.1857, prior_loss=0.9716, diff_loss=0.2937, tot_loss=1.451, over 106.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9711, diff_loss=0.3911, tot_loss=1.545, over 1142.00 samples.], 2024-10-21 21:10:49,168 INFO [train.py:682] (1/4) Start epoch 3211 2024-10-21 21:10:57,988 INFO [train.py:561] (1/4) Epoch 3211, batch 0, global_batch_idx: 51360, batch size: 108, loss[dur_loss=0.1903, prior_loss=0.9718, diff_loss=0.3177, tot_loss=1.48, over 108.00 samples.], tot_loss[dur_loss=0.1903, prior_loss=0.9718, diff_loss=0.3177, tot_loss=1.48, over 108.00 samples.], 2024-10-21 21:11:12,422 INFO [train.py:561] (1/4) Epoch 3211, batch 10, global_batch_idx: 51370, batch size: 111, loss[dur_loss=0.1886, prior_loss=0.9726, diff_loss=0.2898, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9712, diff_loss=0.3608, tot_loss=1.517, over 1656.00 samples.], 2024-10-21 21:11:19,536 INFO [train.py:682] (1/4) Start epoch 3212 2024-10-21 21:11:33,653 INFO [train.py:561] (1/4) Epoch 3212, batch 4, global_batch_idx: 51380, batch size: 189, loss[dur_loss=0.1857, prior_loss=0.9717, diff_loss=0.2878, tot_loss=1.445, over 189.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9707, diff_loss=0.4166, tot_loss=1.571, over 937.00 samples.], 2024-10-21 21:11:48,670 INFO [train.py:561] (1/4) Epoch 3212, batch 14, global_batch_idx: 51390, batch size: 142, loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.3029, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9713, diff_loss=0.3427, tot_loss=1.5, over 2210.00 samples.], 2024-10-21 21:11:50,102 INFO [train.py:682] (1/4) Start epoch 3213 2024-10-21 21:12:10,305 INFO [train.py:561] (1/4) Epoch 3213, batch 8, global_batch_idx: 51400, batch size: 170, loss[dur_loss=0.1918, prior_loss=0.9719, diff_loss=0.3091, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9711, diff_loss=0.3622, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 21:12:20,517 INFO [train.py:682] (1/4) Start epoch 3214 2024-10-21 21:12:31,983 INFO [train.py:561] (1/4) Epoch 3214, batch 2, global_batch_idx: 51410, batch size: 203, loss[dur_loss=0.1888, prior_loss=0.9716, diff_loss=0.3492, tot_loss=1.51, over 203.00 samples.], tot_loss[dur_loss=0.1884, prior_loss=0.9716, diff_loss=0.3104, tot_loss=1.47, over 442.00 samples.], 2024-10-21 21:12:46,372 INFO [train.py:561] (1/4) Epoch 3214, batch 12, global_batch_idx: 51420, batch size: 152, loss[dur_loss=0.1844, prior_loss=0.9715, diff_loss=0.299, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1863, prior_loss=0.9713, diff_loss=0.344, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 21:12:50,856 INFO [train.py:682] (1/4) Start epoch 3215 2024-10-21 21:13:08,341 INFO [train.py:561] (1/4) Epoch 3215, batch 6, global_batch_idx: 51430, batch size: 106, loss[dur_loss=0.1883, prior_loss=0.9718, diff_loss=0.2986, tot_loss=1.459, over 106.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.971, diff_loss=0.402, tot_loss=1.557, over 1142.00 samples.], 2024-10-21 21:13:21,467 INFO [train.py:682] (1/4) Start epoch 3216 2024-10-21 21:13:30,457 INFO [train.py:561] (1/4) Epoch 3216, batch 0, global_batch_idx: 51440, batch size: 108, loss[dur_loss=0.1892, prior_loss=0.9721, diff_loss=0.2969, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.1892, prior_loss=0.9721, diff_loss=0.2969, tot_loss=1.458, over 108.00 samples.], 2024-10-21 21:13:44,762 INFO [train.py:561] (1/4) Epoch 3216, batch 10, global_batch_idx: 51450, batch size: 111, loss[dur_loss=0.1893, prior_loss=0.9726, diff_loss=0.2819, tot_loss=1.444, over 111.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9712, diff_loss=0.3651, tot_loss=1.52, over 1656.00 samples.], 2024-10-21 21:13:51,941 INFO [train.py:682] (1/4) Start epoch 3217 2024-10-21 21:14:05,738 INFO [train.py:561] (1/4) Epoch 3217, batch 4, global_batch_idx: 51460, batch size: 189, loss[dur_loss=0.1901, prior_loss=0.9716, diff_loss=0.3224, tot_loss=1.484, over 189.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9708, diff_loss=0.4049, tot_loss=1.559, over 937.00 samples.], 2024-10-21 21:14:20,722 INFO [train.py:561] (1/4) Epoch 3217, batch 14, global_batch_idx: 51470, batch size: 142, loss[dur_loss=0.19, prior_loss=0.9713, diff_loss=0.3374, tot_loss=1.499, over 142.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3528, tot_loss=1.51, over 2210.00 samples.], 2024-10-21 21:14:22,169 INFO [train.py:682] (1/4) Start epoch 3218 2024-10-21 21:14:42,566 INFO [train.py:561] (1/4) Epoch 3218, batch 8, global_batch_idx: 51480, batch size: 170, loss[dur_loss=0.1898, prior_loss=0.9717, diff_loss=0.3195, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.376, tot_loss=1.531, over 1432.00 samples.], 2024-10-21 21:14:52,808 INFO [train.py:682] (1/4) Start epoch 3219 2024-10-21 21:15:04,149 INFO [train.py:561] (1/4) Epoch 3219, batch 2, global_batch_idx: 51490, batch size: 203, loss[dur_loss=0.1861, prior_loss=0.9713, diff_loss=0.3252, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9714, diff_loss=0.2975, tot_loss=1.455, over 442.00 samples.], 2024-10-21 21:15:18,650 INFO [train.py:561] (1/4) Epoch 3219, batch 12, global_batch_idx: 51500, batch size: 152, loss[dur_loss=0.1861, prior_loss=0.9716, diff_loss=0.3188, tot_loss=1.476, over 152.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.9711, diff_loss=0.3404, tot_loss=1.496, over 1966.00 samples.], 2024-10-21 21:15:23,150 INFO [train.py:682] (1/4) Start epoch 3220 2024-10-21 21:15:40,508 INFO [train.py:561] (1/4) Epoch 3220, batch 6, global_batch_idx: 51510, batch size: 106, loss[dur_loss=0.1845, prior_loss=0.9714, diff_loss=0.267, tot_loss=1.423, over 106.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9709, diff_loss=0.3805, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 21:15:53,868 INFO [train.py:682] (1/4) Start epoch 3221 2024-10-21 21:16:02,661 INFO [train.py:561] (1/4) Epoch 3221, batch 0, global_batch_idx: 51520, batch size: 108, loss[dur_loss=0.1897, prior_loss=0.9721, diff_loss=0.3187, tot_loss=1.48, over 108.00 samples.], tot_loss[dur_loss=0.1897, prior_loss=0.9721, diff_loss=0.3187, tot_loss=1.48, over 108.00 samples.], 2024-10-21 21:16:17,052 INFO [train.py:561] (1/4) Epoch 3221, batch 10, global_batch_idx: 51530, batch size: 111, loss[dur_loss=0.1869, prior_loss=0.9721, diff_loss=0.2788, tot_loss=1.438, over 111.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9711, diff_loss=0.3662, tot_loss=1.522, over 1656.00 samples.], 2024-10-21 21:16:24,240 INFO [train.py:682] (1/4) Start epoch 3222 2024-10-21 21:16:38,381 INFO [train.py:561] (1/4) Epoch 3222, batch 4, global_batch_idx: 51540, batch size: 189, loss[dur_loss=0.1876, prior_loss=0.9715, diff_loss=0.3108, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.4036, tot_loss=1.558, over 937.00 samples.], 2024-10-21 21:16:53,527 INFO [train.py:561] (1/4) Epoch 3222, batch 14, global_batch_idx: 51550, batch size: 142, loss[dur_loss=0.1907, prior_loss=0.9712, diff_loss=0.3101, tot_loss=1.472, over 142.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3474, tot_loss=1.504, over 2210.00 samples.], 2024-10-21 21:16:54,965 INFO [train.py:682] (1/4) Start epoch 3223 2024-10-21 21:17:14,952 INFO [train.py:561] (1/4) Epoch 3223, batch 8, global_batch_idx: 51560, batch size: 170, loss[dur_loss=0.1865, prior_loss=0.9715, diff_loss=0.326, tot_loss=1.484, over 170.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.3736, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 21:17:25,158 INFO [train.py:682] (1/4) Start epoch 3224 2024-10-21 21:17:37,044 INFO [train.py:561] (1/4) Epoch 3224, batch 2, global_batch_idx: 51570, batch size: 203, loss[dur_loss=0.1887, prior_loss=0.9715, diff_loss=0.3334, tot_loss=1.494, over 203.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9715, diff_loss=0.3153, tot_loss=1.473, over 442.00 samples.], 2024-10-21 21:17:51,553 INFO [train.py:561] (1/4) Epoch 3224, batch 12, global_batch_idx: 51580, batch size: 152, loss[dur_loss=0.1846, prior_loss=0.9716, diff_loss=0.2829, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.3527, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 21:17:56,077 INFO [train.py:682] (1/4) Start epoch 3225 2024-10-21 21:18:13,276 INFO [train.py:561] (1/4) Epoch 3225, batch 6, global_batch_idx: 51590, batch size: 106, loss[dur_loss=0.1846, prior_loss=0.9712, diff_loss=0.3382, tot_loss=1.494, over 106.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9707, diff_loss=0.3853, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 21:18:26,476 INFO [train.py:682] (1/4) Start epoch 3226 2024-10-21 21:18:35,353 INFO [train.py:561] (1/4) Epoch 3226, batch 0, global_batch_idx: 51600, batch size: 108, loss[dur_loss=0.1869, prior_loss=0.9719, diff_loss=0.3419, tot_loss=1.501, over 108.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9719, diff_loss=0.3419, tot_loss=1.501, over 108.00 samples.], 2024-10-21 21:18:49,724 INFO [train.py:561] (1/4) Epoch 3226, batch 10, global_batch_idx: 51610, batch size: 111, loss[dur_loss=0.1845, prior_loss=0.9723, diff_loss=0.3043, tot_loss=1.461, over 111.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9711, diff_loss=0.3659, tot_loss=1.522, over 1656.00 samples.], 2024-10-21 21:18:56,921 INFO [train.py:682] (1/4) Start epoch 3227 2024-10-21 21:19:10,703 INFO [train.py:561] (1/4) Epoch 3227, batch 4, global_batch_idx: 51620, batch size: 189, loss[dur_loss=0.1839, prior_loss=0.9714, diff_loss=0.3251, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9705, diff_loss=0.3958, tot_loss=1.548, over 937.00 samples.], 2024-10-21 21:19:25,908 INFO [train.py:561] (1/4) Epoch 3227, batch 14, global_batch_idx: 51630, batch size: 142, loss[dur_loss=0.1878, prior_loss=0.971, diff_loss=0.316, tot_loss=1.475, over 142.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.3432, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 21:19:27,359 INFO [train.py:682] (1/4) Start epoch 3228 2024-10-21 21:19:47,629 INFO [train.py:561] (1/4) Epoch 3228, batch 8, global_batch_idx: 51640, batch size: 170, loss[dur_loss=0.1889, prior_loss=0.9715, diff_loss=0.3273, tot_loss=1.488, over 170.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9708, diff_loss=0.3698, tot_loss=1.524, over 1432.00 samples.], 2024-10-21 21:19:57,968 INFO [train.py:682] (1/4) Start epoch 3229 2024-10-21 21:20:09,524 INFO [train.py:561] (1/4) Epoch 3229, batch 2, global_batch_idx: 51650, batch size: 203, loss[dur_loss=0.1869, prior_loss=0.9713, diff_loss=0.3353, tot_loss=1.494, over 203.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9713, diff_loss=0.301, tot_loss=1.459, over 442.00 samples.], 2024-10-21 21:20:24,032 INFO [train.py:561] (1/4) Epoch 3229, batch 12, global_batch_idx: 51660, batch size: 152, loss[dur_loss=0.1868, prior_loss=0.9715, diff_loss=0.2926, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9711, diff_loss=0.353, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 21:20:28,540 INFO [train.py:682] (1/4) Start epoch 3230 2024-10-21 21:20:45,686 INFO [train.py:561] (1/4) Epoch 3230, batch 6, global_batch_idx: 51670, batch size: 106, loss[dur_loss=0.1886, prior_loss=0.9713, diff_loss=0.2648, tot_loss=1.425, over 106.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9708, diff_loss=0.38, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 21:20:58,923 INFO [train.py:682] (1/4) Start epoch 3231 2024-10-21 21:21:07,642 INFO [train.py:561] (1/4) Epoch 3231, batch 0, global_batch_idx: 51680, batch size: 108, loss[dur_loss=0.1882, prior_loss=0.9719, diff_loss=0.2564, tot_loss=1.417, over 108.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.9719, diff_loss=0.2564, tot_loss=1.417, over 108.00 samples.], 2024-10-21 21:21:22,059 INFO [train.py:561] (1/4) Epoch 3231, batch 10, global_batch_idx: 51690, batch size: 111, loss[dur_loss=0.187, prior_loss=0.9723, diff_loss=0.2716, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9711, diff_loss=0.358, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 21:21:29,281 INFO [train.py:682] (1/4) Start epoch 3232 2024-10-21 21:21:42,837 INFO [train.py:561] (1/4) Epoch 3232, batch 4, global_batch_idx: 51700, batch size: 189, loss[dur_loss=0.1868, prior_loss=0.9714, diff_loss=0.3018, tot_loss=1.46, over 189.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9706, diff_loss=0.397, tot_loss=1.551, over 937.00 samples.], 2024-10-21 21:21:57,908 INFO [train.py:561] (1/4) Epoch 3232, batch 14, global_batch_idx: 51710, batch size: 142, loss[dur_loss=0.1859, prior_loss=0.971, diff_loss=0.3338, tot_loss=1.491, over 142.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9711, diff_loss=0.3428, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 21:21:59,356 INFO [train.py:682] (1/4) Start epoch 3233 2024-10-21 21:22:19,792 INFO [train.py:561] (1/4) Epoch 3233, batch 8, global_batch_idx: 51720, batch size: 170, loss[dur_loss=0.1883, prior_loss=0.9712, diff_loss=0.2998, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9708, diff_loss=0.379, tot_loss=1.534, over 1432.00 samples.], 2024-10-21 21:22:30,084 INFO [train.py:682] (1/4) Start epoch 3234 2024-10-21 21:22:41,502 INFO [train.py:561] (1/4) Epoch 3234, batch 2, global_batch_idx: 51730, batch size: 203, loss[dur_loss=0.1853, prior_loss=0.971, diff_loss=0.3173, tot_loss=1.474, over 203.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3149, tot_loss=1.472, over 442.00 samples.], 2024-10-21 21:22:55,927 INFO [train.py:561] (1/4) Epoch 3234, batch 12, global_batch_idx: 51740, batch size: 152, loss[dur_loss=0.1849, prior_loss=0.9714, diff_loss=0.2977, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.971, diff_loss=0.3425, tot_loss=1.497, over 1966.00 samples.], 2024-10-21 21:23:00,412 INFO [train.py:682] (1/4) Start epoch 3235 2024-10-21 21:23:17,624 INFO [train.py:561] (1/4) Epoch 3235, batch 6, global_batch_idx: 51750, batch size: 106, loss[dur_loss=0.1862, prior_loss=0.9711, diff_loss=0.263, tot_loss=1.42, over 106.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9707, diff_loss=0.3846, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 21:23:30,904 INFO [train.py:682] (1/4) Start epoch 3236 2024-10-21 21:23:39,555 INFO [train.py:561] (1/4) Epoch 3236, batch 0, global_batch_idx: 51760, batch size: 108, loss[dur_loss=0.1891, prior_loss=0.9718, diff_loss=0.2781, tot_loss=1.439, over 108.00 samples.], tot_loss[dur_loss=0.1891, prior_loss=0.9718, diff_loss=0.2781, tot_loss=1.439, over 108.00 samples.], 2024-10-21 21:23:53,826 INFO [train.py:561] (1/4) Epoch 3236, batch 10, global_batch_idx: 51770, batch size: 111, loss[dur_loss=0.1869, prior_loss=0.9721, diff_loss=0.2591, tot_loss=1.418, over 111.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.3598, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 21:24:00,985 INFO [train.py:682] (1/4) Start epoch 3237 2024-10-21 21:24:14,616 INFO [train.py:561] (1/4) Epoch 3237, batch 4, global_batch_idx: 51780, batch size: 189, loss[dur_loss=0.1874, prior_loss=0.9713, diff_loss=0.3414, tot_loss=1.5, over 189.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.4297, tot_loss=1.583, over 937.00 samples.], 2024-10-21 21:24:29,675 INFO [train.py:561] (1/4) Epoch 3237, batch 14, global_batch_idx: 51790, batch size: 142, loss[dur_loss=0.1871, prior_loss=0.971, diff_loss=0.2688, tot_loss=1.427, over 142.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.971, diff_loss=0.3503, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 21:24:31,115 INFO [train.py:682] (1/4) Start epoch 3238 2024-10-21 21:24:51,208 INFO [train.py:561] (1/4) Epoch 3238, batch 8, global_batch_idx: 51800, batch size: 170, loss[dur_loss=0.1864, prior_loss=0.9714, diff_loss=0.311, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9708, diff_loss=0.3739, tot_loss=1.53, over 1432.00 samples.], 2024-10-21 21:25:01,459 INFO [train.py:682] (1/4) Start epoch 3239 2024-10-21 21:25:12,838 INFO [train.py:561] (1/4) Epoch 3239, batch 2, global_batch_idx: 51810, batch size: 203, loss[dur_loss=0.1862, prior_loss=0.9711, diff_loss=0.3534, tot_loss=1.511, over 203.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3162, tot_loss=1.473, over 442.00 samples.], 2024-10-21 21:25:27,209 INFO [train.py:561] (1/4) Epoch 3239, batch 12, global_batch_idx: 51820, batch size: 152, loss[dur_loss=0.185, prior_loss=0.9714, diff_loss=0.2944, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.3474, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 21:25:31,705 INFO [train.py:682] (1/4) Start epoch 3240 2024-10-21 21:25:48,688 INFO [train.py:561] (1/4) Epoch 3240, batch 6, global_batch_idx: 51830, batch size: 106, loss[dur_loss=0.1863, prior_loss=0.9712, diff_loss=0.3174, tot_loss=1.475, over 106.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9708, diff_loss=0.4104, tot_loss=1.564, over 1142.00 samples.], 2024-10-21 21:26:01,934 INFO [train.py:682] (1/4) Start epoch 3241 2024-10-21 21:26:10,784 INFO [train.py:561] (1/4) Epoch 3241, batch 0, global_batch_idx: 51840, batch size: 108, loss[dur_loss=0.1887, prior_loss=0.9721, diff_loss=0.3063, tot_loss=1.467, over 108.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9721, diff_loss=0.3063, tot_loss=1.467, over 108.00 samples.], 2024-10-21 21:26:25,184 INFO [train.py:561] (1/4) Epoch 3241, batch 10, global_batch_idx: 51850, batch size: 111, loss[dur_loss=0.1891, prior_loss=0.9723, diff_loss=0.2657, tot_loss=1.427, over 111.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.3596, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 21:26:32,397 INFO [train.py:682] (1/4) Start epoch 3242 2024-10-21 21:26:46,350 INFO [train.py:561] (1/4) Epoch 3242, batch 4, global_batch_idx: 51860, batch size: 189, loss[dur_loss=0.1834, prior_loss=0.9714, diff_loss=0.307, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9706, diff_loss=0.4023, tot_loss=1.554, over 937.00 samples.], 2024-10-21 21:27:01,439 INFO [train.py:561] (1/4) Epoch 3242, batch 14, global_batch_idx: 51870, batch size: 142, loss[dur_loss=0.1872, prior_loss=0.971, diff_loss=0.2876, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9711, diff_loss=0.3437, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 21:27:02,875 INFO [train.py:682] (1/4) Start epoch 3243 2024-10-21 21:27:22,929 INFO [train.py:561] (1/4) Epoch 3243, batch 8, global_batch_idx: 51880, batch size: 170, loss[dur_loss=0.186, prior_loss=0.9712, diff_loss=0.2855, tot_loss=1.443, over 170.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9708, diff_loss=0.3636, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 21:27:33,328 INFO [train.py:682] (1/4) Start epoch 3244 2024-10-21 21:27:45,114 INFO [train.py:561] (1/4) Epoch 3244, batch 2, global_batch_idx: 51890, batch size: 203, loss[dur_loss=0.1848, prior_loss=0.9712, diff_loss=0.3145, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1854, prior_loss=0.9712, diff_loss=0.3049, tot_loss=1.462, over 442.00 samples.], 2024-10-21 21:27:59,575 INFO [train.py:561] (1/4) Epoch 3244, batch 12, global_batch_idx: 51900, batch size: 152, loss[dur_loss=0.1848, prior_loss=0.9714, diff_loss=0.2968, tot_loss=1.453, over 152.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.971, diff_loss=0.3517, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 21:28:04,092 INFO [train.py:682] (1/4) Start epoch 3245 2024-10-21 21:28:21,154 INFO [train.py:561] (1/4) Epoch 3245, batch 6, global_batch_idx: 51910, batch size: 106, loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.2471, tot_loss=1.404, over 106.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.3812, tot_loss=1.535, over 1142.00 samples.], 2024-10-21 21:28:34,435 INFO [train.py:682] (1/4) Start epoch 3246 2024-10-21 21:28:43,532 INFO [train.py:561] (1/4) Epoch 3246, batch 0, global_batch_idx: 51920, batch size: 108, loss[dur_loss=0.1907, prior_loss=0.9718, diff_loss=0.2845, tot_loss=1.447, over 108.00 samples.], tot_loss[dur_loss=0.1907, prior_loss=0.9718, diff_loss=0.2845, tot_loss=1.447, over 108.00 samples.], 2024-10-21 21:28:57,798 INFO [train.py:561] (1/4) Epoch 3246, batch 10, global_batch_idx: 51930, batch size: 111, loss[dur_loss=0.1882, prior_loss=0.9724, diff_loss=0.3091, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9709, diff_loss=0.3653, tot_loss=1.52, over 1656.00 samples.], 2024-10-21 21:29:05,075 INFO [train.py:682] (1/4) Start epoch 3247 2024-10-21 21:29:18,903 INFO [train.py:561] (1/4) Epoch 3247, batch 4, global_batch_idx: 51940, batch size: 189, loss[dur_loss=0.1863, prior_loss=0.9713, diff_loss=0.3076, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9705, diff_loss=0.411, tot_loss=1.563, over 937.00 samples.], 2024-10-21 21:29:33,896 INFO [train.py:561] (1/4) Epoch 3247, batch 14, global_batch_idx: 51950, batch size: 142, loss[dur_loss=0.1866, prior_loss=0.9711, diff_loss=0.3045, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.971, diff_loss=0.3479, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 21:29:35,340 INFO [train.py:682] (1/4) Start epoch 3248 2024-10-21 21:29:55,244 INFO [train.py:561] (1/4) Epoch 3248, batch 8, global_batch_idx: 51960, batch size: 170, loss[dur_loss=0.1876, prior_loss=0.9714, diff_loss=0.3037, tot_loss=1.463, over 170.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9709, diff_loss=0.3692, tot_loss=1.523, over 1432.00 samples.], 2024-10-21 21:30:05,461 INFO [train.py:682] (1/4) Start epoch 3249 2024-10-21 21:30:16,882 INFO [train.py:561] (1/4) Epoch 3249, batch 2, global_batch_idx: 51970, batch size: 203, loss[dur_loss=0.185, prior_loss=0.9711, diff_loss=0.3186, tot_loss=1.475, over 203.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9712, diff_loss=0.3063, tot_loss=1.462, over 442.00 samples.], 2024-10-21 21:30:31,255 INFO [train.py:561] (1/4) Epoch 3249, batch 12, global_batch_idx: 51980, batch size: 152, loss[dur_loss=0.1844, prior_loss=0.9714, diff_loss=0.2874, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.971, diff_loss=0.3485, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 21:30:35,764 INFO [train.py:682] (1/4) Start epoch 3250 2024-10-21 21:30:52,751 INFO [train.py:561] (1/4) Epoch 3250, batch 6, global_batch_idx: 51990, batch size: 106, loss[dur_loss=0.1866, prior_loss=0.9713, diff_loss=0.3115, tot_loss=1.469, over 106.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9708, diff_loss=0.3881, tot_loss=1.542, over 1142.00 samples.], 2024-10-21 21:31:06,024 INFO [train.py:682] (1/4) Start epoch 3251 2024-10-21 21:31:14,644 INFO [train.py:561] (1/4) Epoch 3251, batch 0, global_batch_idx: 52000, batch size: 108, loss[dur_loss=0.1882, prior_loss=0.9718, diff_loss=0.3086, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.1882, prior_loss=0.9718, diff_loss=0.3086, tot_loss=1.469, over 108.00 samples.], 2024-10-21 21:31:28,985 INFO [train.py:561] (1/4) Epoch 3251, batch 10, global_batch_idx: 52010, batch size: 111, loss[dur_loss=0.1866, prior_loss=0.9723, diff_loss=0.3012, tot_loss=1.46, over 111.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9709, diff_loss=0.363, tot_loss=1.517, over 1656.00 samples.], 2024-10-21 21:31:36,199 INFO [train.py:682] (1/4) Start epoch 3252 2024-10-21 21:31:50,203 INFO [train.py:561] (1/4) Epoch 3252, batch 4, global_batch_idx: 52020, batch size: 189, loss[dur_loss=0.185, prior_loss=0.9712, diff_loss=0.3246, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.425, tot_loss=1.577, over 937.00 samples.], 2024-10-21 21:32:05,258 INFO [train.py:561] (1/4) Epoch 3252, batch 14, global_batch_idx: 52030, batch size: 142, loss[dur_loss=0.1879, prior_loss=0.9709, diff_loss=0.3102, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9709, diff_loss=0.3489, tot_loss=1.505, over 2210.00 samples.], 2024-10-21 21:32:06,720 INFO [train.py:682] (1/4) Start epoch 3253 2024-10-21 21:32:27,009 INFO [train.py:561] (1/4) Epoch 3253, batch 8, global_batch_idx: 52040, batch size: 170, loss[dur_loss=0.1869, prior_loss=0.9713, diff_loss=0.3018, tot_loss=1.46, over 170.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9707, diff_loss=0.3702, tot_loss=1.523, over 1432.00 samples.], 2024-10-21 21:32:37,311 INFO [train.py:682] (1/4) Start epoch 3254 2024-10-21 21:32:48,883 INFO [train.py:561] (1/4) Epoch 3254, batch 2, global_batch_idx: 52050, batch size: 203, loss[dur_loss=0.1855, prior_loss=0.9712, diff_loss=0.3357, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3069, tot_loss=1.464, over 442.00 samples.], 2024-10-21 21:33:03,257 INFO [train.py:561] (1/4) Epoch 3254, batch 12, global_batch_idx: 52060, batch size: 152, loss[dur_loss=0.1888, prior_loss=0.9713, diff_loss=0.2851, tot_loss=1.445, over 152.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.3561, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 21:33:07,788 INFO [train.py:682] (1/4) Start epoch 3255 2024-10-21 21:33:25,044 INFO [train.py:561] (1/4) Epoch 3255, batch 6, global_batch_idx: 52070, batch size: 106, loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.2757, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9705, diff_loss=0.4072, tot_loss=1.56, over 1142.00 samples.], 2024-10-21 21:33:38,416 INFO [train.py:682] (1/4) Start epoch 3256 2024-10-21 21:33:47,222 INFO [train.py:561] (1/4) Epoch 3256, batch 0, global_batch_idx: 52080, batch size: 108, loss[dur_loss=0.1869, prior_loss=0.9717, diff_loss=0.3132, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9717, diff_loss=0.3132, tot_loss=1.472, over 108.00 samples.], 2024-10-21 21:34:01,625 INFO [train.py:561] (1/4) Epoch 3256, batch 10, global_batch_idx: 52090, batch size: 111, loss[dur_loss=0.1888, prior_loss=0.9721, diff_loss=0.2701, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9709, diff_loss=0.3555, tot_loss=1.51, over 1656.00 samples.], 2024-10-21 21:34:08,829 INFO [train.py:682] (1/4) Start epoch 3257 2024-10-21 21:34:23,239 INFO [train.py:561] (1/4) Epoch 3257, batch 4, global_batch_idx: 52100, batch size: 189, loss[dur_loss=0.1854, prior_loss=0.9713, diff_loss=0.3316, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9706, diff_loss=0.4122, tot_loss=1.566, over 937.00 samples.], 2024-10-21 21:34:38,304 INFO [train.py:561] (1/4) Epoch 3257, batch 14, global_batch_idx: 52110, batch size: 142, loss[dur_loss=0.1862, prior_loss=0.9709, diff_loss=0.3092, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.3439, tot_loss=1.5, over 2210.00 samples.], 2024-10-21 21:34:39,753 INFO [train.py:682] (1/4) Start epoch 3258 2024-10-21 21:34:59,807 INFO [train.py:561] (1/4) Epoch 3258, batch 8, global_batch_idx: 52120, batch size: 170, loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.2985, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9708, diff_loss=0.3734, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 21:35:10,085 INFO [train.py:682] (1/4) Start epoch 3259 2024-10-21 21:35:21,570 INFO [train.py:561] (1/4) Epoch 3259, batch 2, global_batch_idx: 52130, batch size: 203, loss[dur_loss=0.1865, prior_loss=0.9712, diff_loss=0.3287, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.9712, diff_loss=0.3082, tot_loss=1.466, over 442.00 samples.], 2024-10-21 21:35:35,972 INFO [train.py:561] (1/4) Epoch 3259, batch 12, global_batch_idx: 52140, batch size: 152, loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.2977, tot_loss=1.456, over 152.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9709, diff_loss=0.3484, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 21:35:40,450 INFO [train.py:682] (1/4) Start epoch 3260 2024-10-21 21:35:57,488 INFO [train.py:561] (1/4) Epoch 3260, batch 6, global_batch_idx: 52150, batch size: 106, loss[dur_loss=0.1833, prior_loss=0.9712, diff_loss=0.2618, tot_loss=1.416, over 106.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9706, diff_loss=0.3727, tot_loss=1.526, over 1142.00 samples.], 2024-10-21 21:36:10,663 INFO [train.py:682] (1/4) Start epoch 3261 2024-10-21 21:36:19,395 INFO [train.py:561] (1/4) Epoch 3261, batch 0, global_batch_idx: 52160, batch size: 108, loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.2789, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.2789, tot_loss=1.438, over 108.00 samples.], 2024-10-21 21:36:33,817 INFO [train.py:561] (1/4) Epoch 3261, batch 10, global_batch_idx: 52170, batch size: 111, loss[dur_loss=0.1853, prior_loss=0.9723, diff_loss=0.305, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9709, diff_loss=0.356, tot_loss=1.51, over 1656.00 samples.], 2024-10-21 21:36:40,972 INFO [train.py:682] (1/4) Start epoch 3262 2024-10-21 21:36:54,796 INFO [train.py:561] (1/4) Epoch 3262, batch 4, global_batch_idx: 52180, batch size: 189, loss[dur_loss=0.1866, prior_loss=0.9713, diff_loss=0.3044, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9705, diff_loss=0.4111, tot_loss=1.563, over 937.00 samples.], 2024-10-21 21:37:09,839 INFO [train.py:561] (1/4) Epoch 3262, batch 14, global_batch_idx: 52190, batch size: 142, loss[dur_loss=0.1869, prior_loss=0.9712, diff_loss=0.3019, tot_loss=1.46, over 142.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.971, diff_loss=0.34, tot_loss=1.495, over 2210.00 samples.], 2024-10-21 21:37:11,263 INFO [train.py:682] (1/4) Start epoch 3263 2024-10-21 21:37:31,592 INFO [train.py:561] (1/4) Epoch 3263, batch 8, global_batch_idx: 52200, batch size: 170, loss[dur_loss=0.1877, prior_loss=0.9714, diff_loss=0.3189, tot_loss=1.478, over 170.00 samples.], tot_loss[dur_loss=0.1832, prior_loss=0.9708, diff_loss=0.3645, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 21:37:41,771 INFO [train.py:682] (1/4) Start epoch 3264 2024-10-21 21:37:53,213 INFO [train.py:561] (1/4) Epoch 3264, batch 2, global_batch_idx: 52210, batch size: 203, loss[dur_loss=0.1859, prior_loss=0.9713, diff_loss=0.3437, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.1862, prior_loss=0.9713, diff_loss=0.3074, tot_loss=1.465, over 442.00 samples.], 2024-10-21 21:38:07,553 INFO [train.py:561] (1/4) Epoch 3264, batch 12, global_batch_idx: 52220, batch size: 152, loss[dur_loss=0.1855, prior_loss=0.9714, diff_loss=0.2781, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.971, diff_loss=0.3469, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 21:38:12,027 INFO [train.py:682] (1/4) Start epoch 3265 2024-10-21 21:38:29,354 INFO [train.py:561] (1/4) Epoch 3265, batch 6, global_batch_idx: 52230, batch size: 106, loss[dur_loss=0.1862, prior_loss=0.9712, diff_loss=0.2868, tot_loss=1.444, over 106.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9707, diff_loss=0.3893, tot_loss=1.544, over 1142.00 samples.], 2024-10-21 21:38:42,419 INFO [train.py:682] (1/4) Start epoch 3266 2024-10-21 21:38:51,718 INFO [train.py:561] (1/4) Epoch 3266, batch 0, global_batch_idx: 52240, batch size: 108, loss[dur_loss=0.1895, prior_loss=0.9716, diff_loss=0.3276, tot_loss=1.489, over 108.00 samples.], tot_loss[dur_loss=0.1895, prior_loss=0.9716, diff_loss=0.3276, tot_loss=1.489, over 108.00 samples.], 2024-10-21 21:39:06,038 INFO [train.py:561] (1/4) Epoch 3266, batch 10, global_batch_idx: 52250, batch size: 111, loss[dur_loss=0.1877, prior_loss=0.9722, diff_loss=0.2826, tot_loss=1.442, over 111.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9709, diff_loss=0.3674, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 21:39:13,209 INFO [train.py:682] (1/4) Start epoch 3267 2024-10-21 21:39:27,453 INFO [train.py:561] (1/4) Epoch 3267, batch 4, global_batch_idx: 52260, batch size: 189, loss[dur_loss=0.1872, prior_loss=0.9713, diff_loss=0.2872, tot_loss=1.446, over 189.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9704, diff_loss=0.4015, tot_loss=1.555, over 937.00 samples.], 2024-10-21 21:39:42,427 INFO [train.py:561] (1/4) Epoch 3267, batch 14, global_batch_idx: 52270, batch size: 142, loss[dur_loss=0.1877, prior_loss=0.9712, diff_loss=0.2832, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.338, tot_loss=1.494, over 2210.00 samples.], 2024-10-21 21:39:43,874 INFO [train.py:682] (1/4) Start epoch 3268 2024-10-21 21:40:04,074 INFO [train.py:561] (1/4) Epoch 3268, batch 8, global_batch_idx: 52280, batch size: 170, loss[dur_loss=0.1886, prior_loss=0.9711, diff_loss=0.3151, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.3744, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 21:40:14,338 INFO [train.py:682] (1/4) Start epoch 3269 2024-10-21 21:40:25,899 INFO [train.py:561] (1/4) Epoch 3269, batch 2, global_batch_idx: 52290, batch size: 203, loss[dur_loss=0.186, prior_loss=0.9711, diff_loss=0.3045, tot_loss=1.462, over 203.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3008, tot_loss=1.458, over 442.00 samples.], 2024-10-21 21:40:40,356 INFO [train.py:561] (1/4) Epoch 3269, batch 12, global_batch_idx: 52300, batch size: 152, loss[dur_loss=0.1828, prior_loss=0.9712, diff_loss=0.3058, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9709, diff_loss=0.3564, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 21:40:44,853 INFO [train.py:682] (1/4) Start epoch 3270 2024-10-21 21:41:02,073 INFO [train.py:561] (1/4) Epoch 3270, batch 6, global_batch_idx: 52310, batch size: 106, loss[dur_loss=0.1838, prior_loss=0.971, diff_loss=0.3063, tot_loss=1.461, over 106.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9706, diff_loss=0.3844, tot_loss=1.537, over 1142.00 samples.], 2024-10-21 21:41:15,253 INFO [train.py:682] (1/4) Start epoch 3271 2024-10-21 21:41:24,258 INFO [train.py:561] (1/4) Epoch 3271, batch 0, global_batch_idx: 52320, batch size: 108, loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.2413, tot_loss=1.4, over 108.00 samples.], tot_loss[dur_loss=0.1871, prior_loss=0.9716, diff_loss=0.2413, tot_loss=1.4, over 108.00 samples.], 2024-10-21 21:41:38,718 INFO [train.py:561] (1/4) Epoch 3271, batch 10, global_batch_idx: 52330, batch size: 111, loss[dur_loss=0.1869, prior_loss=0.972, diff_loss=0.2697, tot_loss=1.429, over 111.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.3494, tot_loss=1.504, over 1656.00 samples.], 2024-10-21 21:41:45,961 INFO [train.py:682] (1/4) Start epoch 3272 2024-10-21 21:41:59,976 INFO [train.py:561] (1/4) Epoch 3272, batch 4, global_batch_idx: 52340, batch size: 189, loss[dur_loss=0.1872, prior_loss=0.9712, diff_loss=0.3033, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9704, diff_loss=0.4058, tot_loss=1.56, over 937.00 samples.], 2024-10-21 21:42:15,004 INFO [train.py:561] (1/4) Epoch 3272, batch 14, global_batch_idx: 52350, batch size: 142, loss[dur_loss=0.1893, prior_loss=0.9709, diff_loss=0.3058, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9709, diff_loss=0.3396, tot_loss=1.495, over 2210.00 samples.], 2024-10-21 21:42:16,453 INFO [train.py:682] (1/4) Start epoch 3273 2024-10-21 21:42:36,906 INFO [train.py:561] (1/4) Epoch 3273, batch 8, global_batch_idx: 52360, batch size: 170, loss[dur_loss=0.188, prior_loss=0.971, diff_loss=0.3246, tot_loss=1.484, over 170.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.9707, diff_loss=0.3807, tot_loss=1.536, over 1432.00 samples.], 2024-10-21 21:42:47,104 INFO [train.py:682] (1/4) Start epoch 3274 2024-10-21 21:42:58,908 INFO [train.py:561] (1/4) Epoch 3274, batch 2, global_batch_idx: 52370, batch size: 203, loss[dur_loss=0.183, prior_loss=0.9709, diff_loss=0.3356, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9711, diff_loss=0.3112, tot_loss=1.467, over 442.00 samples.], 2024-10-21 21:43:13,524 INFO [train.py:561] (1/4) Epoch 3274, batch 12, global_batch_idx: 52380, batch size: 152, loss[dur_loss=0.1816, prior_loss=0.9714, diff_loss=0.2947, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9708, diff_loss=0.3531, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 21:43:18,028 INFO [train.py:682] (1/4) Start epoch 3275 2024-10-21 21:43:35,349 INFO [train.py:561] (1/4) Epoch 3275, batch 6, global_batch_idx: 52390, batch size: 106, loss[dur_loss=0.1868, prior_loss=0.971, diff_loss=0.2622, tot_loss=1.42, over 106.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9706, diff_loss=0.3869, tot_loss=1.542, over 1142.00 samples.], 2024-10-21 21:43:48,606 INFO [train.py:682] (1/4) Start epoch 3276 2024-10-21 21:43:57,362 INFO [train.py:561] (1/4) Epoch 3276, batch 0, global_batch_idx: 52400, batch size: 108, loss[dur_loss=0.1894, prior_loss=0.9717, diff_loss=0.2785, tot_loss=1.44, over 108.00 samples.], tot_loss[dur_loss=0.1894, prior_loss=0.9717, diff_loss=0.2785, tot_loss=1.44, over 108.00 samples.], 2024-10-21 21:44:11,640 INFO [train.py:561] (1/4) Epoch 3276, batch 10, global_batch_idx: 52410, batch size: 111, loss[dur_loss=0.1877, prior_loss=0.9721, diff_loss=0.254, tot_loss=1.414, over 111.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9709, diff_loss=0.3534, tot_loss=1.509, over 1656.00 samples.], 2024-10-21 21:44:18,771 INFO [train.py:682] (1/4) Start epoch 3277 2024-10-21 21:44:33,118 INFO [train.py:561] (1/4) Epoch 3277, batch 4, global_batch_idx: 52420, batch size: 189, loss[dur_loss=0.1866, prior_loss=0.9713, diff_loss=0.3207, tot_loss=1.479, over 189.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9705, diff_loss=0.4136, tot_loss=1.567, over 937.00 samples.], 2024-10-21 21:44:48,127 INFO [train.py:561] (1/4) Epoch 3277, batch 14, global_batch_idx: 52430, batch size: 142, loss[dur_loss=0.1872, prior_loss=0.9709, diff_loss=0.2651, tot_loss=1.423, over 142.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.971, diff_loss=0.3453, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 21:44:49,550 INFO [train.py:682] (1/4) Start epoch 3278 2024-10-21 21:45:09,679 INFO [train.py:561] (1/4) Epoch 3278, batch 8, global_batch_idx: 52440, batch size: 170, loss[dur_loss=0.1875, prior_loss=0.9714, diff_loss=0.3177, tot_loss=1.477, over 170.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9708, diff_loss=0.3705, tot_loss=1.526, over 1432.00 samples.], 2024-10-21 21:45:19,916 INFO [train.py:682] (1/4) Start epoch 3279 2024-10-21 21:45:31,394 INFO [train.py:561] (1/4) Epoch 3279, batch 2, global_batch_idx: 52450, batch size: 203, loss[dur_loss=0.187, prior_loss=0.9711, diff_loss=0.3169, tot_loss=1.475, over 203.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9712, diff_loss=0.2898, tot_loss=1.447, over 442.00 samples.], 2024-10-21 21:45:45,668 INFO [train.py:561] (1/4) Epoch 3279, batch 12, global_batch_idx: 52460, batch size: 152, loss[dur_loss=0.184, prior_loss=0.9714, diff_loss=0.283, tot_loss=1.438, over 152.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9709, diff_loss=0.3457, tot_loss=1.501, over 1966.00 samples.], 2024-10-21 21:45:50,133 INFO [train.py:682] (1/4) Start epoch 3280 2024-10-21 21:46:07,541 INFO [train.py:561] (1/4) Epoch 3280, batch 6, global_batch_idx: 52470, batch size: 106, loss[dur_loss=0.1851, prior_loss=0.9711, diff_loss=0.2534, tot_loss=1.41, over 106.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9705, diff_loss=0.3882, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 21:46:20,672 INFO [train.py:682] (1/4) Start epoch 3281 2024-10-21 21:46:29,749 INFO [train.py:561] (1/4) Epoch 3281, batch 0, global_batch_idx: 52480, batch size: 108, loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.3025, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.3025, tot_loss=1.461, over 108.00 samples.], 2024-10-21 21:46:44,190 INFO [train.py:561] (1/4) Epoch 3281, batch 10, global_batch_idx: 52490, batch size: 111, loss[dur_loss=0.1879, prior_loss=0.9722, diff_loss=0.2808, tot_loss=1.441, over 111.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.3639, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 21:46:51,393 INFO [train.py:682] (1/4) Start epoch 3282 2024-10-21 21:47:05,439 INFO [train.py:561] (1/4) Epoch 3282, batch 4, global_batch_idx: 52500, batch size: 189, loss[dur_loss=0.1854, prior_loss=0.9711, diff_loss=0.3156, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9704, diff_loss=0.4177, tot_loss=1.569, over 937.00 samples.], 2024-10-21 21:47:07,106 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 21:47:34,018 INFO [train.py:589] (1/4) Epoch 3282, validation: dur_loss=0.4583, prior_loss=1.036, diff_loss=0.4367, tot_loss=1.931, over 100.00 samples. 2024-10-21 21:47:34,019 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 21:47:47,506 INFO [train.py:561] (1/4) Epoch 3282, batch 14, global_batch_idx: 52510, batch size: 142, loss[dur_loss=0.1864, prior_loss=0.971, diff_loss=0.2779, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9709, diff_loss=0.3464, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 21:47:48,960 INFO [train.py:682] (1/4) Start epoch 3283 2024-10-21 21:48:09,879 INFO [train.py:561] (1/4) Epoch 3283, batch 8, global_batch_idx: 52520, batch size: 170, loss[dur_loss=0.1855, prior_loss=0.9712, diff_loss=0.2756, tot_loss=1.432, over 170.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.3782, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 21:48:20,104 INFO [train.py:682] (1/4) Start epoch 3284 2024-10-21 21:48:32,178 INFO [train.py:561] (1/4) Epoch 3284, batch 2, global_batch_idx: 52530, batch size: 203, loss[dur_loss=0.1879, prior_loss=0.9712, diff_loss=0.3174, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1874, prior_loss=0.9712, diff_loss=0.3153, tot_loss=1.474, over 442.00 samples.], 2024-10-21 21:48:46,469 INFO [train.py:561] (1/4) Epoch 3284, batch 12, global_batch_idx: 52540, batch size: 152, loss[dur_loss=0.1822, prior_loss=0.9713, diff_loss=0.2644, tot_loss=1.418, over 152.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9709, diff_loss=0.3512, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 21:48:50,966 INFO [train.py:682] (1/4) Start epoch 3285 2024-10-21 21:49:08,686 INFO [train.py:561] (1/4) Epoch 3285, batch 6, global_batch_idx: 52550, batch size: 106, loss[dur_loss=0.1875, prior_loss=0.971, diff_loss=0.2742, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9706, diff_loss=0.3885, tot_loss=1.542, over 1142.00 samples.], 2024-10-21 21:49:21,838 INFO [train.py:682] (1/4) Start epoch 3286 2024-10-21 21:49:31,338 INFO [train.py:561] (1/4) Epoch 3286, batch 0, global_batch_idx: 52560, batch size: 108, loss[dur_loss=0.1876, prior_loss=0.9713, diff_loss=0.2897, tot_loss=1.449, over 108.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9713, diff_loss=0.2897, tot_loss=1.449, over 108.00 samples.], 2024-10-21 21:49:45,740 INFO [train.py:561] (1/4) Epoch 3286, batch 10, global_batch_idx: 52570, batch size: 111, loss[dur_loss=0.1861, prior_loss=0.9721, diff_loss=0.3269, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9708, diff_loss=0.3571, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 21:49:52,957 INFO [train.py:682] (1/4) Start epoch 3287 2024-10-21 21:50:07,231 INFO [train.py:561] (1/4) Epoch 3287, batch 4, global_batch_idx: 52580, batch size: 189, loss[dur_loss=0.1833, prior_loss=0.9711, diff_loss=0.312, tot_loss=1.466, over 189.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9704, diff_loss=0.4087, tot_loss=1.563, over 937.00 samples.], 2024-10-21 21:50:22,180 INFO [train.py:561] (1/4) Epoch 3287, batch 14, global_batch_idx: 52590, batch size: 142, loss[dur_loss=0.185, prior_loss=0.9709, diff_loss=0.2864, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9709, diff_loss=0.3417, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 21:50:23,611 INFO [train.py:682] (1/4) Start epoch 3288 2024-10-21 21:50:44,808 INFO [train.py:561] (1/4) Epoch 3288, batch 8, global_batch_idx: 52600, batch size: 170, loss[dur_loss=0.1868, prior_loss=0.9712, diff_loss=0.3331, tot_loss=1.491, over 170.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9707, diff_loss=0.373, tot_loss=1.527, over 1432.00 samples.], 2024-10-21 21:50:55,040 INFO [train.py:682] (1/4) Start epoch 3289 2024-10-21 21:51:06,833 INFO [train.py:561] (1/4) Epoch 3289, batch 2, global_batch_idx: 52610, batch size: 203, loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.3142, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3142, tot_loss=1.471, over 442.00 samples.], 2024-10-21 21:51:21,171 INFO [train.py:561] (1/4) Epoch 3289, batch 12, global_batch_idx: 52620, batch size: 152, loss[dur_loss=0.1829, prior_loss=0.9714, diff_loss=0.2845, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9708, diff_loss=0.3538, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 21:51:25,603 INFO [train.py:682] (1/4) Start epoch 3290 2024-10-21 21:51:43,068 INFO [train.py:561] (1/4) Epoch 3290, batch 6, global_batch_idx: 52630, batch size: 106, loss[dur_loss=0.1842, prior_loss=0.9709, diff_loss=0.302, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9705, diff_loss=0.3808, tot_loss=1.533, over 1142.00 samples.], 2024-10-21 21:51:56,086 INFO [train.py:682] (1/4) Start epoch 3291 2024-10-21 21:52:05,327 INFO [train.py:561] (1/4) Epoch 3291, batch 0, global_batch_idx: 52640, batch size: 108, loss[dur_loss=0.1881, prior_loss=0.9716, diff_loss=0.2868, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9716, diff_loss=0.2868, tot_loss=1.446, over 108.00 samples.], 2024-10-21 21:52:19,511 INFO [train.py:561] (1/4) Epoch 3291, batch 10, global_batch_idx: 52650, batch size: 111, loss[dur_loss=0.1879, prior_loss=0.972, diff_loss=0.3094, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9708, diff_loss=0.364, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 21:52:26,590 INFO [train.py:682] (1/4) Start epoch 3292 2024-10-21 21:52:40,783 INFO [train.py:561] (1/4) Epoch 3292, batch 4, global_batch_idx: 52660, batch size: 189, loss[dur_loss=0.1857, prior_loss=0.9713, diff_loss=0.3271, tot_loss=1.484, over 189.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.4094, tot_loss=1.561, over 937.00 samples.], 2024-10-21 21:52:55,578 INFO [train.py:561] (1/4) Epoch 3292, batch 14, global_batch_idx: 52670, batch size: 142, loss[dur_loss=0.1895, prior_loss=0.9711, diff_loss=0.2522, tot_loss=1.413, over 142.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9709, diff_loss=0.3409, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 21:52:56,999 INFO [train.py:682] (1/4) Start epoch 3293 2024-10-21 21:53:17,372 INFO [train.py:561] (1/4) Epoch 3293, batch 8, global_batch_idx: 52680, batch size: 170, loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.296, tot_loss=1.454, over 170.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9707, diff_loss=0.3642, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 21:53:27,546 INFO [train.py:682] (1/4) Start epoch 3294 2024-10-21 21:53:39,340 INFO [train.py:561] (1/4) Epoch 3294, batch 2, global_batch_idx: 52690, batch size: 203, loss[dur_loss=0.1853, prior_loss=0.971, diff_loss=0.3254, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.9711, diff_loss=0.313, tot_loss=1.469, over 442.00 samples.], 2024-10-21 21:53:53,550 INFO [train.py:561] (1/4) Epoch 3294, batch 12, global_batch_idx: 52700, batch size: 152, loss[dur_loss=0.1842, prior_loss=0.9714, diff_loss=0.3319, tot_loss=1.488, over 152.00 samples.], tot_loss[dur_loss=0.1839, prior_loss=0.9709, diff_loss=0.3512, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 21:53:57,989 INFO [train.py:682] (1/4) Start epoch 3295 2024-10-21 21:54:15,196 INFO [train.py:561] (1/4) Epoch 3295, batch 6, global_batch_idx: 52710, batch size: 106, loss[dur_loss=0.184, prior_loss=0.9711, diff_loss=0.2724, tot_loss=1.427, over 106.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9705, diff_loss=0.4002, tot_loss=1.552, over 1142.00 samples.], 2024-10-21 21:54:28,298 INFO [train.py:682] (1/4) Start epoch 3296 2024-10-21 21:54:37,111 INFO [train.py:561] (1/4) Epoch 3296, batch 0, global_batch_idx: 52720, batch size: 108, loss[dur_loss=0.19, prior_loss=0.9715, diff_loss=0.32, tot_loss=1.481, over 108.00 samples.], tot_loss[dur_loss=0.19, prior_loss=0.9715, diff_loss=0.32, tot_loss=1.481, over 108.00 samples.], 2024-10-21 21:54:51,499 INFO [train.py:561] (1/4) Epoch 3296, batch 10, global_batch_idx: 52730, batch size: 111, loss[dur_loss=0.1848, prior_loss=0.9721, diff_loss=0.2805, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9709, diff_loss=0.3569, tot_loss=1.512, over 1656.00 samples.], 2024-10-21 21:54:58,649 INFO [train.py:682] (1/4) Start epoch 3297 2024-10-21 21:55:12,784 INFO [train.py:561] (1/4) Epoch 3297, batch 4, global_batch_idx: 52740, batch size: 189, loss[dur_loss=0.1839, prior_loss=0.9711, diff_loss=0.3658, tot_loss=1.521, over 189.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9704, diff_loss=0.4121, tot_loss=1.564, over 937.00 samples.], 2024-10-21 21:55:27,814 INFO [train.py:561] (1/4) Epoch 3297, batch 14, global_batch_idx: 52750, batch size: 142, loss[dur_loss=0.1848, prior_loss=0.9709, diff_loss=0.2942, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.3447, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 21:55:29,255 INFO [train.py:682] (1/4) Start epoch 3298 2024-10-21 21:55:49,526 INFO [train.py:561] (1/4) Epoch 3298, batch 8, global_batch_idx: 52760, batch size: 170, loss[dur_loss=0.1901, prior_loss=0.9715, diff_loss=0.3138, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.3668, tot_loss=1.521, over 1432.00 samples.], 2024-10-21 21:55:59,720 INFO [train.py:682] (1/4) Start epoch 3299 2024-10-21 21:56:11,487 INFO [train.py:561] (1/4) Epoch 3299, batch 2, global_batch_idx: 52770, batch size: 203, loss[dur_loss=0.1849, prior_loss=0.9711, diff_loss=0.3372, tot_loss=1.493, over 203.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.9712, diff_loss=0.3206, tot_loss=1.477, over 442.00 samples.], 2024-10-21 21:56:25,798 INFO [train.py:561] (1/4) Epoch 3299, batch 12, global_batch_idx: 52780, batch size: 152, loss[dur_loss=0.1847, prior_loss=0.9713, diff_loss=0.2896, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.355, tot_loss=1.51, over 1966.00 samples.], 2024-10-21 21:56:30,300 INFO [train.py:682] (1/4) Start epoch 3300 2024-10-21 21:56:47,401 INFO [train.py:561] (1/4) Epoch 3300, batch 6, global_batch_idx: 52790, batch size: 106, loss[dur_loss=0.1841, prior_loss=0.9712, diff_loss=0.2563, tot_loss=1.412, over 106.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9705, diff_loss=0.3742, tot_loss=1.526, over 1142.00 samples.], 2024-10-21 21:57:00,512 INFO [train.py:682] (1/4) Start epoch 3301 2024-10-21 21:57:09,621 INFO [train.py:561] (1/4) Epoch 3301, batch 0, global_batch_idx: 52800, batch size: 108, loss[dur_loss=0.1846, prior_loss=0.9714, diff_loss=0.2928, tot_loss=1.449, over 108.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9714, diff_loss=0.2928, tot_loss=1.449, over 108.00 samples.], 2024-10-21 21:57:23,796 INFO [train.py:561] (1/4) Epoch 3301, batch 10, global_batch_idx: 52810, batch size: 111, loss[dur_loss=0.1873, prior_loss=0.9719, diff_loss=0.2924, tot_loss=1.452, over 111.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.3639, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 21:57:30,953 INFO [train.py:682] (1/4) Start epoch 3302 2024-10-21 21:57:44,995 INFO [train.py:561] (1/4) Epoch 3302, batch 4, global_batch_idx: 52820, batch size: 189, loss[dur_loss=0.1837, prior_loss=0.971, diff_loss=0.3301, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.408, tot_loss=1.559, over 937.00 samples.], 2024-10-21 21:57:59,854 INFO [train.py:561] (1/4) Epoch 3302, batch 14, global_batch_idx: 52830, batch size: 142, loss[dur_loss=0.1862, prior_loss=0.9709, diff_loss=0.2955, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9708, diff_loss=0.3474, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 21:58:01,277 INFO [train.py:682] (1/4) Start epoch 3303 2024-10-21 21:58:21,419 INFO [train.py:561] (1/4) Epoch 3303, batch 8, global_batch_idx: 52840, batch size: 170, loss[dur_loss=0.1872, prior_loss=0.9713, diff_loss=0.2965, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9706, diff_loss=0.3747, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 21:58:31,558 INFO [train.py:682] (1/4) Start epoch 3304 2024-10-21 21:58:42,877 INFO [train.py:561] (1/4) Epoch 3304, batch 2, global_batch_idx: 52850, batch size: 203, loss[dur_loss=0.1867, prior_loss=0.971, diff_loss=0.3072, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.2923, tot_loss=1.448, over 442.00 samples.], 2024-10-21 21:58:57,259 INFO [train.py:561] (1/4) Epoch 3304, batch 12, global_batch_idx: 52860, batch size: 152, loss[dur_loss=0.1831, prior_loss=0.9712, diff_loss=0.3066, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.3429, tot_loss=1.497, over 1966.00 samples.], 2024-10-21 21:59:01,721 INFO [train.py:682] (1/4) Start epoch 3305 2024-10-21 21:59:18,744 INFO [train.py:561] (1/4) Epoch 3305, batch 6, global_batch_idx: 52870, batch size: 106, loss[dur_loss=0.185, prior_loss=0.9711, diff_loss=0.2522, tot_loss=1.408, over 106.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9704, diff_loss=0.3817, tot_loss=1.533, over 1142.00 samples.], 2024-10-21 21:59:31,844 INFO [train.py:682] (1/4) Start epoch 3306 2024-10-21 21:59:41,017 INFO [train.py:561] (1/4) Epoch 3306, batch 0, global_batch_idx: 52880, batch size: 108, loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.3229, tot_loss=1.482, over 108.00 samples.], tot_loss[dur_loss=0.1875, prior_loss=0.9716, diff_loss=0.3229, tot_loss=1.482, over 108.00 samples.], 2024-10-21 21:59:55,290 INFO [train.py:561] (1/4) Epoch 3306, batch 10, global_batch_idx: 52890, batch size: 111, loss[dur_loss=0.186, prior_loss=0.972, diff_loss=0.273, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9707, diff_loss=0.3658, tot_loss=1.519, over 1656.00 samples.], 2024-10-21 22:00:02,387 INFO [train.py:682] (1/4) Start epoch 3307 2024-10-21 22:00:16,269 INFO [train.py:561] (1/4) Epoch 3307, batch 4, global_batch_idx: 52900, batch size: 189, loss[dur_loss=0.1835, prior_loss=0.971, diff_loss=0.3211, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.4108, tot_loss=1.562, over 937.00 samples.], 2024-10-21 22:00:31,165 INFO [train.py:561] (1/4) Epoch 3307, batch 14, global_batch_idx: 52910, batch size: 142, loss[dur_loss=0.1884, prior_loss=0.971, diff_loss=0.3056, tot_loss=1.465, over 142.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9709, diff_loss=0.3407, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 22:00:32,594 INFO [train.py:682] (1/4) Start epoch 3308 2024-10-21 22:00:53,042 INFO [train.py:561] (1/4) Epoch 3308, batch 8, global_batch_idx: 52920, batch size: 170, loss[dur_loss=0.1857, prior_loss=0.9712, diff_loss=0.291, tot_loss=1.448, over 170.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9706, diff_loss=0.3665, tot_loss=1.52, over 1432.00 samples.], 2024-10-21 22:01:03,100 INFO [train.py:682] (1/4) Start epoch 3309 2024-10-21 22:01:14,586 INFO [train.py:561] (1/4) Epoch 3309, batch 2, global_batch_idx: 52930, batch size: 203, loss[dur_loss=0.1837, prior_loss=0.971, diff_loss=0.3104, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9711, diff_loss=0.3026, tot_loss=1.457, over 442.00 samples.], 2024-10-21 22:01:28,773 INFO [train.py:561] (1/4) Epoch 3309, batch 12, global_batch_idx: 52940, batch size: 152, loss[dur_loss=0.1835, prior_loss=0.9711, diff_loss=0.2961, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9708, diff_loss=0.3492, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 22:01:33,214 INFO [train.py:682] (1/4) Start epoch 3310 2024-10-21 22:01:50,608 INFO [train.py:561] (1/4) Epoch 3310, batch 6, global_batch_idx: 52950, batch size: 106, loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.2771, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9705, diff_loss=0.3757, tot_loss=1.528, over 1142.00 samples.], 2024-10-21 22:02:03,711 INFO [train.py:682] (1/4) Start epoch 3311 2024-10-21 22:02:12,683 INFO [train.py:561] (1/4) Epoch 3311, batch 0, global_batch_idx: 52960, batch size: 108, loss[dur_loss=0.188, prior_loss=0.9716, diff_loss=0.2653, tot_loss=1.425, over 108.00 samples.], tot_loss[dur_loss=0.188, prior_loss=0.9716, diff_loss=0.2653, tot_loss=1.425, over 108.00 samples.], 2024-10-21 22:02:26,947 INFO [train.py:561] (1/4) Epoch 3311, batch 10, global_batch_idx: 52970, batch size: 111, loss[dur_loss=0.1872, prior_loss=0.9718, diff_loss=0.2699, tot_loss=1.429, over 111.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9708, diff_loss=0.3452, tot_loss=1.499, over 1656.00 samples.], 2024-10-21 22:02:34,019 INFO [train.py:682] (1/4) Start epoch 3312 2024-10-21 22:02:47,855 INFO [train.py:561] (1/4) Epoch 3312, batch 4, global_batch_idx: 52980, batch size: 189, loss[dur_loss=0.1853, prior_loss=0.9713, diff_loss=0.2803, tot_loss=1.437, over 189.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.4073, tot_loss=1.559, over 937.00 samples.], 2024-10-21 22:03:02,740 INFO [train.py:561] (1/4) Epoch 3312, batch 14, global_batch_idx: 52990, batch size: 142, loss[dur_loss=0.1848, prior_loss=0.9708, diff_loss=0.3075, tot_loss=1.463, over 142.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9709, diff_loss=0.3503, tot_loss=1.505, over 2210.00 samples.], 2024-10-21 22:03:04,163 INFO [train.py:682] (1/4) Start epoch 3313 2024-10-21 22:03:24,088 INFO [train.py:561] (1/4) Epoch 3313, batch 8, global_batch_idx: 53000, batch size: 170, loss[dur_loss=0.1885, prior_loss=0.9713, diff_loss=0.3517, tot_loss=1.512, over 170.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9707, diff_loss=0.3785, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 22:03:34,152 INFO [train.py:682] (1/4) Start epoch 3314 2024-10-21 22:03:45,983 INFO [train.py:561] (1/4) Epoch 3314, batch 2, global_batch_idx: 53010, batch size: 203, loss[dur_loss=0.1835, prior_loss=0.9711, diff_loss=0.3248, tot_loss=1.479, over 203.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9711, diff_loss=0.3017, tot_loss=1.457, over 442.00 samples.], 2024-10-21 22:04:00,216 INFO [train.py:561] (1/4) Epoch 3314, batch 12, global_batch_idx: 53020, batch size: 152, loss[dur_loss=0.1849, prior_loss=0.9715, diff_loss=0.3096, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.3479, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 22:04:04,654 INFO [train.py:682] (1/4) Start epoch 3315 2024-10-21 22:04:21,625 INFO [train.py:561] (1/4) Epoch 3315, batch 6, global_batch_idx: 53030, batch size: 106, loss[dur_loss=0.186, prior_loss=0.9713, diff_loss=0.2848, tot_loss=1.442, over 106.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3812, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 22:04:34,649 INFO [train.py:682] (1/4) Start epoch 3316 2024-10-21 22:04:43,386 INFO [train.py:561] (1/4) Epoch 3316, batch 0, global_batch_idx: 53040, batch size: 108, loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.2893, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.2893, tot_loss=1.448, over 108.00 samples.], 2024-10-21 22:04:57,634 INFO [train.py:561] (1/4) Epoch 3316, batch 10, global_batch_idx: 53050, batch size: 111, loss[dur_loss=0.1874, prior_loss=0.9721, diff_loss=0.2873, tot_loss=1.447, over 111.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9708, diff_loss=0.3627, tot_loss=1.516, over 1656.00 samples.], 2024-10-21 22:05:04,712 INFO [train.py:682] (1/4) Start epoch 3317 2024-10-21 22:05:18,335 INFO [train.py:561] (1/4) Epoch 3317, batch 4, global_batch_idx: 53060, batch size: 189, loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.305, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9703, diff_loss=0.3988, tot_loss=1.551, over 937.00 samples.], 2024-10-21 22:05:33,171 INFO [train.py:561] (1/4) Epoch 3317, batch 14, global_batch_idx: 53070, batch size: 142, loss[dur_loss=0.1845, prior_loss=0.9707, diff_loss=0.2379, tot_loss=1.393, over 142.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9708, diff_loss=0.3408, tot_loss=1.495, over 2210.00 samples.], 2024-10-21 22:05:34,595 INFO [train.py:682] (1/4) Start epoch 3318 2024-10-21 22:05:54,625 INFO [train.py:561] (1/4) Epoch 3318, batch 8, global_batch_idx: 53080, batch size: 170, loss[dur_loss=0.1912, prior_loss=0.9715, diff_loss=0.2989, tot_loss=1.462, over 170.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9706, diff_loss=0.3641, tot_loss=1.518, over 1432.00 samples.], 2024-10-21 22:06:04,725 INFO [train.py:682] (1/4) Start epoch 3319 2024-10-21 22:06:16,388 INFO [train.py:561] (1/4) Epoch 3319, batch 2, global_batch_idx: 53090, batch size: 203, loss[dur_loss=0.1859, prior_loss=0.971, diff_loss=0.3278, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.1855, prior_loss=0.9711, diff_loss=0.3051, tot_loss=1.462, over 442.00 samples.], 2024-10-21 22:06:30,708 INFO [train.py:561] (1/4) Epoch 3319, batch 12, global_batch_idx: 53100, batch size: 152, loss[dur_loss=0.1865, prior_loss=0.9713, diff_loss=0.3101, tot_loss=1.468, over 152.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.971, diff_loss=0.3557, tot_loss=1.511, over 1966.00 samples.], 2024-10-21 22:06:35,184 INFO [train.py:682] (1/4) Start epoch 3320 2024-10-21 22:06:52,397 INFO [train.py:561] (1/4) Epoch 3320, batch 6, global_batch_idx: 53110, batch size: 106, loss[dur_loss=0.1806, prior_loss=0.9711, diff_loss=0.2669, tot_loss=1.419, over 106.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9705, diff_loss=0.3961, tot_loss=1.548, over 1142.00 samples.], 2024-10-21 22:07:05,536 INFO [train.py:682] (1/4) Start epoch 3321 2024-10-21 22:07:14,469 INFO [train.py:561] (1/4) Epoch 3321, batch 0, global_batch_idx: 53120, batch size: 108, loss[dur_loss=0.1845, prior_loss=0.9715, diff_loss=0.2949, tot_loss=1.451, over 108.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9715, diff_loss=0.2949, tot_loss=1.451, over 108.00 samples.], 2024-10-21 22:07:28,777 INFO [train.py:561] (1/4) Epoch 3321, batch 10, global_batch_idx: 53130, batch size: 111, loss[dur_loss=0.1873, prior_loss=0.9723, diff_loss=0.2938, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9708, diff_loss=0.3608, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 22:07:35,905 INFO [train.py:682] (1/4) Start epoch 3322 2024-10-21 22:07:50,318 INFO [train.py:561] (1/4) Epoch 3322, batch 4, global_batch_idx: 53140, batch size: 189, loss[dur_loss=0.1822, prior_loss=0.9709, diff_loss=0.2927, tot_loss=1.446, over 189.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.4049, tot_loss=1.557, over 937.00 samples.], 2024-10-21 22:08:05,351 INFO [train.py:561] (1/4) Epoch 3322, batch 14, global_batch_idx: 53150, batch size: 142, loss[dur_loss=0.1852, prior_loss=0.971, diff_loss=0.2866, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.3451, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 22:08:06,784 INFO [train.py:682] (1/4) Start epoch 3323 2024-10-21 22:08:26,980 INFO [train.py:561] (1/4) Epoch 3323, batch 8, global_batch_idx: 53160, batch size: 170, loss[dur_loss=0.1885, prior_loss=0.9712, diff_loss=0.3129, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.3744, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 22:08:37,175 INFO [train.py:682] (1/4) Start epoch 3324 2024-10-21 22:08:48,730 INFO [train.py:561] (1/4) Epoch 3324, batch 2, global_batch_idx: 53170, batch size: 203, loss[dur_loss=0.186, prior_loss=0.9709, diff_loss=0.3037, tot_loss=1.461, over 203.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.971, diff_loss=0.291, tot_loss=1.447, over 442.00 samples.], 2024-10-21 22:09:03,018 INFO [train.py:561] (1/4) Epoch 3324, batch 12, global_batch_idx: 53180, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.9711, diff_loss=0.3157, tot_loss=1.47, over 152.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.3488, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 22:09:07,528 INFO [train.py:682] (1/4) Start epoch 3325 2024-10-21 22:09:24,709 INFO [train.py:561] (1/4) Epoch 3325, batch 6, global_batch_idx: 53190, batch size: 106, loss[dur_loss=0.1835, prior_loss=0.971, diff_loss=0.373, tot_loss=1.527, over 106.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9704, diff_loss=0.3992, tot_loss=1.551, over 1142.00 samples.], 2024-10-21 22:09:37,982 INFO [train.py:682] (1/4) Start epoch 3326 2024-10-21 22:09:46,755 INFO [train.py:561] (1/4) Epoch 3326, batch 0, global_batch_idx: 53200, batch size: 108, loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.2969, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9716, diff_loss=0.2969, tot_loss=1.455, over 108.00 samples.], 2024-10-21 22:10:01,153 INFO [train.py:561] (1/4) Epoch 3326, batch 10, global_batch_idx: 53210, batch size: 111, loss[dur_loss=0.1865, prior_loss=0.9721, diff_loss=0.2883, tot_loss=1.447, over 111.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9708, diff_loss=0.3611, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 22:10:08,287 INFO [train.py:682] (1/4) Start epoch 3327 2024-10-21 22:10:22,070 INFO [train.py:561] (1/4) Epoch 3327, batch 4, global_batch_idx: 53220, batch size: 189, loss[dur_loss=0.1852, prior_loss=0.9709, diff_loss=0.3201, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9703, diff_loss=0.4109, tot_loss=1.564, over 937.00 samples.], 2024-10-21 22:10:37,031 INFO [train.py:561] (1/4) Epoch 3327, batch 14, global_batch_idx: 53230, batch size: 142, loss[dur_loss=0.1854, prior_loss=0.9707, diff_loss=0.3026, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.184, prior_loss=0.9708, diff_loss=0.3461, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 22:10:38,470 INFO [train.py:682] (1/4) Start epoch 3328 2024-10-21 22:10:58,736 INFO [train.py:561] (1/4) Epoch 3328, batch 8, global_batch_idx: 53240, batch size: 170, loss[dur_loss=0.1879, prior_loss=0.9713, diff_loss=0.3349, tot_loss=1.494, over 170.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9706, diff_loss=0.3833, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 22:11:08,903 INFO [train.py:682] (1/4) Start epoch 3329 2024-10-21 22:11:20,283 INFO [train.py:561] (1/4) Epoch 3329, batch 2, global_batch_idx: 53250, batch size: 203, loss[dur_loss=0.1843, prior_loss=0.9708, diff_loss=0.3344, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9709, diff_loss=0.3184, tot_loss=1.473, over 442.00 samples.], 2024-10-21 22:11:34,602 INFO [train.py:561] (1/4) Epoch 3329, batch 12, global_batch_idx: 53260, batch size: 152, loss[dur_loss=0.1827, prior_loss=0.9712, diff_loss=0.3171, tot_loss=1.471, over 152.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9708, diff_loss=0.3538, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 22:11:39,065 INFO [train.py:682] (1/4) Start epoch 3330 2024-10-21 22:11:56,236 INFO [train.py:561] (1/4) Epoch 3330, batch 6, global_batch_idx: 53270, batch size: 106, loss[dur_loss=0.1826, prior_loss=0.9709, diff_loss=0.2653, tot_loss=1.419, over 106.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9704, diff_loss=0.391, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 22:12:09,318 INFO [train.py:682] (1/4) Start epoch 3331 2024-10-21 22:12:18,303 INFO [train.py:561] (1/4) Epoch 3331, batch 0, global_batch_idx: 53280, batch size: 108, loss[dur_loss=0.1887, prior_loss=0.9714, diff_loss=0.2589, tot_loss=1.419, over 108.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9714, diff_loss=0.2589, tot_loss=1.419, over 108.00 samples.], 2024-10-21 22:12:32,719 INFO [train.py:561] (1/4) Epoch 3331, batch 10, global_batch_idx: 53290, batch size: 111, loss[dur_loss=0.186, prior_loss=0.9721, diff_loss=0.2748, tot_loss=1.433, over 111.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9707, diff_loss=0.3619, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 22:12:39,855 INFO [train.py:682] (1/4) Start epoch 3332 2024-10-21 22:12:54,062 INFO [train.py:561] (1/4) Epoch 3332, batch 4, global_batch_idx: 53300, batch size: 189, loss[dur_loss=0.1813, prior_loss=0.9709, diff_loss=0.3288, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9703, diff_loss=0.4173, tot_loss=1.568, over 937.00 samples.], 2024-10-21 22:13:09,036 INFO [train.py:561] (1/4) Epoch 3332, batch 14, global_batch_idx: 53310, batch size: 142, loss[dur_loss=0.1877, prior_loss=0.9709, diff_loss=0.3192, tot_loss=1.478, over 142.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9708, diff_loss=0.3572, tot_loss=1.511, over 2210.00 samples.], 2024-10-21 22:13:10,479 INFO [train.py:682] (1/4) Start epoch 3333 2024-10-21 22:13:30,857 INFO [train.py:561] (1/4) Epoch 3333, batch 8, global_batch_idx: 53320, batch size: 170, loss[dur_loss=0.1857, prior_loss=0.9712, diff_loss=0.3495, tot_loss=1.506, over 170.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9706, diff_loss=0.3731, tot_loss=1.526, over 1432.00 samples.], 2024-10-21 22:13:41,157 INFO [train.py:682] (1/4) Start epoch 3334 2024-10-21 22:13:52,669 INFO [train.py:561] (1/4) Epoch 3334, batch 2, global_batch_idx: 53330, batch size: 203, loss[dur_loss=0.1844, prior_loss=0.9711, diff_loss=0.3214, tot_loss=1.477, over 203.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.9711, diff_loss=0.3122, tot_loss=1.468, over 442.00 samples.], 2024-10-21 22:14:07,099 INFO [train.py:561] (1/4) Epoch 3334, batch 12, global_batch_idx: 53340, batch size: 152, loss[dur_loss=0.1817, prior_loss=0.9712, diff_loss=0.2803, tot_loss=1.433, over 152.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9708, diff_loss=0.3495, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 22:14:11,613 INFO [train.py:682] (1/4) Start epoch 3335 2024-10-21 22:14:28,607 INFO [train.py:561] (1/4) Epoch 3335, batch 6, global_batch_idx: 53350, batch size: 106, loss[dur_loss=0.1855, prior_loss=0.9711, diff_loss=0.2957, tot_loss=1.452, over 106.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9705, diff_loss=0.3857, tot_loss=1.538, over 1142.00 samples.], 2024-10-21 22:14:41,726 INFO [train.py:682] (1/4) Start epoch 3336 2024-10-21 22:14:50,323 INFO [train.py:561] (1/4) Epoch 3336, batch 0, global_batch_idx: 53360, batch size: 108, loss[dur_loss=0.1878, prior_loss=0.9714, diff_loss=0.278, tot_loss=1.437, over 108.00 samples.], tot_loss[dur_loss=0.1878, prior_loss=0.9714, diff_loss=0.278, tot_loss=1.437, over 108.00 samples.], 2024-10-21 22:15:04,567 INFO [train.py:561] (1/4) Epoch 3336, batch 10, global_batch_idx: 53370, batch size: 111, loss[dur_loss=0.1863, prior_loss=0.972, diff_loss=0.2975, tot_loss=1.456, over 111.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9707, diff_loss=0.359, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 22:15:11,660 INFO [train.py:682] (1/4) Start epoch 3337 2024-10-21 22:15:25,463 INFO [train.py:561] (1/4) Epoch 3337, batch 4, global_batch_idx: 53380, batch size: 189, loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.3154, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9701, diff_loss=0.4165, tot_loss=1.568, over 937.00 samples.], 2024-10-21 22:15:40,415 INFO [train.py:561] (1/4) Epoch 3337, batch 14, global_batch_idx: 53390, batch size: 142, loss[dur_loss=0.1859, prior_loss=0.9708, diff_loss=0.2769, tot_loss=1.434, over 142.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.3415, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 22:15:41,857 INFO [train.py:682] (1/4) Start epoch 3338 2024-10-21 22:16:01,766 INFO [train.py:561] (1/4) Epoch 3338, batch 8, global_batch_idx: 53400, batch size: 170, loss[dur_loss=0.1871, prior_loss=0.9715, diff_loss=0.3033, tot_loss=1.462, over 170.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9706, diff_loss=0.3772, tot_loss=1.53, over 1432.00 samples.], 2024-10-21 22:16:11,925 INFO [train.py:682] (1/4) Start epoch 3339 2024-10-21 22:16:23,370 INFO [train.py:561] (1/4) Epoch 3339, batch 2, global_batch_idx: 53410, batch size: 203, loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.3175, tot_loss=1.472, over 203.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9709, diff_loss=0.2942, tot_loss=1.449, over 442.00 samples.], 2024-10-21 22:16:37,762 INFO [train.py:561] (1/4) Epoch 3339, batch 12, global_batch_idx: 53420, batch size: 152, loss[dur_loss=0.1818, prior_loss=0.9712, diff_loss=0.3007, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9707, diff_loss=0.3546, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 22:16:42,260 INFO [train.py:682] (1/4) Start epoch 3340 2024-10-21 22:16:59,453 INFO [train.py:561] (1/4) Epoch 3340, batch 6, global_batch_idx: 53430, batch size: 106, loss[dur_loss=0.1828, prior_loss=0.9709, diff_loss=0.3572, tot_loss=1.511, over 106.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9704, diff_loss=0.3909, tot_loss=1.542, over 1142.00 samples.], 2024-10-21 22:17:12,609 INFO [train.py:682] (1/4) Start epoch 3341 2024-10-21 22:17:21,284 INFO [train.py:561] (1/4) Epoch 3341, batch 0, global_batch_idx: 53440, batch size: 108, loss[dur_loss=0.186, prior_loss=0.9715, diff_loss=0.2855, tot_loss=1.443, over 108.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9715, diff_loss=0.2855, tot_loss=1.443, over 108.00 samples.], 2024-10-21 22:17:35,652 INFO [train.py:561] (1/4) Epoch 3341, batch 10, global_batch_idx: 53450, batch size: 111, loss[dur_loss=0.1854, prior_loss=0.9721, diff_loss=0.2818, tot_loss=1.439, over 111.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9708, diff_loss=0.3585, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 22:17:42,831 INFO [train.py:682] (1/4) Start epoch 3342 2024-10-21 22:17:56,761 INFO [train.py:561] (1/4) Epoch 3342, batch 4, global_batch_idx: 53460, batch size: 189, loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.2844, tot_loss=1.44, over 189.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9702, diff_loss=0.4056, tot_loss=1.556, over 937.00 samples.], 2024-10-21 22:18:11,903 INFO [train.py:561] (1/4) Epoch 3342, batch 14, global_batch_idx: 53470, batch size: 142, loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.2879, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9708, diff_loss=0.337, tot_loss=1.491, over 2210.00 samples.], 2024-10-21 22:18:13,347 INFO [train.py:682] (1/4) Start epoch 3343 2024-10-21 22:18:33,545 INFO [train.py:561] (1/4) Epoch 3343, batch 8, global_batch_idx: 53480, batch size: 170, loss[dur_loss=0.1875, prior_loss=0.9714, diff_loss=0.3386, tot_loss=1.497, over 170.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9706, diff_loss=0.3699, tot_loss=1.523, over 1432.00 samples.], 2024-10-21 22:18:43,821 INFO [train.py:682] (1/4) Start epoch 3344 2024-10-21 22:18:55,081 INFO [train.py:561] (1/4) Epoch 3344, batch 2, global_batch_idx: 53490, batch size: 203, loss[dur_loss=0.1843, prior_loss=0.9708, diff_loss=0.2854, tot_loss=1.44, over 203.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.9709, diff_loss=0.2836, tot_loss=1.44, over 442.00 samples.], 2024-10-21 22:19:09,443 INFO [train.py:561] (1/4) Epoch 3344, batch 12, global_batch_idx: 53500, batch size: 152, loss[dur_loss=0.1833, prior_loss=0.9711, diff_loss=0.2915, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.3424, tot_loss=1.496, over 1966.00 samples.], 2024-10-21 22:19:13,917 INFO [train.py:682] (1/4) Start epoch 3345 2024-10-21 22:19:31,279 INFO [train.py:561] (1/4) Epoch 3345, batch 6, global_batch_idx: 53510, batch size: 106, loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.2712, tot_loss=1.426, over 106.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9703, diff_loss=0.3865, tot_loss=1.538, over 1142.00 samples.], 2024-10-21 22:19:44,499 INFO [train.py:682] (1/4) Start epoch 3346 2024-10-21 22:19:53,181 INFO [train.py:561] (1/4) Epoch 3346, batch 0, global_batch_idx: 53520, batch size: 108, loss[dur_loss=0.1856, prior_loss=0.9714, diff_loss=0.3033, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9714, diff_loss=0.3033, tot_loss=1.46, over 108.00 samples.], 2024-10-21 22:20:07,684 INFO [train.py:561] (1/4) Epoch 3346, batch 10, global_batch_idx: 53530, batch size: 111, loss[dur_loss=0.1857, prior_loss=0.972, diff_loss=0.2828, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.3513, tot_loss=1.504, over 1656.00 samples.], 2024-10-21 22:20:14,852 INFO [train.py:682] (1/4) Start epoch 3347 2024-10-21 22:20:28,580 INFO [train.py:561] (1/4) Epoch 3347, batch 4, global_batch_idx: 53540, batch size: 189, loss[dur_loss=0.1828, prior_loss=0.9709, diff_loss=0.3227, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.9701, diff_loss=0.4064, tot_loss=1.556, over 937.00 samples.], 2024-10-21 22:20:43,655 INFO [train.py:561] (1/4) Epoch 3347, batch 14, global_batch_idx: 53550, batch size: 142, loss[dur_loss=0.1859, prior_loss=0.9708, diff_loss=0.2834, tot_loss=1.44, over 142.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9707, diff_loss=0.3438, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 22:20:45,079 INFO [train.py:682] (1/4) Start epoch 3348 2024-10-21 22:21:05,534 INFO [train.py:561] (1/4) Epoch 3348, batch 8, global_batch_idx: 53560, batch size: 170, loss[dur_loss=0.1874, prior_loss=0.9713, diff_loss=0.3142, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9705, diff_loss=0.3733, tot_loss=1.526, over 1432.00 samples.], 2024-10-21 22:21:15,784 INFO [train.py:682] (1/4) Start epoch 3349 2024-10-21 22:21:27,140 INFO [train.py:561] (1/4) Epoch 3349, batch 2, global_batch_idx: 53570, batch size: 203, loss[dur_loss=0.1844, prior_loss=0.9709, diff_loss=0.3256, tot_loss=1.481, over 203.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.971, diff_loss=0.2995, tot_loss=1.455, over 442.00 samples.], 2024-10-21 22:21:41,605 INFO [train.py:561] (1/4) Epoch 3349, batch 12, global_batch_idx: 53580, batch size: 152, loss[dur_loss=0.1823, prior_loss=0.9712, diff_loss=0.3042, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9707, diff_loss=0.3518, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 22:21:46,083 INFO [train.py:682] (1/4) Start epoch 3350 2024-10-21 22:22:03,388 INFO [train.py:561] (1/4) Epoch 3350, batch 6, global_batch_idx: 53590, batch size: 106, loss[dur_loss=0.1827, prior_loss=0.9709, diff_loss=0.2827, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9704, diff_loss=0.3853, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 22:22:16,612 INFO [train.py:682] (1/4) Start epoch 3351 2024-10-21 22:22:25,397 INFO [train.py:561] (1/4) Epoch 3351, batch 0, global_batch_idx: 53600, batch size: 108, loss[dur_loss=0.1856, prior_loss=0.9713, diff_loss=0.2935, tot_loss=1.45, over 108.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9713, diff_loss=0.2935, tot_loss=1.45, over 108.00 samples.], 2024-10-21 22:22:39,836 INFO [train.py:561] (1/4) Epoch 3351, batch 10, global_batch_idx: 53610, batch size: 111, loss[dur_loss=0.1875, prior_loss=0.9723, diff_loss=0.2489, tot_loss=1.409, over 111.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9707, diff_loss=0.3554, tot_loss=1.509, over 1656.00 samples.], 2024-10-21 22:22:47,015 INFO [train.py:682] (1/4) Start epoch 3352 2024-10-21 22:23:00,771 INFO [train.py:561] (1/4) Epoch 3352, batch 4, global_batch_idx: 53620, batch size: 189, loss[dur_loss=0.1854, prior_loss=0.9711, diff_loss=0.302, tot_loss=1.459, over 189.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9702, diff_loss=0.3996, tot_loss=1.552, over 937.00 samples.], 2024-10-21 22:23:15,878 INFO [train.py:561] (1/4) Epoch 3352, batch 14, global_batch_idx: 53630, batch size: 142, loss[dur_loss=0.1868, prior_loss=0.971, diff_loss=0.2606, tot_loss=1.418, over 142.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.9708, diff_loss=0.3347, tot_loss=1.49, over 2210.00 samples.], 2024-10-21 22:23:17,306 INFO [train.py:682] (1/4) Start epoch 3353 2024-10-21 22:23:37,569 INFO [train.py:561] (1/4) Epoch 3353, batch 8, global_batch_idx: 53640, batch size: 170, loss[dur_loss=0.1858, prior_loss=0.9713, diff_loss=0.292, tot_loss=1.449, over 170.00 samples.], tot_loss[dur_loss=0.1825, prior_loss=0.9706, diff_loss=0.377, tot_loss=1.53, over 1432.00 samples.], 2024-10-21 22:23:47,821 INFO [train.py:682] (1/4) Start epoch 3354 2024-10-21 22:23:59,397 INFO [train.py:561] (1/4) Epoch 3354, batch 2, global_batch_idx: 53650, batch size: 203, loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.332, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9709, diff_loss=0.3259, tot_loss=1.481, over 442.00 samples.], 2024-10-21 22:24:13,778 INFO [train.py:561] (1/4) Epoch 3354, batch 12, global_batch_idx: 53660, batch size: 152, loss[dur_loss=0.1843, prior_loss=0.9711, diff_loss=0.263, tot_loss=1.418, over 152.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.3519, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 22:24:18,260 INFO [train.py:682] (1/4) Start epoch 3355 2024-10-21 22:24:35,655 INFO [train.py:561] (1/4) Epoch 3355, batch 6, global_batch_idx: 53670, batch size: 106, loss[dur_loss=0.1842, prior_loss=0.9709, diff_loss=0.2798, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9704, diff_loss=0.3858, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 22:24:48,874 INFO [train.py:682] (1/4) Start epoch 3356 2024-10-21 22:24:57,699 INFO [train.py:561] (1/4) Epoch 3356, batch 0, global_batch_idx: 53680, batch size: 108, loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.2767, tot_loss=1.435, over 108.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.9715, diff_loss=0.2767, tot_loss=1.435, over 108.00 samples.], 2024-10-21 22:25:12,139 INFO [train.py:561] (1/4) Epoch 3356, batch 10, global_batch_idx: 53690, batch size: 111, loss[dur_loss=0.1855, prior_loss=0.9718, diff_loss=0.2938, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9706, diff_loss=0.3478, tot_loss=1.501, over 1656.00 samples.], 2024-10-21 22:25:19,270 INFO [train.py:682] (1/4) Start epoch 3357 2024-10-21 22:25:33,105 INFO [train.py:561] (1/4) Epoch 3357, batch 4, global_batch_idx: 53700, batch size: 189, loss[dur_loss=0.1835, prior_loss=0.9708, diff_loss=0.3193, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9701, diff_loss=0.4152, tot_loss=1.566, over 937.00 samples.], 2024-10-21 22:25:48,187 INFO [train.py:561] (1/4) Epoch 3357, batch 14, global_batch_idx: 53710, batch size: 142, loss[dur_loss=0.1873, prior_loss=0.9709, diff_loss=0.2788, tot_loss=1.437, over 142.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9708, diff_loss=0.339, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 22:25:49,620 INFO [train.py:682] (1/4) Start epoch 3358 2024-10-21 22:26:09,547 INFO [train.py:561] (1/4) Epoch 3358, batch 8, global_batch_idx: 53720, batch size: 170, loss[dur_loss=0.188, prior_loss=0.9715, diff_loss=0.3114, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9706, diff_loss=0.3757, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 22:26:19,801 INFO [train.py:682] (1/4) Start epoch 3359 2024-10-21 22:26:31,451 INFO [train.py:561] (1/4) Epoch 3359, batch 2, global_batch_idx: 53730, batch size: 203, loss[dur_loss=0.1864, prior_loss=0.9709, diff_loss=0.3122, tot_loss=1.469, over 203.00 samples.], tot_loss[dur_loss=0.185, prior_loss=0.971, diff_loss=0.2934, tot_loss=1.449, over 442.00 samples.], 2024-10-21 22:26:45,928 INFO [train.py:561] (1/4) Epoch 3359, batch 12, global_batch_idx: 53740, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.9713, diff_loss=0.3, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9708, diff_loss=0.3462, tot_loss=1.501, over 1966.00 samples.], 2024-10-21 22:26:50,426 INFO [train.py:682] (1/4) Start epoch 3360 2024-10-21 22:27:07,466 INFO [train.py:561] (1/4) Epoch 3360, batch 6, global_batch_idx: 53750, batch size: 106, loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.2791, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.3873, tot_loss=1.54, over 1142.00 samples.], 2024-10-21 22:27:20,630 INFO [train.py:682] (1/4) Start epoch 3361 2024-10-21 22:27:29,578 INFO [train.py:561] (1/4) Epoch 3361, batch 0, global_batch_idx: 53760, batch size: 108, loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.2545, tot_loss=1.413, over 108.00 samples.], tot_loss[dur_loss=0.187, prior_loss=0.9715, diff_loss=0.2545, tot_loss=1.413, over 108.00 samples.], 2024-10-21 22:27:43,955 INFO [train.py:561] (1/4) Epoch 3361, batch 10, global_batch_idx: 53770, batch size: 111, loss[dur_loss=0.1857, prior_loss=0.972, diff_loss=0.3075, tot_loss=1.465, over 111.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9707, diff_loss=0.3607, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 22:27:51,085 INFO [train.py:682] (1/4) Start epoch 3362 2024-10-21 22:28:04,944 INFO [train.py:561] (1/4) Epoch 3362, batch 4, global_batch_idx: 53780, batch size: 189, loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.3079, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.4144, tot_loss=1.565, over 937.00 samples.], 2024-10-21 22:28:19,956 INFO [train.py:561] (1/4) Epoch 3362, batch 14, global_batch_idx: 53790, batch size: 142, loss[dur_loss=0.1866, prior_loss=0.9709, diff_loss=0.3042, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9707, diff_loss=0.3469, tot_loss=1.5, over 2210.00 samples.], 2024-10-21 22:28:21,388 INFO [train.py:682] (1/4) Start epoch 3363 2024-10-21 22:28:41,670 INFO [train.py:561] (1/4) Epoch 3363, batch 8, global_batch_idx: 53800, batch size: 170, loss[dur_loss=0.1847, prior_loss=0.9713, diff_loss=0.3326, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9705, diff_loss=0.3839, tot_loss=1.537, over 1432.00 samples.], 2024-10-21 22:28:52,049 INFO [train.py:682] (1/4) Start epoch 3364 2024-10-21 22:29:03,842 INFO [train.py:561] (1/4) Epoch 3364, batch 2, global_batch_idx: 53810, batch size: 203, loss[dur_loss=0.1843, prior_loss=0.9709, diff_loss=0.3009, tot_loss=1.456, over 203.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.971, diff_loss=0.3019, tot_loss=1.458, over 442.00 samples.], 2024-10-21 22:29:18,275 INFO [train.py:561] (1/4) Epoch 3364, batch 12, global_batch_idx: 53820, batch size: 152, loss[dur_loss=0.1836, prior_loss=0.9712, diff_loss=0.3086, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9708, diff_loss=0.3598, tot_loss=1.513, over 1966.00 samples.], 2024-10-21 22:29:22,786 INFO [train.py:682] (1/4) Start epoch 3365 2024-10-21 22:29:40,027 INFO [train.py:561] (1/4) Epoch 3365, batch 6, global_batch_idx: 53830, batch size: 106, loss[dur_loss=0.1827, prior_loss=0.9709, diff_loss=0.2941, tot_loss=1.448, over 106.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9705, diff_loss=0.3776, tot_loss=1.531, over 1142.00 samples.], 2024-10-21 22:29:53,188 INFO [train.py:682] (1/4) Start epoch 3366 2024-10-21 22:30:02,057 INFO [train.py:561] (1/4) Epoch 3366, batch 0, global_batch_idx: 53840, batch size: 108, loss[dur_loss=0.186, prior_loss=0.9712, diff_loss=0.2652, tot_loss=1.422, over 108.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.9712, diff_loss=0.2652, tot_loss=1.422, over 108.00 samples.], 2024-10-21 22:30:16,386 INFO [train.py:561] (1/4) Epoch 3366, batch 10, global_batch_idx: 53850, batch size: 111, loss[dur_loss=0.186, prior_loss=0.9724, diff_loss=0.2894, tot_loss=1.448, over 111.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9707, diff_loss=0.3643, tot_loss=1.517, over 1656.00 samples.], 2024-10-21 22:30:23,508 INFO [train.py:682] (1/4) Start epoch 3367 2024-10-21 22:30:37,415 INFO [train.py:561] (1/4) Epoch 3367, batch 4, global_batch_idx: 53860, batch size: 189, loss[dur_loss=0.1854, prior_loss=0.9709, diff_loss=0.3238, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.4199, tot_loss=1.57, over 937.00 samples.], 2024-10-21 22:30:52,370 INFO [train.py:561] (1/4) Epoch 3367, batch 14, global_batch_idx: 53870, batch size: 142, loss[dur_loss=0.1861, prior_loss=0.9707, diff_loss=0.2825, tot_loss=1.439, over 142.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3496, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 22:30:53,798 INFO [train.py:682] (1/4) Start epoch 3368 2024-10-21 22:31:13,744 INFO [train.py:561] (1/4) Epoch 3368, batch 8, global_batch_idx: 53880, batch size: 170, loss[dur_loss=0.1844, prior_loss=0.9707, diff_loss=0.33, tot_loss=1.485, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9703, diff_loss=0.377, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 22:31:23,939 INFO [train.py:682] (1/4) Start epoch 3369 2024-10-21 22:31:35,306 INFO [train.py:561] (1/4) Epoch 3369, batch 2, global_batch_idx: 53890, batch size: 203, loss[dur_loss=0.1838, prior_loss=0.9707, diff_loss=0.335, tot_loss=1.489, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.2977, tot_loss=1.452, over 442.00 samples.], 2024-10-21 22:31:49,626 INFO [train.py:561] (1/4) Epoch 3369, batch 12, global_batch_idx: 53900, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.971, diff_loss=0.3036, tot_loss=1.458, over 152.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9706, diff_loss=0.345, tot_loss=1.498, over 1966.00 samples.], 2024-10-21 22:31:54,117 INFO [train.py:682] (1/4) Start epoch 3370 2024-10-21 22:32:11,311 INFO [train.py:561] (1/4) Epoch 3370, batch 6, global_batch_idx: 53910, batch size: 106, loss[dur_loss=0.1848, prior_loss=0.9708, diff_loss=0.2358, tot_loss=1.391, over 106.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9704, diff_loss=0.3735, tot_loss=1.525, over 1142.00 samples.], 2024-10-21 22:32:24,500 INFO [train.py:682] (1/4) Start epoch 3371 2024-10-21 22:32:33,504 INFO [train.py:561] (1/4) Epoch 3371, batch 0, global_batch_idx: 53920, batch size: 108, loss[dur_loss=0.1856, prior_loss=0.9713, diff_loss=0.2657, tot_loss=1.423, over 108.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9713, diff_loss=0.2657, tot_loss=1.423, over 108.00 samples.], 2024-10-21 22:32:47,848 INFO [train.py:561] (1/4) Epoch 3371, batch 10, global_batch_idx: 53930, batch size: 111, loss[dur_loss=0.1861, prior_loss=0.9721, diff_loss=0.3252, tot_loss=1.483, over 111.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9706, diff_loss=0.3607, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 22:32:55,025 INFO [train.py:682] (1/4) Start epoch 3372 2024-10-21 22:33:09,053 INFO [train.py:561] (1/4) Epoch 3372, batch 4, global_batch_idx: 53940, batch size: 189, loss[dur_loss=0.1868, prior_loss=0.971, diff_loss=0.3055, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9702, diff_loss=0.3995, tot_loss=1.551, over 937.00 samples.], 2024-10-21 22:33:24,077 INFO [train.py:561] (1/4) Epoch 3372, batch 14, global_batch_idx: 53950, batch size: 142, loss[dur_loss=0.1857, prior_loss=0.9708, diff_loss=0.3203, tot_loss=1.477, over 142.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9707, diff_loss=0.3448, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 22:33:25,533 INFO [train.py:682] (1/4) Start epoch 3373 2024-10-21 22:33:45,709 INFO [train.py:561] (1/4) Epoch 3373, batch 8, global_batch_idx: 53960, batch size: 170, loss[dur_loss=0.1863, prior_loss=0.9714, diff_loss=0.3457, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9705, diff_loss=0.381, tot_loss=1.533, over 1432.00 samples.], 2024-10-21 22:33:55,932 INFO [train.py:682] (1/4) Start epoch 3374 2024-10-21 22:34:07,251 INFO [train.py:561] (1/4) Epoch 3374, batch 2, global_batch_idx: 53970, batch size: 203, loss[dur_loss=0.1824, prior_loss=0.9708, diff_loss=0.2998, tot_loss=1.453, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9709, diff_loss=0.3084, tot_loss=1.463, over 442.00 samples.], 2024-10-21 22:34:21,681 INFO [train.py:561] (1/4) Epoch 3374, batch 12, global_batch_idx: 53980, batch size: 152, loss[dur_loss=0.1818, prior_loss=0.9709, diff_loss=0.2839, tot_loss=1.437, over 152.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3449, tot_loss=1.498, over 1966.00 samples.], 2024-10-21 22:34:26,179 INFO [train.py:682] (1/4) Start epoch 3375 2024-10-21 22:34:43,471 INFO [train.py:561] (1/4) Epoch 3375, batch 6, global_batch_idx: 53990, batch size: 106, loss[dur_loss=0.1799, prior_loss=0.9706, diff_loss=0.2739, tot_loss=1.424, over 106.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9703, diff_loss=0.3848, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 22:34:56,633 INFO [train.py:682] (1/4) Start epoch 3376 2024-10-21 22:35:05,619 INFO [train.py:561] (1/4) Epoch 3376, batch 0, global_batch_idx: 54000, batch size: 108, loss[dur_loss=0.1853, prior_loss=0.9714, diff_loss=0.2834, tot_loss=1.44, over 108.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9714, diff_loss=0.2834, tot_loss=1.44, over 108.00 samples.], 2024-10-21 22:35:07,033 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 22:35:24,741 INFO [train.py:589] (1/4) Epoch 3376, validation: dur_loss=0.462, prior_loss=1.037, diff_loss=0.4017, tot_loss=1.901, over 100.00 samples. 2024-10-21 22:35:24,742 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 22:35:37,776 INFO [train.py:561] (1/4) Epoch 3376, batch 10, global_batch_idx: 54010, batch size: 111, loss[dur_loss=0.187, prior_loss=0.9719, diff_loss=0.2864, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9705, diff_loss=0.3553, tot_loss=1.508, over 1656.00 samples.], 2024-10-21 22:35:44,904 INFO [train.py:682] (1/4) Start epoch 3377 2024-10-21 22:35:59,128 INFO [train.py:561] (1/4) Epoch 3377, batch 4, global_batch_idx: 54020, batch size: 189, loss[dur_loss=0.1837, prior_loss=0.9708, diff_loss=0.288, tot_loss=1.443, over 189.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9701, diff_loss=0.4039, tot_loss=1.555, over 937.00 samples.], 2024-10-21 22:36:13,997 INFO [train.py:561] (1/4) Epoch 3377, batch 14, global_batch_idx: 54030, batch size: 142, loss[dur_loss=0.1829, prior_loss=0.9708, diff_loss=0.3003, tot_loss=1.454, over 142.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3449, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 22:36:15,422 INFO [train.py:682] (1/4) Start epoch 3378 2024-10-21 22:36:36,241 INFO [train.py:561] (1/4) Epoch 3378, batch 8, global_batch_idx: 54040, batch size: 170, loss[dur_loss=0.1845, prior_loss=0.9709, diff_loss=0.3236, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1818, prior_loss=0.9705, diff_loss=0.3615, tot_loss=1.514, over 1432.00 samples.], 2024-10-21 22:36:46,367 INFO [train.py:682] (1/4) Start epoch 3379 2024-10-21 22:36:57,981 INFO [train.py:561] (1/4) Epoch 3379, batch 2, global_batch_idx: 54050, batch size: 203, loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.3169, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9709, diff_loss=0.3051, tot_loss=1.459, over 442.00 samples.], 2024-10-21 22:37:12,182 INFO [train.py:561] (1/4) Epoch 3379, batch 12, global_batch_idx: 54060, batch size: 152, loss[dur_loss=0.1813, prior_loss=0.971, diff_loss=0.2883, tot_loss=1.441, over 152.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.3566, tot_loss=1.509, over 1966.00 samples.], 2024-10-21 22:37:16,618 INFO [train.py:682] (1/4) Start epoch 3380 2024-10-21 22:37:34,266 INFO [train.py:561] (1/4) Epoch 3380, batch 6, global_batch_idx: 54070, batch size: 106, loss[dur_loss=0.1834, prior_loss=0.9707, diff_loss=0.257, tot_loss=1.411, over 106.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9702, diff_loss=0.3775, tot_loss=1.527, over 1142.00 samples.], 2024-10-21 22:37:47,360 INFO [train.py:682] (1/4) Start epoch 3381 2024-10-21 22:37:56,483 INFO [train.py:561] (1/4) Epoch 3381, batch 0, global_batch_idx: 54080, batch size: 108, loss[dur_loss=0.1881, prior_loss=0.9714, diff_loss=0.2669, tot_loss=1.426, over 108.00 samples.], tot_loss[dur_loss=0.1881, prior_loss=0.9714, diff_loss=0.2669, tot_loss=1.426, over 108.00 samples.], 2024-10-21 22:38:10,839 INFO [train.py:561] (1/4) Epoch 3381, batch 10, global_batch_idx: 54090, batch size: 111, loss[dur_loss=0.1824, prior_loss=0.972, diff_loss=0.2764, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9706, diff_loss=0.3686, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 22:38:17,927 INFO [train.py:682] (1/4) Start epoch 3382 2024-10-21 22:38:32,016 INFO [train.py:561] (1/4) Epoch 3382, batch 4, global_batch_idx: 54100, batch size: 189, loss[dur_loss=0.1862, prior_loss=0.9712, diff_loss=0.3157, tot_loss=1.473, over 189.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9702, diff_loss=0.4209, tot_loss=1.573, over 937.00 samples.], 2024-10-21 22:38:46,879 INFO [train.py:561] (1/4) Epoch 3382, batch 14, global_batch_idx: 54110, batch size: 142, loss[dur_loss=0.1836, prior_loss=0.9708, diff_loss=0.2809, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9708, diff_loss=0.3483, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 22:38:48,316 INFO [train.py:682] (1/4) Start epoch 3383 2024-10-21 22:39:09,067 INFO [train.py:561] (1/4) Epoch 3383, batch 8, global_batch_idx: 54120, batch size: 170, loss[dur_loss=0.1906, prior_loss=0.9713, diff_loss=0.2986, tot_loss=1.46, over 170.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9705, diff_loss=0.3713, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 22:39:19,200 INFO [train.py:682] (1/4) Start epoch 3384 2024-10-21 22:39:31,037 INFO [train.py:561] (1/4) Epoch 3384, batch 2, global_batch_idx: 54130, batch size: 203, loss[dur_loss=0.1846, prior_loss=0.9709, diff_loss=0.3022, tot_loss=1.458, over 203.00 samples.], tot_loss[dur_loss=0.1849, prior_loss=0.971, diff_loss=0.2888, tot_loss=1.445, over 442.00 samples.], 2024-10-21 22:39:45,313 INFO [train.py:561] (1/4) Epoch 3384, batch 12, global_batch_idx: 54140, batch size: 152, loss[dur_loss=0.1839, prior_loss=0.9712, diff_loss=0.3135, tot_loss=1.469, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3467, tot_loss=1.5, over 1966.00 samples.], 2024-10-21 22:39:49,757 INFO [train.py:682] (1/4) Start epoch 3385 2024-10-21 22:40:07,174 INFO [train.py:561] (1/4) Epoch 3385, batch 6, global_batch_idx: 54150, batch size: 106, loss[dur_loss=0.182, prior_loss=0.9708, diff_loss=0.2952, tot_loss=1.448, over 106.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9704, diff_loss=0.3902, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 22:40:20,224 INFO [train.py:682] (1/4) Start epoch 3386 2024-10-21 22:40:29,356 INFO [train.py:561] (1/4) Epoch 3386, batch 0, global_batch_idx: 54160, batch size: 108, loss[dur_loss=0.1859, prior_loss=0.9716, diff_loss=0.3032, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.1859, prior_loss=0.9716, diff_loss=0.3032, tot_loss=1.461, over 108.00 samples.], 2024-10-21 22:40:43,614 INFO [train.py:561] (1/4) Epoch 3386, batch 10, global_batch_idx: 54170, batch size: 111, loss[dur_loss=0.1821, prior_loss=0.9719, diff_loss=0.2937, tot_loss=1.448, over 111.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9707, diff_loss=0.3601, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 22:40:50,784 INFO [train.py:682] (1/4) Start epoch 3387 2024-10-21 22:41:04,810 INFO [train.py:561] (1/4) Epoch 3387, batch 4, global_batch_idx: 54180, batch size: 189, loss[dur_loss=0.1838, prior_loss=0.971, diff_loss=0.343, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.4229, tot_loss=1.574, over 937.00 samples.], 2024-10-21 22:41:19,701 INFO [train.py:561] (1/4) Epoch 3387, batch 14, global_batch_idx: 54190, batch size: 142, loss[dur_loss=0.1884, prior_loss=0.9707, diff_loss=0.2819, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9707, diff_loss=0.3473, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 22:41:21,125 INFO [train.py:682] (1/4) Start epoch 3388 2024-10-21 22:41:41,280 INFO [train.py:561] (1/4) Epoch 3388, batch 8, global_batch_idx: 54200, batch size: 170, loss[dur_loss=0.1827, prior_loss=0.9711, diff_loss=0.3219, tot_loss=1.476, over 170.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9705, diff_loss=0.3686, tot_loss=1.521, over 1432.00 samples.], 2024-10-21 22:41:51,411 INFO [train.py:682] (1/4) Start epoch 3389 2024-10-21 22:42:03,076 INFO [train.py:561] (1/4) Epoch 3389, batch 2, global_batch_idx: 54210, batch size: 203, loss[dur_loss=0.1849, prior_loss=0.9709, diff_loss=0.3265, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.971, diff_loss=0.3111, tot_loss=1.466, over 442.00 samples.], 2024-10-21 22:42:17,354 INFO [train.py:561] (1/4) Epoch 3389, batch 12, global_batch_idx: 54220, batch size: 152, loss[dur_loss=0.1818, prior_loss=0.9712, diff_loss=0.2874, tot_loss=1.44, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3458, tot_loss=1.499, over 1966.00 samples.], 2024-10-21 22:42:21,801 INFO [train.py:682] (1/4) Start epoch 3390 2024-10-21 22:42:39,330 INFO [train.py:561] (1/4) Epoch 3390, batch 6, global_batch_idx: 54230, batch size: 106, loss[dur_loss=0.1823, prior_loss=0.9708, diff_loss=0.269, tot_loss=1.422, over 106.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.3916, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 22:42:52,425 INFO [train.py:682] (1/4) Start epoch 3391 2024-10-21 22:43:01,376 INFO [train.py:561] (1/4) Epoch 3391, batch 0, global_batch_idx: 54240, batch size: 108, loss[dur_loss=0.1876, prior_loss=0.9715, diff_loss=0.286, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.1876, prior_loss=0.9715, diff_loss=0.286, tot_loss=1.445, over 108.00 samples.], 2024-10-21 22:43:15,632 INFO [train.py:561] (1/4) Epoch 3391, batch 10, global_batch_idx: 54250, batch size: 111, loss[dur_loss=0.1848, prior_loss=0.9717, diff_loss=0.2727, tot_loss=1.429, over 111.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9706, diff_loss=0.3538, tot_loss=1.506, over 1656.00 samples.], 2024-10-21 22:43:22,740 INFO [train.py:682] (1/4) Start epoch 3392 2024-10-21 22:43:37,054 INFO [train.py:561] (1/4) Epoch 3392, batch 4, global_batch_idx: 54260, batch size: 189, loss[dur_loss=0.1818, prior_loss=0.9708, diff_loss=0.298, tot_loss=1.451, over 189.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9701, diff_loss=0.4124, tot_loss=1.562, over 937.00 samples.], 2024-10-21 22:43:51,908 INFO [train.py:561] (1/4) Epoch 3392, batch 14, global_batch_idx: 54270, batch size: 142, loss[dur_loss=0.1852, prior_loss=0.9707, diff_loss=0.2928, tot_loss=1.449, over 142.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3484, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 22:43:53,338 INFO [train.py:682] (1/4) Start epoch 3393 2024-10-21 22:44:13,547 INFO [train.py:561] (1/4) Epoch 3393, batch 8, global_batch_idx: 54280, batch size: 170, loss[dur_loss=0.185, prior_loss=0.971, diff_loss=0.3217, tot_loss=1.478, over 170.00 samples.], tot_loss[dur_loss=0.1829, prior_loss=0.9705, diff_loss=0.3718, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 22:44:23,758 INFO [train.py:682] (1/4) Start epoch 3394 2024-10-21 22:44:35,429 INFO [train.py:561] (1/4) Epoch 3394, batch 2, global_batch_idx: 54290, batch size: 203, loss[dur_loss=0.184, prior_loss=0.9709, diff_loss=0.3413, tot_loss=1.496, over 203.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9709, diff_loss=0.3087, tot_loss=1.464, over 442.00 samples.], 2024-10-21 22:44:49,781 INFO [train.py:561] (1/4) Epoch 3394, batch 12, global_batch_idx: 54300, batch size: 152, loss[dur_loss=0.181, prior_loss=0.971, diff_loss=0.3333, tot_loss=1.485, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3501, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 22:44:54,246 INFO [train.py:682] (1/4) Start epoch 3395 2024-10-21 22:45:11,526 INFO [train.py:561] (1/4) Epoch 3395, batch 6, global_batch_idx: 54310, batch size: 106, loss[dur_loss=0.1859, prior_loss=0.971, diff_loss=0.2828, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.3901, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 22:45:24,627 INFO [train.py:682] (1/4) Start epoch 3396 2024-10-21 22:45:33,608 INFO [train.py:561] (1/4) Epoch 3396, batch 0, global_batch_idx: 54320, batch size: 108, loss[dur_loss=0.1888, prior_loss=0.9715, diff_loss=0.3142, tot_loss=1.475, over 108.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9715, diff_loss=0.3142, tot_loss=1.475, over 108.00 samples.], 2024-10-21 22:45:47,837 INFO [train.py:561] (1/4) Epoch 3396, batch 10, global_batch_idx: 54330, batch size: 111, loss[dur_loss=0.1841, prior_loss=0.9719, diff_loss=0.3069, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9706, diff_loss=0.3652, tot_loss=1.518, over 1656.00 samples.], 2024-10-21 22:45:54,980 INFO [train.py:682] (1/4) Start epoch 3397 2024-10-21 22:46:08,794 INFO [train.py:561] (1/4) Epoch 3397, batch 4, global_batch_idx: 54340, batch size: 189, loss[dur_loss=0.1855, prior_loss=0.9711, diff_loss=0.3051, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.4113, tot_loss=1.562, over 937.00 samples.], 2024-10-21 22:46:23,743 INFO [train.py:561] (1/4) Epoch 3397, batch 14, global_batch_idx: 54350, batch size: 142, loss[dur_loss=0.1872, prior_loss=0.9709, diff_loss=0.303, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9707, diff_loss=0.3428, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 22:46:25,187 INFO [train.py:682] (1/4) Start epoch 3398 2024-10-21 22:46:45,784 INFO [train.py:561] (1/4) Epoch 3398, batch 8, global_batch_idx: 54360, batch size: 170, loss[dur_loss=0.1838, prior_loss=0.971, diff_loss=0.3285, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9705, diff_loss=0.3724, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 22:46:55,932 INFO [train.py:682] (1/4) Start epoch 3399 2024-10-21 22:47:07,377 INFO [train.py:561] (1/4) Epoch 3399, batch 2, global_batch_idx: 54370, batch size: 203, loss[dur_loss=0.1861, prior_loss=0.9712, diff_loss=0.3456, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9711, diff_loss=0.3157, tot_loss=1.472, over 442.00 samples.], 2024-10-21 22:47:21,651 INFO [train.py:561] (1/4) Epoch 3399, batch 12, global_batch_idx: 54380, batch size: 152, loss[dur_loss=0.182, prior_loss=0.9711, diff_loss=0.2667, tot_loss=1.42, over 152.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.3549, tot_loss=1.508, over 1966.00 samples.], 2024-10-21 22:47:26,140 INFO [train.py:682] (1/4) Start epoch 3400 2024-10-21 22:47:43,571 INFO [train.py:561] (1/4) Epoch 3400, batch 6, global_batch_idx: 54390, batch size: 106, loss[dur_loss=0.1817, prior_loss=0.9708, diff_loss=0.2822, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9703, diff_loss=0.3845, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 22:47:56,690 INFO [train.py:682] (1/4) Start epoch 3401 2024-10-21 22:48:05,592 INFO [train.py:561] (1/4) Epoch 3401, batch 0, global_batch_idx: 54400, batch size: 108, loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.311, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9714, diff_loss=0.311, tot_loss=1.469, over 108.00 samples.], 2024-10-21 22:48:19,920 INFO [train.py:561] (1/4) Epoch 3401, batch 10, global_batch_idx: 54410, batch size: 111, loss[dur_loss=0.1866, prior_loss=0.9717, diff_loss=0.296, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9705, diff_loss=0.3588, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 22:48:27,101 INFO [train.py:682] (1/4) Start epoch 3402 2024-10-21 22:48:41,100 INFO [train.py:561] (1/4) Epoch 3402, batch 4, global_batch_idx: 54420, batch size: 189, loss[dur_loss=0.1818, prior_loss=0.9707, diff_loss=0.2945, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.97, diff_loss=0.3899, tot_loss=1.539, over 937.00 samples.], 2024-10-21 22:48:56,069 INFO [train.py:561] (1/4) Epoch 3402, batch 14, global_batch_idx: 54430, batch size: 142, loss[dur_loss=0.1837, prior_loss=0.9706, diff_loss=0.3118, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9706, diff_loss=0.3341, tot_loss=1.487, over 2210.00 samples.], 2024-10-21 22:48:57,506 INFO [train.py:682] (1/4) Start epoch 3403 2024-10-21 22:49:17,778 INFO [train.py:561] (1/4) Epoch 3403, batch 8, global_batch_idx: 54440, batch size: 170, loss[dur_loss=0.1849, prior_loss=0.9712, diff_loss=0.3333, tot_loss=1.489, over 170.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9705, diff_loss=0.3673, tot_loss=1.52, over 1432.00 samples.], 2024-10-21 22:49:27,976 INFO [train.py:682] (1/4) Start epoch 3404 2024-10-21 22:49:39,773 INFO [train.py:561] (1/4) Epoch 3404, batch 2, global_batch_idx: 54450, batch size: 203, loss[dur_loss=0.1864, prior_loss=0.9709, diff_loss=0.2947, tot_loss=1.452, over 203.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.971, diff_loss=0.302, tot_loss=1.459, over 442.00 samples.], 2024-10-21 22:49:54,060 INFO [train.py:561] (1/4) Epoch 3404, batch 12, global_batch_idx: 54460, batch size: 152, loss[dur_loss=0.1845, prior_loss=0.971, diff_loss=0.2811, tot_loss=1.437, over 152.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9706, diff_loss=0.3537, tot_loss=1.507, over 1966.00 samples.], 2024-10-21 22:49:58,523 INFO [train.py:682] (1/4) Start epoch 3405 2024-10-21 22:50:15,544 INFO [train.py:561] (1/4) Epoch 3405, batch 6, global_batch_idx: 54470, batch size: 106, loss[dur_loss=0.1843, prior_loss=0.9709, diff_loss=0.3205, tot_loss=1.476, over 106.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.394, tot_loss=1.545, over 1142.00 samples.], 2024-10-21 22:50:28,619 INFO [train.py:682] (1/4) Start epoch 3406 2024-10-21 22:50:37,770 INFO [train.py:561] (1/4) Epoch 3406, batch 0, global_batch_idx: 54480, batch size: 108, loss[dur_loss=0.1853, prior_loss=0.9714, diff_loss=0.285, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9714, diff_loss=0.285, tot_loss=1.442, over 108.00 samples.], 2024-10-21 22:50:52,098 INFO [train.py:561] (1/4) Epoch 3406, batch 10, global_batch_idx: 54490, batch size: 111, loss[dur_loss=0.1868, prior_loss=0.9716, diff_loss=0.2762, tot_loss=1.435, over 111.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9704, diff_loss=0.3548, tot_loss=1.508, over 1656.00 samples.], 2024-10-21 22:50:59,230 INFO [train.py:682] (1/4) Start epoch 3407 2024-10-21 22:51:13,051 INFO [train.py:561] (1/4) Epoch 3407, batch 4, global_batch_idx: 54500, batch size: 189, loss[dur_loss=0.1812, prior_loss=0.9707, diff_loss=0.3401, tot_loss=1.492, over 189.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.97, diff_loss=0.4083, tot_loss=1.558, over 937.00 samples.], 2024-10-21 22:51:28,064 INFO [train.py:561] (1/4) Epoch 3407, batch 14, global_batch_idx: 54510, batch size: 142, loss[dur_loss=0.1835, prior_loss=0.9705, diff_loss=0.2808, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.1825, prior_loss=0.9705, diff_loss=0.3465, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 22:51:29,510 INFO [train.py:682] (1/4) Start epoch 3408 2024-10-21 22:51:49,586 INFO [train.py:561] (1/4) Epoch 3408, batch 8, global_batch_idx: 54520, batch size: 170, loss[dur_loss=0.1855, prior_loss=0.9706, diff_loss=0.283, tot_loss=1.439, over 170.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.368, tot_loss=1.52, over 1432.00 samples.], 2024-10-21 22:51:59,752 INFO [train.py:682] (1/4) Start epoch 3409 2024-10-21 22:52:11,685 INFO [train.py:561] (1/4) Epoch 3409, batch 2, global_batch_idx: 54530, batch size: 203, loss[dur_loss=0.1835, prior_loss=0.9707, diff_loss=0.3381, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9707, diff_loss=0.3078, tot_loss=1.462, over 442.00 samples.], 2024-10-21 22:52:25,911 INFO [train.py:561] (1/4) Epoch 3409, batch 12, global_batch_idx: 54540, batch size: 152, loss[dur_loss=0.1809, prior_loss=0.9709, diff_loss=0.3148, tot_loss=1.467, over 152.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.3525, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 22:52:30,392 INFO [train.py:682] (1/4) Start epoch 3410 2024-10-21 22:52:47,374 INFO [train.py:561] (1/4) Epoch 3410, batch 6, global_batch_idx: 54550, batch size: 106, loss[dur_loss=0.1832, prior_loss=0.9708, diff_loss=0.2595, tot_loss=1.414, over 106.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9702, diff_loss=0.3865, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 22:53:00,470 INFO [train.py:682] (1/4) Start epoch 3411 2024-10-21 22:53:09,114 INFO [train.py:561] (1/4) Epoch 3411, batch 0, global_batch_idx: 54560, batch size: 108, loss[dur_loss=0.1869, prior_loss=0.9712, diff_loss=0.2576, tot_loss=1.416, over 108.00 samples.], tot_loss[dur_loss=0.1869, prior_loss=0.9712, diff_loss=0.2576, tot_loss=1.416, over 108.00 samples.], 2024-10-21 22:53:23,322 INFO [train.py:561] (1/4) Epoch 3411, batch 10, global_batch_idx: 54570, batch size: 111, loss[dur_loss=0.185, prior_loss=0.9716, diff_loss=0.2804, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.3632, tot_loss=1.515, over 1656.00 samples.], 2024-10-21 22:53:30,416 INFO [train.py:682] (1/4) Start epoch 3412 2024-10-21 22:53:44,651 INFO [train.py:561] (1/4) Epoch 3412, batch 4, global_batch_idx: 54580, batch size: 189, loss[dur_loss=0.1821, prior_loss=0.9707, diff_loss=0.3185, tot_loss=1.471, over 189.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9699, diff_loss=0.4156, tot_loss=1.566, over 937.00 samples.], 2024-10-21 22:53:59,560 INFO [train.py:561] (1/4) Epoch 3412, batch 14, global_batch_idx: 54590, batch size: 142, loss[dur_loss=0.1853, prior_loss=0.9705, diff_loss=0.2896, tot_loss=1.445, over 142.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9705, diff_loss=0.3535, tot_loss=1.506, over 2210.00 samples.], 2024-10-21 22:54:00,986 INFO [train.py:682] (1/4) Start epoch 3413 2024-10-21 22:54:21,118 INFO [train.py:561] (1/4) Epoch 3413, batch 8, global_batch_idx: 54600, batch size: 170, loss[dur_loss=0.1828, prior_loss=0.9707, diff_loss=0.3388, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9702, diff_loss=0.3707, tot_loss=1.522, over 1432.00 samples.], 2024-10-21 22:54:31,256 INFO [train.py:682] (1/4) Start epoch 3414 2024-10-21 22:54:42,616 INFO [train.py:561] (1/4) Epoch 3414, batch 2, global_batch_idx: 54610, batch size: 203, loss[dur_loss=0.1832, prior_loss=0.9705, diff_loss=0.3132, tot_loss=1.467, over 203.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9707, diff_loss=0.3011, tot_loss=1.455, over 442.00 samples.], 2024-10-21 22:54:56,974 INFO [train.py:561] (1/4) Epoch 3414, batch 12, global_batch_idx: 54620, batch size: 152, loss[dur_loss=0.1819, prior_loss=0.9709, diff_loss=0.2816, tot_loss=1.434, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9704, diff_loss=0.3531, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 22:55:01,426 INFO [train.py:682] (1/4) Start epoch 3415 2024-10-21 22:55:18,434 INFO [train.py:561] (1/4) Epoch 3415, batch 6, global_batch_idx: 54630, batch size: 106, loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.3036, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9701, diff_loss=0.3908, tot_loss=1.542, over 1142.00 samples.], 2024-10-21 22:55:31,637 INFO [train.py:682] (1/4) Start epoch 3416 2024-10-21 22:55:40,655 INFO [train.py:561] (1/4) Epoch 3416, batch 0, global_batch_idx: 54640, batch size: 108, loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3033, tot_loss=1.46, over 108.00 samples.], tot_loss[dur_loss=0.1858, prior_loss=0.9712, diff_loss=0.3033, tot_loss=1.46, over 108.00 samples.], 2024-10-21 22:55:55,050 INFO [train.py:561] (1/4) Epoch 3416, batch 10, global_batch_idx: 54650, batch size: 111, loss[dur_loss=0.1826, prior_loss=0.9717, diff_loss=0.2796, tot_loss=1.434, over 111.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9704, diff_loss=0.3521, tot_loss=1.503, over 1656.00 samples.], 2024-10-21 22:56:02,157 INFO [train.py:682] (1/4) Start epoch 3417 2024-10-21 22:56:16,149 INFO [train.py:561] (1/4) Epoch 3417, batch 4, global_batch_idx: 54660, batch size: 189, loss[dur_loss=0.1813, prior_loss=0.9708, diff_loss=0.3081, tot_loss=1.46, over 189.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.97, diff_loss=0.4019, tot_loss=1.552, over 937.00 samples.], 2024-10-21 22:56:31,184 INFO [train.py:561] (1/4) Epoch 3417, batch 14, global_batch_idx: 54670, batch size: 142, loss[dur_loss=0.183, prior_loss=0.9706, diff_loss=0.31, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.3514, tot_loss=1.503, over 2210.00 samples.], 2024-10-21 22:56:32,619 INFO [train.py:682] (1/4) Start epoch 3418 2024-10-21 22:56:52,878 INFO [train.py:561] (1/4) Epoch 3418, batch 8, global_batch_idx: 54680, batch size: 170, loss[dur_loss=0.1829, prior_loss=0.9708, diff_loss=0.3321, tot_loss=1.486, over 170.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.3784, tot_loss=1.529, over 1432.00 samples.], 2024-10-21 22:57:03,061 INFO [train.py:682] (1/4) Start epoch 3419 2024-10-21 22:57:14,434 INFO [train.py:561] (1/4) Epoch 3419, batch 2, global_batch_idx: 54690, batch size: 203, loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.3228, tot_loss=1.475, over 203.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9706, diff_loss=0.2993, tot_loss=1.452, over 442.00 samples.], 2024-10-21 22:57:28,689 INFO [train.py:561] (1/4) Epoch 3419, batch 12, global_batch_idx: 54700, batch size: 152, loss[dur_loss=0.1834, prior_loss=0.9709, diff_loss=0.2965, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9704, diff_loss=0.344, tot_loss=1.496, over 1966.00 samples.], 2024-10-21 22:57:33,142 INFO [train.py:682] (1/4) Start epoch 3420 2024-10-21 22:57:50,369 INFO [train.py:561] (1/4) Epoch 3420, batch 6, global_batch_idx: 54710, batch size: 106, loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.2487, tot_loss=1.401, over 106.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9703, diff_loss=0.3852, tot_loss=1.537, over 1142.00 samples.], 2024-10-21 22:58:03,408 INFO [train.py:682] (1/4) Start epoch 3421 2024-10-21 22:58:12,131 INFO [train.py:561] (1/4) Epoch 3421, batch 0, global_batch_idx: 54720, batch size: 108, loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.2662, tot_loss=1.422, over 108.00 samples.], tot_loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.2662, tot_loss=1.422, over 108.00 samples.], 2024-10-21 22:58:26,440 INFO [train.py:561] (1/4) Epoch 3421, batch 10, global_batch_idx: 54730, batch size: 111, loss[dur_loss=0.186, prior_loss=0.9717, diff_loss=0.2504, tot_loss=1.408, over 111.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9704, diff_loss=0.3537, tot_loss=1.505, over 1656.00 samples.], 2024-10-21 22:58:33,550 INFO [train.py:682] (1/4) Start epoch 3422 2024-10-21 22:58:47,499 INFO [train.py:561] (1/4) Epoch 3422, batch 4, global_batch_idx: 54740, batch size: 189, loss[dur_loss=0.1815, prior_loss=0.9706, diff_loss=0.3153, tot_loss=1.467, over 189.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.4184, tot_loss=1.568, over 937.00 samples.], 2024-10-21 22:59:02,511 INFO [train.py:561] (1/4) Epoch 3422, batch 14, global_batch_idx: 54750, batch size: 142, loss[dur_loss=0.1834, prior_loss=0.9705, diff_loss=0.3011, tot_loss=1.455, over 142.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9704, diff_loss=0.3498, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 22:59:03,949 INFO [train.py:682] (1/4) Start epoch 3423 2024-10-21 22:59:23,875 INFO [train.py:561] (1/4) Epoch 3423, batch 8, global_batch_idx: 54760, batch size: 170, loss[dur_loss=0.1846, prior_loss=0.9709, diff_loss=0.278, tot_loss=1.434, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9703, diff_loss=0.3593, tot_loss=1.51, over 1432.00 samples.], 2024-10-21 22:59:34,094 INFO [train.py:682] (1/4) Start epoch 3424 2024-10-21 22:59:45,470 INFO [train.py:561] (1/4) Epoch 3424, batch 2, global_batch_idx: 54770, batch size: 203, loss[dur_loss=0.1854, prior_loss=0.9706, diff_loss=0.3469, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9707, diff_loss=0.3114, tot_loss=1.466, over 442.00 samples.], 2024-10-21 23:00:00,253 INFO [train.py:561] (1/4) Epoch 3424, batch 12, global_batch_idx: 54780, batch size: 152, loss[dur_loss=0.1806, prior_loss=0.9709, diff_loss=0.3208, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9704, diff_loss=0.3492, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 23:00:04,693 INFO [train.py:682] (1/4) Start epoch 3425 2024-10-21 23:00:21,829 INFO [train.py:561] (1/4) Epoch 3425, batch 6, global_batch_idx: 54790, batch size: 106, loss[dur_loss=0.1832, prior_loss=0.971, diff_loss=0.2649, tot_loss=1.419, over 106.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9702, diff_loss=0.3805, tot_loss=1.533, over 1142.00 samples.], 2024-10-21 23:00:35,047 INFO [train.py:682] (1/4) Start epoch 3426 2024-10-21 23:00:43,739 INFO [train.py:561] (1/4) Epoch 3426, batch 0, global_batch_idx: 54800, batch size: 108, loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.2957, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.1848, prior_loss=0.9711, diff_loss=0.2957, tot_loss=1.452, over 108.00 samples.], 2024-10-21 23:00:58,118 INFO [train.py:561] (1/4) Epoch 3426, batch 10, global_batch_idx: 54810, batch size: 111, loss[dur_loss=0.1857, prior_loss=0.9719, diff_loss=0.3271, tot_loss=1.485, over 111.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9704, diff_loss=0.36, tot_loss=1.512, over 1656.00 samples.], 2024-10-21 23:01:05,266 INFO [train.py:682] (1/4) Start epoch 3427 2024-10-21 23:01:18,897 INFO [train.py:561] (1/4) Epoch 3427, batch 4, global_batch_idx: 54820, batch size: 189, loss[dur_loss=0.1816, prior_loss=0.9708, diff_loss=0.3014, tot_loss=1.454, over 189.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.97, diff_loss=0.402, tot_loss=1.553, over 937.00 samples.], 2024-10-21 23:01:33,937 INFO [train.py:561] (1/4) Epoch 3427, batch 14, global_batch_idx: 54830, batch size: 142, loss[dur_loss=0.1827, prior_loss=0.9705, diff_loss=0.2854, tot_loss=1.439, over 142.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9705, diff_loss=0.3408, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:01:35,365 INFO [train.py:682] (1/4) Start epoch 3428 2024-10-21 23:01:55,773 INFO [train.py:561] (1/4) Epoch 3428, batch 8, global_batch_idx: 54840, batch size: 170, loss[dur_loss=0.1858, prior_loss=0.9709, diff_loss=0.3112, tot_loss=1.468, over 170.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9703, diff_loss=0.3806, tot_loss=1.532, over 1432.00 samples.], 2024-10-21 23:02:05,993 INFO [train.py:682] (1/4) Start epoch 3429 2024-10-21 23:02:17,657 INFO [train.py:561] (1/4) Epoch 3429, batch 2, global_batch_idx: 54850, batch size: 203, loss[dur_loss=0.1813, prior_loss=0.9707, diff_loss=0.3177, tot_loss=1.47, over 203.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9707, diff_loss=0.29, tot_loss=1.444, over 442.00 samples.], 2024-10-21 23:02:32,151 INFO [train.py:561] (1/4) Epoch 3429, batch 12, global_batch_idx: 54860, batch size: 152, loss[dur_loss=0.1814, prior_loss=0.9709, diff_loss=0.3242, tot_loss=1.476, over 152.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9705, diff_loss=0.3401, tot_loss=1.493, over 1966.00 samples.], 2024-10-21 23:02:36,606 INFO [train.py:682] (1/4) Start epoch 3430 2024-10-21 23:02:54,056 INFO [train.py:561] (1/4) Epoch 3430, batch 6, global_batch_idx: 54870, batch size: 106, loss[dur_loss=0.1834, prior_loss=0.9706, diff_loss=0.2624, tot_loss=1.416, over 106.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.3932, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 23:03:07,250 INFO [train.py:682] (1/4) Start epoch 3431 2024-10-21 23:03:16,106 INFO [train.py:561] (1/4) Epoch 3431, batch 0, global_batch_idx: 54880, batch size: 108, loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.3035, tot_loss=1.462, over 108.00 samples.], tot_loss[dur_loss=0.1867, prior_loss=0.9714, diff_loss=0.3035, tot_loss=1.462, over 108.00 samples.], 2024-10-21 23:03:30,579 INFO [train.py:561] (1/4) Epoch 3431, batch 10, global_batch_idx: 54890, batch size: 111, loss[dur_loss=0.1854, prior_loss=0.9717, diff_loss=0.3135, tot_loss=1.471, over 111.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9705, diff_loss=0.3606, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 23:03:37,754 INFO [train.py:682] (1/4) Start epoch 3432 2024-10-21 23:03:51,590 INFO [train.py:561] (1/4) Epoch 3432, batch 4, global_batch_idx: 54900, batch size: 189, loss[dur_loss=0.1838, prior_loss=0.9709, diff_loss=0.3522, tot_loss=1.507, over 189.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.97, diff_loss=0.4171, tot_loss=1.567, over 937.00 samples.], 2024-10-21 23:04:06,681 INFO [train.py:561] (1/4) Epoch 3432, batch 14, global_batch_idx: 54910, batch size: 142, loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.2602, tot_loss=1.413, over 142.00 samples.], tot_loss[dur_loss=0.1827, prior_loss=0.9706, diff_loss=0.3436, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 23:04:08,117 INFO [train.py:682] (1/4) Start epoch 3433 2024-10-21 23:04:28,197 INFO [train.py:561] (1/4) Epoch 3433, batch 8, global_batch_idx: 54920, batch size: 170, loss[dur_loss=0.184, prior_loss=0.9712, diff_loss=0.2845, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9704, diff_loss=0.3618, tot_loss=1.513, over 1432.00 samples.], 2024-10-21 23:04:38,429 INFO [train.py:682] (1/4) Start epoch 3434 2024-10-21 23:04:49,946 INFO [train.py:561] (1/4) Epoch 3434, batch 2, global_batch_idx: 54930, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.9707, diff_loss=0.3047, tot_loss=1.458, over 203.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9708, diff_loss=0.2935, tot_loss=1.447, over 442.00 samples.], 2024-10-21 23:05:04,295 INFO [train.py:561] (1/4) Epoch 3434, batch 12, global_batch_idx: 54940, batch size: 152, loss[dur_loss=0.1826, prior_loss=0.9711, diff_loss=0.3181, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.3494, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 23:05:08,745 INFO [train.py:682] (1/4) Start epoch 3435 2024-10-21 23:05:25,808 INFO [train.py:561] (1/4) Epoch 3435, batch 6, global_batch_idx: 54950, batch size: 106, loss[dur_loss=0.1819, prior_loss=0.9707, diff_loss=0.2878, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.3731, tot_loss=1.524, over 1142.00 samples.], 2024-10-21 23:05:39,024 INFO [train.py:682] (1/4) Start epoch 3436 2024-10-21 23:05:48,070 INFO [train.py:561] (1/4) Epoch 3436, batch 0, global_batch_idx: 54960, batch size: 108, loss[dur_loss=0.1833, prior_loss=0.9711, diff_loss=0.2749, tot_loss=1.429, over 108.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9711, diff_loss=0.2749, tot_loss=1.429, over 108.00 samples.], 2024-10-21 23:06:02,334 INFO [train.py:561] (1/4) Epoch 3436, batch 10, global_batch_idx: 54970, batch size: 111, loss[dur_loss=0.1844, prior_loss=0.9718, diff_loss=0.243, tot_loss=1.399, over 111.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.3619, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 23:06:09,447 INFO [train.py:682] (1/4) Start epoch 3437 2024-10-21 23:06:23,308 INFO [train.py:561] (1/4) Epoch 3437, batch 4, global_batch_idx: 54980, batch size: 189, loss[dur_loss=0.1815, prior_loss=0.9707, diff_loss=0.345, tot_loss=1.497, over 189.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.97, diff_loss=0.4042, tot_loss=1.553, over 937.00 samples.], 2024-10-21 23:06:38,320 INFO [train.py:561] (1/4) Epoch 3437, batch 14, global_batch_idx: 54990, batch size: 142, loss[dur_loss=0.1844, prior_loss=0.9704, diff_loss=0.2829, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9705, diff_loss=0.3417, tot_loss=1.494, over 2210.00 samples.], 2024-10-21 23:06:39,751 INFO [train.py:682] (1/4) Start epoch 3438 2024-10-21 23:06:59,742 INFO [train.py:561] (1/4) Epoch 3438, batch 8, global_batch_idx: 55000, batch size: 170, loss[dur_loss=0.1842, prior_loss=0.9707, diff_loss=0.327, tot_loss=1.482, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3724, tot_loss=1.523, over 1432.00 samples.], 2024-10-21 23:07:09,903 INFO [train.py:682] (1/4) Start epoch 3439 2024-10-21 23:07:21,239 INFO [train.py:561] (1/4) Epoch 3439, batch 2, global_batch_idx: 55010, batch size: 203, loss[dur_loss=0.1827, prior_loss=0.9706, diff_loss=0.3029, tot_loss=1.456, over 203.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9706, diff_loss=0.286, tot_loss=1.44, over 442.00 samples.], 2024-10-21 23:07:35,628 INFO [train.py:561] (1/4) Epoch 3439, batch 12, global_batch_idx: 55020, batch size: 152, loss[dur_loss=0.1827, prior_loss=0.971, diff_loss=0.3325, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.349, tot_loss=1.501, over 1966.00 samples.], 2024-10-21 23:07:40,102 INFO [train.py:682] (1/4) Start epoch 3440 2024-10-21 23:07:57,144 INFO [train.py:561] (1/4) Epoch 3440, batch 6, global_batch_idx: 55030, batch size: 106, loss[dur_loss=0.1808, prior_loss=0.9705, diff_loss=0.289, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.386, tot_loss=1.535, over 1142.00 samples.], 2024-10-21 23:08:10,345 INFO [train.py:682] (1/4) Start epoch 3441 2024-10-21 23:08:19,366 INFO [train.py:561] (1/4) Epoch 3441, batch 0, global_batch_idx: 55040, batch size: 108, loss[dur_loss=0.1887, prior_loss=0.9711, diff_loss=0.3325, tot_loss=1.492, over 108.00 samples.], tot_loss[dur_loss=0.1887, prior_loss=0.9711, diff_loss=0.3325, tot_loss=1.492, over 108.00 samples.], 2024-10-21 23:08:33,680 INFO [train.py:561] (1/4) Epoch 3441, batch 10, global_batch_idx: 55050, batch size: 111, loss[dur_loss=0.1826, prior_loss=0.9716, diff_loss=0.3099, tot_loss=1.464, over 111.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9704, diff_loss=0.3543, tot_loss=1.506, over 1656.00 samples.], 2024-10-21 23:08:40,855 INFO [train.py:682] (1/4) Start epoch 3442 2024-10-21 23:08:54,975 INFO [train.py:561] (1/4) Epoch 3442, batch 4, global_batch_idx: 55060, batch size: 189, loss[dur_loss=0.1825, prior_loss=0.971, diff_loss=0.3266, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.97, diff_loss=0.4154, tot_loss=1.565, over 937.00 samples.], 2024-10-21 23:09:09,978 INFO [train.py:561] (1/4) Epoch 3442, batch 14, global_batch_idx: 55070, batch size: 142, loss[dur_loss=0.186, prior_loss=0.9705, diff_loss=0.2506, tot_loss=1.407, over 142.00 samples.], tot_loss[dur_loss=0.1825, prior_loss=0.9705, diff_loss=0.3457, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 23:09:11,416 INFO [train.py:682] (1/4) Start epoch 3443 2024-10-21 23:09:31,558 INFO [train.py:561] (1/4) Epoch 3443, batch 8, global_batch_idx: 55080, batch size: 170, loss[dur_loss=0.1842, prior_loss=0.9708, diff_loss=0.3246, tot_loss=1.48, over 170.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9703, diff_loss=0.3806, tot_loss=1.531, over 1432.00 samples.], 2024-10-21 23:09:41,809 INFO [train.py:682] (1/4) Start epoch 3444 2024-10-21 23:09:53,070 INFO [train.py:561] (1/4) Epoch 3444, batch 2, global_batch_idx: 55090, batch size: 203, loss[dur_loss=0.1833, prior_loss=0.9705, diff_loss=0.3105, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1838, prior_loss=0.9706, diff_loss=0.3121, tot_loss=1.466, over 442.00 samples.], 2024-10-21 23:10:07,496 INFO [train.py:561] (1/4) Epoch 3444, batch 12, global_batch_idx: 55100, batch size: 152, loss[dur_loss=0.1832, prior_loss=0.9708, diff_loss=0.2823, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9704, diff_loss=0.3536, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 23:10:11,975 INFO [train.py:682] (1/4) Start epoch 3445 2024-10-21 23:10:29,295 INFO [train.py:561] (1/4) Epoch 3445, batch 6, global_batch_idx: 55110, batch size: 106, loss[dur_loss=0.1794, prior_loss=0.9703, diff_loss=0.2833, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.97, diff_loss=0.3845, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 23:10:42,507 INFO [train.py:682] (1/4) Start epoch 3446 2024-10-21 23:10:51,067 INFO [train.py:561] (1/4) Epoch 3446, batch 0, global_batch_idx: 55120, batch size: 108, loss[dur_loss=0.1853, prior_loss=0.9711, diff_loss=0.3224, tot_loss=1.479, over 108.00 samples.], tot_loss[dur_loss=0.1853, prior_loss=0.9711, diff_loss=0.3224, tot_loss=1.479, over 108.00 samples.], 2024-10-21 23:11:05,398 INFO [train.py:561] (1/4) Epoch 3446, batch 10, global_batch_idx: 55130, batch size: 111, loss[dur_loss=0.1841, prior_loss=0.9717, diff_loss=0.3009, tot_loss=1.457, over 111.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9704, diff_loss=0.3557, tot_loss=1.507, over 1656.00 samples.], 2024-10-21 23:11:12,569 INFO [train.py:682] (1/4) Start epoch 3447 2024-10-21 23:11:26,261 INFO [train.py:561] (1/4) Epoch 3447, batch 4, global_batch_idx: 55140, batch size: 189, loss[dur_loss=0.1812, prior_loss=0.9706, diff_loss=0.3345, tot_loss=1.486, over 189.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9699, diff_loss=0.4174, tot_loss=1.567, over 937.00 samples.], 2024-10-21 23:11:41,401 INFO [train.py:561] (1/4) Epoch 3447, batch 14, global_batch_idx: 55150, batch size: 142, loss[dur_loss=0.182, prior_loss=0.9704, diff_loss=0.2839, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9704, diff_loss=0.3443, tot_loss=1.497, over 2210.00 samples.], 2024-10-21 23:11:42,831 INFO [train.py:682] (1/4) Start epoch 3448 2024-10-21 23:12:03,194 INFO [train.py:561] (1/4) Epoch 3448, batch 8, global_batch_idx: 55160, batch size: 170, loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.3353, tot_loss=1.49, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.365, tot_loss=1.516, over 1432.00 samples.], 2024-10-21 23:12:13,470 INFO [train.py:682] (1/4) Start epoch 3449 2024-10-21 23:12:24,936 INFO [train.py:561] (1/4) Epoch 3449, batch 2, global_batch_idx: 55170, batch size: 203, loss[dur_loss=0.1851, prior_loss=0.9708, diff_loss=0.3326, tot_loss=1.488, over 203.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.9708, diff_loss=0.3001, tot_loss=1.456, over 442.00 samples.], 2024-10-21 23:12:39,410 INFO [train.py:561] (1/4) Epoch 3449, batch 12, global_batch_idx: 55180, batch size: 152, loss[dur_loss=0.1837, prior_loss=0.9711, diff_loss=0.2723, tot_loss=1.427, over 152.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3467, tot_loss=1.5, over 1966.00 samples.], 2024-10-21 23:12:43,941 INFO [train.py:682] (1/4) Start epoch 3450 2024-10-21 23:13:00,911 INFO [train.py:561] (1/4) Epoch 3450, batch 6, global_batch_idx: 55190, batch size: 106, loss[dur_loss=0.1794, prior_loss=0.9704, diff_loss=0.2641, tot_loss=1.414, over 106.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9702, diff_loss=0.3902, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 23:13:14,065 INFO [train.py:682] (1/4) Start epoch 3451 2024-10-21 23:13:23,138 INFO [train.py:561] (1/4) Epoch 3451, batch 0, global_batch_idx: 55200, batch size: 108, loss[dur_loss=0.1843, prior_loss=0.9713, diff_loss=0.2889, tot_loss=1.444, over 108.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9713, diff_loss=0.2889, tot_loss=1.444, over 108.00 samples.], 2024-10-21 23:13:37,479 INFO [train.py:561] (1/4) Epoch 3451, batch 10, global_batch_idx: 55210, batch size: 111, loss[dur_loss=0.1852, prior_loss=0.9717, diff_loss=0.2866, tot_loss=1.443, over 111.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.362, tot_loss=1.513, over 1656.00 samples.], 2024-10-21 23:13:44,631 INFO [train.py:682] (1/4) Start epoch 3452 2024-10-21 23:13:58,534 INFO [train.py:561] (1/4) Epoch 3452, batch 4, global_batch_idx: 55220, batch size: 189, loss[dur_loss=0.1834, prior_loss=0.9706, diff_loss=0.2967, tot_loss=1.451, over 189.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9699, diff_loss=0.4096, tot_loss=1.559, over 937.00 samples.], 2024-10-21 23:14:13,591 INFO [train.py:561] (1/4) Epoch 3452, batch 14, global_batch_idx: 55230, batch size: 142, loss[dur_loss=0.1844, prior_loss=0.9707, diff_loss=0.2701, tot_loss=1.425, over 142.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.339, tot_loss=1.491, over 2210.00 samples.], 2024-10-21 23:14:15,031 INFO [train.py:682] (1/4) Start epoch 3453 2024-10-21 23:14:35,577 INFO [train.py:561] (1/4) Epoch 3453, batch 8, global_batch_idx: 55240, batch size: 170, loss[dur_loss=0.1832, prior_loss=0.9709, diff_loss=0.2746, tot_loss=1.429, over 170.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9704, diff_loss=0.3571, tot_loss=1.509, over 1432.00 samples.], 2024-10-21 23:14:45,826 INFO [train.py:682] (1/4) Start epoch 3454 2024-10-21 23:14:57,432 INFO [train.py:561] (1/4) Epoch 3454, batch 2, global_batch_idx: 55250, batch size: 203, loss[dur_loss=0.1859, prior_loss=0.9708, diff_loss=0.3595, tot_loss=1.516, over 203.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9707, diff_loss=0.3104, tot_loss=1.465, over 442.00 samples.], 2024-10-21 23:15:11,735 INFO [train.py:561] (1/4) Epoch 3454, batch 12, global_batch_idx: 55260, batch size: 152, loss[dur_loss=0.179, prior_loss=0.9709, diff_loss=0.2866, tot_loss=1.437, over 152.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.3518, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 23:15:16,250 INFO [train.py:682] (1/4) Start epoch 3455 2024-10-21 23:15:33,394 INFO [train.py:561] (1/4) Epoch 3455, batch 6, global_batch_idx: 55270, batch size: 106, loss[dur_loss=0.1811, prior_loss=0.9704, diff_loss=0.289, tot_loss=1.441, over 106.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9701, diff_loss=0.3885, tot_loss=1.538, over 1142.00 samples.], 2024-10-21 23:15:46,663 INFO [train.py:682] (1/4) Start epoch 3456 2024-10-21 23:15:55,449 INFO [train.py:561] (1/4) Epoch 3456, batch 0, global_batch_idx: 55280, batch size: 108, loss[dur_loss=0.1865, prior_loss=0.9711, diff_loss=0.2367, tot_loss=1.394, over 108.00 samples.], tot_loss[dur_loss=0.1865, prior_loss=0.9711, diff_loss=0.2367, tot_loss=1.394, over 108.00 samples.], 2024-10-21 23:16:09,825 INFO [train.py:561] (1/4) Epoch 3456, batch 10, global_batch_idx: 55290, batch size: 111, loss[dur_loss=0.183, prior_loss=0.9716, diff_loss=0.2657, tot_loss=1.42, over 111.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.352, tot_loss=1.503, over 1656.00 samples.], 2024-10-21 23:16:16,984 INFO [train.py:682] (1/4) Start epoch 3457 2024-10-21 23:16:30,719 INFO [train.py:561] (1/4) Epoch 3457, batch 4, global_batch_idx: 55300, batch size: 189, loss[dur_loss=0.1833, prior_loss=0.9708, diff_loss=0.295, tot_loss=1.449, over 189.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9699, diff_loss=0.4188, tot_loss=1.567, over 937.00 samples.], 2024-10-21 23:16:45,758 INFO [train.py:561] (1/4) Epoch 3457, batch 14, global_batch_idx: 55310, batch size: 142, loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.2942, tot_loss=1.449, over 142.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9705, diff_loss=0.3414, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:16:47,205 INFO [train.py:682] (1/4) Start epoch 3458 2024-10-21 23:17:07,401 INFO [train.py:561] (1/4) Epoch 3458, batch 8, global_batch_idx: 55320, batch size: 170, loss[dur_loss=0.1807, prior_loss=0.9706, diff_loss=0.3047, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3653, tot_loss=1.516, over 1432.00 samples.], 2024-10-21 23:17:17,666 INFO [train.py:682] (1/4) Start epoch 3459 2024-10-21 23:17:29,267 INFO [train.py:561] (1/4) Epoch 3459, batch 2, global_batch_idx: 55330, batch size: 203, loss[dur_loss=0.1836, prior_loss=0.9705, diff_loss=0.3215, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1834, prior_loss=0.9707, diff_loss=0.3124, tot_loss=1.466, over 442.00 samples.], 2024-10-21 23:17:43,614 INFO [train.py:561] (1/4) Epoch 3459, batch 12, global_batch_idx: 55340, batch size: 152, loss[dur_loss=0.1844, prior_loss=0.9709, diff_loss=0.301, tot_loss=1.456, over 152.00 samples.], tot_loss[dur_loss=0.1818, prior_loss=0.9703, diff_loss=0.3504, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 23:17:48,097 INFO [train.py:682] (1/4) Start epoch 3460 2024-10-21 23:18:05,010 INFO [train.py:561] (1/4) Epoch 3460, batch 6, global_batch_idx: 55350, batch size: 106, loss[dur_loss=0.179, prior_loss=0.9705, diff_loss=0.2454, tot_loss=1.395, over 106.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9701, diff_loss=0.3731, tot_loss=1.522, over 1142.00 samples.], 2024-10-21 23:18:18,160 INFO [train.py:682] (1/4) Start epoch 3461 2024-10-21 23:18:27,226 INFO [train.py:561] (1/4) Epoch 3461, batch 0, global_batch_idx: 55360, batch size: 108, loss[dur_loss=0.1846, prior_loss=0.971, diff_loss=0.2812, tot_loss=1.437, over 108.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.971, diff_loss=0.2812, tot_loss=1.437, over 108.00 samples.], 2024-10-21 23:18:41,432 INFO [train.py:561] (1/4) Epoch 3461, batch 10, global_batch_idx: 55370, batch size: 111, loss[dur_loss=0.1818, prior_loss=0.9716, diff_loss=0.3042, tot_loss=1.458, over 111.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3565, tot_loss=1.507, over 1656.00 samples.], 2024-10-21 23:18:48,523 INFO [train.py:682] (1/4) Start epoch 3462 2024-10-21 23:19:02,190 INFO [train.py:561] (1/4) Epoch 3462, batch 4, global_batch_idx: 55380, batch size: 189, loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3046, tot_loss=1.457, over 189.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9698, diff_loss=0.419, tot_loss=1.568, over 937.00 samples.], 2024-10-21 23:19:17,035 INFO [train.py:561] (1/4) Epoch 3462, batch 14, global_batch_idx: 55390, batch size: 142, loss[dur_loss=0.1836, prior_loss=0.9704, diff_loss=0.2865, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.3502, tot_loss=1.502, over 2210.00 samples.], 2024-10-21 23:19:18,465 INFO [train.py:682] (1/4) Start epoch 3463 2024-10-21 23:19:38,343 INFO [train.py:561] (1/4) Epoch 3463, batch 8, global_batch_idx: 55400, batch size: 170, loss[dur_loss=0.1839, prior_loss=0.9704, diff_loss=0.3049, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1815, prior_loss=0.9701, diff_loss=0.3769, tot_loss=1.528, over 1432.00 samples.], 2024-10-21 23:19:48,549 INFO [train.py:682] (1/4) Start epoch 3464 2024-10-21 23:20:00,108 INFO [train.py:561] (1/4) Epoch 3464, batch 2, global_batch_idx: 55410, batch size: 203, loss[dur_loss=0.1808, prior_loss=0.9705, diff_loss=0.3136, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9706, diff_loss=0.2878, tot_loss=1.439, over 442.00 samples.], 2024-10-21 23:20:14,265 INFO [train.py:561] (1/4) Epoch 3464, batch 12, global_batch_idx: 55420, batch size: 152, loss[dur_loss=0.1796, prior_loss=0.9707, diff_loss=0.323, tot_loss=1.473, over 152.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9703, diff_loss=0.3519, tot_loss=1.503, over 1966.00 samples.], 2024-10-21 23:20:18,772 INFO [train.py:682] (1/4) Start epoch 3465 2024-10-21 23:20:36,094 INFO [train.py:561] (1/4) Epoch 3465, batch 6, global_batch_idx: 55430, batch size: 106, loss[dur_loss=0.1824, prior_loss=0.9705, diff_loss=0.2841, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3758, tot_loss=1.525, over 1142.00 samples.], 2024-10-21 23:20:49,008 INFO [train.py:682] (1/4) Start epoch 3466 2024-10-21 23:20:57,701 INFO [train.py:561] (1/4) Epoch 3466, batch 0, global_batch_idx: 55440, batch size: 108, loss[dur_loss=0.1864, prior_loss=0.971, diff_loss=0.2555, tot_loss=1.413, over 108.00 samples.], tot_loss[dur_loss=0.1864, prior_loss=0.971, diff_loss=0.2555, tot_loss=1.413, over 108.00 samples.], 2024-10-21 23:21:11,759 INFO [train.py:561] (1/4) Epoch 3466, batch 10, global_batch_idx: 55450, batch size: 111, loss[dur_loss=0.1818, prior_loss=0.9716, diff_loss=0.316, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.3598, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 23:21:18,844 INFO [train.py:682] (1/4) Start epoch 3467 2024-10-21 23:21:32,865 INFO [train.py:561] (1/4) Epoch 3467, batch 4, global_batch_idx: 55460, batch size: 189, loss[dur_loss=0.1815, prior_loss=0.9703, diff_loss=0.2973, tot_loss=1.449, over 189.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9698, diff_loss=0.3946, tot_loss=1.543, over 937.00 samples.], 2024-10-21 23:21:47,762 INFO [train.py:561] (1/4) Epoch 3467, batch 14, global_batch_idx: 55470, batch size: 142, loss[dur_loss=0.1845, prior_loss=0.9705, diff_loss=0.2809, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9703, diff_loss=0.334, tot_loss=1.485, over 2210.00 samples.], 2024-10-21 23:21:49,189 INFO [train.py:682] (1/4) Start epoch 3468 2024-10-21 23:22:09,026 INFO [train.py:561] (1/4) Epoch 3468, batch 8, global_batch_idx: 55480, batch size: 170, loss[dur_loss=0.1845, prior_loss=0.9706, diff_loss=0.3372, tot_loss=1.492, over 170.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9701, diff_loss=0.3758, tot_loss=1.526, over 1432.00 samples.], 2024-10-21 23:22:19,316 INFO [train.py:682] (1/4) Start epoch 3469 2024-10-21 23:22:30,877 INFO [train.py:561] (1/4) Epoch 3469, batch 2, global_batch_idx: 55490, batch size: 203, loss[dur_loss=0.1831, prior_loss=0.9705, diff_loss=0.3332, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9706, diff_loss=0.3088, tot_loss=1.462, over 442.00 samples.], 2024-10-21 23:22:45,096 INFO [train.py:561] (1/4) Epoch 3469, batch 12, global_batch_idx: 55500, batch size: 152, loss[dur_loss=0.1814, prior_loss=0.9707, diff_loss=0.3024, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3458, tot_loss=1.496, over 1966.00 samples.], 2024-10-21 23:22:46,722 INFO [train.py:579] (1/4) Computing validation loss 2024-10-21 23:23:08,424 INFO [train.py:589] (1/4) Epoch 3469, validation: dur_loss=0.4577, prior_loss=1.038, diff_loss=0.3781, tot_loss=1.873, over 100.00 samples. 2024-10-21 23:23:08,425 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-21 23:23:11,323 INFO [train.py:682] (1/4) Start epoch 3470 2024-10-21 23:23:29,085 INFO [train.py:561] (1/4) Epoch 3470, batch 6, global_batch_idx: 55510, batch size: 106, loss[dur_loss=0.1834, prior_loss=0.9705, diff_loss=0.3272, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.397, tot_loss=1.547, over 1142.00 samples.], 2024-10-21 23:23:42,199 INFO [train.py:682] (1/4) Start epoch 3471 2024-10-21 23:23:51,184 INFO [train.py:561] (1/4) Epoch 3471, batch 0, global_batch_idx: 55520, batch size: 108, loss[dur_loss=0.1826, prior_loss=0.971, diff_loss=0.2944, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.971, diff_loss=0.2944, tot_loss=1.448, over 108.00 samples.], 2024-10-21 23:24:05,555 INFO [train.py:561] (1/4) Epoch 3471, batch 10, global_batch_idx: 55530, batch size: 111, loss[dur_loss=0.1828, prior_loss=0.9715, diff_loss=0.2936, tot_loss=1.448, over 111.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.356, tot_loss=1.506, over 1656.00 samples.], 2024-10-21 23:24:12,705 INFO [train.py:682] (1/4) Start epoch 3472 2024-10-21 23:24:26,595 INFO [train.py:561] (1/4) Epoch 3472, batch 4, global_batch_idx: 55540, batch size: 189, loss[dur_loss=0.1833, prior_loss=0.9704, diff_loss=0.3363, tot_loss=1.49, over 189.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9698, diff_loss=0.4157, tot_loss=1.564, over 937.00 samples.], 2024-10-21 23:24:41,444 INFO [train.py:561] (1/4) Epoch 3472, batch 14, global_batch_idx: 55550, batch size: 142, loss[dur_loss=0.1858, prior_loss=0.9705, diff_loss=0.299, tot_loss=1.455, over 142.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9703, diff_loss=0.3466, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 23:24:42,884 INFO [train.py:682] (1/4) Start epoch 3473 2024-10-21 23:25:03,260 INFO [train.py:561] (1/4) Epoch 3473, batch 8, global_batch_idx: 55560, batch size: 170, loss[dur_loss=0.1827, prior_loss=0.9707, diff_loss=0.2933, tot_loss=1.447, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3548, tot_loss=1.506, over 1432.00 samples.], 2024-10-21 23:25:13,381 INFO [train.py:682] (1/4) Start epoch 3474 2024-10-21 23:25:24,956 INFO [train.py:561] (1/4) Epoch 3474, batch 2, global_batch_idx: 55570, batch size: 203, loss[dur_loss=0.1836, prior_loss=0.9706, diff_loss=0.3406, tot_loss=1.495, over 203.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9706, diff_loss=0.3204, tot_loss=1.475, over 442.00 samples.], 2024-10-21 23:25:39,338 INFO [train.py:561] (1/4) Epoch 3474, batch 12, global_batch_idx: 55580, batch size: 152, loss[dur_loss=0.1806, prior_loss=0.9707, diff_loss=0.3051, tot_loss=1.456, over 152.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.3507, tot_loss=1.502, over 1966.00 samples.], 2024-10-21 23:25:43,823 INFO [train.py:682] (1/4) Start epoch 3475 2024-10-21 23:26:01,180 INFO [train.py:561] (1/4) Epoch 3475, batch 6, global_batch_idx: 55590, batch size: 106, loss[dur_loss=0.1792, prior_loss=0.9704, diff_loss=0.3317, tot_loss=1.481, over 106.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9699, diff_loss=0.3946, tot_loss=1.543, over 1142.00 samples.], 2024-10-21 23:26:14,416 INFO [train.py:682] (1/4) Start epoch 3476 2024-10-21 23:26:23,691 INFO [train.py:561] (1/4) Epoch 3476, batch 0, global_batch_idx: 55600, batch size: 108, loss[dur_loss=0.1831, prior_loss=0.9711, diff_loss=0.2875, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9711, diff_loss=0.2875, tot_loss=1.442, over 108.00 samples.], 2024-10-21 23:26:38,168 INFO [train.py:561] (1/4) Epoch 3476, batch 10, global_batch_idx: 55610, batch size: 111, loss[dur_loss=0.1809, prior_loss=0.9714, diff_loss=0.3157, tot_loss=1.468, over 111.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.9702, diff_loss=0.3734, tot_loss=1.523, over 1656.00 samples.], 2024-10-21 23:26:45,302 INFO [train.py:682] (1/4) Start epoch 3477 2024-10-21 23:26:59,408 INFO [train.py:561] (1/4) Epoch 3477, batch 4, global_batch_idx: 55620, batch size: 189, loss[dur_loss=0.183, prior_loss=0.9705, diff_loss=0.3116, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9697, diff_loss=0.4101, tot_loss=1.558, over 937.00 samples.], 2024-10-21 23:27:14,325 INFO [train.py:561] (1/4) Epoch 3477, batch 14, global_batch_idx: 55630, batch size: 142, loss[dur_loss=0.1845, prior_loss=0.9705, diff_loss=0.3107, tot_loss=1.466, over 142.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.3446, tot_loss=1.496, over 2210.00 samples.], 2024-10-21 23:27:15,753 INFO [train.py:682] (1/4) Start epoch 3478 2024-10-21 23:27:36,348 INFO [train.py:561] (1/4) Epoch 3478, batch 8, global_batch_idx: 55640, batch size: 170, loss[dur_loss=0.1825, prior_loss=0.9706, diff_loss=0.2931, tot_loss=1.446, over 170.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.3698, tot_loss=1.52, over 1432.00 samples.], 2024-10-21 23:27:46,511 INFO [train.py:682] (1/4) Start epoch 3479 2024-10-21 23:27:58,017 INFO [train.py:561] (1/4) Epoch 3479, batch 2, global_batch_idx: 55650, batch size: 203, loss[dur_loss=0.1845, prior_loss=0.9706, diff_loss=0.3064, tot_loss=1.461, over 203.00 samples.], tot_loss[dur_loss=0.1835, prior_loss=0.9706, diff_loss=0.2866, tot_loss=1.441, over 442.00 samples.], 2024-10-21 23:28:12,340 INFO [train.py:561] (1/4) Epoch 3479, batch 12, global_batch_idx: 55660, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.9708, diff_loss=0.2915, tot_loss=1.44, over 152.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.3395, tot_loss=1.491, over 1966.00 samples.], 2024-10-21 23:28:16,803 INFO [train.py:682] (1/4) Start epoch 3480 2024-10-21 23:28:34,049 INFO [train.py:561] (1/4) Epoch 3480, batch 6, global_batch_idx: 55670, batch size: 106, loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.2623, tot_loss=1.414, over 106.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.9701, diff_loss=0.3917, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 23:28:47,108 INFO [train.py:682] (1/4) Start epoch 3481 2024-10-21 23:28:56,305 INFO [train.py:561] (1/4) Epoch 3481, batch 0, global_batch_idx: 55680, batch size: 108, loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.2418, tot_loss=1.397, over 108.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.971, diff_loss=0.2418, tot_loss=1.397, over 108.00 samples.], 2024-10-21 23:29:10,603 INFO [train.py:561] (1/4) Epoch 3481, batch 10, global_batch_idx: 55690, batch size: 111, loss[dur_loss=0.1867, prior_loss=0.9716, diff_loss=0.2869, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.3537, tot_loss=1.505, over 1656.00 samples.], 2024-10-21 23:29:17,726 INFO [train.py:682] (1/4) Start epoch 3482 2024-10-21 23:29:31,795 INFO [train.py:561] (1/4) Epoch 3482, batch 4, global_batch_idx: 55700, batch size: 189, loss[dur_loss=0.1812, prior_loss=0.9704, diff_loss=0.3114, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.4044, tot_loss=1.553, over 937.00 samples.], 2024-10-21 23:29:46,864 INFO [train.py:561] (1/4) Epoch 3482, batch 14, global_batch_idx: 55710, batch size: 142, loss[dur_loss=0.1853, prior_loss=0.9704, diff_loss=0.2666, tot_loss=1.422, over 142.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.3409, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:29:48,324 INFO [train.py:682] (1/4) Start epoch 3483 2024-10-21 23:30:08,503 INFO [train.py:561] (1/4) Epoch 3483, batch 8, global_batch_idx: 55720, batch size: 170, loss[dur_loss=0.1858, prior_loss=0.9707, diff_loss=0.2808, tot_loss=1.437, over 170.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9701, diff_loss=0.3722, tot_loss=1.522, over 1432.00 samples.], 2024-10-21 23:30:18,730 INFO [train.py:682] (1/4) Start epoch 3484 2024-10-21 23:30:30,282 INFO [train.py:561] (1/4) Epoch 3484, batch 2, global_batch_idx: 55730, batch size: 203, loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.3194, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1832, prior_loss=0.9705, diff_loss=0.2939, tot_loss=1.448, over 442.00 samples.], 2024-10-21 23:30:44,604 INFO [train.py:561] (1/4) Epoch 3484, batch 12, global_batch_idx: 55740, batch size: 152, loss[dur_loss=0.1794, prior_loss=0.9706, diff_loss=0.3036, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3437, tot_loss=1.494, over 1966.00 samples.], 2024-10-21 23:30:49,097 INFO [train.py:682] (1/4) Start epoch 3485 2024-10-21 23:31:06,213 INFO [train.py:561] (1/4) Epoch 3485, batch 6, global_batch_idx: 55750, batch size: 106, loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.2935, tot_loss=1.445, over 106.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9699, diff_loss=0.3914, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 23:31:19,370 INFO [train.py:682] (1/4) Start epoch 3486 2024-10-21 23:31:28,226 INFO [train.py:561] (1/4) Epoch 3486, batch 0, global_batch_idx: 55760, batch size: 108, loss[dur_loss=0.1888, prior_loss=0.9712, diff_loss=0.2972, tot_loss=1.457, over 108.00 samples.], tot_loss[dur_loss=0.1888, prior_loss=0.9712, diff_loss=0.2972, tot_loss=1.457, over 108.00 samples.], 2024-10-21 23:31:42,593 INFO [train.py:561] (1/4) Epoch 3486, batch 10, global_batch_idx: 55770, batch size: 111, loss[dur_loss=0.1839, prior_loss=0.9714, diff_loss=0.2563, tot_loss=1.412, over 111.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9703, diff_loss=0.3549, tot_loss=1.506, over 1656.00 samples.], 2024-10-21 23:31:49,709 INFO [train.py:682] (1/4) Start epoch 3487 2024-10-21 23:32:03,826 INFO [train.py:561] (1/4) Epoch 3487, batch 4, global_batch_idx: 55780, batch size: 189, loss[dur_loss=0.1833, prior_loss=0.9706, diff_loss=0.2998, tot_loss=1.454, over 189.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.4166, tot_loss=1.565, over 937.00 samples.], 2024-10-21 23:32:18,838 INFO [train.py:561] (1/4) Epoch 3487, batch 14, global_batch_idx: 55790, batch size: 142, loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.2754, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9703, diff_loss=0.3484, tot_loss=1.5, over 2210.00 samples.], 2024-10-21 23:32:20,278 INFO [train.py:682] (1/4) Start epoch 3488 2024-10-21 23:32:40,379 INFO [train.py:561] (1/4) Epoch 3488, batch 8, global_batch_idx: 55800, batch size: 170, loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.2978, tot_loss=1.451, over 170.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9701, diff_loss=0.375, tot_loss=1.525, over 1432.00 samples.], 2024-10-21 23:32:50,497 INFO [train.py:682] (1/4) Start epoch 3489 2024-10-21 23:33:01,900 INFO [train.py:561] (1/4) Epoch 3489, batch 2, global_batch_idx: 55810, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.9706, diff_loss=0.3229, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9706, diff_loss=0.3144, tot_loss=1.468, over 442.00 samples.], 2024-10-21 23:33:16,213 INFO [train.py:561] (1/4) Epoch 3489, batch 12, global_batch_idx: 55820, batch size: 152, loss[dur_loss=0.1818, prior_loss=0.9707, diff_loss=0.2738, tot_loss=1.426, over 152.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9703, diff_loss=0.349, tot_loss=1.5, over 1966.00 samples.], 2024-10-21 23:33:20,654 INFO [train.py:682] (1/4) Start epoch 3490 2024-10-21 23:33:38,243 INFO [train.py:561] (1/4) Epoch 3490, batch 6, global_batch_idx: 55830, batch size: 106, loss[dur_loss=0.179, prior_loss=0.9705, diff_loss=0.3021, tot_loss=1.452, over 106.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.3904, tot_loss=1.54, over 1142.00 samples.], 2024-10-21 23:33:51,334 INFO [train.py:682] (1/4) Start epoch 3491 2024-10-21 23:34:00,036 INFO [train.py:561] (1/4) Epoch 3491, batch 0, global_batch_idx: 55840, batch size: 108, loss[dur_loss=0.183, prior_loss=0.9709, diff_loss=0.305, tot_loss=1.459, over 108.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9709, diff_loss=0.305, tot_loss=1.459, over 108.00 samples.], 2024-10-21 23:34:14,304 INFO [train.py:561] (1/4) Epoch 3491, batch 10, global_batch_idx: 55850, batch size: 111, loss[dur_loss=0.1841, prior_loss=0.9714, diff_loss=0.2957, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3701, tot_loss=1.521, over 1656.00 samples.], 2024-10-21 23:34:21,387 INFO [train.py:682] (1/4) Start epoch 3492 2024-10-21 23:34:35,220 INFO [train.py:561] (1/4) Epoch 3492, batch 4, global_batch_idx: 55860, batch size: 189, loss[dur_loss=0.1828, prior_loss=0.9703, diff_loss=0.3299, tot_loss=1.483, over 189.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9697, diff_loss=0.4183, tot_loss=1.567, over 937.00 samples.], 2024-10-21 23:34:50,223 INFO [train.py:561] (1/4) Epoch 3492, batch 14, global_batch_idx: 55870, batch size: 142, loss[dur_loss=0.1838, prior_loss=0.9703, diff_loss=0.2688, tot_loss=1.423, over 142.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9702, diff_loss=0.3431, tot_loss=1.494, over 2210.00 samples.], 2024-10-21 23:34:51,652 INFO [train.py:682] (1/4) Start epoch 3493 2024-10-21 23:35:11,644 INFO [train.py:561] (1/4) Epoch 3493, batch 8, global_batch_idx: 55880, batch size: 170, loss[dur_loss=0.1843, prior_loss=0.9705, diff_loss=0.2884, tot_loss=1.443, over 170.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9701, diff_loss=0.3738, tot_loss=1.524, over 1432.00 samples.], 2024-10-21 23:35:21,787 INFO [train.py:682] (1/4) Start epoch 3494 2024-10-21 23:35:33,080 INFO [train.py:561] (1/4) Epoch 3494, batch 2, global_batch_idx: 55890, batch size: 203, loss[dur_loss=0.1842, prior_loss=0.9704, diff_loss=0.3317, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.9706, diff_loss=0.3001, tot_loss=1.455, over 442.00 samples.], 2024-10-21 23:35:47,440 INFO [train.py:561] (1/4) Epoch 3494, batch 12, global_batch_idx: 55900, batch size: 152, loss[dur_loss=0.1776, prior_loss=0.9705, diff_loss=0.3032, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9703, diff_loss=0.353, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 23:35:51,911 INFO [train.py:682] (1/4) Start epoch 3495 2024-10-21 23:36:09,239 INFO [train.py:561] (1/4) Epoch 3495, batch 6, global_batch_idx: 55910, batch size: 106, loss[dur_loss=0.1798, prior_loss=0.9705, diff_loss=0.282, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.97, diff_loss=0.3915, tot_loss=1.539, over 1142.00 samples.], 2024-10-21 23:36:22,309 INFO [train.py:682] (1/4) Start epoch 3496 2024-10-21 23:36:31,191 INFO [train.py:561] (1/4) Epoch 3496, batch 0, global_batch_idx: 55920, batch size: 108, loss[dur_loss=0.1866, prior_loss=0.971, diff_loss=0.2554, tot_loss=1.413, over 108.00 samples.], tot_loss[dur_loss=0.1866, prior_loss=0.971, diff_loss=0.2554, tot_loss=1.413, over 108.00 samples.], 2024-10-21 23:36:45,544 INFO [train.py:561] (1/4) Epoch 3496, batch 10, global_batch_idx: 55930, batch size: 111, loss[dur_loss=0.1832, prior_loss=0.9716, diff_loss=0.2894, tot_loss=1.444, over 111.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.3612, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 23:36:52,715 INFO [train.py:682] (1/4) Start epoch 3497 2024-10-21 23:37:06,659 INFO [train.py:561] (1/4) Epoch 3497, batch 4, global_batch_idx: 55940, batch size: 189, loss[dur_loss=0.1813, prior_loss=0.9706, diff_loss=0.3011, tot_loss=1.453, over 189.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.9699, diff_loss=0.4111, tot_loss=1.56, over 937.00 samples.], 2024-10-21 23:37:21,861 INFO [train.py:561] (1/4) Epoch 3497, batch 14, global_batch_idx: 55950, batch size: 142, loss[dur_loss=0.1796, prior_loss=0.9703, diff_loss=0.3088, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9703, diff_loss=0.3421, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:37:23,289 INFO [train.py:682] (1/4) Start epoch 3498 2024-10-21 23:37:43,582 INFO [train.py:561] (1/4) Epoch 3498, batch 8, global_batch_idx: 55960, batch size: 170, loss[dur_loss=0.1833, prior_loss=0.9707, diff_loss=0.2935, tot_loss=1.447, over 170.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.3717, tot_loss=1.522, over 1432.00 samples.], 2024-10-21 23:37:53,772 INFO [train.py:682] (1/4) Start epoch 3499 2024-10-21 23:38:05,127 INFO [train.py:561] (1/4) Epoch 3499, batch 2, global_batch_idx: 55970, batch size: 203, loss[dur_loss=0.1814, prior_loss=0.9704, diff_loss=0.2996, tot_loss=1.451, over 203.00 samples.], tot_loss[dur_loss=0.183, prior_loss=0.9705, diff_loss=0.304, tot_loss=1.458, over 442.00 samples.], 2024-10-21 23:38:19,573 INFO [train.py:561] (1/4) Epoch 3499, batch 12, global_batch_idx: 55980, batch size: 152, loss[dur_loss=0.1786, prior_loss=0.9705, diff_loss=0.2846, tot_loss=1.434, over 152.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9702, diff_loss=0.3541, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 23:38:24,056 INFO [train.py:682] (1/4) Start epoch 3500 2024-10-21 23:38:41,171 INFO [train.py:561] (1/4) Epoch 3500, batch 6, global_batch_idx: 55990, batch size: 106, loss[dur_loss=0.1829, prior_loss=0.9703, diff_loss=0.2537, tot_loss=1.407, over 106.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.3821, tot_loss=1.531, over 1142.00 samples.], 2024-10-21 23:38:54,317 INFO [train.py:682] (1/4) Start epoch 3501 2024-10-21 23:39:03,260 INFO [train.py:561] (1/4) Epoch 3501, batch 0, global_batch_idx: 56000, batch size: 108, loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.2964, tot_loss=1.453, over 108.00 samples.], tot_loss[dur_loss=0.1856, prior_loss=0.9712, diff_loss=0.2964, tot_loss=1.453, over 108.00 samples.], 2024-10-21 23:39:17,708 INFO [train.py:561] (1/4) Epoch 3501, batch 10, global_batch_idx: 56010, batch size: 111, loss[dur_loss=0.1834, prior_loss=0.9714, diff_loss=0.2923, tot_loss=1.447, over 111.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.3602, tot_loss=1.511, over 1656.00 samples.], 2024-10-21 23:39:24,916 INFO [train.py:682] (1/4) Start epoch 3502 2024-10-21 23:39:39,069 INFO [train.py:561] (1/4) Epoch 3502, batch 4, global_batch_idx: 56020, batch size: 189, loss[dur_loss=0.1805, prior_loss=0.9707, diff_loss=0.311, tot_loss=1.462, over 189.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9698, diff_loss=0.41, tot_loss=1.559, over 937.00 samples.], 2024-10-21 23:39:54,029 INFO [train.py:561] (1/4) Epoch 3502, batch 14, global_batch_idx: 56030, batch size: 142, loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.2938, tot_loss=1.445, over 142.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9703, diff_loss=0.3394, tot_loss=1.49, over 2210.00 samples.], 2024-10-21 23:39:55,477 INFO [train.py:682] (1/4) Start epoch 3503 2024-10-21 23:40:15,571 INFO [train.py:561] (1/4) Epoch 3503, batch 8, global_batch_idx: 56040, batch size: 170, loss[dur_loss=0.1837, prior_loss=0.9705, diff_loss=0.3076, tot_loss=1.462, over 170.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9701, diff_loss=0.3618, tot_loss=1.511, over 1432.00 samples.], 2024-10-21 23:40:25,798 INFO [train.py:682] (1/4) Start epoch 3504 2024-10-21 23:40:37,435 INFO [train.py:561] (1/4) Epoch 3504, batch 2, global_batch_idx: 56050, batch size: 203, loss[dur_loss=0.1827, prior_loss=0.9704, diff_loss=0.3065, tot_loss=1.46, over 203.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9705, diff_loss=0.3106, tot_loss=1.463, over 442.00 samples.], 2024-10-21 23:40:51,761 INFO [train.py:561] (1/4) Epoch 3504, batch 12, global_batch_idx: 56060, batch size: 152, loss[dur_loss=0.1812, prior_loss=0.9707, diff_loss=0.2791, tot_loss=1.431, over 152.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9702, diff_loss=0.3554, tot_loss=1.506, over 1966.00 samples.], 2024-10-21 23:40:56,249 INFO [train.py:682] (1/4) Start epoch 3505 2024-10-21 23:41:13,285 INFO [train.py:561] (1/4) Epoch 3505, batch 6, global_batch_idx: 56070, batch size: 106, loss[dur_loss=0.1793, prior_loss=0.9701, diff_loss=0.2797, tot_loss=1.429, over 106.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9699, diff_loss=0.3856, tot_loss=1.534, over 1142.00 samples.], 2024-10-21 23:41:26,372 INFO [train.py:682] (1/4) Start epoch 3506 2024-10-21 23:41:35,083 INFO [train.py:561] (1/4) Epoch 3506, batch 0, global_batch_idx: 56080, batch size: 108, loss[dur_loss=0.1852, prior_loss=0.971, diff_loss=0.3179, tot_loss=1.474, over 108.00 samples.], tot_loss[dur_loss=0.1852, prior_loss=0.971, diff_loss=0.3179, tot_loss=1.474, over 108.00 samples.], 2024-10-21 23:41:49,397 INFO [train.py:561] (1/4) Epoch 3506, batch 10, global_batch_idx: 56090, batch size: 111, loss[dur_loss=0.1849, prior_loss=0.9714, diff_loss=0.2844, tot_loss=1.441, over 111.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.3581, tot_loss=1.508, over 1656.00 samples.], 2024-10-21 23:41:56,506 INFO [train.py:682] (1/4) Start epoch 3507 2024-10-21 23:42:10,347 INFO [train.py:561] (1/4) Epoch 3507, batch 4, global_batch_idx: 56100, batch size: 189, loss[dur_loss=0.1812, prior_loss=0.9704, diff_loss=0.3156, tot_loss=1.467, over 189.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.4108, tot_loss=1.559, over 937.00 samples.], 2024-10-21 23:42:25,247 INFO [train.py:561] (1/4) Epoch 3507, batch 14, global_batch_idx: 56110, batch size: 142, loss[dur_loss=0.1814, prior_loss=0.9702, diff_loss=0.2617, tot_loss=1.413, over 142.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3422, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:42:26,703 INFO [train.py:682] (1/4) Start epoch 3508 2024-10-21 23:42:46,942 INFO [train.py:561] (1/4) Epoch 3508, batch 8, global_batch_idx: 56120, batch size: 170, loss[dur_loss=0.1828, prior_loss=0.9704, diff_loss=0.3103, tot_loss=1.463, over 170.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.9701, diff_loss=0.3608, tot_loss=1.511, over 1432.00 samples.], 2024-10-21 23:42:57,082 INFO [train.py:682] (1/4) Start epoch 3509 2024-10-21 23:43:08,389 INFO [train.py:561] (1/4) Epoch 3509, batch 2, global_batch_idx: 56130, batch size: 203, loss[dur_loss=0.1827, prior_loss=0.9707, diff_loss=0.3165, tot_loss=1.47, over 203.00 samples.], tot_loss[dur_loss=0.1825, prior_loss=0.9706, diff_loss=0.3139, tot_loss=1.467, over 442.00 samples.], 2024-10-21 23:43:22,667 INFO [train.py:561] (1/4) Epoch 3509, batch 12, global_batch_idx: 56140, batch size: 152, loss[dur_loss=0.1842, prior_loss=0.9709, diff_loss=0.3121, tot_loss=1.467, over 152.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9703, diff_loss=0.3473, tot_loss=1.499, over 1966.00 samples.], 2024-10-21 23:43:27,128 INFO [train.py:682] (1/4) Start epoch 3510 2024-10-21 23:43:44,597 INFO [train.py:561] (1/4) Epoch 3510, batch 6, global_batch_idx: 56150, batch size: 106, loss[dur_loss=0.1778, prior_loss=0.9703, diff_loss=0.2903, tot_loss=1.438, over 106.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.97, diff_loss=0.3931, tot_loss=1.541, over 1142.00 samples.], 2024-10-21 23:43:57,716 INFO [train.py:682] (1/4) Start epoch 3511 2024-10-21 23:44:06,502 INFO [train.py:561] (1/4) Epoch 3511, batch 0, global_batch_idx: 56160, batch size: 108, loss[dur_loss=0.1836, prior_loss=0.971, diff_loss=0.3062, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.971, diff_loss=0.3062, tot_loss=1.461, over 108.00 samples.], 2024-10-21 23:44:20,949 INFO [train.py:561] (1/4) Epoch 3511, batch 10, global_batch_idx: 56170, batch size: 111, loss[dur_loss=0.1827, prior_loss=0.9713, diff_loss=0.3045, tot_loss=1.459, over 111.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9702, diff_loss=0.3657, tot_loss=1.516, over 1656.00 samples.], 2024-10-21 23:44:28,131 INFO [train.py:682] (1/4) Start epoch 3512 2024-10-21 23:44:42,144 INFO [train.py:561] (1/4) Epoch 3512, batch 4, global_batch_idx: 56180, batch size: 189, loss[dur_loss=0.1808, prior_loss=0.9704, diff_loss=0.3067, tot_loss=1.458, over 189.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9698, diff_loss=0.4141, tot_loss=1.563, over 937.00 samples.], 2024-10-21 23:44:57,061 INFO [train.py:561] (1/4) Epoch 3512, batch 14, global_batch_idx: 56190, batch size: 142, loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.3009, tot_loss=1.453, over 142.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9703, diff_loss=0.3477, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 23:44:58,503 INFO [train.py:682] (1/4) Start epoch 3513 2024-10-21 23:45:18,545 INFO [train.py:561] (1/4) Epoch 3513, batch 8, global_batch_idx: 56200, batch size: 170, loss[dur_loss=0.1834, prior_loss=0.9703, diff_loss=0.3014, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.97, diff_loss=0.3615, tot_loss=1.511, over 1432.00 samples.], 2024-10-21 23:45:28,807 INFO [train.py:682] (1/4) Start epoch 3514 2024-10-21 23:45:40,452 INFO [train.py:561] (1/4) Epoch 3514, batch 2, global_batch_idx: 56210, batch size: 203, loss[dur_loss=0.1843, prior_loss=0.9704, diff_loss=0.3478, tot_loss=1.503, over 203.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9705, diff_loss=0.3168, tot_loss=1.47, over 442.00 samples.], 2024-10-21 23:45:54,757 INFO [train.py:561] (1/4) Epoch 3514, batch 12, global_batch_idx: 56220, batch size: 152, loss[dur_loss=0.1841, prior_loss=0.9708, diff_loss=0.3068, tot_loss=1.462, over 152.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.3528, tot_loss=1.504, over 1966.00 samples.], 2024-10-21 23:45:59,222 INFO [train.py:682] (1/4) Start epoch 3515 2024-10-21 23:46:16,398 INFO [train.py:561] (1/4) Epoch 3515, batch 6, global_batch_idx: 56230, batch size: 106, loss[dur_loss=0.1795, prior_loss=0.9703, diff_loss=0.3052, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9698, diff_loss=0.3797, tot_loss=1.528, over 1142.00 samples.], 2024-10-21 23:46:29,517 INFO [train.py:682] (1/4) Start epoch 3516 2024-10-21 23:46:38,213 INFO [train.py:561] (1/4) Epoch 3516, batch 0, global_batch_idx: 56240, batch size: 108, loss[dur_loss=0.186, prior_loss=0.971, diff_loss=0.2835, tot_loss=1.441, over 108.00 samples.], tot_loss[dur_loss=0.186, prior_loss=0.971, diff_loss=0.2835, tot_loss=1.441, over 108.00 samples.], 2024-10-21 23:46:52,439 INFO [train.py:561] (1/4) Epoch 3516, batch 10, global_batch_idx: 56250, batch size: 111, loss[dur_loss=0.1799, prior_loss=0.9713, diff_loss=0.2851, tot_loss=1.436, over 111.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.9701, diff_loss=0.3569, tot_loss=1.507, over 1656.00 samples.], 2024-10-21 23:46:59,579 INFO [train.py:682] (1/4) Start epoch 3517 2024-10-21 23:47:13,396 INFO [train.py:561] (1/4) Epoch 3517, batch 4, global_batch_idx: 56260, batch size: 189, loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.3126, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.4089, tot_loss=1.558, over 937.00 samples.], 2024-10-21 23:47:28,304 INFO [train.py:561] (1/4) Epoch 3517, batch 14, global_batch_idx: 56270, batch size: 142, loss[dur_loss=0.1817, prior_loss=0.9704, diff_loss=0.2795, tot_loss=1.432, over 142.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.3375, tot_loss=1.489, over 2210.00 samples.], 2024-10-21 23:47:29,733 INFO [train.py:682] (1/4) Start epoch 3518 2024-10-21 23:47:50,077 INFO [train.py:561] (1/4) Epoch 3518, batch 8, global_batch_idx: 56280, batch size: 170, loss[dur_loss=0.1845, prior_loss=0.9706, diff_loss=0.3141, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.97, diff_loss=0.3626, tot_loss=1.512, over 1432.00 samples.], 2024-10-21 23:48:00,258 INFO [train.py:682] (1/4) Start epoch 3519 2024-10-21 23:48:11,745 INFO [train.py:561] (1/4) Epoch 3519, batch 2, global_batch_idx: 56290, batch size: 203, loss[dur_loss=0.1837, prior_loss=0.9704, diff_loss=0.2924, tot_loss=1.446, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9705, diff_loss=0.2823, tot_loss=1.436, over 442.00 samples.], 2024-10-21 23:48:26,044 INFO [train.py:561] (1/4) Epoch 3519, batch 12, global_batch_idx: 56300, batch size: 152, loss[dur_loss=0.1801, prior_loss=0.9707, diff_loss=0.292, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3433, tot_loss=1.494, over 1966.00 samples.], 2024-10-21 23:48:30,518 INFO [train.py:682] (1/4) Start epoch 3520 2024-10-21 23:48:47,690 INFO [train.py:561] (1/4) Epoch 3520, batch 6, global_batch_idx: 56310, batch size: 106, loss[dur_loss=0.1802, prior_loss=0.9702, diff_loss=0.2654, tot_loss=1.416, over 106.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9699, diff_loss=0.383, tot_loss=1.531, over 1142.00 samples.], 2024-10-21 23:49:00,818 INFO [train.py:682] (1/4) Start epoch 3521 2024-10-21 23:49:09,408 INFO [train.py:561] (1/4) Epoch 3521, batch 0, global_batch_idx: 56320, batch size: 108, loss[dur_loss=0.1831, prior_loss=0.971, diff_loss=0.2794, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.971, diff_loss=0.2794, tot_loss=1.433, over 108.00 samples.], 2024-10-21 23:49:23,680 INFO [train.py:561] (1/4) Epoch 3521, batch 10, global_batch_idx: 56330, batch size: 111, loss[dur_loss=0.1838, prior_loss=0.9715, diff_loss=0.3113, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9702, diff_loss=0.366, tot_loss=1.517, over 1656.00 samples.], 2024-10-21 23:49:30,855 INFO [train.py:682] (1/4) Start epoch 3522 2024-10-21 23:49:44,792 INFO [train.py:561] (1/4) Epoch 3522, batch 4, global_batch_idx: 56340, batch size: 189, loss[dur_loss=0.1825, prior_loss=0.9707, diff_loss=0.3118, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9698, diff_loss=0.4073, tot_loss=1.557, over 937.00 samples.], 2024-10-21 23:49:59,890 INFO [train.py:561] (1/4) Epoch 3522, batch 14, global_batch_idx: 56350, batch size: 142, loss[dur_loss=0.1841, prior_loss=0.9705, diff_loss=0.303, tot_loss=1.458, over 142.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9703, diff_loss=0.346, tot_loss=1.498, over 2210.00 samples.], 2024-10-21 23:50:01,340 INFO [train.py:682] (1/4) Start epoch 3523 2024-10-21 23:50:21,316 INFO [train.py:561] (1/4) Epoch 3523, batch 8, global_batch_idx: 56360, batch size: 170, loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.3022, tot_loss=1.457, over 170.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9701, diff_loss=0.3688, tot_loss=1.518, over 1432.00 samples.], 2024-10-21 23:50:31,533 INFO [train.py:682] (1/4) Start epoch 3524 2024-10-21 23:50:42,760 INFO [train.py:561] (1/4) Epoch 3524, batch 2, global_batch_idx: 56370, batch size: 203, loss[dur_loss=0.185, prior_loss=0.9707, diff_loss=0.3141, tot_loss=1.47, over 203.00 samples.], tot_loss[dur_loss=0.1832, prior_loss=0.9706, diff_loss=0.2862, tot_loss=1.44, over 442.00 samples.], 2024-10-21 23:50:57,168 INFO [train.py:561] (1/4) Epoch 3524, batch 12, global_batch_idx: 56380, batch size: 152, loss[dur_loss=0.1822, prior_loss=0.9708, diff_loss=0.311, tot_loss=1.464, over 152.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9702, diff_loss=0.3482, tot_loss=1.499, over 1966.00 samples.], 2024-10-21 23:51:01,665 INFO [train.py:682] (1/4) Start epoch 3525 2024-10-21 23:51:18,739 INFO [train.py:561] (1/4) Epoch 3525, batch 6, global_batch_idx: 56390, batch size: 106, loss[dur_loss=0.1786, prior_loss=0.9703, diff_loss=0.3092, tot_loss=1.458, over 106.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9699, diff_loss=0.3779, tot_loss=1.527, over 1142.00 samples.], 2024-10-21 23:51:31,912 INFO [train.py:682] (1/4) Start epoch 3526 2024-10-21 23:51:40,921 INFO [train.py:561] (1/4) Epoch 3526, batch 0, global_batch_idx: 56400, batch size: 108, loss[dur_loss=0.1846, prior_loss=0.971, diff_loss=0.2678, tot_loss=1.423, over 108.00 samples.], tot_loss[dur_loss=0.1846, prior_loss=0.971, diff_loss=0.2678, tot_loss=1.423, over 108.00 samples.], 2024-10-21 23:51:55,267 INFO [train.py:561] (1/4) Epoch 3526, batch 10, global_batch_idx: 56410, batch size: 111, loss[dur_loss=0.1841, prior_loss=0.9714, diff_loss=0.2619, tot_loss=1.417, over 111.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3486, tot_loss=1.499, over 1656.00 samples.], 2024-10-21 23:52:02,444 INFO [train.py:682] (1/4) Start epoch 3527 2024-10-21 23:52:16,595 INFO [train.py:561] (1/4) Epoch 3527, batch 4, global_batch_idx: 56420, batch size: 189, loss[dur_loss=0.1816, prior_loss=0.9707, diff_loss=0.3034, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9698, diff_loss=0.4062, tot_loss=1.553, over 937.00 samples.], 2024-10-21 23:52:31,718 INFO [train.py:561] (1/4) Epoch 3527, batch 14, global_batch_idx: 56430, batch size: 142, loss[dur_loss=0.1829, prior_loss=0.9704, diff_loss=0.3134, tot_loss=1.467, over 142.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9703, diff_loss=0.3486, tot_loss=1.499, over 2210.00 samples.], 2024-10-21 23:52:33,157 INFO [train.py:682] (1/4) Start epoch 3528 2024-10-21 23:52:53,093 INFO [train.py:561] (1/4) Epoch 3528, batch 8, global_batch_idx: 56440, batch size: 170, loss[dur_loss=0.1861, prior_loss=0.9705, diff_loss=0.3044, tot_loss=1.461, over 170.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.97, diff_loss=0.3676, tot_loss=1.517, over 1432.00 samples.], 2024-10-21 23:53:03,341 INFO [train.py:682] (1/4) Start epoch 3529 2024-10-21 23:53:14,689 INFO [train.py:561] (1/4) Epoch 3529, batch 2, global_batch_idx: 56450, batch size: 203, loss[dur_loss=0.1839, prior_loss=0.9706, diff_loss=0.2873, tot_loss=1.442, over 203.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9706, diff_loss=0.2973, tot_loss=1.451, over 442.00 samples.], 2024-10-21 23:53:29,380 INFO [train.py:561] (1/4) Epoch 3529, batch 12, global_batch_idx: 56460, batch size: 152, loss[dur_loss=0.1825, prior_loss=0.9706, diff_loss=0.2784, tot_loss=1.431, over 152.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9702, diff_loss=0.3386, tot_loss=1.49, over 1966.00 samples.], 2024-10-21 23:53:33,906 INFO [train.py:682] (1/4) Start epoch 3530 2024-10-21 23:53:51,137 INFO [train.py:561] (1/4) Epoch 3530, batch 6, global_batch_idx: 56470, batch size: 106, loss[dur_loss=0.1815, prior_loss=0.9703, diff_loss=0.2537, tot_loss=1.405, over 106.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9699, diff_loss=0.3876, tot_loss=1.538, over 1142.00 samples.], 2024-10-21 23:54:04,250 INFO [train.py:682] (1/4) Start epoch 3531 2024-10-21 23:54:13,055 INFO [train.py:561] (1/4) Epoch 3531, batch 0, global_batch_idx: 56480, batch size: 108, loss[dur_loss=0.1811, prior_loss=0.9711, diff_loss=0.2821, tot_loss=1.434, over 108.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9711, diff_loss=0.2821, tot_loss=1.434, over 108.00 samples.], 2024-10-21 23:54:27,320 INFO [train.py:561] (1/4) Epoch 3531, batch 10, global_batch_idx: 56490, batch size: 111, loss[dur_loss=0.186, prior_loss=0.9714, diff_loss=0.2743, tot_loss=1.432, over 111.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9702, diff_loss=0.3543, tot_loss=1.504, over 1656.00 samples.], 2024-10-21 23:54:34,467 INFO [train.py:682] (1/4) Start epoch 3532 2024-10-21 23:54:48,277 INFO [train.py:561] (1/4) Epoch 3532, batch 4, global_batch_idx: 56500, batch size: 189, loss[dur_loss=0.1821, prior_loss=0.9706, diff_loss=0.2945, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.4257, tot_loss=1.574, over 937.00 samples.], 2024-10-21 23:55:03,420 INFO [train.py:561] (1/4) Epoch 3532, batch 14, global_batch_idx: 56510, batch size: 142, loss[dur_loss=0.1822, prior_loss=0.9703, diff_loss=0.2885, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9703, diff_loss=0.3502, tot_loss=1.501, over 2210.00 samples.], 2024-10-21 23:55:04,855 INFO [train.py:682] (1/4) Start epoch 3533 2024-10-21 23:55:25,186 INFO [train.py:561] (1/4) Epoch 3533, batch 8, global_batch_idx: 56520, batch size: 170, loss[dur_loss=0.1841, prior_loss=0.9707, diff_loss=0.3067, tot_loss=1.461, over 170.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3695, tot_loss=1.519, over 1432.00 samples.], 2024-10-21 23:55:35,369 INFO [train.py:682] (1/4) Start epoch 3534 2024-10-21 23:55:46,646 INFO [train.py:561] (1/4) Epoch 3534, batch 2, global_batch_idx: 56530, batch size: 203, loss[dur_loss=0.1828, prior_loss=0.9706, diff_loss=0.3044, tot_loss=1.458, over 203.00 samples.], tot_loss[dur_loss=0.1833, prior_loss=0.9706, diff_loss=0.3044, tot_loss=1.458, over 442.00 samples.], 2024-10-21 23:56:00,938 INFO [train.py:561] (1/4) Epoch 3534, batch 12, global_batch_idx: 56540, batch size: 152, loss[dur_loss=0.1809, prior_loss=0.9705, diff_loss=0.3315, tot_loss=1.483, over 152.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9702, diff_loss=0.354, tot_loss=1.505, over 1966.00 samples.], 2024-10-21 23:56:05,407 INFO [train.py:682] (1/4) Start epoch 3535 2024-10-21 23:56:22,379 INFO [train.py:561] (1/4) Epoch 3535, batch 6, global_batch_idx: 56550, batch size: 106, loss[dur_loss=0.1784, prior_loss=0.9701, diff_loss=0.3205, tot_loss=1.469, over 106.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9699, diff_loss=0.3795, tot_loss=1.528, over 1142.00 samples.], 2024-10-21 23:56:35,440 INFO [train.py:682] (1/4) Start epoch 3536 2024-10-21 23:56:44,289 INFO [train.py:561] (1/4) Epoch 3536, batch 0, global_batch_idx: 56560, batch size: 108, loss[dur_loss=0.1857, prior_loss=0.971, diff_loss=0.2783, tot_loss=1.435, over 108.00 samples.], tot_loss[dur_loss=0.1857, prior_loss=0.971, diff_loss=0.2783, tot_loss=1.435, over 108.00 samples.], 2024-10-21 23:56:58,403 INFO [train.py:561] (1/4) Epoch 3536, batch 10, global_batch_idx: 56570, batch size: 111, loss[dur_loss=0.1783, prior_loss=0.9713, diff_loss=0.2813, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9701, diff_loss=0.3631, tot_loss=1.514, over 1656.00 samples.], 2024-10-21 23:57:05,522 INFO [train.py:682] (1/4) Start epoch 3537 2024-10-21 23:57:19,102 INFO [train.py:561] (1/4) Epoch 3537, batch 4, global_batch_idx: 56580, batch size: 189, loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.326, tot_loss=1.477, over 189.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.4046, tot_loss=1.551, over 937.00 samples.], 2024-10-21 23:57:33,936 INFO [train.py:561] (1/4) Epoch 3537, batch 14, global_batch_idx: 56590, batch size: 142, loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.2841, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.3431, tot_loss=1.493, over 2210.00 samples.], 2024-10-21 23:57:35,361 INFO [train.py:682] (1/4) Start epoch 3538 2024-10-21 23:57:55,485 INFO [train.py:561] (1/4) Epoch 3538, batch 8, global_batch_idx: 56600, batch size: 170, loss[dur_loss=0.1839, prior_loss=0.9705, diff_loss=0.2766, tot_loss=1.431, over 170.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.3676, tot_loss=1.517, over 1432.00 samples.], 2024-10-21 23:58:05,627 INFO [train.py:682] (1/4) Start epoch 3539 2024-10-21 23:58:17,088 INFO [train.py:561] (1/4) Epoch 3539, batch 2, global_batch_idx: 56610, batch size: 203, loss[dur_loss=0.1823, prior_loss=0.9704, diff_loss=0.3334, tot_loss=1.486, over 203.00 samples.], tot_loss[dur_loss=0.1821, prior_loss=0.9705, diff_loss=0.308, tot_loss=1.461, over 442.00 samples.], 2024-10-21 23:58:31,276 INFO [train.py:561] (1/4) Epoch 3539, batch 12, global_batch_idx: 56620, batch size: 152, loss[dur_loss=0.1821, prior_loss=0.9707, diff_loss=0.301, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9702, diff_loss=0.3496, tot_loss=1.5, over 1966.00 samples.], 2024-10-21 23:58:35,734 INFO [train.py:682] (1/4) Start epoch 3540 2024-10-21 23:58:52,632 INFO [train.py:561] (1/4) Epoch 3540, batch 6, global_batch_idx: 56630, batch size: 106, loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.2557, tot_loss=1.407, over 106.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.3864, tot_loss=1.536, over 1142.00 samples.], 2024-10-21 23:59:05,649 INFO [train.py:682] (1/4) Start epoch 3541 2024-10-21 23:59:14,510 INFO [train.py:561] (1/4) Epoch 3541, batch 0, global_batch_idx: 56640, batch size: 108, loss[dur_loss=0.1844, prior_loss=0.9709, diff_loss=0.2915, tot_loss=1.447, over 108.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9709, diff_loss=0.2915, tot_loss=1.447, over 108.00 samples.], 2024-10-21 23:59:28,584 INFO [train.py:561] (1/4) Epoch 3541, batch 10, global_batch_idx: 56650, batch size: 111, loss[dur_loss=0.182, prior_loss=0.9711, diff_loss=0.3156, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9701, diff_loss=0.3554, tot_loss=1.505, over 1656.00 samples.], 2024-10-21 23:59:35,695 INFO [train.py:682] (1/4) Start epoch 3542 2024-10-21 23:59:49,371 INFO [train.py:561] (1/4) Epoch 3542, batch 4, global_batch_idx: 56660, batch size: 189, loss[dur_loss=0.1814, prior_loss=0.9703, diff_loss=0.2886, tot_loss=1.44, over 189.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.4109, tot_loss=1.559, over 937.00 samples.], 2024-10-22 00:00:04,165 INFO [train.py:561] (1/4) Epoch 3542, batch 14, global_batch_idx: 56670, batch size: 142, loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.2926, tot_loss=1.445, over 142.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.3453, tot_loss=1.496, over 2210.00 samples.], 2024-10-22 00:00:05,598 INFO [train.py:682] (1/4) Start epoch 3543 2024-10-22 00:00:25,600 INFO [train.py:561] (1/4) Epoch 3543, batch 8, global_batch_idx: 56680, batch size: 170, loss[dur_loss=0.1836, prior_loss=0.9706, diff_loss=0.3105, tot_loss=1.465, over 170.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9699, diff_loss=0.3791, tot_loss=1.529, over 1432.00 samples.], 2024-10-22 00:00:35,784 INFO [train.py:682] (1/4) Start epoch 3544 2024-10-22 00:00:47,171 INFO [train.py:561] (1/4) Epoch 3544, batch 2, global_batch_idx: 56690, batch size: 203, loss[dur_loss=0.1834, prior_loss=0.9704, diff_loss=0.31, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9705, diff_loss=0.2914, tot_loss=1.444, over 442.00 samples.], 2024-10-22 00:01:01,496 INFO [train.py:561] (1/4) Epoch 3544, batch 12, global_batch_idx: 56700, batch size: 152, loss[dur_loss=0.1787, prior_loss=0.9703, diff_loss=0.3051, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.97, diff_loss=0.3439, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 00:01:06,005 INFO [train.py:682] (1/4) Start epoch 3545 2024-10-22 00:01:22,906 INFO [train.py:561] (1/4) Epoch 3545, batch 6, global_batch_idx: 56710, batch size: 106, loss[dur_loss=0.1799, prior_loss=0.9702, diff_loss=0.2931, tot_loss=1.443, over 106.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9697, diff_loss=0.3901, tot_loss=1.539, over 1142.00 samples.], 2024-10-22 00:01:36,018 INFO [train.py:682] (1/4) Start epoch 3546 2024-10-22 00:01:44,903 INFO [train.py:561] (1/4) Epoch 3546, batch 0, global_batch_idx: 56720, batch size: 108, loss[dur_loss=0.1822, prior_loss=0.9709, diff_loss=0.2862, tot_loss=1.439, over 108.00 samples.], tot_loss[dur_loss=0.1822, prior_loss=0.9709, diff_loss=0.2862, tot_loss=1.439, over 108.00 samples.], 2024-10-22 00:01:59,061 INFO [train.py:561] (1/4) Epoch 3546, batch 10, global_batch_idx: 56730, batch size: 111, loss[dur_loss=0.1818, prior_loss=0.9712, diff_loss=0.3132, tot_loss=1.466, over 111.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.3602, tot_loss=1.51, over 1656.00 samples.], 2024-10-22 00:02:06,215 INFO [train.py:682] (1/4) Start epoch 3547 2024-10-22 00:02:19,941 INFO [train.py:561] (1/4) Epoch 3547, batch 4, global_batch_idx: 56740, batch size: 189, loss[dur_loss=0.1792, prior_loss=0.9704, diff_loss=0.3219, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9697, diff_loss=0.4087, tot_loss=1.555, over 937.00 samples.], 2024-10-22 00:02:34,861 INFO [train.py:561] (1/4) Epoch 3547, batch 14, global_batch_idx: 56750, batch size: 142, loss[dur_loss=0.1823, prior_loss=0.97, diff_loss=0.3169, tot_loss=1.469, over 142.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.9701, diff_loss=0.3457, tot_loss=1.496, over 2210.00 samples.], 2024-10-22 00:02:36,285 INFO [train.py:682] (1/4) Start epoch 3548 2024-10-22 00:02:56,127 INFO [train.py:561] (1/4) Epoch 3548, batch 8, global_batch_idx: 56760, batch size: 170, loss[dur_loss=0.1849, prior_loss=0.9706, diff_loss=0.309, tot_loss=1.464, over 170.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9698, diff_loss=0.3721, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 00:03:06,274 INFO [train.py:682] (1/4) Start epoch 3549 2024-10-22 00:03:17,913 INFO [train.py:561] (1/4) Epoch 3549, batch 2, global_batch_idx: 56770, batch size: 203, loss[dur_loss=0.1826, prior_loss=0.97, diff_loss=0.3133, tot_loss=1.466, over 203.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9702, diff_loss=0.3059, tot_loss=1.458, over 442.00 samples.], 2024-10-22 00:03:32,068 INFO [train.py:561] (1/4) Epoch 3549, batch 12, global_batch_idx: 56780, batch size: 152, loss[dur_loss=0.1778, prior_loss=0.9702, diff_loss=0.3185, tot_loss=1.467, over 152.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.97, diff_loss=0.3508, tot_loss=1.5, over 1966.00 samples.], 2024-10-22 00:03:36,490 INFO [train.py:682] (1/4) Start epoch 3550 2024-10-22 00:03:53,412 INFO [train.py:561] (1/4) Epoch 3550, batch 6, global_batch_idx: 56790, batch size: 106, loss[dur_loss=0.1806, prior_loss=0.9701, diff_loss=0.2867, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9698, diff_loss=0.4018, tot_loss=1.549, over 1142.00 samples.], 2024-10-22 00:04:06,489 INFO [train.py:682] (1/4) Start epoch 3551 2024-10-22 00:04:15,488 INFO [train.py:561] (1/4) Epoch 3551, batch 0, global_batch_idx: 56800, batch size: 108, loss[dur_loss=0.1844, prior_loss=0.9707, diff_loss=0.29, tot_loss=1.445, over 108.00 samples.], tot_loss[dur_loss=0.1844, prior_loss=0.9707, diff_loss=0.29, tot_loss=1.445, over 108.00 samples.], 2024-10-22 00:04:29,600 INFO [train.py:561] (1/4) Epoch 3551, batch 10, global_batch_idx: 56810, batch size: 111, loss[dur_loss=0.183, prior_loss=0.9712, diff_loss=0.3058, tot_loss=1.46, over 111.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.3594, tot_loss=1.51, over 1656.00 samples.], 2024-10-22 00:04:36,746 INFO [train.py:682] (1/4) Start epoch 3552 2024-10-22 00:04:50,683 INFO [train.py:561] (1/4) Epoch 3552, batch 4, global_batch_idx: 56820, batch size: 189, loss[dur_loss=0.1788, prior_loss=0.9704, diff_loss=0.3002, tot_loss=1.449, over 189.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.4025, tot_loss=1.549, over 937.00 samples.], 2024-10-22 00:05:05,567 INFO [train.py:561] (1/4) Epoch 3552, batch 14, global_batch_idx: 56830, batch size: 142, loss[dur_loss=0.1815, prior_loss=0.97, diff_loss=0.2985, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.3427, tot_loss=1.493, over 2210.00 samples.], 2024-10-22 00:05:07,008 INFO [train.py:682] (1/4) Start epoch 3553 2024-10-22 00:05:27,403 INFO [train.py:561] (1/4) Epoch 3553, batch 8, global_batch_idx: 56840, batch size: 170, loss[dur_loss=0.1825, prior_loss=0.9705, diff_loss=0.3107, tot_loss=1.464, over 170.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9699, diff_loss=0.3621, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 00:05:37,656 INFO [train.py:682] (1/4) Start epoch 3554 2024-10-22 00:05:49,063 INFO [train.py:561] (1/4) Epoch 3554, batch 2, global_batch_idx: 56850, batch size: 203, loss[dur_loss=0.1831, prior_loss=0.9703, diff_loss=0.2801, tot_loss=1.433, over 203.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9703, diff_loss=0.3018, tot_loss=1.454, over 442.00 samples.], 2024-10-22 00:06:03,285 INFO [train.py:561] (1/4) Epoch 3554, batch 12, global_batch_idx: 56860, batch size: 152, loss[dur_loss=0.1806, prior_loss=0.9703, diff_loss=0.2632, tot_loss=1.414, over 152.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.3418, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 00:06:07,778 INFO [train.py:682] (1/4) Start epoch 3555 2024-10-22 00:06:25,120 INFO [train.py:561] (1/4) Epoch 3555, batch 6, global_batch_idx: 56870, batch size: 106, loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.2736, tot_loss=1.424, over 106.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9697, diff_loss=0.3881, tot_loss=1.536, over 1142.00 samples.], 2024-10-22 00:06:38,227 INFO [train.py:682] (1/4) Start epoch 3556 2024-10-22 00:06:47,088 INFO [train.py:561] (1/4) Epoch 3556, batch 0, global_batch_idx: 56880, batch size: 108, loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.2544, tot_loss=1.41, over 108.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.971, diff_loss=0.2544, tot_loss=1.41, over 108.00 samples.], 2024-10-22 00:07:01,262 INFO [train.py:561] (1/4) Epoch 3556, batch 10, global_batch_idx: 56890, batch size: 111, loss[dur_loss=0.1832, prior_loss=0.9714, diff_loss=0.3176, tot_loss=1.472, over 111.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.97, diff_loss=0.3567, tot_loss=1.506, over 1656.00 samples.], 2024-10-22 00:07:08,402 INFO [train.py:682] (1/4) Start epoch 3557 2024-10-22 00:07:22,211 INFO [train.py:561] (1/4) Epoch 3557, batch 4, global_batch_idx: 56900, batch size: 189, loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.3413, tot_loss=1.494, over 189.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9696, diff_loss=0.4071, tot_loss=1.554, over 937.00 samples.], 2024-10-22 00:07:37,091 INFO [train.py:561] (1/4) Epoch 3557, batch 14, global_batch_idx: 56910, batch size: 142, loss[dur_loss=0.1814, prior_loss=0.9702, diff_loss=0.3301, tot_loss=1.482, over 142.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9701, diff_loss=0.3366, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 00:07:38,544 INFO [train.py:682] (1/4) Start epoch 3558 2024-10-22 00:07:58,638 INFO [train.py:561] (1/4) Epoch 3558, batch 8, global_batch_idx: 56920, batch size: 170, loss[dur_loss=0.183, prior_loss=0.9705, diff_loss=0.2627, tot_loss=1.416, over 170.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.97, diff_loss=0.3655, tot_loss=1.515, over 1432.00 samples.], 2024-10-22 00:08:08,731 INFO [train.py:682] (1/4) Start epoch 3559 2024-10-22 00:08:20,366 INFO [train.py:561] (1/4) Epoch 3559, batch 2, global_batch_idx: 56930, batch size: 203, loss[dur_loss=0.1849, prior_loss=0.9703, diff_loss=0.3407, tot_loss=1.496, over 203.00 samples.], tot_loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.3272, tot_loss=1.48, over 442.00 samples.], 2024-10-22 00:08:34,556 INFO [train.py:561] (1/4) Epoch 3559, batch 12, global_batch_idx: 56940, batch size: 152, loss[dur_loss=0.1798, prior_loss=0.9705, diff_loss=0.3128, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9701, diff_loss=0.3545, tot_loss=1.505, over 1966.00 samples.], 2024-10-22 00:08:39,045 INFO [train.py:682] (1/4) Start epoch 3560 2024-10-22 00:08:56,078 INFO [train.py:561] (1/4) Epoch 3560, batch 6, global_batch_idx: 56950, batch size: 106, loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3207, tot_loss=1.472, over 106.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9698, diff_loss=0.3969, tot_loss=1.545, over 1142.00 samples.], 2024-10-22 00:09:09,134 INFO [train.py:682] (1/4) Start epoch 3561 2024-10-22 00:09:18,001 INFO [train.py:561] (1/4) Epoch 3561, batch 0, global_batch_idx: 56960, batch size: 108, loss[dur_loss=0.1805, prior_loss=0.9707, diff_loss=0.2687, tot_loss=1.42, over 108.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9707, diff_loss=0.2687, tot_loss=1.42, over 108.00 samples.], 2024-10-22 00:09:32,211 INFO [train.py:561] (1/4) Epoch 3561, batch 10, global_batch_idx: 56970, batch size: 111, loss[dur_loss=0.1823, prior_loss=0.9711, diff_loss=0.31, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9699, diff_loss=0.3529, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 00:09:39,293 INFO [train.py:682] (1/4) Start epoch 3562 2024-10-22 00:09:53,113 INFO [train.py:561] (1/4) Epoch 3562, batch 4, global_batch_idx: 56980, batch size: 189, loss[dur_loss=0.1823, prior_loss=0.9704, diff_loss=0.3465, tot_loss=1.499, over 189.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9696, diff_loss=0.4094, tot_loss=1.557, over 937.00 samples.], 2024-10-22 00:10:07,812 INFO [train.py:561] (1/4) Epoch 3562, batch 14, global_batch_idx: 56990, batch size: 142, loss[dur_loss=0.1826, prior_loss=0.97, diff_loss=0.2615, tot_loss=1.414, over 142.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.3484, tot_loss=1.498, over 2210.00 samples.], 2024-10-22 00:10:09,259 INFO [train.py:682] (1/4) Start epoch 3563 2024-10-22 00:10:29,243 INFO [train.py:561] (1/4) Epoch 3563, batch 8, global_batch_idx: 57000, batch size: 170, loss[dur_loss=0.1844, prior_loss=0.9707, diff_loss=0.3147, tot_loss=1.47, over 170.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.3783, tot_loss=1.527, over 1432.00 samples.], 2024-10-22 00:10:30,713 INFO [train.py:579] (1/4) Computing validation loss 2024-10-22 00:10:58,410 INFO [train.py:589] (1/4) Epoch 3563, validation: dur_loss=0.453, prior_loss=1.037, diff_loss=0.3498, tot_loss=1.839, over 100.00 samples. 2024-10-22 00:10:58,411 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-22 00:11:07,034 INFO [train.py:682] (1/4) Start epoch 3564 2024-10-22 00:11:18,982 INFO [train.py:561] (1/4) Epoch 3564, batch 2, global_batch_idx: 57010, batch size: 203, loss[dur_loss=0.1829, prior_loss=0.9702, diff_loss=0.3285, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9703, diff_loss=0.3109, tot_loss=1.463, over 442.00 samples.], 2024-10-22 00:11:33,193 INFO [train.py:561] (1/4) Epoch 3564, batch 12, global_batch_idx: 57020, batch size: 152, loss[dur_loss=0.1804, prior_loss=0.9705, diff_loss=0.3351, tot_loss=1.486, over 152.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.3562, tot_loss=1.506, over 1966.00 samples.], 2024-10-22 00:11:37,665 INFO [train.py:682] (1/4) Start epoch 3565 2024-10-22 00:11:54,877 INFO [train.py:561] (1/4) Epoch 3565, batch 6, global_batch_idx: 57030, batch size: 106, loss[dur_loss=0.1789, prior_loss=0.9701, diff_loss=0.3057, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9697, diff_loss=0.3858, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 00:12:07,855 INFO [train.py:682] (1/4) Start epoch 3566 2024-10-22 00:12:16,694 INFO [train.py:561] (1/4) Epoch 3566, batch 0, global_batch_idx: 57040, batch size: 108, loss[dur_loss=0.1807, prior_loss=0.9706, diff_loss=0.2833, tot_loss=1.435, over 108.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9706, diff_loss=0.2833, tot_loss=1.435, over 108.00 samples.], 2024-10-22 00:12:30,977 INFO [train.py:561] (1/4) Epoch 3566, batch 10, global_batch_idx: 57050, batch size: 111, loss[dur_loss=0.1819, prior_loss=0.9712, diff_loss=0.2783, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9699, diff_loss=0.362, tot_loss=1.511, over 1656.00 samples.], 2024-10-22 00:12:38,077 INFO [train.py:682] (1/4) Start epoch 3567 2024-10-22 00:12:52,527 INFO [train.py:561] (1/4) Epoch 3567, batch 4, global_batch_idx: 57060, batch size: 189, loss[dur_loss=0.1802, prior_loss=0.9703, diff_loss=0.2769, tot_loss=1.427, over 189.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9695, diff_loss=0.4009, tot_loss=1.548, over 937.00 samples.], 2024-10-22 00:13:07,341 INFO [train.py:561] (1/4) Epoch 3567, batch 14, global_batch_idx: 57070, batch size: 142, loss[dur_loss=0.179, prior_loss=0.9702, diff_loss=0.273, tot_loss=1.422, over 142.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3386, tot_loss=1.488, over 2210.00 samples.], 2024-10-22 00:13:08,764 INFO [train.py:682] (1/4) Start epoch 3568 2024-10-22 00:13:28,867 INFO [train.py:561] (1/4) Epoch 3568, batch 8, global_batch_idx: 57080, batch size: 170, loss[dur_loss=0.186, prior_loss=0.9703, diff_loss=0.31, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9698, diff_loss=0.3716, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 00:13:38,898 INFO [train.py:682] (1/4) Start epoch 3569 2024-10-22 00:13:50,411 INFO [train.py:561] (1/4) Epoch 3569, batch 2, global_batch_idx: 57090, batch size: 203, loss[dur_loss=0.1804, prior_loss=0.9701, diff_loss=0.2727, tot_loss=1.423, over 203.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9702, diff_loss=0.2915, tot_loss=1.442, over 442.00 samples.], 2024-10-22 00:14:04,540 INFO [train.py:561] (1/4) Epoch 3569, batch 12, global_batch_idx: 57100, batch size: 152, loss[dur_loss=0.1788, prior_loss=0.9703, diff_loss=0.2972, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.3506, tot_loss=1.499, over 1966.00 samples.], 2024-10-22 00:14:08,917 INFO [train.py:682] (1/4) Start epoch 3570 2024-10-22 00:14:26,017 INFO [train.py:561] (1/4) Epoch 3570, batch 6, global_batch_idx: 57110, batch size: 106, loss[dur_loss=0.1815, prior_loss=0.9701, diff_loss=0.2924, tot_loss=1.444, over 106.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9697, diff_loss=0.3874, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 00:14:38,931 INFO [train.py:682] (1/4) Start epoch 3571 2024-10-22 00:14:47,631 INFO [train.py:561] (1/4) Epoch 3571, batch 0, global_batch_idx: 57120, batch size: 108, loss[dur_loss=0.1831, prior_loss=0.9708, diff_loss=0.2754, tot_loss=1.429, over 108.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9708, diff_loss=0.2754, tot_loss=1.429, over 108.00 samples.], 2024-10-22 00:15:02,046 INFO [train.py:561] (1/4) Epoch 3571, batch 10, global_batch_idx: 57130, batch size: 111, loss[dur_loss=0.1805, prior_loss=0.9713, diff_loss=0.2787, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.97, diff_loss=0.3622, tot_loss=1.512, over 1656.00 samples.], 2024-10-22 00:15:09,170 INFO [train.py:682] (1/4) Start epoch 3572 2024-10-22 00:15:23,209 INFO [train.py:561] (1/4) Epoch 3572, batch 4, global_batch_idx: 57140, batch size: 189, loss[dur_loss=0.1801, prior_loss=0.9703, diff_loss=0.3364, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9696, diff_loss=0.4146, tot_loss=1.562, over 937.00 samples.], 2024-10-22 00:15:38,165 INFO [train.py:561] (1/4) Epoch 3572, batch 14, global_batch_idx: 57150, batch size: 142, loss[dur_loss=0.1835, prior_loss=0.9702, diff_loss=0.3082, tot_loss=1.462, over 142.00 samples.], tot_loss[dur_loss=0.1806, prior_loss=0.9701, diff_loss=0.3457, tot_loss=1.496, over 2210.00 samples.], 2024-10-22 00:15:39,585 INFO [train.py:682] (1/4) Start epoch 3573 2024-10-22 00:15:59,757 INFO [train.py:561] (1/4) Epoch 3573, batch 8, global_batch_idx: 57160, batch size: 170, loss[dur_loss=0.1809, prior_loss=0.9704, diff_loss=0.3238, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.3735, tot_loss=1.523, over 1432.00 samples.], 2024-10-22 00:16:09,846 INFO [train.py:682] (1/4) Start epoch 3574 2024-10-22 00:16:21,345 INFO [train.py:561] (1/4) Epoch 3574, batch 2, global_batch_idx: 57170, batch size: 203, loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.3286, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9703, diff_loss=0.2999, tot_loss=1.452, over 442.00 samples.], 2024-10-22 00:16:35,542 INFO [train.py:561] (1/4) Epoch 3574, batch 12, global_batch_idx: 57180, batch size: 152, loss[dur_loss=0.1763, prior_loss=0.9705, diff_loss=0.2989, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3457, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 00:16:39,973 INFO [train.py:682] (1/4) Start epoch 3575 2024-10-22 00:16:57,216 INFO [train.py:561] (1/4) Epoch 3575, batch 6, global_batch_idx: 57190, batch size: 106, loss[dur_loss=0.1807, prior_loss=0.9701, diff_loss=0.3353, tot_loss=1.486, over 106.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3827, tot_loss=1.531, over 1142.00 samples.], 2024-10-22 00:17:10,284 INFO [train.py:682] (1/4) Start epoch 3576 2024-10-22 00:17:19,336 INFO [train.py:561] (1/4) Epoch 3576, batch 0, global_batch_idx: 57200, batch size: 108, loss[dur_loss=0.1813, prior_loss=0.9709, diff_loss=0.3246, tot_loss=1.477, over 108.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9709, diff_loss=0.3246, tot_loss=1.477, over 108.00 samples.], 2024-10-22 00:17:33,713 INFO [train.py:561] (1/4) Epoch 3576, batch 10, global_batch_idx: 57210, batch size: 111, loss[dur_loss=0.1833, prior_loss=0.9714, diff_loss=0.3185, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3606, tot_loss=1.51, over 1656.00 samples.], 2024-10-22 00:17:40,779 INFO [train.py:682] (1/4) Start epoch 3577 2024-10-22 00:17:54,681 INFO [train.py:561] (1/4) Epoch 3577, batch 4, global_batch_idx: 57220, batch size: 189, loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.3576, tot_loss=1.509, over 189.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.423, tot_loss=1.57, over 937.00 samples.], 2024-10-22 00:18:09,700 INFO [train.py:561] (1/4) Epoch 3577, batch 14, global_batch_idx: 57230, batch size: 142, loss[dur_loss=0.1829, prior_loss=0.9702, diff_loss=0.288, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.9701, diff_loss=0.3464, tot_loss=1.496, over 2210.00 samples.], 2024-10-22 00:18:11,166 INFO [train.py:682] (1/4) Start epoch 3578 2024-10-22 00:18:31,354 INFO [train.py:561] (1/4) Epoch 3578, batch 8, global_batch_idx: 57240, batch size: 170, loss[dur_loss=0.1838, prior_loss=0.9705, diff_loss=0.3248, tot_loss=1.479, over 170.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9699, diff_loss=0.3599, tot_loss=1.509, over 1432.00 samples.], 2024-10-22 00:18:41,442 INFO [train.py:682] (1/4) Start epoch 3579 2024-10-22 00:18:52,901 INFO [train.py:561] (1/4) Epoch 3579, batch 2, global_batch_idx: 57250, batch size: 203, loss[dur_loss=0.1808, prior_loss=0.9702, diff_loss=0.3339, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9702, diff_loss=0.315, tot_loss=1.465, over 442.00 samples.], 2024-10-22 00:19:07,153 INFO [train.py:561] (1/4) Epoch 3579, batch 12, global_batch_idx: 57260, batch size: 152, loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.2793, tot_loss=1.431, over 152.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9701, diff_loss=0.3623, tot_loss=1.512, over 1966.00 samples.], 2024-10-22 00:19:11,606 INFO [train.py:682] (1/4) Start epoch 3580 2024-10-22 00:19:28,718 INFO [train.py:561] (1/4) Epoch 3580, batch 6, global_batch_idx: 57270, batch size: 106, loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.2944, tot_loss=1.446, over 106.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9697, diff_loss=0.3834, tot_loss=1.531, over 1142.00 samples.], 2024-10-22 00:19:41,702 INFO [train.py:682] (1/4) Start epoch 3581 2024-10-22 00:19:50,637 INFO [train.py:561] (1/4) Epoch 3581, batch 0, global_batch_idx: 57280, batch size: 108, loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.2956, tot_loss=1.45, over 108.00 samples.], tot_loss[dur_loss=0.1837, prior_loss=0.9707, diff_loss=0.2956, tot_loss=1.45, over 108.00 samples.], 2024-10-22 00:20:04,906 INFO [train.py:561] (1/4) Epoch 3581, batch 10, global_batch_idx: 57290, batch size: 111, loss[dur_loss=0.1841, prior_loss=0.9713, diff_loss=0.2407, tot_loss=1.396, over 111.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.97, diff_loss=0.365, tot_loss=1.514, over 1656.00 samples.], 2024-10-22 00:20:12,018 INFO [train.py:682] (1/4) Start epoch 3582 2024-10-22 00:20:25,870 INFO [train.py:561] (1/4) Epoch 3582, batch 4, global_batch_idx: 57300, batch size: 189, loss[dur_loss=0.1804, prior_loss=0.9704, diff_loss=0.3118, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9695, diff_loss=0.407, tot_loss=1.553, over 937.00 samples.], 2024-10-22 00:20:40,735 INFO [train.py:561] (1/4) Epoch 3582, batch 14, global_batch_idx: 57310, batch size: 142, loss[dur_loss=0.1821, prior_loss=0.9702, diff_loss=0.2855, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9701, diff_loss=0.3424, tot_loss=1.492, over 2210.00 samples.], 2024-10-22 00:20:42,149 INFO [train.py:682] (1/4) Start epoch 3583 2024-10-22 00:21:02,044 INFO [train.py:561] (1/4) Epoch 3583, batch 8, global_batch_idx: 57320, batch size: 170, loss[dur_loss=0.1829, prior_loss=0.9704, diff_loss=0.3023, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3749, tot_loss=1.523, over 1432.00 samples.], 2024-10-22 00:21:12,182 INFO [train.py:682] (1/4) Start epoch 3584 2024-10-22 00:21:23,854 INFO [train.py:561] (1/4) Epoch 3584, batch 2, global_batch_idx: 57330, batch size: 203, loss[dur_loss=0.1826, prior_loss=0.9704, diff_loss=0.3646, tot_loss=1.518, over 203.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9704, diff_loss=0.3184, tot_loss=1.472, over 442.00 samples.], 2024-10-22 00:21:38,149 INFO [train.py:561] (1/4) Epoch 3584, batch 12, global_batch_idx: 57340, batch size: 152, loss[dur_loss=0.1804, prior_loss=0.9705, diff_loss=0.3124, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9701, diff_loss=0.355, tot_loss=1.506, over 1966.00 samples.], 2024-10-22 00:21:42,562 INFO [train.py:682] (1/4) Start epoch 3585 2024-10-22 00:21:59,866 INFO [train.py:561] (1/4) Epoch 3585, batch 6, global_batch_idx: 57350, batch size: 106, loss[dur_loss=0.1812, prior_loss=0.9703, diff_loss=0.3132, tot_loss=1.465, over 106.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.9698, diff_loss=0.396, tot_loss=1.545, over 1142.00 samples.], 2024-10-22 00:22:13,048 INFO [train.py:682] (1/4) Start epoch 3586 2024-10-22 00:22:21,862 INFO [train.py:561] (1/4) Epoch 3586, batch 0, global_batch_idx: 57360, batch size: 108, loss[dur_loss=0.1851, prior_loss=0.9707, diff_loss=0.271, tot_loss=1.427, over 108.00 samples.], tot_loss[dur_loss=0.1851, prior_loss=0.9707, diff_loss=0.271, tot_loss=1.427, over 108.00 samples.], 2024-10-22 00:22:36,192 INFO [train.py:561] (1/4) Epoch 3586, batch 10, global_batch_idx: 57370, batch size: 111, loss[dur_loss=0.1825, prior_loss=0.9713, diff_loss=0.29, tot_loss=1.444, over 111.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.97, diff_loss=0.3521, tot_loss=1.502, over 1656.00 samples.], 2024-10-22 00:22:43,321 INFO [train.py:682] (1/4) Start epoch 3587 2024-10-22 00:22:57,363 INFO [train.py:561] (1/4) Epoch 3587, batch 4, global_batch_idx: 57380, batch size: 189, loss[dur_loss=0.1797, prior_loss=0.9703, diff_loss=0.3291, tot_loss=1.479, over 189.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9696, diff_loss=0.4186, tot_loss=1.567, over 937.00 samples.], 2024-10-22 00:23:12,342 INFO [train.py:561] (1/4) Epoch 3587, batch 14, global_batch_idx: 57390, batch size: 142, loss[dur_loss=0.182, prior_loss=0.9701, diff_loss=0.2794, tot_loss=1.432, over 142.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9701, diff_loss=0.3422, tot_loss=1.493, over 2210.00 samples.], 2024-10-22 00:23:13,794 INFO [train.py:682] (1/4) Start epoch 3588 2024-10-22 00:23:33,806 INFO [train.py:561] (1/4) Epoch 3588, batch 8, global_batch_idx: 57400, batch size: 170, loss[dur_loss=0.184, prior_loss=0.9705, diff_loss=0.2825, tot_loss=1.437, over 170.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3563, tot_loss=1.505, over 1432.00 samples.], 2024-10-22 00:23:44,017 INFO [train.py:682] (1/4) Start epoch 3589 2024-10-22 00:23:55,383 INFO [train.py:561] (1/4) Epoch 3589, batch 2, global_batch_idx: 57410, batch size: 203, loss[dur_loss=0.1816, prior_loss=0.9702, diff_loss=0.3154, tot_loss=1.467, over 203.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.3059, tot_loss=1.457, over 442.00 samples.], 2024-10-22 00:24:09,728 INFO [train.py:561] (1/4) Epoch 3589, batch 12, global_batch_idx: 57420, batch size: 152, loss[dur_loss=0.1783, prior_loss=0.9704, diff_loss=0.348, tot_loss=1.497, over 152.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9699, diff_loss=0.3514, tot_loss=1.501, over 1966.00 samples.], 2024-10-22 00:24:14,186 INFO [train.py:682] (1/4) Start epoch 3590 2024-10-22 00:24:31,638 INFO [train.py:561] (1/4) Epoch 3590, batch 6, global_batch_idx: 57430, batch size: 106, loss[dur_loss=0.178, prior_loss=0.9698, diff_loss=0.2558, tot_loss=1.404, over 106.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9696, diff_loss=0.3787, tot_loss=1.525, over 1142.00 samples.], 2024-10-22 00:24:44,781 INFO [train.py:682] (1/4) Start epoch 3591 2024-10-22 00:24:53,521 INFO [train.py:561] (1/4) Epoch 3591, batch 0, global_batch_idx: 57440, batch size: 108, loss[dur_loss=0.1797, prior_loss=0.9706, diff_loss=0.2691, tot_loss=1.419, over 108.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.9706, diff_loss=0.2691, tot_loss=1.419, over 108.00 samples.], 2024-10-22 00:25:07,875 INFO [train.py:561] (1/4) Epoch 3591, batch 10, global_batch_idx: 57450, batch size: 111, loss[dur_loss=0.1819, prior_loss=0.9713, diff_loss=0.3096, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9698, diff_loss=0.3524, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 00:25:15,038 INFO [train.py:682] (1/4) Start epoch 3592 2024-10-22 00:25:28,875 INFO [train.py:561] (1/4) Epoch 3592, batch 4, global_batch_idx: 57460, batch size: 189, loss[dur_loss=0.1785, prior_loss=0.9701, diff_loss=0.2935, tot_loss=1.442, over 189.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9694, diff_loss=0.3959, tot_loss=1.541, over 937.00 samples.], 2024-10-22 00:25:43,910 INFO [train.py:561] (1/4) Epoch 3592, batch 14, global_batch_idx: 57470, batch size: 142, loss[dur_loss=0.1812, prior_loss=0.97, diff_loss=0.291, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9699, diff_loss=0.3326, tot_loss=1.481, over 2210.00 samples.], 2024-10-22 00:25:45,357 INFO [train.py:682] (1/4) Start epoch 3593 2024-10-22 00:26:05,715 INFO [train.py:561] (1/4) Epoch 3593, batch 8, global_batch_idx: 57480, batch size: 170, loss[dur_loss=0.1842, prior_loss=0.9704, diff_loss=0.3008, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.359, tot_loss=1.508, over 1432.00 samples.], 2024-10-22 00:26:16,028 INFO [train.py:682] (1/4) Start epoch 3594 2024-10-22 00:26:27,553 INFO [train.py:561] (1/4) Epoch 3594, batch 2, global_batch_idx: 57490, batch size: 203, loss[dur_loss=0.1835, prior_loss=0.9704, diff_loss=0.299, tot_loss=1.453, over 203.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9703, diff_loss=0.2993, tot_loss=1.451, over 442.00 samples.], 2024-10-22 00:26:42,004 INFO [train.py:561] (1/4) Epoch 3594, batch 12, global_batch_idx: 57500, batch size: 152, loss[dur_loss=0.1794, prior_loss=0.9704, diff_loss=0.2711, tot_loss=1.421, over 152.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.97, diff_loss=0.3495, tot_loss=1.499, over 1966.00 samples.], 2024-10-22 00:26:46,531 INFO [train.py:682] (1/4) Start epoch 3595 2024-10-22 00:27:03,600 INFO [train.py:561] (1/4) Epoch 3595, batch 6, global_batch_idx: 57510, batch size: 106, loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.2794, tot_loss=1.428, over 106.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9696, diff_loss=0.3891, tot_loss=1.536, over 1142.00 samples.], 2024-10-22 00:27:16,873 INFO [train.py:682] (1/4) Start epoch 3596 2024-10-22 00:27:26,115 INFO [train.py:561] (1/4) Epoch 3596, batch 0, global_batch_idx: 57520, batch size: 108, loss[dur_loss=0.1828, prior_loss=0.9706, diff_loss=0.2923, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1828, prior_loss=0.9706, diff_loss=0.2923, tot_loss=1.446, over 108.00 samples.], 2024-10-22 00:27:40,495 INFO [train.py:561] (1/4) Epoch 3596, batch 10, global_batch_idx: 57530, batch size: 111, loss[dur_loss=0.1801, prior_loss=0.9711, diff_loss=0.322, tot_loss=1.473, over 111.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9698, diff_loss=0.3642, tot_loss=1.512, over 1656.00 samples.], 2024-10-22 00:27:47,619 INFO [train.py:682] (1/4) Start epoch 3597 2024-10-22 00:28:01,444 INFO [train.py:561] (1/4) Epoch 3597, batch 4, global_batch_idx: 57540, batch size: 189, loss[dur_loss=0.179, prior_loss=0.97, diff_loss=0.3306, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9694, diff_loss=0.4107, tot_loss=1.558, over 937.00 samples.], 2024-10-22 00:28:16,482 INFO [train.py:561] (1/4) Epoch 3597, batch 14, global_batch_idx: 57550, batch size: 142, loss[dur_loss=0.1812, prior_loss=0.9701, diff_loss=0.3222, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.9699, diff_loss=0.3491, tot_loss=1.499, over 2210.00 samples.], 2024-10-22 00:28:17,910 INFO [train.py:682] (1/4) Start epoch 3598 2024-10-22 00:28:38,059 INFO [train.py:561] (1/4) Epoch 3598, batch 8, global_batch_idx: 57560, batch size: 170, loss[dur_loss=0.1836, prior_loss=0.9705, diff_loss=0.3285, tot_loss=1.483, over 170.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9697, diff_loss=0.3734, tot_loss=1.522, over 1432.00 samples.], 2024-10-22 00:28:48,205 INFO [train.py:682] (1/4) Start epoch 3599 2024-10-22 00:28:59,811 INFO [train.py:561] (1/4) Epoch 3599, batch 2, global_batch_idx: 57570, batch size: 203, loss[dur_loss=0.1807, prior_loss=0.9701, diff_loss=0.294, tot_loss=1.445, over 203.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.2751, tot_loss=1.426, over 442.00 samples.], 2024-10-22 00:29:14,029 INFO [train.py:561] (1/4) Epoch 3599, batch 12, global_batch_idx: 57580, batch size: 152, loss[dur_loss=0.1792, prior_loss=0.9703, diff_loss=0.2772, tot_loss=1.427, over 152.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9699, diff_loss=0.3418, tot_loss=1.491, over 1966.00 samples.], 2024-10-22 00:29:18,511 INFO [train.py:682] (1/4) Start epoch 3600 2024-10-22 00:29:35,796 INFO [train.py:561] (1/4) Epoch 3600, batch 6, global_batch_idx: 57590, batch size: 106, loss[dur_loss=0.1784, prior_loss=0.97, diff_loss=0.3064, tot_loss=1.455, over 106.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9696, diff_loss=0.3904, tot_loss=1.538, over 1142.00 samples.], 2024-10-22 00:29:48,894 INFO [train.py:682] (1/4) Start epoch 3601 2024-10-22 00:29:57,747 INFO [train.py:561] (1/4) Epoch 3601, batch 0, global_batch_idx: 57600, batch size: 108, loss[dur_loss=0.181, prior_loss=0.9707, diff_loss=0.298, tot_loss=1.45, over 108.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9707, diff_loss=0.298, tot_loss=1.45, over 108.00 samples.], 2024-10-22 00:30:11,940 INFO [train.py:561] (1/4) Epoch 3601, batch 10, global_batch_idx: 57610, batch size: 111, loss[dur_loss=0.1819, prior_loss=0.971, diff_loss=0.2402, tot_loss=1.393, over 111.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.3534, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 00:30:19,057 INFO [train.py:682] (1/4) Start epoch 3602 2024-10-22 00:30:32,716 INFO [train.py:561] (1/4) Epoch 3602, batch 4, global_batch_idx: 57620, batch size: 189, loss[dur_loss=0.1793, prior_loss=0.9701, diff_loss=0.3245, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9694, diff_loss=0.4145, tot_loss=1.56, over 937.00 samples.], 2024-10-22 00:30:47,622 INFO [train.py:561] (1/4) Epoch 3602, batch 14, global_batch_idx: 57630, batch size: 142, loss[dur_loss=0.1808, prior_loss=0.97, diff_loss=0.2994, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3416, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 00:30:49,041 INFO [train.py:682] (1/4) Start epoch 3603 2024-10-22 00:31:08,851 INFO [train.py:561] (1/4) Epoch 3603, batch 8, global_batch_idx: 57640, batch size: 170, loss[dur_loss=0.1836, prior_loss=0.9702, diff_loss=0.2997, tot_loss=1.453, over 170.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9697, diff_loss=0.3686, tot_loss=1.517, over 1432.00 samples.], 2024-10-22 00:31:18,968 INFO [train.py:682] (1/4) Start epoch 3604 2024-10-22 00:31:30,553 INFO [train.py:561] (1/4) Epoch 3604, batch 2, global_batch_idx: 57650, batch size: 203, loss[dur_loss=0.1805, prior_loss=0.9701, diff_loss=0.3027, tot_loss=1.453, over 203.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.2862, tot_loss=1.437, over 442.00 samples.], 2024-10-22 00:31:44,749 INFO [train.py:561] (1/4) Epoch 3604, batch 12, global_batch_idx: 57660, batch size: 152, loss[dur_loss=0.1766, prior_loss=0.9702, diff_loss=0.3032, tot_loss=1.45, over 152.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.97, diff_loss=0.3468, tot_loss=1.496, over 1966.00 samples.], 2024-10-22 00:31:49,206 INFO [train.py:682] (1/4) Start epoch 3605 2024-10-22 00:32:06,150 INFO [train.py:561] (1/4) Epoch 3605, batch 6, global_batch_idx: 57670, batch size: 106, loss[dur_loss=0.1802, prior_loss=0.9699, diff_loss=0.2392, tot_loss=1.389, over 106.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9696, diff_loss=0.3803, tot_loss=1.527, over 1142.00 samples.], 2024-10-22 00:32:19,240 INFO [train.py:682] (1/4) Start epoch 3606 2024-10-22 00:32:28,086 INFO [train.py:561] (1/4) Epoch 3606, batch 0, global_batch_idx: 57680, batch size: 108, loss[dur_loss=0.1841, prior_loss=0.9706, diff_loss=0.3359, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.1841, prior_loss=0.9706, diff_loss=0.3359, tot_loss=1.491, over 108.00 samples.], 2024-10-22 00:32:42,356 INFO [train.py:561] (1/4) Epoch 3606, batch 10, global_batch_idx: 57690, batch size: 111, loss[dur_loss=0.1815, prior_loss=0.9711, diff_loss=0.3019, tot_loss=1.455, over 111.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3612, tot_loss=1.509, over 1656.00 samples.], 2024-10-22 00:32:49,458 INFO [train.py:682] (1/4) Start epoch 3607 2024-10-22 00:33:03,234 INFO [train.py:561] (1/4) Epoch 3607, batch 4, global_batch_idx: 57700, batch size: 189, loss[dur_loss=0.1794, prior_loss=0.97, diff_loss=0.3218, tot_loss=1.471, over 189.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9694, diff_loss=0.4102, tot_loss=1.557, over 937.00 samples.], 2024-10-22 00:33:18,099 INFO [train.py:561] (1/4) Epoch 3607, batch 14, global_batch_idx: 57710, batch size: 142, loss[dur_loss=0.1814, prior_loss=0.97, diff_loss=0.2608, tot_loss=1.412, over 142.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9699, diff_loss=0.3451, tot_loss=1.495, over 2210.00 samples.], 2024-10-22 00:33:19,516 INFO [train.py:682] (1/4) Start epoch 3608 2024-10-22 00:33:39,730 INFO [train.py:561] (1/4) Epoch 3608, batch 8, global_batch_idx: 57720, batch size: 170, loss[dur_loss=0.1853, prior_loss=0.9704, diff_loss=0.3005, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9698, diff_loss=0.3614, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 00:33:49,932 INFO [train.py:682] (1/4) Start epoch 3609 2024-10-22 00:34:01,416 INFO [train.py:561] (1/4) Epoch 3609, batch 2, global_batch_idx: 57730, batch size: 203, loss[dur_loss=0.18, prior_loss=0.9701, diff_loss=0.3135, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9703, diff_loss=0.2981, tot_loss=1.448, over 442.00 samples.], 2024-10-22 00:34:15,737 INFO [train.py:561] (1/4) Epoch 3609, batch 12, global_batch_idx: 57740, batch size: 152, loss[dur_loss=0.18, prior_loss=0.9703, diff_loss=0.2957, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.3479, tot_loss=1.497, over 1966.00 samples.], 2024-10-22 00:34:20,254 INFO [train.py:682] (1/4) Start epoch 3610 2024-10-22 00:34:37,378 INFO [train.py:561] (1/4) Epoch 3610, batch 6, global_batch_idx: 57750, batch size: 106, loss[dur_loss=0.1781, prior_loss=0.97, diff_loss=0.2702, tot_loss=1.418, over 106.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9697, diff_loss=0.3821, tot_loss=1.528, over 1142.00 samples.], 2024-10-22 00:34:50,450 INFO [train.py:682] (1/4) Start epoch 3611 2024-10-22 00:34:59,114 INFO [train.py:561] (1/4) Epoch 3611, batch 0, global_batch_idx: 57760, batch size: 108, loss[dur_loss=0.1843, prior_loss=0.9707, diff_loss=0.3045, tot_loss=1.459, over 108.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9707, diff_loss=0.3045, tot_loss=1.459, over 108.00 samples.], 2024-10-22 00:35:13,467 INFO [train.py:561] (1/4) Epoch 3611, batch 10, global_batch_idx: 57770, batch size: 111, loss[dur_loss=0.1833, prior_loss=0.971, diff_loss=0.2761, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9698, diff_loss=0.3543, tot_loss=1.504, over 1656.00 samples.], 2024-10-22 00:35:20,684 INFO [train.py:682] (1/4) Start epoch 3612 2024-10-22 00:35:34,724 INFO [train.py:561] (1/4) Epoch 3612, batch 4, global_batch_idx: 57780, batch size: 189, loss[dur_loss=0.1793, prior_loss=0.9702, diff_loss=0.3416, tot_loss=1.491, over 189.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9696, diff_loss=0.408, tot_loss=1.555, over 937.00 samples.], 2024-10-22 00:35:49,781 INFO [train.py:561] (1/4) Epoch 3612, batch 14, global_batch_idx: 57790, batch size: 142, loss[dur_loss=0.1795, prior_loss=0.9701, diff_loss=0.2731, tot_loss=1.423, over 142.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.97, diff_loss=0.3433, tot_loss=1.492, over 2210.00 samples.], 2024-10-22 00:35:51,219 INFO [train.py:682] (1/4) Start epoch 3613 2024-10-22 00:36:11,166 INFO [train.py:561] (1/4) Epoch 3613, batch 8, global_batch_idx: 57800, batch size: 170, loss[dur_loss=0.181, prior_loss=0.97, diff_loss=0.295, tot_loss=1.446, over 170.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9697, diff_loss=0.3584, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 00:36:21,410 INFO [train.py:682] (1/4) Start epoch 3614 2024-10-22 00:36:32,945 INFO [train.py:561] (1/4) Epoch 3614, batch 2, global_batch_idx: 57810, batch size: 203, loss[dur_loss=0.1804, prior_loss=0.9703, diff_loss=0.3457, tot_loss=1.496, over 203.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9702, diff_loss=0.3127, tot_loss=1.464, over 442.00 samples.], 2024-10-22 00:36:47,289 INFO [train.py:561] (1/4) Epoch 3614, batch 12, global_batch_idx: 57820, batch size: 152, loss[dur_loss=0.1787, prior_loss=0.9703, diff_loss=0.3304, tot_loss=1.479, over 152.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.3481, tot_loss=1.496, over 1966.00 samples.], 2024-10-22 00:36:51,822 INFO [train.py:682] (1/4) Start epoch 3615 2024-10-22 00:37:08,843 INFO [train.py:561] (1/4) Epoch 3615, batch 6, global_batch_idx: 57830, batch size: 106, loss[dur_loss=0.177, prior_loss=0.9699, diff_loss=0.2778, tot_loss=1.425, over 106.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3748, tot_loss=1.522, over 1142.00 samples.], 2024-10-22 00:37:22,120 INFO [train.py:682] (1/4) Start epoch 3616 2024-10-22 00:37:31,281 INFO [train.py:561] (1/4) Epoch 3616, batch 0, global_batch_idx: 57840, batch size: 108, loss[dur_loss=0.1801, prior_loss=0.9707, diff_loss=0.2914, tot_loss=1.442, over 108.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9707, diff_loss=0.2914, tot_loss=1.442, over 108.00 samples.], 2024-10-22 00:37:45,523 INFO [train.py:561] (1/4) Epoch 3616, batch 10, global_batch_idx: 57850, batch size: 111, loss[dur_loss=0.1826, prior_loss=0.9711, diff_loss=0.2992, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9697, diff_loss=0.3528, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 00:37:52,707 INFO [train.py:682] (1/4) Start epoch 3617 2024-10-22 00:38:06,554 INFO [train.py:561] (1/4) Epoch 3617, batch 4, global_batch_idx: 57860, batch size: 189, loss[dur_loss=0.1801, prior_loss=0.97, diff_loss=0.3147, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.411, tot_loss=1.557, over 937.00 samples.], 2024-10-22 00:38:21,490 INFO [train.py:561] (1/4) Epoch 3617, batch 14, global_batch_idx: 57870, batch size: 142, loss[dur_loss=0.1818, prior_loss=0.9701, diff_loss=0.2671, tot_loss=1.419, over 142.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9699, diff_loss=0.3364, tot_loss=1.485, over 2210.00 samples.], 2024-10-22 00:38:22,923 INFO [train.py:682] (1/4) Start epoch 3618 2024-10-22 00:38:42,939 INFO [train.py:561] (1/4) Epoch 3618, batch 8, global_batch_idx: 57880, batch size: 170, loss[dur_loss=0.1818, prior_loss=0.9702, diff_loss=0.2869, tot_loss=1.439, over 170.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9697, diff_loss=0.3599, tot_loss=1.508, over 1432.00 samples.], 2024-10-22 00:38:53,145 INFO [train.py:682] (1/4) Start epoch 3619 2024-10-22 00:39:04,410 INFO [train.py:561] (1/4) Epoch 3619, batch 2, global_batch_idx: 57890, batch size: 203, loss[dur_loss=0.1826, prior_loss=0.97, diff_loss=0.3265, tot_loss=1.479, over 203.00 samples.], tot_loss[dur_loss=0.1818, prior_loss=0.9701, diff_loss=0.3219, tot_loss=1.474, over 442.00 samples.], 2024-10-22 00:39:18,739 INFO [train.py:561] (1/4) Epoch 3619, batch 12, global_batch_idx: 57900, batch size: 152, loss[dur_loss=0.1769, prior_loss=0.9702, diff_loss=0.3036, tot_loss=1.451, over 152.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.3496, tot_loss=1.499, over 1966.00 samples.], 2024-10-22 00:39:23,249 INFO [train.py:682] (1/4) Start epoch 3620 2024-10-22 00:39:40,662 INFO [train.py:561] (1/4) Epoch 3620, batch 6, global_batch_idx: 57910, batch size: 106, loss[dur_loss=0.1781, prior_loss=0.9701, diff_loss=0.2701, tot_loss=1.418, over 106.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.368, tot_loss=1.515, over 1142.00 samples.], 2024-10-22 00:39:53,914 INFO [train.py:682] (1/4) Start epoch 3621 2024-10-22 00:40:02,907 INFO [train.py:561] (1/4) Epoch 3621, batch 0, global_batch_idx: 57920, batch size: 108, loss[dur_loss=0.1798, prior_loss=0.9705, diff_loss=0.3112, tot_loss=1.462, over 108.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9705, diff_loss=0.3112, tot_loss=1.462, over 108.00 samples.], 2024-10-22 00:40:16,932 INFO [train.py:561] (1/4) Epoch 3621, batch 10, global_batch_idx: 57930, batch size: 111, loss[dur_loss=0.1832, prior_loss=0.9712, diff_loss=0.244, tot_loss=1.398, over 111.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3607, tot_loss=1.509, over 1656.00 samples.], 2024-10-22 00:40:23,988 INFO [train.py:682] (1/4) Start epoch 3622 2024-10-22 00:40:37,526 INFO [train.py:561] (1/4) Epoch 3622, batch 4, global_batch_idx: 57940, batch size: 189, loss[dur_loss=0.1793, prior_loss=0.9701, diff_loss=0.3162, tot_loss=1.466, over 189.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9694, diff_loss=0.4081, tot_loss=1.554, over 937.00 samples.], 2024-10-22 00:40:52,331 INFO [train.py:561] (1/4) Epoch 3622, batch 14, global_batch_idx: 57950, batch size: 142, loss[dur_loss=0.181, prior_loss=0.97, diff_loss=0.2901, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9699, diff_loss=0.3429, tot_loss=1.492, over 2210.00 samples.], 2024-10-22 00:40:53,757 INFO [train.py:682] (1/4) Start epoch 3623 2024-10-22 00:41:13,652 INFO [train.py:561] (1/4) Epoch 3623, batch 8, global_batch_idx: 57960, batch size: 170, loss[dur_loss=0.1814, prior_loss=0.9702, diff_loss=0.2894, tot_loss=1.441, over 170.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9696, diff_loss=0.3664, tot_loss=1.513, over 1432.00 samples.], 2024-10-22 00:41:23,905 INFO [train.py:682] (1/4) Start epoch 3624 2024-10-22 00:41:35,629 INFO [train.py:561] (1/4) Epoch 3624, batch 2, global_batch_idx: 57970, batch size: 203, loss[dur_loss=0.1814, prior_loss=0.97, diff_loss=0.3275, tot_loss=1.479, over 203.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9701, diff_loss=0.3103, tot_loss=1.461, over 442.00 samples.], 2024-10-22 00:41:49,888 INFO [train.py:561] (1/4) Epoch 3624, batch 12, global_batch_idx: 57980, batch size: 152, loss[dur_loss=0.1791, prior_loss=0.9704, diff_loss=0.2919, tot_loss=1.441, over 152.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.9699, diff_loss=0.3446, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 00:41:54,360 INFO [train.py:682] (1/4) Start epoch 3625 2024-10-22 00:42:11,599 INFO [train.py:561] (1/4) Epoch 3625, batch 6, global_batch_idx: 57990, batch size: 106, loss[dur_loss=0.1777, prior_loss=0.97, diff_loss=0.2854, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9696, diff_loss=0.3925, tot_loss=1.54, over 1142.00 samples.], 2024-10-22 00:42:24,714 INFO [train.py:682] (1/4) Start epoch 3626 2024-10-22 00:42:33,682 INFO [train.py:561] (1/4) Epoch 3626, batch 0, global_batch_idx: 58000, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9708, diff_loss=0.2794, tot_loss=1.432, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9708, diff_loss=0.2794, tot_loss=1.432, over 108.00 samples.], 2024-10-22 00:42:47,958 INFO [train.py:561] (1/4) Epoch 3626, batch 10, global_batch_idx: 58010, batch size: 111, loss[dur_loss=0.1804, prior_loss=0.9708, diff_loss=0.3111, tot_loss=1.462, over 111.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.3518, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 00:42:55,131 INFO [train.py:682] (1/4) Start epoch 3627 2024-10-22 00:43:09,157 INFO [train.py:561] (1/4) Epoch 3627, batch 4, global_batch_idx: 58020, batch size: 189, loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.268, tot_loss=1.419, over 189.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.4052, tot_loss=1.552, over 937.00 samples.], 2024-10-22 00:43:24,077 INFO [train.py:561] (1/4) Epoch 3627, batch 14, global_batch_idx: 58030, batch size: 142, loss[dur_loss=0.1802, prior_loss=0.9698, diff_loss=0.2867, tot_loss=1.437, over 142.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.3374, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 00:43:25,509 INFO [train.py:682] (1/4) Start epoch 3628 2024-10-22 00:43:45,782 INFO [train.py:561] (1/4) Epoch 3628, batch 8, global_batch_idx: 58040, batch size: 170, loss[dur_loss=0.1808, prior_loss=0.97, diff_loss=0.3353, tot_loss=1.486, over 170.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9697, diff_loss=0.3678, tot_loss=1.516, over 1432.00 samples.], 2024-10-22 00:43:55,988 INFO [train.py:682] (1/4) Start epoch 3629 2024-10-22 00:44:07,719 INFO [train.py:561] (1/4) Epoch 3629, batch 2, global_batch_idx: 58050, batch size: 203, loss[dur_loss=0.1794, prior_loss=0.9702, diff_loss=0.3146, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9702, diff_loss=0.3009, tot_loss=1.451, over 442.00 samples.], 2024-10-22 00:44:21,956 INFO [train.py:561] (1/4) Epoch 3629, batch 12, global_batch_idx: 58060, batch size: 152, loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.2862, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9699, diff_loss=0.3414, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 00:44:26,432 INFO [train.py:682] (1/4) Start epoch 3630 2024-10-22 00:44:43,393 INFO [train.py:561] (1/4) Epoch 3630, batch 6, global_batch_idx: 58070, batch size: 106, loss[dur_loss=0.1752, prior_loss=0.97, diff_loss=0.3115, tot_loss=1.457, over 106.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3933, tot_loss=1.54, over 1142.00 samples.], 2024-10-22 00:44:56,500 INFO [train.py:682] (1/4) Start epoch 3631 2024-10-22 00:45:05,076 INFO [train.py:561] (1/4) Epoch 3631, batch 0, global_batch_idx: 58080, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9707, diff_loss=0.2744, tot_loss=1.427, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9707, diff_loss=0.2744, tot_loss=1.427, over 108.00 samples.], 2024-10-22 00:45:19,283 INFO [train.py:561] (1/4) Epoch 3631, batch 10, global_batch_idx: 58090, batch size: 111, loss[dur_loss=0.1818, prior_loss=0.9711, diff_loss=0.3168, tot_loss=1.47, over 111.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.3549, tot_loss=1.503, over 1656.00 samples.], 2024-10-22 00:45:26,417 INFO [train.py:682] (1/4) Start epoch 3632 2024-10-22 00:45:40,409 INFO [train.py:561] (1/4) Epoch 3632, batch 4, global_batch_idx: 58100, batch size: 189, loss[dur_loss=0.1802, prior_loss=0.97, diff_loss=0.3301, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9694, diff_loss=0.4077, tot_loss=1.554, over 937.00 samples.], 2024-10-22 00:45:55,200 INFO [train.py:561] (1/4) Epoch 3632, batch 14, global_batch_idx: 58110, batch size: 142, loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.2934, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9698, diff_loss=0.3379, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 00:45:56,612 INFO [train.py:682] (1/4) Start epoch 3633 2024-10-22 00:46:16,365 INFO [train.py:561] (1/4) Epoch 3633, batch 8, global_batch_idx: 58120, batch size: 170, loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.3094, tot_loss=1.46, over 170.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9696, diff_loss=0.3719, tot_loss=1.519, over 1432.00 samples.], 2024-10-22 00:46:26,549 INFO [train.py:682] (1/4) Start epoch 3634 2024-10-22 00:46:38,028 INFO [train.py:561] (1/4) Epoch 3634, batch 2, global_batch_idx: 58130, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.9701, diff_loss=0.3245, tot_loss=1.477, over 203.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9701, diff_loss=0.291, tot_loss=1.442, over 442.00 samples.], 2024-10-22 00:46:52,187 INFO [train.py:561] (1/4) Epoch 3634, batch 12, global_batch_idx: 58140, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.9704, diff_loss=0.327, tot_loss=1.476, over 152.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.3451, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 00:46:56,703 INFO [train.py:682] (1/4) Start epoch 3635 2024-10-22 00:47:13,514 INFO [train.py:561] (1/4) Epoch 3635, batch 6, global_batch_idx: 58150, batch size: 106, loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.2779, tot_loss=1.426, over 106.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9696, diff_loss=0.3918, tot_loss=1.539, over 1142.00 samples.], 2024-10-22 00:47:26,647 INFO [train.py:682] (1/4) Start epoch 3636 2024-10-22 00:47:35,592 INFO [train.py:561] (1/4) Epoch 3636, batch 0, global_batch_idx: 58160, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9707, diff_loss=0.3379, tot_loss=1.491, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9707, diff_loss=0.3379, tot_loss=1.491, over 108.00 samples.], 2024-10-22 00:47:49,722 INFO [train.py:561] (1/4) Epoch 3636, batch 10, global_batch_idx: 58170, batch size: 111, loss[dur_loss=0.1804, prior_loss=0.9709, diff_loss=0.3003, tot_loss=1.452, over 111.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9698, diff_loss=0.3561, tot_loss=1.505, over 1656.00 samples.], 2024-10-22 00:47:56,858 INFO [train.py:682] (1/4) Start epoch 3637 2024-10-22 00:48:10,554 INFO [train.py:561] (1/4) Epoch 3637, batch 4, global_batch_idx: 58180, batch size: 189, loss[dur_loss=0.1796, prior_loss=0.9702, diff_loss=0.2968, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.3991, tot_loss=1.545, over 937.00 samples.], 2024-10-22 00:48:25,370 INFO [train.py:561] (1/4) Epoch 3637, batch 14, global_batch_idx: 58190, batch size: 142, loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.2875, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9699, diff_loss=0.3412, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 00:48:26,814 INFO [train.py:682] (1/4) Start epoch 3638 2024-10-22 00:48:46,781 INFO [train.py:561] (1/4) Epoch 3638, batch 8, global_batch_idx: 58200, batch size: 170, loss[dur_loss=0.1813, prior_loss=0.9703, diff_loss=0.2836, tot_loss=1.435, over 170.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9698, diff_loss=0.367, tot_loss=1.515, over 1432.00 samples.], 2024-10-22 00:48:56,944 INFO [train.py:682] (1/4) Start epoch 3639 2024-10-22 00:49:08,064 INFO [train.py:561] (1/4) Epoch 3639, batch 2, global_batch_idx: 58210, batch size: 203, loss[dur_loss=0.1819, prior_loss=0.9699, diff_loss=0.3202, tot_loss=1.472, over 203.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9701, diff_loss=0.3029, tot_loss=1.453, over 442.00 samples.], 2024-10-22 00:49:22,606 INFO [train.py:561] (1/4) Epoch 3639, batch 12, global_batch_idx: 58220, batch size: 152, loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.2837, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9698, diff_loss=0.3423, tot_loss=1.491, over 1966.00 samples.], 2024-10-22 00:49:27,084 INFO [train.py:682] (1/4) Start epoch 3640 2024-10-22 00:49:44,272 INFO [train.py:561] (1/4) Epoch 3640, batch 6, global_batch_idx: 58230, batch size: 106, loss[dur_loss=0.1785, prior_loss=0.9699, diff_loss=0.2987, tot_loss=1.447, over 106.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.3801, tot_loss=1.527, over 1142.00 samples.], 2024-10-22 00:49:57,440 INFO [train.py:682] (1/4) Start epoch 3641 2024-10-22 00:50:06,312 INFO [train.py:561] (1/4) Epoch 3641, batch 0, global_batch_idx: 58240, batch size: 108, loss[dur_loss=0.1809, prior_loss=0.9706, diff_loss=0.2743, tot_loss=1.426, over 108.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9706, diff_loss=0.2743, tot_loss=1.426, over 108.00 samples.], 2024-10-22 00:50:20,535 INFO [train.py:561] (1/4) Epoch 3641, batch 10, global_batch_idx: 58250, batch size: 111, loss[dur_loss=0.1828, prior_loss=0.9709, diff_loss=0.3286, tot_loss=1.482, over 111.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9698, diff_loss=0.352, tot_loss=1.5, over 1656.00 samples.], 2024-10-22 00:50:27,714 INFO [train.py:682] (1/4) Start epoch 3642 2024-10-22 00:50:41,461 INFO [train.py:561] (1/4) Epoch 3642, batch 4, global_batch_idx: 58260, batch size: 189, loss[dur_loss=0.1802, prior_loss=0.9701, diff_loss=0.3008, tot_loss=1.451, over 189.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.4052, tot_loss=1.552, over 937.00 samples.], 2024-10-22 00:50:56,397 INFO [train.py:561] (1/4) Epoch 3642, batch 14, global_batch_idx: 58270, batch size: 142, loss[dur_loss=0.1805, prior_loss=0.97, diff_loss=0.3072, tot_loss=1.458, over 142.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9699, diff_loss=0.3367, tot_loss=1.485, over 2210.00 samples.], 2024-10-22 00:50:57,839 INFO [train.py:682] (1/4) Start epoch 3643 2024-10-22 00:51:17,945 INFO [train.py:561] (1/4) Epoch 3643, batch 8, global_batch_idx: 58280, batch size: 170, loss[dur_loss=0.1815, prior_loss=0.9702, diff_loss=0.3217, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.3709, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 00:51:28,183 INFO [train.py:682] (1/4) Start epoch 3644 2024-10-22 00:51:39,565 INFO [train.py:561] (1/4) Epoch 3644, batch 2, global_batch_idx: 58290, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.97, diff_loss=0.2998, tot_loss=1.452, over 203.00 samples.], tot_loss[dur_loss=0.1809, prior_loss=0.9701, diff_loss=0.285, tot_loss=1.436, over 442.00 samples.], 2024-10-22 00:51:53,878 INFO [train.py:561] (1/4) Epoch 3644, batch 12, global_batch_idx: 58300, batch size: 152, loss[dur_loss=0.1777, prior_loss=0.9702, diff_loss=0.2869, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.3374, tot_loss=1.485, over 1966.00 samples.], 2024-10-22 00:51:58,425 INFO [train.py:682] (1/4) Start epoch 3645 2024-10-22 00:52:15,659 INFO [train.py:561] (1/4) Epoch 3645, batch 6, global_batch_idx: 58310, batch size: 106, loss[dur_loss=0.1768, prior_loss=0.9699, diff_loss=0.2873, tot_loss=1.434, over 106.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.3837, tot_loss=1.53, over 1142.00 samples.], 2024-10-22 00:52:28,862 INFO [train.py:682] (1/4) Start epoch 3646 2024-10-22 00:52:37,579 INFO [train.py:561] (1/4) Epoch 3646, batch 0, global_batch_idx: 58320, batch size: 108, loss[dur_loss=0.1823, prior_loss=0.9705, diff_loss=0.267, tot_loss=1.42, over 108.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9705, diff_loss=0.267, tot_loss=1.42, over 108.00 samples.], 2024-10-22 00:52:51,806 INFO [train.py:561] (1/4) Epoch 3646, batch 10, global_batch_idx: 58330, batch size: 111, loss[dur_loss=0.183, prior_loss=0.9712, diff_loss=0.3084, tot_loss=1.463, over 111.00 samples.], tot_loss[dur_loss=0.1786, prior_loss=0.9698, diff_loss=0.3648, tot_loss=1.513, over 1656.00 samples.], 2024-10-22 00:52:59,059 INFO [train.py:682] (1/4) Start epoch 3647 2024-10-22 00:53:12,863 INFO [train.py:561] (1/4) Epoch 3647, batch 4, global_batch_idx: 58340, batch size: 189, loss[dur_loss=0.1811, prior_loss=0.9703, diff_loss=0.3363, tot_loss=1.488, over 189.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9694, diff_loss=0.4131, tot_loss=1.559, over 937.00 samples.], 2024-10-22 00:53:27,933 INFO [train.py:561] (1/4) Epoch 3647, batch 14, global_batch_idx: 58350, batch size: 142, loss[dur_loss=0.1837, prior_loss=0.97, diff_loss=0.2756, tot_loss=1.429, over 142.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9699, diff_loss=0.3471, tot_loss=1.496, over 2210.00 samples.], 2024-10-22 00:53:29,415 INFO [train.py:682] (1/4) Start epoch 3648 2024-10-22 00:53:49,523 INFO [train.py:561] (1/4) Epoch 3648, batch 8, global_batch_idx: 58360, batch size: 170, loss[dur_loss=0.1797, prior_loss=0.9701, diff_loss=0.3429, tot_loss=1.493, over 170.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9697, diff_loss=0.3714, tot_loss=1.519, over 1432.00 samples.], 2024-10-22 00:53:59,846 INFO [train.py:682] (1/4) Start epoch 3649 2024-10-22 00:54:11,405 INFO [train.py:561] (1/4) Epoch 3649, batch 2, global_batch_idx: 58370, batch size: 203, loss[dur_loss=0.1798, prior_loss=0.9699, diff_loss=0.2959, tot_loss=1.446, over 203.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.3029, tot_loss=1.453, over 442.00 samples.], 2024-10-22 00:54:25,704 INFO [train.py:561] (1/4) Epoch 3649, batch 12, global_batch_idx: 58380, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.9701, diff_loss=0.3286, tot_loss=1.477, over 152.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9698, diff_loss=0.3468, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 00:54:30,258 INFO [train.py:682] (1/4) Start epoch 3650 2024-10-22 00:54:47,467 INFO [train.py:561] (1/4) Epoch 3650, batch 6, global_batch_idx: 58390, batch size: 106, loss[dur_loss=0.1803, prior_loss=0.9699, diff_loss=0.2845, tot_loss=1.435, over 106.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9695, diff_loss=0.3844, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 00:55:00,663 INFO [train.py:682] (1/4) Start epoch 3651 2024-10-22 00:55:09,545 INFO [train.py:561] (1/4) Epoch 3651, batch 0, global_batch_idx: 58400, batch size: 108, loss[dur_loss=0.1842, prior_loss=0.9705, diff_loss=0.2783, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1842, prior_loss=0.9705, diff_loss=0.2783, tot_loss=1.433, over 108.00 samples.], 2024-10-22 00:55:23,718 INFO [train.py:561] (1/4) Epoch 3651, batch 10, global_batch_idx: 58410, batch size: 111, loss[dur_loss=0.1804, prior_loss=0.9712, diff_loss=0.2873, tot_loss=1.439, over 111.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9698, diff_loss=0.3526, tot_loss=1.5, over 1656.00 samples.], 2024-10-22 00:55:30,951 INFO [train.py:682] (1/4) Start epoch 3652 2024-10-22 00:55:44,747 INFO [train.py:561] (1/4) Epoch 3652, batch 4, global_batch_idx: 58420, batch size: 189, loss[dur_loss=0.1832, prior_loss=0.9704, diff_loss=0.3096, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.4117, tot_loss=1.558, over 937.00 samples.], 2024-10-22 00:55:59,804 INFO [train.py:561] (1/4) Epoch 3652, batch 14, global_batch_idx: 58430, batch size: 142, loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.2793, tot_loss=1.43, over 142.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.3424, tot_loss=1.491, over 2210.00 samples.], 2024-10-22 00:56:01,241 INFO [train.py:682] (1/4) Start epoch 3653 2024-10-22 00:56:21,164 INFO [train.py:561] (1/4) Epoch 3653, batch 8, global_batch_idx: 58440, batch size: 170, loss[dur_loss=0.1854, prior_loss=0.9702, diff_loss=0.2773, tot_loss=1.433, over 170.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9697, diff_loss=0.3606, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 00:56:31,402 INFO [train.py:682] (1/4) Start epoch 3654 2024-10-22 00:56:43,095 INFO [train.py:561] (1/4) Epoch 3654, batch 2, global_batch_idx: 58450, batch size: 203, loss[dur_loss=0.1827, prior_loss=0.9702, diff_loss=0.346, tot_loss=1.499, over 203.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.3187, tot_loss=1.471, over 442.00 samples.], 2024-10-22 00:56:57,343 INFO [train.py:561] (1/4) Epoch 3654, batch 12, global_batch_idx: 58460, batch size: 152, loss[dur_loss=0.181, prior_loss=0.9705, diff_loss=0.3209, tot_loss=1.472, over 152.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.3541, tot_loss=1.504, over 1966.00 samples.], 2024-10-22 00:57:01,888 INFO [train.py:682] (1/4) Start epoch 3655 2024-10-22 00:57:19,048 INFO [train.py:561] (1/4) Epoch 3655, batch 6, global_batch_idx: 58470, batch size: 106, loss[dur_loss=0.1777, prior_loss=0.9699, diff_loss=0.3003, tot_loss=1.448, over 106.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9695, diff_loss=0.3914, tot_loss=1.538, over 1142.00 samples.], 2024-10-22 00:57:32,248 INFO [train.py:682] (1/4) Start epoch 3656 2024-10-22 00:57:41,070 INFO [train.py:561] (1/4) Epoch 3656, batch 0, global_batch_idx: 58480, batch size: 108, loss[dur_loss=0.1843, prior_loss=0.9705, diff_loss=0.2994, tot_loss=1.454, over 108.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9705, diff_loss=0.2994, tot_loss=1.454, over 108.00 samples.], 2024-10-22 00:57:55,378 INFO [train.py:561] (1/4) Epoch 3656, batch 10, global_batch_idx: 58490, batch size: 111, loss[dur_loss=0.1845, prior_loss=0.9712, diff_loss=0.2696, tot_loss=1.425, over 111.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9699, diff_loss=0.3499, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 00:58:02,570 INFO [train.py:682] (1/4) Start epoch 3657 2024-10-22 00:58:17,122 INFO [train.py:561] (1/4) Epoch 3657, batch 4, global_batch_idx: 58500, batch size: 189, loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3608, tot_loss=1.512, over 189.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.4233, tot_loss=1.57, over 937.00 samples.], 2024-10-22 00:58:18,769 INFO [train.py:579] (1/4) Computing validation loss 2024-10-22 00:58:43,035 INFO [train.py:589] (1/4) Epoch 3657, validation: dur_loss=0.4592, prior_loss=1.038, diff_loss=0.3882, tot_loss=1.885, over 100.00 samples. 2024-10-22 00:58:43,035 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-22 00:58:56,529 INFO [train.py:561] (1/4) Epoch 3657, batch 14, global_batch_idx: 58510, batch size: 142, loss[dur_loss=0.1828, prior_loss=0.9701, diff_loss=0.2887, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.97, diff_loss=0.346, tot_loss=1.495, over 2210.00 samples.], 2024-10-22 00:58:57,976 INFO [train.py:682] (1/4) Start epoch 3658 2024-10-22 00:59:18,315 INFO [train.py:561] (1/4) Epoch 3658, batch 8, global_batch_idx: 58520, batch size: 170, loss[dur_loss=0.1827, prior_loss=0.9702, diff_loss=0.318, tot_loss=1.471, over 170.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.3758, tot_loss=1.524, over 1432.00 samples.], 2024-10-22 00:59:28,689 INFO [train.py:682] (1/4) Start epoch 3659 2024-10-22 00:59:40,504 INFO [train.py:561] (1/4) Epoch 3659, batch 2, global_batch_idx: 58530, batch size: 203, loss[dur_loss=0.1814, prior_loss=0.9702, diff_loss=0.3211, tot_loss=1.473, over 203.00 samples.], tot_loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.3067, tot_loss=1.457, over 442.00 samples.], 2024-10-22 00:59:54,925 INFO [train.py:561] (1/4) Epoch 3659, batch 12, global_batch_idx: 58540, batch size: 152, loss[dur_loss=0.1801, prior_loss=0.9701, diff_loss=0.2983, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9699, diff_loss=0.3451, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 00:59:59,411 INFO [train.py:682] (1/4) Start epoch 3660 2024-10-22 01:00:16,945 INFO [train.py:561] (1/4) Epoch 3660, batch 6, global_batch_idx: 58550, batch size: 106, loss[dur_loss=0.1804, prior_loss=0.9698, diff_loss=0.2559, tot_loss=1.406, over 106.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.3776, tot_loss=1.524, over 1142.00 samples.], 2024-10-22 01:00:30,067 INFO [train.py:682] (1/4) Start epoch 3661 2024-10-22 01:00:39,122 INFO [train.py:561] (1/4) Epoch 3661, batch 0, global_batch_idx: 58560, batch size: 108, loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3297, tot_loss=1.483, over 108.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9706, diff_loss=0.3297, tot_loss=1.483, over 108.00 samples.], 2024-10-22 01:00:53,468 INFO [train.py:561] (1/4) Epoch 3661, batch 10, global_batch_idx: 58570, batch size: 111, loss[dur_loss=0.1819, prior_loss=0.9708, diff_loss=0.333, tot_loss=1.486, over 111.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3614, tot_loss=1.509, over 1656.00 samples.], 2024-10-22 01:01:00,633 INFO [train.py:682] (1/4) Start epoch 3662 2024-10-22 01:01:14,990 INFO [train.py:561] (1/4) Epoch 3662, batch 4, global_batch_idx: 58580, batch size: 189, loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.3299, tot_loss=1.48, over 189.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9694, diff_loss=0.417, tot_loss=1.564, over 937.00 samples.], 2024-10-22 01:01:29,979 INFO [train.py:561] (1/4) Epoch 3662, batch 14, global_batch_idx: 58590, batch size: 142, loss[dur_loss=0.1798, prior_loss=0.9699, diff_loss=0.2816, tot_loss=1.431, over 142.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9698, diff_loss=0.3479, tot_loss=1.497, over 2210.00 samples.], 2024-10-22 01:01:31,414 INFO [train.py:682] (1/4) Start epoch 3663 2024-10-22 01:01:51,750 INFO [train.py:561] (1/4) Epoch 3663, batch 8, global_batch_idx: 58600, batch size: 170, loss[dur_loss=0.1809, prior_loss=0.9701, diff_loss=0.2856, tot_loss=1.437, over 170.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3698, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 01:02:01,988 INFO [train.py:682] (1/4) Start epoch 3664 2024-10-22 01:02:13,473 INFO [train.py:561] (1/4) Epoch 3664, batch 2, global_batch_idx: 58610, batch size: 203, loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.3419, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9703, diff_loss=0.3101, tot_loss=1.461, over 442.00 samples.], 2024-10-22 01:02:27,852 INFO [train.py:561] (1/4) Epoch 3664, batch 12, global_batch_idx: 58620, batch size: 152, loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.3021, tot_loss=1.452, over 152.00 samples.], tot_loss[dur_loss=0.1793, prior_loss=0.9699, diff_loss=0.3448, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 01:02:32,350 INFO [train.py:682] (1/4) Start epoch 3665 2024-10-22 01:02:50,131 INFO [train.py:561] (1/4) Epoch 3665, batch 6, global_batch_idx: 58630, batch size: 106, loss[dur_loss=0.1761, prior_loss=0.9699, diff_loss=0.2464, tot_loss=1.392, over 106.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9696, diff_loss=0.3777, tot_loss=1.525, over 1142.00 samples.], 2024-10-22 01:03:03,283 INFO [train.py:682] (1/4) Start epoch 3666 2024-10-22 01:03:12,176 INFO [train.py:561] (1/4) Epoch 3666, batch 0, global_batch_idx: 58640, batch size: 108, loss[dur_loss=0.181, prior_loss=0.9707, diff_loss=0.2845, tot_loss=1.436, over 108.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9707, diff_loss=0.2845, tot_loss=1.436, over 108.00 samples.], 2024-10-22 01:03:26,591 INFO [train.py:561] (1/4) Epoch 3666, batch 10, global_batch_idx: 58650, batch size: 111, loss[dur_loss=0.1839, prior_loss=0.9711, diff_loss=0.2845, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9697, diff_loss=0.3644, tot_loss=1.512, over 1656.00 samples.], 2024-10-22 01:03:33,731 INFO [train.py:682] (1/4) Start epoch 3667 2024-10-22 01:03:47,963 INFO [train.py:561] (1/4) Epoch 3667, batch 4, global_batch_idx: 58660, batch size: 189, loss[dur_loss=0.1778, prior_loss=0.97, diff_loss=0.3163, tot_loss=1.464, over 189.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9694, diff_loss=0.4058, tot_loss=1.551, over 937.00 samples.], 2024-10-22 01:04:02,887 INFO [train.py:561] (1/4) Epoch 3667, batch 14, global_batch_idx: 58670, batch size: 142, loss[dur_loss=0.1807, prior_loss=0.9699, diff_loss=0.2907, tot_loss=1.441, over 142.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3372, tot_loss=1.485, over 2210.00 samples.], 2024-10-22 01:04:04,337 INFO [train.py:682] (1/4) Start epoch 3668 2024-10-22 01:04:24,964 INFO [train.py:561] (1/4) Epoch 3668, batch 8, global_batch_idx: 58680, batch size: 170, loss[dur_loss=0.1801, prior_loss=0.97, diff_loss=0.322, tot_loss=1.472, over 170.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9696, diff_loss=0.3646, tot_loss=1.511, over 1432.00 samples.], 2024-10-22 01:04:35,102 INFO [train.py:682] (1/4) Start epoch 3669 2024-10-22 01:04:46,755 INFO [train.py:561] (1/4) Epoch 3669, batch 2, global_batch_idx: 58690, batch size: 203, loss[dur_loss=0.1826, prior_loss=0.9701, diff_loss=0.3206, tot_loss=1.473, over 203.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.9701, diff_loss=0.3018, tot_loss=1.453, over 442.00 samples.], 2024-10-22 01:05:01,051 INFO [train.py:561] (1/4) Epoch 3669, batch 12, global_batch_idx: 58700, batch size: 152, loss[dur_loss=0.1789, prior_loss=0.9702, diff_loss=0.2824, tot_loss=1.432, over 152.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3425, tot_loss=1.491, over 1966.00 samples.], 2024-10-22 01:05:05,499 INFO [train.py:682] (1/4) Start epoch 3670 2024-10-22 01:05:22,758 INFO [train.py:561] (1/4) Epoch 3670, batch 6, global_batch_idx: 58710, batch size: 106, loss[dur_loss=0.1779, prior_loss=0.9699, diff_loss=0.2628, tot_loss=1.411, over 106.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9695, diff_loss=0.3754, tot_loss=1.523, over 1142.00 samples.], 2024-10-22 01:05:35,886 INFO [train.py:682] (1/4) Start epoch 3671 2024-10-22 01:05:44,771 INFO [train.py:561] (1/4) Epoch 3671, batch 0, global_batch_idx: 58720, batch size: 108, loss[dur_loss=0.1836, prior_loss=0.9708, diff_loss=0.2765, tot_loss=1.431, over 108.00 samples.], tot_loss[dur_loss=0.1836, prior_loss=0.9708, diff_loss=0.2765, tot_loss=1.431, over 108.00 samples.], 2024-10-22 01:05:59,055 INFO [train.py:561] (1/4) Epoch 3671, batch 10, global_batch_idx: 58730, batch size: 111, loss[dur_loss=0.1803, prior_loss=0.971, diff_loss=0.324, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9698, diff_loss=0.3593, tot_loss=1.507, over 1656.00 samples.], 2024-10-22 01:06:06,175 INFO [train.py:682] (1/4) Start epoch 3672 2024-10-22 01:06:20,121 INFO [train.py:561] (1/4) Epoch 3672, batch 4, global_batch_idx: 58740, batch size: 189, loss[dur_loss=0.182, prior_loss=0.9701, diff_loss=0.295, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9694, diff_loss=0.4148, tot_loss=1.562, over 937.00 samples.], 2024-10-22 01:06:35,003 INFO [train.py:561] (1/4) Epoch 3672, batch 14, global_batch_idx: 58750, batch size: 142, loss[dur_loss=0.1808, prior_loss=0.9699, diff_loss=0.2962, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9698, diff_loss=0.3464, tot_loss=1.495, over 2210.00 samples.], 2024-10-22 01:06:36,448 INFO [train.py:682] (1/4) Start epoch 3673 2024-10-22 01:06:56,991 INFO [train.py:561] (1/4) Epoch 3673, batch 8, global_batch_idx: 58760, batch size: 170, loss[dur_loss=0.1817, prior_loss=0.9699, diff_loss=0.2886, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.3625, tot_loss=1.509, over 1432.00 samples.], 2024-10-22 01:07:07,128 INFO [train.py:682] (1/4) Start epoch 3674 2024-10-22 01:07:18,941 INFO [train.py:561] (1/4) Epoch 3674, batch 2, global_batch_idx: 58770, batch size: 203, loss[dur_loss=0.1813, prior_loss=0.97, diff_loss=0.3101, tot_loss=1.461, over 203.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9701, diff_loss=0.3069, tot_loss=1.457, over 442.00 samples.], 2024-10-22 01:07:33,220 INFO [train.py:561] (1/4) Epoch 3674, batch 12, global_batch_idx: 58780, batch size: 152, loss[dur_loss=0.1801, prior_loss=0.9703, diff_loss=0.2908, tot_loss=1.441, over 152.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3492, tot_loss=1.497, over 1966.00 samples.], 2024-10-22 01:07:37,680 INFO [train.py:682] (1/4) Start epoch 3675 2024-10-22 01:07:54,589 INFO [train.py:561] (1/4) Epoch 3675, batch 6, global_batch_idx: 58790, batch size: 106, loss[dur_loss=0.1804, prior_loss=0.9697, diff_loss=0.2874, tot_loss=1.438, over 106.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3913, tot_loss=1.538, over 1142.00 samples.], 2024-10-22 01:08:07,632 INFO [train.py:682] (1/4) Start epoch 3676 2024-10-22 01:08:16,772 INFO [train.py:561] (1/4) Epoch 3676, batch 0, global_batch_idx: 58800, batch size: 108, loss[dur_loss=0.1808, prior_loss=0.9706, diff_loss=0.2817, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9706, diff_loss=0.2817, tot_loss=1.433, over 108.00 samples.], 2024-10-22 01:08:31,087 INFO [train.py:561] (1/4) Epoch 3676, batch 10, global_batch_idx: 58810, batch size: 111, loss[dur_loss=0.1788, prior_loss=0.9708, diff_loss=0.2848, tot_loss=1.434, over 111.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.3545, tot_loss=1.502, over 1656.00 samples.], 2024-10-22 01:08:38,170 INFO [train.py:682] (1/4) Start epoch 3677 2024-10-22 01:08:51,907 INFO [train.py:561] (1/4) Epoch 3677, batch 4, global_batch_idx: 58820, batch size: 189, loss[dur_loss=0.1776, prior_loss=0.9698, diff_loss=0.3081, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9692, diff_loss=0.4095, tot_loss=1.555, over 937.00 samples.], 2024-10-22 01:09:06,840 INFO [train.py:561] (1/4) Epoch 3677, batch 14, global_batch_idx: 58830, batch size: 142, loss[dur_loss=0.1816, prior_loss=0.9697, diff_loss=0.291, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3443, tot_loss=1.492, over 2210.00 samples.], 2024-10-22 01:09:08,278 INFO [train.py:682] (1/4) Start epoch 3678 2024-10-22 01:09:28,305 INFO [train.py:561] (1/4) Epoch 3678, batch 8, global_batch_idx: 58840, batch size: 170, loss[dur_loss=0.18, prior_loss=0.97, diff_loss=0.3525, tot_loss=1.503, over 170.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9696, diff_loss=0.3635, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 01:09:38,433 INFO [train.py:682] (1/4) Start epoch 3679 2024-10-22 01:09:50,111 INFO [train.py:561] (1/4) Epoch 3679, batch 2, global_batch_idx: 58850, batch size: 203, loss[dur_loss=0.1793, prior_loss=0.97, diff_loss=0.3164, tot_loss=1.466, over 203.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.9701, diff_loss=0.3012, tot_loss=1.451, over 442.00 samples.], 2024-10-22 01:10:04,396 INFO [train.py:561] (1/4) Epoch 3679, batch 12, global_batch_idx: 58860, batch size: 152, loss[dur_loss=0.1764, prior_loss=0.9701, diff_loss=0.3227, tot_loss=1.469, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9697, diff_loss=0.3537, tot_loss=1.501, over 1966.00 samples.], 2024-10-22 01:10:08,868 INFO [train.py:682] (1/4) Start epoch 3680 2024-10-22 01:10:26,030 INFO [train.py:561] (1/4) Epoch 3680, batch 6, global_batch_idx: 58870, batch size: 106, loss[dur_loss=0.1772, prior_loss=0.9698, diff_loss=0.3044, tot_loss=1.451, over 106.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9694, diff_loss=0.3898, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 01:10:39,248 INFO [train.py:682] (1/4) Start epoch 3681 2024-10-22 01:10:48,001 INFO [train.py:561] (1/4) Epoch 3681, batch 0, global_batch_idx: 58880, batch size: 108, loss[dur_loss=0.1843, prior_loss=0.9705, diff_loss=0.3077, tot_loss=1.462, over 108.00 samples.], tot_loss[dur_loss=0.1843, prior_loss=0.9705, diff_loss=0.3077, tot_loss=1.462, over 108.00 samples.], 2024-10-22 01:11:02,280 INFO [train.py:561] (1/4) Epoch 3681, batch 10, global_batch_idx: 58890, batch size: 111, loss[dur_loss=0.1808, prior_loss=0.9709, diff_loss=0.2997, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9697, diff_loss=0.3669, tot_loss=1.514, over 1656.00 samples.], 2024-10-22 01:11:09,358 INFO [train.py:682] (1/4) Start epoch 3682 2024-10-22 01:11:23,208 INFO [train.py:561] (1/4) Epoch 3682, batch 4, global_batch_idx: 58900, batch size: 189, loss[dur_loss=0.1767, prior_loss=0.97, diff_loss=0.3221, tot_loss=1.469, over 189.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.406, tot_loss=1.552, over 937.00 samples.], 2024-10-22 01:11:38,132 INFO [train.py:561] (1/4) Epoch 3682, batch 14, global_batch_idx: 58910, batch size: 142, loss[dur_loss=0.1824, prior_loss=0.9699, diff_loss=0.2949, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.9697, diff_loss=0.3418, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 01:11:39,551 INFO [train.py:682] (1/4) Start epoch 3683 2024-10-22 01:11:59,601 INFO [train.py:561] (1/4) Epoch 3683, batch 8, global_batch_idx: 58920, batch size: 170, loss[dur_loss=0.1813, prior_loss=0.97, diff_loss=0.2977, tot_loss=1.449, over 170.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9696, diff_loss=0.3603, tot_loss=1.508, over 1432.00 samples.], 2024-10-22 01:12:09,701 INFO [train.py:682] (1/4) Start epoch 3684 2024-10-22 01:12:21,188 INFO [train.py:561] (1/4) Epoch 3684, batch 2, global_batch_idx: 58930, batch size: 203, loss[dur_loss=0.1801, prior_loss=0.97, diff_loss=0.3056, tot_loss=1.456, over 203.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.9701, diff_loss=0.302, tot_loss=1.453, over 442.00 samples.], 2024-10-22 01:12:35,351 INFO [train.py:561] (1/4) Epoch 3684, batch 12, global_batch_idx: 58940, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.9702, diff_loss=0.3298, tot_loss=1.478, over 152.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.344, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 01:12:39,769 INFO [train.py:682] (1/4) Start epoch 3685 2024-10-22 01:12:56,960 INFO [train.py:561] (1/4) Epoch 3685, batch 6, global_batch_idx: 58950, batch size: 106, loss[dur_loss=0.1781, prior_loss=0.97, diff_loss=0.2875, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3742, tot_loss=1.521, over 1142.00 samples.], 2024-10-22 01:13:09,983 INFO [train.py:682] (1/4) Start epoch 3686 2024-10-22 01:13:18,718 INFO [train.py:561] (1/4) Epoch 3686, batch 0, global_batch_idx: 58960, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.2835, tot_loss=1.436, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9706, diff_loss=0.2835, tot_loss=1.436, over 108.00 samples.], 2024-10-22 01:13:33,024 INFO [train.py:561] (1/4) Epoch 3686, batch 10, global_batch_idx: 58970, batch size: 111, loss[dur_loss=0.1795, prior_loss=0.9708, diff_loss=0.2794, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9697, diff_loss=0.3574, tot_loss=1.504, over 1656.00 samples.], 2024-10-22 01:13:40,106 INFO [train.py:682] (1/4) Start epoch 3687 2024-10-22 01:13:53,977 INFO [train.py:561] (1/4) Epoch 3687, batch 4, global_batch_idx: 58980, batch size: 189, loss[dur_loss=0.1788, prior_loss=0.9701, diff_loss=0.2923, tot_loss=1.441, over 189.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9693, diff_loss=0.4013, tot_loss=1.547, over 937.00 samples.], 2024-10-22 01:14:08,898 INFO [train.py:561] (1/4) Epoch 3687, batch 14, global_batch_idx: 58990, batch size: 142, loss[dur_loss=0.1812, prior_loss=0.9699, diff_loss=0.3128, tot_loss=1.464, over 142.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9697, diff_loss=0.3437, tot_loss=1.491, over 2210.00 samples.], 2024-10-22 01:14:10,323 INFO [train.py:682] (1/4) Start epoch 3688 2024-10-22 01:14:30,591 INFO [train.py:561] (1/4) Epoch 3688, batch 8, global_batch_idx: 59000, batch size: 170, loss[dur_loss=0.1817, prior_loss=0.9701, diff_loss=0.3396, tot_loss=1.491, over 170.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.3738, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 01:14:40,713 INFO [train.py:682] (1/4) Start epoch 3689 2024-10-22 01:14:52,184 INFO [train.py:561] (1/4) Epoch 3689, batch 2, global_batch_idx: 59010, batch size: 203, loss[dur_loss=0.1816, prior_loss=0.97, diff_loss=0.29, tot_loss=1.441, over 203.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.97, diff_loss=0.2974, tot_loss=1.449, over 442.00 samples.], 2024-10-22 01:15:06,535 INFO [train.py:561] (1/4) Epoch 3689, batch 12, global_batch_idx: 59020, batch size: 152, loss[dur_loss=0.1804, prior_loss=0.9702, diff_loss=0.2935, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9697, diff_loss=0.3447, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 01:15:11,002 INFO [train.py:682] (1/4) Start epoch 3690 2024-10-22 01:15:28,087 INFO [train.py:561] (1/4) Epoch 3690, batch 6, global_batch_idx: 59030, batch size: 106, loss[dur_loss=0.1789, prior_loss=0.9697, diff_loss=0.2629, tot_loss=1.411, over 106.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9694, diff_loss=0.383, tot_loss=1.528, over 1142.00 samples.], 2024-10-22 01:15:41,150 INFO [train.py:682] (1/4) Start epoch 3691 2024-10-22 01:15:50,057 INFO [train.py:561] (1/4) Epoch 3691, batch 0, global_batch_idx: 59040, batch size: 108, loss[dur_loss=0.1802, prior_loss=0.9705, diff_loss=0.299, tot_loss=1.45, over 108.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9705, diff_loss=0.299, tot_loss=1.45, over 108.00 samples.], 2024-10-22 01:16:04,340 INFO [train.py:561] (1/4) Epoch 3691, batch 10, global_batch_idx: 59050, batch size: 111, loss[dur_loss=0.178, prior_loss=0.9707, diff_loss=0.2941, tot_loss=1.443, over 111.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.3622, tot_loss=1.509, over 1656.00 samples.], 2024-10-22 01:16:11,410 INFO [train.py:682] (1/4) Start epoch 3692 2024-10-22 01:16:25,235 INFO [train.py:561] (1/4) Epoch 3692, batch 4, global_batch_idx: 59060, batch size: 189, loss[dur_loss=0.1773, prior_loss=0.9697, diff_loss=0.3, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9692, diff_loss=0.4026, tot_loss=1.548, over 937.00 samples.], 2024-10-22 01:16:40,208 INFO [train.py:561] (1/4) Epoch 3692, batch 14, global_batch_idx: 59070, batch size: 142, loss[dur_loss=0.1801, prior_loss=0.9697, diff_loss=0.2872, tot_loss=1.437, over 142.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.3361, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 01:16:41,623 INFO [train.py:682] (1/4) Start epoch 3693 2024-10-22 01:17:01,585 INFO [train.py:561] (1/4) Epoch 3693, batch 8, global_batch_idx: 59080, batch size: 170, loss[dur_loss=0.1797, prior_loss=0.97, diff_loss=0.3066, tot_loss=1.456, over 170.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9696, diff_loss=0.3592, tot_loss=1.506, over 1432.00 samples.], 2024-10-22 01:17:11,689 INFO [train.py:682] (1/4) Start epoch 3694 2024-10-22 01:17:23,334 INFO [train.py:561] (1/4) Epoch 3694, batch 2, global_batch_idx: 59090, batch size: 203, loss[dur_loss=0.1815, prior_loss=0.97, diff_loss=0.34, tot_loss=1.492, over 203.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.31, tot_loss=1.46, over 442.00 samples.], 2024-10-22 01:17:37,584 INFO [train.py:561] (1/4) Epoch 3694, batch 12, global_batch_idx: 59100, batch size: 152, loss[dur_loss=0.1789, prior_loss=0.9701, diff_loss=0.3106, tot_loss=1.46, over 152.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9697, diff_loss=0.3437, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 01:17:42,047 INFO [train.py:682] (1/4) Start epoch 3695 2024-10-22 01:17:59,467 INFO [train.py:561] (1/4) Epoch 3695, batch 6, global_batch_idx: 59110, batch size: 106, loss[dur_loss=0.1774, prior_loss=0.9699, diff_loss=0.3196, tot_loss=1.467, over 106.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.3827, tot_loss=1.529, over 1142.00 samples.], 2024-10-22 01:18:12,559 INFO [train.py:682] (1/4) Start epoch 3696 2024-10-22 01:18:21,329 INFO [train.py:561] (1/4) Epoch 3696, batch 0, global_batch_idx: 59120, batch size: 108, loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.2635, tot_loss=1.415, over 108.00 samples.], tot_loss[dur_loss=0.1814, prior_loss=0.9705, diff_loss=0.2635, tot_loss=1.415, over 108.00 samples.], 2024-10-22 01:18:35,584 INFO [train.py:561] (1/4) Epoch 3696, batch 10, global_batch_idx: 59130, batch size: 111, loss[dur_loss=0.1805, prior_loss=0.9708, diff_loss=0.3157, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3594, tot_loss=1.507, over 1656.00 samples.], 2024-10-22 01:18:42,646 INFO [train.py:682] (1/4) Start epoch 3697 2024-10-22 01:18:56,536 INFO [train.py:561] (1/4) Epoch 3697, batch 4, global_batch_idx: 59140, batch size: 189, loss[dur_loss=0.1761, prior_loss=0.9698, diff_loss=0.289, tot_loss=1.435, over 189.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.406, tot_loss=1.55, over 937.00 samples.], 2024-10-22 01:19:11,317 INFO [train.py:561] (1/4) Epoch 3697, batch 14, global_batch_idx: 59150, batch size: 142, loss[dur_loss=0.1798, prior_loss=0.9697, diff_loss=0.2755, tot_loss=1.425, over 142.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9697, diff_loss=0.3394, tot_loss=1.487, over 2210.00 samples.], 2024-10-22 01:19:12,730 INFO [train.py:682] (1/4) Start epoch 3698 2024-10-22 01:19:33,247 INFO [train.py:561] (1/4) Epoch 3698, batch 8, global_batch_idx: 59160, batch size: 170, loss[dur_loss=0.183, prior_loss=0.9701, diff_loss=0.2815, tot_loss=1.435, over 170.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3581, tot_loss=1.505, over 1432.00 samples.], 2024-10-22 01:19:43,379 INFO [train.py:682] (1/4) Start epoch 3699 2024-10-22 01:19:54,607 INFO [train.py:561] (1/4) Epoch 3699, batch 2, global_batch_idx: 59170, batch size: 203, loss[dur_loss=0.1806, prior_loss=0.9699, diff_loss=0.2858, tot_loss=1.436, over 203.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9699, diff_loss=0.27, tot_loss=1.42, over 442.00 samples.], 2024-10-22 01:20:08,899 INFO [train.py:561] (1/4) Epoch 3699, batch 12, global_batch_idx: 59180, batch size: 152, loss[dur_loss=0.1786, prior_loss=0.9701, diff_loss=0.2883, tot_loss=1.437, over 152.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9696, diff_loss=0.3287, tot_loss=1.476, over 1966.00 samples.], 2024-10-22 01:20:13,345 INFO [train.py:682] (1/4) Start epoch 3700 2024-10-22 01:20:30,748 INFO [train.py:561] (1/4) Epoch 3700, batch 6, global_batch_idx: 59190, batch size: 106, loss[dur_loss=0.1749, prior_loss=0.9697, diff_loss=0.3056, tot_loss=1.45, over 106.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9693, diff_loss=0.3881, tot_loss=1.534, over 1142.00 samples.], 2024-10-22 01:20:43,841 INFO [train.py:682] (1/4) Start epoch 3701 2024-10-22 01:20:52,541 INFO [train.py:561] (1/4) Epoch 3701, batch 0, global_batch_idx: 59200, batch size: 108, loss[dur_loss=0.181, prior_loss=0.9704, diff_loss=0.2708, tot_loss=1.422, over 108.00 samples.], tot_loss[dur_loss=0.181, prior_loss=0.9704, diff_loss=0.2708, tot_loss=1.422, over 108.00 samples.], 2024-10-22 01:21:06,795 INFO [train.py:561] (1/4) Epoch 3701, batch 10, global_batch_idx: 59210, batch size: 111, loss[dur_loss=0.1808, prior_loss=0.9712, diff_loss=0.3169, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.3555, tot_loss=1.502, over 1656.00 samples.], 2024-10-22 01:21:13,929 INFO [train.py:682] (1/4) Start epoch 3702 2024-10-22 01:21:27,619 INFO [train.py:561] (1/4) Epoch 3702, batch 4, global_batch_idx: 59220, batch size: 189, loss[dur_loss=0.178, prior_loss=0.97, diff_loss=0.3074, tot_loss=1.455, over 189.00 samples.], tot_loss[dur_loss=0.175, prior_loss=0.9692, diff_loss=0.3976, tot_loss=1.542, over 937.00 samples.], 2024-10-22 01:21:42,392 INFO [train.py:561] (1/4) Epoch 3702, batch 14, global_batch_idx: 59230, batch size: 142, loss[dur_loss=0.1785, prior_loss=0.9697, diff_loss=0.2698, tot_loss=1.418, over 142.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.3321, tot_loss=1.479, over 2210.00 samples.], 2024-10-22 01:21:43,856 INFO [train.py:682] (1/4) Start epoch 3703 2024-10-22 01:22:03,826 INFO [train.py:561] (1/4) Epoch 3703, batch 8, global_batch_idx: 59240, batch size: 170, loss[dur_loss=0.1799, prior_loss=0.9699, diff_loss=0.2808, tot_loss=1.431, over 170.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9694, diff_loss=0.3654, tot_loss=1.512, over 1432.00 samples.], 2024-10-22 01:22:13,966 INFO [train.py:682] (1/4) Start epoch 3704 2024-10-22 01:22:25,324 INFO [train.py:561] (1/4) Epoch 3704, batch 2, global_batch_idx: 59250, batch size: 203, loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.3097, tot_loss=1.459, over 203.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.97, diff_loss=0.2952, tot_loss=1.444, over 442.00 samples.], 2024-10-22 01:22:39,623 INFO [train.py:561] (1/4) Epoch 3704, batch 12, global_batch_idx: 59260, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.97, diff_loss=0.327, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.3458, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 01:22:44,092 INFO [train.py:682] (1/4) Start epoch 3705 2024-10-22 01:23:01,184 INFO [train.py:561] (1/4) Epoch 3705, batch 6, global_batch_idx: 59270, batch size: 106, loss[dur_loss=0.1738, prior_loss=0.9697, diff_loss=0.2838, tot_loss=1.427, over 106.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3817, tot_loss=1.527, over 1142.00 samples.], 2024-10-22 01:23:14,291 INFO [train.py:682] (1/4) Start epoch 3706 2024-10-22 01:23:23,474 INFO [train.py:561] (1/4) Epoch 3706, batch 0, global_batch_idx: 59280, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9704, diff_loss=0.2802, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9704, diff_loss=0.2802, tot_loss=1.433, over 108.00 samples.], 2024-10-22 01:23:37,758 INFO [train.py:561] (1/4) Epoch 3706, batch 10, global_batch_idx: 59290, batch size: 111, loss[dur_loss=0.1819, prior_loss=0.9708, diff_loss=0.2928, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3488, tot_loss=1.496, over 1656.00 samples.], 2024-10-22 01:23:44,861 INFO [train.py:682] (1/4) Start epoch 3707 2024-10-22 01:23:58,550 INFO [train.py:561] (1/4) Epoch 3707, batch 4, global_batch_idx: 59300, batch size: 189, loss[dur_loss=0.1771, prior_loss=0.9699, diff_loss=0.3067, tot_loss=1.454, over 189.00 samples.], tot_loss[dur_loss=0.1745, prior_loss=0.9692, diff_loss=0.4082, tot_loss=1.552, over 937.00 samples.], 2024-10-22 01:24:13,487 INFO [train.py:561] (1/4) Epoch 3707, batch 14, global_batch_idx: 59310, batch size: 142, loss[dur_loss=0.1812, prior_loss=0.9698, diff_loss=0.3044, tot_loss=1.455, over 142.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9696, diff_loss=0.3415, tot_loss=1.488, over 2210.00 samples.], 2024-10-22 01:24:14,908 INFO [train.py:682] (1/4) Start epoch 3708 2024-10-22 01:24:34,721 INFO [train.py:561] (1/4) Epoch 3708, batch 8, global_batch_idx: 59320, batch size: 170, loss[dur_loss=0.1801, prior_loss=0.9702, diff_loss=0.3248, tot_loss=1.475, over 170.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9696, diff_loss=0.3686, tot_loss=1.516, over 1432.00 samples.], 2024-10-22 01:24:44,864 INFO [train.py:682] (1/4) Start epoch 3709 2024-10-22 01:24:56,550 INFO [train.py:561] (1/4) Epoch 3709, batch 2, global_batch_idx: 59330, batch size: 203, loss[dur_loss=0.181, prior_loss=0.9699, diff_loss=0.2843, tot_loss=1.435, over 203.00 samples.], tot_loss[dur_loss=0.1805, prior_loss=0.97, diff_loss=0.2778, tot_loss=1.428, over 442.00 samples.], 2024-10-22 01:25:10,888 INFO [train.py:561] (1/4) Epoch 3709, batch 12, global_batch_idx: 59340, batch size: 152, loss[dur_loss=0.1775, prior_loss=0.9701, diff_loss=0.2693, tot_loss=1.417, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3465, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 01:25:15,406 INFO [train.py:682] (1/4) Start epoch 3710 2024-10-22 01:25:32,618 INFO [train.py:561] (1/4) Epoch 3710, batch 6, global_batch_idx: 59350, batch size: 106, loss[dur_loss=0.1744, prior_loss=0.9697, diff_loss=0.2638, tot_loss=1.408, over 106.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9693, diff_loss=0.3851, tot_loss=1.53, over 1142.00 samples.], 2024-10-22 01:25:45,784 INFO [train.py:682] (1/4) Start epoch 3711 2024-10-22 01:25:54,498 INFO [train.py:561] (1/4) Epoch 3711, batch 0, global_batch_idx: 59360, batch size: 108, loss[dur_loss=0.1819, prior_loss=0.9705, diff_loss=0.2765, tot_loss=1.429, over 108.00 samples.], tot_loss[dur_loss=0.1819, prior_loss=0.9705, diff_loss=0.2765, tot_loss=1.429, over 108.00 samples.], 2024-10-22 01:26:08,831 INFO [train.py:561] (1/4) Epoch 3711, batch 10, global_batch_idx: 59370, batch size: 111, loss[dur_loss=0.1809, prior_loss=0.971, diff_loss=0.2729, tot_loss=1.425, over 111.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9697, diff_loss=0.3502, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 01:26:15,922 INFO [train.py:682] (1/4) Start epoch 3712 2024-10-22 01:26:29,666 INFO [train.py:561] (1/4) Epoch 3712, batch 4, global_batch_idx: 59380, batch size: 189, loss[dur_loss=0.1807, prior_loss=0.9699, diff_loss=0.3245, tot_loss=1.475, over 189.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9691, diff_loss=0.4092, tot_loss=1.554, over 937.00 samples.], 2024-10-22 01:26:44,712 INFO [train.py:561] (1/4) Epoch 3712, batch 14, global_batch_idx: 59390, batch size: 142, loss[dur_loss=0.1827, prior_loss=0.9697, diff_loss=0.285, tot_loss=1.437, over 142.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3462, tot_loss=1.494, over 2210.00 samples.], 2024-10-22 01:26:46,140 INFO [train.py:682] (1/4) Start epoch 3713 2024-10-22 01:27:06,373 INFO [train.py:561] (1/4) Epoch 3713, batch 8, global_batch_idx: 59400, batch size: 170, loss[dur_loss=0.1828, prior_loss=0.97, diff_loss=0.3241, tot_loss=1.477, over 170.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.3758, tot_loss=1.522, over 1432.00 samples.], 2024-10-22 01:27:16,695 INFO [train.py:682] (1/4) Start epoch 3714 2024-10-22 01:27:28,347 INFO [train.py:561] (1/4) Epoch 3714, batch 2, global_batch_idx: 59410, batch size: 203, loss[dur_loss=0.1785, prior_loss=0.9699, diff_loss=0.3202, tot_loss=1.469, over 203.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9699, diff_loss=0.2985, tot_loss=1.446, over 442.00 samples.], 2024-10-22 01:27:42,832 INFO [train.py:561] (1/4) Epoch 3714, batch 12, global_batch_idx: 59420, batch size: 152, loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.2723, tot_loss=1.422, over 152.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.3372, tot_loss=1.484, over 1966.00 samples.], 2024-10-22 01:27:47,331 INFO [train.py:682] (1/4) Start epoch 3715 2024-10-22 01:28:04,827 INFO [train.py:561] (1/4) Epoch 3715, batch 6, global_batch_idx: 59430, batch size: 106, loss[dur_loss=0.18, prior_loss=0.97, diff_loss=0.2568, tot_loss=1.407, over 106.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9693, diff_loss=0.3734, tot_loss=1.519, over 1142.00 samples.], 2024-10-22 01:28:18,007 INFO [train.py:682] (1/4) Start epoch 3716 2024-10-22 01:28:26,730 INFO [train.py:561] (1/4) Epoch 3716, batch 0, global_batch_idx: 59440, batch size: 108, loss[dur_loss=0.1802, prior_loss=0.9704, diff_loss=0.2396, tot_loss=1.39, over 108.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9704, diff_loss=0.2396, tot_loss=1.39, over 108.00 samples.], 2024-10-22 01:28:41,055 INFO [train.py:561] (1/4) Epoch 3716, batch 10, global_batch_idx: 59450, batch size: 111, loss[dur_loss=0.179, prior_loss=0.9709, diff_loss=0.3287, tot_loss=1.479, over 111.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.3607, tot_loss=1.508, over 1656.00 samples.], 2024-10-22 01:28:48,241 INFO [train.py:682] (1/4) Start epoch 3717 2024-10-22 01:29:02,275 INFO [train.py:561] (1/4) Epoch 3717, batch 4, global_batch_idx: 59460, batch size: 189, loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.2867, tot_loss=1.437, over 189.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9694, diff_loss=0.403, tot_loss=1.548, over 937.00 samples.], 2024-10-22 01:29:17,244 INFO [train.py:561] (1/4) Epoch 3717, batch 14, global_batch_idx: 59470, batch size: 142, loss[dur_loss=0.1836, prior_loss=0.9699, diff_loss=0.3055, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9699, diff_loss=0.3366, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 01:29:18,698 INFO [train.py:682] (1/4) Start epoch 3718 2024-10-22 01:29:38,768 INFO [train.py:561] (1/4) Epoch 3718, batch 8, global_batch_idx: 59480, batch size: 170, loss[dur_loss=0.1815, prior_loss=0.97, diff_loss=0.3055, tot_loss=1.457, over 170.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9696, diff_loss=0.3606, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 01:29:49,004 INFO [train.py:682] (1/4) Start epoch 3719 2024-10-22 01:30:00,499 INFO [train.py:561] (1/4) Epoch 3719, batch 2, global_batch_idx: 59490, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.9699, diff_loss=0.3408, tot_loss=1.493, over 203.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.97, diff_loss=0.3131, tot_loss=1.465, over 442.00 samples.], 2024-10-22 01:30:14,845 INFO [train.py:561] (1/4) Epoch 3719, batch 12, global_batch_idx: 59500, batch size: 152, loss[dur_loss=0.1779, prior_loss=0.9701, diff_loss=0.2821, tot_loss=1.43, over 152.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9698, diff_loss=0.3427, tot_loss=1.491, over 1966.00 samples.], 2024-10-22 01:30:19,323 INFO [train.py:682] (1/4) Start epoch 3720 2024-10-22 01:30:36,536 INFO [train.py:561] (1/4) Epoch 3720, batch 6, global_batch_idx: 59510, batch size: 106, loss[dur_loss=0.1788, prior_loss=0.9699, diff_loss=0.2516, tot_loss=1.4, over 106.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9696, diff_loss=0.391, tot_loss=1.537, over 1142.00 samples.], 2024-10-22 01:30:49,654 INFO [train.py:682] (1/4) Start epoch 3721 2024-10-22 01:30:58,329 INFO [train.py:561] (1/4) Epoch 3721, batch 0, global_batch_idx: 59520, batch size: 108, loss[dur_loss=0.18, prior_loss=0.9706, diff_loss=0.3359, tot_loss=1.487, over 108.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9706, diff_loss=0.3359, tot_loss=1.487, over 108.00 samples.], 2024-10-22 01:31:12,576 INFO [train.py:561] (1/4) Epoch 3721, batch 10, global_batch_idx: 59530, batch size: 111, loss[dur_loss=0.1814, prior_loss=0.971, diff_loss=0.2821, tot_loss=1.435, over 111.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9697, diff_loss=0.3648, tot_loss=1.513, over 1656.00 samples.], 2024-10-22 01:31:19,758 INFO [train.py:682] (1/4) Start epoch 3722 2024-10-22 01:31:33,609 INFO [train.py:561] (1/4) Epoch 3722, batch 4, global_batch_idx: 59540, batch size: 189, loss[dur_loss=0.18, prior_loss=0.9699, diff_loss=0.3033, tot_loss=1.453, over 189.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.4008, tot_loss=1.546, over 937.00 samples.], 2024-10-22 01:31:48,597 INFO [train.py:561] (1/4) Epoch 3722, batch 14, global_batch_idx: 59550, batch size: 142, loss[dur_loss=0.1814, prior_loss=0.9697, diff_loss=0.2707, tot_loss=1.422, over 142.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.3362, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 01:31:50,035 INFO [train.py:682] (1/4) Start epoch 3723 2024-10-22 01:32:10,580 INFO [train.py:561] (1/4) Epoch 3723, batch 8, global_batch_idx: 59560, batch size: 170, loss[dur_loss=0.1815, prior_loss=0.9703, diff_loss=0.3001, tot_loss=1.452, over 170.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9696, diff_loss=0.3744, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 01:32:20,914 INFO [train.py:682] (1/4) Start epoch 3724 2024-10-22 01:32:32,513 INFO [train.py:561] (1/4) Epoch 3724, batch 2, global_batch_idx: 59570, batch size: 203, loss[dur_loss=0.1791, prior_loss=0.9699, diff_loss=0.3339, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9701, diff_loss=0.3211, tot_loss=1.47, over 442.00 samples.], 2024-10-22 01:32:46,951 INFO [train.py:561] (1/4) Epoch 3724, batch 12, global_batch_idx: 59580, batch size: 152, loss[dur_loss=0.1738, prior_loss=0.9698, diff_loss=0.2759, tot_loss=1.419, over 152.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9697, diff_loss=0.3474, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 01:32:51,483 INFO [train.py:682] (1/4) Start epoch 3725 2024-10-22 01:33:08,420 INFO [train.py:561] (1/4) Epoch 3725, batch 6, global_batch_idx: 59590, batch size: 106, loss[dur_loss=0.1764, prior_loss=0.9697, diff_loss=0.3097, tot_loss=1.456, over 106.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.3919, tot_loss=1.538, over 1142.00 samples.], 2024-10-22 01:33:21,635 INFO [train.py:682] (1/4) Start epoch 3726 2024-10-22 01:33:30,724 INFO [train.py:561] (1/4) Epoch 3726, batch 0, global_batch_idx: 59600, batch size: 108, loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.2916, tot_loss=1.444, over 108.00 samples.], tot_loss[dur_loss=0.182, prior_loss=0.9703, diff_loss=0.2916, tot_loss=1.444, over 108.00 samples.], 2024-10-22 01:33:45,063 INFO [train.py:561] (1/4) Epoch 3726, batch 10, global_batch_idx: 59610, batch size: 111, loss[dur_loss=0.1803, prior_loss=0.9709, diff_loss=0.2906, tot_loss=1.442, over 111.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.3656, tot_loss=1.512, over 1656.00 samples.], 2024-10-22 01:33:52,266 INFO [train.py:682] (1/4) Start epoch 3727 2024-10-22 01:34:05,972 INFO [train.py:561] (1/4) Epoch 3727, batch 4, global_batch_idx: 59620, batch size: 189, loss[dur_loss=0.1782, prior_loss=0.97, diff_loss=0.2939, tot_loss=1.442, over 189.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9693, diff_loss=0.3902, tot_loss=1.537, over 937.00 samples.], 2024-10-22 01:34:21,046 INFO [train.py:561] (1/4) Epoch 3727, batch 14, global_batch_idx: 59630, batch size: 142, loss[dur_loss=0.1825, prior_loss=0.9698, diff_loss=0.2779, tot_loss=1.43, over 142.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.9697, diff_loss=0.3354, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 01:34:22,501 INFO [train.py:682] (1/4) Start epoch 3728 2024-10-22 01:34:42,854 INFO [train.py:561] (1/4) Epoch 3728, batch 8, global_batch_idx: 59640, batch size: 170, loss[dur_loss=0.1798, prior_loss=0.9701, diff_loss=0.3079, tot_loss=1.458, over 170.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9695, diff_loss=0.3609, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 01:34:53,115 INFO [train.py:682] (1/4) Start epoch 3729 2024-10-22 01:35:04,464 INFO [train.py:561] (1/4) Epoch 3729, batch 2, global_batch_idx: 59650, batch size: 203, loss[dur_loss=0.1794, prior_loss=0.97, diff_loss=0.3547, tot_loss=1.504, over 203.00 samples.], tot_loss[dur_loss=0.1792, prior_loss=0.97, diff_loss=0.3321, tot_loss=1.481, over 442.00 samples.], 2024-10-22 01:35:18,798 INFO [train.py:561] (1/4) Epoch 3729, batch 12, global_batch_idx: 59660, batch size: 152, loss[dur_loss=0.1792, prior_loss=0.9701, diff_loss=0.305, tot_loss=1.454, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3547, tot_loss=1.502, over 1966.00 samples.], 2024-10-22 01:35:23,297 INFO [train.py:682] (1/4) Start epoch 3730 2024-10-22 01:35:40,275 INFO [train.py:561] (1/4) Epoch 3730, batch 6, global_batch_idx: 59670, batch size: 106, loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.2917, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9694, diff_loss=0.3782, tot_loss=1.523, over 1142.00 samples.], 2024-10-22 01:35:53,449 INFO [train.py:682] (1/4) Start epoch 3731 2024-10-22 01:36:02,348 INFO [train.py:561] (1/4) Epoch 3731, batch 0, global_batch_idx: 59680, batch size: 108, loss[dur_loss=0.1831, prior_loss=0.9703, diff_loss=0.2896, tot_loss=1.443, over 108.00 samples.], tot_loss[dur_loss=0.1831, prior_loss=0.9703, diff_loss=0.2896, tot_loss=1.443, over 108.00 samples.], 2024-10-22 01:36:16,722 INFO [train.py:561] (1/4) Epoch 3731, batch 10, global_batch_idx: 59690, batch size: 111, loss[dur_loss=0.1814, prior_loss=0.971, diff_loss=0.273, tot_loss=1.425, over 111.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9696, diff_loss=0.3477, tot_loss=1.494, over 1656.00 samples.], 2024-10-22 01:36:23,927 INFO [train.py:682] (1/4) Start epoch 3732 2024-10-22 01:36:37,777 INFO [train.py:561] (1/4) Epoch 3732, batch 4, global_batch_idx: 59700, batch size: 189, loss[dur_loss=0.1779, prior_loss=0.9698, diff_loss=0.3134, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.4185, tot_loss=1.564, over 937.00 samples.], 2024-10-22 01:36:52,756 INFO [train.py:561] (1/4) Epoch 3732, batch 14, global_batch_idx: 59710, batch size: 142, loss[dur_loss=0.1808, prior_loss=0.9698, diff_loss=0.2959, tot_loss=1.446, over 142.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9696, diff_loss=0.339, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 01:36:54,197 INFO [train.py:682] (1/4) Start epoch 3733 2024-10-22 01:37:14,032 INFO [train.py:561] (1/4) Epoch 3733, batch 8, global_batch_idx: 59720, batch size: 170, loss[dur_loss=0.1788, prior_loss=0.97, diff_loss=0.3011, tot_loss=1.45, over 170.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9694, diff_loss=0.365, tot_loss=1.511, over 1432.00 samples.], 2024-10-22 01:37:24,343 INFO [train.py:682] (1/4) Start epoch 3734 2024-10-22 01:37:36,036 INFO [train.py:561] (1/4) Epoch 3734, batch 2, global_batch_idx: 59730, batch size: 203, loss[dur_loss=0.179, prior_loss=0.9698, diff_loss=0.3192, tot_loss=1.468, over 203.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9698, diff_loss=0.3026, tot_loss=1.451, over 442.00 samples.], 2024-10-22 01:37:50,391 INFO [train.py:561] (1/4) Epoch 3734, batch 12, global_batch_idx: 59740, batch size: 152, loss[dur_loss=0.177, prior_loss=0.9698, diff_loss=0.2892, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.3414, tot_loss=1.488, over 1966.00 samples.], 2024-10-22 01:37:54,937 INFO [train.py:682] (1/4) Start epoch 3735 2024-10-22 01:38:11,876 INFO [train.py:561] (1/4) Epoch 3735, batch 6, global_batch_idx: 59750, batch size: 106, loss[dur_loss=0.178, prior_loss=0.9698, diff_loss=0.3309, tot_loss=1.479, over 106.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9692, diff_loss=0.3798, tot_loss=1.525, over 1142.00 samples.], 2024-10-22 01:38:25,151 INFO [train.py:682] (1/4) Start epoch 3736 2024-10-22 01:38:34,238 INFO [train.py:561] (1/4) Epoch 3736, batch 0, global_batch_idx: 59760, batch size: 108, loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.2874, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9702, diff_loss=0.2874, tot_loss=1.438, over 108.00 samples.], 2024-10-22 01:38:48,651 INFO [train.py:561] (1/4) Epoch 3736, batch 10, global_batch_idx: 59770, batch size: 111, loss[dur_loss=0.1782, prior_loss=0.9707, diff_loss=0.2535, tot_loss=1.402, over 111.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9696, diff_loss=0.3494, tot_loss=1.496, over 1656.00 samples.], 2024-10-22 01:38:55,899 INFO [train.py:682] (1/4) Start epoch 3737 2024-10-22 01:39:09,864 INFO [train.py:561] (1/4) Epoch 3737, batch 4, global_batch_idx: 59780, batch size: 189, loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3495, tot_loss=1.498, over 189.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9691, diff_loss=0.4098, tot_loss=1.555, over 937.00 samples.], 2024-10-22 01:39:24,901 INFO [train.py:561] (1/4) Epoch 3737, batch 14, global_batch_idx: 59790, batch size: 142, loss[dur_loss=0.1819, prior_loss=0.9696, diff_loss=0.2904, tot_loss=1.442, over 142.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9696, diff_loss=0.3386, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 01:39:26,379 INFO [train.py:682] (1/4) Start epoch 3738 2024-10-22 01:39:46,385 INFO [train.py:561] (1/4) Epoch 3738, batch 8, global_batch_idx: 59800, batch size: 170, loss[dur_loss=0.1805, prior_loss=0.9699, diff_loss=0.3003, tot_loss=1.451, over 170.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9694, diff_loss=0.377, tot_loss=1.523, over 1432.00 samples.], 2024-10-22 01:39:56,728 INFO [train.py:682] (1/4) Start epoch 3739 2024-10-22 01:40:08,442 INFO [train.py:561] (1/4) Epoch 3739, batch 2, global_batch_idx: 59810, batch size: 203, loss[dur_loss=0.1779, prior_loss=0.9697, diff_loss=0.2967, tot_loss=1.444, over 203.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9698, diff_loss=0.3132, tot_loss=1.461, over 442.00 samples.], 2024-10-22 01:40:22,763 INFO [train.py:561] (1/4) Epoch 3739, batch 12, global_batch_idx: 59820, batch size: 152, loss[dur_loss=0.174, prior_loss=0.97, diff_loss=0.3153, tot_loss=1.459, over 152.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9696, diff_loss=0.3481, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 01:40:27,288 INFO [train.py:682] (1/4) Start epoch 3740 2024-10-22 01:40:44,368 INFO [train.py:561] (1/4) Epoch 3740, batch 6, global_batch_idx: 59830, batch size: 106, loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.3196, tot_loss=1.467, over 106.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9692, diff_loss=0.389, tot_loss=1.534, over 1142.00 samples.], 2024-10-22 01:40:57,646 INFO [train.py:682] (1/4) Start epoch 3741 2024-10-22 01:41:06,273 INFO [train.py:561] (1/4) Epoch 3741, batch 0, global_batch_idx: 59840, batch size: 108, loss[dur_loss=0.1847, prior_loss=0.9706, diff_loss=0.3054, tot_loss=1.461, over 108.00 samples.], tot_loss[dur_loss=0.1847, prior_loss=0.9706, diff_loss=0.3054, tot_loss=1.461, over 108.00 samples.], 2024-10-22 01:41:20,544 INFO [train.py:561] (1/4) Epoch 3741, batch 10, global_batch_idx: 59850, batch size: 111, loss[dur_loss=0.1779, prior_loss=0.9708, diff_loss=0.2828, tot_loss=1.431, over 111.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9695, diff_loss=0.3547, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 01:41:27,728 INFO [train.py:682] (1/4) Start epoch 3742 2024-10-22 01:41:41,703 INFO [train.py:561] (1/4) Epoch 3742, batch 4, global_batch_idx: 59860, batch size: 189, loss[dur_loss=0.1797, prior_loss=0.9699, diff_loss=0.3181, tot_loss=1.468, over 189.00 samples.], tot_loss[dur_loss=0.1739, prior_loss=0.9691, diff_loss=0.4107, tot_loss=1.554, over 937.00 samples.], 2024-10-22 01:41:56,620 INFO [train.py:561] (1/4) Epoch 3742, batch 14, global_batch_idx: 59870, batch size: 142, loss[dur_loss=0.1783, prior_loss=0.9695, diff_loss=0.2785, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9696, diff_loss=0.3437, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 01:41:58,053 INFO [train.py:682] (1/4) Start epoch 3743 2024-10-22 01:42:17,859 INFO [train.py:561] (1/4) Epoch 3743, batch 8, global_batch_idx: 59880, batch size: 170, loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.3188, tot_loss=1.468, over 170.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9694, diff_loss=0.3774, tot_loss=1.523, over 1432.00 samples.], 2024-10-22 01:42:28,081 INFO [train.py:682] (1/4) Start epoch 3744 2024-10-22 01:42:39,703 INFO [train.py:561] (1/4) Epoch 3744, batch 2, global_batch_idx: 59890, batch size: 203, loss[dur_loss=0.1778, prior_loss=0.9698, diff_loss=0.342, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9699, diff_loss=0.3092, tot_loss=1.457, over 442.00 samples.], 2024-10-22 01:42:53,959 INFO [train.py:561] (1/4) Epoch 3744, batch 12, global_batch_idx: 59900, batch size: 152, loss[dur_loss=0.1763, prior_loss=0.9699, diff_loss=0.3023, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3484, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 01:42:58,481 INFO [train.py:682] (1/4) Start epoch 3745 2024-10-22 01:43:15,525 INFO [train.py:561] (1/4) Epoch 3745, batch 6, global_batch_idx: 59910, batch size: 106, loss[dur_loss=0.1763, prior_loss=0.9697, diff_loss=0.2308, tot_loss=1.377, over 106.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9693, diff_loss=0.3929, tot_loss=1.539, over 1142.00 samples.], 2024-10-22 01:43:28,694 INFO [train.py:682] (1/4) Start epoch 3746 2024-10-22 01:43:37,507 INFO [train.py:561] (1/4) Epoch 3746, batch 0, global_batch_idx: 59920, batch size: 108, loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.2802, tot_loss=1.432, over 108.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9704, diff_loss=0.2802, tot_loss=1.432, over 108.00 samples.], 2024-10-22 01:43:51,792 INFO [train.py:561] (1/4) Epoch 3746, batch 10, global_batch_idx: 59930, batch size: 111, loss[dur_loss=0.1809, prior_loss=0.9708, diff_loss=0.2704, tot_loss=1.422, over 111.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.3521, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 01:43:59,095 INFO [train.py:682] (1/4) Start epoch 3747 2024-10-22 01:44:13,378 INFO [train.py:561] (1/4) Epoch 3747, batch 4, global_batch_idx: 59940, batch size: 189, loss[dur_loss=0.1777, prior_loss=0.9697, diff_loss=0.3215, tot_loss=1.469, over 189.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9691, diff_loss=0.4091, tot_loss=1.554, over 937.00 samples.], 2024-10-22 01:44:28,446 INFO [train.py:561] (1/4) Epoch 3747, batch 14, global_batch_idx: 59950, batch size: 142, loss[dur_loss=0.1792, prior_loss=0.9695, diff_loss=0.2986, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9696, diff_loss=0.3426, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 01:44:29,922 INFO [train.py:682] (1/4) Start epoch 3748 2024-10-22 01:44:50,021 INFO [train.py:561] (1/4) Epoch 3748, batch 8, global_batch_idx: 59960, batch size: 170, loss[dur_loss=0.1832, prior_loss=0.9702, diff_loss=0.3138, tot_loss=1.467, over 170.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.3714, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 01:45:00,308 INFO [train.py:682] (1/4) Start epoch 3749 2024-10-22 01:45:11,666 INFO [train.py:561] (1/4) Epoch 3749, batch 2, global_batch_idx: 59970, batch size: 203, loss[dur_loss=0.1799, prior_loss=0.9697, diff_loss=0.2948, tot_loss=1.444, over 203.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9699, diff_loss=0.283, tot_loss=1.433, over 442.00 samples.], 2024-10-22 01:45:26,011 INFO [train.py:561] (1/4) Epoch 3749, batch 12, global_batch_idx: 59980, batch size: 152, loss[dur_loss=0.1769, prior_loss=0.9701, diff_loss=0.2628, tot_loss=1.41, over 152.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.3352, tot_loss=1.483, over 1966.00 samples.], 2024-10-22 01:45:30,527 INFO [train.py:682] (1/4) Start epoch 3750 2024-10-22 01:45:47,661 INFO [train.py:561] (1/4) Epoch 3750, batch 6, global_batch_idx: 59990, batch size: 106, loss[dur_loss=0.1788, prior_loss=0.9698, diff_loss=0.2879, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9694, diff_loss=0.3939, tot_loss=1.54, over 1142.00 samples.], 2024-10-22 01:46:00,828 INFO [train.py:682] (1/4) Start epoch 3751 2024-10-22 01:46:09,585 INFO [train.py:561] (1/4) Epoch 3751, batch 0, global_batch_idx: 60000, batch size: 108, loss[dur_loss=0.18, prior_loss=0.9704, diff_loss=0.291, tot_loss=1.441, over 108.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9704, diff_loss=0.291, tot_loss=1.441, over 108.00 samples.], 2024-10-22 01:46:10,998 INFO [train.py:579] (1/4) Computing validation loss 2024-10-22 01:46:33,442 INFO [train.py:589] (1/4) Epoch 3751, validation: dur_loss=0.4583, prior_loss=1.038, diff_loss=0.3865, tot_loss=1.883, over 100.00 samples. 2024-10-22 01:46:33,443 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-22 01:46:46,370 INFO [train.py:561] (1/4) Epoch 3751, batch 10, global_batch_idx: 60010, batch size: 111, loss[dur_loss=0.1815, prior_loss=0.9707, diff_loss=0.2775, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9697, diff_loss=0.347, tot_loss=1.494, over 1656.00 samples.], 2024-10-22 01:46:53,504 INFO [train.py:682] (1/4) Start epoch 3752 2024-10-22 01:47:07,486 INFO [train.py:561] (1/4) Epoch 3752, batch 4, global_batch_idx: 60020, batch size: 189, loss[dur_loss=0.1796, prior_loss=0.9698, diff_loss=0.3293, tot_loss=1.479, over 189.00 samples.], tot_loss[dur_loss=0.175, prior_loss=0.9692, diff_loss=0.404, tot_loss=1.548, over 937.00 samples.], 2024-10-22 01:47:22,416 INFO [train.py:561] (1/4) Epoch 3752, batch 14, global_batch_idx: 60030, batch size: 142, loss[dur_loss=0.1801, prior_loss=0.9698, diff_loss=0.3079, tot_loss=1.458, over 142.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.3427, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 01:47:23,845 INFO [train.py:682] (1/4) Start epoch 3753 2024-10-22 01:47:44,339 INFO [train.py:561] (1/4) Epoch 3753, batch 8, global_batch_idx: 60040, batch size: 170, loss[dur_loss=0.1806, prior_loss=0.9701, diff_loss=0.2898, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.3584, tot_loss=1.505, over 1432.00 samples.], 2024-10-22 01:47:54,477 INFO [train.py:682] (1/4) Start epoch 3754 2024-10-22 01:48:06,067 INFO [train.py:561] (1/4) Epoch 3754, batch 2, global_batch_idx: 60050, batch size: 203, loss[dur_loss=0.1792, prior_loss=0.9698, diff_loss=0.3219, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9699, diff_loss=0.3081, tot_loss=1.458, over 442.00 samples.], 2024-10-22 01:48:20,385 INFO [train.py:561] (1/4) Epoch 3754, batch 12, global_batch_idx: 60060, batch size: 152, loss[dur_loss=0.179, prior_loss=0.9701, diff_loss=0.2748, tot_loss=1.424, over 152.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.3461, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 01:48:24,826 INFO [train.py:682] (1/4) Start epoch 3755 2024-10-22 01:48:42,399 INFO [train.py:561] (1/4) Epoch 3755, batch 6, global_batch_idx: 60070, batch size: 106, loss[dur_loss=0.1781, prior_loss=0.9696, diff_loss=0.2784, tot_loss=1.426, over 106.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.3878, tot_loss=1.534, over 1142.00 samples.], 2024-10-22 01:48:55,488 INFO [train.py:682] (1/4) Start epoch 3756 2024-10-22 01:49:04,298 INFO [train.py:561] (1/4) Epoch 3756, batch 0, global_batch_idx: 60080, batch size: 108, loss[dur_loss=0.1802, prior_loss=0.9703, diff_loss=0.2976, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.1802, prior_loss=0.9703, diff_loss=0.2976, tot_loss=1.448, over 108.00 samples.], 2024-10-22 01:49:18,768 INFO [train.py:561] (1/4) Epoch 3756, batch 10, global_batch_idx: 60090, batch size: 111, loss[dur_loss=0.1776, prior_loss=0.9707, diff_loss=0.255, tot_loss=1.403, over 111.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9696, diff_loss=0.3496, tot_loss=1.496, over 1656.00 samples.], 2024-10-22 01:49:25,804 INFO [train.py:682] (1/4) Start epoch 3757 2024-10-22 01:49:39,795 INFO [train.py:561] (1/4) Epoch 3757, batch 4, global_batch_idx: 60100, batch size: 189, loss[dur_loss=0.1796, prior_loss=0.9699, diff_loss=0.3184, tot_loss=1.468, over 189.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9691, diff_loss=0.4075, tot_loss=1.553, over 937.00 samples.], 2024-10-22 01:49:54,588 INFO [train.py:561] (1/4) Epoch 3757, batch 14, global_batch_idx: 60110, batch size: 142, loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.3035, tot_loss=1.451, over 142.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9696, diff_loss=0.3344, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 01:49:56,008 INFO [train.py:682] (1/4) Start epoch 3758 2024-10-22 01:50:16,109 INFO [train.py:561] (1/4) Epoch 3758, batch 8, global_batch_idx: 60120, batch size: 170, loss[dur_loss=0.1804, prior_loss=0.97, diff_loss=0.3238, tot_loss=1.474, over 170.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9694, diff_loss=0.3717, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 01:50:26,202 INFO [train.py:682] (1/4) Start epoch 3759 2024-10-22 01:50:37,696 INFO [train.py:561] (1/4) Epoch 3759, batch 2, global_batch_idx: 60130, batch size: 203, loss[dur_loss=0.1792, prior_loss=0.9697, diff_loss=0.3311, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9698, diff_loss=0.3153, tot_loss=1.463, over 442.00 samples.], 2024-10-22 01:50:51,942 INFO [train.py:561] (1/4) Epoch 3759, batch 12, global_batch_idx: 60140, batch size: 152, loss[dur_loss=0.1776, prior_loss=0.9698, diff_loss=0.2885, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9695, diff_loss=0.35, tot_loss=1.496, over 1966.00 samples.], 2024-10-22 01:50:56,405 INFO [train.py:682] (1/4) Start epoch 3760 2024-10-22 01:51:13,849 INFO [train.py:561] (1/4) Epoch 3760, batch 6, global_batch_idx: 60150, batch size: 106, loss[dur_loss=0.1756, prior_loss=0.9698, diff_loss=0.3101, tot_loss=1.456, over 106.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9693, diff_loss=0.3963, tot_loss=1.541, over 1142.00 samples.], 2024-10-22 01:51:26,904 INFO [train.py:682] (1/4) Start epoch 3761 2024-10-22 01:51:35,913 INFO [train.py:561] (1/4) Epoch 3761, batch 0, global_batch_idx: 60160, batch size: 108, loss[dur_loss=0.1812, prior_loss=0.9702, diff_loss=0.2894, tot_loss=1.441, over 108.00 samples.], tot_loss[dur_loss=0.1812, prior_loss=0.9702, diff_loss=0.2894, tot_loss=1.441, over 108.00 samples.], 2024-10-22 01:51:50,349 INFO [train.py:561] (1/4) Epoch 3761, batch 10, global_batch_idx: 60170, batch size: 111, loss[dur_loss=0.1796, prior_loss=0.9707, diff_loss=0.2781, tot_loss=1.428, over 111.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9695, diff_loss=0.3498, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 01:51:57,493 INFO [train.py:682] (1/4) Start epoch 3762 2024-10-22 01:52:11,650 INFO [train.py:561] (1/4) Epoch 3762, batch 4, global_batch_idx: 60180, batch size: 189, loss[dur_loss=0.1787, prior_loss=0.9698, diff_loss=0.3205, tot_loss=1.469, over 189.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.3922, tot_loss=1.537, over 937.00 samples.], 2024-10-22 01:52:26,607 INFO [train.py:561] (1/4) Epoch 3762, batch 14, global_batch_idx: 60190, batch size: 142, loss[dur_loss=0.1816, prior_loss=0.9696, diff_loss=0.2988, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9696, diff_loss=0.3293, tot_loss=1.477, over 2210.00 samples.], 2024-10-22 01:52:28,051 INFO [train.py:682] (1/4) Start epoch 3763 2024-10-22 01:52:48,570 INFO [train.py:561] (1/4) Epoch 3763, batch 8, global_batch_idx: 60200, batch size: 170, loss[dur_loss=0.1808, prior_loss=0.97, diff_loss=0.3023, tot_loss=1.453, over 170.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9694, diff_loss=0.3579, tot_loss=1.504, over 1432.00 samples.], 2024-10-22 01:52:58,708 INFO [train.py:682] (1/4) Start epoch 3764 2024-10-22 01:53:10,190 INFO [train.py:561] (1/4) Epoch 3764, batch 2, global_batch_idx: 60210, batch size: 203, loss[dur_loss=0.1773, prior_loss=0.9697, diff_loss=0.3105, tot_loss=1.458, over 203.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9698, diff_loss=0.2939, tot_loss=1.442, over 442.00 samples.], 2024-10-22 01:53:24,569 INFO [train.py:561] (1/4) Epoch 3764, batch 12, global_batch_idx: 60220, batch size: 152, loss[dur_loss=0.1782, prior_loss=0.9699, diff_loss=0.291, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9695, diff_loss=0.3433, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 01:53:29,038 INFO [train.py:682] (1/4) Start epoch 3765 2024-10-22 01:53:46,257 INFO [train.py:561] (1/4) Epoch 3765, batch 6, global_batch_idx: 60230, batch size: 106, loss[dur_loss=0.1786, prior_loss=0.9696, diff_loss=0.2893, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9693, diff_loss=0.3913, tot_loss=1.536, over 1142.00 samples.], 2024-10-22 01:53:59,349 INFO [train.py:682] (1/4) Start epoch 3766 2024-10-22 01:54:08,575 INFO [train.py:561] (1/4) Epoch 3766, batch 0, global_batch_idx: 60240, batch size: 108, loss[dur_loss=0.1824, prior_loss=0.9703, diff_loss=0.2849, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.1824, prior_loss=0.9703, diff_loss=0.2849, tot_loss=1.438, over 108.00 samples.], 2024-10-22 01:54:22,869 INFO [train.py:561] (1/4) Epoch 3766, batch 10, global_batch_idx: 60250, batch size: 111, loss[dur_loss=0.1812, prior_loss=0.9706, diff_loss=0.2946, tot_loss=1.446, over 111.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9695, diff_loss=0.3551, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 01:54:29,969 INFO [train.py:682] (1/4) Start epoch 3767 2024-10-22 01:54:43,861 INFO [train.py:561] (1/4) Epoch 3767, batch 4, global_batch_idx: 60260, batch size: 189, loss[dur_loss=0.1791, prior_loss=0.9698, diff_loss=0.2723, tot_loss=1.421, over 189.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.969, diff_loss=0.3941, tot_loss=1.539, over 937.00 samples.], 2024-10-22 01:54:58,798 INFO [train.py:561] (1/4) Epoch 3767, batch 14, global_batch_idx: 60270, batch size: 142, loss[dur_loss=0.1809, prior_loss=0.9696, diff_loss=0.286, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9695, diff_loss=0.3336, tot_loss=1.48, over 2210.00 samples.], 2024-10-22 01:55:00,232 INFO [train.py:682] (1/4) Start epoch 3768 2024-10-22 01:55:20,431 INFO [train.py:561] (1/4) Epoch 3768, batch 8, global_batch_idx: 60280, batch size: 170, loss[dur_loss=0.18, prior_loss=0.9698, diff_loss=0.2999, tot_loss=1.45, over 170.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9693, diff_loss=0.3731, tot_loss=1.519, over 1432.00 samples.], 2024-10-22 01:55:30,561 INFO [train.py:682] (1/4) Start epoch 3769 2024-10-22 01:55:42,147 INFO [train.py:561] (1/4) Epoch 3769, batch 2, global_batch_idx: 60290, batch size: 203, loss[dur_loss=0.1799, prior_loss=0.9698, diff_loss=0.3353, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9699, diff_loss=0.3102, tot_loss=1.46, over 442.00 samples.], 2024-10-22 01:55:56,359 INFO [train.py:561] (1/4) Epoch 3769, batch 12, global_batch_idx: 60300, batch size: 152, loss[dur_loss=0.1771, prior_loss=0.9699, diff_loss=0.3156, tot_loss=1.463, over 152.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.3501, tot_loss=1.496, over 1966.00 samples.], 2024-10-22 01:56:00,797 INFO [train.py:682] (1/4) Start epoch 3770 2024-10-22 01:56:17,859 INFO [train.py:561] (1/4) Epoch 3770, batch 6, global_batch_idx: 60310, batch size: 106, loss[dur_loss=0.1747, prior_loss=0.9696, diff_loss=0.2979, tot_loss=1.442, over 106.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.9692, diff_loss=0.3817, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 01:56:30,997 INFO [train.py:682] (1/4) Start epoch 3771 2024-10-22 01:56:39,690 INFO [train.py:561] (1/4) Epoch 3771, batch 0, global_batch_idx: 60320, batch size: 108, loss[dur_loss=0.1797, prior_loss=0.9702, diff_loss=0.2861, tot_loss=1.436, over 108.00 samples.], tot_loss[dur_loss=0.1797, prior_loss=0.9702, diff_loss=0.2861, tot_loss=1.436, over 108.00 samples.], 2024-10-22 01:56:54,050 INFO [train.py:561] (1/4) Epoch 3771, batch 10, global_batch_idx: 60330, batch size: 111, loss[dur_loss=0.1796, prior_loss=0.9707, diff_loss=0.265, tot_loss=1.415, over 111.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.3513, tot_loss=1.498, over 1656.00 samples.], 2024-10-22 01:57:01,105 INFO [train.py:682] (1/4) Start epoch 3772 2024-10-22 01:57:15,379 INFO [train.py:561] (1/4) Epoch 3772, batch 4, global_batch_idx: 60340, batch size: 189, loss[dur_loss=0.1766, prior_loss=0.9697, diff_loss=0.2837, tot_loss=1.43, over 189.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.969, diff_loss=0.3977, tot_loss=1.542, over 937.00 samples.], 2024-10-22 01:57:30,360 INFO [train.py:561] (1/4) Epoch 3772, batch 14, global_batch_idx: 60350, batch size: 142, loss[dur_loss=0.1791, prior_loss=0.9695, diff_loss=0.2765, tot_loss=1.425, over 142.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9695, diff_loss=0.338, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 01:57:31,785 INFO [train.py:682] (1/4) Start epoch 3773 2024-10-22 01:57:51,811 INFO [train.py:561] (1/4) Epoch 3773, batch 8, global_batch_idx: 60360, batch size: 170, loss[dur_loss=0.179, prior_loss=0.97, diff_loss=0.292, tot_loss=1.441, over 170.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9693, diff_loss=0.376, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 01:58:01,996 INFO [train.py:682] (1/4) Start epoch 3774 2024-10-22 01:58:13,370 INFO [train.py:561] (1/4) Epoch 3774, batch 2, global_batch_idx: 60370, batch size: 203, loss[dur_loss=0.1799, prior_loss=0.9699, diff_loss=0.3117, tot_loss=1.462, over 203.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9698, diff_loss=0.2997, tot_loss=1.449, over 442.00 samples.], 2024-10-22 01:58:27,979 INFO [train.py:561] (1/4) Epoch 3774, batch 12, global_batch_idx: 60380, batch size: 152, loss[dur_loss=0.1749, prior_loss=0.9698, diff_loss=0.2842, tot_loss=1.429, over 152.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.3479, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 01:58:32,443 INFO [train.py:682] (1/4) Start epoch 3775 2024-10-22 01:58:49,732 INFO [train.py:561] (1/4) Epoch 3775, batch 6, global_batch_idx: 60390, batch size: 106, loss[dur_loss=0.1748, prior_loss=0.9696, diff_loss=0.2274, tot_loss=1.372, over 106.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3754, tot_loss=1.52, over 1142.00 samples.], 2024-10-22 01:59:02,853 INFO [train.py:682] (1/4) Start epoch 3776 2024-10-22 01:59:11,619 INFO [train.py:561] (1/4) Epoch 3776, batch 0, global_batch_idx: 60400, batch size: 108, loss[dur_loss=0.1791, prior_loss=0.9703, diff_loss=0.3221, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9703, diff_loss=0.3221, tot_loss=1.472, over 108.00 samples.], 2024-10-22 01:59:25,956 INFO [train.py:561] (1/4) Epoch 3776, batch 10, global_batch_idx: 60410, batch size: 111, loss[dur_loss=0.1815, prior_loss=0.9705, diff_loss=0.3098, tot_loss=1.462, over 111.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9694, diff_loss=0.3496, tot_loss=1.496, over 1656.00 samples.], 2024-10-22 01:59:33,176 INFO [train.py:682] (1/4) Start epoch 3777 2024-10-22 01:59:47,188 INFO [train.py:561] (1/4) Epoch 3777, batch 4, global_batch_idx: 60420, batch size: 189, loss[dur_loss=0.1779, prior_loss=0.9696, diff_loss=0.3262, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.3991, tot_loss=1.544, over 937.00 samples.], 2024-10-22 02:00:02,142 INFO [train.py:561] (1/4) Epoch 3777, batch 14, global_batch_idx: 60430, batch size: 142, loss[dur_loss=0.1816, prior_loss=0.9697, diff_loss=0.2678, tot_loss=1.419, over 142.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9695, diff_loss=0.3322, tot_loss=1.479, over 2210.00 samples.], 2024-10-22 02:00:03,565 INFO [train.py:682] (1/4) Start epoch 3778 2024-10-22 02:00:23,582 INFO [train.py:561] (1/4) Epoch 3778, batch 8, global_batch_idx: 60440, batch size: 170, loss[dur_loss=0.179, prior_loss=0.97, diff_loss=0.2822, tot_loss=1.431, over 170.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.3605, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 02:00:33,903 INFO [train.py:682] (1/4) Start epoch 3779 2024-10-22 02:00:45,136 INFO [train.py:561] (1/4) Epoch 3779, batch 2, global_batch_idx: 60450, batch size: 203, loss[dur_loss=0.1797, prior_loss=0.9698, diff_loss=0.3211, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9698, diff_loss=0.3047, tot_loss=1.454, over 442.00 samples.], 2024-10-22 02:00:59,528 INFO [train.py:561] (1/4) Epoch 3779, batch 12, global_batch_idx: 60460, batch size: 152, loss[dur_loss=0.1798, prior_loss=0.9699, diff_loss=0.2967, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9696, diff_loss=0.3448, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 02:01:04,017 INFO [train.py:682] (1/4) Start epoch 3780 2024-10-22 02:01:21,232 INFO [train.py:561] (1/4) Epoch 3780, batch 6, global_batch_idx: 60470, batch size: 106, loss[dur_loss=0.1775, prior_loss=0.9696, diff_loss=0.3146, tot_loss=1.462, over 106.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9692, diff_loss=0.3902, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 02:01:34,329 INFO [train.py:682] (1/4) Start epoch 3781 2024-10-22 02:01:42,976 INFO [train.py:561] (1/4) Epoch 3781, batch 0, global_batch_idx: 60480, batch size: 108, loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.237, tot_loss=1.388, over 108.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.237, tot_loss=1.388, over 108.00 samples.], 2024-10-22 02:01:57,376 INFO [train.py:561] (1/4) Epoch 3781, batch 10, global_batch_idx: 60490, batch size: 111, loss[dur_loss=0.1812, prior_loss=0.9707, diff_loss=0.2932, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.3478, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 02:02:04,502 INFO [train.py:682] (1/4) Start epoch 3782 2024-10-22 02:02:18,199 INFO [train.py:561] (1/4) Epoch 3782, batch 4, global_batch_idx: 60500, batch size: 189, loss[dur_loss=0.1775, prior_loss=0.9699, diff_loss=0.305, tot_loss=1.453, over 189.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9691, diff_loss=0.3867, tot_loss=1.531, over 937.00 samples.], 2024-10-22 02:02:33,114 INFO [train.py:561] (1/4) Epoch 3782, batch 14, global_batch_idx: 60510, batch size: 142, loss[dur_loss=0.1804, prior_loss=0.9695, diff_loss=0.2974, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9696, diff_loss=0.3389, tot_loss=1.485, over 2210.00 samples.], 2024-10-22 02:02:34,543 INFO [train.py:682] (1/4) Start epoch 3783 2024-10-22 02:02:54,559 INFO [train.py:561] (1/4) Epoch 3783, batch 8, global_batch_idx: 60520, batch size: 170, loss[dur_loss=0.1791, prior_loss=0.97, diff_loss=0.2812, tot_loss=1.43, over 170.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3677, tot_loss=1.514, over 1432.00 samples.], 2024-10-22 02:03:04,761 INFO [train.py:682] (1/4) Start epoch 3784 2024-10-22 02:03:16,604 INFO [train.py:561] (1/4) Epoch 3784, batch 2, global_batch_idx: 60530, batch size: 203, loss[dur_loss=0.1787, prior_loss=0.9698, diff_loss=0.322, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.2986, tot_loss=1.447, over 442.00 samples.], 2024-10-22 02:03:31,094 INFO [train.py:561] (1/4) Epoch 3784, batch 12, global_batch_idx: 60540, batch size: 152, loss[dur_loss=0.1772, prior_loss=0.9698, diff_loss=0.2936, tot_loss=1.441, over 152.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.3432, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 02:03:35,582 INFO [train.py:682] (1/4) Start epoch 3785 2024-10-22 02:03:52,739 INFO [train.py:561] (1/4) Epoch 3785, batch 6, global_batch_idx: 60550, batch size: 106, loss[dur_loss=0.1765, prior_loss=0.9698, diff_loss=0.2902, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9693, diff_loss=0.3811, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 02:04:05,923 INFO [train.py:682] (1/4) Start epoch 3786 2024-10-22 02:04:14,618 INFO [train.py:561] (1/4) Epoch 3786, batch 0, global_batch_idx: 60560, batch size: 108, loss[dur_loss=0.1817, prior_loss=0.9701, diff_loss=0.3277, tot_loss=1.48, over 108.00 samples.], tot_loss[dur_loss=0.1817, prior_loss=0.9701, diff_loss=0.3277, tot_loss=1.48, over 108.00 samples.], 2024-10-22 02:04:29,239 INFO [train.py:561] (1/4) Epoch 3786, batch 10, global_batch_idx: 60570, batch size: 111, loss[dur_loss=0.1774, prior_loss=0.9707, diff_loss=0.2744, tot_loss=1.422, over 111.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.3599, tot_loss=1.506, over 1656.00 samples.], 2024-10-22 02:04:36,456 INFO [train.py:682] (1/4) Start epoch 3787 2024-10-22 02:04:50,489 INFO [train.py:561] (1/4) Epoch 3787, batch 4, global_batch_idx: 60580, batch size: 189, loss[dur_loss=0.1762, prior_loss=0.9698, diff_loss=0.3403, tot_loss=1.486, over 189.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.4087, tot_loss=1.553, over 937.00 samples.], 2024-10-22 02:05:05,585 INFO [train.py:561] (1/4) Epoch 3787, batch 14, global_batch_idx: 60590, batch size: 142, loss[dur_loss=0.1808, prior_loss=0.9697, diff_loss=0.2652, tot_loss=1.416, over 142.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9696, diff_loss=0.3418, tot_loss=1.489, over 2210.00 samples.], 2024-10-22 02:05:07,039 INFO [train.py:682] (1/4) Start epoch 3788 2024-10-22 02:05:27,060 INFO [train.py:561] (1/4) Epoch 3788, batch 8, global_batch_idx: 60600, batch size: 170, loss[dur_loss=0.1775, prior_loss=0.97, diff_loss=0.299, tot_loss=1.446, over 170.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.3681, tot_loss=1.514, over 1432.00 samples.], 2024-10-22 02:05:37,310 INFO [train.py:682] (1/4) Start epoch 3789 2024-10-22 02:05:48,930 INFO [train.py:561] (1/4) Epoch 3789, batch 2, global_batch_idx: 60610, batch size: 203, loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.2933, tot_loss=1.441, over 203.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9698, diff_loss=0.2831, tot_loss=1.431, over 442.00 samples.], 2024-10-22 02:06:03,376 INFO [train.py:561] (1/4) Epoch 3789, batch 12, global_batch_idx: 60620, batch size: 152, loss[dur_loss=0.1764, prior_loss=0.9698, diff_loss=0.2902, tot_loss=1.436, over 152.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.3458, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 02:06:07,855 INFO [train.py:682] (1/4) Start epoch 3790 2024-10-22 02:06:24,930 INFO [train.py:561] (1/4) Epoch 3790, batch 6, global_batch_idx: 60630, batch size: 106, loss[dur_loss=0.177, prior_loss=0.9697, diff_loss=0.2849, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3877, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 02:06:38,090 INFO [train.py:682] (1/4) Start epoch 3791 2024-10-22 02:06:46,885 INFO [train.py:561] (1/4) Epoch 3791, batch 0, global_batch_idx: 60640, batch size: 108, loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.281, tot_loss=1.431, over 108.00 samples.], tot_loss[dur_loss=0.1798, prior_loss=0.97, diff_loss=0.281, tot_loss=1.431, over 108.00 samples.], 2024-10-22 02:07:01,181 INFO [train.py:561] (1/4) Epoch 3791, batch 10, global_batch_idx: 60650, batch size: 111, loss[dur_loss=0.1806, prior_loss=0.9709, diff_loss=0.3011, tot_loss=1.453, over 111.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9695, diff_loss=0.3595, tot_loss=1.506, over 1656.00 samples.], 2024-10-22 02:07:08,269 INFO [train.py:682] (1/4) Start epoch 3792 2024-10-22 02:07:21,999 INFO [train.py:561] (1/4) Epoch 3792, batch 4, global_batch_idx: 60660, batch size: 189, loss[dur_loss=0.178, prior_loss=0.9698, diff_loss=0.3483, tot_loss=1.496, over 189.00 samples.], tot_loss[dur_loss=0.175, prior_loss=0.9691, diff_loss=0.4118, tot_loss=1.556, over 937.00 samples.], 2024-10-22 02:07:37,016 INFO [train.py:561] (1/4) Epoch 3792, batch 14, global_batch_idx: 60670, batch size: 142, loss[dur_loss=0.1812, prior_loss=0.9696, diff_loss=0.2801, tot_loss=1.431, over 142.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3391, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 02:07:38,459 INFO [train.py:682] (1/4) Start epoch 3793 2024-10-22 02:07:58,553 INFO [train.py:561] (1/4) Epoch 3793, batch 8, global_batch_idx: 60680, batch size: 170, loss[dur_loss=0.1806, prior_loss=0.97, diff_loss=0.3268, tot_loss=1.477, over 170.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9694, diff_loss=0.3677, tot_loss=1.513, over 1432.00 samples.], 2024-10-22 02:08:08,659 INFO [train.py:682] (1/4) Start epoch 3794 2024-10-22 02:08:20,370 INFO [train.py:561] (1/4) Epoch 3794, batch 2, global_batch_idx: 60690, batch size: 203, loss[dur_loss=0.1793, prior_loss=0.9697, diff_loss=0.3129, tot_loss=1.462, over 203.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9698, diff_loss=0.3052, tot_loss=1.455, over 442.00 samples.], 2024-10-22 02:08:34,696 INFO [train.py:561] (1/4) Epoch 3794, batch 12, global_batch_idx: 60700, batch size: 152, loss[dur_loss=0.1766, prior_loss=0.9697, diff_loss=0.297, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9695, diff_loss=0.3439, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 02:08:39,125 INFO [train.py:682] (1/4) Start epoch 3795 2024-10-22 02:08:56,203 INFO [train.py:561] (1/4) Epoch 3795, batch 6, global_batch_idx: 60710, batch size: 106, loss[dur_loss=0.1758, prior_loss=0.9696, diff_loss=0.2829, tot_loss=1.428, over 106.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.3905, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 02:09:09,308 INFO [train.py:682] (1/4) Start epoch 3796 2024-10-22 02:09:18,268 INFO [train.py:561] (1/4) Epoch 3796, batch 0, global_batch_idx: 60720, batch size: 108, loss[dur_loss=0.1776, prior_loss=0.9702, diff_loss=0.2996, tot_loss=1.447, over 108.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9702, diff_loss=0.2996, tot_loss=1.447, over 108.00 samples.], 2024-10-22 02:09:32,605 INFO [train.py:561] (1/4) Epoch 3796, batch 10, global_batch_idx: 60730, batch size: 111, loss[dur_loss=0.1796, prior_loss=0.9705, diff_loss=0.2865, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9694, diff_loss=0.3532, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 02:09:39,745 INFO [train.py:682] (1/4) Start epoch 3797 2024-10-22 02:09:53,476 INFO [train.py:561] (1/4) Epoch 3797, batch 4, global_batch_idx: 60740, batch size: 189, loss[dur_loss=0.1795, prior_loss=0.97, diff_loss=0.2967, tot_loss=1.446, over 189.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.969, diff_loss=0.394, tot_loss=1.539, over 937.00 samples.], 2024-10-22 02:10:08,468 INFO [train.py:561] (1/4) Epoch 3797, batch 14, global_batch_idx: 60750, batch size: 142, loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.2585, tot_loss=1.405, over 142.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9695, diff_loss=0.3349, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 02:10:09,914 INFO [train.py:682] (1/4) Start epoch 3798 2024-10-22 02:10:30,248 INFO [train.py:561] (1/4) Epoch 3798, batch 8, global_batch_idx: 60760, batch size: 170, loss[dur_loss=0.1797, prior_loss=0.9699, diff_loss=0.323, tot_loss=1.473, over 170.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9693, diff_loss=0.368, tot_loss=1.513, over 1432.00 samples.], 2024-10-22 02:10:40,488 INFO [train.py:682] (1/4) Start epoch 3799 2024-10-22 02:10:51,799 INFO [train.py:561] (1/4) Epoch 3799, batch 2, global_batch_idx: 60770, batch size: 203, loss[dur_loss=0.1801, prior_loss=0.9697, diff_loss=0.33, tot_loss=1.48, over 203.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9697, diff_loss=0.2998, tot_loss=1.449, over 442.00 samples.], 2024-10-22 02:11:06,210 INFO [train.py:561] (1/4) Epoch 3799, batch 12, global_batch_idx: 60780, batch size: 152, loss[dur_loss=0.1791, prior_loss=0.9698, diff_loss=0.2669, tot_loss=1.416, over 152.00 samples.], tot_loss[dur_loss=0.1776, prior_loss=0.9695, diff_loss=0.3487, tot_loss=1.496, over 1966.00 samples.], 2024-10-22 02:11:10,689 INFO [train.py:682] (1/4) Start epoch 3800 2024-10-22 02:11:27,681 INFO [train.py:561] (1/4) Epoch 3800, batch 6, global_batch_idx: 60790, batch size: 106, loss[dur_loss=0.1779, prior_loss=0.9697, diff_loss=0.2686, tot_loss=1.416, over 106.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3888, tot_loss=1.533, over 1142.00 samples.], 2024-10-22 02:11:40,879 INFO [train.py:682] (1/4) Start epoch 3801 2024-10-22 02:11:49,884 INFO [train.py:561] (1/4) Epoch 3801, batch 0, global_batch_idx: 60800, batch size: 108, loss[dur_loss=0.1787, prior_loss=0.9701, diff_loss=0.2884, tot_loss=1.437, over 108.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9701, diff_loss=0.2884, tot_loss=1.437, over 108.00 samples.], 2024-10-22 02:12:04,157 INFO [train.py:561] (1/4) Epoch 3801, batch 10, global_batch_idx: 60810, batch size: 111, loss[dur_loss=0.179, prior_loss=0.9708, diff_loss=0.2739, tot_loss=1.424, over 111.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9694, diff_loss=0.3622, tot_loss=1.508, over 1656.00 samples.], 2024-10-22 02:12:11,318 INFO [train.py:682] (1/4) Start epoch 3802 2024-10-22 02:12:25,087 INFO [train.py:561] (1/4) Epoch 3802, batch 4, global_batch_idx: 60820, batch size: 189, loss[dur_loss=0.1796, prior_loss=0.9697, diff_loss=0.3228, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.969, diff_loss=0.4025, tot_loss=1.547, over 937.00 samples.], 2024-10-22 02:12:40,166 INFO [train.py:561] (1/4) Epoch 3802, batch 14, global_batch_idx: 60830, batch size: 142, loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.3128, tot_loss=1.461, over 142.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9695, diff_loss=0.3361, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 02:12:41,582 INFO [train.py:682] (1/4) Start epoch 3803 2024-10-22 02:13:01,490 INFO [train.py:561] (1/4) Epoch 3803, batch 8, global_batch_idx: 60840, batch size: 170, loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.3075, tot_loss=1.457, over 170.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9693, diff_loss=0.3714, tot_loss=1.517, over 1432.00 samples.], 2024-10-22 02:13:11,603 INFO [train.py:682] (1/4) Start epoch 3804 2024-10-22 02:13:23,135 INFO [train.py:561] (1/4) Epoch 3804, batch 2, global_batch_idx: 60850, batch size: 203, loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.3041, tot_loss=1.452, over 203.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9697, diff_loss=0.3019, tot_loss=1.45, over 442.00 samples.], 2024-10-22 02:13:37,324 INFO [train.py:561] (1/4) Epoch 3804, batch 12, global_batch_idx: 60860, batch size: 152, loss[dur_loss=0.1769, prior_loss=0.9698, diff_loss=0.2694, tot_loss=1.416, over 152.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9694, diff_loss=0.3479, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 02:13:41,803 INFO [train.py:682] (1/4) Start epoch 3805 2024-10-22 02:13:59,404 INFO [train.py:561] (1/4) Epoch 3805, batch 6, global_batch_idx: 60870, batch size: 106, loss[dur_loss=0.1744, prior_loss=0.9698, diff_loss=0.3093, tot_loss=1.453, over 106.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9693, diff_loss=0.3845, tot_loss=1.53, over 1142.00 samples.], 2024-10-22 02:14:12,491 INFO [train.py:682] (1/4) Start epoch 3806 2024-10-22 02:14:21,219 INFO [train.py:561] (1/4) Epoch 3806, batch 0, global_batch_idx: 60880, batch size: 108, loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.2677, tot_loss=1.42, over 108.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9703, diff_loss=0.2677, tot_loss=1.42, over 108.00 samples.], 2024-10-22 02:14:35,535 INFO [train.py:561] (1/4) Epoch 3806, batch 10, global_batch_idx: 60890, batch size: 111, loss[dur_loss=0.1822, prior_loss=0.9709, diff_loss=0.3154, tot_loss=1.468, over 111.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9695, diff_loss=0.3585, tot_loss=1.504, over 1656.00 samples.], 2024-10-22 02:14:42,644 INFO [train.py:682] (1/4) Start epoch 3807 2024-10-22 02:14:56,259 INFO [train.py:561] (1/4) Epoch 3807, batch 4, global_batch_idx: 60900, batch size: 189, loss[dur_loss=0.178, prior_loss=0.9697, diff_loss=0.3305, tot_loss=1.478, over 189.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.969, diff_loss=0.4013, tot_loss=1.546, over 937.00 samples.], 2024-10-22 02:15:11,119 INFO [train.py:561] (1/4) Epoch 3807, batch 14, global_batch_idx: 60910, batch size: 142, loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.2788, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9695, diff_loss=0.3436, tot_loss=1.49, over 2210.00 samples.], 2024-10-22 02:15:12,537 INFO [train.py:682] (1/4) Start epoch 3808 2024-10-22 02:15:32,676 INFO [train.py:561] (1/4) Epoch 3808, batch 8, global_batch_idx: 60920, batch size: 170, loss[dur_loss=0.1818, prior_loss=0.97, diff_loss=0.2638, tot_loss=1.416, over 170.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.3664, tot_loss=1.513, over 1432.00 samples.], 2024-10-22 02:15:42,879 INFO [train.py:682] (1/4) Start epoch 3809 2024-10-22 02:15:54,412 INFO [train.py:561] (1/4) Epoch 3809, batch 2, global_batch_idx: 60930, batch size: 203, loss[dur_loss=0.1798, prior_loss=0.9697, diff_loss=0.3151, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.18, prior_loss=0.9697, diff_loss=0.3012, tot_loss=1.451, over 442.00 samples.], 2024-10-22 02:16:08,671 INFO [train.py:561] (1/4) Epoch 3809, batch 12, global_batch_idx: 60940, batch size: 152, loss[dur_loss=0.1769, prior_loss=0.97, diff_loss=0.301, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9695, diff_loss=0.3462, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 02:16:13,146 INFO [train.py:682] (1/4) Start epoch 3810 2024-10-22 02:16:30,201 INFO [train.py:561] (1/4) Epoch 3810, batch 6, global_batch_idx: 60950, batch size: 106, loss[dur_loss=0.1762, prior_loss=0.9697, diff_loss=0.2443, tot_loss=1.39, over 106.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9693, diff_loss=0.3808, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 02:16:43,395 INFO [train.py:682] (1/4) Start epoch 3811 2024-10-22 02:16:52,111 INFO [train.py:561] (1/4) Epoch 3811, batch 0, global_batch_idx: 60960, batch size: 108, loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.2474, tot_loss=1.399, over 108.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9702, diff_loss=0.2474, tot_loss=1.399, over 108.00 samples.], 2024-10-22 02:17:06,434 INFO [train.py:561] (1/4) Epoch 3811, batch 10, global_batch_idx: 60970, batch size: 111, loss[dur_loss=0.1811, prior_loss=0.9708, diff_loss=0.3226, tot_loss=1.475, over 111.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.3577, tot_loss=1.505, over 1656.00 samples.], 2024-10-22 02:17:13,571 INFO [train.py:682] (1/4) Start epoch 3812 2024-10-22 02:17:27,482 INFO [train.py:561] (1/4) Epoch 3812, batch 4, global_batch_idx: 60980, batch size: 189, loss[dur_loss=0.1812, prior_loss=0.9703, diff_loss=0.2837, tot_loss=1.435, over 189.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9693, diff_loss=0.4028, tot_loss=1.549, over 937.00 samples.], 2024-10-22 02:17:42,406 INFO [train.py:561] (1/4) Epoch 3812, batch 14, global_batch_idx: 60990, batch size: 142, loss[dur_loss=0.1813, prior_loss=0.9699, diff_loss=0.2664, tot_loss=1.418, over 142.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9698, diff_loss=0.3429, tot_loss=1.491, over 2210.00 samples.], 2024-10-22 02:17:43,840 INFO [train.py:682] (1/4) Start epoch 3813 2024-10-22 02:18:03,968 INFO [train.py:561] (1/4) Epoch 3813, batch 8, global_batch_idx: 61000, batch size: 170, loss[dur_loss=0.178, prior_loss=0.9699, diff_loss=0.2903, tot_loss=1.438, over 170.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.3644, tot_loss=1.511, over 1432.00 samples.], 2024-10-22 02:18:14,151 INFO [train.py:682] (1/4) Start epoch 3814 2024-10-22 02:18:25,549 INFO [train.py:561] (1/4) Epoch 3814, batch 2, global_batch_idx: 61010, batch size: 203, loss[dur_loss=0.1793, prior_loss=0.9697, diff_loss=0.3287, tot_loss=1.478, over 203.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9697, diff_loss=0.3062, tot_loss=1.455, over 442.00 samples.], 2024-10-22 02:18:39,839 INFO [train.py:561] (1/4) Epoch 3814, batch 12, global_batch_idx: 61020, batch size: 152, loss[dur_loss=0.1789, prior_loss=0.97, diff_loss=0.2826, tot_loss=1.432, over 152.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.3379, tot_loss=1.484, over 1966.00 samples.], 2024-10-22 02:18:44,325 INFO [train.py:682] (1/4) Start epoch 3815 2024-10-22 02:19:01,246 INFO [train.py:561] (1/4) Epoch 3815, batch 6, global_batch_idx: 61030, batch size: 106, loss[dur_loss=0.1777, prior_loss=0.9698, diff_loss=0.2369, tot_loss=1.384, over 106.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9693, diff_loss=0.3863, tot_loss=1.531, over 1142.00 samples.], 2024-10-22 02:19:14,432 INFO [train.py:682] (1/4) Start epoch 3816 2024-10-22 02:19:23,532 INFO [train.py:561] (1/4) Epoch 3816, batch 0, global_batch_idx: 61040, batch size: 108, loss[dur_loss=0.1823, prior_loss=0.9702, diff_loss=0.2732, tot_loss=1.426, over 108.00 samples.], tot_loss[dur_loss=0.1823, prior_loss=0.9702, diff_loss=0.2732, tot_loss=1.426, over 108.00 samples.], 2024-10-22 02:19:37,790 INFO [train.py:561] (1/4) Epoch 3816, batch 10, global_batch_idx: 61050, batch size: 111, loss[dur_loss=0.18, prior_loss=0.9708, diff_loss=0.3312, tot_loss=1.482, over 111.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9695, diff_loss=0.3518, tot_loss=1.498, over 1656.00 samples.], 2024-10-22 02:19:44,966 INFO [train.py:682] (1/4) Start epoch 3817 2024-10-22 02:19:59,018 INFO [train.py:561] (1/4) Epoch 3817, batch 4, global_batch_idx: 61060, batch size: 189, loss[dur_loss=0.1814, prior_loss=0.9696, diff_loss=0.3479, tot_loss=1.499, over 189.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.969, diff_loss=0.4132, tot_loss=1.557, over 937.00 samples.], 2024-10-22 02:20:13,892 INFO [train.py:561] (1/4) Epoch 3817, batch 14, global_batch_idx: 61070, batch size: 142, loss[dur_loss=0.1787, prior_loss=0.9695, diff_loss=0.2679, tot_loss=1.416, over 142.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9695, diff_loss=0.34, tot_loss=1.487, over 2210.00 samples.], 2024-10-22 02:20:15,337 INFO [train.py:682] (1/4) Start epoch 3818 2024-10-22 02:20:35,149 INFO [train.py:561] (1/4) Epoch 3818, batch 8, global_batch_idx: 61080, batch size: 170, loss[dur_loss=0.1817, prior_loss=0.9698, diff_loss=0.2719, tot_loss=1.423, over 170.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.374, tot_loss=1.519, over 1432.00 samples.], 2024-10-22 02:20:45,245 INFO [train.py:682] (1/4) Start epoch 3819 2024-10-22 02:20:56,418 INFO [train.py:561] (1/4) Epoch 3819, batch 2, global_batch_idx: 61090, batch size: 203, loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.3101, tot_loss=1.458, over 203.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9697, diff_loss=0.283, tot_loss=1.431, over 442.00 samples.], 2024-10-22 02:21:10,561 INFO [train.py:561] (1/4) Epoch 3819, batch 12, global_batch_idx: 61100, batch size: 152, loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3175, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9694, diff_loss=0.3398, tot_loss=1.486, over 1966.00 samples.], 2024-10-22 02:21:15,039 INFO [train.py:682] (1/4) Start epoch 3820 2024-10-22 02:21:32,373 INFO [train.py:561] (1/4) Epoch 3820, batch 6, global_batch_idx: 61110, batch size: 106, loss[dur_loss=0.1794, prior_loss=0.9696, diff_loss=0.2503, tot_loss=1.399, over 106.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.9691, diff_loss=0.3727, tot_loss=1.517, over 1142.00 samples.], 2024-10-22 02:21:45,452 INFO [train.py:682] (1/4) Start epoch 3821 2024-10-22 02:21:54,484 INFO [train.py:561] (1/4) Epoch 3821, batch 0, global_batch_idx: 61120, batch size: 108, loss[dur_loss=0.1808, prior_loss=0.9699, diff_loss=0.2408, tot_loss=1.391, over 108.00 samples.], tot_loss[dur_loss=0.1808, prior_loss=0.9699, diff_loss=0.2408, tot_loss=1.391, over 108.00 samples.], 2024-10-22 02:22:08,729 INFO [train.py:561] (1/4) Epoch 3821, batch 10, global_batch_idx: 61130, batch size: 111, loss[dur_loss=0.1768, prior_loss=0.9706, diff_loss=0.2452, tot_loss=1.393, over 111.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9694, diff_loss=0.3478, tot_loss=1.493, over 1656.00 samples.], 2024-10-22 02:22:15,869 INFO [train.py:682] (1/4) Start epoch 3822 2024-10-22 02:22:29,736 INFO [train.py:561] (1/4) Epoch 3822, batch 4, global_batch_idx: 61140, batch size: 189, loss[dur_loss=0.1784, prior_loss=0.9696, diff_loss=0.3334, tot_loss=1.481, over 189.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.969, diff_loss=0.4142, tot_loss=1.557, over 937.00 samples.], 2024-10-22 02:22:44,598 INFO [train.py:561] (1/4) Epoch 3822, batch 14, global_batch_idx: 61150, batch size: 142, loss[dur_loss=0.1792, prior_loss=0.9694, diff_loss=0.2817, tot_loss=1.43, over 142.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9694, diff_loss=0.34, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 02:22:46,039 INFO [train.py:682] (1/4) Start epoch 3823 2024-10-22 02:23:06,076 INFO [train.py:561] (1/4) Epoch 3823, batch 8, global_batch_idx: 61160, batch size: 170, loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.3123, tot_loss=1.46, over 170.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.376, tot_loss=1.522, over 1432.00 samples.], 2024-10-22 02:23:16,137 INFO [train.py:682] (1/4) Start epoch 3824 2024-10-22 02:23:27,470 INFO [train.py:561] (1/4) Epoch 3824, batch 2, global_batch_idx: 61170, batch size: 203, loss[dur_loss=0.1822, prior_loss=0.9696, diff_loss=0.3069, tot_loss=1.459, over 203.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9696, diff_loss=0.3158, tot_loss=1.466, over 442.00 samples.], 2024-10-22 02:23:41,692 INFO [train.py:561] (1/4) Epoch 3824, batch 12, global_batch_idx: 61180, batch size: 152, loss[dur_loss=0.1768, prior_loss=0.9697, diff_loss=0.2821, tot_loss=1.429, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9694, diff_loss=0.3424, tot_loss=1.488, over 1966.00 samples.], 2024-10-22 02:23:46,198 INFO [train.py:682] (1/4) Start epoch 3825 2024-10-22 02:24:03,356 INFO [train.py:561] (1/4) Epoch 3825, batch 6, global_batch_idx: 61190, batch size: 106, loss[dur_loss=0.1773, prior_loss=0.9694, diff_loss=0.2964, tot_loss=1.443, over 106.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.969, diff_loss=0.3748, tot_loss=1.518, over 1142.00 samples.], 2024-10-22 02:24:16,342 INFO [train.py:682] (1/4) Start epoch 3826 2024-10-22 02:24:25,196 INFO [train.py:561] (1/4) Epoch 3826, batch 0, global_batch_idx: 61200, batch size: 108, loss[dur_loss=0.1794, prior_loss=0.9701, diff_loss=0.2962, tot_loss=1.446, over 108.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9701, diff_loss=0.2962, tot_loss=1.446, over 108.00 samples.], 2024-10-22 02:24:39,280 INFO [train.py:561] (1/4) Epoch 3826, batch 10, global_batch_idx: 61210, batch size: 111, loss[dur_loss=0.1775, prior_loss=0.9705, diff_loss=0.2773, tot_loss=1.425, over 111.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9693, diff_loss=0.352, tot_loss=1.496, over 1656.00 samples.], 2024-10-22 02:24:46,345 INFO [train.py:682] (1/4) Start epoch 3827 2024-10-22 02:25:00,331 INFO [train.py:561] (1/4) Epoch 3827, batch 4, global_batch_idx: 61220, batch size: 189, loss[dur_loss=0.1774, prior_loss=0.9697, diff_loss=0.2946, tot_loss=1.442, over 189.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9689, diff_loss=0.4013, tot_loss=1.544, over 937.00 samples.], 2024-10-22 02:25:15,166 INFO [train.py:561] (1/4) Epoch 3827, batch 14, global_batch_idx: 61230, batch size: 142, loss[dur_loss=0.1776, prior_loss=0.9693, diff_loss=0.2747, tot_loss=1.422, over 142.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9694, diff_loss=0.3341, tot_loss=1.48, over 2210.00 samples.], 2024-10-22 02:25:16,602 INFO [train.py:682] (1/4) Start epoch 3828 2024-10-22 02:25:36,704 INFO [train.py:561] (1/4) Epoch 3828, batch 8, global_batch_idx: 61240, batch size: 170, loss[dur_loss=0.1786, prior_loss=0.9697, diff_loss=0.3071, tot_loss=1.455, over 170.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.3622, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 02:25:46,808 INFO [train.py:682] (1/4) Start epoch 3829 2024-10-22 02:25:58,096 INFO [train.py:561] (1/4) Epoch 3829, batch 2, global_batch_idx: 61250, batch size: 203, loss[dur_loss=0.1788, prior_loss=0.9697, diff_loss=0.3325, tot_loss=1.481, over 203.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9697, diff_loss=0.2982, tot_loss=1.446, over 442.00 samples.], 2024-10-22 02:26:12,205 INFO [train.py:561] (1/4) Epoch 3829, batch 12, global_batch_idx: 61260, batch size: 152, loss[dur_loss=0.1758, prior_loss=0.9698, diff_loss=0.2964, tot_loss=1.442, over 152.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9694, diff_loss=0.3349, tot_loss=1.481, over 1966.00 samples.], 2024-10-22 02:26:16,683 INFO [train.py:682] (1/4) Start epoch 3830 2024-10-22 02:26:33,556 INFO [train.py:561] (1/4) Epoch 3830, batch 6, global_batch_idx: 61270, batch size: 106, loss[dur_loss=0.1771, prior_loss=0.9696, diff_loss=0.2898, tot_loss=1.437, over 106.00 samples.], tot_loss[dur_loss=0.1734, prior_loss=0.969, diff_loss=0.3908, tot_loss=1.533, over 1142.00 samples.], 2024-10-22 02:26:46,634 INFO [train.py:682] (1/4) Start epoch 3831 2024-10-22 02:26:55,510 INFO [train.py:561] (1/4) Epoch 3831, batch 0, global_batch_idx: 61280, batch size: 108, loss[dur_loss=0.179, prior_loss=0.97, diff_loss=0.2854, tot_loss=1.434, over 108.00 samples.], tot_loss[dur_loss=0.179, prior_loss=0.97, diff_loss=0.2854, tot_loss=1.434, over 108.00 samples.], 2024-10-22 02:27:09,661 INFO [train.py:561] (1/4) Epoch 3831, batch 10, global_batch_idx: 61290, batch size: 111, loss[dur_loss=0.1782, prior_loss=0.9705, diff_loss=0.3051, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9694, diff_loss=0.3517, tot_loss=1.497, over 1656.00 samples.], 2024-10-22 02:27:16,768 INFO [train.py:682] (1/4) Start epoch 3832 2024-10-22 02:27:30,757 INFO [train.py:561] (1/4) Epoch 3832, batch 4, global_batch_idx: 61300, batch size: 189, loss[dur_loss=0.1775, prior_loss=0.9698, diff_loss=0.3123, tot_loss=1.46, over 189.00 samples.], tot_loss[dur_loss=0.1739, prior_loss=0.9689, diff_loss=0.4033, tot_loss=1.546, over 937.00 samples.], 2024-10-22 02:27:45,523 INFO [train.py:561] (1/4) Epoch 3832, batch 14, global_batch_idx: 61310, batch size: 142, loss[dur_loss=0.178, prior_loss=0.9695, diff_loss=0.2624, tot_loss=1.41, over 142.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9694, diff_loss=0.3398, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 02:27:46,930 INFO [train.py:682] (1/4) Start epoch 3833 2024-10-22 02:28:06,913 INFO [train.py:561] (1/4) Epoch 3833, batch 8, global_batch_idx: 61320, batch size: 170, loss[dur_loss=0.181, prior_loss=0.9695, diff_loss=0.2946, tot_loss=1.445, over 170.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9692, diff_loss=0.3735, tot_loss=1.519, over 1432.00 samples.], 2024-10-22 02:28:17,092 INFO [train.py:682] (1/4) Start epoch 3834 2024-10-22 02:28:28,777 INFO [train.py:561] (1/4) Epoch 3834, batch 2, global_batch_idx: 61330, batch size: 203, loss[dur_loss=0.1792, prior_loss=0.9696, diff_loss=0.3218, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.3042, tot_loss=1.452, over 442.00 samples.], 2024-10-22 02:28:43,005 INFO [train.py:561] (1/4) Epoch 3834, batch 12, global_batch_idx: 61340, batch size: 152, loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.2636, tot_loss=1.411, over 152.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.3419, tot_loss=1.488, over 1966.00 samples.], 2024-10-22 02:28:47,472 INFO [train.py:682] (1/4) Start epoch 3835 2024-10-22 02:29:04,594 INFO [train.py:561] (1/4) Epoch 3835, batch 6, global_batch_idx: 61350, batch size: 106, loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.2718, tot_loss=1.418, over 106.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.969, diff_loss=0.3879, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 02:29:17,557 INFO [train.py:682] (1/4) Start epoch 3836 2024-10-22 02:29:26,314 INFO [train.py:561] (1/4) Epoch 3836, batch 0, global_batch_idx: 61360, batch size: 108, loss[dur_loss=0.1816, prior_loss=0.9702, diff_loss=0.3169, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.1816, prior_loss=0.9702, diff_loss=0.3169, tot_loss=1.469, over 108.00 samples.], 2024-10-22 02:29:40,495 INFO [train.py:561] (1/4) Epoch 3836, batch 10, global_batch_idx: 61370, batch size: 111, loss[dur_loss=0.1777, prior_loss=0.9703, diff_loss=0.3029, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.3557, tot_loss=1.501, over 1656.00 samples.], 2024-10-22 02:29:47,555 INFO [train.py:682] (1/4) Start epoch 3837 2024-10-22 02:30:01,760 INFO [train.py:561] (1/4) Epoch 3837, batch 4, global_batch_idx: 61380, batch size: 189, loss[dur_loss=0.1775, prior_loss=0.9695, diff_loss=0.3191, tot_loss=1.466, over 189.00 samples.], tot_loss[dur_loss=0.1739, prior_loss=0.9688, diff_loss=0.4082, tot_loss=1.551, over 937.00 samples.], 2024-10-22 02:30:16,615 INFO [train.py:561] (1/4) Epoch 3837, batch 14, global_batch_idx: 61390, batch size: 142, loss[dur_loss=0.176, prior_loss=0.9693, diff_loss=0.2828, tot_loss=1.428, over 142.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9693, diff_loss=0.3377, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 02:30:18,035 INFO [train.py:682] (1/4) Start epoch 3838 2024-10-22 02:30:38,320 INFO [train.py:561] (1/4) Epoch 3838, batch 8, global_batch_idx: 61400, batch size: 170, loss[dur_loss=0.179, prior_loss=0.9696, diff_loss=0.2895, tot_loss=1.438, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3619, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 02:30:48,399 INFO [train.py:682] (1/4) Start epoch 3839 2024-10-22 02:30:59,780 INFO [train.py:561] (1/4) Epoch 3839, batch 2, global_batch_idx: 61410, batch size: 203, loss[dur_loss=0.1786, prior_loss=0.9696, diff_loss=0.3251, tot_loss=1.473, over 203.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.9696, diff_loss=0.2927, tot_loss=1.441, over 442.00 samples.], 2024-10-22 02:31:13,974 INFO [train.py:561] (1/4) Epoch 3839, batch 12, global_batch_idx: 61420, batch size: 152, loss[dur_loss=0.1768, prior_loss=0.9697, diff_loss=0.2641, tot_loss=1.411, over 152.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9694, diff_loss=0.3371, tot_loss=1.483, over 1966.00 samples.], 2024-10-22 02:31:18,414 INFO [train.py:682] (1/4) Start epoch 3840 2024-10-22 02:31:35,492 INFO [train.py:561] (1/4) Epoch 3840, batch 6, global_batch_idx: 61430, batch size: 106, loss[dur_loss=0.1748, prior_loss=0.9695, diff_loss=0.3303, tot_loss=1.475, over 106.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.4035, tot_loss=1.548, over 1142.00 samples.], 2024-10-22 02:31:48,456 INFO [train.py:682] (1/4) Start epoch 3841 2024-10-22 02:31:57,250 INFO [train.py:561] (1/4) Epoch 3841, batch 0, global_batch_idx: 61440, batch size: 108, loss[dur_loss=0.1807, prior_loss=0.9701, diff_loss=0.3045, tot_loss=1.455, over 108.00 samples.], tot_loss[dur_loss=0.1807, prior_loss=0.9701, diff_loss=0.3045, tot_loss=1.455, over 108.00 samples.], 2024-10-22 02:32:11,295 INFO [train.py:561] (1/4) Epoch 3841, batch 10, global_batch_idx: 61450, batch size: 111, loss[dur_loss=0.1809, prior_loss=0.9705, diff_loss=0.2869, tot_loss=1.438, over 111.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9693, diff_loss=0.3583, tot_loss=1.503, over 1656.00 samples.], 2024-10-22 02:32:18,350 INFO [train.py:682] (1/4) Start epoch 3842 2024-10-22 02:32:32,033 INFO [train.py:561] (1/4) Epoch 3842, batch 4, global_batch_idx: 61460, batch size: 189, loss[dur_loss=0.1762, prior_loss=0.9696, diff_loss=0.3499, tot_loss=1.496, over 189.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9689, diff_loss=0.4039, tot_loss=1.548, over 937.00 samples.], 2024-10-22 02:32:46,825 INFO [train.py:561] (1/4) Epoch 3842, batch 14, global_batch_idx: 61470, batch size: 142, loss[dur_loss=0.1781, prior_loss=0.9694, diff_loss=0.264, tot_loss=1.411, over 142.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.3357, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 02:32:48,234 INFO [train.py:682] (1/4) Start epoch 3843 2024-10-22 02:33:08,315 INFO [train.py:561] (1/4) Epoch 3843, batch 8, global_batch_idx: 61480, batch size: 170, loss[dur_loss=0.1798, prior_loss=0.9695, diff_loss=0.2926, tot_loss=1.442, over 170.00 samples.], tot_loss[dur_loss=0.1761, prior_loss=0.9692, diff_loss=0.3638, tot_loss=1.509, over 1432.00 samples.], 2024-10-22 02:33:18,412 INFO [train.py:682] (1/4) Start epoch 3844 2024-10-22 02:33:30,009 INFO [train.py:561] (1/4) Epoch 3844, batch 2, global_batch_idx: 61490, batch size: 203, loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.3068, tot_loss=1.455, over 203.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9696, diff_loss=0.3026, tot_loss=1.451, over 442.00 samples.], 2024-10-22 02:33:44,189 INFO [train.py:561] (1/4) Epoch 3844, batch 12, global_batch_idx: 61500, batch size: 152, loss[dur_loss=0.1768, prior_loss=0.9697, diff_loss=0.3192, tot_loss=1.466, over 152.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3535, tot_loss=1.499, over 1966.00 samples.], 2024-10-22 02:33:45,812 INFO [train.py:579] (1/4) Computing validation loss 2024-10-22 02:34:20,693 INFO [train.py:589] (1/4) Epoch 3844, validation: dur_loss=0.4562, prior_loss=1.037, diff_loss=0.4023, tot_loss=1.895, over 100.00 samples. 2024-10-22 02:34:20,694 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-22 02:34:23,514 INFO [train.py:682] (1/4) Start epoch 3845 2024-10-22 02:34:41,204 INFO [train.py:561] (1/4) Epoch 3845, batch 6, global_batch_idx: 61510, batch size: 106, loss[dur_loss=0.1758, prior_loss=0.9693, diff_loss=0.2882, tot_loss=1.433, over 106.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.969, diff_loss=0.3888, tot_loss=1.534, over 1142.00 samples.], 2024-10-22 02:34:54,196 INFO [train.py:682] (1/4) Start epoch 3846 2024-10-22 02:35:03,296 INFO [train.py:561] (1/4) Epoch 3846, batch 0, global_batch_idx: 61520, batch size: 108, loss[dur_loss=0.1783, prior_loss=0.9699, diff_loss=0.2897, tot_loss=1.438, over 108.00 samples.], tot_loss[dur_loss=0.1783, prior_loss=0.9699, diff_loss=0.2897, tot_loss=1.438, over 108.00 samples.], 2024-10-22 02:35:17,590 INFO [train.py:561] (1/4) Epoch 3846, batch 10, global_batch_idx: 61530, batch size: 111, loss[dur_loss=0.1793, prior_loss=0.9705, diff_loss=0.2798, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3494, tot_loss=1.494, over 1656.00 samples.], 2024-10-22 02:35:24,705 INFO [train.py:682] (1/4) Start epoch 3847 2024-10-22 02:35:38,794 INFO [train.py:561] (1/4) Epoch 3847, batch 4, global_batch_idx: 61540, batch size: 189, loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.3242, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9689, diff_loss=0.4252, tot_loss=1.569, over 937.00 samples.], 2024-10-22 02:35:53,605 INFO [train.py:561] (1/4) Epoch 3847, batch 14, global_batch_idx: 61550, batch size: 142, loss[dur_loss=0.1781, prior_loss=0.9695, diff_loss=0.251, tot_loss=1.399, over 142.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.34, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 02:35:55,026 INFO [train.py:682] (1/4) Start epoch 3848 2024-10-22 02:36:15,164 INFO [train.py:561] (1/4) Epoch 3848, batch 8, global_batch_idx: 61560, batch size: 170, loss[dur_loss=0.1807, prior_loss=0.9697, diff_loss=0.3081, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9692, diff_loss=0.3753, tot_loss=1.521, over 1432.00 samples.], 2024-10-22 02:36:25,268 INFO [train.py:682] (1/4) Start epoch 3849 2024-10-22 02:36:37,491 INFO [train.py:561] (1/4) Epoch 3849, batch 2, global_batch_idx: 61570, batch size: 203, loss[dur_loss=0.1763, prior_loss=0.9696, diff_loss=0.3357, tot_loss=1.482, over 203.00 samples.], tot_loss[dur_loss=0.1778, prior_loss=0.9696, diff_loss=0.2931, tot_loss=1.44, over 442.00 samples.], 2024-10-22 02:36:51,715 INFO [train.py:561] (1/4) Epoch 3849, batch 12, global_batch_idx: 61580, batch size: 152, loss[dur_loss=0.176, prior_loss=0.9697, diff_loss=0.2779, tot_loss=1.424, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9693, diff_loss=0.3368, tot_loss=1.482, over 1966.00 samples.], 2024-10-22 02:36:56,134 INFO [train.py:682] (1/4) Start epoch 3850 2024-10-22 02:37:13,338 INFO [train.py:561] (1/4) Epoch 3850, batch 6, global_batch_idx: 61590, batch size: 106, loss[dur_loss=0.1752, prior_loss=0.9694, diff_loss=0.2675, tot_loss=1.412, over 106.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9691, diff_loss=0.3842, tot_loss=1.528, over 1142.00 samples.], 2024-10-22 02:37:26,369 INFO [train.py:682] (1/4) Start epoch 3851 2024-10-22 02:37:35,479 INFO [train.py:561] (1/4) Epoch 3851, batch 0, global_batch_idx: 61600, batch size: 108, loss[dur_loss=0.1785, prior_loss=0.97, diff_loss=0.2795, tot_loss=1.428, over 108.00 samples.], tot_loss[dur_loss=0.1785, prior_loss=0.97, diff_loss=0.2795, tot_loss=1.428, over 108.00 samples.], 2024-10-22 02:37:49,792 INFO [train.py:561] (1/4) Epoch 3851, batch 10, global_batch_idx: 61610, batch size: 111, loss[dur_loss=0.1788, prior_loss=0.9705, diff_loss=0.2707, tot_loss=1.42, over 111.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9693, diff_loss=0.3479, tot_loss=1.493, over 1656.00 samples.], 2024-10-22 02:37:56,869 INFO [train.py:682] (1/4) Start epoch 3852 2024-10-22 02:38:10,775 INFO [train.py:561] (1/4) Epoch 3852, batch 4, global_batch_idx: 61620, batch size: 189, loss[dur_loss=0.1787, prior_loss=0.9696, diff_loss=0.3184, tot_loss=1.467, over 189.00 samples.], tot_loss[dur_loss=0.1732, prior_loss=0.9689, diff_loss=0.4012, tot_loss=1.543, over 937.00 samples.], 2024-10-22 02:38:25,545 INFO [train.py:561] (1/4) Epoch 3852, batch 14, global_batch_idx: 61630, batch size: 142, loss[dur_loss=0.1775, prior_loss=0.9692, diff_loss=0.3, tot_loss=1.447, over 142.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9693, diff_loss=0.3351, tot_loss=1.48, over 2210.00 samples.], 2024-10-22 02:38:26,977 INFO [train.py:682] (1/4) Start epoch 3853 2024-10-22 02:38:47,358 INFO [train.py:561] (1/4) Epoch 3853, batch 8, global_batch_idx: 61640, batch size: 170, loss[dur_loss=0.1796, prior_loss=0.9695, diff_loss=0.303, tot_loss=1.452, over 170.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9691, diff_loss=0.3543, tot_loss=1.499, over 1432.00 samples.], 2024-10-22 02:38:57,376 INFO [train.py:682] (1/4) Start epoch 3854 2024-10-22 02:39:08,767 INFO [train.py:561] (1/4) Epoch 3854, batch 2, global_batch_idx: 61650, batch size: 203, loss[dur_loss=0.1781, prior_loss=0.9696, diff_loss=0.3203, tot_loss=1.468, over 203.00 samples.], tot_loss[dur_loss=0.1779, prior_loss=0.9697, diff_loss=0.3161, tot_loss=1.464, over 442.00 samples.], 2024-10-22 02:39:22,978 INFO [train.py:561] (1/4) Epoch 3854, batch 12, global_batch_idx: 61660, batch size: 152, loss[dur_loss=0.1792, prior_loss=0.9699, diff_loss=0.3246, tot_loss=1.474, over 152.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9694, diff_loss=0.3487, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 02:39:27,395 INFO [train.py:682] (1/4) Start epoch 3855 2024-10-22 02:39:44,319 INFO [train.py:561] (1/4) Epoch 3855, batch 6, global_batch_idx: 61670, batch size: 106, loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.3057, tot_loss=1.452, over 106.00 samples.], tot_loss[dur_loss=0.1745, prior_loss=0.9691, diff_loss=0.4031, tot_loss=1.547, over 1142.00 samples.], 2024-10-22 02:39:57,335 INFO [train.py:682] (1/4) Start epoch 3856 2024-10-22 02:40:06,309 INFO [train.py:561] (1/4) Epoch 3856, batch 0, global_batch_idx: 61680, batch size: 108, loss[dur_loss=0.1788, prior_loss=0.97, diff_loss=0.2844, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1788, prior_loss=0.97, diff_loss=0.2844, tot_loss=1.433, over 108.00 samples.], 2024-10-22 02:40:20,618 INFO [train.py:561] (1/4) Epoch 3856, batch 10, global_batch_idx: 61690, batch size: 111, loss[dur_loss=0.1781, prior_loss=0.9704, diff_loss=0.2968, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.369, tot_loss=1.514, over 1656.00 samples.], 2024-10-22 02:40:27,839 INFO [train.py:682] (1/4) Start epoch 3857 2024-10-22 02:40:41,918 INFO [train.py:561] (1/4) Epoch 3857, batch 4, global_batch_idx: 61700, batch size: 189, loss[dur_loss=0.1753, prior_loss=0.9695, diff_loss=0.3109, tot_loss=1.456, over 189.00 samples.], tot_loss[dur_loss=0.1732, prior_loss=0.9689, diff_loss=0.4125, tot_loss=1.555, over 937.00 samples.], 2024-10-22 02:40:57,202 INFO [train.py:561] (1/4) Epoch 3857, batch 14, global_batch_idx: 61710, batch size: 142, loss[dur_loss=0.1772, prior_loss=0.9693, diff_loss=0.2884, tot_loss=1.435, over 142.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9693, diff_loss=0.3397, tot_loss=1.485, over 2210.00 samples.], 2024-10-22 02:40:58,637 INFO [train.py:682] (1/4) Start epoch 3858 2024-10-22 02:41:19,485 INFO [train.py:561] (1/4) Epoch 3858, batch 8, global_batch_idx: 61720, batch size: 170, loss[dur_loss=0.1808, prior_loss=0.9698, diff_loss=0.2952, tot_loss=1.446, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.3623, tot_loss=1.507, over 1432.00 samples.], 2024-10-22 02:41:29,771 INFO [train.py:682] (1/4) Start epoch 3859 2024-10-22 02:41:41,703 INFO [train.py:561] (1/4) Epoch 3859, batch 2, global_batch_idx: 61730, batch size: 203, loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.3183, tot_loss=1.465, over 203.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9697, diff_loss=0.3002, tot_loss=1.447, over 442.00 samples.], 2024-10-22 02:41:56,316 INFO [train.py:561] (1/4) Epoch 3859, batch 12, global_batch_idx: 61740, batch size: 152, loss[dur_loss=0.1763, prior_loss=0.9697, diff_loss=0.2867, tot_loss=1.433, over 152.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3547, tot_loss=1.5, over 1966.00 samples.], 2024-10-22 02:42:00,848 INFO [train.py:682] (1/4) Start epoch 3860 2024-10-22 02:42:18,586 INFO [train.py:561] (1/4) Epoch 3860, batch 6, global_batch_idx: 61750, batch size: 106, loss[dur_loss=0.1742, prior_loss=0.9694, diff_loss=0.2889, tot_loss=1.432, over 106.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9691, diff_loss=0.3929, tot_loss=1.537, over 1142.00 samples.], 2024-10-22 02:42:31,981 INFO [train.py:682] (1/4) Start epoch 3861 2024-10-22 02:42:41,409 INFO [train.py:561] (1/4) Epoch 3861, batch 0, global_batch_idx: 61760, batch size: 108, loss[dur_loss=0.1775, prior_loss=0.9701, diff_loss=0.2762, tot_loss=1.424, over 108.00 samples.], tot_loss[dur_loss=0.1775, prior_loss=0.9701, diff_loss=0.2762, tot_loss=1.424, over 108.00 samples.], 2024-10-22 02:42:56,077 INFO [train.py:561] (1/4) Epoch 3861, batch 10, global_batch_idx: 61770, batch size: 111, loss[dur_loss=0.1812, prior_loss=0.9706, diff_loss=0.2945, tot_loss=1.446, over 111.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9694, diff_loss=0.3589, tot_loss=1.505, over 1656.00 samples.], 2024-10-22 02:43:03,349 INFO [train.py:682] (1/4) Start epoch 3862 2024-10-22 02:43:17,866 INFO [train.py:561] (1/4) Epoch 3862, batch 4, global_batch_idx: 61780, batch size: 189, loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.2958, tot_loss=1.443, over 189.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9688, diff_loss=0.4004, tot_loss=1.543, over 937.00 samples.], 2024-10-22 02:43:33,027 INFO [train.py:561] (1/4) Epoch 3862, batch 14, global_batch_idx: 61790, batch size: 142, loss[dur_loss=0.1782, prior_loss=0.9694, diff_loss=0.2714, tot_loss=1.419, over 142.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3368, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 02:43:34,501 INFO [train.py:682] (1/4) Start epoch 3863 2024-10-22 02:43:55,132 INFO [train.py:561] (1/4) Epoch 3863, batch 8, global_batch_idx: 61800, batch size: 170, loss[dur_loss=0.1801, prior_loss=0.9696, diff_loss=0.2536, tot_loss=1.403, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3483, tot_loss=1.493, over 1432.00 samples.], 2024-10-22 02:44:05,429 INFO [train.py:682] (1/4) Start epoch 3864 2024-10-22 02:44:17,141 INFO [train.py:561] (1/4) Epoch 3864, batch 2, global_batch_idx: 61810, batch size: 203, loss[dur_loss=0.1786, prior_loss=0.9698, diff_loss=0.2997, tot_loss=1.448, over 203.00 samples.], tot_loss[dur_loss=0.1791, prior_loss=0.9698, diff_loss=0.2975, tot_loss=1.446, over 442.00 samples.], 2024-10-22 02:44:31,747 INFO [train.py:561] (1/4) Epoch 3864, batch 12, global_batch_idx: 61820, batch size: 152, loss[dur_loss=0.1776, prior_loss=0.97, diff_loss=0.2723, tot_loss=1.42, over 152.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9694, diff_loss=0.3457, tot_loss=1.491, over 1966.00 samples.], 2024-10-22 02:44:36,310 INFO [train.py:682] (1/4) Start epoch 3865 2024-10-22 02:44:53,988 INFO [train.py:561] (1/4) Epoch 3865, batch 6, global_batch_idx: 61830, batch size: 106, loss[dur_loss=0.1758, prior_loss=0.9694, diff_loss=0.3038, tot_loss=1.449, over 106.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9691, diff_loss=0.4034, tot_loss=1.548, over 1142.00 samples.], 2024-10-22 02:45:07,388 INFO [train.py:682] (1/4) Start epoch 3866 2024-10-22 02:45:16,579 INFO [train.py:561] (1/4) Epoch 3866, batch 0, global_batch_idx: 61840, batch size: 108, loss[dur_loss=0.1784, prior_loss=0.9701, diff_loss=0.2733, tot_loss=1.422, over 108.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9701, diff_loss=0.2733, tot_loss=1.422, over 108.00 samples.], 2024-10-22 02:45:31,314 INFO [train.py:561] (1/4) Epoch 3866, batch 10, global_batch_idx: 61850, batch size: 111, loss[dur_loss=0.1787, prior_loss=0.9704, diff_loss=0.2644, tot_loss=1.414, over 111.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9693, diff_loss=0.3549, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 02:45:38,564 INFO [train.py:682] (1/4) Start epoch 3867 2024-10-22 02:45:52,974 INFO [train.py:561] (1/4) Epoch 3867, batch 4, global_batch_idx: 61860, batch size: 189, loss[dur_loss=0.1782, prior_loss=0.9697, diff_loss=0.3395, tot_loss=1.487, over 189.00 samples.], tot_loss[dur_loss=0.1737, prior_loss=0.9689, diff_loss=0.4131, tot_loss=1.556, over 937.00 samples.], 2024-10-22 02:46:08,232 INFO [train.py:561] (1/4) Epoch 3867, batch 14, global_batch_idx: 61870, batch size: 142, loss[dur_loss=0.1813, prior_loss=0.9693, diff_loss=0.2598, tot_loss=1.41, over 142.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9694, diff_loss=0.3432, tot_loss=1.489, over 2210.00 samples.], 2024-10-22 02:46:09,696 INFO [train.py:682] (1/4) Start epoch 3868 2024-10-22 02:46:30,203 INFO [train.py:561] (1/4) Epoch 3868, batch 8, global_batch_idx: 61880, batch size: 170, loss[dur_loss=0.1779, prior_loss=0.9696, diff_loss=0.3334, tot_loss=1.481, over 170.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.9692, diff_loss=0.3613, tot_loss=1.506, over 1432.00 samples.], 2024-10-22 02:46:40,518 INFO [train.py:682] (1/4) Start epoch 3869 2024-10-22 02:46:52,298 INFO [train.py:561] (1/4) Epoch 3869, batch 2, global_batch_idx: 61890, batch size: 203, loss[dur_loss=0.1796, prior_loss=0.9698, diff_loss=0.2994, tot_loss=1.449, over 203.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.2991, tot_loss=1.447, over 442.00 samples.], 2024-10-22 02:47:06,855 INFO [train.py:561] (1/4) Epoch 3869, batch 12, global_batch_idx: 61900, batch size: 152, loss[dur_loss=0.1758, prior_loss=0.9695, diff_loss=0.2954, tot_loss=1.441, over 152.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9693, diff_loss=0.3413, tot_loss=1.487, over 1966.00 samples.], 2024-10-22 02:47:11,414 INFO [train.py:682] (1/4) Start epoch 3870 2024-10-22 02:47:29,191 INFO [train.py:561] (1/4) Epoch 3870, batch 6, global_batch_idx: 61910, batch size: 106, loss[dur_loss=0.1783, prior_loss=0.9694, diff_loss=0.2689, tot_loss=1.417, over 106.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.969, diff_loss=0.3725, tot_loss=1.517, over 1142.00 samples.], 2024-10-22 02:47:42,582 INFO [train.py:682] (1/4) Start epoch 3871 2024-10-22 02:47:51,493 INFO [train.py:561] (1/4) Epoch 3871, batch 0, global_batch_idx: 61920, batch size: 108, loss[dur_loss=0.1803, prior_loss=0.9699, diff_loss=0.2979, tot_loss=1.448, over 108.00 samples.], tot_loss[dur_loss=0.1803, prior_loss=0.9699, diff_loss=0.2979, tot_loss=1.448, over 108.00 samples.], 2024-10-22 02:48:06,092 INFO [train.py:561] (1/4) Epoch 3871, batch 10, global_batch_idx: 61930, batch size: 111, loss[dur_loss=0.1783, prior_loss=0.9702, diff_loss=0.3201, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9692, diff_loss=0.3478, tot_loss=1.494, over 1656.00 samples.], 2024-10-22 02:48:13,393 INFO [train.py:682] (1/4) Start epoch 3872 2024-10-22 02:48:27,830 INFO [train.py:561] (1/4) Epoch 3872, batch 4, global_batch_idx: 61940, batch size: 189, loss[dur_loss=0.1763, prior_loss=0.9697, diff_loss=0.3179, tot_loss=1.464, over 189.00 samples.], tot_loss[dur_loss=0.1739, prior_loss=0.9688, diff_loss=0.4005, tot_loss=1.543, over 937.00 samples.], 2024-10-22 02:48:43,124 INFO [train.py:561] (1/4) Epoch 3872, batch 14, global_batch_idx: 61950, batch size: 142, loss[dur_loss=0.179, prior_loss=0.9694, diff_loss=0.2452, tot_loss=1.394, over 142.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9693, diff_loss=0.3358, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 02:48:44,570 INFO [train.py:682] (1/4) Start epoch 3873 2024-10-22 02:49:05,259 INFO [train.py:561] (1/4) Epoch 3873, batch 8, global_batch_idx: 61960, batch size: 170, loss[dur_loss=0.1789, prior_loss=0.9697, diff_loss=0.3204, tot_loss=1.469, over 170.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9691, diff_loss=0.3675, tot_loss=1.512, over 1432.00 samples.], 2024-10-22 02:49:15,580 INFO [train.py:682] (1/4) Start epoch 3874 2024-10-22 02:49:27,291 INFO [train.py:561] (1/4) Epoch 3874, batch 2, global_batch_idx: 61970, batch size: 203, loss[dur_loss=0.1799, prior_loss=0.9696, diff_loss=0.3137, tot_loss=1.463, over 203.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9695, diff_loss=0.3033, tot_loss=1.451, over 442.00 samples.], 2024-10-22 02:49:41,901 INFO [train.py:561] (1/4) Epoch 3874, batch 12, global_batch_idx: 61980, batch size: 152, loss[dur_loss=0.1755, prior_loss=0.9695, diff_loss=0.2895, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9692, diff_loss=0.3386, tot_loss=1.484, over 1966.00 samples.], 2024-10-22 02:49:46,451 INFO [train.py:682] (1/4) Start epoch 3875 2024-10-22 02:50:03,909 INFO [train.py:561] (1/4) Epoch 3875, batch 6, global_batch_idx: 61990, batch size: 106, loss[dur_loss=0.1763, prior_loss=0.9693, diff_loss=0.271, tot_loss=1.417, over 106.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9689, diff_loss=0.3753, tot_loss=1.519, over 1142.00 samples.], 2024-10-22 02:50:17,266 INFO [train.py:682] (1/4) Start epoch 3876 2024-10-22 02:50:26,551 INFO [train.py:561] (1/4) Epoch 3876, batch 0, global_batch_idx: 62000, batch size: 108, loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.2754, tot_loss=1.424, over 108.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9698, diff_loss=0.2754, tot_loss=1.424, over 108.00 samples.], 2024-10-22 02:50:41,162 INFO [train.py:561] (1/4) Epoch 3876, batch 10, global_batch_idx: 62010, batch size: 111, loss[dur_loss=0.1758, prior_loss=0.9703, diff_loss=0.3341, tot_loss=1.48, over 111.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3584, tot_loss=1.503, over 1656.00 samples.], 2024-10-22 02:50:48,382 INFO [train.py:682] (1/4) Start epoch 3877 2024-10-22 02:51:02,629 INFO [train.py:561] (1/4) Epoch 3877, batch 4, global_batch_idx: 62020, batch size: 189, loss[dur_loss=0.1754, prior_loss=0.9697, diff_loss=0.3197, tot_loss=1.465, over 189.00 samples.], tot_loss[dur_loss=0.174, prior_loss=0.9688, diff_loss=0.4027, tot_loss=1.545, over 937.00 samples.], 2024-10-22 02:51:17,876 INFO [train.py:561] (1/4) Epoch 3877, batch 14, global_batch_idx: 62030, batch size: 142, loss[dur_loss=0.1765, prior_loss=0.9692, diff_loss=0.2793, tot_loss=1.425, over 142.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9693, diff_loss=0.3324, tot_loss=1.478, over 2210.00 samples.], 2024-10-22 02:51:19,329 INFO [train.py:682] (1/4) Start epoch 3878 2024-10-22 02:51:39,777 INFO [train.py:561] (1/4) Epoch 3878, batch 8, global_batch_idx: 62040, batch size: 170, loss[dur_loss=0.1788, prior_loss=0.9694, diff_loss=0.2981, tot_loss=1.446, over 170.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9691, diff_loss=0.3742, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 02:51:50,071 INFO [train.py:682] (1/4) Start epoch 3879 2024-10-22 02:52:01,863 INFO [train.py:561] (1/4) Epoch 3879, batch 2, global_batch_idx: 62050, batch size: 203, loss[dur_loss=0.178, prior_loss=0.9694, diff_loss=0.3086, tot_loss=1.456, over 203.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9695, diff_loss=0.296, tot_loss=1.442, over 442.00 samples.], 2024-10-22 02:52:16,299 INFO [train.py:561] (1/4) Epoch 3879, batch 12, global_batch_idx: 62060, batch size: 152, loss[dur_loss=0.1755, prior_loss=0.9696, diff_loss=0.2762, tot_loss=1.421, over 152.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3447, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 02:52:20,828 INFO [train.py:682] (1/4) Start epoch 3880 2024-10-22 02:52:38,223 INFO [train.py:561] (1/4) Epoch 3880, batch 6, global_batch_idx: 62070, batch size: 106, loss[dur_loss=0.1786, prior_loss=0.9694, diff_loss=0.3078, tot_loss=1.456, over 106.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9689, diff_loss=0.3798, tot_loss=1.523, over 1142.00 samples.], 2024-10-22 02:52:51,543 INFO [train.py:682] (1/4) Start epoch 3881 2024-10-22 02:53:00,761 INFO [train.py:561] (1/4) Epoch 3881, batch 0, global_batch_idx: 62080, batch size: 108, loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.2841, tot_loss=1.433, over 108.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.9699, diff_loss=0.2841, tot_loss=1.433, over 108.00 samples.], 2024-10-22 02:53:15,246 INFO [train.py:561] (1/4) Epoch 3881, batch 10, global_batch_idx: 62090, batch size: 111, loss[dur_loss=0.1778, prior_loss=0.9703, diff_loss=0.2892, tot_loss=1.437, over 111.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3546, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 02:53:22,551 INFO [train.py:682] (1/4) Start epoch 3882 2024-10-22 02:53:37,073 INFO [train.py:561] (1/4) Epoch 3882, batch 4, global_batch_idx: 62100, batch size: 189, loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.2963, tot_loss=1.443, over 189.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9687, diff_loss=0.3969, tot_loss=1.541, over 937.00 samples.], 2024-10-22 02:53:52,336 INFO [train.py:561] (1/4) Epoch 3882, batch 14, global_batch_idx: 62110, batch size: 142, loss[dur_loss=0.1768, prior_loss=0.9692, diff_loss=0.3043, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.3379, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 02:53:53,788 INFO [train.py:682] (1/4) Start epoch 3883 2024-10-22 02:54:14,344 INFO [train.py:561] (1/4) Epoch 3883, batch 8, global_batch_idx: 62120, batch size: 170, loss[dur_loss=0.1775, prior_loss=0.9695, diff_loss=0.317, tot_loss=1.464, over 170.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.3698, tot_loss=1.514, over 1432.00 samples.], 2024-10-22 02:54:24,735 INFO [train.py:682] (1/4) Start epoch 3884 2024-10-22 02:54:36,580 INFO [train.py:561] (1/4) Epoch 3884, batch 2, global_batch_idx: 62130, batch size: 203, loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.3165, tot_loss=1.463, over 203.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9695, diff_loss=0.2991, tot_loss=1.445, over 442.00 samples.], 2024-10-22 02:54:51,222 INFO [train.py:561] (1/4) Epoch 3884, batch 12, global_batch_idx: 62140, batch size: 152, loss[dur_loss=0.1757, prior_loss=0.9695, diff_loss=0.2866, tot_loss=1.432, over 152.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9692, diff_loss=0.3424, tot_loss=1.486, over 1966.00 samples.], 2024-10-22 02:54:55,773 INFO [train.py:682] (1/4) Start epoch 3885 2024-10-22 02:55:13,179 INFO [train.py:561] (1/4) Epoch 3885, batch 6, global_batch_idx: 62150, batch size: 106, loss[dur_loss=0.1758, prior_loss=0.9694, diff_loss=0.2658, tot_loss=1.411, over 106.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.9689, diff_loss=0.3771, tot_loss=1.52, over 1142.00 samples.], 2024-10-22 02:55:26,573 INFO [train.py:682] (1/4) Start epoch 3886 2024-10-22 02:55:35,582 INFO [train.py:561] (1/4) Epoch 3886, batch 0, global_batch_idx: 62160, batch size: 108, loss[dur_loss=0.1767, prior_loss=0.9699, diff_loss=0.2906, tot_loss=1.437, over 108.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9699, diff_loss=0.2906, tot_loss=1.437, over 108.00 samples.], 2024-10-22 02:55:50,248 INFO [train.py:561] (1/4) Epoch 3886, batch 10, global_batch_idx: 62170, batch size: 111, loss[dur_loss=0.1775, prior_loss=0.9703, diff_loss=0.2646, tot_loss=1.412, over 111.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9691, diff_loss=0.3556, tot_loss=1.5, over 1656.00 samples.], 2024-10-22 02:55:57,509 INFO [train.py:682] (1/4) Start epoch 3887 2024-10-22 02:56:11,875 INFO [train.py:561] (1/4) Epoch 3887, batch 4, global_batch_idx: 62180, batch size: 189, loss[dur_loss=0.1774, prior_loss=0.9694, diff_loss=0.3234, tot_loss=1.47, over 189.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9689, diff_loss=0.4064, tot_loss=1.55, over 937.00 samples.], 2024-10-22 02:56:27,111 INFO [train.py:561] (1/4) Epoch 3887, batch 14, global_batch_idx: 62190, batch size: 142, loss[dur_loss=0.1763, prior_loss=0.9692, diff_loss=0.2622, tot_loss=1.408, over 142.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9692, diff_loss=0.3372, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 02:56:28,556 INFO [train.py:682] (1/4) Start epoch 3888 2024-10-22 02:56:49,118 INFO [train.py:561] (1/4) Epoch 3888, batch 8, global_batch_idx: 62200, batch size: 170, loss[dur_loss=0.1816, prior_loss=0.9697, diff_loss=0.3068, tot_loss=1.458, over 170.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9691, diff_loss=0.372, tot_loss=1.517, over 1432.00 samples.], 2024-10-22 02:56:59,423 INFO [train.py:682] (1/4) Start epoch 3889 2024-10-22 02:57:11,261 INFO [train.py:561] (1/4) Epoch 3889, batch 2, global_batch_idx: 62210, batch size: 203, loss[dur_loss=0.1784, prior_loss=0.9696, diff_loss=0.2922, tot_loss=1.44, over 203.00 samples.], tot_loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.2915, tot_loss=1.438, over 442.00 samples.], 2024-10-22 02:57:25,780 INFO [train.py:561] (1/4) Epoch 3889, batch 12, global_batch_idx: 62220, batch size: 152, loss[dur_loss=0.1769, prior_loss=0.9697, diff_loss=0.2914, tot_loss=1.438, over 152.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3445, tot_loss=1.489, over 1966.00 samples.], 2024-10-22 02:57:30,345 INFO [train.py:682] (1/4) Start epoch 3890 2024-10-22 02:57:48,172 INFO [train.py:561] (1/4) Epoch 3890, batch 6, global_batch_idx: 62230, batch size: 106, loss[dur_loss=0.1741, prior_loss=0.9694, diff_loss=0.2477, tot_loss=1.391, over 106.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9689, diff_loss=0.3812, tot_loss=1.525, over 1142.00 samples.], 2024-10-22 02:58:01,607 INFO [train.py:682] (1/4) Start epoch 3891 2024-10-22 02:58:10,662 INFO [train.py:561] (1/4) Epoch 3891, batch 0, global_batch_idx: 62240, batch size: 108, loss[dur_loss=0.1794, prior_loss=0.97, diff_loss=0.2667, tot_loss=1.416, over 108.00 samples.], tot_loss[dur_loss=0.1794, prior_loss=0.97, diff_loss=0.2667, tot_loss=1.416, over 108.00 samples.], 2024-10-22 02:58:25,322 INFO [train.py:561] (1/4) Epoch 3891, batch 10, global_batch_idx: 62250, batch size: 111, loss[dur_loss=0.1763, prior_loss=0.9703, diff_loss=0.307, tot_loss=1.454, over 111.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.3616, tot_loss=1.506, over 1656.00 samples.], 2024-10-22 02:58:32,562 INFO [train.py:682] (1/4) Start epoch 3892 2024-10-22 02:58:46,698 INFO [train.py:561] (1/4) Epoch 3892, batch 4, global_batch_idx: 62260, batch size: 189, loss[dur_loss=0.1776, prior_loss=0.9694, diff_loss=0.3309, tot_loss=1.478, over 189.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.9688, diff_loss=0.4083, tot_loss=1.552, over 937.00 samples.], 2024-10-22 02:59:01,870 INFO [train.py:561] (1/4) Epoch 3892, batch 14, global_batch_idx: 62270, batch size: 142, loss[dur_loss=0.1793, prior_loss=0.9693, diff_loss=0.2777, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.3354, tot_loss=1.48, over 2210.00 samples.], 2024-10-22 02:59:03,334 INFO [train.py:682] (1/4) Start epoch 3893 2024-10-22 02:59:24,336 INFO [train.py:561] (1/4) Epoch 3893, batch 8, global_batch_idx: 62280, batch size: 170, loss[dur_loss=0.1773, prior_loss=0.9694, diff_loss=0.3192, tot_loss=1.466, over 170.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.969, diff_loss=0.3638, tot_loss=1.508, over 1432.00 samples.], 2024-10-22 02:59:34,768 INFO [train.py:682] (1/4) Start epoch 3894 2024-10-22 02:59:46,752 INFO [train.py:561] (1/4) Epoch 3894, batch 2, global_batch_idx: 62290, batch size: 203, loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.3167, tot_loss=1.463, over 203.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9694, diff_loss=0.3018, tot_loss=1.447, over 442.00 samples.], 2024-10-22 03:00:01,343 INFO [train.py:561] (1/4) Epoch 3894, batch 12, global_batch_idx: 62300, batch size: 152, loss[dur_loss=0.1756, prior_loss=0.9695, diff_loss=0.2597, tot_loss=1.405, over 152.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9691, diff_loss=0.3495, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 03:00:05,871 INFO [train.py:682] (1/4) Start epoch 3895 2024-10-22 03:00:23,507 INFO [train.py:561] (1/4) Epoch 3895, batch 6, global_batch_idx: 62310, batch size: 106, loss[dur_loss=0.178, prior_loss=0.9693, diff_loss=0.267, tot_loss=1.414, over 106.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.9689, diff_loss=0.3873, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 03:00:36,899 INFO [train.py:682] (1/4) Start epoch 3896 2024-10-22 03:00:45,958 INFO [train.py:561] (1/4) Epoch 3896, batch 0, global_batch_idx: 62320, batch size: 108, loss[dur_loss=0.1801, prior_loss=0.9699, diff_loss=0.2303, tot_loss=1.38, over 108.00 samples.], tot_loss[dur_loss=0.1801, prior_loss=0.9699, diff_loss=0.2303, tot_loss=1.38, over 108.00 samples.], 2024-10-22 03:01:00,563 INFO [train.py:561] (1/4) Epoch 3896, batch 10, global_batch_idx: 62330, batch size: 111, loss[dur_loss=0.1755, prior_loss=0.9703, diff_loss=0.2677, tot_loss=1.414, over 111.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9691, diff_loss=0.342, tot_loss=1.486, over 1656.00 samples.], 2024-10-22 03:01:07,831 INFO [train.py:682] (1/4) Start epoch 3897 2024-10-22 03:01:21,888 INFO [train.py:561] (1/4) Epoch 3897, batch 4, global_batch_idx: 62340, batch size: 189, loss[dur_loss=0.1777, prior_loss=0.9694, diff_loss=0.2755, tot_loss=1.423, over 189.00 samples.], tot_loss[dur_loss=0.1743, prior_loss=0.9687, diff_loss=0.394, tot_loss=1.537, over 937.00 samples.], 2024-10-22 03:01:37,210 INFO [train.py:561] (1/4) Epoch 3897, batch 14, global_batch_idx: 62350, batch size: 142, loss[dur_loss=0.1762, prior_loss=0.9691, diff_loss=0.293, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9692, diff_loss=0.3387, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 03:01:38,663 INFO [train.py:682] (1/4) Start epoch 3898 2024-10-22 03:01:59,203 INFO [train.py:561] (1/4) Epoch 3898, batch 8, global_batch_idx: 62360, batch size: 170, loss[dur_loss=0.1778, prior_loss=0.9696, diff_loss=0.2835, tot_loss=1.431, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.3583, tot_loss=1.503, over 1432.00 samples.], 2024-10-22 03:02:09,515 INFO [train.py:682] (1/4) Start epoch 3899 2024-10-22 03:02:21,166 INFO [train.py:561] (1/4) Epoch 3899, batch 2, global_batch_idx: 62370, batch size: 203, loss[dur_loss=0.1786, prior_loss=0.9695, diff_loss=0.3128, tot_loss=1.461, over 203.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9695, diff_loss=0.2883, tot_loss=1.436, over 442.00 samples.], 2024-10-22 03:02:35,730 INFO [train.py:561] (1/4) Epoch 3899, batch 12, global_batch_idx: 62380, batch size: 152, loss[dur_loss=0.1742, prior_loss=0.9694, diff_loss=0.3131, tot_loss=1.457, over 152.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.9692, diff_loss=0.3456, tot_loss=1.49, over 1966.00 samples.], 2024-10-22 03:02:40,242 INFO [train.py:682] (1/4) Start epoch 3900 2024-10-22 03:02:57,659 INFO [train.py:561] (1/4) Epoch 3900, batch 6, global_batch_idx: 62390, batch size: 106, loss[dur_loss=0.175, prior_loss=0.9692, diff_loss=0.2183, tot_loss=1.363, over 106.00 samples.], tot_loss[dur_loss=0.1736, prior_loss=0.9689, diff_loss=0.3796, tot_loss=1.522, over 1142.00 samples.], 2024-10-22 03:03:11,004 INFO [train.py:682] (1/4) Start epoch 3901 2024-10-22 03:03:20,107 INFO [train.py:561] (1/4) Epoch 3901, batch 0, global_batch_idx: 62400, batch size: 108, loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3096, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9699, diff_loss=0.3096, tot_loss=1.458, over 108.00 samples.], 2024-10-22 03:03:34,715 INFO [train.py:561] (1/4) Epoch 3901, batch 10, global_batch_idx: 62410, batch size: 111, loss[dur_loss=0.1782, prior_loss=0.9704, diff_loss=0.2776, tot_loss=1.426, over 111.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9691, diff_loss=0.3502, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 03:03:41,927 INFO [train.py:682] (1/4) Start epoch 3902 2024-10-22 03:03:56,246 INFO [train.py:561] (1/4) Epoch 3902, batch 4, global_batch_idx: 62420, batch size: 189, loss[dur_loss=0.1751, prior_loss=0.9693, diff_loss=0.3373, tot_loss=1.482, over 189.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9687, diff_loss=0.4047, tot_loss=1.548, over 937.00 samples.], 2024-10-22 03:04:11,502 INFO [train.py:561] (1/4) Epoch 3902, batch 14, global_batch_idx: 62430, batch size: 142, loss[dur_loss=0.1774, prior_loss=0.9693, diff_loss=0.2588, tot_loss=1.405, over 142.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3353, tot_loss=1.48, over 2210.00 samples.], 2024-10-22 03:04:12,956 INFO [train.py:682] (1/4) Start epoch 3903 2024-10-22 03:04:33,371 INFO [train.py:561] (1/4) Epoch 3903, batch 8, global_batch_idx: 62440, batch size: 170, loss[dur_loss=0.1762, prior_loss=0.9696, diff_loss=0.2846, tot_loss=1.431, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.969, diff_loss=0.3547, tot_loss=1.499, over 1432.00 samples.], 2024-10-22 03:04:43,739 INFO [train.py:682] (1/4) Start epoch 3904 2024-10-22 03:04:55,894 INFO [train.py:561] (1/4) Epoch 3904, batch 2, global_batch_idx: 62450, batch size: 203, loss[dur_loss=0.1751, prior_loss=0.9694, diff_loss=0.3221, tot_loss=1.467, over 203.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9695, diff_loss=0.3043, tot_loss=1.45, over 442.00 samples.], 2024-10-22 03:05:10,463 INFO [train.py:561] (1/4) Epoch 3904, batch 12, global_batch_idx: 62460, batch size: 152, loss[dur_loss=0.1747, prior_loss=0.9694, diff_loss=0.2987, tot_loss=1.443, over 152.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.349, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 03:05:14,993 INFO [train.py:682] (1/4) Start epoch 3905 2024-10-22 03:05:32,499 INFO [train.py:561] (1/4) Epoch 3905, batch 6, global_batch_idx: 62470, batch size: 106, loss[dur_loss=0.1753, prior_loss=0.9694, diff_loss=0.3064, tot_loss=1.451, over 106.00 samples.], tot_loss[dur_loss=0.1731, prior_loss=0.9689, diff_loss=0.3857, tot_loss=1.528, over 1142.00 samples.], 2024-10-22 03:05:45,992 INFO [train.py:682] (1/4) Start epoch 3906 2024-10-22 03:05:55,304 INFO [train.py:561] (1/4) Epoch 3906, batch 0, global_batch_idx: 62480, batch size: 108, loss[dur_loss=0.176, prior_loss=0.9699, diff_loss=0.2977, tot_loss=1.444, over 108.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9699, diff_loss=0.2977, tot_loss=1.444, over 108.00 samples.], 2024-10-22 03:06:10,030 INFO [train.py:561] (1/4) Epoch 3906, batch 10, global_batch_idx: 62490, batch size: 111, loss[dur_loss=0.1773, prior_loss=0.9704, diff_loss=0.2975, tot_loss=1.445, over 111.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9691, diff_loss=0.3623, tot_loss=1.506, over 1656.00 samples.], 2024-10-22 03:06:17,293 INFO [train.py:682] (1/4) Start epoch 3907 2024-10-22 03:06:31,615 INFO [train.py:561] (1/4) Epoch 3907, batch 4, global_batch_idx: 62500, batch size: 189, loss[dur_loss=0.1782, prior_loss=0.9696, diff_loss=0.3104, tot_loss=1.458, over 189.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.9688, diff_loss=0.4041, tot_loss=1.548, over 937.00 samples.], 2024-10-22 03:06:46,882 INFO [train.py:561] (1/4) Epoch 3907, batch 14, global_batch_idx: 62510, batch size: 142, loss[dur_loss=0.181, prior_loss=0.9693, diff_loss=0.2873, tot_loss=1.438, over 142.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9693, diff_loss=0.3258, tot_loss=1.471, over 2210.00 samples.], 2024-10-22 03:06:48,353 INFO [train.py:682] (1/4) Start epoch 3908 2024-10-22 03:07:08,798 INFO [train.py:561] (1/4) Epoch 3908, batch 8, global_batch_idx: 62520, batch size: 170, loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.2836, tot_loss=1.43, over 170.00 samples.], tot_loss[dur_loss=0.1744, prior_loss=0.9691, diff_loss=0.368, tot_loss=1.511, over 1432.00 samples.], 2024-10-22 03:07:19,126 INFO [train.py:682] (1/4) Start epoch 3909 2024-10-22 03:07:31,199 INFO [train.py:561] (1/4) Epoch 3909, batch 2, global_batch_idx: 62530, batch size: 203, loss[dur_loss=0.1762, prior_loss=0.9695, diff_loss=0.3394, tot_loss=1.485, over 203.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9695, diff_loss=0.3069, tot_loss=1.452, over 442.00 samples.], 2024-10-22 03:07:45,709 INFO [train.py:561] (1/4) Epoch 3909, batch 12, global_batch_idx: 62540, batch size: 152, loss[dur_loss=0.1745, prior_loss=0.9695, diff_loss=0.2949, tot_loss=1.439, over 152.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.3506, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 03:07:50,250 INFO [train.py:682] (1/4) Start epoch 3910 2024-10-22 03:08:08,139 INFO [train.py:561] (1/4) Epoch 3910, batch 6, global_batch_idx: 62550, batch size: 106, loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.2839, tot_loss=1.429, over 106.00 samples.], tot_loss[dur_loss=0.1739, prior_loss=0.9689, diff_loss=0.3829, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 03:08:21,437 INFO [train.py:682] (1/4) Start epoch 3911 2024-10-22 03:08:30,806 INFO [train.py:561] (1/4) Epoch 3911, batch 0, global_batch_idx: 62560, batch size: 108, loss[dur_loss=0.1787, prior_loss=0.9701, diff_loss=0.3207, tot_loss=1.469, over 108.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.9701, diff_loss=0.3207, tot_loss=1.469, over 108.00 samples.], 2024-10-22 03:08:45,420 INFO [train.py:561] (1/4) Epoch 3911, batch 10, global_batch_idx: 62570, batch size: 111, loss[dur_loss=0.1757, prior_loss=0.9702, diff_loss=0.3208, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9693, diff_loss=0.3538, tot_loss=1.498, over 1656.00 samples.], 2024-10-22 03:08:52,704 INFO [train.py:682] (1/4) Start epoch 3912 2024-10-22 03:09:07,077 INFO [train.py:561] (1/4) Epoch 3912, batch 4, global_batch_idx: 62580, batch size: 189, loss[dur_loss=0.1765, prior_loss=0.9695, diff_loss=0.3295, tot_loss=1.476, over 189.00 samples.], tot_loss[dur_loss=0.175, prior_loss=0.9688, diff_loss=0.4143, tot_loss=1.558, over 937.00 samples.], 2024-10-22 03:09:22,442 INFO [train.py:561] (1/4) Epoch 3912, batch 14, global_batch_idx: 62590, batch size: 142, loss[dur_loss=0.1753, prior_loss=0.9692, diff_loss=0.2862, tot_loss=1.431, over 142.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9692, diff_loss=0.3476, tot_loss=1.493, over 2210.00 samples.], 2024-10-22 03:09:23,901 INFO [train.py:682] (1/4) Start epoch 3913 2024-10-22 03:09:44,545 INFO [train.py:561] (1/4) Epoch 3913, batch 8, global_batch_idx: 62600, batch size: 170, loss[dur_loss=0.1793, prior_loss=0.9697, diff_loss=0.3014, tot_loss=1.45, over 170.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.364, tot_loss=1.509, over 1432.00 samples.], 2024-10-22 03:09:55,033 INFO [train.py:682] (1/4) Start epoch 3914 2024-10-22 03:10:07,253 INFO [train.py:561] (1/4) Epoch 3914, batch 2, global_batch_idx: 62610, batch size: 203, loss[dur_loss=0.1784, prior_loss=0.9695, diff_loss=0.3014, tot_loss=1.449, over 203.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9695, diff_loss=0.2762, tot_loss=1.424, over 442.00 samples.], 2024-10-22 03:10:22,014 INFO [train.py:561] (1/4) Epoch 3914, batch 12, global_batch_idx: 62620, batch size: 152, loss[dur_loss=0.1742, prior_loss=0.9698, diff_loss=0.3172, tot_loss=1.461, over 152.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9694, diff_loss=0.3467, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 03:10:26,556 INFO [train.py:682] (1/4) Start epoch 3915 2024-10-22 03:10:44,420 INFO [train.py:561] (1/4) Epoch 3915, batch 6, global_batch_idx: 62630, batch size: 106, loss[dur_loss=0.1765, prior_loss=0.9693, diff_loss=0.2493, tot_loss=1.395, over 106.00 samples.], tot_loss[dur_loss=0.1738, prior_loss=0.969, diff_loss=0.3701, tot_loss=1.513, over 1142.00 samples.], 2024-10-22 03:10:57,649 INFO [train.py:682] (1/4) Start epoch 3916 2024-10-22 03:11:06,647 INFO [train.py:561] (1/4) Epoch 3916, batch 0, global_batch_idx: 62640, batch size: 108, loss[dur_loss=0.1787, prior_loss=0.97, diff_loss=0.2805, tot_loss=1.429, over 108.00 samples.], tot_loss[dur_loss=0.1787, prior_loss=0.97, diff_loss=0.2805, tot_loss=1.429, over 108.00 samples.], 2024-10-22 03:11:21,137 INFO [train.py:561] (1/4) Epoch 3916, batch 10, global_batch_idx: 62650, batch size: 111, loss[dur_loss=0.1809, prior_loss=0.9706, diff_loss=0.3198, tot_loss=1.471, over 111.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9693, diff_loss=0.3569, tot_loss=1.503, over 1656.00 samples.], 2024-10-22 03:11:28,371 INFO [train.py:682] (1/4) Start epoch 3917 2024-10-22 03:11:42,797 INFO [train.py:561] (1/4) Epoch 3917, batch 4, global_batch_idx: 62660, batch size: 189, loss[dur_loss=0.177, prior_loss=0.9694, diff_loss=0.3002, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9688, diff_loss=0.3901, tot_loss=1.534, over 937.00 samples.], 2024-10-22 03:11:58,072 INFO [train.py:561] (1/4) Epoch 3917, batch 14, global_batch_idx: 62670, batch size: 142, loss[dur_loss=0.1781, prior_loss=0.9692, diff_loss=0.2974, tot_loss=1.445, over 142.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3312, tot_loss=1.477, over 2210.00 samples.], 2024-10-22 03:11:59,523 INFO [train.py:682] (1/4) Start epoch 3918 2024-10-22 03:12:20,150 INFO [train.py:561] (1/4) Epoch 3918, batch 8, global_batch_idx: 62680, batch size: 170, loss[dur_loss=0.1774, prior_loss=0.9696, diff_loss=0.2936, tot_loss=1.441, over 170.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.969, diff_loss=0.3582, tot_loss=1.501, over 1432.00 samples.], 2024-10-22 03:12:30,483 INFO [train.py:682] (1/4) Start epoch 3919 2024-10-22 03:12:42,534 INFO [train.py:561] (1/4) Epoch 3919, batch 2, global_batch_idx: 62690, batch size: 203, loss[dur_loss=0.1772, prior_loss=0.9695, diff_loss=0.3086, tot_loss=1.455, over 203.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9695, diff_loss=0.2874, tot_loss=1.433, over 442.00 samples.], 2024-10-22 03:12:57,009 INFO [train.py:561] (1/4) Epoch 3919, batch 12, global_batch_idx: 62700, batch size: 152, loss[dur_loss=0.1788, prior_loss=0.9696, diff_loss=0.2869, tot_loss=1.435, over 152.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3501, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 03:13:01,552 INFO [train.py:682] (1/4) Start epoch 3920 2024-10-22 03:13:18,997 INFO [train.py:561] (1/4) Epoch 3920, batch 6, global_batch_idx: 62710, batch size: 106, loss[dur_loss=0.1734, prior_loss=0.9693, diff_loss=0.2815, tot_loss=1.424, over 106.00 samples.], tot_loss[dur_loss=0.1737, prior_loss=0.9689, diff_loss=0.3627, tot_loss=1.505, over 1142.00 samples.], 2024-10-22 03:13:32,386 INFO [train.py:682] (1/4) Start epoch 3921 2024-10-22 03:13:41,830 INFO [train.py:561] (1/4) Epoch 3921, batch 0, global_batch_idx: 62720, batch size: 108, loss[dur_loss=0.1813, prior_loss=0.97, diff_loss=0.3065, tot_loss=1.458, over 108.00 samples.], tot_loss[dur_loss=0.1813, prior_loss=0.97, diff_loss=0.3065, tot_loss=1.458, over 108.00 samples.], 2024-10-22 03:13:56,369 INFO [train.py:561] (1/4) Epoch 3921, batch 10, global_batch_idx: 62730, batch size: 111, loss[dur_loss=0.1799, prior_loss=0.9704, diff_loss=0.2784, tot_loss=1.429, over 111.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.9692, diff_loss=0.3475, tot_loss=1.492, over 1656.00 samples.], 2024-10-22 03:14:03,622 INFO [train.py:682] (1/4) Start epoch 3922 2024-10-22 03:14:17,874 INFO [train.py:561] (1/4) Epoch 3922, batch 4, global_batch_idx: 62740, batch size: 189, loss[dur_loss=0.175, prior_loss=0.9697, diff_loss=0.3101, tot_loss=1.455, over 189.00 samples.], tot_loss[dur_loss=0.1732, prior_loss=0.9688, diff_loss=0.4008, tot_loss=1.543, over 937.00 samples.], 2024-10-22 03:14:33,129 INFO [train.py:561] (1/4) Epoch 3922, batch 14, global_batch_idx: 62750, batch size: 142, loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.3132, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9693, diff_loss=0.3357, tot_loss=1.481, over 2210.00 samples.], 2024-10-22 03:14:34,615 INFO [train.py:682] (1/4) Start epoch 3923 2024-10-22 03:14:55,216 INFO [train.py:561] (1/4) Epoch 3923, batch 8, global_batch_idx: 62760, batch size: 170, loss[dur_loss=0.1785, prior_loss=0.9698, diff_loss=0.3216, tot_loss=1.47, over 170.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.3604, tot_loss=1.506, over 1432.00 samples.], 2024-10-22 03:15:05,643 INFO [train.py:682] (1/4) Start epoch 3924 2024-10-22 03:15:17,717 INFO [train.py:561] (1/4) Epoch 3924, batch 2, global_batch_idx: 62770, batch size: 203, loss[dur_loss=0.1794, prior_loss=0.9695, diff_loss=0.3406, tot_loss=1.489, over 203.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9695, diff_loss=0.3065, tot_loss=1.454, over 442.00 samples.], 2024-10-22 03:15:32,383 INFO [train.py:561] (1/4) Epoch 3924, batch 12, global_batch_idx: 62780, batch size: 152, loss[dur_loss=0.1758, prior_loss=0.9696, diff_loss=0.2838, tot_loss=1.429, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9692, diff_loss=0.3526, tot_loss=1.498, over 1966.00 samples.], 2024-10-22 03:15:36,961 INFO [train.py:682] (1/4) Start epoch 3925 2024-10-22 03:15:54,603 INFO [train.py:561] (1/4) Epoch 3925, batch 6, global_batch_idx: 62790, batch size: 106, loss[dur_loss=0.1764, prior_loss=0.9695, diff_loss=0.2944, tot_loss=1.44, over 106.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.969, diff_loss=0.3888, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 03:16:08,046 INFO [train.py:682] (1/4) Start epoch 3926 2024-10-22 03:16:17,396 INFO [train.py:561] (1/4) Epoch 3926, batch 0, global_batch_idx: 62800, batch size: 108, loss[dur_loss=0.1771, prior_loss=0.9699, diff_loss=0.2659, tot_loss=1.413, over 108.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9699, diff_loss=0.2659, tot_loss=1.413, over 108.00 samples.], 2024-10-22 03:16:31,977 INFO [train.py:561] (1/4) Epoch 3926, batch 10, global_batch_idx: 62810, batch size: 111, loss[dur_loss=0.179, prior_loss=0.9704, diff_loss=0.3193, tot_loss=1.469, over 111.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.9692, diff_loss=0.349, tot_loss=1.493, over 1656.00 samples.], 2024-10-22 03:16:39,249 INFO [train.py:682] (1/4) Start epoch 3927 2024-10-22 03:16:53,347 INFO [train.py:561] (1/4) Epoch 3927, batch 4, global_batch_idx: 62820, batch size: 189, loss[dur_loss=0.1787, prior_loss=0.9697, diff_loss=0.3462, tot_loss=1.495, over 189.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.9688, diff_loss=0.4209, tot_loss=1.564, over 937.00 samples.], 2024-10-22 03:17:08,744 INFO [train.py:561] (1/4) Epoch 3927, batch 14, global_batch_idx: 62830, batch size: 142, loss[dur_loss=0.179, prior_loss=0.9695, diff_loss=0.2703, tot_loss=1.419, over 142.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9693, diff_loss=0.342, tot_loss=1.488, over 2210.00 samples.], 2024-10-22 03:17:10,213 INFO [train.py:682] (1/4) Start epoch 3928 2024-10-22 03:17:30,790 INFO [train.py:561] (1/4) Epoch 3928, batch 8, global_batch_idx: 62840, batch size: 170, loss[dur_loss=0.1799, prior_loss=0.9697, diff_loss=0.3185, tot_loss=1.468, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.37, tot_loss=1.515, over 1432.00 samples.], 2024-10-22 03:17:41,196 INFO [train.py:682] (1/4) Start epoch 3929 2024-10-22 03:17:53,444 INFO [train.py:561] (1/4) Epoch 3929, batch 2, global_batch_idx: 62850, batch size: 203, loss[dur_loss=0.178, prior_loss=0.9694, diff_loss=0.3536, tot_loss=1.501, over 203.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9696, diff_loss=0.3194, tot_loss=1.467, over 442.00 samples.], 2024-10-22 03:18:07,970 INFO [train.py:561] (1/4) Epoch 3929, batch 12, global_batch_idx: 62860, batch size: 152, loss[dur_loss=0.1785, prior_loss=0.9696, diff_loss=0.2606, tot_loss=1.409, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.9693, diff_loss=0.3425, tot_loss=1.488, over 1966.00 samples.], 2024-10-22 03:18:12,533 INFO [train.py:682] (1/4) Start epoch 3930 2024-10-22 03:18:30,044 INFO [train.py:561] (1/4) Epoch 3930, batch 6, global_batch_idx: 62870, batch size: 106, loss[dur_loss=0.1751, prior_loss=0.9694, diff_loss=0.2832, tot_loss=1.428, over 106.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.969, diff_loss=0.3913, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 03:18:43,523 INFO [train.py:682] (1/4) Start epoch 3931 2024-10-22 03:18:52,747 INFO [train.py:561] (1/4) Epoch 3931, batch 0, global_batch_idx: 62880, batch size: 108, loss[dur_loss=0.1782, prior_loss=0.9699, diff_loss=0.245, tot_loss=1.393, over 108.00 samples.], tot_loss[dur_loss=0.1782, prior_loss=0.9699, diff_loss=0.245, tot_loss=1.393, over 108.00 samples.], 2024-10-22 03:19:07,354 INFO [train.py:561] (1/4) Epoch 3931, batch 10, global_batch_idx: 62890, batch size: 111, loss[dur_loss=0.1787, prior_loss=0.9705, diff_loss=0.2865, tot_loss=1.436, over 111.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.3599, tot_loss=1.505, over 1656.00 samples.], 2024-10-22 03:19:14,680 INFO [train.py:682] (1/4) Start epoch 3932 2024-10-22 03:19:28,861 INFO [train.py:561] (1/4) Epoch 3932, batch 4, global_batch_idx: 62900, batch size: 189, loss[dur_loss=0.1755, prior_loss=0.9693, diff_loss=0.3268, tot_loss=1.472, over 189.00 samples.], tot_loss[dur_loss=0.1736, prior_loss=0.9687, diff_loss=0.4168, tot_loss=1.559, over 937.00 samples.], 2024-10-22 03:19:44,208 INFO [train.py:561] (1/4) Epoch 3932, batch 14, global_batch_idx: 62910, batch size: 142, loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.2907, tot_loss=1.436, over 142.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9692, diff_loss=0.3445, tot_loss=1.489, over 2210.00 samples.], 2024-10-22 03:19:45,646 INFO [train.py:682] (1/4) Start epoch 3933 2024-10-22 03:20:06,371 INFO [train.py:561] (1/4) Epoch 3933, batch 8, global_batch_idx: 62920, batch size: 170, loss[dur_loss=0.1796, prior_loss=0.9696, diff_loss=0.2673, tot_loss=1.416, over 170.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9691, diff_loss=0.3601, tot_loss=1.504, over 1432.00 samples.], 2024-10-22 03:20:16,722 INFO [train.py:682] (1/4) Start epoch 3934 2024-10-22 03:20:28,299 INFO [train.py:561] (1/4) Epoch 3934, batch 2, global_batch_idx: 62930, batch size: 203, loss[dur_loss=0.1787, prior_loss=0.9696, diff_loss=0.3521, tot_loss=1.5, over 203.00 samples.], tot_loss[dur_loss=0.1777, prior_loss=0.9695, diff_loss=0.3177, tot_loss=1.465, over 442.00 samples.], 2024-10-22 03:20:42,744 INFO [train.py:561] (1/4) Epoch 3934, batch 12, global_batch_idx: 62940, batch size: 152, loss[dur_loss=0.1751, prior_loss=0.9694, diff_loss=0.3177, tot_loss=1.462, over 152.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.3479, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 03:20:47,316 INFO [train.py:682] (1/4) Start epoch 3935 2024-10-22 03:21:04,884 INFO [train.py:561] (1/4) Epoch 3935, batch 6, global_batch_idx: 62950, batch size: 106, loss[dur_loss=0.1728, prior_loss=0.9693, diff_loss=0.2663, tot_loss=1.408, over 106.00 samples.], tot_loss[dur_loss=0.1735, prior_loss=0.9689, diff_loss=0.3947, tot_loss=1.537, over 1142.00 samples.], 2024-10-22 03:21:18,265 INFO [train.py:682] (1/4) Start epoch 3936 2024-10-22 03:21:27,578 INFO [train.py:561] (1/4) Epoch 3936, batch 0, global_batch_idx: 62960, batch size: 108, loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.3236, tot_loss=1.472, over 108.00 samples.], tot_loss[dur_loss=0.1784, prior_loss=0.9697, diff_loss=0.3236, tot_loss=1.472, over 108.00 samples.], 2024-10-22 03:21:41,958 INFO [train.py:561] (1/4) Epoch 3936, batch 10, global_batch_idx: 62970, batch size: 111, loss[dur_loss=0.1779, prior_loss=0.9704, diff_loss=0.3038, tot_loss=1.452, over 111.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9692, diff_loss=0.3621, tot_loss=1.507, over 1656.00 samples.], 2024-10-22 03:21:49,077 INFO [train.py:682] (1/4) Start epoch 3937 2024-10-22 03:22:03,533 INFO [train.py:561] (1/4) Epoch 3937, batch 4, global_batch_idx: 62980, batch size: 189, loss[dur_loss=0.1771, prior_loss=0.9696, diff_loss=0.3386, tot_loss=1.485, over 189.00 samples.], tot_loss[dur_loss=0.175, prior_loss=0.9688, diff_loss=0.4217, tot_loss=1.566, over 937.00 samples.], 2024-10-22 03:22:18,684 INFO [train.py:561] (1/4) Epoch 3937, batch 14, global_batch_idx: 62990, batch size: 142, loss[dur_loss=0.1791, prior_loss=0.9694, diff_loss=0.258, tot_loss=1.407, over 142.00 samples.], tot_loss[dur_loss=0.1765, prior_loss=0.9693, diff_loss=0.3426, tot_loss=1.488, over 2210.00 samples.], 2024-10-22 03:22:20,131 INFO [train.py:682] (1/4) Start epoch 3938 2024-10-22 03:22:40,362 INFO [train.py:561] (1/4) Epoch 3938, batch 8, global_batch_idx: 63000, batch size: 170, loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.2989, tot_loss=1.445, over 170.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9691, diff_loss=0.3586, tot_loss=1.504, over 1432.00 samples.], 2024-10-22 03:22:41,843 INFO [train.py:579] (1/4) Computing validation loss 2024-10-22 03:22:43,945 INFO [train.py:589] (1/4) Epoch 3938, validation: dur_loss=0.4615, prior_loss=1.037, diff_loss=0.375, tot_loss=1.874, over 100.00 samples. 2024-10-22 03:22:43,945 INFO [train.py:590] (1/4) Maximum memory allocated so far is 21049MB 2024-10-22 03:22:52,684 INFO [train.py:682] (1/4) Start epoch 3939 2024-10-22 03:23:04,710 INFO [train.py:561] (1/4) Epoch 3939, batch 2, global_batch_idx: 63010, batch size: 203, loss[dur_loss=0.1798, prior_loss=0.9694, diff_loss=0.3391, tot_loss=1.488, over 203.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9695, diff_loss=0.2949, tot_loss=1.442, over 442.00 samples.], 2024-10-22 03:23:19,090 INFO [train.py:561] (1/4) Epoch 3939, batch 12, global_batch_idx: 63020, batch size: 152, loss[dur_loss=0.1748, prior_loss=0.9696, diff_loss=0.3399, tot_loss=1.484, over 152.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3498, tot_loss=1.495, over 1966.00 samples.], 2024-10-22 03:23:23,603 INFO [train.py:682] (1/4) Start epoch 3940 2024-10-22 03:23:41,457 INFO [train.py:561] (1/4) Epoch 3940, batch 6, global_batch_idx: 63030, batch size: 106, loss[dur_loss=0.1768, prior_loss=0.9695, diff_loss=0.274, tot_loss=1.42, over 106.00 samples.], tot_loss[dur_loss=0.1757, prior_loss=0.969, diff_loss=0.3722, tot_loss=1.517, over 1142.00 samples.], 2024-10-22 03:23:54,769 INFO [train.py:682] (1/4) Start epoch 3941 2024-10-22 03:24:04,132 INFO [train.py:561] (1/4) Epoch 3941, batch 0, global_batch_idx: 63040, batch size: 108, loss[dur_loss=0.1773, prior_loss=0.9699, diff_loss=0.3085, tot_loss=1.456, over 108.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9699, diff_loss=0.3085, tot_loss=1.456, over 108.00 samples.], 2024-10-22 03:24:18,737 INFO [train.py:561] (1/4) Epoch 3941, batch 10, global_batch_idx: 63050, batch size: 111, loss[dur_loss=0.1797, prior_loss=0.9706, diff_loss=0.3077, tot_loss=1.458, over 111.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3688, tot_loss=1.514, over 1656.00 samples.], 2024-10-22 03:24:25,950 INFO [train.py:682] (1/4) Start epoch 3942 2024-10-22 03:24:40,337 INFO [train.py:561] (1/4) Epoch 3942, batch 4, global_batch_idx: 63060, batch size: 189, loss[dur_loss=0.1786, prior_loss=0.9695, diff_loss=0.3111, tot_loss=1.459, over 189.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9688, diff_loss=0.4044, tot_loss=1.548, over 937.00 samples.], 2024-10-22 03:24:55,424 INFO [train.py:561] (1/4) Epoch 3942, batch 14, global_batch_idx: 63070, batch size: 142, loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.3123, tot_loss=1.459, over 142.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9692, diff_loss=0.3409, tot_loss=1.486, over 2210.00 samples.], 2024-10-22 03:24:56,880 INFO [train.py:682] (1/4) Start epoch 3943 2024-10-22 03:25:17,641 INFO [train.py:561] (1/4) Epoch 3943, batch 8, global_batch_idx: 63080, batch size: 170, loss[dur_loss=0.1805, prior_loss=0.9697, diff_loss=0.3091, tot_loss=1.459, over 170.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.969, diff_loss=0.364, tot_loss=1.508, over 1432.00 samples.], 2024-10-22 03:25:27,828 INFO [train.py:682] (1/4) Start epoch 3944 2024-10-22 03:25:40,195 INFO [train.py:561] (1/4) Epoch 3944, batch 2, global_batch_idx: 63090, batch size: 203, loss[dur_loss=0.1766, prior_loss=0.9694, diff_loss=0.3247, tot_loss=1.471, over 203.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9695, diff_loss=0.2979, tot_loss=1.445, over 442.00 samples.], 2024-10-22 03:25:54,764 INFO [train.py:561] (1/4) Epoch 3944, batch 12, global_batch_idx: 63100, batch size: 152, loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.2978, tot_loss=1.445, over 152.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9692, diff_loss=0.3394, tot_loss=1.485, over 1966.00 samples.], 2024-10-22 03:25:59,292 INFO [train.py:682] (1/4) Start epoch 3945 2024-10-22 03:26:16,831 INFO [train.py:561] (1/4) Epoch 3945, batch 6, global_batch_idx: 63110, batch size: 106, loss[dur_loss=0.1753, prior_loss=0.9694, diff_loss=0.3032, tot_loss=1.448, over 106.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.969, diff_loss=0.3904, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 03:26:30,033 INFO [train.py:682] (1/4) Start epoch 3946 2024-10-22 03:26:39,083 INFO [train.py:561] (1/4) Epoch 3946, batch 0, global_batch_idx: 63120, batch size: 108, loss[dur_loss=0.1773, prior_loss=0.9698, diff_loss=0.2659, tot_loss=1.413, over 108.00 samples.], tot_loss[dur_loss=0.1773, prior_loss=0.9698, diff_loss=0.2659, tot_loss=1.413, over 108.00 samples.], 2024-10-22 03:26:53,436 INFO [train.py:561] (1/4) Epoch 3946, batch 10, global_batch_idx: 63130, batch size: 111, loss[dur_loss=0.1761, prior_loss=0.9703, diff_loss=0.2792, tot_loss=1.426, over 111.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9692, diff_loss=0.3407, tot_loss=1.486, over 1656.00 samples.], 2024-10-22 03:27:00,579 INFO [train.py:682] (1/4) Start epoch 3947 2024-10-22 03:27:14,981 INFO [train.py:561] (1/4) Epoch 3947, batch 4, global_batch_idx: 63140, batch size: 189, loss[dur_loss=0.1766, prior_loss=0.9698, diff_loss=0.309, tot_loss=1.455, over 189.00 samples.], tot_loss[dur_loss=0.1731, prior_loss=0.9688, diff_loss=0.4043, tot_loss=1.546, over 937.00 samples.], 2024-10-22 03:27:29,981 INFO [train.py:561] (1/4) Epoch 3947, batch 14, global_batch_idx: 63150, batch size: 142, loss[dur_loss=0.1769, prior_loss=0.9691, diff_loss=0.2841, tot_loss=1.43, over 142.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9693, diff_loss=0.3391, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 03:27:31,431 INFO [train.py:682] (1/4) Start epoch 3948 2024-10-22 03:27:52,114 INFO [train.py:561] (1/4) Epoch 3948, batch 8, global_batch_idx: 63160, batch size: 170, loss[dur_loss=0.176, prior_loss=0.9695, diff_loss=0.3116, tot_loss=1.457, over 170.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.9691, diff_loss=0.3665, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 03:28:02,312 INFO [train.py:682] (1/4) Start epoch 3949 2024-10-22 03:28:14,116 INFO [train.py:561] (1/4) Epoch 3949, batch 2, global_batch_idx: 63170, batch size: 203, loss[dur_loss=0.1817, prior_loss=0.9696, diff_loss=0.3321, tot_loss=1.483, over 203.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9695, diff_loss=0.3183, tot_loss=1.467, over 442.00 samples.], 2024-10-22 03:28:28,468 INFO [train.py:561] (1/4) Epoch 3949, batch 12, global_batch_idx: 63180, batch size: 152, loss[dur_loss=0.179, prior_loss=0.9697, diff_loss=0.326, tot_loss=1.475, over 152.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9692, diff_loss=0.3518, tot_loss=1.497, over 1966.00 samples.], 2024-10-22 03:28:32,939 INFO [train.py:682] (1/4) Start epoch 3950 2024-10-22 03:28:50,647 INFO [train.py:561] (1/4) Epoch 3950, batch 6, global_batch_idx: 63190, batch size: 106, loss[dur_loss=0.1741, prior_loss=0.9692, diff_loss=0.2576, tot_loss=1.401, over 106.00 samples.], tot_loss[dur_loss=0.1743, prior_loss=0.9689, diff_loss=0.3855, tot_loss=1.529, over 1142.00 samples.], 2024-10-22 03:29:04,004 INFO [train.py:682] (1/4) Start epoch 3951 2024-10-22 03:29:13,603 INFO [train.py:561] (1/4) Epoch 3951, batch 0, global_batch_idx: 63200, batch size: 108, loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.287, tot_loss=1.435, over 108.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9697, diff_loss=0.287, tot_loss=1.435, over 108.00 samples.], 2024-10-22 03:29:28,109 INFO [train.py:561] (1/4) Epoch 3951, batch 10, global_batch_idx: 63210, batch size: 111, loss[dur_loss=0.1773, prior_loss=0.9704, diff_loss=0.3035, tot_loss=1.451, over 111.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9692, diff_loss=0.3504, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 03:29:35,367 INFO [train.py:682] (1/4) Start epoch 3952 2024-10-22 03:29:49,773 INFO [train.py:561] (1/4) Epoch 3952, batch 4, global_batch_idx: 63220, batch size: 189, loss[dur_loss=0.1772, prior_loss=0.9696, diff_loss=0.2936, tot_loss=1.44, over 189.00 samples.], tot_loss[dur_loss=0.1735, prior_loss=0.9687, diff_loss=0.4049, tot_loss=1.547, over 937.00 samples.], 2024-10-22 03:30:04,816 INFO [train.py:561] (1/4) Epoch 3952, batch 14, global_batch_idx: 63230, batch size: 142, loss[dur_loss=0.1761, prior_loss=0.9691, diff_loss=0.2807, tot_loss=1.426, over 142.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.3391, tot_loss=1.484, over 2210.00 samples.], 2024-10-22 03:30:06,255 INFO [train.py:682] (1/4) Start epoch 3953 2024-10-22 03:30:26,826 INFO [train.py:561] (1/4) Epoch 3953, batch 8, global_batch_idx: 63240, batch size: 170, loss[dur_loss=0.1791, prior_loss=0.9696, diff_loss=0.2796, tot_loss=1.428, over 170.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9691, diff_loss=0.3727, tot_loss=1.518, over 1432.00 samples.], 2024-10-22 03:30:37,051 INFO [train.py:682] (1/4) Start epoch 3954 2024-10-22 03:30:48,646 INFO [train.py:561] (1/4) Epoch 3954, batch 2, global_batch_idx: 63250, batch size: 203, loss[dur_loss=0.1788, prior_loss=0.9693, diff_loss=0.3359, tot_loss=1.484, over 203.00 samples.], tot_loss[dur_loss=0.1781, prior_loss=0.9694, diff_loss=0.3072, tot_loss=1.455, over 442.00 samples.], 2024-10-22 03:31:02,988 INFO [train.py:561] (1/4) Epoch 3954, batch 12, global_batch_idx: 63260, batch size: 152, loss[dur_loss=0.1797, prior_loss=0.9697, diff_loss=0.273, tot_loss=1.422, over 152.00 samples.], tot_loss[dur_loss=0.1769, prior_loss=0.9693, diff_loss=0.3475, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 03:31:07,483 INFO [train.py:682] (1/4) Start epoch 3955 2024-10-22 03:31:24,999 INFO [train.py:561] (1/4) Epoch 3955, batch 6, global_batch_idx: 63270, batch size: 106, loss[dur_loss=0.1745, prior_loss=0.9693, diff_loss=0.2385, tot_loss=1.382, over 106.00 samples.], tot_loss[dur_loss=0.1745, prior_loss=0.9689, diff_loss=0.3797, tot_loss=1.523, over 1142.00 samples.], 2024-10-22 03:31:38,145 INFO [train.py:682] (1/4) Start epoch 3956 2024-10-22 03:31:47,124 INFO [train.py:561] (1/4) Epoch 3956, batch 0, global_batch_idx: 63280, batch size: 108, loss[dur_loss=0.1772, prior_loss=0.9699, diff_loss=0.3056, tot_loss=1.453, over 108.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9699, diff_loss=0.3056, tot_loss=1.453, over 108.00 samples.], 2024-10-22 03:32:01,478 INFO [train.py:561] (1/4) Epoch 3956, batch 10, global_batch_idx: 63290, batch size: 111, loss[dur_loss=0.176, prior_loss=0.9703, diff_loss=0.2362, tot_loss=1.383, over 111.00 samples.], tot_loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.3499, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 03:32:08,709 INFO [train.py:682] (1/4) Start epoch 3957 2024-10-22 03:32:23,189 INFO [train.py:561] (1/4) Epoch 3957, batch 4, global_batch_idx: 63300, batch size: 189, loss[dur_loss=0.1773, prior_loss=0.9694, diff_loss=0.3138, tot_loss=1.461, over 189.00 samples.], tot_loss[dur_loss=0.1738, prior_loss=0.9686, diff_loss=0.4029, tot_loss=1.545, over 937.00 samples.], 2024-10-22 03:32:38,393 INFO [train.py:561] (1/4) Epoch 3957, batch 14, global_batch_idx: 63310, batch size: 142, loss[dur_loss=0.1757, prior_loss=0.9692, diff_loss=0.3266, tot_loss=1.471, over 142.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9691, diff_loss=0.3379, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 03:32:39,844 INFO [train.py:682] (1/4) Start epoch 3958 2024-10-22 03:33:00,489 INFO [train.py:561] (1/4) Epoch 3958, batch 8, global_batch_idx: 63320, batch size: 170, loss[dur_loss=0.1789, prior_loss=0.9695, diff_loss=0.2982, tot_loss=1.447, over 170.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.969, diff_loss=0.3548, tot_loss=1.499, over 1432.00 samples.], 2024-10-22 03:33:10,856 INFO [train.py:682] (1/4) Start epoch 3959 2024-10-22 03:33:23,268 INFO [train.py:561] (1/4) Epoch 3959, batch 2, global_batch_idx: 63330, batch size: 203, loss[dur_loss=0.1763, prior_loss=0.9691, diff_loss=0.3608, tot_loss=1.506, over 203.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9693, diff_loss=0.3158, tot_loss=1.461, over 442.00 samples.], 2024-10-22 03:33:37,869 INFO [train.py:561] (1/4) Epoch 3959, batch 12, global_batch_idx: 63340, batch size: 152, loss[dur_loss=0.1751, prior_loss=0.9693, diff_loss=0.2754, tot_loss=1.42, over 152.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9691, diff_loss=0.3429, tot_loss=1.487, over 1966.00 samples.], 2024-10-22 03:33:42,436 INFO [train.py:682] (1/4) Start epoch 3960 2024-10-22 03:33:59,949 INFO [train.py:561] (1/4) Epoch 3960, batch 6, global_batch_idx: 63350, batch size: 106, loss[dur_loss=0.1769, prior_loss=0.9692, diff_loss=0.3014, tot_loss=1.447, over 106.00 samples.], tot_loss[dur_loss=0.1737, prior_loss=0.9688, diff_loss=0.3837, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 03:34:13,290 INFO [train.py:682] (1/4) Start epoch 3961 2024-10-22 03:34:22,360 INFO [train.py:561] (1/4) Epoch 3961, batch 0, global_batch_idx: 63360, batch size: 108, loss[dur_loss=0.1799, prior_loss=0.9698, diff_loss=0.2621, tot_loss=1.412, over 108.00 samples.], tot_loss[dur_loss=0.1799, prior_loss=0.9698, diff_loss=0.2621, tot_loss=1.412, over 108.00 samples.], 2024-10-22 03:34:36,925 INFO [train.py:561] (1/4) Epoch 3961, batch 10, global_batch_idx: 63370, batch size: 111, loss[dur_loss=0.179, prior_loss=0.9703, diff_loss=0.2946, tot_loss=1.444, over 111.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9691, diff_loss=0.3602, tot_loss=1.504, over 1656.00 samples.], 2024-10-22 03:34:44,205 INFO [train.py:682] (1/4) Start epoch 3962 2024-10-22 03:34:58,481 INFO [train.py:561] (1/4) Epoch 3962, batch 4, global_batch_idx: 63380, batch size: 189, loss[dur_loss=0.1773, prior_loss=0.9693, diff_loss=0.3003, tot_loss=1.447, over 189.00 samples.], tot_loss[dur_loss=0.1734, prior_loss=0.9687, diff_loss=0.4018, tot_loss=1.544, over 937.00 samples.], 2024-10-22 03:35:13,716 INFO [train.py:561] (1/4) Epoch 3962, batch 14, global_batch_idx: 63390, batch size: 142, loss[dur_loss=0.1813, prior_loss=0.969, diff_loss=0.2822, tot_loss=1.433, over 142.00 samples.], tot_loss[dur_loss=0.1763, prior_loss=0.9691, diff_loss=0.3356, tot_loss=1.481, over 2210.00 samples.], 2024-10-22 03:35:15,173 INFO [train.py:682] (1/4) Start epoch 3963 2024-10-22 03:35:35,718 INFO [train.py:561] (1/4) Epoch 3963, batch 8, global_batch_idx: 63400, batch size: 170, loss[dur_loss=0.1799, prior_loss=0.9693, diff_loss=0.3079, tot_loss=1.457, over 170.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9688, diff_loss=0.3556, tot_loss=1.499, over 1432.00 samples.], 2024-10-22 03:35:46,039 INFO [train.py:682] (1/4) Start epoch 3964 2024-10-22 03:35:58,059 INFO [train.py:561] (1/4) Epoch 3964, batch 2, global_batch_idx: 63410, batch size: 203, loss[dur_loss=0.1779, prior_loss=0.9693, diff_loss=0.3097, tot_loss=1.457, over 203.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9692, diff_loss=0.2932, tot_loss=1.439, over 442.00 samples.], 2024-10-22 03:36:12,552 INFO [train.py:561] (1/4) Epoch 3964, batch 12, global_batch_idx: 63420, batch size: 152, loss[dur_loss=0.1774, prior_loss=0.9693, diff_loss=0.2668, tot_loss=1.413, over 152.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.969, diff_loss=0.3435, tot_loss=1.488, over 1966.00 samples.], 2024-10-22 03:36:17,099 INFO [train.py:682] (1/4) Start epoch 3965 2024-10-22 03:36:34,594 INFO [train.py:561] (1/4) Epoch 3965, batch 6, global_batch_idx: 63430, batch size: 106, loss[dur_loss=0.1771, prior_loss=0.9691, diff_loss=0.249, tot_loss=1.395, over 106.00 samples.], tot_loss[dur_loss=0.1728, prior_loss=0.9687, diff_loss=0.3806, tot_loss=1.522, over 1142.00 samples.], 2024-10-22 03:36:47,927 INFO [train.py:682] (1/4) Start epoch 3966 2024-10-22 03:36:57,191 INFO [train.py:561] (1/4) Epoch 3966, batch 0, global_batch_idx: 63440, batch size: 108, loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.2701, tot_loss=1.418, over 108.00 samples.], tot_loss[dur_loss=0.178, prior_loss=0.9696, diff_loss=0.2701, tot_loss=1.418, over 108.00 samples.], 2024-10-22 03:37:11,765 INFO [train.py:561] (1/4) Epoch 3966, batch 10, global_batch_idx: 63450, batch size: 111, loss[dur_loss=0.1766, prior_loss=0.9702, diff_loss=0.2832, tot_loss=1.43, over 111.00 samples.], tot_loss[dur_loss=0.1744, prior_loss=0.9689, diff_loss=0.3512, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 03:37:19,004 INFO [train.py:682] (1/4) Start epoch 3967 2024-10-22 03:37:33,264 INFO [train.py:561] (1/4) Epoch 3967, batch 4, global_batch_idx: 63460, batch size: 189, loss[dur_loss=0.1765, prior_loss=0.9692, diff_loss=0.2883, tot_loss=1.434, over 189.00 samples.], tot_loss[dur_loss=0.1729, prior_loss=0.9685, diff_loss=0.4142, tot_loss=1.556, over 937.00 samples.], 2024-10-22 03:37:48,585 INFO [train.py:561] (1/4) Epoch 3967, batch 14, global_batch_idx: 63470, batch size: 142, loss[dur_loss=0.1793, prior_loss=0.9692, diff_loss=0.3245, tot_loss=1.473, over 142.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.969, diff_loss=0.3391, tot_loss=1.483, over 2210.00 samples.], 2024-10-22 03:37:50,056 INFO [train.py:682] (1/4) Start epoch 3968 2024-10-22 03:38:10,788 INFO [train.py:561] (1/4) Epoch 3968, batch 8, global_batch_idx: 63480, batch size: 170, loss[dur_loss=0.1793, prior_loss=0.9695, diff_loss=0.2932, tot_loss=1.442, over 170.00 samples.], tot_loss[dur_loss=0.174, prior_loss=0.9689, diff_loss=0.3707, tot_loss=1.514, over 1432.00 samples.], 2024-10-22 03:38:21,162 INFO [train.py:682] (1/4) Start epoch 3969 2024-10-22 03:38:33,047 INFO [train.py:561] (1/4) Epoch 3969, batch 2, global_batch_idx: 63490, batch size: 203, loss[dur_loss=0.1732, prior_loss=0.9692, diff_loss=0.3338, tot_loss=1.476, over 203.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9693, diff_loss=0.3019, tot_loss=1.446, over 442.00 samples.], 2024-10-22 03:38:47,426 INFO [train.py:561] (1/4) Epoch 3969, batch 12, global_batch_idx: 63500, batch size: 152, loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3101, tot_loss=1.455, over 152.00 samples.], tot_loss[dur_loss=0.1748, prior_loss=0.969, diff_loss=0.3418, tot_loss=1.486, over 1966.00 samples.], 2024-10-22 03:38:51,880 INFO [train.py:682] (1/4) Start epoch 3970 2024-10-22 03:39:09,349 INFO [train.py:561] (1/4) Epoch 3970, batch 6, global_batch_idx: 63510, batch size: 106, loss[dur_loss=0.175, prior_loss=0.9692, diff_loss=0.2798, tot_loss=1.424, over 106.00 samples.], tot_loss[dur_loss=0.173, prior_loss=0.9687, diff_loss=0.384, tot_loss=1.526, over 1142.00 samples.], 2024-10-22 03:39:22,482 INFO [train.py:682] (1/4) Start epoch 3971 2024-10-22 03:39:31,606 INFO [train.py:561] (1/4) Epoch 3971, batch 0, global_batch_idx: 63520, batch size: 108, loss[dur_loss=0.1811, prior_loss=0.9697, diff_loss=0.2803, tot_loss=1.431, over 108.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9697, diff_loss=0.2803, tot_loss=1.431, over 108.00 samples.], 2024-10-22 03:39:45,968 INFO [train.py:561] (1/4) Epoch 3971, batch 10, global_batch_idx: 63530, batch size: 111, loss[dur_loss=0.1763, prior_loss=0.9702, diff_loss=0.3209, tot_loss=1.467, over 111.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.969, diff_loss=0.3556, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 03:39:53,085 INFO [train.py:682] (1/4) Start epoch 3972 2024-10-22 03:40:07,534 INFO [train.py:561] (1/4) Epoch 3972, batch 4, global_batch_idx: 63540, batch size: 189, loss[dur_loss=0.1746, prior_loss=0.9695, diff_loss=0.3297, tot_loss=1.474, over 189.00 samples.], tot_loss[dur_loss=0.1729, prior_loss=0.9686, diff_loss=0.4004, tot_loss=1.542, over 937.00 samples.], 2024-10-22 03:40:22,630 INFO [train.py:561] (1/4) Epoch 3972, batch 14, global_batch_idx: 63550, batch size: 142, loss[dur_loss=0.1776, prior_loss=0.9691, diff_loss=0.273, tot_loss=1.42, over 142.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9691, diff_loss=0.3371, tot_loss=1.482, over 2210.00 samples.], 2024-10-22 03:40:24,079 INFO [train.py:682] (1/4) Start epoch 3973 2024-10-22 03:40:44,517 INFO [train.py:561] (1/4) Epoch 3973, batch 8, global_batch_idx: 63560, batch size: 170, loss[dur_loss=0.1803, prior_loss=0.9692, diff_loss=0.263, tot_loss=1.412, over 170.00 samples.], tot_loss[dur_loss=0.1743, prior_loss=0.9689, diff_loss=0.3673, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 03:40:54,689 INFO [train.py:682] (1/4) Start epoch 3974 2024-10-22 03:41:06,328 INFO [train.py:561] (1/4) Epoch 3974, batch 2, global_batch_idx: 63570, batch size: 203, loss[dur_loss=0.1766, prior_loss=0.9692, diff_loss=0.3412, tot_loss=1.487, over 203.00 samples.], tot_loss[dur_loss=0.1771, prior_loss=0.9693, diff_loss=0.3289, tot_loss=1.475, over 442.00 samples.], 2024-10-22 03:41:20,716 INFO [train.py:561] (1/4) Epoch 3974, batch 12, global_batch_idx: 63580, batch size: 152, loss[dur_loss=0.1747, prior_loss=0.9693, diff_loss=0.3078, tot_loss=1.452, over 152.00 samples.], tot_loss[dur_loss=0.1762, prior_loss=0.969, diff_loss=0.3537, tot_loss=1.499, over 1966.00 samples.], 2024-10-22 03:41:25,212 INFO [train.py:682] (1/4) Start epoch 3975 2024-10-22 03:41:42,489 INFO [train.py:561] (1/4) Epoch 3975, batch 6, global_batch_idx: 63590, batch size: 106, loss[dur_loss=0.1752, prior_loss=0.9692, diff_loss=0.2834, tot_loss=1.428, over 106.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9688, diff_loss=0.3915, tot_loss=1.534, over 1142.00 samples.], 2024-10-22 03:41:55,617 INFO [train.py:682] (1/4) Start epoch 3976 2024-10-22 03:42:04,430 INFO [train.py:561] (1/4) Epoch 3976, batch 0, global_batch_idx: 63600, batch size: 108, loss[dur_loss=0.177, prior_loss=0.9698, diff_loss=0.2868, tot_loss=1.434, over 108.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9698, diff_loss=0.2868, tot_loss=1.434, over 108.00 samples.], 2024-10-22 03:42:18,903 INFO [train.py:561] (1/4) Epoch 3976, batch 10, global_batch_idx: 63610, batch size: 111, loss[dur_loss=0.1753, prior_loss=0.9702, diff_loss=0.2597, tot_loss=1.405, over 111.00 samples.], tot_loss[dur_loss=0.1743, prior_loss=0.969, diff_loss=0.3467, tot_loss=1.49, over 1656.00 samples.], 2024-10-22 03:42:26,110 INFO [train.py:682] (1/4) Start epoch 3977 2024-10-22 03:42:40,045 INFO [train.py:561] (1/4) Epoch 3977, batch 4, global_batch_idx: 63620, batch size: 189, loss[dur_loss=0.1794, prior_loss=0.9694, diff_loss=0.3282, tot_loss=1.477, over 189.00 samples.], tot_loss[dur_loss=0.174, prior_loss=0.9686, diff_loss=0.4032, tot_loss=1.546, over 937.00 samples.], 2024-10-22 03:42:55,131 INFO [train.py:561] (1/4) Epoch 3977, batch 14, global_batch_idx: 63630, batch size: 142, loss[dur_loss=0.179, prior_loss=0.9691, diff_loss=0.3024, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1758, prior_loss=0.9691, diff_loss=0.3364, tot_loss=1.481, over 2210.00 samples.], 2024-10-22 03:42:56,571 INFO [train.py:682] (1/4) Start epoch 3978 2024-10-22 03:43:17,052 INFO [train.py:561] (1/4) Epoch 3978, batch 8, global_batch_idx: 63640, batch size: 170, loss[dur_loss=0.1767, prior_loss=0.9691, diff_loss=0.2988, tot_loss=1.445, over 170.00 samples.], tot_loss[dur_loss=0.1741, prior_loss=0.9688, diff_loss=0.3612, tot_loss=1.504, over 1432.00 samples.], 2024-10-22 03:43:27,286 INFO [train.py:682] (1/4) Start epoch 3979 2024-10-22 03:43:38,878 INFO [train.py:561] (1/4) Epoch 3979, batch 2, global_batch_idx: 63650, batch size: 203, loss[dur_loss=0.1778, prior_loss=0.9693, diff_loss=0.3167, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1767, prior_loss=0.9693, diff_loss=0.3011, tot_loss=1.447, over 442.00 samples.], 2024-10-22 03:43:53,314 INFO [train.py:561] (1/4) Epoch 3979, batch 12, global_batch_idx: 63660, batch size: 152, loss[dur_loss=0.1732, prior_loss=0.9692, diff_loss=0.3056, tot_loss=1.448, over 152.00 samples.], tot_loss[dur_loss=0.1751, prior_loss=0.969, diff_loss=0.349, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 03:43:57,907 INFO [train.py:682] (1/4) Start epoch 3980 2024-10-22 03:44:15,409 INFO [train.py:561] (1/4) Epoch 3980, batch 6, global_batch_idx: 63670, batch size: 106, loss[dur_loss=0.1746, prior_loss=0.9692, diff_loss=0.2875, tot_loss=1.431, over 106.00 samples.], tot_loss[dur_loss=0.1731, prior_loss=0.9686, diff_loss=0.3783, tot_loss=1.52, over 1142.00 samples.], 2024-10-22 03:44:28,736 INFO [train.py:682] (1/4) Start epoch 3981 2024-10-22 03:44:37,708 INFO [train.py:561] (1/4) Epoch 3981, batch 0, global_batch_idx: 63680, batch size: 108, loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.2779, tot_loss=1.425, over 108.00 samples.], tot_loss[dur_loss=0.1772, prior_loss=0.9697, diff_loss=0.2779, tot_loss=1.425, over 108.00 samples.], 2024-10-22 03:44:52,158 INFO [train.py:561] (1/4) Epoch 3981, batch 10, global_batch_idx: 63690, batch size: 111, loss[dur_loss=0.1736, prior_loss=0.9702, diff_loss=0.2958, tot_loss=1.44, over 111.00 samples.], tot_loss[dur_loss=0.1744, prior_loss=0.9689, diff_loss=0.3561, tot_loss=1.499, over 1656.00 samples.], 2024-10-22 03:44:59,300 INFO [train.py:682] (1/4) Start epoch 3982 2024-10-22 03:45:13,446 INFO [train.py:561] (1/4) Epoch 3982, batch 4, global_batch_idx: 63700, batch size: 189, loss[dur_loss=0.1771, prior_loss=0.9693, diff_loss=0.3062, tot_loss=1.453, over 189.00 samples.], tot_loss[dur_loss=0.1724, prior_loss=0.9686, diff_loss=0.414, tot_loss=1.555, over 937.00 samples.], 2024-10-22 03:45:28,753 INFO [train.py:561] (1/4) Epoch 3982, batch 14, global_batch_idx: 63710, batch size: 142, loss[dur_loss=0.1765, prior_loss=0.969, diff_loss=0.2499, tot_loss=1.395, over 142.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.3422, tot_loss=1.487, over 2210.00 samples.], 2024-10-22 03:45:30,218 INFO [train.py:682] (1/4) Start epoch 3983 2024-10-22 03:45:50,624 INFO [train.py:561] (1/4) Epoch 3983, batch 8, global_batch_idx: 63720, batch size: 170, loss[dur_loss=0.1777, prior_loss=0.9693, diff_loss=0.3372, tot_loss=1.484, over 170.00 samples.], tot_loss[dur_loss=0.1742, prior_loss=0.9688, diff_loss=0.3607, tot_loss=1.504, over 1432.00 samples.], 2024-10-22 03:46:00,910 INFO [train.py:682] (1/4) Start epoch 3984 2024-10-22 03:46:12,559 INFO [train.py:561] (1/4) Epoch 3984, batch 2, global_batch_idx: 63730, batch size: 203, loss[dur_loss=0.1764, prior_loss=0.9693, diff_loss=0.3439, tot_loss=1.49, over 203.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9693, diff_loss=0.2987, tot_loss=1.445, over 442.00 samples.], 2024-10-22 03:46:26,949 INFO [train.py:561] (1/4) Epoch 3984, batch 12, global_batch_idx: 63740, batch size: 152, loss[dur_loss=0.173, prior_loss=0.9695, diff_loss=0.2626, tot_loss=1.405, over 152.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.9691, diff_loss=0.3363, tot_loss=1.481, over 1966.00 samples.], 2024-10-22 03:46:31,439 INFO [train.py:682] (1/4) Start epoch 3985 2024-10-22 03:46:48,666 INFO [train.py:561] (1/4) Epoch 3985, batch 6, global_batch_idx: 63750, batch size: 106, loss[dur_loss=0.1753, prior_loss=0.9691, diff_loss=0.3037, tot_loss=1.448, over 106.00 samples.], tot_loss[dur_loss=0.1738, prior_loss=0.9688, diff_loss=0.3885, tot_loss=1.531, over 1142.00 samples.], 2024-10-22 03:47:01,914 INFO [train.py:682] (1/4) Start epoch 3986 2024-10-22 03:47:10,988 INFO [train.py:561] (1/4) Epoch 3986, batch 0, global_batch_idx: 63760, batch size: 108, loss[dur_loss=0.1795, prior_loss=0.9697, diff_loss=0.3026, tot_loss=1.452, over 108.00 samples.], tot_loss[dur_loss=0.1795, prior_loss=0.9697, diff_loss=0.3026, tot_loss=1.452, over 108.00 samples.], 2024-10-22 03:47:25,420 INFO [train.py:561] (1/4) Epoch 3986, batch 10, global_batch_idx: 63770, batch size: 111, loss[dur_loss=0.1765, prior_loss=0.9703, diff_loss=0.2853, tot_loss=1.432, over 111.00 samples.], tot_loss[dur_loss=0.1753, prior_loss=0.969, diff_loss=0.3509, tot_loss=1.495, over 1656.00 samples.], 2024-10-22 03:47:32,631 INFO [train.py:682] (1/4) Start epoch 3987 2024-10-22 03:47:46,606 INFO [train.py:561] (1/4) Epoch 3987, batch 4, global_batch_idx: 63780, batch size: 189, loss[dur_loss=0.1763, prior_loss=0.9692, diff_loss=0.3174, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.9686, diff_loss=0.408, tot_loss=1.552, over 937.00 samples.], 2024-10-22 03:48:01,696 INFO [train.py:561] (1/4) Epoch 3987, batch 14, global_batch_idx: 63790, batch size: 142, loss[dur_loss=0.1797, prior_loss=0.9691, diff_loss=0.3014, tot_loss=1.45, over 142.00 samples.], tot_loss[dur_loss=0.1764, prior_loss=0.9691, diff_loss=0.3433, tot_loss=1.489, over 2210.00 samples.], 2024-10-22 03:48:03,139 INFO [train.py:682] (1/4) Start epoch 3988 2024-10-22 03:48:23,539 INFO [train.py:561] (1/4) Epoch 3988, batch 8, global_batch_idx: 63800, batch size: 170, loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.294, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1743, prior_loss=0.9689, diff_loss=0.3699, tot_loss=1.513, over 1432.00 samples.], 2024-10-22 03:48:33,855 INFO [train.py:682] (1/4) Start epoch 3989 2024-10-22 03:48:45,623 INFO [train.py:561] (1/4) Epoch 3989, batch 2, global_batch_idx: 63810, batch size: 203, loss[dur_loss=0.179, prior_loss=0.9694, diff_loss=0.3239, tot_loss=1.472, over 203.00 samples.], tot_loss[dur_loss=0.1789, prior_loss=0.9693, diff_loss=0.2964, tot_loss=1.445, over 442.00 samples.], 2024-10-22 03:49:00,149 INFO [train.py:561] (1/4) Epoch 3989, batch 12, global_batch_idx: 63820, batch size: 152, loss[dur_loss=0.1746, prior_loss=0.9695, diff_loss=0.3021, tot_loss=1.446, over 152.00 samples.], tot_loss[dur_loss=0.1766, prior_loss=0.9691, diff_loss=0.3477, tot_loss=1.493, over 1966.00 samples.], 2024-10-22 03:49:04,637 INFO [train.py:682] (1/4) Start epoch 3990 2024-10-22 03:49:21,877 INFO [train.py:561] (1/4) Epoch 3990, batch 6, global_batch_idx: 63830, batch size: 106, loss[dur_loss=0.1756, prior_loss=0.969, diff_loss=0.2815, tot_loss=1.426, over 106.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.9688, diff_loss=0.3867, tot_loss=1.53, over 1142.00 samples.], 2024-10-22 03:49:35,167 INFO [train.py:682] (1/4) Start epoch 3991 2024-10-22 03:49:43,926 INFO [train.py:561] (1/4) Epoch 3991, batch 0, global_batch_idx: 63840, batch size: 108, loss[dur_loss=0.1811, prior_loss=0.9697, diff_loss=0.2937, tot_loss=1.444, over 108.00 samples.], tot_loss[dur_loss=0.1811, prior_loss=0.9697, diff_loss=0.2937, tot_loss=1.444, over 108.00 samples.], 2024-10-22 03:49:58,464 INFO [train.py:561] (1/4) Epoch 3991, batch 10, global_batch_idx: 63850, batch size: 111, loss[dur_loss=0.1782, prior_loss=0.9701, diff_loss=0.3162, tot_loss=1.465, over 111.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.969, diff_loss=0.3526, tot_loss=1.497, over 1656.00 samples.], 2024-10-22 03:50:05,700 INFO [train.py:682] (1/4) Start epoch 3992 2024-10-22 03:50:20,220 INFO [train.py:561] (1/4) Epoch 3992, batch 4, global_batch_idx: 63860, batch size: 189, loss[dur_loss=0.1767, prior_loss=0.9695, diff_loss=0.2946, tot_loss=1.441, over 189.00 samples.], tot_loss[dur_loss=0.1733, prior_loss=0.9686, diff_loss=0.4046, tot_loss=1.547, over 937.00 samples.], 2024-10-22 03:50:35,579 INFO [train.py:561] (1/4) Epoch 3992, batch 14, global_batch_idx: 63870, batch size: 142, loss[dur_loss=0.1802, prior_loss=0.9691, diff_loss=0.3215, tot_loss=1.471, over 142.00 samples.], tot_loss[dur_loss=0.1755, prior_loss=0.9691, diff_loss=0.3327, tot_loss=1.477, over 2210.00 samples.], 2024-10-22 03:50:37,023 INFO [train.py:682] (1/4) Start epoch 3993 2024-10-22 03:50:57,709 INFO [train.py:561] (1/4) Epoch 3993, batch 8, global_batch_idx: 63880, batch size: 170, loss[dur_loss=0.1791, prior_loss=0.9693, diff_loss=0.2913, tot_loss=1.44, over 170.00 samples.], tot_loss[dur_loss=0.1756, prior_loss=0.9689, diff_loss=0.3669, tot_loss=1.511, over 1432.00 samples.], 2024-10-22 03:51:08,128 INFO [train.py:682] (1/4) Start epoch 3994 2024-10-22 03:51:19,634 INFO [train.py:561] (1/4) Epoch 3994, batch 2, global_batch_idx: 63890, batch size: 203, loss[dur_loss=0.1761, prior_loss=0.9693, diff_loss=0.3187, tot_loss=1.464, over 203.00 samples.], tot_loss[dur_loss=0.1768, prior_loss=0.9693, diff_loss=0.3111, tot_loss=1.457, over 442.00 samples.], 2024-10-22 03:51:34,320 INFO [train.py:561] (1/4) Epoch 3994, batch 12, global_batch_idx: 63900, batch size: 152, loss[dur_loss=0.1736, prior_loss=0.9693, diff_loss=0.3373, tot_loss=1.48, over 152.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.969, diff_loss=0.3476, tot_loss=1.492, over 1966.00 samples.], 2024-10-22 03:51:38,884 INFO [train.py:682] (1/4) Start epoch 3995 2024-10-22 03:51:56,421 INFO [train.py:561] (1/4) Epoch 3995, batch 6, global_batch_idx: 63910, batch size: 106, loss[dur_loss=0.1759, prior_loss=0.9692, diff_loss=0.2927, tot_loss=1.438, over 106.00 samples.], tot_loss[dur_loss=0.1752, prior_loss=0.9688, diff_loss=0.3911, tot_loss=1.535, over 1142.00 samples.], 2024-10-22 03:52:09,823 INFO [train.py:682] (1/4) Start epoch 3996 2024-10-22 03:52:18,747 INFO [train.py:561] (1/4) Epoch 3996, batch 0, global_batch_idx: 63920, batch size: 108, loss[dur_loss=0.1796, prior_loss=0.9697, diff_loss=0.3264, tot_loss=1.476, over 108.00 samples.], tot_loss[dur_loss=0.1796, prior_loss=0.9697, diff_loss=0.3264, tot_loss=1.476, over 108.00 samples.], 2024-10-22 03:52:33,360 INFO [train.py:561] (1/4) Epoch 3996, batch 10, global_batch_idx: 63930, batch size: 111, loss[dur_loss=0.1796, prior_loss=0.9704, diff_loss=0.2566, tot_loss=1.407, over 111.00 samples.], tot_loss[dur_loss=0.1749, prior_loss=0.969, diff_loss=0.3482, tot_loss=1.492, over 1656.00 samples.], 2024-10-22 03:52:40,595 INFO [train.py:682] (1/4) Start epoch 3997 2024-10-22 03:52:54,691 INFO [train.py:561] (1/4) Epoch 3997, batch 4, global_batch_idx: 63940, batch size: 189, loss[dur_loss=0.1765, prior_loss=0.9695, diff_loss=0.3166, tot_loss=1.463, over 189.00 samples.], tot_loss[dur_loss=0.1746, prior_loss=0.9686, diff_loss=0.4224, tot_loss=1.566, over 937.00 samples.], 2024-10-22 03:53:09,904 INFO [train.py:561] (1/4) Epoch 3997, batch 14, global_batch_idx: 63950, batch size: 142, loss[dur_loss=0.1756, prior_loss=0.9691, diff_loss=0.2982, tot_loss=1.443, over 142.00 samples.], tot_loss[dur_loss=0.176, prior_loss=0.9691, diff_loss=0.3496, tot_loss=1.495, over 2210.00 samples.], 2024-10-22 03:53:11,346 INFO [train.py:682] (1/4) Start epoch 3998 2024-10-22 03:53:32,149 INFO [train.py:561] (1/4) Epoch 3998, batch 8, global_batch_idx: 63960, batch size: 170, loss[dur_loss=0.179, prior_loss=0.9695, diff_loss=0.299, tot_loss=1.448, over 170.00 samples.], tot_loss[dur_loss=0.1747, prior_loss=0.969, diff_loss=0.3667, tot_loss=1.51, over 1432.00 samples.], 2024-10-22 03:53:42,445 INFO [train.py:682] (1/4) Start epoch 3999 2024-10-22 03:53:54,012 INFO [train.py:561] (1/4) Epoch 3999, batch 2, global_batch_idx: 63970, batch size: 203, loss[dur_loss=0.1756, prior_loss=0.9693, diff_loss=0.3076, tot_loss=1.453, over 203.00 samples.], tot_loss[dur_loss=0.177, prior_loss=0.9693, diff_loss=0.3072, tot_loss=1.453, over 442.00 samples.], 2024-10-22 03:54:08,609 INFO [train.py:561] (1/4) Epoch 3999, batch 12, global_batch_idx: 63980, batch size: 152, loss[dur_loss=0.1757, prior_loss=0.9693, diff_loss=0.2993, tot_loss=1.444, over 152.00 samples.], tot_loss[dur_loss=0.1754, prior_loss=0.969, diff_loss=0.3501, tot_loss=1.494, over 1966.00 samples.], 2024-10-22 03:54:13,107 INFO [train.py:682] (1/4) Start epoch 4000 2024-10-22 03:54:30,686 INFO [train.py:561] (1/4) Epoch 4000, batch 6, global_batch_idx: 63990, batch size: 106, loss[dur_loss=0.1769, prior_loss=0.9691, diff_loss=0.2901, tot_loss=1.436, over 106.00 samples.], tot_loss[dur_loss=0.174, prior_loss=0.9687, diff_loss=0.3893, tot_loss=1.532, over 1142.00 samples.], 2024-10-22 03:54:43,968 INFO [train.py:724] (1/4) Done!