File size: 11,664 Bytes
1fb32b9 1d53e66 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 |
2022-02-03 13:58:44,300 INFO MainThread:42729 [wandb_setup.py:_flush():75] Loading settings from /workspace/.config/wandb/settings
2022-02-03 13:58:44,300 INFO MainThread:42729 [wandb_setup.py:_flush():75] Loading settings from /workspace/xls-r-300m-cv_8-fr/wandb/settings
2022-02-03 13:58:44,300 INFO MainThread:42729 [wandb_setup.py:_flush():75] Loading settings from environment variables: {'project': 'xls-r-300m-cv_8-fr'}
2022-02-03 13:58:44,300 INFO MainThread:42729 [wandb_setup.py:_flush():75] Inferring run settings from compute environment: {'program_relpath': 'run_speech_recognition_ctc.py', 'program': 'run_speech_recognition_ctc.py'}
2022-02-03 13:58:44,301 INFO MainThread:42729 [wandb_init.py:_log_setup():386] Logging user logs to /workspace/xls-r-300m-cv_8-fr/wandb/run-20220203_135844-2tzexn1o/logs/debug.log
2022-02-03 13:58:44,301 INFO MainThread:42729 [wandb_init.py:_log_setup():387] Logging internal logs to /workspace/xls-r-300m-cv_8-fr/wandb/run-20220203_135844-2tzexn1o/logs/debug-internal.log
2022-02-03 13:58:44,301 INFO MainThread:42729 [wandb_init.py:init():420] calling init triggers
2022-02-03 13:58:44,301 INFO MainThread:42729 [wandb_init.py:init():425] wandb.init called with sweep_config: {}
config: {}
2022-02-03 13:58:44,301 INFO MainThread:42729 [wandb_init.py:init():471] starting backend
2022-02-03 13:58:44,301 INFO MainThread:42729 [backend.py:_multiprocessing_setup():99] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
2022-02-03 13:58:44,539 INFO MainThread:42729 [backend.py:ensure_launched():219] starting backend process...
2022-02-03 13:58:44,779 INFO MainThread:42729 [backend.py:ensure_launched():224] started backend process with pid: 3742
2022-02-03 13:58:44,782 INFO MainThread:42729 [wandb_init.py:init():480] backend started and connected
2022-02-03 13:58:44,791 INFO MainThread:42729 [wandb_init.py:init():550] updated telemetry
2022-02-03 13:58:45,331 INFO MainThread:42729 [wandb_init.py:init():581] communicating current version
2022-02-03 13:58:45,892 INFO MainThread:42729 [wandb_init.py:init():586] got version response
2022-02-03 13:58:45,892 INFO MainThread:42729 [wandb_init.py:init():596] communicating run to backend with 30 second timeout
2022-02-03 13:58:46,119 INFO MainThread:42729 [wandb_init.py:init():624] starting run threads in backend
2022-02-03 13:58:46,700 INFO MainThread:42729 [wandb_run.py:_console_start():1827] atexit reg
2022-02-03 13:58:46,701 INFO MainThread:42729 [wandb_run.py:_redirect():1701] redirect: SettingsConsole.REDIRECT
2022-02-03 13:58:46,702 INFO MainThread:42729 [wandb_run.py:_redirect():1706] Redirecting console.
2022-02-03 13:58:46,709 INFO MainThread:42729 [wandb_run.py:_redirect():1762] Redirects installed.
2022-02-03 13:58:46,709 INFO MainThread:42729 [wandb_init.py:init():651] run started, returning control to user process
2022-02-03 13:58:46,712 INFO MainThread:42729 [wandb_run.py:_config_callback():966] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': 'float32', 'use_bfloat16': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'chunk_size_feed_forward': 0, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 45, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-xls-r-300m', 'transformers_version': '4.17.0.dev0', 'feat_extract_dropout': 0.0, 'model_type': 'wav2vec2', 'num_feat_extract_layers': 7, 'hidden_size': 1024, 'feat_extract_norm': 'layer', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': True, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 24, 'intermediate_size': 4096, 'hidden_act': 'gelu', 'num_attention_heads': 16, 'hidden_dropout': 0.0, 'attention_dropout': 0.0, 'activation_dropout': 0.1, 'feat_proj_dropout': 0.0, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 46, 'do_stable_layer_norm': True, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.75, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.25, 'mask_feature_length': 64, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 768, 'proj_codevector_dim': 768, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'mean', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 1024, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': './', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 16, 'per_device_eval_batch_size': 16, 'per_gpu_train_batch_size': 'None', 'per_gpu_eval_batch_size': 'None', 'gradient_accumulation_steps': 8, 'eval_accumulation_steps': 'None', 'learning_rate': 7.5e-05, 'weight_decay': 0.0, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 5.0, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'warmup_ratio': 0.0, 'warmup_steps': 2000, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': './runs/Feb03_10-26-21_job-8ff6a64a-ab40-4ef5-b3ec-d0e2f05ceb9f', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 100, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 1000, 'save_total_limit': 3, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'amp', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': 'None', 'local_rank': -1, 'xpu_backend': 'None', 'tpu_num_cores': 'None', 'tpu_metrics_debug': False, 'debug': '[]', 'dataloader_drop_last': False, 'eval_steps': 1000, 'dataloader_num_workers': 0, 'past_index': -1, 'run_name': './', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': 'None', 'load_best_model_at_end': True, 'metric_for_best_model': 'loss', 'greater_is_better': False, 'ignore_data_skip': False, 'sharded_ddp': '[]', 'deepspeed': 'None', 'label_smoothing_factor': 0.0, 'optim': 'adamw_hf', 'adafactor': False, 'group_by_length': True, 'length_column_name': 'input_length', 'report_to': "['wandb']", 'ddp_find_unused_parameters': 'None', 'ddp_bucket_cap_mb': 'None', 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': 'None', 'hub_model_id': 'None', 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'gradient_checkpointing': True, 'fp16_backend': 'auto', 'push_to_hub_model_id': 'None', 'push_to_hub_organization': 'None', 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', '_n_gpu': 1, 'mp_parameters': '', 'train_batch_size': 16, 'eval_batch_size': 16}
2022-02-03 13:58:46,715 INFO MainThread:42729 [wandb_watch.py:watch():43] Watching
2022-02-05 05:58:42,614 INFO MainThread:42729 [wandb_run.py:_atexit_cleanup():1797] got exitcode: 0
2022-02-05 05:58:42,620 INFO MainThread:42729 [wandb_run.py:_restore():1769] restore
2022-02-05 05:58:45,392 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 1
}
pusher_stats {
uploaded_bytes: 2043
total_bytes: 2043
}
2022-02-05 05:58:45,562 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 1
}
pusher_stats {
uploaded_bytes: 2043
total_bytes: 2043
}
2022-02-05 05:58:46,573 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 1
}
pusher_stats {
uploaded_bytes: 2043
total_bytes: 2043
}
2022-02-05 05:58:46,676 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2043
total_bytes: 2872686
}
2022-02-05 05:58:46,779 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2043
total_bytes: 2872686
}
2022-02-05 05:58:46,883 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 1590434
total_bytes: 2872686
}
2022-02-05 05:58:46,986 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:47,088 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:47,191 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:47,294 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:47,400 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:47,503 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:48,539 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
2022-02-05 05:58:48,948 INFO MainThread:42729 [wandb_run.py:_wait_for_finish():1929] got exit ret: done: true
exit_result {
}
file_counts {
wandb_count: 5
}
pusher_stats {
uploaded_bytes: 2872686
total_bytes: 2872686
}
local_info {
}
2022-02-05 05:58:50,098 INFO MainThread:42729 [wandb_run.py:_append_history():2144] rendering history
2022-02-05 05:58:50,100 INFO MainThread:42729 [wandb_run.py:_append_summary():2102] rendering summary
2022-02-05 05:58:50,101 INFO MainThread:42729 [wandb_run.py:_append_files():2194] logging synced files
|