rlaorrn commited on
Commit
64dba2f
1 Parent(s): b3df4c6

Training in progress, step 500

Browse files
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ffef46ba0d91b26720ade5a0764b8eb00dd4cd8b026b05ad7fdf73b5bba92b4e
3
  size 377611120
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:bbdd2ed8b70919495da7652044d38dca1e3f2b644743b508ee081973992a3427
3
  size 377611120
runs/May25_17-49-41_8f1fad5fe1d2/events.out.tfevents.1716659410.8f1fad5fe1d2.34.0 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9a69b3e59d047302f1940346158a0ca2a3adbf878520a7b13b0a07e64ecd1dd0
3
+ size 6848
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:141b8c574fad469c69f1096f214b83f7c7364266f51555c4dd421e97e70dca21
3
  size 4920
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:16c44ddbf94d57ba0bb11a0f097172952efe63cc2b9e9b68519186850c66a69c
3
  size 4920
wandb/debug-internal.log CHANGED
The diff for this file is too large to render. See raw diff
 
wandb/debug.log CHANGED
@@ -1,217 +1,32 @@
1
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Current SDK version is 0.16.6
2
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Configure stats pid to 34
3
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /root/.config/wandb/settings
4
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /kaggle/working/wandb/settings
5
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from environment variables: {}
6
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying setup settings: {'_disable_service': False}
7
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program': '<python with no main file>'}
8
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {}
9
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {'api_key': '***REDACTED***'}
10
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_init.py:_log_setup():521] Logging user logs to /kaggle/working/wandb/run-20240524_202737-n1w0kmmv/logs/debug.log
11
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_init.py:_log_setup():522] Logging internal logs to /kaggle/working/wandb/run-20240524_202737-n1w0kmmv/logs/debug-internal.log
12
- 2024-05-24 20:27:37,417 INFO MainThread:34 [wandb_init.py:_jupyter_setup():467] configuring jupyter hooks <wandb.sdk.wandb_init._WandbInit object at 0x785a802b9120>
13
- 2024-05-24 20:27:37,418 INFO MainThread:34 [wandb_init.py:init():561] calling init triggers
14
- 2024-05-24 20:27:37,418 INFO MainThread:34 [wandb_init.py:init():568] wandb.init called with sweep_config: {}
15
  config: {}
16
- 2024-05-24 20:27:37,418 INFO MainThread:34 [wandb_init.py:init():611] starting backend
17
- 2024-05-24 20:27:37,418 INFO MainThread:34 [wandb_init.py:init():615] setting up manager
18
- 2024-05-24 20:27:37,420 INFO MainThread:34 [backend.py:_multiprocessing_setup():105] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
19
- 2024-05-24 20:27:37,423 INFO MainThread:34 [wandb_init.py:init():623] backend started and connected
20
- 2024-05-24 20:27:37,434 INFO MainThread:34 [wandb_run.py:_label_probe_notebook():1299] probe notebook
21
- 2024-05-24 20:27:38,191 INFO MainThread:34 [wandb_init.py:init():715] updated telemetry
22
- 2024-05-24 20:27:38,195 INFO MainThread:34 [wandb_init.py:init():748] communicating run to backend with 90.0 second timeout
23
- 2024-05-24 20:27:38,448 INFO MainThread:34 [wandb_run.py:_on_init():2357] communicating current version
24
- 2024-05-24 20:27:38,512 INFO MainThread:34 [wandb_run.py:_on_init():2366] got version response upgrade_message: "wandb version 0.17.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade"
25
 
26
- 2024-05-24 20:27:38,514 INFO MainThread:34 [wandb_init.py:init():799] starting run threads in backend
27
- 2024-05-24 20:27:54,627 INFO MainThread:34 [wandb_run.py:_console_start():2335] atexit reg
28
- 2024-05-24 20:27:54,628 INFO MainThread:34 [wandb_run.py:_redirect():2190] redirect: wrap_raw
29
- 2024-05-24 20:27:54,628 INFO MainThread:34 [wandb_run.py:_redirect():2255] Wrapping output streams.
30
- 2024-05-24 20:27:54,629 INFO MainThread:34 [wandb_run.py:_redirect():2280] Redirects installed.
31
- 2024-05-24 20:27:54,630 INFO MainThread:34 [wandb_init.py:init():842] run started, returning control to user process
32
- 2024-05-24 20:27:54,637 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 32, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-24-49_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
33
- 2024-05-24 20:28:04,489 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
34
- 2024-05-24 20:28:04,489 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
35
- 2024-05-24 20:28:58,368 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
36
- 2024-05-24 20:28:59,625 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
37
- 2024-05-24 20:28:59,625 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
38
- 2024-05-24 20:29:08,949 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
39
- 2024-05-24 20:29:08,951 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
40
- 2024-05-24 20:29:08,951 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
41
- 2024-05-24 20:29:12,141 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
42
- 2024-05-24 20:29:16,856 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
43
- 2024-05-24 20:29:16,857 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
44
- 2024-05-24 20:29:20,764 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
45
- 2024-05-24 20:29:24,478 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
46
- 2024-05-24 20:29:24,478 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
47
- 2024-05-24 20:29:30,221 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
48
- 2024-05-24 20:30:45,072 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
49
- 2024-05-24 20:30:45,072 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
50
- 2024-05-24 20:30:58,707 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
51
- 2024-05-24 20:30:58,709 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
52
- 2024-05-24 20:30:58,710 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
53
- 2024-05-24 20:31:02,188 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
54
- 2024-05-24 20:31:02,613 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
55
- 2024-05-24 20:31:02,613 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
56
- 2024-05-24 20:31:04,725 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
57
- 2024-05-24 20:31:04,727 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
58
- 2024-05-24 20:31:04,727 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
59
- 2024-05-24 20:31:07,065 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
60
- 2024-05-24 20:31:07,298 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
61
- 2024-05-24 20:31:07,298 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
62
- 2024-05-24 20:31:10,187 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
63
- 2024-05-24 20:31:10,193 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
64
- 2024-05-24 20:31:10,193 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
65
- 2024-05-24 20:31:12,404 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
66
- 2024-05-24 20:31:50,143 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 32, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-31-07_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
67
- 2024-05-24 20:31:58,626 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
68
- 2024-05-24 20:31:58,626 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
69
- 2024-05-24 20:34:06,571 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
70
- 2024-05-24 20:34:06,717 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
71
- 2024-05-24 20:34:06,717 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
72
- 2024-05-24 20:34:10,001 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
73
- 2024-05-24 20:34:46,464 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 64, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-34-06_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
74
- 2024-05-24 20:34:55,928 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
75
- 2024-05-24 20:34:55,929 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
76
- 2024-05-24 20:35:26,131 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
77
- 2024-05-24 20:35:26,139 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
78
- 2024-05-24 20:35:26,139 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
79
- 2024-05-24 20:35:32,729 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
80
- 2024-05-24 20:35:32,732 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
81
- 2024-05-24 20:35:32,732 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
82
- 2024-05-24 20:35:34,729 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
83
- 2024-05-24 20:35:34,732 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
84
- 2024-05-24 20:35:34,733 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
85
- 2024-05-24 20:35:47,483 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
86
- 2024-05-24 20:35:47,488 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
87
- 2024-05-24 20:35:47,488 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
88
- 2024-05-24 20:35:56,249 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
89
- 2024-05-24 20:35:56,253 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
90
- 2024-05-24 20:35:56,253 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
91
- 2024-05-24 20:37:20,366 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
92
- 2024-05-24 20:37:20,370 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
93
- 2024-05-24 20:37:20,370 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
94
- 2024-05-24 20:37:28,481 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
95
- 2024-05-24 20:37:28,510 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
96
- 2024-05-24 20:37:28,511 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
97
- 2024-05-24 20:38:00,946 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
98
- 2024-05-24 20:38:00,950 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
99
- 2024-05-24 20:38:00,950 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
100
- 2024-05-24 20:38:04,524 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
101
- 2024-05-24 20:38:04,550 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
102
- 2024-05-24 20:38:04,550 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
103
- 2024-05-24 20:38:12,369 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
104
- 2024-05-24 20:38:12,432 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
105
- 2024-05-24 20:38:12,432 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
106
- 2024-05-24 20:38:16,938 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
107
- 2024-05-24 20:38:16,998 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
108
- 2024-05-24 20:38:16,998 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
109
- 2024-05-24 20:38:21,248 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
110
- 2024-05-24 20:38:21,260 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
111
- 2024-05-24 20:38:21,260 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
112
- 2024-05-24 20:38:22,930 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
113
- 2024-05-24 20:38:22,940 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
114
- 2024-05-24 20:38:22,940 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
115
- 2024-05-24 20:38:37,851 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
116
- 2024-05-24 20:38:55,254 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
117
- 2024-05-24 20:38:55,254 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
118
- 2024-05-24 20:38:58,788 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
119
- 2024-05-24 20:38:58,827 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
120
- 2024-05-24 20:38:58,827 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
121
- 2024-05-24 20:39:00,762 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
122
- 2024-05-24 20:40:16,333 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
123
- 2024-05-24 20:40:16,334 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
124
- 2024-05-24 20:40:21,472 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
125
- 2024-05-24 20:40:22,682 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
126
- 2024-05-24 20:40:22,682 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
127
- 2024-05-24 20:40:28,979 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
128
- 2024-05-24 20:40:33,961 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
129
- 2024-05-24 20:40:33,961 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
130
- 2024-05-24 20:40:36,994 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
131
- 2024-05-24 20:40:36,996 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
132
- 2024-05-24 20:40:36,997 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
133
- 2024-05-24 20:40:42,203 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
134
- 2024-05-24 20:40:42,621 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
135
- 2024-05-24 20:40:42,621 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
136
- 2024-05-24 20:40:44,473 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
137
- 2024-05-24 20:40:44,475 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
138
- 2024-05-24 20:40:44,475 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
139
- 2024-05-24 20:40:48,431 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
140
- 2024-05-24 20:40:48,669 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
141
- 2024-05-24 20:40:48,669 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
142
- 2024-05-24 20:40:50,725 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
143
- 2024-05-24 20:40:50,730 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
144
- 2024-05-24 20:40:50,730 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
145
- 2024-05-24 20:40:52,537 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
146
- 2024-05-24 20:41:30,450 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 64, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-40-48_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
147
- 2024-05-24 20:41:39,043 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
148
- 2024-05-24 20:41:39,044 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
149
- 2024-05-24 20:42:26,950 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
150
- 2024-05-24 20:42:27,128 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
151
- 2024-05-24 20:42:27,128 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
152
- 2024-05-24 20:42:29,871 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
153
- 2024-05-24 20:43:07,276 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 16, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-42-26_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
154
- 2024-05-24 20:43:12,247 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
155
- 2024-05-24 20:43:12,247 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
156
- 2024-05-24 20:44:08,643 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
157
- 2024-05-24 20:44:14,294 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
158
- 2024-05-24 20:44:14,295 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
159
- 2024-05-24 20:44:39,719 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
160
- 2024-05-24 20:44:39,770 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
161
- 2024-05-24 20:44:39,770 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
162
- 2024-05-24 20:44:43,543 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
163
- 2024-05-24 20:46:31,143 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
164
- 2024-05-24 20:46:31,143 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
165
- 2024-05-24 20:46:49,361 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
166
- 2024-05-24 20:46:49,391 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
167
- 2024-05-24 20:46:49,391 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
168
- 2024-05-24 20:46:52,169 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
169
- 2024-05-24 20:46:54,391 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
170
- 2024-05-24 20:46:54,391 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
171
- 2024-05-24 20:46:58,317 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
172
- 2024-05-24 20:46:58,322 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
173
- 2024-05-24 20:46:58,322 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
174
- 2024-05-24 20:47:06,482 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
175
- 2024-05-24 20:47:07,830 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
176
- 2024-05-24 20:47:07,830 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
177
- 2024-05-24 20:47:10,833 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
178
- 2024-05-24 20:47:10,835 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
179
- 2024-05-24 20:47:10,839 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
180
- 2024-05-24 20:47:14,002 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
181
- 2024-05-24 20:48:02,460 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
182
- 2024-05-24 20:48:02,460 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
183
- 2024-05-24 20:48:05,731 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
184
- 2024-05-24 20:48:05,733 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
185
- 2024-05-24 20:48:05,734 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
186
- 2024-05-24 20:48:08,287 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
187
- 2024-05-24 20:48:08,714 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
188
- 2024-05-24 20:48:08,714 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
189
- 2024-05-24 20:48:10,290 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
190
- 2024-05-24 20:48:10,293 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
191
- 2024-05-24 20:48:10,294 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
192
- 2024-05-24 20:48:13,841 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
193
- 2024-05-24 20:48:14,091 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
194
- 2024-05-24 20:48:14,092 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
195
- 2024-05-24 20:48:16,125 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
196
- 2024-05-24 20:48:40,373 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 32, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 16, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May24_20-48-13_890e8b3ca76b', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
197
- 2024-05-24 20:48:42,631 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
198
- 2024-05-24 20:48:42,632 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
199
- 2024-05-24 20:49:00,381 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
200
- 2024-05-24 20:49:00,425 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
201
- 2024-05-24 20:49:00,425 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
202
- 2024-05-24 20:49:03,721 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
203
- 2024-05-24 20:49:03,725 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
204
- 2024-05-24 20:49:03,726 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
205
- 2024-05-24 20:49:08,089 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
206
- 2024-05-24 20:49:08,455 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
207
- 2024-05-24 20:49:08,456 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
208
- 2024-05-24 20:51:52,779 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
209
- 2024-05-24 20:51:52,781 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
210
- 2024-05-24 20:51:52,781 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
211
- 2024-05-24 20:51:56,593 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
212
- 2024-05-24 20:52:22,759 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
213
- 2024-05-24 20:52:22,759 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
214
- 2024-05-24 20:52:49,570 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
215
- 2024-05-24 20:52:49,572 INFO MainThread:34 [jupyter.py:save_ipynb():373] not saving jupyter notebook
216
- 2024-05-24 20:52:49,572 INFO MainThread:34 [wandb_init.py:_pause_backend():432] pausing backend
217
- 2024-05-24 20:52:52,107 INFO MainThread:34 [wandb_init.py:_resume_backend():437] resuming backend
 
1
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Current SDK version is 0.16.6
2
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Configure stats pid to 34
3
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /root/.config/wandb/settings
4
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /kaggle/working/wandb/settings
5
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from environment variables: {}
6
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program': '<python with no main file>'}
7
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {'api_key': '***REDACTED***'}
8
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {'api_key': '***REDACTED***'}
9
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {}
10
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_log_setup():521] Logging user logs to /kaggle/working/wandb/run-20240525_175010-8ah63pdc/logs/debug.log
11
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_log_setup():522] Logging internal logs to /kaggle/working/wandb/run-20240525_175010-8ah63pdc/logs/debug-internal.log
12
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_jupyter_setup():467] configuring jupyter hooks <wandb.sdk.wandb_init._WandbInit object at 0x7fbe99ebfe20>
13
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():561] calling init triggers
14
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():568] wandb.init called with sweep_config: {}
15
  config: {}
16
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():611] starting backend
17
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():615] setting up manager
18
+ 2024-05-25 17:50:10,437 INFO MainThread:34 [backend.py:_multiprocessing_setup():105] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
19
+ 2024-05-25 17:50:10,439 INFO MainThread:34 [wandb_init.py:init():623] backend started and connected
20
+ 2024-05-25 17:50:10,450 INFO MainThread:34 [wandb_run.py:_label_probe_notebook():1299] probe notebook
21
+ 2024-05-25 17:50:11,144 INFO MainThread:34 [wandb_init.py:init():715] updated telemetry
22
+ 2024-05-25 17:50:11,147 INFO MainThread:34 [wandb_init.py:init():748] communicating run to backend with 90.0 second timeout
23
+ 2024-05-25 17:50:11,381 INFO MainThread:34 [wandb_run.py:_on_init():2357] communicating current version
24
+ 2024-05-25 17:50:11,442 INFO MainThread:34 [wandb_run.py:_on_init():2366] got version response upgrade_message: "wandb version 0.17.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade"
25
 
26
+ 2024-05-25 17:50:11,444 INFO MainThread:34 [wandb_init.py:init():799] starting run threads in backend
27
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_console_start():2335] atexit reg
28
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2190] redirect: wrap_raw
29
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2255] Wrapping output streams.
30
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2280] Redirects installed.
31
+ 2024-05-25 17:50:27,457 INFO MainThread:34 [wandb_init.py:init():842] run started, returning control to user process
32
+ 2024-05-25 17:50:27,464 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 100000000000000, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 8, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May25_17-49-41_8f1fad5fe1d2', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
wandb/run-20240525_175010-8ah63pdc/files/conda-environment.yaml ADDED
File without changes
wandb/run-20240525_175010-8ah63pdc/files/config.yaml ADDED
@@ -0,0 +1,866 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ wandb_version: 1
2
+
3
+ _wandb:
4
+ desc: null
5
+ value:
6
+ python_version: 3.10.13
7
+ cli_version: 0.16.6
8
+ framework: huggingface
9
+ huggingface_version: 4.39.3
10
+ is_jupyter_run: true
11
+ is_kaggle_kernel: true
12
+ start_time: 1716659410.0
13
+ t:
14
+ 1:
15
+ - 1
16
+ - 2
17
+ - 3
18
+ - 5
19
+ - 11
20
+ - 12
21
+ - 49
22
+ - 51
23
+ - 53
24
+ - 55
25
+ - 71
26
+ - 105
27
+ 2:
28
+ - 1
29
+ - 2
30
+ - 3
31
+ - 5
32
+ - 11
33
+ - 12
34
+ - 49
35
+ - 51
36
+ - 53
37
+ - 55
38
+ - 71
39
+ - 105
40
+ 3:
41
+ - 7
42
+ - 23
43
+ 4: 3.10.13
44
+ 5: 0.16.6
45
+ 6: 4.39.3
46
+ 8:
47
+ - 1
48
+ - 2
49
+ - 5
50
+ 9:
51
+ 1: transformers_trainer
52
+ 13: linux-x86_64
53
+ m:
54
+ - 1: train/global_step
55
+ 6:
56
+ - 3
57
+ - 1: train/loss
58
+ 5: 1
59
+ 6:
60
+ - 1
61
+ - 1: train/grad_norm
62
+ 5: 1
63
+ 6:
64
+ - 1
65
+ - 1: train/learning_rate
66
+ 5: 1
67
+ 6:
68
+ - 1
69
+ - 1: train/epoch
70
+ 5: 1
71
+ 6:
72
+ - 1
73
+ - 1: eval/loss
74
+ 5: 1
75
+ 6:
76
+ - 1
77
+ - 1: eval/wer
78
+ 5: 1
79
+ 6:
80
+ - 1
81
+ - 1: eval/runtime
82
+ 5: 1
83
+ 6:
84
+ - 1
85
+ - 1: eval/samples_per_second
86
+ 5: 1
87
+ 6:
88
+ - 1
89
+ - 1: eval/steps_per_second
90
+ 5: 1
91
+ 6:
92
+ - 1
93
+ return_dict:
94
+ desc: null
95
+ value: true
96
+ output_hidden_states:
97
+ desc: null
98
+ value: false
99
+ output_attentions:
100
+ desc: null
101
+ value: false
102
+ torchscript:
103
+ desc: null
104
+ value: false
105
+ torch_dtype:
106
+ desc: null
107
+ value: null
108
+ use_bfloat16:
109
+ desc: null
110
+ value: false
111
+ tf_legacy_loss:
112
+ desc: null
113
+ value: false
114
+ pruned_heads:
115
+ desc: null
116
+ value: {}
117
+ tie_word_embeddings:
118
+ desc: null
119
+ value: true
120
+ chunk_size_feed_forward:
121
+ desc: null
122
+ value: 0
123
+ is_encoder_decoder:
124
+ desc: null
125
+ value: false
126
+ is_decoder:
127
+ desc: null
128
+ value: false
129
+ cross_attention_hidden_size:
130
+ desc: null
131
+ value: null
132
+ add_cross_attention:
133
+ desc: null
134
+ value: false
135
+ tie_encoder_decoder:
136
+ desc: null
137
+ value: false
138
+ max_length:
139
+ desc: null
140
+ value: 20
141
+ min_length:
142
+ desc: null
143
+ value: 0
144
+ do_sample:
145
+ desc: null
146
+ value: false
147
+ early_stopping:
148
+ desc: null
149
+ value: false
150
+ num_beams:
151
+ desc: null
152
+ value: 1
153
+ num_beam_groups:
154
+ desc: null
155
+ value: 1
156
+ diversity_penalty:
157
+ desc: null
158
+ value: 0.0
159
+ temperature:
160
+ desc: null
161
+ value: 1.0
162
+ top_k:
163
+ desc: null
164
+ value: 50
165
+ top_p:
166
+ desc: null
167
+ value: 1.0
168
+ typical_p:
169
+ desc: null
170
+ value: 1.0
171
+ repetition_penalty:
172
+ desc: null
173
+ value: 1.0
174
+ length_penalty:
175
+ desc: null
176
+ value: 1.0
177
+ no_repeat_ngram_size:
178
+ desc: null
179
+ value: 0
180
+ encoder_no_repeat_ngram_size:
181
+ desc: null
182
+ value: 0
183
+ bad_words_ids:
184
+ desc: null
185
+ value: null
186
+ num_return_sequences:
187
+ desc: null
188
+ value: 1
189
+ output_scores:
190
+ desc: null
191
+ value: false
192
+ return_dict_in_generate:
193
+ desc: null
194
+ value: false
195
+ forced_bos_token_id:
196
+ desc: null
197
+ value: null
198
+ forced_eos_token_id:
199
+ desc: null
200
+ value: null
201
+ remove_invalid_values:
202
+ desc: null
203
+ value: false
204
+ exponential_decay_length_penalty:
205
+ desc: null
206
+ value: null
207
+ suppress_tokens:
208
+ desc: null
209
+ value: null
210
+ begin_suppress_tokens:
211
+ desc: null
212
+ value: null
213
+ architectures:
214
+ desc: null
215
+ value:
216
+ - Wav2Vec2ForPreTraining
217
+ finetuning_task:
218
+ desc: null
219
+ value: null
220
+ id2label:
221
+ desc: null
222
+ value:
223
+ '0': LABEL_0
224
+ '1': LABEL_1
225
+ label2id:
226
+ desc: null
227
+ value:
228
+ LABEL_0: 0
229
+ LABEL_1: 1
230
+ tokenizer_class:
231
+ desc: null
232
+ value: null
233
+ prefix:
234
+ desc: null
235
+ value: null
236
+ bos_token_id:
237
+ desc: null
238
+ value: 1
239
+ pad_token_id:
240
+ desc: null
241
+ value: 0
242
+ eos_token_id:
243
+ desc: null
244
+ value: 2
245
+ sep_token_id:
246
+ desc: null
247
+ value: null
248
+ decoder_start_token_id:
249
+ desc: null
250
+ value: null
251
+ task_specific_params:
252
+ desc: null
253
+ value: null
254
+ problem_type:
255
+ desc: null
256
+ value: null
257
+ _name_or_path:
258
+ desc: null
259
+ value: facebook/wav2vec2-base
260
+ transformers_version:
261
+ desc: null
262
+ value: 4.39.3
263
+ freeze_feat_extract_train:
264
+ desc: null
265
+ value: true
266
+ mask_channel_length:
267
+ desc: null
268
+ value: 10
269
+ mask_channel_min_space:
270
+ desc: null
271
+ value: 1
272
+ mask_channel_other:
273
+ desc: null
274
+ value: 0.0
275
+ mask_channel_prob:
276
+ desc: null
277
+ value: 0.0
278
+ mask_channel_selection:
279
+ desc: null
280
+ value: static
281
+ mask_time_min_space:
282
+ desc: null
283
+ value: 1
284
+ mask_time_other:
285
+ desc: null
286
+ value: 0.0
287
+ mask_time_selection:
288
+ desc: null
289
+ value: static
290
+ model_type:
291
+ desc: null
292
+ value: wav2vec2
293
+ no_mask_channel_overlap:
294
+ desc: null
295
+ value: false
296
+ no_mask_time_overlap:
297
+ desc: null
298
+ value: false
299
+ num_feat_extract_layers:
300
+ desc: null
301
+ value: 7
302
+ hidden_size:
303
+ desc: null
304
+ value: 768
305
+ feat_extract_norm:
306
+ desc: null
307
+ value: group
308
+ feat_extract_activation:
309
+ desc: null
310
+ value: gelu
311
+ conv_dim:
312
+ desc: null
313
+ value:
314
+ - 512
315
+ - 512
316
+ - 512
317
+ - 512
318
+ - 512
319
+ - 512
320
+ - 512
321
+ conv_stride:
322
+ desc: null
323
+ value:
324
+ - 5
325
+ - 2
326
+ - 2
327
+ - 2
328
+ - 2
329
+ - 2
330
+ - 2
331
+ conv_kernel:
332
+ desc: null
333
+ value:
334
+ - 10
335
+ - 3
336
+ - 3
337
+ - 3
338
+ - 3
339
+ - 2
340
+ - 2
341
+ conv_bias:
342
+ desc: null
343
+ value: false
344
+ num_conv_pos_embeddings:
345
+ desc: null
346
+ value: 128
347
+ num_conv_pos_embedding_groups:
348
+ desc: null
349
+ value: 16
350
+ num_hidden_layers:
351
+ desc: null
352
+ value: 12
353
+ intermediate_size:
354
+ desc: null
355
+ value: 3072
356
+ hidden_act:
357
+ desc: null
358
+ value: gelu
359
+ num_attention_heads:
360
+ desc: null
361
+ value: 12
362
+ hidden_dropout:
363
+ desc: null
364
+ value: 0.1
365
+ attention_dropout:
366
+ desc: null
367
+ value: 0.1
368
+ activation_dropout:
369
+ desc: null
370
+ value: 0.0
371
+ feat_proj_dropout:
372
+ desc: null
373
+ value: 0.1
374
+ final_dropout:
375
+ desc: null
376
+ value: 0.0
377
+ layerdrop:
378
+ desc: null
379
+ value: 0.0
380
+ layer_norm_eps:
381
+ desc: null
382
+ value: 1.0e-05
383
+ initializer_range:
384
+ desc: null
385
+ value: 0.02
386
+ vocab_size:
387
+ desc: null
388
+ value: 100000000000000
389
+ do_stable_layer_norm:
390
+ desc: null
391
+ value: false
392
+ use_weighted_layer_sum:
393
+ desc: null
394
+ value: false
395
+ apply_spec_augment:
396
+ desc: null
397
+ value: true
398
+ mask_time_prob:
399
+ desc: null
400
+ value: 0.05
401
+ mask_time_length:
402
+ desc: null
403
+ value: 10
404
+ mask_time_min_masks:
405
+ desc: null
406
+ value: 2
407
+ mask_feature_prob:
408
+ desc: null
409
+ value: 0.0
410
+ mask_feature_length:
411
+ desc: null
412
+ value: 10
413
+ mask_feature_min_masks:
414
+ desc: null
415
+ value: 0
416
+ num_codevectors_per_group:
417
+ desc: null
418
+ value: 320
419
+ num_codevector_groups:
420
+ desc: null
421
+ value: 2
422
+ contrastive_logits_temperature:
423
+ desc: null
424
+ value: 0.1
425
+ feat_quantizer_dropout:
426
+ desc: null
427
+ value: 0.0
428
+ num_negatives:
429
+ desc: null
430
+ value: 100
431
+ codevector_dim:
432
+ desc: null
433
+ value: 256
434
+ proj_codevector_dim:
435
+ desc: null
436
+ value: 256
437
+ diversity_loss_weight:
438
+ desc: null
439
+ value: 0.1
440
+ ctc_loss_reduction:
441
+ desc: null
442
+ value: sum
443
+ ctc_zero_infinity:
444
+ desc: null
445
+ value: false
446
+ add_adapter:
447
+ desc: null
448
+ value: false
449
+ adapter_kernel_size:
450
+ desc: null
451
+ value: 3
452
+ adapter_stride:
453
+ desc: null
454
+ value: 2
455
+ num_adapter_layers:
456
+ desc: null
457
+ value: 3
458
+ output_hidden_size:
459
+ desc: null
460
+ value: 768
461
+ adapter_attn_dim:
462
+ desc: null
463
+ value: null
464
+ classifier_proj_size:
465
+ desc: null
466
+ value: 256
467
+ tdnn_dim:
468
+ desc: null
469
+ value:
470
+ - 512
471
+ - 512
472
+ - 512
473
+ - 512
474
+ - 1500
475
+ tdnn_kernel:
476
+ desc: null
477
+ value:
478
+ - 5
479
+ - 3
480
+ - 3
481
+ - 1
482
+ - 1
483
+ tdnn_dilation:
484
+ desc: null
485
+ value:
486
+ - 1
487
+ - 2
488
+ - 3
489
+ - 1
490
+ - 1
491
+ xvector_output_dim:
492
+ desc: null
493
+ value: 512
494
+ output_dir:
495
+ desc: null
496
+ value: /kaggle/working/
497
+ overwrite_output_dir:
498
+ desc: null
499
+ value: false
500
+ do_train:
501
+ desc: null
502
+ value: false
503
+ do_eval:
504
+ desc: null
505
+ value: true
506
+ do_predict:
507
+ desc: null
508
+ value: false
509
+ evaluation_strategy:
510
+ desc: null
511
+ value: steps
512
+ prediction_loss_only:
513
+ desc: null
514
+ value: false
515
+ per_device_train_batch_size:
516
+ desc: null
517
+ value: 8
518
+ per_device_eval_batch_size:
519
+ desc: null
520
+ value: 8
521
+ per_gpu_train_batch_size:
522
+ desc: null
523
+ value: null
524
+ per_gpu_eval_batch_size:
525
+ desc: null
526
+ value: null
527
+ gradient_accumulation_steps:
528
+ desc: null
529
+ value: 1
530
+ eval_accumulation_steps:
531
+ desc: null
532
+ value: null
533
+ eval_delay:
534
+ desc: null
535
+ value: 0
536
+ learning_rate:
537
+ desc: null
538
+ value: 0.0001
539
+ weight_decay:
540
+ desc: null
541
+ value: 0.005
542
+ adam_beta1:
543
+ desc: null
544
+ value: 0.9
545
+ adam_beta2:
546
+ desc: null
547
+ value: 0.999
548
+ adam_epsilon:
549
+ desc: null
550
+ value: 1.0e-08
551
+ max_grad_norm:
552
+ desc: null
553
+ value: 1.0
554
+ num_train_epochs:
555
+ desc: null
556
+ value: 30
557
+ max_steps:
558
+ desc: null
559
+ value: -1
560
+ lr_scheduler_type:
561
+ desc: null
562
+ value: linear
563
+ lr_scheduler_kwargs:
564
+ desc: null
565
+ value: {}
566
+ warmup_ratio:
567
+ desc: null
568
+ value: 0.0
569
+ warmup_steps:
570
+ desc: null
571
+ value: 1000
572
+ log_level:
573
+ desc: null
574
+ value: passive
575
+ log_level_replica:
576
+ desc: null
577
+ value: warning
578
+ log_on_each_node:
579
+ desc: null
580
+ value: true
581
+ logging_dir:
582
+ desc: null
583
+ value: /kaggle/working/runs/May25_17-49-41_8f1fad5fe1d2
584
+ logging_strategy:
585
+ desc: null
586
+ value: steps
587
+ logging_first_step:
588
+ desc: null
589
+ value: false
590
+ logging_steps:
591
+ desc: null
592
+ value: 500
593
+ logging_nan_inf_filter:
594
+ desc: null
595
+ value: true
596
+ save_strategy:
597
+ desc: null
598
+ value: steps
599
+ save_steps:
600
+ desc: null
601
+ value: 500
602
+ save_total_limit:
603
+ desc: null
604
+ value: 2
605
+ save_safetensors:
606
+ desc: null
607
+ value: true
608
+ save_on_each_node:
609
+ desc: null
610
+ value: false
611
+ save_only_model:
612
+ desc: null
613
+ value: false
614
+ no_cuda:
615
+ desc: null
616
+ value: false
617
+ use_cpu:
618
+ desc: null
619
+ value: false
620
+ use_mps_device:
621
+ desc: null
622
+ value: false
623
+ seed:
624
+ desc: null
625
+ value: 42
626
+ data_seed:
627
+ desc: null
628
+ value: null
629
+ jit_mode_eval:
630
+ desc: null
631
+ value: false
632
+ use_ipex:
633
+ desc: null
634
+ value: false
635
+ bf16:
636
+ desc: null
637
+ value: false
638
+ fp16:
639
+ desc: null
640
+ value: true
641
+ fp16_opt_level:
642
+ desc: null
643
+ value: O1
644
+ half_precision_backend:
645
+ desc: null
646
+ value: auto
647
+ bf16_full_eval:
648
+ desc: null
649
+ value: false
650
+ fp16_full_eval:
651
+ desc: null
652
+ value: false
653
+ tf32:
654
+ desc: null
655
+ value: null
656
+ local_rank:
657
+ desc: null
658
+ value: 0
659
+ ddp_backend:
660
+ desc: null
661
+ value: null
662
+ tpu_num_cores:
663
+ desc: null
664
+ value: null
665
+ tpu_metrics_debug:
666
+ desc: null
667
+ value: false
668
+ debug:
669
+ desc: null
670
+ value: []
671
+ dataloader_drop_last:
672
+ desc: null
673
+ value: false
674
+ eval_steps:
675
+ desc: null
676
+ value: 500
677
+ dataloader_num_workers:
678
+ desc: null
679
+ value: 0
680
+ dataloader_prefetch_factor:
681
+ desc: null
682
+ value: null
683
+ past_index:
684
+ desc: null
685
+ value: -1
686
+ run_name:
687
+ desc: null
688
+ value: /kaggle/working/
689
+ disable_tqdm:
690
+ desc: null
691
+ value: false
692
+ remove_unused_columns:
693
+ desc: null
694
+ value: true
695
+ label_names:
696
+ desc: null
697
+ value: null
698
+ load_best_model_at_end:
699
+ desc: null
700
+ value: false
701
+ metric_for_best_model:
702
+ desc: null
703
+ value: null
704
+ greater_is_better:
705
+ desc: null
706
+ value: null
707
+ ignore_data_skip:
708
+ desc: null
709
+ value: false
710
+ fsdp:
711
+ desc: null
712
+ value: []
713
+ fsdp_min_num_params:
714
+ desc: null
715
+ value: 0
716
+ fsdp_config:
717
+ desc: null
718
+ value:
719
+ min_num_params: 0
720
+ xla: false
721
+ xla_fsdp_v2: false
722
+ xla_fsdp_grad_ckpt: false
723
+ fsdp_transformer_layer_cls_to_wrap:
724
+ desc: null
725
+ value: null
726
+ accelerator_config:
727
+ desc: null
728
+ value:
729
+ split_batches: false
730
+ dispatch_batches: null
731
+ even_batches: true
732
+ use_seedable_sampler: true
733
+ deepspeed:
734
+ desc: null
735
+ value: null
736
+ label_smoothing_factor:
737
+ desc: null
738
+ value: 0.0
739
+ optim:
740
+ desc: null
741
+ value: adamw_torch
742
+ optim_args:
743
+ desc: null
744
+ value: null
745
+ adafactor:
746
+ desc: null
747
+ value: false
748
+ group_by_length:
749
+ desc: null
750
+ value: true
751
+ length_column_name:
752
+ desc: null
753
+ value: length
754
+ report_to:
755
+ desc: null
756
+ value:
757
+ - tensorboard
758
+ - wandb
759
+ ddp_find_unused_parameters:
760
+ desc: null
761
+ value: null
762
+ ddp_bucket_cap_mb:
763
+ desc: null
764
+ value: null
765
+ ddp_broadcast_buffers:
766
+ desc: null
767
+ value: null
768
+ dataloader_pin_memory:
769
+ desc: null
770
+ value: true
771
+ dataloader_persistent_workers:
772
+ desc: null
773
+ value: false
774
+ skip_memory_metrics:
775
+ desc: null
776
+ value: true
777
+ use_legacy_prediction_loop:
778
+ desc: null
779
+ value: false
780
+ push_to_hub:
781
+ desc: null
782
+ value: true
783
+ resume_from_checkpoint:
784
+ desc: null
785
+ value: null
786
+ hub_model_id:
787
+ desc: null
788
+ value: null
789
+ hub_strategy:
790
+ desc: null
791
+ value: every_save
792
+ hub_token:
793
+ desc: null
794
+ value: <HUB_TOKEN>
795
+ hub_private_repo:
796
+ desc: null
797
+ value: false
798
+ hub_always_push:
799
+ desc: null
800
+ value: false
801
+ gradient_checkpointing:
802
+ desc: null
803
+ value: true
804
+ gradient_checkpointing_kwargs:
805
+ desc: null
806
+ value: null
807
+ include_inputs_for_metrics:
808
+ desc: null
809
+ value: false
810
+ fp16_backend:
811
+ desc: null
812
+ value: auto
813
+ push_to_hub_model_id:
814
+ desc: null
815
+ value: null
816
+ push_to_hub_organization:
817
+ desc: null
818
+ value: null
819
+ push_to_hub_token:
820
+ desc: null
821
+ value: <PUSH_TO_HUB_TOKEN>
822
+ mp_parameters:
823
+ desc: null
824
+ value: ''
825
+ auto_find_batch_size:
826
+ desc: null
827
+ value: false
828
+ full_determinism:
829
+ desc: null
830
+ value: false
831
+ torchdynamo:
832
+ desc: null
833
+ value: null
834
+ ray_scope:
835
+ desc: null
836
+ value: last
837
+ ddp_timeout:
838
+ desc: null
839
+ value: 1800
840
+ torch_compile:
841
+ desc: null
842
+ value: false
843
+ torch_compile_backend:
844
+ desc: null
845
+ value: null
846
+ torch_compile_mode:
847
+ desc: null
848
+ value: null
849
+ dispatch_batches:
850
+ desc: null
851
+ value: null
852
+ split_batches:
853
+ desc: null
854
+ value: null
855
+ include_tokens_per_second:
856
+ desc: null
857
+ value: false
858
+ include_num_input_tokens_seen:
859
+ desc: null
860
+ value: false
861
+ neftune_noise_alpha:
862
+ desc: null
863
+ value: null
864
+ optim_target_modules:
865
+ desc: null
866
+ value: null
wandb/run-20240525_175010-8ah63pdc/files/output.log ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ /opt/conda/lib/python3.10/site-packages/transformers/models/wav2vec2/processing_wav2vec2.py:156: UserWarning: `as_target_processor` is deprecated and will be removed in v5 of Transformers. You can process your labels by using the argument `text` of the regular `__call__` method (either in the same call as your audio inputs, or in a separate call.
2
+ warnings.warn(
3
+ /opt/conda/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
4
+ warnings.warn(
5
+ /opt/conda/lib/python3.10/site-packages/transformers/models/wav2vec2/processing_wav2vec2.py:156: UserWarning: `as_target_processor` is deprecated and will be removed in v5 of Transformers. You can process your labels by using the argument `text` of the regular `__call__` method (either in the same call as your audio inputs, or in a separate call.
6
+ warnings.warn(
7
+ /opt/conda/lib/python3.10/site-packages/torch/utils/checkpoint.py:429: UserWarning: torch.utils.checkpoint: please pass in use_reentrant=True or use_reentrant=False explicitly. The default value of use_reentrant will be updated to be False in the future. To maintain current behavior, pass use_reentrant=True. It is recommended that you use use_reentrant=False. Refer to docs for more details on the differences between the two variants.
wandb/run-20240525_175010-8ah63pdc/files/requirements.txt ADDED
@@ -0,0 +1,862 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Babel==2.14.0
2
+ Boruta==0.3
3
+ Brotli==1.0.9
4
+ CVXcanon==0.1.2
5
+ Cartopy==0.23.0
6
+ Cython==3.0.8
7
+ Deprecated==1.2.14
8
+ Farama-Notifications==0.0.4
9
+ Flask==3.0.3
10
+ Geohash==1.0
11
+ GitPython==3.1.41
12
+ ImageHash==4.3.1
13
+ Janome==0.5.0
14
+ Jinja2==3.1.2
15
+ LunarCalendar==0.0.9
16
+ Mako==1.3.3
17
+ Markdown==3.5.2
18
+ MarkupSafe==2.1.3
19
+ MarkupSafe==2.1.5
20
+ Pillow==9.5.0
21
+ PuLP==2.8.0
22
+ PyArabic==0.6.15
23
+ PyJWT==2.8.0
24
+ PyMeeus==0.5.12
25
+ PySocks==1.7.1
26
+ PyUpSet==0.1.1.post7
27
+ PyWavelets==1.5.0
28
+ PyYAML==6.0.1
29
+ Pygments==2.17.2
30
+ Pympler==1.0.1
31
+ QtPy==2.4.1
32
+ Rtree==1.2.0
33
+ SQLAlchemy==2.0.25
34
+ SecretStorage==3.3.3
35
+ Send2Trash==1.8.2
36
+ Shapely==1.8.5.post1
37
+ Shimmy==1.3.0
38
+ SimpleITK==2.3.1
39
+ TPOT==0.12.1
40
+ Theano-PyMC==1.1.2
41
+ Theano==1.0.5
42
+ Wand==0.6.13
43
+ Werkzeug==3.0.2
44
+ absl-py==1.4.0
45
+ accelerate==0.29.3
46
+ access==1.1.9
47
+ affine==2.4.0
48
+ aiobotocore==2.12.3
49
+ aiofiles==22.1.0
50
+ aiohttp-cors==0.7.0
51
+ aiohttp==3.9.1
52
+ aioitertools==0.11.0
53
+ aiorwlock==1.3.0
54
+ aiosignal==1.3.1
55
+ aiosqlite==0.19.0
56
+ albumentations==1.4.0
57
+ alembic==1.13.1
58
+ altair==5.3.0
59
+ annotated-types==0.6.0
60
+ annoy==1.17.3
61
+ anyio==4.2.0
62
+ apache-beam==2.46.0
63
+ aplus==0.11.0
64
+ appdirs==1.4.4
65
+ archspec==0.2.3
66
+ argon2-cffi-bindings==21.2.0
67
+ argon2-cffi==23.1.0
68
+ array-record==0.5.0
69
+ arrow==1.3.0
70
+ arviz==0.18.0
71
+ astroid==3.1.0
72
+ astropy-iers-data==0.2024.4.15.2.45.49
73
+ astropy==6.0.1
74
+ asttokens==2.4.1
75
+ astunparse==1.6.3
76
+ async-lru==2.0.4
77
+ async-timeout==4.0.3
78
+ attrs==23.2.0
79
+ audioread==3.0.1
80
+ autopep8==2.0.4
81
+ backoff==2.2.1
82
+ bayesian-optimization==1.4.3
83
+ beatrix_jupyterlab==2023.128.151533
84
+ beautifulsoup4==4.12.2
85
+ blake3==0.2.1
86
+ bleach==6.1.0
87
+ blessed==1.20.0
88
+ blinker==1.7.0
89
+ blis==0.7.10
90
+ blosc2==2.6.2
91
+ bokeh==3.4.1
92
+ boltons==23.1.1
93
+ boto3==1.26.100
94
+ botocore==1.34.69
95
+ bq_helper==0.4.1
96
+ bqplot==0.12.43
97
+ branca==0.7.1
98
+ brewer2mpl==1.4.1
99
+ brotlipy==0.7.0
100
+ cached-property==1.5.2
101
+ cachetools==4.2.4
102
+ cachetools==5.3.2
103
+ catalogue==2.0.10
104
+ catalyst==22.4
105
+ catboost==1.2.3
106
+ category-encoders==2.6.3
107
+ certifi==2024.2.2
108
+ cesium==0.12.1
109
+ cffi==1.16.0
110
+ charset-normalizer==3.3.2
111
+ chex==0.1.86
112
+ cleverhans==4.0.0
113
+ click-plugins==1.1.1
114
+ click==8.1.7
115
+ cligj==0.7.2
116
+ cloud-tpu-client==0.10
117
+ cloud-tpu-profiler==2.4.0
118
+ cloudpathlib==0.16.0
119
+ cloudpickle==2.2.1
120
+ cloudpickle==3.0.0
121
+ cmdstanpy==1.2.2
122
+ colorama==0.4.6
123
+ colorcet==3.1.0
124
+ colorful==0.5.6
125
+ colorlog==6.8.2
126
+ colorlover==0.3.0
127
+ comm==0.2.1
128
+ conda-libmamba-solver==23.7.0
129
+ conda-package-handling==2.2.0
130
+ conda==23.7.4
131
+ conda_package_streaming==0.9.0
132
+ confection==0.1.4
133
+ contextily==1.6.0
134
+ contourpy==1.2.0
135
+ contourpy==1.2.1
136
+ convertdate==2.4.0
137
+ crcmod==1.7
138
+ cryptography==41.0.7
139
+ cuda-python==12.4.0
140
+ cudf==23.8.0
141
+ cufflinks==0.17.3
142
+ cuml==23.8.0
143
+ cupy==13.0.0
144
+ cycler==0.12.1
145
+ cymem==2.0.8
146
+ cytoolz==0.12.3
147
+ daal4py==2024.3.0
148
+ daal==2024.3.0
149
+ dacite==1.8.1
150
+ dask-cuda==23.8.0
151
+ dask-cudf==23.8.0
152
+ dask-expr==1.0.11
153
+ dask==2024.4.1
154
+ dataclasses-json==0.6.4
155
+ dataproc_jupyter_plugin==0.1.66
156
+ datasets==2.18.0
157
+ datashader==0.16.0
158
+ datatile==1.0.3
159
+ db-dtypes==1.2.0
160
+ deap==1.4.1
161
+ debugpy==1.8.0
162
+ decorator==5.1.1
163
+ deepdiff==7.0.1
164
+ defusedxml==0.7.1
165
+ deprecation==2.1.0
166
+ descartes==1.1.0
167
+ dill==0.3.8
168
+ dipy==1.9.0
169
+ distlib==0.3.8
170
+ distributed==2023.7.1
171
+ distro==1.9.0
172
+ dm-tree==0.1.8
173
+ docker-pycreds==0.4.0
174
+ docker==7.0.0
175
+ docopt==0.6.2
176
+ docstring-parser==0.15
177
+ docstring-to-markdown==0.15
178
+ docutils==0.21.1
179
+ earthengine-api==0.1.399
180
+ easydict==1.13
181
+ easyocr==1.7.1
182
+ ecos==2.0.13
183
+ eli5==0.13.0
184
+ emoji==2.11.0
185
+ en-core-web-lg==3.7.1
186
+ en-core-web-sm==3.7.1
187
+ entrypoints==0.4
188
+ ephem==4.1.5
189
+ esda==2.5.1
190
+ essentia==2.1b6.dev1110
191
+ et-xmlfile==1.1.0
192
+ etils==1.6.0
193
+ exceptiongroup==1.2.0
194
+ executing==2.0.1
195
+ explainable-ai-sdk==1.3.3
196
+ fastai==2.7.14
197
+ fastapi==0.108.0
198
+ fastavro==1.9.3
199
+ fastcore==1.5.29
200
+ fastdownload==0.0.7
201
+ fasteners==0.19
202
+ fastjsonschema==2.19.1
203
+ fastprogress==1.0.3
204
+ fastrlock==0.8.2
205
+ fasttext==0.9.2
206
+ feather-format==0.4.1
207
+ featuretools==1.30.0
208
+ filelock==3.13.1
209
+ fiona==1.9.6
210
+ fitter==1.7.0
211
+ flake8==7.0.0
212
+ flashtext==2.7
213
+ flatbuffers==23.5.26
214
+ flax==0.8.2
215
+ folium==0.16.0
216
+ fonttools==4.47.0
217
+ fonttools==4.51.0
218
+ fqdn==1.5.1
219
+ frozendict==2.4.2
220
+ frozenlist==1.4.1
221
+ fsspec==2024.2.0
222
+ fsspec==2024.3.1
223
+ funcy==2.0
224
+ fury==0.10.0
225
+ future==1.0.0
226
+ fuzzywuzzy==0.18.0
227
+ gast==0.5.4
228
+ gatspy==0.3
229
+ gcsfs==2024.2.0
230
+ gensim==4.3.2
231
+ geographiclib==2.0
232
+ geojson==3.1.0
233
+ geopandas==0.14.3
234
+ geoplot==0.5.1
235
+ geopy==2.4.1
236
+ geoviews==1.12.0
237
+ ggplot==0.11.5
238
+ giddy==2.3.5
239
+ gitdb==4.0.11
240
+ google-ai-generativelanguage==0.6.2
241
+ google-api-core==2.11.1
242
+ google-api-core==2.18.0
243
+ google-api-python-client==2.126.0
244
+ google-apitools==0.5.31
245
+ google-auth-httplib2==0.2.0
246
+ google-auth-oauthlib==1.2.0
247
+ google-auth==2.26.1
248
+ google-cloud-aiplatform==0.6.0a1
249
+ google-cloud-artifact-registry==1.10.0
250
+ google-cloud-automl==1.0.1
251
+ google-cloud-bigquery==2.34.4
252
+ google-cloud-bigtable==1.7.3
253
+ google-cloud-core==2.4.1
254
+ google-cloud-datastore==2.19.0
255
+ google-cloud-dlp==3.14.0
256
+ google-cloud-jupyter-config==0.0.5
257
+ google-cloud-language==2.13.3
258
+ google-cloud-monitoring==2.18.0
259
+ google-cloud-pubsub==2.19.0
260
+ google-cloud-pubsublite==1.9.0
261
+ google-cloud-recommendations-ai==0.7.1
262
+ google-cloud-resource-manager==1.11.0
263
+ google-cloud-spanner==3.40.1
264
+ google-cloud-storage==1.44.0
265
+ google-cloud-translate==3.12.1
266
+ google-cloud-videointelligence==2.13.3
267
+ google-cloud-vision==2.8.0
268
+ google-crc32c==1.5.0
269
+ google-generativeai==0.5.1
270
+ google-pasta==0.2.0
271
+ google-resumable-media==2.7.0
272
+ googleapis-common-protos==1.62.0
273
+ gplearn==0.4.2
274
+ gpustat==1.0.0
275
+ gpxpy==1.6.2
276
+ graphviz==0.20.3
277
+ greenlet==3.0.3
278
+ grpc-google-iam-v1==0.12.7
279
+ grpcio-status==1.48.1
280
+ grpcio-status==1.48.2
281
+ grpcio==1.51.1
282
+ grpcio==1.60.0
283
+ gviz-api==1.10.0
284
+ gym-notices==0.0.8
285
+ gym==0.26.2
286
+ gymnasium==0.29.0
287
+ h11==0.14.0
288
+ h2o==3.46.0.1
289
+ h5netcdf==1.3.0
290
+ h5py==3.10.0
291
+ haversine==2.8.1
292
+ hdfs==2.7.3
293
+ hep-ml==0.7.2
294
+ hijri-converter==2.3.1
295
+ hmmlearn==0.3.2
296
+ holidays==0.24
297
+ holoviews==1.18.3
298
+ hpsklearn==0.1.0
299
+ html5lib==1.1
300
+ htmlmin==0.1.12
301
+ httpcore==1.0.5
302
+ httplib2==0.21.0
303
+ httptools==0.6.1
304
+ httpx==0.27.0
305
+ huggingface-hub==0.22.2
306
+ hunspell==0.5.5
307
+ hydra-slayer==0.5.0
308
+ hyperopt==0.2.7
309
+ hypertools==0.8.0
310
+ idna==3.6
311
+ igraph==0.11.4
312
+ imagecodecs==2024.1.1
313
+ imageio==2.33.1
314
+ imbalanced-learn==0.12.2
315
+ imgaug==0.4.0
316
+ importlib-metadata==6.11.0
317
+ importlib-metadata==7.0.1
318
+ importlib-resources==6.1.1
319
+ inequality==1.0.1
320
+ iniconfig==2.0.0
321
+ ipydatawidgets==4.3.5
322
+ ipykernel==6.28.0
323
+ ipyleaflet==0.18.2
324
+ ipympl==0.7.0
325
+ ipython-genutils==0.2.0
326
+ ipython-genutils==0.2.0
327
+ ipython-sql==0.5.0
328
+ ipython==8.20.0
329
+ ipyvolume==0.6.3
330
+ ipyvue==1.11.0
331
+ ipyvuetify==1.9.4
332
+ ipywebrtc==0.6.0
333
+ ipywidgets==7.7.1
334
+ isoduration==20.11.0
335
+ isort==5.13.2
336
+ isoweek==1.3.3
337
+ itsdangerous==2.2.0
338
+ jaraco.classes==3.3.0
339
+ jax-jumpy==1.0.0
340
+ jax==0.4.23
341
+ jaxlib==0.4.23.dev20240116
342
+ jedi==0.19.1
343
+ jeepney==0.8.0
344
+ jieba==0.42.1
345
+ jiwer==3.0.4
346
+ jmespath==1.0.1
347
+ joblib==1.4.0
348
+ json5==0.9.14
349
+ jsonpatch==1.33
350
+ jsonpointer==2.4
351
+ jsonschema-specifications==2023.12.1
352
+ jsonschema==4.20.0
353
+ jupyter-console==6.6.3
354
+ jupyter-events==0.9.0
355
+ jupyter-http-over-ws==0.0.8
356
+ jupyter-lsp==1.5.1
357
+ jupyter-server-mathjax==0.2.6
358
+ jupyter-ydoc==0.2.5
359
+ jupyter_client==7.4.9
360
+ jupyter_client==8.6.0
361
+ jupyter_core==5.7.1
362
+ jupyter_server==2.12.5
363
+ jupyter_server_fileid==0.9.1
364
+ jupyter_server_proxy==4.1.0
365
+ jupyter_server_terminals==0.5.1
366
+ jupyter_server_ydoc==0.8.0
367
+ jupyterlab-lsp==5.1.0
368
+ jupyterlab-widgets==3.0.9
369
+ jupyterlab==4.1.6
370
+ jupyterlab_git==0.44.0
371
+ jupyterlab_pygments==0.3.0
372
+ jupyterlab_server==2.25.2
373
+ jupytext==1.16.0
374
+ kaggle-environments==1.14.3
375
+ kaggle==1.6.12
376
+ kagglehub==0.2.3
377
+ keras-cv==0.8.2
378
+ keras-nlp==0.9.3
379
+ keras-tuner==1.4.6
380
+ keras==3.2.1
381
+ kernels-mixer==0.0.7
382
+ keyring==24.3.0
383
+ keyrings.google-artifactregistry-auth==1.1.2
384
+ kfp-pipeline-spec==0.2.2
385
+ kfp-server-api==2.0.5
386
+ kfp==2.5.0
387
+ kiwisolver==1.4.5
388
+ kmapper==2.0.1
389
+ kmodes==0.12.2
390
+ korean-lunar-calendar==0.3.1
391
+ kornia==0.7.2
392
+ kornia_rs==0.1.3
393
+ kt-legacy==1.0.5
394
+ kubernetes==26.1.0
395
+ langcodes==3.3.0
396
+ langid==1.1.6
397
+ lazy_loader==0.3
398
+ learntools==0.3.4
399
+ leven==1.0.4
400
+ libclang==16.0.6
401
+ libmambapy==1.5.0
402
+ libpysal==4.9.2
403
+ librosa==0.10.1
404
+ lightgbm==4.2.0
405
+ lightning-utilities==0.11.2
406
+ lime==0.2.0.1
407
+ line-profiler==4.1.2
408
+ linkify-it-py==2.0.3
409
+ llvmlite==0.41.1
410
+ llvmlite==0.42.0
411
+ lml==0.1.0
412
+ locket==1.0.0
413
+ loguru==0.7.2
414
+ lxml==5.2.1
415
+ lz4==4.3.3
416
+ mamba==1.5.0
417
+ mapclassify==2.6.1
418
+ markdown-it-py==3.0.0
419
+ marshmallow==3.21.1
420
+ matplotlib-inline==0.1.6
421
+ matplotlib-venn==0.11.10
422
+ matplotlib==3.7.5
423
+ matplotlib==3.8.4
424
+ mccabe==0.7.0
425
+ mdit-py-plugins==0.4.0
426
+ mdurl==0.1.2
427
+ memory-profiler==0.61.0
428
+ menuinst==2.0.1
429
+ mercantile==1.2.1
430
+ mgwr==2.2.1
431
+ missingno==0.5.2
432
+ mistune==0.8.4
433
+ mizani==0.11.1
434
+ ml-dtypes==0.2.0
435
+ mlcrate==0.2.0
436
+ mlens==0.2.3
437
+ mlxtend==0.23.1
438
+ mne==1.6.1
439
+ mnist==0.2.2
440
+ momepy==0.7.0
441
+ more-itertools==10.2.0
442
+ mpld3==0.5.10
443
+ mpmath==1.3.0
444
+ msgpack==1.0.7
445
+ multidict==6.0.4
446
+ multimethod==1.10
447
+ multipledispatch==1.0.0
448
+ multiprocess==0.70.16
449
+ munkres==1.1.4
450
+ murmurhash==1.0.10
451
+ mypy-extensions==1.0.0
452
+ namex==0.0.8
453
+ nb-conda-kernels==2.3.1
454
+ nb_conda==2.2.1
455
+ nbclassic==1.0.0
456
+ nbclient==0.5.13
457
+ nbconvert==6.4.5
458
+ nbdime==3.2.0
459
+ nbformat==5.9.2
460
+ ndindex==1.8
461
+ nest-asyncio==1.5.8
462
+ networkx==3.2.1
463
+ nibabel==5.2.1
464
+ nilearn==0.10.4
465
+ ninja==1.11.1.1
466
+ nltk==3.2.4
467
+ nose==1.3.7
468
+ notebook==6.5.4
469
+ notebook==6.5.6
470
+ notebook_executor==0.2
471
+ notebook_shim==0.2.3
472
+ numba==0.58.1
473
+ numba==0.59.1
474
+ numexpr==2.10.0
475
+ numpy==1.26.4
476
+ nvidia-ml-py==11.495.46
477
+ nvtx==0.2.10
478
+ oauth2client==4.1.3
479
+ oauthlib==3.2.2
480
+ objsize==0.6.1
481
+ odfpy==1.4.1
482
+ olefile==0.47
483
+ onnx==1.16.0
484
+ opencensus-context==0.1.3
485
+ opencensus==0.11.4
486
+ opencv-contrib-python==4.9.0.80
487
+ opencv-python-headless==4.9.0.80
488
+ opencv-python==4.9.0.80
489
+ openpyxl==3.1.2
490
+ openslide-python==1.3.1
491
+ opentelemetry-api==1.22.0
492
+ opentelemetry-exporter-otlp-proto-common==1.22.0
493
+ opentelemetry-exporter-otlp-proto-grpc==1.22.0
494
+ opentelemetry-exporter-otlp-proto-http==1.22.0
495
+ opentelemetry-exporter-otlp==1.22.0
496
+ opentelemetry-proto==1.22.0
497
+ opentelemetry-sdk==1.22.0
498
+ opentelemetry-semantic-conventions==0.43b0
499
+ opt-einsum==3.3.0
500
+ optax==0.2.2
501
+ optree==0.11.0
502
+ optuna==3.6.1
503
+ orbax-checkpoint==0.5.9
504
+ ordered-set==4.1.0
505
+ orjson==3.9.10
506
+ ortools==9.4.1874
507
+ osmnx==1.9.2
508
+ overrides==7.4.0
509
+ packaging==21.3
510
+ pandas-datareader==0.10.0
511
+ pandas-profiling==3.6.6
512
+ pandas-summary==0.2.0
513
+ pandas==2.1.4
514
+ pandas==2.2.2
515
+ pandasql==0.7.3
516
+ pandocfilters==1.5.0
517
+ panel==1.4.1
518
+ papermill==2.5.0
519
+ param==2.1.0
520
+ parso==0.8.3
521
+ partd==1.4.1
522
+ path.py==12.5.0
523
+ path==16.14.0
524
+ pathos==0.3.2
525
+ pathy==0.10.3
526
+ patsy==0.5.6
527
+ pdf2image==1.17.0
528
+ pettingzoo==1.24.0
529
+ pexpect==4.8.0
530
+ pexpect==4.9.0
531
+ phik==0.12.4
532
+ pickleshare==0.7.5
533
+ pillow==10.3.0
534
+ pip==23.3.2
535
+ pkgutil_resolve_name==1.3.10
536
+ platformdirs==4.2.0
537
+ plotly-express==0.4.1
538
+ plotly==5.18.0
539
+ plotnine==0.13.4
540
+ pluggy==1.4.0
541
+ pointpats==2.4.0
542
+ polars==0.20.21
543
+ polyglot==16.7.4
544
+ pooch==1.8.1
545
+ pox==0.3.4
546
+ ppca==0.0.4
547
+ ppft==1.7.6.8
548
+ preprocessing==0.1.13
549
+ preshed==3.0.9
550
+ prettytable==3.9.0
551
+ progressbar2==4.4.2
552
+ prometheus-client==0.19.0
553
+ promise==2.3
554
+ prompt-toolkit==3.0.42
555
+ prompt-toolkit==3.0.43
556
+ prophet==1.1.1
557
+ proto-plus==1.23.0
558
+ protobuf==3.20.3
559
+ protobuf==4.21.12
560
+ psutil==5.9.3
561
+ psutil==5.9.7
562
+ ptyprocess==0.7.0
563
+ pudb==2024.1
564
+ pure-eval==0.2.2
565
+ py-cpuinfo==9.0.0
566
+ py-spy==0.3.14
567
+ py4j==0.10.9.7
568
+ pyLDAvis==3.4.1
569
+ pyOpenSSL==23.3.0
570
+ pyaml==23.12.0
571
+ pyarrow-hotfix==0.6
572
+ pyarrow==15.0.2
573
+ pyasn1-modules==0.3.0
574
+ pyasn1==0.5.1
575
+ pybind11==2.12.0
576
+ pyclipper==1.3.0.post5
577
+ pycodestyle==2.11.1
578
+ pycosat==0.6.6
579
+ pycparser==2.21
580
+ pycryptodome==3.20.0
581
+ pyct==0.5.0
582
+ pycuda==2024.1
583
+ pydantic==2.5.3
584
+ pydantic==2.7.0
585
+ pydantic_core==2.14.6
586
+ pydantic_core==2.18.1
587
+ pydegensac==0.1.2
588
+ pydicom==2.4.4
589
+ pydocstyle==6.3.0
590
+ pydot==1.4.2
591
+ pydub==0.25.1
592
+ pyemd==1.0.0
593
+ pyerfa==2.0.1.4
594
+ pyexcel-io==0.6.6
595
+ pyexcel-ods==0.6.0
596
+ pyflakes==3.2.0
597
+ pygltflib==1.16.2
598
+ pykalman==0.9.7
599
+ pylibraft==23.8.0
600
+ pylint==3.1.0
601
+ pymc3==3.11.4
602
+ pymongo==3.13.0
603
+ pynndescent==0.5.12
604
+ pynvml==11.4.1
605
+ pynvrtc==9.2
606
+ pyparsing==3.1.1
607
+ pyparsing==3.1.2
608
+ pypdf==4.2.0
609
+ pyproj==3.6.1
610
+ pysal==24.1
611
+ pyshp==2.3.1
612
+ pytesseract==0.3.10
613
+ pytest==8.1.1
614
+ python-bidi==0.4.2
615
+ python-dateutil==2.9.0.post0
616
+ python-dotenv==1.0.0
617
+ python-json-logger==2.0.7
618
+ python-louvain==0.16
619
+ python-lsp-jsonrpc==1.1.2
620
+ python-lsp-server==1.11.0
621
+ python-slugify==8.0.4
622
+ python-utils==3.8.2
623
+ pythreejs==2.4.2
624
+ pytoolconfig==1.3.1
625
+ pytools==2024.1.1
626
+ pytorch-ignite==0.5.0.post2
627
+ pytorch-lightning==2.2.2
628
+ pytz==2023.3.post1
629
+ pytz==2024.1
630
+ pyu2f==0.1.5
631
+ pyviz_comms==3.0.2
632
+ pyzmq==24.0.1
633
+ pyzmq==25.1.2
634
+ qgrid==1.3.1
635
+ qtconsole==5.5.1
636
+ quantecon==0.7.2
637
+ qudida==0.0.4
638
+ raft-dask==23.8.0
639
+ rapidfuzz==3.9.1
640
+ rasterio==1.3.10
641
+ rasterstats==0.19.0
642
+ ray-cpp==2.9.0
643
+ ray==2.9.0
644
+ referencing==0.32.1
645
+ regex==2023.12.25
646
+ requests-oauthlib==1.3.1
647
+ requests-toolbelt==0.10.1
648
+ requests==2.31.0
649
+ retrying==1.3.3
650
+ retrying==1.3.4
651
+ rfc3339-validator==0.1.4
652
+ rfc3986-validator==0.1.1
653
+ rgf-python==3.12.0
654
+ rich-click==1.7.4
655
+ rich==13.7.0
656
+ rich==13.7.1
657
+ rmm==23.8.0
658
+ rope==1.13.0
659
+ rpds-py==0.16.2
660
+ rsa==4.9
661
+ ruamel-yaml-conda==0.15.100
662
+ ruamel.yaml.clib==0.2.7
663
+ ruamel.yaml==0.17.40
664
+ s2sphere==0.2.5
665
+ s3fs==2024.2.0
666
+ s3transfer==0.6.2
667
+ safetensors==0.4.3
668
+ scattertext==0.1.19
669
+ scikit-image==0.22.0
670
+ scikit-learn-intelex==2024.3.0
671
+ scikit-learn==1.2.2
672
+ scikit-multilearn==0.2.0
673
+ scikit-optimize==0.10.1
674
+ scikit-plot==0.3.7
675
+ scikit-surprise==1.1.3
676
+ scipy==1.11.4
677
+ scipy==1.13.0
678
+ seaborn==0.12.2
679
+ segment_anything==1.0
680
+ segregation==2.5
681
+ semver==3.0.2
682
+ sentencepiece==0.2.0
683
+ sentry-sdk==1.45.0
684
+ setproctitle==1.3.3
685
+ setuptools-git==1.2
686
+ setuptools-scm==8.0.4
687
+ setuptools==69.0.3
688
+ shap==0.44.1
689
+ shapely==2.0.4
690
+ shellingham==1.5.4
691
+ simpervisor==1.0.0
692
+ simplejson==3.19.2
693
+ six==1.16.0
694
+ sklearn-pandas==2.2.0
695
+ slicer==0.0.7
696
+ smart-open==6.4.0
697
+ smmap==5.0.1
698
+ sniffio==1.3.0
699
+ snowballstemmer==2.2.0
700
+ snuggs==1.4.7
701
+ sortedcontainers==2.4.0
702
+ soundfile==0.12.1
703
+ soupsieve==2.5
704
+ soxr==0.3.7
705
+ spacy-legacy==3.0.12
706
+ spacy-loggers==1.0.5
707
+ spacy==3.7.3
708
+ spaghetti==1.7.5.post1
709
+ spectral==0.23.1
710
+ spglm==1.1.0
711
+ sphinx-rtd-theme==0.2.4
712
+ spint==1.0.7
713
+ splot==1.1.5.post1
714
+ spopt==0.6.0
715
+ spreg==1.4.2
716
+ spvcm==0.3.0
717
+ sqlparse==0.4.4
718
+ squarify==0.4.3
719
+ srsly==2.4.8
720
+ stable-baselines3==2.1.0
721
+ stack-data==0.6.2
722
+ stack-data==0.6.3
723
+ stanio==0.5.0
724
+ starlette==0.32.0.post1
725
+ statsmodels==0.14.1
726
+ stemming==1.0.1
727
+ stop-words==2018.7.23
728
+ stopit==1.1.2
729
+ stumpy==1.12.0
730
+ sympy==1.12
731
+ tables==3.9.2
732
+ tabulate==0.9.0
733
+ tangled-up-in-unicode==0.2.0
734
+ tbb==2021.12.0
735
+ tblib==3.0.0
736
+ tenacity==8.2.3
737
+ tensorboard-data-server==0.7.2
738
+ tensorboard-plugin-profile==2.15.0
739
+ tensorboard==2.15.1
740
+ tensorboardX==2.6.2.2
741
+ tensorflow-cloud==0.1.16
742
+ tensorflow-datasets==4.9.4
743
+ tensorflow-decision-forests==1.8.1
744
+ tensorflow-estimator==2.15.0
745
+ tensorflow-hub==0.16.1
746
+ tensorflow-io-gcs-filesystem==0.35.0
747
+ tensorflow-io==0.35.0
748
+ tensorflow-metadata==0.14.0
749
+ tensorflow-probability==0.23.0
750
+ tensorflow-serving-api==2.14.1
751
+ tensorflow-text==2.15.0
752
+ tensorflow-transform==0.14.0
753
+ tensorflow==2.15.0
754
+ tensorstore==0.1.56
755
+ termcolor==2.4.0
756
+ terminado==0.18.0
757
+ testpath==0.6.0
758
+ text-unidecode==1.3
759
+ textblob==0.18.0.post0
760
+ texttable==1.7.0
761
+ tf_keras==2.15.1
762
+ tfp-nightly==0.24.0.dev0
763
+ thinc==8.2.2
764
+ threadpoolctl==3.2.0
765
+ tifffile==2023.12.9
766
+ timm==0.9.16
767
+ tinycss2==1.2.1
768
+ tobler==0.11.2
769
+ tokenizers==0.15.2
770
+ toml==0.10.2
771
+ tomli==2.0.1
772
+ tomlkit==0.12.4
773
+ toolz==0.12.1
774
+ torch==2.1.2
775
+ torchaudio==2.1.2
776
+ torchdata==0.7.1
777
+ torchinfo==1.8.0
778
+ torchmetrics==1.3.2
779
+ torchtext==0.16.2
780
+ torchvision==0.16.2
781
+ tornado==6.3.3
782
+ tqdm==4.66.1
783
+ traceml==1.0.8
784
+ traitlets==5.9.0
785
+ traittypes==0.2.1
786
+ transformers==4.39.3
787
+ treelite-runtime==3.2.0
788
+ treelite==3.2.0
789
+ truststore==0.8.0
790
+ trx-python==0.2.9
791
+ tsfresh==0.20.2
792
+ typeguard==4.1.5
793
+ typer==0.9.0
794
+ typer==0.9.4
795
+ types-python-dateutil==2.8.19.20240106
796
+ typing-inspect==0.9.0
797
+ typing-utils==0.1.0
798
+ typing_extensions==4.9.0
799
+ tzdata==2023.4
800
+ uc-micro-py==1.0.3
801
+ ucx-py==0.33.0
802
+ ujson==5.9.0
803
+ umap-learn==0.5.6
804
+ unicodedata2==15.1.0
805
+ update-checker==0.18.0
806
+ uri-template==1.3.0
807
+ uritemplate==3.0.1
808
+ urllib3==1.26.18
809
+ urllib3==2.1.0
810
+ urwid==2.6.10
811
+ urwid_readline==0.14
812
+ uvicorn==0.25.0
813
+ uvloop==0.19.0
814
+ vaex-astro==0.9.3
815
+ vaex-core==4.17.1
816
+ vaex-hdf5==0.14.1
817
+ vaex-jupyter==0.8.2
818
+ vaex-ml==0.18.3
819
+ vaex-server==0.9.0
820
+ vaex-viz==0.5.4
821
+ vaex==4.17.0
822
+ vec_noise==1.1.4
823
+ vecstack==0.4.0
824
+ virtualenv==20.21.0
825
+ visions==0.7.5
826
+ vowpalwabbit==9.9.0
827
+ vtk==9.3.0
828
+ wandb==0.16.6
829
+ wasabi==1.1.2
830
+ watchfiles==0.21.0
831
+ wavio==0.0.8
832
+ wcwidth==0.2.13
833
+ weasel==0.3.4
834
+ webcolors==1.13
835
+ webencodings==0.5.1
836
+ websocket-client==1.7.0
837
+ websockets==12.0
838
+ wfdb==4.1.2
839
+ whatthepatch==1.0.5
840
+ wheel==0.42.0
841
+ widgetsnbextension==3.6.6
842
+ witwidget==1.8.1
843
+ woodwork==0.30.0
844
+ wordcloud==1.9.3
845
+ wordsegment==1.3.1
846
+ wrapt==1.14.1
847
+ xarray-einstats==0.7.0
848
+ xarray==2024.3.0
849
+ xgboost==2.0.3
850
+ xvfbwrapper==0.2.9
851
+ xxhash==3.4.1
852
+ xyzservices==2024.4.0
853
+ y-py==0.6.2
854
+ yapf==0.40.2
855
+ yarl==1.9.3
856
+ yarl==1.9.4
857
+ ydata-profiling==4.6.4
858
+ yellowbrick==1.5
859
+ ypy-websocket==0.8.4
860
+ zict==3.0.0
861
+ zipp==3.17.0
862
+ zstandard==0.22.0
wandb/run-20240525_175010-8ah63pdc/files/wandb-metadata.json ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "os": "Linux-5.15.133+-x86_64-with-glibc2.31",
3
+ "python": "3.10.13",
4
+ "heartbeatAt": "2024-05-25T17:50:11.470552",
5
+ "startedAt": "2024-05-25T17:50:10.432729",
6
+ "docker": null,
7
+ "cuda": null,
8
+ "args": [],
9
+ "state": "running",
10
+ "program": "kaggle.ipynb",
11
+ "codePathLocal": null,
12
+ "root": "/kaggle/working",
13
+ "host": "8f1fad5fe1d2",
14
+ "username": "root",
15
+ "executable": "/opt/conda/bin/python3.10",
16
+ "cpu_count": 2,
17
+ "cpu_count_logical": 4,
18
+ "cpu_freq": {
19
+ "current": 2000.142,
20
+ "min": 0.0,
21
+ "max": 0.0
22
+ },
23
+ "cpu_freq_per_core": [
24
+ {
25
+ "current": 2000.142,
26
+ "min": 0.0,
27
+ "max": 0.0
28
+ },
29
+ {
30
+ "current": 2000.142,
31
+ "min": 0.0,
32
+ "max": 0.0
33
+ },
34
+ {
35
+ "current": 2000.142,
36
+ "min": 0.0,
37
+ "max": 0.0
38
+ },
39
+ {
40
+ "current": 2000.142,
41
+ "min": 0.0,
42
+ "max": 0.0
43
+ }
44
+ ],
45
+ "disk": {
46
+ "/": {
47
+ "total": 8062.387607574463,
48
+ "used": 5597.954097747803
49
+ }
50
+ },
51
+ "gpu": "Tesla P100-PCIE-16GB",
52
+ "gpu_count": 1,
53
+ "gpu_devices": [
54
+ {
55
+ "name": "Tesla P100-PCIE-16GB",
56
+ "memory_total": 17179869184
57
+ }
58
+ ],
59
+ "memory": {
60
+ "total": 31.357563018798828
61
+ }
62
+ }
wandb/run-20240525_175010-8ah63pdc/files/wandb-summary.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"train/loss": 4481.06, "train/grad_norm": NaN, "train/learning_rate": 1.3e-06, "train/epoch": 5.68, "train/global_step": 500, "_timestamp": 1716659750.4134378, "_runtime": 339.9736979007721, "_step": 1, "eval/loss": NaN, "eval/wer": 1.0, "eval/runtime": 9.9528, "eval/samples_per_second": 20.095, "eval/steps_per_second": 2.512}
wandb/run-20240525_175010-8ah63pdc/logs/debug-internal.log ADDED
@@ -0,0 +1,230 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-05-25 17:50:10,439 INFO StreamThr :225 [internal.py:wandb_internal():86] W&B internal server running at pid: 225, started at: 2024-05-25 17:50:10.438792
2
+ 2024-05-25 17:50:10,441 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status
3
+ 2024-05-25 17:50:11,144 INFO WriterThread:225 [datastore.py:open_for_write():87] open: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/run-8ah63pdc.wandb
4
+ 2024-05-25 17:50:11,145 DEBUG SenderThread:225 [sender.py:send():379] send: header
5
+ 2024-05-25 17:50:11,148 DEBUG SenderThread:225 [sender.py:send():379] send: run
6
+ 2024-05-25 17:50:11,374 INFO SenderThread:225 [dir_watcher.py:__init__():211] watching files in: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files
7
+ 2024-05-25 17:50:11,374 INFO SenderThread:225 [sender.py:_start_run_threads():1124] run started: 8ah63pdc with start time 1716659410.43974
8
+ 2024-05-25 17:50:11,382 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: check_version
9
+ 2024-05-25 17:50:11,382 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: check_version
10
+ 2024-05-25 17:50:11,450 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: run_start
11
+ 2024-05-25 17:50:11,461 DEBUG HandlerThread:225 [system_info.py:__init__():26] System info init
12
+ 2024-05-25 17:50:11,461 DEBUG HandlerThread:225 [system_info.py:__init__():41] System info init done
13
+ 2024-05-25 17:50:11,461 INFO HandlerThread:225 [system_monitor.py:start():194] Starting system monitor
14
+ 2024-05-25 17:50:11,461 INFO SystemMonitor:225 [system_monitor.py:_start():158] Starting system asset monitoring threads
15
+ 2024-05-25 17:50:11,461 INFO HandlerThread:225 [system_monitor.py:probe():214] Collecting system info
16
+ 2024-05-25 17:50:11,462 INFO SystemMonitor:225 [interfaces.py:start():190] Started cpu monitoring
17
+ 2024-05-25 17:50:11,462 INFO SystemMonitor:225 [interfaces.py:start():190] Started disk monitoring
18
+ 2024-05-25 17:50:11,463 INFO SystemMonitor:225 [interfaces.py:start():190] Started gpu monitoring
19
+ 2024-05-25 17:50:11,464 INFO SystemMonitor:225 [interfaces.py:start():190] Started memory monitoring
20
+ 2024-05-25 17:50:11,465 INFO SystemMonitor:225 [interfaces.py:start():190] Started network monitoring
21
+ 2024-05-25 17:50:11,470 DEBUG HandlerThread:225 [system_info.py:probe():150] Probing system
22
+ 2024-05-25 17:50:11,473 DEBUG HandlerThread:225 [gitlib.py:_init_repo():56] git repository is invalid
23
+ 2024-05-25 17:50:11,473 DEBUG HandlerThread:225 [system_info.py:probe():198] Probing system done
24
+ 2024-05-25 17:50:11,473 DEBUG HandlerThread:225 [system_monitor.py:probe():223] {'os': 'Linux-5.15.133+-x86_64-with-glibc2.31', 'python': '3.10.13', 'heartbeatAt': '2024-05-25T17:50:11.470552', 'startedAt': '2024-05-25T17:50:10.432729', 'docker': None, 'cuda': None, 'args': (), 'state': 'running', 'program': 'kaggle.ipynb', 'codePathLocal': None, 'root': '/kaggle/working', 'host': '8f1fad5fe1d2', 'username': 'root', 'executable': '/opt/conda/bin/python3.10', 'cpu_count': 2, 'cpu_count_logical': 4, 'cpu_freq': {'current': 2000.142, 'min': 0.0, 'max': 0.0}, 'cpu_freq_per_core': [{'current': 2000.142, 'min': 0.0, 'max': 0.0}, {'current': 2000.142, 'min': 0.0, 'max': 0.0}, {'current': 2000.142, 'min': 0.0, 'max': 0.0}, {'current': 2000.142, 'min': 0.0, 'max': 0.0}], 'disk': {'/': {'total': 8062.387607574463, 'used': 5597.954097747803}}, 'gpu': 'Tesla P100-PCIE-16GB', 'gpu_count': 1, 'gpu_devices': [{'name': 'Tesla P100-PCIE-16GB', 'memory_total': 17179869184}], 'memory': {'total': 31.357563018798828}}
25
+ 2024-05-25 17:50:11,473 INFO HandlerThread:225 [system_monitor.py:probe():224] Finished collecting system info
26
+ 2024-05-25 17:50:11,473 INFO HandlerThread:225 [system_monitor.py:probe():227] Publishing system info
27
+ 2024-05-25 17:50:11,473 DEBUG HandlerThread:225 [system_info.py:_save_conda():207] Saving list of conda packages installed into the current environment
28
+ 2024-05-25 17:50:12,376 INFO Thread-12 :225 [dir_watcher.py:_on_file_created():271] file/dir created: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/conda-environment.yaml
29
+ 2024-05-25 17:50:26,490 ERROR HandlerThread:225 [system_info.py:_save_conda():221] Error saving conda packages: Command '['conda', 'env', 'export']' timed out after 15 seconds
30
+ Traceback (most recent call last):
31
+ File "/opt/conda/lib/python3.10/site-packages/wandb/sdk/internal/system/system_info.py", line 214, in _save_conda
32
+ subprocess.call(
33
+ File "/opt/conda/lib/python3.10/subprocess.py", line 347, in call
34
+ return p.wait(timeout=timeout)
35
+ File "/opt/conda/lib/python3.10/subprocess.py", line 1209, in wait
36
+ return self._wait(timeout=timeout)
37
+ File "/opt/conda/lib/python3.10/subprocess.py", line 1951, in _wait
38
+ raise TimeoutExpired(self.args, timeout)
39
+ subprocess.TimeoutExpired: Command '['conda', 'env', 'export']' timed out after 15 seconds
40
+ 2024-05-25 17:50:26,493 DEBUG HandlerThread:225 [system_info.py:_save_conda():222] Saving conda packages done
41
+ 2024-05-25 17:50:26,493 INFO HandlerThread:225 [system_monitor.py:probe():229] Finished publishing system info
42
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
43
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: keepalive
44
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
45
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: keepalive
46
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
47
+ 2024-05-25 17:50:26,501 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: keepalive
48
+ 2024-05-25 17:50:26,502 DEBUG SenderThread:225 [sender.py:send():379] send: files
49
+ 2024-05-25 17:50:26,502 INFO SenderThread:225 [sender.py:_save_file():1390] saving file wandb-metadata.json with policy now
50
+ 2024-05-25 17:50:26,795 INFO wandb-upload_0:225 [upload_job.py:push():131] Uploaded file /tmp/tmpt3y2775xwandb/r9lafvob-wandb-metadata.json
51
+ 2024-05-25 17:50:27,378 INFO Thread-12 :225 [dir_watcher.py:_on_file_created():271] file/dir created: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/wandb-metadata.json
52
+ 2024-05-25 17:50:27,455 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: python_packages
53
+ 2024-05-25 17:50:27,455 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: python_packages
54
+ 2024-05-25 17:50:27,458 DEBUG SenderThread:225 [sender.py:send():379] send: telemetry
55
+ 2024-05-25 17:50:27,469 DEBUG SenderThread:225 [sender.py:send():379] send: config
56
+ 2024-05-25 17:50:27,472 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
57
+ 2024-05-25 17:50:27,473 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
58
+ 2024-05-25 17:50:27,473 DEBUG SenderThread:225 [sender.py:send():379] send: metric
59
+ 2024-05-25 17:50:27,474 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
60
+ 2024-05-25 17:50:27,623 DEBUG SenderThread:225 [sender.py:send():379] send: telemetry
61
+ 2024-05-25 17:50:27,624 DEBUG SenderThread:225 [sender.py:send():379] send: metric
62
+ 2024-05-25 17:50:27,624 WARNING SenderThread:225 [sender.py:send_metric():1341] Seen metric with glob (shouldn't happen)
63
+ 2024-05-25 17:50:27,624 DEBUG SenderThread:225 [sender.py:send():379] send: telemetry
64
+ 2024-05-25 17:50:28,379 INFO Thread-12 :225 [dir_watcher.py:_on_file_created():271] file/dir created: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/output.log
65
+ 2024-05-25 17:50:28,379 INFO Thread-12 :225 [dir_watcher.py:_on_file_created():271] file/dir created: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/requirements.txt
66
+ 2024-05-25 17:50:30,380 INFO Thread-12 :225 [dir_watcher.py:_on_file_modified():288] file/dir modified: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/output.log
67
+ 2024-05-25 17:50:31,456 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
68
+ 2024-05-25 17:50:36,457 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
69
+ 2024-05-25 17:50:41,462 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
70
+ 2024-05-25 17:50:42,384 INFO Thread-12 :225 [dir_watcher.py:_on_file_modified():288] file/dir modified: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/config.yaml
71
+ 2024-05-25 17:50:42,817 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
72
+ 2024-05-25 17:50:42,819 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
73
+ 2024-05-25 17:50:42,819 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
74
+ 2024-05-25 17:50:46,947 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
75
+ 2024-05-25 17:50:51,948 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
76
+ 2024-05-25 17:50:56,949 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
77
+ 2024-05-25 17:50:57,766 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
78
+ 2024-05-25 17:50:57,766 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
79
+ 2024-05-25 17:50:57,767 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
80
+ 2024-05-25 17:51:02,881 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
81
+ 2024-05-25 17:51:07,882 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
82
+ 2024-05-25 17:51:11,465 DEBUG SystemMonitor:225 [system_monitor.py:_start():172] Starting system metrics aggregation loop
83
+ 2024-05-25 17:51:11,466 DEBUG SenderThread:225 [sender.py:send():379] send: stats
84
+ 2024-05-25 17:51:12,765 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
85
+ 2024-05-25 17:51:12,766 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
86
+ 2024-05-25 17:51:12,767 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
87
+ 2024-05-25 17:51:13,876 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
88
+ 2024-05-25 17:51:18,877 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
89
+ 2024-05-25 17:51:23,878 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
90
+ 2024-05-25 17:51:27,768 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
91
+ 2024-05-25 17:51:27,769 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
92
+ 2024-05-25 17:51:27,770 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
93
+ 2024-05-25 17:51:29,878 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
94
+ 2024-05-25 17:51:34,879 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
95
+ 2024-05-25 17:51:39,880 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
96
+ 2024-05-25 17:51:41,467 DEBUG SenderThread:225 [sender.py:send():379] send: stats
97
+ 2024-05-25 17:51:42,767 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
98
+ 2024-05-25 17:51:42,769 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
99
+ 2024-05-25 17:51:42,770 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
100
+ 2024-05-25 17:51:45,874 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
101
+ 2024-05-25 17:51:50,875 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
102
+ 2024-05-25 17:51:55,876 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
103
+ 2024-05-25 17:51:57,768 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
104
+ 2024-05-25 17:51:57,769 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
105
+ 2024-05-25 17:51:57,770 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
106
+ 2024-05-25 17:52:00,927 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
107
+ 2024-05-25 17:52:05,928 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
108
+ 2024-05-25 17:52:10,929 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
109
+ 2024-05-25 17:52:11,468 DEBUG SenderThread:225 [sender.py:send():379] send: stats
110
+ 2024-05-25 17:52:12,767 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
111
+ 2024-05-25 17:52:12,774 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
112
+ 2024-05-25 17:52:12,775 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
113
+ 2024-05-25 17:52:16,015 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
114
+ 2024-05-25 17:52:21,016 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
115
+ 2024-05-25 17:52:26,017 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
116
+ 2024-05-25 17:52:27,838 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
117
+ 2024-05-25 17:52:27,839 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
118
+ 2024-05-25 17:52:27,839 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
119
+ 2024-05-25 17:52:31,936 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
120
+ 2024-05-25 17:52:36,937 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
121
+ 2024-05-25 17:52:41,469 DEBUG SenderThread:225 [sender.py:send():379] send: stats
122
+ 2024-05-25 17:52:42,470 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
123
+ 2024-05-25 17:52:42,832 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
124
+ 2024-05-25 17:52:42,832 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
125
+ 2024-05-25 17:52:42,833 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
126
+ 2024-05-25 17:52:47,962 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
127
+ 2024-05-25 17:52:52,963 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
128
+ 2024-05-25 17:52:57,808 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
129
+ 2024-05-25 17:52:57,819 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
130
+ 2024-05-25 17:52:57,820 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
131
+ 2024-05-25 17:52:57,983 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
132
+ 2024-05-25 17:53:02,984 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
133
+ 2024-05-25 17:53:07,985 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
134
+ 2024-05-25 17:53:11,470 DEBUG SenderThread:225 [sender.py:send():379] send: stats
135
+ 2024-05-25 17:53:12,808 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
136
+ 2024-05-25 17:53:12,820 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
137
+ 2024-05-25 17:53:12,820 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
138
+ 2024-05-25 17:53:13,958 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
139
+ 2024-05-25 17:53:18,959 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
140
+ 2024-05-25 17:53:23,960 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
141
+ 2024-05-25 17:53:27,808 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
142
+ 2024-05-25 17:53:27,820 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
143
+ 2024-05-25 17:53:27,820 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
144
+ 2024-05-25 17:53:29,928 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
145
+ 2024-05-25 17:53:34,929 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
146
+ 2024-05-25 17:53:39,930 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
147
+ 2024-05-25 17:53:41,471 DEBUG SenderThread:225 [sender.py:send():379] send: stats
148
+ 2024-05-25 17:53:43,123 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
149
+ 2024-05-25 17:53:43,123 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
150
+ 2024-05-25 17:53:43,124 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
151
+ 2024-05-25 17:53:45,202 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
152
+ 2024-05-25 17:53:50,203 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
153
+ 2024-05-25 17:53:55,204 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
154
+ 2024-05-25 17:53:58,084 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
155
+ 2024-05-25 17:53:58,085 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
156
+ 2024-05-25 17:53:58,125 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
157
+ 2024-05-25 17:54:00,254 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
158
+ 2024-05-25 17:54:05,255 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
159
+ 2024-05-25 17:54:10,256 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
160
+ 2024-05-25 17:54:11,472 DEBUG SenderThread:225 [sender.py:send():379] send: stats
161
+ 2024-05-25 17:54:13,149 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
162
+ 2024-05-25 17:54:13,149 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
163
+ 2024-05-25 17:54:13,189 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
164
+ 2024-05-25 17:54:15,324 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
165
+ 2024-05-25 17:54:20,326 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
166
+ 2024-05-25 17:54:25,327 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
167
+ 2024-05-25 17:54:28,119 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
168
+ 2024-05-25 17:54:28,119 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
169
+ 2024-05-25 17:54:28,445 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
170
+ 2024-05-25 17:54:31,278 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
171
+ 2024-05-25 17:54:36,278 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
172
+ 2024-05-25 17:54:41,279 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
173
+ 2024-05-25 17:54:41,472 DEBUG SenderThread:225 [sender.py:send():379] send: stats
174
+ 2024-05-25 17:54:43,119 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
175
+ 2024-05-25 17:54:43,120 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
176
+ 2024-05-25 17:54:43,160 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
177
+ 2024-05-25 17:54:47,268 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
178
+ 2024-05-25 17:54:52,269 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
179
+ 2024-05-25 17:54:57,270 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
180
+ 2024-05-25 17:54:58,119 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
181
+ 2024-05-25 17:54:58,120 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
182
+ 2024-05-25 17:54:58,160 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
183
+ 2024-05-25 17:55:03,228 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
184
+ 2024-05-25 17:55:08,229 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
185
+ 2024-05-25 17:55:11,473 DEBUG SenderThread:225 [sender.py:send():379] send: stats
186
+ 2024-05-25 17:55:13,155 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
187
+ 2024-05-25 17:55:13,155 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
188
+ 2024-05-25 17:55:13,195 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
189
+ 2024-05-25 17:55:13,316 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
190
+ 2024-05-25 17:55:18,317 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
191
+ 2024-05-25 17:55:23,318 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
192
+ 2024-05-25 17:55:28,319 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
193
+ 2024-05-25 17:55:28,576 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
194
+ 2024-05-25 17:55:28,576 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
195
+ 2024-05-25 17:55:28,616 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
196
+ 2024-05-25 17:55:33,671 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
197
+ 2024-05-25 17:55:38,672 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
198
+ 2024-05-25 17:55:40,457 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: partial_history
199
+ 2024-05-25 17:55:40,459 DEBUG SenderThread:225 [sender.py:send():379] send: metric
200
+ 2024-05-25 17:55:40,459 DEBUG SenderThread:225 [sender.py:send():379] send: metric
201
+ 2024-05-25 17:55:40,459 DEBUG SenderThread:225 [sender.py:send():379] send: metric
202
+ 2024-05-25 17:55:40,459 DEBUG SenderThread:225 [sender.py:send():379] send: metric
203
+ 2024-05-25 17:55:40,460 DEBUG SenderThread:225 [sender.py:send():379] send: history
204
+ 2024-05-25 17:55:40,460 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: summary_record
205
+ 2024-05-25 17:55:40,460 INFO SenderThread:225 [sender.py:_save_file():1390] saving file wandb-summary.json with policy end
206
+ 2024-05-25 17:55:40,497 INFO Thread-12 :225 [dir_watcher.py:_on_file_created():271] file/dir created: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/wandb-summary.json
207
+ 2024-05-25 17:55:41,474 DEBUG SenderThread:225 [sender.py:send():379] send: stats
208
+ 2024-05-25 17:55:43,544 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
209
+ 2024-05-25 17:55:43,545 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
210
+ 2024-05-25 17:55:43,546 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
211
+ 2024-05-25 17:55:43,690 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
212
+ 2024-05-25 17:55:48,691 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
213
+ 2024-05-25 17:55:50,414 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: partial_history
214
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: metric
215
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: metric
216
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: metric
217
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: metric
218
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: metric
219
+ 2024-05-25 17:55:50,416 DEBUG SenderThread:225 [sender.py:send():379] send: history
220
+ 2024-05-25 17:55:50,417 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: summary_record
221
+ 2024-05-25 17:55:50,417 INFO SenderThread:225 [sender.py:_save_file():1390] saving file wandb-summary.json with policy end
222
+ 2024-05-25 17:55:50,501 INFO Thread-12 :225 [dir_watcher.py:_on_file_modified():288] file/dir modified: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/wandb-summary.json
223
+ 2024-05-25 17:55:54,503 INFO Thread-12 :225 [dir_watcher.py:_on_file_modified():288] file/dir modified: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/output.log
224
+ 2024-05-25 17:55:54,691 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
225
+ 2024-05-25 17:55:55,503 INFO Thread-12 :225 [dir_watcher.py:_on_file_modified():288] file/dir modified: /kaggle/working/wandb/run-20240525_175010-8ah63pdc/files/config.yaml
226
+ 2024-05-25 17:55:58,532 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: stop_status
227
+ 2024-05-25 17:55:58,532 DEBUG SenderThread:225 [sender.py:send_request():406] send_request: stop_status
228
+ 2024-05-25 17:55:58,543 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: internal_messages
229
+ 2024-05-25 17:56:00,677 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
230
+ 2024-05-25 17:56:05,678 DEBUG HandlerThread:225 [handler.py:handle_request():146] handle_request: status_report
wandb/run-20240525_175010-8ah63pdc/logs/debug.log ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Current SDK version is 0.16.6
2
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Configure stats pid to 34
3
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /root/.config/wandb/settings
4
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from /kaggle/working/wandb/settings
5
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Loading settings from environment variables: {}
6
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Inferring run settings from compute environment: {'program': '<python with no main file>'}
7
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {'api_key': '***REDACTED***'}
8
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {'api_key': '***REDACTED***'}
9
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_setup.py:_flush():76] Applying login settings: {}
10
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_log_setup():521] Logging user logs to /kaggle/working/wandb/run-20240525_175010-8ah63pdc/logs/debug.log
11
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_log_setup():522] Logging internal logs to /kaggle/working/wandb/run-20240525_175010-8ah63pdc/logs/debug-internal.log
12
+ 2024-05-25 17:50:10,434 INFO MainThread:34 [wandb_init.py:_jupyter_setup():467] configuring jupyter hooks <wandb.sdk.wandb_init._WandbInit object at 0x7fbe99ebfe20>
13
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():561] calling init triggers
14
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():568] wandb.init called with sweep_config: {}
15
+ config: {}
16
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():611] starting backend
17
+ 2024-05-25 17:50:10,435 INFO MainThread:34 [wandb_init.py:init():615] setting up manager
18
+ 2024-05-25 17:50:10,437 INFO MainThread:34 [backend.py:_multiprocessing_setup():105] multiprocessing start_methods=fork,spawn,forkserver, using: spawn
19
+ 2024-05-25 17:50:10,439 INFO MainThread:34 [wandb_init.py:init():623] backend started and connected
20
+ 2024-05-25 17:50:10,450 INFO MainThread:34 [wandb_run.py:_label_probe_notebook():1299] probe notebook
21
+ 2024-05-25 17:50:11,144 INFO MainThread:34 [wandb_init.py:init():715] updated telemetry
22
+ 2024-05-25 17:50:11,147 INFO MainThread:34 [wandb_init.py:init():748] communicating run to backend with 90.0 second timeout
23
+ 2024-05-25 17:50:11,381 INFO MainThread:34 [wandb_run.py:_on_init():2357] communicating current version
24
+ 2024-05-25 17:50:11,442 INFO MainThread:34 [wandb_run.py:_on_init():2366] got version response upgrade_message: "wandb version 0.17.0 is available! To upgrade, please run:\n $ pip install wandb --upgrade"
25
+
26
+ 2024-05-25 17:50:11,444 INFO MainThread:34 [wandb_init.py:init():799] starting run threads in backend
27
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_console_start():2335] atexit reg
28
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2190] redirect: wrap_raw
29
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2255] Wrapping output streams.
30
+ 2024-05-25 17:50:27,456 INFO MainThread:34 [wandb_run.py:_redirect():2280] Redirects installed.
31
+ 2024-05-25 17:50:27,457 INFO MainThread:34 [wandb_init.py:init():842] run started, returning control to user process
32
+ 2024-05-25 17:50:27,464 INFO MainThread:34 [wandb_run.py:_config_callback():1347] config_cb None None {'return_dict': True, 'output_hidden_states': False, 'output_attentions': False, 'torchscript': False, 'torch_dtype': None, 'use_bfloat16': False, 'tf_legacy_loss': False, 'pruned_heads': {}, 'tie_word_embeddings': True, 'chunk_size_feed_forward': 0, 'is_encoder_decoder': False, 'is_decoder': False, 'cross_attention_hidden_size': None, 'add_cross_attention': False, 'tie_encoder_decoder': False, 'max_length': 20, 'min_length': 0, 'do_sample': False, 'early_stopping': False, 'num_beams': 1, 'num_beam_groups': 1, 'diversity_penalty': 0.0, 'temperature': 1.0, 'top_k': 50, 'top_p': 1.0, 'typical_p': 1.0, 'repetition_penalty': 1.0, 'length_penalty': 1.0, 'no_repeat_ngram_size': 0, 'encoder_no_repeat_ngram_size': 0, 'bad_words_ids': None, 'num_return_sequences': 1, 'output_scores': False, 'return_dict_in_generate': False, 'forced_bos_token_id': None, 'forced_eos_token_id': None, 'remove_invalid_values': False, 'exponential_decay_length_penalty': None, 'suppress_tokens': None, 'begin_suppress_tokens': None, 'architectures': ['Wav2Vec2ForPreTraining'], 'finetuning_task': None, 'id2label': {0: 'LABEL_0', 1: 'LABEL_1'}, 'label2id': {'LABEL_0': 0, 'LABEL_1': 1}, 'tokenizer_class': None, 'prefix': None, 'bos_token_id': 1, 'pad_token_id': 0, 'eos_token_id': 2, 'sep_token_id': None, 'decoder_start_token_id': None, 'task_specific_params': None, 'problem_type': None, '_name_or_path': 'facebook/wav2vec2-base', 'transformers_version': '4.39.3', 'freeze_feat_extract_train': True, 'mask_channel_length': 10, 'mask_channel_min_space': 1, 'mask_channel_other': 0.0, 'mask_channel_prob': 0.0, 'mask_channel_selection': 'static', 'mask_time_min_space': 1, 'mask_time_other': 0.0, 'mask_time_selection': 'static', 'model_type': 'wav2vec2', 'no_mask_channel_overlap': False, 'no_mask_time_overlap': False, 'num_feat_extract_layers': 7, 'hidden_size': 768, 'feat_extract_norm': 'group', 'feat_extract_activation': 'gelu', 'conv_dim': [512, 512, 512, 512, 512, 512, 512], 'conv_stride': [5, 2, 2, 2, 2, 2, 2], 'conv_kernel': [10, 3, 3, 3, 3, 2, 2], 'conv_bias': False, 'num_conv_pos_embeddings': 128, 'num_conv_pos_embedding_groups': 16, 'num_hidden_layers': 12, 'intermediate_size': 3072, 'hidden_act': 'gelu', 'num_attention_heads': 12, 'hidden_dropout': 0.1, 'attention_dropout': 0.1, 'activation_dropout': 0.0, 'feat_proj_dropout': 0.1, 'final_dropout': 0.0, 'layerdrop': 0.0, 'layer_norm_eps': 1e-05, 'initializer_range': 0.02, 'vocab_size': 100000000000000, 'do_stable_layer_norm': False, 'use_weighted_layer_sum': False, 'apply_spec_augment': True, 'mask_time_prob': 0.05, 'mask_time_length': 10, 'mask_time_min_masks': 2, 'mask_feature_prob': 0.0, 'mask_feature_length': 10, 'mask_feature_min_masks': 0, 'num_codevectors_per_group': 320, 'num_codevector_groups': 2, 'contrastive_logits_temperature': 0.1, 'feat_quantizer_dropout': 0.0, 'num_negatives': 100, 'codevector_dim': 256, 'proj_codevector_dim': 256, 'diversity_loss_weight': 0.1, 'ctc_loss_reduction': 'sum', 'ctc_zero_infinity': False, 'add_adapter': False, 'adapter_kernel_size': 3, 'adapter_stride': 2, 'num_adapter_layers': 3, 'output_hidden_size': 768, 'adapter_attn_dim': None, 'classifier_proj_size': 256, 'tdnn_dim': [512, 512, 512, 512, 1500], 'tdnn_kernel': [5, 3, 3, 1, 1], 'tdnn_dilation': [1, 2, 3, 1, 1], 'xvector_output_dim': 512, 'output_dir': '/kaggle/working/', 'overwrite_output_dir': False, 'do_train': False, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'steps', 'prediction_loss_only': False, 'per_device_train_batch_size': 8, 'per_device_eval_batch_size': 8, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'eval_delay': 0, 'learning_rate': 0.0001, 'weight_decay': 0.005, 'adam_beta1': 0.9, 'adam_beta2': 0.999, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 30, 'max_steps': -1, 'lr_scheduler_type': 'linear', 'lr_scheduler_kwargs': {}, 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': 'passive', 'log_level_replica': 'warning', 'log_on_each_node': True, 'logging_dir': '/kaggle/working/runs/May25_17-49-41_8f1fad5fe1d2', 'logging_strategy': 'steps', 'logging_first_step': False, 'logging_steps': 500, 'logging_nan_inf_filter': True, 'save_strategy': 'steps', 'save_steps': 500, 'save_total_limit': 2, 'save_safetensors': True, 'save_on_each_node': False, 'save_only_model': False, 'no_cuda': False, 'use_cpu': False, 'use_mps_device': False, 'seed': 42, 'data_seed': None, 'jit_mode_eval': False, 'use_ipex': False, 'bf16': False, 'fp16': True, 'fp16_opt_level': 'O1', 'half_precision_backend': 'auto', 'bf16_full_eval': False, 'fp16_full_eval': False, 'tf32': None, 'local_rank': 0, 'ddp_backend': None, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 500, 'dataloader_num_workers': 0, 'dataloader_prefetch_factor': None, 'past_index': -1, 'run_name': '/kaggle/working/', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'fsdp': [], 'fsdp_min_num_params': 0, 'fsdp_config': {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, 'fsdp_transformer_layer_cls_to_wrap': None, 'accelerator_config': {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}, 'deepspeed': None, 'label_smoothing_factor': 0.0, 'optim': 'adamw_torch', 'optim_args': None, 'adafactor': False, 'group_by_length': True, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'ddp_bucket_cap_mb': None, 'ddp_broadcast_buffers': None, 'dataloader_pin_memory': True, 'dataloader_persistent_workers': False, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'hub_model_id': None, 'hub_strategy': 'every_save', 'hub_token': '<HUB_TOKEN>', 'hub_private_repo': False, 'hub_always_push': False, 'gradient_checkpointing': True, 'gradient_checkpointing_kwargs': None, 'include_inputs_for_metrics': False, 'fp16_backend': 'auto', 'push_to_hub_model_id': None, 'push_to_hub_organization': None, 'push_to_hub_token': '<PUSH_TO_HUB_TOKEN>', 'mp_parameters': '', 'auto_find_batch_size': False, 'full_determinism': False, 'torchdynamo': None, 'ray_scope': 'last', 'ddp_timeout': 1800, 'torch_compile': False, 'torch_compile_backend': None, 'torch_compile_mode': None, 'dispatch_batches': None, 'split_batches': None, 'include_tokens_per_second': False, 'include_num_input_tokens_seen': False, 'neftune_noise_alpha': None, 'optim_target_modules': None}
wandb/run-20240525_175010-8ah63pdc/run-8ah63pdc.wandb ADDED
Binary file (15 kB). View file