2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_setup.py:_flush():69] Unhandled environment var: WANDB_LOG_MODEL 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_setup.py:_flush():69] setting env: {'project': 'hf-flax-gpt2-indonesian', 'entity': 'wandb'} 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_setup.py:_flush():69] setting login settings: {} 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_init.py:_log_setup():337] Logging user logs to /home/cahya/Work/flax-community/gpt2-medium-indonesian/wandb/run-20210709_141445-2k8cnty2/logs/debug.log 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_init.py:_log_setup():338] Logging internal logs to /home/cahya/Work/flax-community/gpt2-medium-indonesian/wandb/run-20210709_141445-2k8cnty2/logs/debug-internal.log 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_init.py:init():370] calling init triggers 2021-07-09 14:14:45,372 INFO MainThread:245719 [wandb_init.py:init():375] wandb.init called with sweep_config: {} config: {} 2021-07-09 14:14:45,373 INFO MainThread:245719 [wandb_init.py:init():419] starting backend 2021-07-09 14:14:45,373 INFO MainThread:245719 [backend.py:_multiprocessing_setup():70] multiprocessing start_methods=fork,spawn,forkserver, using: spawn 2021-07-09 14:14:45,413 INFO MainThread:245719 [backend.py:ensure_launched():135] starting backend process... 2021-07-09 14:14:45,453 INFO MainThread:245719 [backend.py:ensure_launched():139] started backend process with pid: 246776 2021-07-09 14:14:45,455 INFO MainThread:245719 [wandb_init.py:init():424] backend started and connected 2021-07-09 14:14:45,458 INFO MainThread:245719 [wandb_init.py:init():472] updated telemetry 2021-07-09 14:14:45,458 INFO MainThread:245719 [wandb_init.py:init():491] communicating current version 2021-07-09 14:14:46,105 INFO MainThread:245719 [wandb_init.py:init():496] got version response 2021-07-09 14:14:46,105 INFO MainThread:245719 [wandb_init.py:init():504] communicating run to backend with 30 second timeout 2021-07-09 14:14:46,291 INFO MainThread:245719 [wandb_init.py:init():529] starting run threads in backend 2021-07-09 14:14:49,944 INFO MainThread:245719 [wandb_run.py:_console_start():1623] atexit reg 2021-07-09 14:14:49,944 INFO MainThread:245719 [wandb_run.py:_redirect():1497] redirect: SettingsConsole.REDIRECT 2021-07-09 14:14:49,945 INFO MainThread:245719 [wandb_run.py:_redirect():1502] Redirecting console. 2021-07-09 14:14:49,946 INFO MainThread:245719 [wandb_run.py:_redirect():1558] Redirects installed. 2021-07-09 14:14:49,947 INFO MainThread:245719 [wandb_init.py:init():554] run started, returning control to user process 2021-07-09 14:14:49,956 INFO MainThread:245719 [wandb_run.py:_config_callback():872] config_cb None None {'output_dir': '/home/cahya/Work/flax-community/gpt2-medium-indonesian', 'overwrite_output_dir': True, 'do_train': True, 'do_eval': True, 'do_predict': False, 'evaluation_strategy': 'IntervalStrategy.NO', 'prediction_loss_only': False, 'per_device_train_batch_size': 24, 'per_device_eval_batch_size': 24, 'per_gpu_train_batch_size': None, 'per_gpu_eval_batch_size': None, 'gradient_accumulation_steps': 1, 'eval_accumulation_steps': None, 'learning_rate': 0.0024, 'weight_decay': 0.01, 'adam_beta1': 0.9, 'adam_beta2': 0.98, 'adam_epsilon': 1e-08, 'max_grad_norm': 1.0, 'num_train_epochs': 20.0, 'max_steps': -1, 'lr_scheduler_type': 'SchedulerType.LINEAR', 'warmup_ratio': 0.0, 'warmup_steps': 1000, 'log_level': -1, 'log_level_replica': -1, 'log_on_each_node': True, 'logging_dir': '/home/cahya/Work/flax-community/gpt2-medium-indonesian/runs/Jul09_14-14-49_t1v-n-528d9406-w-0', 'logging_strategy': 'IntervalStrategy.STEPS', 'logging_first_step': False, 'logging_steps': 500, 'save_strategy': 'IntervalStrategy.STEPS', 'save_steps': 10, 'save_total_limit': None, 'save_on_each_node': False, 'no_cuda': False, 'seed': 42, 'fp16': False, 'fp16_opt_level': 'O1', 'fp16_backend': 'auto', 'fp16_full_eval': False, 'local_rank': -1, 'tpu_num_cores': None, 'tpu_metrics_debug': False, 'debug': [], 'dataloader_drop_last': False, 'eval_steps': 10, 'dataloader_num_workers': 64, 'past_index': -1, 'run_name': '/home/cahya/Work/flax-community/gpt2-medium-indonesian', 'disable_tqdm': False, 'remove_unused_columns': True, 'label_names': None, 'load_best_model_at_end': False, 'metric_for_best_model': None, 'greater_is_better': None, 'ignore_data_skip': False, 'sharded_ddp': [], 'deepspeed': None, 'label_smoothing_factor': 0.0, 'adafactor': False, 'group_by_length': False, 'length_column_name': 'length', 'report_to': ['tensorboard', 'wandb'], 'ddp_find_unused_parameters': None, 'dataloader_pin_memory': True, 'skip_memory_metrics': True, 'use_legacy_prediction_loop': False, 'push_to_hub': True, 'resume_from_checkpoint': None, 'push_to_hub_model_id': 'gpt2-medium-indonesian', 'push_to_hub_organization': None, 'push_to_hub_token': None, 'mp_parameters': '', '_n_gpu': 0, '__cached__setup_devices': 'cpu'} 2021-07-09 14:14:49,957 INFO MainThread:245719 [wandb_run.py:_config_callback():872] config_cb None None {'model_name_or_path': None, 'model_type': 'gpt2', 'config_name': '/home/cahya/Work/flax-community/gpt2-medium-indonesian', 'tokenizer_name': '/home/cahya/Work/flax-community/gpt2-medium-indonesian', 'cache_dir': None, 'use_fast_tokenizer': True, 'dtype': 'float32'} 2021-07-09 14:14:49,958 INFO MainThread:245719 [wandb_run.py:_config_callback():872] config_cb None None {'dataset_name': 'oscar', 'dataset_config_name': 'unshuffled_deduplicated_id', 'train_file': None, 'validation_file': None, 'max_train_samples': 10000, 'max_eval_samples': 1000, 'overwrite_cache': False, 'validation_split_percentage': 5, 'block_size': 512, 'preprocessing_num_workers': 64} 2021-07-09 14:14:49,958 INFO MainThread:245719 [wandb_config.py:__setitem__():141] config set test_log = 12345 - > 2021-07-09 14:14:49,958 INFO MainThread:245719 [wandb_run.py:_config_callback():872] config_cb test_log 12345 None 2021-07-09 14:15:27,424 INFO MainThread:245719 [wandb_run.py:_tensorboard_callback():943] tensorboard callback: /home/cahya/Work/flax-community/gpt2-medium-indonesian, None