/opt/conda/lib/python3.9/site-packages/transformers/tokenization_utils_base.py:1601: FutureWarning: `clean_up_tokenization_spaces` was not set. It will be set to `True` by default. This behavior will be depracted in transformers v4.45, and will be then set to `False` by default. For more details check this issue: https://github.com/huggingface/transformers/issues/31884 warnings.warn( Downloading readme: 0%| | 0.00/271 [00:00>> Tracker's metadata: [codecarbon INFO @ 19:58:07] Platform system: Linux-5.10.192-183.736.amzn2.x86_64-x86_64-with-glibc2.35 [codecarbon INFO @ 19:58:07] Python version: 3.9.20 [codecarbon INFO @ 19:58:07] CodeCarbon version: 2.5.1 [codecarbon INFO @ 19:58:07] Available RAM : 186.705 GB [codecarbon INFO @ 19:58:07] CPU count: 48 [codecarbon INFO @ 19:58:07] CPU model: AMD EPYC 7R32 [codecarbon INFO @ 19:58:07] GPU count: 1 [codecarbon INFO @ 19:58:07] GPU model: 1 x NVIDIA A10G [codecarbon DEBUG @ 19:58:08] Not running on AWS [codecarbon DEBUG @ 19:58:09] Not running on Azure [codecarbon DEBUG @ 19:58:10] Not running on GCP [codecarbon INFO @ 19:58:10] Saving emissions data to file /runs/codecarbon.csv [codecarbon DEBUG @ 19:58:10] EmissionsData(timestamp='2024-10-07T19:58:10', project_name='codecarbon', run_id='2d39c1fa-c60c-4a42-ae21-7478b6ecb362', duration=0.0021433930087368935, emissions=0.0, emissions_rate=0.0, cpu_power=0.0, gpu_power=0.0, ram_power=0.0, cpu_energy=0, gpu_energy=0, ram_energy=0, energy_consumed=0, country_name='United States', country_iso_code='USA', region='virginia', cloud_provider='', cloud_region='', os='Linux-5.10.192-183.736.amzn2.x86_64-x86_64-with-glibc2.35', python_version='3.9.20', codecarbon_version='2.5.1', cpu_count=48, cpu_model='AMD EPYC 7R32', gpu_count=1, gpu_model='1 x NVIDIA A10G', longitude=-77.4903, latitude=39.0469, ram_total_size=186.7047882080078, tracking_mode='process', on_cloud='N', pue=1.0) Filter: 0%| | 0/1000 [00:00