IlyasMoutawwakil HF Staff commited on
Commit
68ffcbf
·
verified ·
1 Parent(s): 4cb492e

Upload perf-df-unquantized-1xT4.csv with huggingface_hub

Browse files
Files changed (1) hide show
  1. perf-df-unquantized-1xT4.csv +12 -12
perf-df-unquantized-1xT4.csv CHANGED
@@ -1113,7 +1113,7 @@ ChildProcessError: Traceback (most recent call last):
1113
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
1114
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
1115
  return func(*args, **kwargs)
1116
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 21522 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
1117
 
1118
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1119
  bfloat16-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -7210,7 +7210,7 @@ ChildProcessError: Traceback (most recent call last):
7210
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
7211
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
7212
  return func(*args, **kwargs)
7213
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 560.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 8.12 MiB is free. Process 21875 has 14.73 GiB memory in use. Of the allocated memory 14.62 GiB is allocated by PyTorch, and 1.67 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
7214
 
7215
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
7216
  float32-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float32,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -8020,7 +8020,7 @@ ChildProcessError: Traceback (most recent call last):
8020
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
8021
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
8022
  return func(*args, **kwargs)
8023
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 64.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 30.12 MiB is free. Process 19433 has 14.71 GiB memory in use. Of the allocated memory 14.51 GiB is allocated by PyTorch, and 85.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
8024
 
8025
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
8026
  float32-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,mistral,TencentARC/Mistral_Pro_8B_v0.1,TencentARC/Mistral_Pro_8B_v0.1,cuda,0,42,,,True,True,,float32,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -14545,7 +14545,7 @@ ChildProcessError: Traceback (most recent call last):
14545
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
14546
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
14547
  return func(*args, **kwargs)
14548
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 23279 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
14549
 
14550
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
14551
  bfloat16-flash_attention_2,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,flash_attention_2,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -19984,7 +19984,7 @@ ChildProcessError: Traceback (most recent call last):
19984
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
19985
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
19986
  return func(*args, **kwargs)
19987
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 22931 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
19988
 
19989
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
19990
  float16-flash_attention_2,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,flash_attention_2,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -23674,7 +23674,7 @@ ChildProcessError: Traceback (most recent call last):
23674
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
23675
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
23676
  return func(*args, **kwargs)
23677
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 224.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 162.12 MiB is free. Process 23633 has 14.58 GiB memory in use. Of the allocated memory 14.44 GiB is allocated by PyTorch, and 25.46 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
23678
 
23679
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
23680
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,stablelm,stabilityai/stablelm-3b-4e1t,stabilityai/stablelm-3b-4e1t,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.224-212.876.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.4.0,,4.44.2,,0.34.2,,,,1.22.0,,,,0.12.0,,,MB,884.5312,11792.154624,0.0,11389.632512,11388.883968,s,1,7.62391650390625,7.62391650390625,0.0,7.62391650390625,7.62391650390625,7.62391650390625,7.62391650390625,[7.62391650390625],,kWh,7.550980120830293e-06,8.183254871460181e-07,4.161947774000696e-06,1.2531253381977007e-05,,MB,1211.850752,12089.950208,0.0,11676.942336,11620.241408,s,10,3.529409362792969,0.35294093627929685,0.004687535482057751,0.3546914520263672,0.3575145141601562,0.3580895935058594,0.35854965698242186,"[0.3421952209472656, 0.3542493896484375, 0.35546945190429685, 0.3586646728515625, 0.3475618896484375, 0.3554535827636719, 0.35738671875, 0.35513351440429686, 0.35198422241210936, 0.3513106994628906]",tokens/s,725.333826953461,kWh,1.0179109382326568e-05,1.1225649051281122e-06,6.76206479662085e-06,1.806373908407553e-05,tokens/kWh,14172038.181490464,MB,1217.179648,12089.950208,0.0,11676.942336,11620.243968,s,10,31.83818115234375,3.1838181152343745,0.0023803913060039604,3.1841195068359376,3.1869217041015627,3.1870340942382813,3.187124006347656,"[3.182017333984375, 3.184160400390625, 3.182552490234375, 3.183121826171875, 3.17848828125, 3.18476953125, 3.184949462890625, 3.18407861328125, 3.186896728515625, 3.187146484375]",tokens/s,19.78756251764159,kWh,9.316640863058924e-05,1.0276466099167838e-05,6.183607341517975e-05,0.00016527894814493684,tokens/kWh,381173.7714155458,,s,630,31.834508899688718,0.05053096650744242,0.00027557110689108155,0.05054019165039063,0.050829452133178715,0.05090588703155518,0.0515531579208374,"[0.05157068634033203, 0.05063679885864258, 0.05017599868774414, 0.04999980926513672, 0.05020832061767578, 0.0501212158203125, 0.05012678527832031, 0.05012886428833008, 0.0501822395324707, 0.05013913726806641, 0.05027222442626953, 0.0502599983215332, 0.05031321716308594, 0.050114559173583983, 0.050184192657470705, 0.05008902359008789, 0.05028300857543945, 0.05022150421142578, 0.050450401306152345, 0.050608158111572266, 0.050603870391845704, 0.05046031951904297, 0.05046732711791992, 0.05025177764892578, 0.05030246353149414, 0.05031520080566406, 0.05030470275878906, 0.05025471878051758, 0.050288639068603515, 0.0503166389465332, 0.05038966369628906, 0.050348033905029295, 0.05049305725097656, 0.0504323844909668, 0.05052537536621094, 0.05059795379638672, 0.050791168212890626, 0.050683902740478515, 0.050716670989990234, 0.0506341438293457, 0.05071660614013672, 0.05068479919433594, 0.050669345855712894, 0.05062047958374023, 0.050672863006591795, 0.05063958358764648, 0.05063663864135742, 0.05060214233398438, 0.050561023712158204, 0.050677761077880856, 0.050670848846435544, 0.050648929595947266, 0.05065411376953125, 0.05081087875366211, 0.05064089584350586, 0.050722270965576174, 0.05067830276489258, 0.050756607055664066, 0.050918399810791014, 0.050888671875, 0.050864158630371095, 0.05086617660522461, 0.05081292724609375, 0.051525279998779296, 0.050764320373535156, 0.050282497406005856, 0.05028432083129883, 0.050175617218017575, 0.05016841506958008, 0.050237438201904294, 0.05014323043823242, 0.0505239372253418, 0.05024528121948242, 0.050235969543457035, 0.05016899108886719, 0.05016569519042969, 0.05054054260253906, 0.050278560638427734, 0.05021343994140625, 0.050309310913085936, 0.05040947341918945, 0.05056512069702149, 0.0507064323425293, 0.05078015899658203, 0.05053984069824219, 0.05064265441894531, 0.05047590255737305, 0.050345535278320315, 0.050342430114746095, 0.05032470321655273, 0.050332447052001954, 0.05036646270751953, 0.05034710311889649, 0.05033184051513672, 0.05041430282592774, 0.050388992309570314, 0.050484577178955076, 0.050481822967529295, 0.05046476745605469, 0.05060403060913086, 0.050710529327392576, 0.05065523147583008, 0.0508040657043457, 0.05062518310546875, 0.05088256072998047, 0.050783905029296875, 0.0506412467956543, 0.05057126235961914, 0.05061964797973633, 0.05052617645263672, 0.05053462219238281, 0.05060211181640625, 0.05091987228393555, 0.05053961563110351, 0.05069680023193359, 0.050579776763916014, 0.05063679885864258, 0.05060713577270508, 0.05070742416381836, 0.0507125129699707, 0.05084726333618164, 0.05077660751342773, 0.05074691009521484, 0.050825695037841796, 0.050826366424560544, 0.0507831039428711, 0.05176115036010742, 0.05078742218017578, 0.05025788879394531, 0.050086849212646486, 0.05018009567260742, 0.05006131362915039, 0.05021491241455078, 0.05018009567260742, 0.050098175048828124, 0.05026764678955078, 0.05028915023803711, 0.050214336395263674, 0.05011308670043945, 0.05016323089599609, 0.05026863861083984, 0.05026601409912109, 0.0501712646484375, 0.0502413444519043, 0.050406303405761715, 0.050730945587158204, 0.05065439987182617, 0.05046566390991211, 0.05028400039672851, 0.05036624145507813, 0.050342655181884764, 0.050331649780273435, 0.05030857467651367, 0.05032400131225586, 0.05023539352416992, 0.05045862579345703, 0.050267486572265624, 0.050375328063964844, 0.05041766357421875, 0.05050518417358398, 0.05041404724121094, 0.050505214691162106, 0.050496063232421874, 0.05064908981323242, 0.050601982116699216, 0.050730720520019534, 0.05067190551757812, 0.050792224884033205, 0.05067388916015625, 0.05064838409423828, 0.05060063934326172, 0.05065017700195312, 0.05058246231079101, 0.050572414398193356, 0.05065804672241211, 0.050724990844726564, 0.05075465774536133, 0.050664447784423826, 0.050587169647216795, 0.0507457275390625, 0.0506695671081543, 0.05082278442382813, 0.05076825714111328, 0.05076582336425781, 0.05092313766479492, 0.05083552169799805, 0.0510134391784668, 0.05077862548828125, 0.05077196884155273, 0.05157795333862305, 0.05066435241699219, 0.050216960906982425, 0.05022304153442383, 0.05025388717651367, 0.050260990142822266, 0.05018931198120117, 0.05029619216918945, 0.05014182281494141, 0.05024470520019531, 0.05017436981201172, 0.05022771072387695, 0.05018761444091797, 0.050200672149658204, 0.050237472534179685, 0.05018668746948242, 0.05032515335083008, 0.05035647964477539, 0.05054278564453125, 0.050611743927001955, 0.0505382080078125, 0.05049542236328125, 0.05042067337036133, 0.050366336822509766, 0.050411518096923826, 0.050359809875488284, 0.05037107086181641, 0.05032102584838867, 0.05031472015380859, 0.05029776000976562, 0.050251136779785155, 0.0501798095703125, 0.05077699279785156, 0.05042937469482422, 0.05047558212280273, 0.050522113800048826, 0.0506033935546875, 0.05062838363647461, 0.05064380645751953, 0.050713951110839844, 0.05069635009765625, 0.050661376953125, 0.05061593627929688, 0.05082815933227539, 0.050627777099609375, 0.050628704071044923, 0.05066416168212891, 0.0506429443359375, 0.050544639587402344, 0.05070025634765625, 0.05059135818481445, 0.050641311645507815, 0.05067571258544922, 0.050735103607177735, 0.05065932846069336, 0.05071638488769531, 0.05073891067504883, 0.050805313110351566, 0.051035999298095706, 0.05081718444824219, 0.05088774490356445, 0.05084572982788086, 0.050708511352539065, 0.0516328010559082, 0.05083071899414063, 0.0502685432434082, 0.050114814758300784, 0.04996432113647461, 0.05011324691772461, 0.0500747184753418, 0.050125247955322264, 0.050012126922607425, 0.050130752563476565, 0.05020947265625, 0.05003878402709961, 0.05019375991821289, 0.0500968017578125, 0.05019180679321289, 0.050002494812011716, 0.05018009567260742, 0.05032755279541016, 0.05048934555053711, 0.050552833557128904, 0.0506363525390625, 0.05055123138427734, 0.05043404769897461, 0.05017734527587891, 0.05028524780273438, 0.05028659057617187, 0.05039436721801758, 0.050311103820800784, 0.05030361557006836, 0.05038083267211914, 0.05047628784179688, 0.05034281539916992, 0.05041907119750977, 0.050518657684326174, 0.050485183715820316, 0.050433406829833986, 0.05061907196044922, 0.05104435348510742, 0.05074716949462891, 0.050743518829345705, 0.05070771026611328, 0.05074764633178711, 0.050567680358886716, 0.05039308929443359, 0.05050483322143555, 0.050514495849609375, 0.050454784393310546, 0.05053635025024414, 0.05047449493408203, 0.05055894470214844, 0.050471614837646485, 0.050339839935302735, 0.05043199920654297, 0.05055487823486328, 0.05052604675292969, 0.050417823791503905, 0.05076377487182617, 0.050577407836914064, 0.05073020935058594, 0.050549537658691406, 0.050799713134765626, 0.05066640090942383, 0.05071257781982422, 0.05145964813232422, 0.05054844665527344, 0.050289375305175785, 0.05014323043823242, 0.05007974243164062, 0.05015961456298828, 0.05014473724365234, 0.050159233093261715, 0.05015644836425781, 0.05013078308105469, 0.05033964920043945, 0.050127201080322266, 0.05020985412597656, 0.05020528030395508, 0.050239742279052736, 0.050208255767822264, 0.0502125129699707, 0.050291648864746095, 0.050527423858642576, 0.05068854522705078, 0.05069990539550781, 0.05063910293579101, 0.05040700912475586, 0.050430656433105465, 0.05036044692993164, 0.050348033905029295, 0.05039616012573242, 0.05031628799438476, 0.0504131851196289, 0.050463104248046876, 0.050522113800048826, 0.0504439697265625, 0.050547008514404294, 0.05039513778686523, 0.05056716918945312, 0.05050294494628906, 0.05060630416870117, 0.05085190582275391, 0.050743743896484374, 0.05082931137084961, 0.05075107192993164, 0.05077648162841797, 0.05074943923950195, 0.050710529327392576, 0.05063683319091797, 0.05081494522094727, 0.05076172637939453, 0.050702335357666016, 0.050560577392578125, 0.05079443359375, 0.05055744171142578, 0.050683902740478515, 0.05067161560058594, 0.05070438385009766, 0.050683902740478515, 0.05082316970825195, 0.05081497573852539, 0.05084934234619141, 0.05088848114013672, 0.051154590606689455, 0.05082371139526367, 0.05086051177978516, 0.05086003112792969, 0.05167411041259766, 0.05076416015625, 0.05025817489624024, 0.050057567596435544, 0.05016697692871094, 0.050033470153808594, 0.0501288948059082, 0.050171585083007814, 0.0503565444946289, 0.05030297470092773, 0.05019343948364258, 0.050315265655517576, 0.050230239868164064, 0.050321407318115234, 0.050130752563476565, 0.050219200134277345, 0.050298881530761716, 0.05032470321655273, 0.05045123291015625, 0.05072895812988281, 0.050980289459228514, 0.05049196624755859, 0.05040083312988281, 0.05036281585693359, 0.050290687561035156, 0.05039913558959961, 0.05032742309570312, 0.050343391418457034, 0.0503078384399414, 0.05040332794189453, 0.050323455810546876, 0.05034521484375, 0.050372608184814455, 0.05044940948486328, 0.05051932907104492, 0.050546783447265625, 0.05065356826782227, 0.050713951110839844, 0.05072553634643555, 0.050855934143066404, 0.050733055114746094, 0.050826271057128905, 0.05070025634765625, 0.050715648651123046, 0.05075353622436524, 0.05068364715576172, 0.050673473358154295, 0.05064134216308594, 0.050552833557128904, 0.05060748672485352, 0.05072313690185547, 0.050968734741210935, 0.05065887832641602, 0.05067190551757812, 0.05074736022949219, 0.050743072509765626, 0.050821407318115235, 0.05080092620849609, 0.051076225280761715, 0.050979328155517575, 0.05088614273071289, 0.05086819076538086, 0.05080361557006836, 0.05156454467773437, 0.05069004821777344, 0.05017331314086914, 0.05013471984863281, 0.050222015380859374, 0.050135040283203126, 0.050106529235839845, 0.05019007873535156, 0.05018838500976563, 0.050157569885253904, 0.05022719955444336, 0.050249729156494144, 0.05029033660888672, 0.05029513549804687, 0.05026406478881836, 0.050253822326660154, 0.05036236953735351, 0.05036236953735351, 0.05059379196166992, 0.050710529327392576, 0.050644992828369144, 0.05053440093994141, 0.05041916656494141, 0.05029724884033203, 0.050321537017822264, 0.05035523223876953, 0.05025481414794922, 0.05032112121582031, 0.05026230239868164, 0.05034598541259765, 0.050414623260498045, 0.050331615447998045, 0.0504637451171875, 0.050444286346435545, 0.05056460952758789, 0.05061254501342773, 0.0507578239440918, 0.05066342544555664, 0.05086617660522461, 0.0507446403503418, 0.05072272109985351, 0.050756385803222656, 0.05069823837280273, 0.05067712020874023, 0.05060262298583985, 0.050685791015625, 0.05061840057373047, 0.05070415878295898, 0.050590049743652346, 0.05062041473388672, 0.05067497634887695, 0.050662113189697267, 0.050702335357666016, 0.050728225708007814, 0.05058428955078125, 0.05086396789550781, 0.050718753814697266, 0.05084985733032227, 0.05083961486816406, 0.050917377471923826, 0.05102592086791992, 0.050878398895263674, 0.05079849624633789, 0.051515392303466793, 0.0507325439453125, 0.050237953186035154, 0.050148544311523435, 0.05016044616699219, 0.050098175048828124, 0.05015075302124023, 0.050274528503417966, 0.05015596771240234, 0.05026201629638672, 0.050288639068603515, 0.05025958251953125, 0.0502685432434082, 0.0503616943359375, 0.0503548469543457, 0.05034598541259765, 0.05027174377441406, 0.05040560150146484, 0.05047087860107422, 0.05074348831176758, 0.050603809356689455, 0.050571231842041015, 0.05043033599853516, 0.05047091293334961, 0.050423809051513675, 0.050444286346435545, 0.050385982513427734, 0.050419872283935546, 0.0503359375, 0.050477664947509764, 0.05051145553588867, 0.05044675064086914, 0.05045248031616211, 0.05052403259277344, 0.050544769287109374, 0.05060403060913086, 0.0506695671081543, 0.050784255981445314, 0.05088051223754883, 0.05083135986328125, 0.050786304473876956, 0.050759681701660155, 0.0506960334777832, 0.05073321533203125, 0.05059756851196289, 0.050772289276123046, 0.050599872589111326, 0.05106284713745117, 0.05055897521972656, 0.0506668815612793, 0.05067020797729492, 0.050710529327392576, 0.05075353622436524, 0.05072860717773438, 0.05068172836303711, 0.050907615661621095, 0.05076732635498047, 0.05091382217407227, 0.05091328048706055, 0.050958335876464846, 0.05094400024414063, 0.05111529541015625, 0.05090377426147461, 0.05180992126464844, 0.05079507064819336, 0.05030809783935547, 0.05042822265625, 0.0502545280456543, 0.050229248046875, 0.05015497589111328, 0.05022771072387695, 0.0503337287902832, 0.05023871994018555, 0.050244350433349606, 0.05028160095214844, 0.050328449249267576, 0.05023539352416992, 0.05029478454589844, 0.050300289154052734, 0.05050636672973633, 0.0504189453125, 0.050590465545654294, 0.050694145202636716, 0.0506429443359375, 0.050563072204589846, 0.05057712173461914, 0.0503803825378418, 0.050430656433105465, 0.05040719985961914, 0.05041584014892578, 0.05039427185058594, 0.05044924926757813, 0.05038256072998047, 0.05050601577758789, 0.050493438720703124, 0.05043404769897461, 0.050462718963623046, 0.0505300178527832, 0.05057769775390625, 0.05056668853759766, 0.05073273468017578, 0.05082191848754883, 0.05086207962036133, 0.05089427185058594, 0.0508642578125, 0.05078470230102539, 0.050763454437255856, 0.050624095916748046, 0.050673728942871095, 0.050621086120605466, 0.05074694442749023, 0.050661823272705075, 0.05068755340576172, 0.05058339309692383, 0.05066403198242188, 0.05062793731689453, 0.05070492935180664, 0.05073932647705078, 0.05076377487182617, 0.05081862258911133, 0.050839969635009766, 0.05094403076171875, 0.05083135986328125, 0.05095616149902344, 0.05086220932006836, 0.050818687438964845]",tokens/s,19.78984510127499,,
@@ -23764,7 +23764,7 @@ ChildProcessError: Traceback (most recent call last):
23764
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
23765
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
23766
  return func(*args, **kwargs)
23767
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 256.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 94.12 MiB is free. Process 27766 has 14.65 GiB memory in use. Of the allocated memory 14.53 GiB is allocated by PyTorch, and 2.49 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
23768
 
23769
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
23770
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,stablelm,stabilityai/stablelm-2-1_6b,stabilityai/stablelm-2-1_6b,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.224-212.876.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.4.0,,4.44.2,,0.34.2,,,,1.22.0,,,,0.12.0,,,MB,902.3488,6993.870848,0.0,6591.348736,6590.657536,s,1,7.7316650390625,7.7316650390625,0.0,7.7316650390625,7.7316650390625,7.7316650390625,7.7316650390625,[7.7316650390625],,kWh,6.249157045840547e-06,6.81981213064745e-07,2.031390514009579e-06,8.962528772914872e-06,,MB,1234.321408,7258.112,0.0,6845.104128,6805.125632,s,10,2.2387210540771485,0.22387210540771485,0.006860980982676506,0.22428342437744142,0.230287255859375,0.23189871673583984,0.23318788543701172,"[0.205762939453125, 0.22159222412109375, 0.2255982666015625, 0.22400172424316406, 0.22380921936035156, 0.22599891662597657, 0.22992915344238282, 0.22456512451171876, 0.22395330810546876, 0.2335101776123047]",tokens/s,1143.510039063482,kWh,6.283555607446845e-06,6.924726440920054e-07,4.172225560000033e-06,1.1148253811538883e-05,tokens/kWh,22963237.501377113,MB,1246.425088,7260.209152,0.0,6847.20128,6805.128192,s,10,17.170465332031252,1.7170465332031248,0.0037074144946325858,1.71831298828125,1.7202278564453124,1.7209889892578125,1.7215978955078124,"[1.710521240234375, 1.7104453125, 1.717341552734375, 1.7161055908203124, 1.7200587158203124, 1.7158358154296875, 1.719595703125, 1.719284423828125, 1.7217501220703124, 1.71952685546875]",tokens/s,36.69091010741243,kWh,5.047899915463414e-05,5.568098191081764e-06,3.353133238059929e-05,8.95784297263152e-05,tokens/kWh,703294.3108344382,,s,630,17.16719076919555,0.027249509157453264,0.0003084332683327531,0.027201295852661133,0.027428953170776368,0.02754782371520996,0.029114681129455568,"[0.02920857620239258, 0.027962623596191408, 0.027306751251220705, 0.02713363265991211, 0.02696633529663086, 0.0268569278717041, 0.026939264297485353, 0.0270032958984375, 0.027158784866333007, 0.02697216033935547, 0.02710323143005371, 0.026866912841796875, 0.027019039154052734, 0.02698137664794922, 0.026959871292114256, 0.027086143493652345, 0.027078880310058593, 0.02702998352050781, 0.02715190315246582, 0.0271549129486084, 0.02708684730529785, 0.027023359298706053, 0.026994688034057617, 0.02690457534790039, 0.026973791122436523, 0.026966367721557617, 0.02702547264099121, 0.027062271118164064, 0.027107328414916993, 0.027037696838378908, 0.027104864120483397, 0.02713360023498535, 0.027085567474365236, 0.027183103561401366, 0.02717695999145508, 0.02717695999145508, 0.02726924705505371, 0.02724995231628418, 0.027183712005615233, 0.027262975692749023, 0.027143999099731444, 0.02703968048095703, 0.02712396812438965, 0.027084800720214845, 0.027229471206665037, 0.027157215118408202, 0.02711347198486328, 0.027127487182617187, 0.027089120864868164, 0.0271092472076416, 0.027039968490600585, 0.027068416595458986, 0.027164831161499023, 0.027126655578613282, 0.027124704360961913, 0.027152256011962892, 0.0271648006439209, 0.027328096389770507, 0.027172672271728517, 0.027238079071044922, 0.027163551330566405, 0.027172319412231444, 0.02713654327392578, 0.029059200286865233, 0.027767072677612303, 0.027430912017822266, 0.027130943298339844, 0.02695382308959961, 0.026888288497924805, 0.026823423385620118, 0.02681999969482422, 0.026834623336791992, 0.026835872650146485, 0.0268287353515625, 0.026856544494628907, 0.02683798408508301, 0.02689571189880371, 0.026927743911743164, 0.026850784301757812, 0.02692153549194336, 0.02693049621582031, 0.02693129539489746, 0.026992639541625976, 0.02686934471130371, 0.02687283134460449, 0.02697420883178711, 0.026998144149780273, 0.02692531204223633, 0.027044063568115236, 0.027005376815795897, 0.026913503646850585, 0.026970687866210936, 0.027146047592163085, 0.027150976181030274, 0.027090368270874025, 0.027050559997558593, 0.02710688018798828, 0.027177215576171875, 0.027262720108032226, 0.027237920761108397, 0.02731820869445801, 0.027265695571899413, 0.02716703987121582, 0.027224063873291016, 0.027195072174072264, 0.027248960494995117, 0.027235519409179686, 0.027212608337402345, 0.027276863098144533, 0.027064159393310548, 0.02706697654724121, 0.027244543075561522, 0.02732758331298828, 0.027256864547729492, 0.027294303894042967, 0.027201055526733398, 0.02720844841003418, 0.02732646369934082, 0.0273768310546875, 0.027267904281616212, 0.027340799331665038, 0.02733875274658203, 0.02734489631652832, 0.027337728500366212, 0.027339775085449217, 0.027310144424438475, 0.02922528076171875, 0.028089088439941408, 0.027412416458129883, 0.027137535095214844, 0.026980863571166993, 0.0270231990814209, 0.027017375946044923, 0.026922752380371093, 0.026970495223999025, 0.026967487335205077, 0.02702761650085449, 0.026882335662841796, 0.0269434871673584, 0.02712348747253418, 0.02698467254638672, 0.027031551361083983, 0.02706790351867676, 0.02699235153198242, 0.027068832397460937, 0.02701468849182129, 0.027100000381469726, 0.027064319610595702, 0.02701923179626465, 0.02710905647277832, 0.027117952346801758, 0.02715385627746582, 0.027113792419433593, 0.027152896881103516, 0.027629280090332033, 0.027254783630371093, 0.02714419174194336, 0.0271297607421875, 0.027088991165161135, 0.027185152053833008, 0.027309120178222655, 0.027360191345214845, 0.027394048690795897, 0.02736332893371582, 0.02733670425415039, 0.027389951705932617, 0.02728550338745117, 0.027299840927124022, 0.027241535186767578, 0.02724959945678711, 0.02727071952819824, 0.027408832550048827, 0.02761897659301758, 0.02734444808959961, 0.027466527938842772, 0.02727084732055664, 0.027217376708984376, 0.02726323127746582, 0.027226720809936523, 0.02728976058959961, 0.02738582420349121, 0.02736729621887207, 0.027267072677612306, 0.027381759643554687, 0.02735308837890625, 0.027297792434692384, 0.02759065628051758, 0.027289600372314454, 0.02734489631652832, 0.029165567398071288, 0.027844608306884764, 0.027445247650146484, 0.02715238380432129, 0.02704979133605957, 0.027089439392089843, 0.027046720504760743, 0.026866527557373048, 0.02694963264465332, 0.027118783950805664, 0.027009408950805665, 0.027068864822387694, 0.027024831771850586, 0.02709766387939453, 0.02703900718688965, 0.026954463958740234, 0.027068288803100585, 0.027117055892944338, 0.02703580856323242, 0.027044319152832032, 0.0269816951751709, 0.02692915153503418, 0.027212480545043945, 0.027043840408325196, 0.027088895797729492, 0.02714182472229004, 0.027171104431152344, 0.02714147186279297, 0.02710780715942383, 0.02711356735229492, 0.02712588882446289, 0.027033599853515625, 0.02709708786010742, 0.027112543106079103, 0.02733353614807129, 0.02736073684692383, 0.027298240661621093, 0.02723961639404297, 0.027226207733154296, 0.027177791595458984, 0.02716057586669922, 0.0271824951171875, 0.027118335723876952, 0.027200767517089844, 0.027300128936767577, 0.027287263870239258, 0.028184736251831054, 0.027277759552001953, 0.02735103988647461, 0.027225183486938476, 0.027244543075561522, 0.027198368072509766, 0.027189504623413085, 0.027268255233764648, 0.027226720809936523, 0.027285375595092774, 0.027407615661621094, 0.027429759979248045, 0.027482112884521483, 0.02749833679199219, 0.0274303035736084, 0.027312671661376953, 0.027383712768554686, 0.029056671142578126, 0.02809231948852539, 0.0275479679107666, 0.02731430435180664, 0.027148191452026366, 0.027002975463867186, 0.027006912231445312, 0.026961376190185547, 0.027017759323120116, 0.026984512329101564, 0.02714419174194336, 0.027166015625, 0.027130048751831056, 0.027044095993041993, 0.02710758399963379, 0.027058176040649414, 0.027076608657836915, 0.02712291145324707, 0.027484960556030273, 0.02711142349243164, 0.027092992782592775, 0.02723347282409668, 0.027165311813354492, 0.02717305564880371, 0.027094688415527344, 0.027172927856445313, 0.02742448043823242, 0.027142240524291993, 0.027117151260375977, 0.027099039077758787, 0.027160671234130858, 0.027085599899291993, 0.027129247665405275, 0.02733647918701172, 0.0273437442779541, 0.027329759597778322, 0.02749932861328125, 0.02735923194885254, 0.027420543670654298, 0.027272352218627928, 0.02740678405761719, 0.027724096298217774, 0.027195615768432616, 0.027242496490478517, 0.027271360397338868, 0.027303743362426757, 0.027267072677612306, 0.027432159423828126, 0.027980512619018554, 0.027295167922973634, 0.027251039505004883, 0.02727497673034668, 0.02727174377441406, 0.02728691291809082, 0.027336864471435546, 0.027231775283813476, 0.027376415252685547, 0.02741196823120117, 0.027474111557006835, 0.027339231491088866, 0.02738096046447754, 0.027368223190307617, 0.027387903213500975, 0.029067264556884766, 0.027836191177368165, 0.027463712692260743, 0.02726691246032715, 0.02712816047668457, 0.026995967864990235, 0.02694790458679199, 0.027057823181152345, 0.02712451171875, 0.027010368347167968, 0.02707321548461914, 0.026979808807373048, 0.02699078369140625, 0.0270728645324707, 0.027080703735351562, 0.02712166404724121, 0.02707587242126465, 0.02703603172302246, 0.02702351951599121, 0.027017248153686522, 0.02718022346496582, 0.027069408416748045, 0.02711347198486328, 0.027090431213378906, 0.02708531188964844, 0.027107328414916993, 0.027181055068969725, 0.027136032104492187, 0.027150304794311523, 0.02707164764404297, 0.02702012825012207, 0.027121152877807617, 0.027070976257324218, 0.027141759872436524, 0.027224031448364258, 0.027498239517211913, 0.027464351654052734, 0.02736947250366211, 0.027398143768310547, 0.027283231735229493, 0.02734716796875, 0.027299840927124022, 0.027271167755126953, 0.027230207443237304, 0.027242528915405274, 0.0272260799407959, 0.027370943069458007, 0.027158496856689453, 0.027165279388427735, 0.027240447998046875, 0.027232255935668945, 0.027262975692749023, 0.027243808746337892, 0.027284032821655275, 0.027229663848876953, 0.02727801513671875, 0.027303936004638672, 0.027254783630371093, 0.027352895736694336, 0.027277088165283202, 0.02735500717163086, 0.027371616363525392, 0.027373760223388673, 0.029223615646362305, 0.02812313652038574, 0.027559455871582032, 0.027306463241577147, 0.02715443229675293, 0.02715648078918457, 0.027200511932373047, 0.02703984069824219, 0.027116384506225586, 0.027133951187133788, 0.0272609920501709, 0.027222015380859374, 0.027047840118408203, 0.027021408081054688, 0.027084640502929688, 0.02716454315185547, 0.027090591430664064, 0.027221887588500977, 0.027044639587402344, 0.027142112731933593, 0.027228160858154295, 0.0271231689453125, 0.027109792709350586, 0.027191423416137697, 0.027152448654174804, 0.027123647689819334, 0.027222015380859374, 0.027119264602661133, 0.027210079193115234, 0.027241535186767578, 0.02711244773864746, 0.027217088699340822, 0.02712588882446289, 0.027349472045898438, 0.027331872940063475, 0.027454336166381835, 0.027373567581176757, 0.027447296142578126, 0.027441152572631834, 0.027288639068603515, 0.027214111328125, 0.027445472717285157, 0.02736172866821289, 0.027314048767089844, 0.027230335235595704, 0.027282527923583984, 0.027294624328613282, 0.027215167999267577, 0.027300352096557616, 0.027397439956665038, 0.02737241554260254, 0.027295743942260742, 0.027398048400878908, 0.027261024475097657, 0.0273768310546875, 0.027226943969726563, 0.02719968032836914, 0.02730169677734375, 0.027387903213500975, 0.02735308837890625, 0.02732796859741211, 0.027232799530029297, 0.02730803108215332, 0.02919219207763672, 0.028078079223632812, 0.0275230712890625, 0.027225727081298827, 0.027142528533935548, 0.02709199905395508, 0.027031999588012695, 0.027124256134033204, 0.02713113594055176, 0.027097856521606446, 0.027162431716918945, 0.027024959564208983, 0.027148927688598633, 0.02711689567565918, 0.027144224166870116, 0.027119359970092773, 0.02719833564758301, 0.027053056716918947, 0.027160768508911134, 0.02698303985595703, 0.027101375579833983, 0.02716057586669922, 0.027031551361083983, 0.02697395133972168, 0.027001087188720702, 0.02716057586669922, 0.027082752227783204, 0.027198623657226563, 0.027157344818115235, 0.02715648078918457, 0.02716262435913086, 0.027172864913940428, 0.027243839263916016, 0.027165376663208007, 0.027482112884521483, 0.02749235153198242, 0.027453535079956053, 0.0273767032623291, 0.027410335540771484, 0.027525728225708007, 0.027373952865600584, 0.027262943267822266, 0.027235616683959962, 0.027257280349731447, 0.027285791397094725, 0.027256832122802735, 0.027303936004638672, 0.027299840927124022, 0.0273305606842041, 0.027378847122192383, 0.027343711853027343, 0.027205408096313475, 0.027248287200927736, 0.027284032821655275, 0.02727302360534668, 0.027281600952148436, 0.027434015274047853, 0.02737455940246582, 0.02793471908569336, 0.027201536178588868, 0.027389951705932617, 0.027338815689086915, 0.027356895446777343, 0.029134048461914062, 0.028009471893310548, 0.02750054359436035, 0.027178432464599608, 0.02706489562988281, 0.027191295623779296, 0.027084800720214845, 0.02711961555480957, 0.027043455123901366, 0.02783270454406738, 0.026990591049194337, 0.027131904602050783, 0.02709503936767578, 0.027159616470336913, 0.027173824310302734, 0.02706768035888672, 0.02711174392700195, 0.027095455169677735, 0.02712166404724121, 0.027080095291137696, 0.027193952560424804, 0.02718243217468262, 0.027144031524658205, 0.027226367950439454, 0.027232704162597657, 0.02724051284790039, 0.027268224716186524, 0.02722502326965332, 0.027256832122802735, 0.02731340789794922, 0.027325183868408202, 0.027239583969116212, 0.027213760375976562, 0.027335584640502928, 0.02739596748352051, 0.027383935928344726, 0.02754764747619629, 0.027455488204956056, 0.027505983352661134, 0.02733299255371094, 0.02737388801574707, 0.0273973445892334, 0.027413280487060546, 0.02733465576171875, 0.027258880615234377, 0.027249727249145508, 0.027179967880249022, 0.027264703750610353, 0.027257152557373047, 0.02731007957458496, 0.027215871810913086, 0.027287551879882813, 0.0273305606842041, 0.02738761520385742, 0.0274803524017334, 0.027322368621826174, 0.027983871459960938, 0.02733875274658203, 0.02737923240661621, 0.027333087921142578, 0.027375328063964845, 0.02732681655883789, 0.02740163230895996, 0.029177152633666992, 0.028002592086791993, 0.027438720703125, 0.02719375991821289, 0.027183488845825197, 0.02707865524291992, 0.027107328414916993, 0.027090944290161133, 0.027064031600952148, 0.02710966491699219, 0.02711043167114258, 0.027038719177246092, 0.027127775192260742, 0.0271278076171875, 0.02713599967956543, 0.02712544059753418, 0.027126079559326173, 0.027236352920532225, 0.027174911499023437, 0.027146240234375, 0.027187391281127928, 0.027223680496215822, 0.0271976318359375, 0.02716374397277832, 0.027150272369384765, 0.02713692855834961, 0.02721526336669922, 0.02718377685546875, 0.02724857521057129, 0.027176095962524415, 0.0271430721282959, 0.027064096450805663, 0.027162431716918945, 0.027302303314208985, 0.02735513687133789, 0.027392127990722655, 0.027575519561767577, 0.027407007217407228, 0.0273756160736084, 0.027402240753173827, 0.02730169677734375, 0.027357376098632813, 0.027441152572631834, 0.027184223175048827, 0.027281375885009767, 0.027277599334716796, 0.02725119972229004, 0.02729145622253418, 0.027255136489868163, 0.027236352920532225, 0.02738492774963379, 0.02725161552429199, 0.027254783630371093, 0.02715769577026367, 0.027251520156860352, 0.027335935592651368, 0.027400960922241212, 0.027428863525390625, 0.027389568328857423, 0.027420223236083983, 0.027402784347534178, 0.02738377571105957, 0.027402240753173827]",tokens/s,36.697908730091044,,
@@ -24275,7 +24275,7 @@ ChildProcessError: Traceback (most recent call last):
24275
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
24276
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
24277
  return func(*args, **kwargs)
24278
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 560.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 8.12 MiB is free. Process 20781 has 14.73 GiB memory in use. Of the allocated memory 14.62 GiB is allocated by PyTorch, and 1.67 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
24279
 
24280
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
24281
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -25123,7 +25123,7 @@ ChildProcessError: Traceback (most recent call last):
25123
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
25124
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
25125
  return func(*args, **kwargs)
25126
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 64.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 30.12 MiB is free. Process 19060 has 14.71 GiB memory in use. Of the allocated memory 14.51 GiB is allocated by PyTorch, and 85.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
25127
 
25128
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
25129
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,mistral,TencentARC/Mistral_Pro_8B_v0.1,TencentARC/Mistral_Pro_8B_v0.1,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -26613,7 +26613,7 @@ ChildProcessError: Traceback (most recent call last):
26613
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
26614
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
26615
  return func(*args, **kwargs)
26616
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 21165 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
26617
 
26618
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
26619
  float16-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -32546,7 +32546,7 @@ ChildProcessError: Traceback (most recent call last):
32546
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
32547
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
32548
  return func(*args, **kwargs)
32549
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 22273 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
32550
 
32551
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
32552
  float16-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
@@ -34605,7 +34605,7 @@ ChildProcessError: Traceback (most recent call last):
34605
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
34606
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
34607
  return func(*args, **kwargs)
34608
- torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 22581 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
34609
 
34610
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
34611
  bfloat16-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
1113
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
1114
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
1115
  return func(*args, **kwargs)
1116
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 21809 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
1117
 
1118
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
1119
  bfloat16-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
7210
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
7211
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
7212
  return func(*args, **kwargs)
7213
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 560.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 8.12 MiB is free. Process 22155 has 14.73 GiB memory in use. Of the allocated memory 14.62 GiB is allocated by PyTorch, and 1.67 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
7214
 
7215
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
7216
  float32-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float32,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
8020
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
8021
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
8022
  return func(*args, **kwargs)
8023
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 64.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 30.12 MiB is free. Process 19776 has 14.71 GiB memory in use. Of the allocated memory 14.51 GiB is allocated by PyTorch, and 85.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
8024
 
8025
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
8026
  float32-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,mistral,TencentARC/Mistral_Pro_8B_v0.1,TencentARC/Mistral_Pro_8B_v0.1,cuda,0,42,,,True,True,,float32,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
14545
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
14546
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
14547
  return func(*args, **kwargs)
14548
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 23522 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
14549
 
14550
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
14551
  bfloat16-flash_attention_2,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,flash_attention_2,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
19984
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
19985
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
19986
  return func(*args, **kwargs)
19987
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 23203 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
19988
 
19989
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
19990
  float16-flash_attention_2,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,flash_attention_2,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
23674
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
23675
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
23676
  return func(*args, **kwargs)
23677
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 224.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 162.12 MiB is free. Process 23918 has 14.58 GiB memory in use. Of the allocated memory 14.44 GiB is allocated by PyTorch, and 25.46 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
23678
 
23679
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
23680
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,stablelm,stabilityai/stablelm-3b-4e1t,stabilityai/stablelm-3b-4e1t,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.224-212.876.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.4.0,,4.44.2,,0.34.2,,,,1.22.0,,,,0.12.0,,,MB,884.5312,11792.154624,0.0,11389.632512,11388.883968,s,1,7.62391650390625,7.62391650390625,0.0,7.62391650390625,7.62391650390625,7.62391650390625,7.62391650390625,[7.62391650390625],,kWh,7.550980120830293e-06,8.183254871460181e-07,4.161947774000696e-06,1.2531253381977007e-05,,MB,1211.850752,12089.950208,0.0,11676.942336,11620.241408,s,10,3.529409362792969,0.35294093627929685,0.004687535482057751,0.3546914520263672,0.3575145141601562,0.3580895935058594,0.35854965698242186,"[0.3421952209472656, 0.3542493896484375, 0.35546945190429685, 0.3586646728515625, 0.3475618896484375, 0.3554535827636719, 0.35738671875, 0.35513351440429686, 0.35198422241210936, 0.3513106994628906]",tokens/s,725.333826953461,kWh,1.0179109382326568e-05,1.1225649051281122e-06,6.76206479662085e-06,1.806373908407553e-05,tokens/kWh,14172038.181490464,MB,1217.179648,12089.950208,0.0,11676.942336,11620.243968,s,10,31.83818115234375,3.1838181152343745,0.0023803913060039604,3.1841195068359376,3.1869217041015627,3.1870340942382813,3.187124006347656,"[3.182017333984375, 3.184160400390625, 3.182552490234375, 3.183121826171875, 3.17848828125, 3.18476953125, 3.184949462890625, 3.18407861328125, 3.186896728515625, 3.187146484375]",tokens/s,19.78756251764159,kWh,9.316640863058924e-05,1.0276466099167838e-05,6.183607341517975e-05,0.00016527894814493684,tokens/kWh,381173.7714155458,,s,630,31.834508899688718,0.05053096650744242,0.00027557110689108155,0.05054019165039063,0.050829452133178715,0.05090588703155518,0.0515531579208374,"[0.05157068634033203, 0.05063679885864258, 0.05017599868774414, 0.04999980926513672, 0.05020832061767578, 0.0501212158203125, 0.05012678527832031, 0.05012886428833008, 0.0501822395324707, 0.05013913726806641, 0.05027222442626953, 0.0502599983215332, 0.05031321716308594, 0.050114559173583983, 0.050184192657470705, 0.05008902359008789, 0.05028300857543945, 0.05022150421142578, 0.050450401306152345, 0.050608158111572266, 0.050603870391845704, 0.05046031951904297, 0.05046732711791992, 0.05025177764892578, 0.05030246353149414, 0.05031520080566406, 0.05030470275878906, 0.05025471878051758, 0.050288639068603515, 0.0503166389465332, 0.05038966369628906, 0.050348033905029295, 0.05049305725097656, 0.0504323844909668, 0.05052537536621094, 0.05059795379638672, 0.050791168212890626, 0.050683902740478515, 0.050716670989990234, 0.0506341438293457, 0.05071660614013672, 0.05068479919433594, 0.050669345855712894, 0.05062047958374023, 0.050672863006591795, 0.05063958358764648, 0.05063663864135742, 0.05060214233398438, 0.050561023712158204, 0.050677761077880856, 0.050670848846435544, 0.050648929595947266, 0.05065411376953125, 0.05081087875366211, 0.05064089584350586, 0.050722270965576174, 0.05067830276489258, 0.050756607055664066, 0.050918399810791014, 0.050888671875, 0.050864158630371095, 0.05086617660522461, 0.05081292724609375, 0.051525279998779296, 0.050764320373535156, 0.050282497406005856, 0.05028432083129883, 0.050175617218017575, 0.05016841506958008, 0.050237438201904294, 0.05014323043823242, 0.0505239372253418, 0.05024528121948242, 0.050235969543457035, 0.05016899108886719, 0.05016569519042969, 0.05054054260253906, 0.050278560638427734, 0.05021343994140625, 0.050309310913085936, 0.05040947341918945, 0.05056512069702149, 0.0507064323425293, 0.05078015899658203, 0.05053984069824219, 0.05064265441894531, 0.05047590255737305, 0.050345535278320315, 0.050342430114746095, 0.05032470321655273, 0.050332447052001954, 0.05036646270751953, 0.05034710311889649, 0.05033184051513672, 0.05041430282592774, 0.050388992309570314, 0.050484577178955076, 0.050481822967529295, 0.05046476745605469, 0.05060403060913086, 0.050710529327392576, 0.05065523147583008, 0.0508040657043457, 0.05062518310546875, 0.05088256072998047, 0.050783905029296875, 0.0506412467956543, 0.05057126235961914, 0.05061964797973633, 0.05052617645263672, 0.05053462219238281, 0.05060211181640625, 0.05091987228393555, 0.05053961563110351, 0.05069680023193359, 0.050579776763916014, 0.05063679885864258, 0.05060713577270508, 0.05070742416381836, 0.0507125129699707, 0.05084726333618164, 0.05077660751342773, 0.05074691009521484, 0.050825695037841796, 0.050826366424560544, 0.0507831039428711, 0.05176115036010742, 0.05078742218017578, 0.05025788879394531, 0.050086849212646486, 0.05018009567260742, 0.05006131362915039, 0.05021491241455078, 0.05018009567260742, 0.050098175048828124, 0.05026764678955078, 0.05028915023803711, 0.050214336395263674, 0.05011308670043945, 0.05016323089599609, 0.05026863861083984, 0.05026601409912109, 0.0501712646484375, 0.0502413444519043, 0.050406303405761715, 0.050730945587158204, 0.05065439987182617, 0.05046566390991211, 0.05028400039672851, 0.05036624145507813, 0.050342655181884764, 0.050331649780273435, 0.05030857467651367, 0.05032400131225586, 0.05023539352416992, 0.05045862579345703, 0.050267486572265624, 0.050375328063964844, 0.05041766357421875, 0.05050518417358398, 0.05041404724121094, 0.050505214691162106, 0.050496063232421874, 0.05064908981323242, 0.050601982116699216, 0.050730720520019534, 0.05067190551757812, 0.050792224884033205, 0.05067388916015625, 0.05064838409423828, 0.05060063934326172, 0.05065017700195312, 0.05058246231079101, 0.050572414398193356, 0.05065804672241211, 0.050724990844726564, 0.05075465774536133, 0.050664447784423826, 0.050587169647216795, 0.0507457275390625, 0.0506695671081543, 0.05082278442382813, 0.05076825714111328, 0.05076582336425781, 0.05092313766479492, 0.05083552169799805, 0.0510134391784668, 0.05077862548828125, 0.05077196884155273, 0.05157795333862305, 0.05066435241699219, 0.050216960906982425, 0.05022304153442383, 0.05025388717651367, 0.050260990142822266, 0.05018931198120117, 0.05029619216918945, 0.05014182281494141, 0.05024470520019531, 0.05017436981201172, 0.05022771072387695, 0.05018761444091797, 0.050200672149658204, 0.050237472534179685, 0.05018668746948242, 0.05032515335083008, 0.05035647964477539, 0.05054278564453125, 0.050611743927001955, 0.0505382080078125, 0.05049542236328125, 0.05042067337036133, 0.050366336822509766, 0.050411518096923826, 0.050359809875488284, 0.05037107086181641, 0.05032102584838867, 0.05031472015380859, 0.05029776000976562, 0.050251136779785155, 0.0501798095703125, 0.05077699279785156, 0.05042937469482422, 0.05047558212280273, 0.050522113800048826, 0.0506033935546875, 0.05062838363647461, 0.05064380645751953, 0.050713951110839844, 0.05069635009765625, 0.050661376953125, 0.05061593627929688, 0.05082815933227539, 0.050627777099609375, 0.050628704071044923, 0.05066416168212891, 0.0506429443359375, 0.050544639587402344, 0.05070025634765625, 0.05059135818481445, 0.050641311645507815, 0.05067571258544922, 0.050735103607177735, 0.05065932846069336, 0.05071638488769531, 0.05073891067504883, 0.050805313110351566, 0.051035999298095706, 0.05081718444824219, 0.05088774490356445, 0.05084572982788086, 0.050708511352539065, 0.0516328010559082, 0.05083071899414063, 0.0502685432434082, 0.050114814758300784, 0.04996432113647461, 0.05011324691772461, 0.0500747184753418, 0.050125247955322264, 0.050012126922607425, 0.050130752563476565, 0.05020947265625, 0.05003878402709961, 0.05019375991821289, 0.0500968017578125, 0.05019180679321289, 0.050002494812011716, 0.05018009567260742, 0.05032755279541016, 0.05048934555053711, 0.050552833557128904, 0.0506363525390625, 0.05055123138427734, 0.05043404769897461, 0.05017734527587891, 0.05028524780273438, 0.05028659057617187, 0.05039436721801758, 0.050311103820800784, 0.05030361557006836, 0.05038083267211914, 0.05047628784179688, 0.05034281539916992, 0.05041907119750977, 0.050518657684326174, 0.050485183715820316, 0.050433406829833986, 0.05061907196044922, 0.05104435348510742, 0.05074716949462891, 0.050743518829345705, 0.05070771026611328, 0.05074764633178711, 0.050567680358886716, 0.05039308929443359, 0.05050483322143555, 0.050514495849609375, 0.050454784393310546, 0.05053635025024414, 0.05047449493408203, 0.05055894470214844, 0.050471614837646485, 0.050339839935302735, 0.05043199920654297, 0.05055487823486328, 0.05052604675292969, 0.050417823791503905, 0.05076377487182617, 0.050577407836914064, 0.05073020935058594, 0.050549537658691406, 0.050799713134765626, 0.05066640090942383, 0.05071257781982422, 0.05145964813232422, 0.05054844665527344, 0.050289375305175785, 0.05014323043823242, 0.05007974243164062, 0.05015961456298828, 0.05014473724365234, 0.050159233093261715, 0.05015644836425781, 0.05013078308105469, 0.05033964920043945, 0.050127201080322266, 0.05020985412597656, 0.05020528030395508, 0.050239742279052736, 0.050208255767822264, 0.0502125129699707, 0.050291648864746095, 0.050527423858642576, 0.05068854522705078, 0.05069990539550781, 0.05063910293579101, 0.05040700912475586, 0.050430656433105465, 0.05036044692993164, 0.050348033905029295, 0.05039616012573242, 0.05031628799438476, 0.0504131851196289, 0.050463104248046876, 0.050522113800048826, 0.0504439697265625, 0.050547008514404294, 0.05039513778686523, 0.05056716918945312, 0.05050294494628906, 0.05060630416870117, 0.05085190582275391, 0.050743743896484374, 0.05082931137084961, 0.05075107192993164, 0.05077648162841797, 0.05074943923950195, 0.050710529327392576, 0.05063683319091797, 0.05081494522094727, 0.05076172637939453, 0.050702335357666016, 0.050560577392578125, 0.05079443359375, 0.05055744171142578, 0.050683902740478515, 0.05067161560058594, 0.05070438385009766, 0.050683902740478515, 0.05082316970825195, 0.05081497573852539, 0.05084934234619141, 0.05088848114013672, 0.051154590606689455, 0.05082371139526367, 0.05086051177978516, 0.05086003112792969, 0.05167411041259766, 0.05076416015625, 0.05025817489624024, 0.050057567596435544, 0.05016697692871094, 0.050033470153808594, 0.0501288948059082, 0.050171585083007814, 0.0503565444946289, 0.05030297470092773, 0.05019343948364258, 0.050315265655517576, 0.050230239868164064, 0.050321407318115234, 0.050130752563476565, 0.050219200134277345, 0.050298881530761716, 0.05032470321655273, 0.05045123291015625, 0.05072895812988281, 0.050980289459228514, 0.05049196624755859, 0.05040083312988281, 0.05036281585693359, 0.050290687561035156, 0.05039913558959961, 0.05032742309570312, 0.050343391418457034, 0.0503078384399414, 0.05040332794189453, 0.050323455810546876, 0.05034521484375, 0.050372608184814455, 0.05044940948486328, 0.05051932907104492, 0.050546783447265625, 0.05065356826782227, 0.050713951110839844, 0.05072553634643555, 0.050855934143066404, 0.050733055114746094, 0.050826271057128905, 0.05070025634765625, 0.050715648651123046, 0.05075353622436524, 0.05068364715576172, 0.050673473358154295, 0.05064134216308594, 0.050552833557128904, 0.05060748672485352, 0.05072313690185547, 0.050968734741210935, 0.05065887832641602, 0.05067190551757812, 0.05074736022949219, 0.050743072509765626, 0.050821407318115235, 0.05080092620849609, 0.051076225280761715, 0.050979328155517575, 0.05088614273071289, 0.05086819076538086, 0.05080361557006836, 0.05156454467773437, 0.05069004821777344, 0.05017331314086914, 0.05013471984863281, 0.050222015380859374, 0.050135040283203126, 0.050106529235839845, 0.05019007873535156, 0.05018838500976563, 0.050157569885253904, 0.05022719955444336, 0.050249729156494144, 0.05029033660888672, 0.05029513549804687, 0.05026406478881836, 0.050253822326660154, 0.05036236953735351, 0.05036236953735351, 0.05059379196166992, 0.050710529327392576, 0.050644992828369144, 0.05053440093994141, 0.05041916656494141, 0.05029724884033203, 0.050321537017822264, 0.05035523223876953, 0.05025481414794922, 0.05032112121582031, 0.05026230239868164, 0.05034598541259765, 0.050414623260498045, 0.050331615447998045, 0.0504637451171875, 0.050444286346435545, 0.05056460952758789, 0.05061254501342773, 0.0507578239440918, 0.05066342544555664, 0.05086617660522461, 0.0507446403503418, 0.05072272109985351, 0.050756385803222656, 0.05069823837280273, 0.05067712020874023, 0.05060262298583985, 0.050685791015625, 0.05061840057373047, 0.05070415878295898, 0.050590049743652346, 0.05062041473388672, 0.05067497634887695, 0.050662113189697267, 0.050702335357666016, 0.050728225708007814, 0.05058428955078125, 0.05086396789550781, 0.050718753814697266, 0.05084985733032227, 0.05083961486816406, 0.050917377471923826, 0.05102592086791992, 0.050878398895263674, 0.05079849624633789, 0.051515392303466793, 0.0507325439453125, 0.050237953186035154, 0.050148544311523435, 0.05016044616699219, 0.050098175048828124, 0.05015075302124023, 0.050274528503417966, 0.05015596771240234, 0.05026201629638672, 0.050288639068603515, 0.05025958251953125, 0.0502685432434082, 0.0503616943359375, 0.0503548469543457, 0.05034598541259765, 0.05027174377441406, 0.05040560150146484, 0.05047087860107422, 0.05074348831176758, 0.050603809356689455, 0.050571231842041015, 0.05043033599853516, 0.05047091293334961, 0.050423809051513675, 0.050444286346435545, 0.050385982513427734, 0.050419872283935546, 0.0503359375, 0.050477664947509764, 0.05051145553588867, 0.05044675064086914, 0.05045248031616211, 0.05052403259277344, 0.050544769287109374, 0.05060403060913086, 0.0506695671081543, 0.050784255981445314, 0.05088051223754883, 0.05083135986328125, 0.050786304473876956, 0.050759681701660155, 0.0506960334777832, 0.05073321533203125, 0.05059756851196289, 0.050772289276123046, 0.050599872589111326, 0.05106284713745117, 0.05055897521972656, 0.0506668815612793, 0.05067020797729492, 0.050710529327392576, 0.05075353622436524, 0.05072860717773438, 0.05068172836303711, 0.050907615661621095, 0.05076732635498047, 0.05091382217407227, 0.05091328048706055, 0.050958335876464846, 0.05094400024414063, 0.05111529541015625, 0.05090377426147461, 0.05180992126464844, 0.05079507064819336, 0.05030809783935547, 0.05042822265625, 0.0502545280456543, 0.050229248046875, 0.05015497589111328, 0.05022771072387695, 0.0503337287902832, 0.05023871994018555, 0.050244350433349606, 0.05028160095214844, 0.050328449249267576, 0.05023539352416992, 0.05029478454589844, 0.050300289154052734, 0.05050636672973633, 0.0504189453125, 0.050590465545654294, 0.050694145202636716, 0.0506429443359375, 0.050563072204589846, 0.05057712173461914, 0.0503803825378418, 0.050430656433105465, 0.05040719985961914, 0.05041584014892578, 0.05039427185058594, 0.05044924926757813, 0.05038256072998047, 0.05050601577758789, 0.050493438720703124, 0.05043404769897461, 0.050462718963623046, 0.0505300178527832, 0.05057769775390625, 0.05056668853759766, 0.05073273468017578, 0.05082191848754883, 0.05086207962036133, 0.05089427185058594, 0.0508642578125, 0.05078470230102539, 0.050763454437255856, 0.050624095916748046, 0.050673728942871095, 0.050621086120605466, 0.05074694442749023, 0.050661823272705075, 0.05068755340576172, 0.05058339309692383, 0.05066403198242188, 0.05062793731689453, 0.05070492935180664, 0.05073932647705078, 0.05076377487182617, 0.05081862258911133, 0.050839969635009766, 0.05094403076171875, 0.05083135986328125, 0.05095616149902344, 0.05086220932006836, 0.050818687438964845]",tokens/s,19.78984510127499,,
 
23764
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
23765
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
23766
  return func(*args, **kwargs)
23767
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 256.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 94.12 MiB is free. Process 27999 has 14.65 GiB memory in use. Of the allocated memory 14.53 GiB is allocated by PyTorch, and 2.49 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
23768
 
23769
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
23770
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,stablelm,stabilityai/stablelm-2-1_6b,stabilityai/stablelm-2-1_6b,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.224-212.876.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.4.0,,4.44.2,,0.34.2,,,,1.22.0,,,,0.12.0,,,MB,902.3488,6993.870848,0.0,6591.348736,6590.657536,s,1,7.7316650390625,7.7316650390625,0.0,7.7316650390625,7.7316650390625,7.7316650390625,7.7316650390625,[7.7316650390625],,kWh,6.249157045840547e-06,6.81981213064745e-07,2.031390514009579e-06,8.962528772914872e-06,,MB,1234.321408,7258.112,0.0,6845.104128,6805.125632,s,10,2.2387210540771485,0.22387210540771485,0.006860980982676506,0.22428342437744142,0.230287255859375,0.23189871673583984,0.23318788543701172,"[0.205762939453125, 0.22159222412109375, 0.2255982666015625, 0.22400172424316406, 0.22380921936035156, 0.22599891662597657, 0.22992915344238282, 0.22456512451171876, 0.22395330810546876, 0.2335101776123047]",tokens/s,1143.510039063482,kWh,6.283555607446845e-06,6.924726440920054e-07,4.172225560000033e-06,1.1148253811538883e-05,tokens/kWh,22963237.501377113,MB,1246.425088,7260.209152,0.0,6847.20128,6805.128192,s,10,17.170465332031252,1.7170465332031248,0.0037074144946325858,1.71831298828125,1.7202278564453124,1.7209889892578125,1.7215978955078124,"[1.710521240234375, 1.7104453125, 1.717341552734375, 1.7161055908203124, 1.7200587158203124, 1.7158358154296875, 1.719595703125, 1.719284423828125, 1.7217501220703124, 1.71952685546875]",tokens/s,36.69091010741243,kWh,5.047899915463414e-05,5.568098191081764e-06,3.353133238059929e-05,8.95784297263152e-05,tokens/kWh,703294.3108344382,,s,630,17.16719076919555,0.027249509157453264,0.0003084332683327531,0.027201295852661133,0.027428953170776368,0.02754782371520996,0.029114681129455568,"[0.02920857620239258, 0.027962623596191408, 0.027306751251220705, 0.02713363265991211, 0.02696633529663086, 0.0268569278717041, 0.026939264297485353, 0.0270032958984375, 0.027158784866333007, 0.02697216033935547, 0.02710323143005371, 0.026866912841796875, 0.027019039154052734, 0.02698137664794922, 0.026959871292114256, 0.027086143493652345, 0.027078880310058593, 0.02702998352050781, 0.02715190315246582, 0.0271549129486084, 0.02708684730529785, 0.027023359298706053, 0.026994688034057617, 0.02690457534790039, 0.026973791122436523, 0.026966367721557617, 0.02702547264099121, 0.027062271118164064, 0.027107328414916993, 0.027037696838378908, 0.027104864120483397, 0.02713360023498535, 0.027085567474365236, 0.027183103561401366, 0.02717695999145508, 0.02717695999145508, 0.02726924705505371, 0.02724995231628418, 0.027183712005615233, 0.027262975692749023, 0.027143999099731444, 0.02703968048095703, 0.02712396812438965, 0.027084800720214845, 0.027229471206665037, 0.027157215118408202, 0.02711347198486328, 0.027127487182617187, 0.027089120864868164, 0.0271092472076416, 0.027039968490600585, 0.027068416595458986, 0.027164831161499023, 0.027126655578613282, 0.027124704360961913, 0.027152256011962892, 0.0271648006439209, 0.027328096389770507, 0.027172672271728517, 0.027238079071044922, 0.027163551330566405, 0.027172319412231444, 0.02713654327392578, 0.029059200286865233, 0.027767072677612303, 0.027430912017822266, 0.027130943298339844, 0.02695382308959961, 0.026888288497924805, 0.026823423385620118, 0.02681999969482422, 0.026834623336791992, 0.026835872650146485, 0.0268287353515625, 0.026856544494628907, 0.02683798408508301, 0.02689571189880371, 0.026927743911743164, 0.026850784301757812, 0.02692153549194336, 0.02693049621582031, 0.02693129539489746, 0.026992639541625976, 0.02686934471130371, 0.02687283134460449, 0.02697420883178711, 0.026998144149780273, 0.02692531204223633, 0.027044063568115236, 0.027005376815795897, 0.026913503646850585, 0.026970687866210936, 0.027146047592163085, 0.027150976181030274, 0.027090368270874025, 0.027050559997558593, 0.02710688018798828, 0.027177215576171875, 0.027262720108032226, 0.027237920761108397, 0.02731820869445801, 0.027265695571899413, 0.02716703987121582, 0.027224063873291016, 0.027195072174072264, 0.027248960494995117, 0.027235519409179686, 0.027212608337402345, 0.027276863098144533, 0.027064159393310548, 0.02706697654724121, 0.027244543075561522, 0.02732758331298828, 0.027256864547729492, 0.027294303894042967, 0.027201055526733398, 0.02720844841003418, 0.02732646369934082, 0.0273768310546875, 0.027267904281616212, 0.027340799331665038, 0.02733875274658203, 0.02734489631652832, 0.027337728500366212, 0.027339775085449217, 0.027310144424438475, 0.02922528076171875, 0.028089088439941408, 0.027412416458129883, 0.027137535095214844, 0.026980863571166993, 0.0270231990814209, 0.027017375946044923, 0.026922752380371093, 0.026970495223999025, 0.026967487335205077, 0.02702761650085449, 0.026882335662841796, 0.0269434871673584, 0.02712348747253418, 0.02698467254638672, 0.027031551361083983, 0.02706790351867676, 0.02699235153198242, 0.027068832397460937, 0.02701468849182129, 0.027100000381469726, 0.027064319610595702, 0.02701923179626465, 0.02710905647277832, 0.027117952346801758, 0.02715385627746582, 0.027113792419433593, 0.027152896881103516, 0.027629280090332033, 0.027254783630371093, 0.02714419174194336, 0.0271297607421875, 0.027088991165161135, 0.027185152053833008, 0.027309120178222655, 0.027360191345214845, 0.027394048690795897, 0.02736332893371582, 0.02733670425415039, 0.027389951705932617, 0.02728550338745117, 0.027299840927124022, 0.027241535186767578, 0.02724959945678711, 0.02727071952819824, 0.027408832550048827, 0.02761897659301758, 0.02734444808959961, 0.027466527938842772, 0.02727084732055664, 0.027217376708984376, 0.02726323127746582, 0.027226720809936523, 0.02728976058959961, 0.02738582420349121, 0.02736729621887207, 0.027267072677612306, 0.027381759643554687, 0.02735308837890625, 0.027297792434692384, 0.02759065628051758, 0.027289600372314454, 0.02734489631652832, 0.029165567398071288, 0.027844608306884764, 0.027445247650146484, 0.02715238380432129, 0.02704979133605957, 0.027089439392089843, 0.027046720504760743, 0.026866527557373048, 0.02694963264465332, 0.027118783950805664, 0.027009408950805665, 0.027068864822387694, 0.027024831771850586, 0.02709766387939453, 0.02703900718688965, 0.026954463958740234, 0.027068288803100585, 0.027117055892944338, 0.02703580856323242, 0.027044319152832032, 0.0269816951751709, 0.02692915153503418, 0.027212480545043945, 0.027043840408325196, 0.027088895797729492, 0.02714182472229004, 0.027171104431152344, 0.02714147186279297, 0.02710780715942383, 0.02711356735229492, 0.02712588882446289, 0.027033599853515625, 0.02709708786010742, 0.027112543106079103, 0.02733353614807129, 0.02736073684692383, 0.027298240661621093, 0.02723961639404297, 0.027226207733154296, 0.027177791595458984, 0.02716057586669922, 0.0271824951171875, 0.027118335723876952, 0.027200767517089844, 0.027300128936767577, 0.027287263870239258, 0.028184736251831054, 0.027277759552001953, 0.02735103988647461, 0.027225183486938476, 0.027244543075561522, 0.027198368072509766, 0.027189504623413085, 0.027268255233764648, 0.027226720809936523, 0.027285375595092774, 0.027407615661621094, 0.027429759979248045, 0.027482112884521483, 0.02749833679199219, 0.0274303035736084, 0.027312671661376953, 0.027383712768554686, 0.029056671142578126, 0.02809231948852539, 0.0275479679107666, 0.02731430435180664, 0.027148191452026366, 0.027002975463867186, 0.027006912231445312, 0.026961376190185547, 0.027017759323120116, 0.026984512329101564, 0.02714419174194336, 0.027166015625, 0.027130048751831056, 0.027044095993041993, 0.02710758399963379, 0.027058176040649414, 0.027076608657836915, 0.02712291145324707, 0.027484960556030273, 0.02711142349243164, 0.027092992782592775, 0.02723347282409668, 0.027165311813354492, 0.02717305564880371, 0.027094688415527344, 0.027172927856445313, 0.02742448043823242, 0.027142240524291993, 0.027117151260375977, 0.027099039077758787, 0.027160671234130858, 0.027085599899291993, 0.027129247665405275, 0.02733647918701172, 0.0273437442779541, 0.027329759597778322, 0.02749932861328125, 0.02735923194885254, 0.027420543670654298, 0.027272352218627928, 0.02740678405761719, 0.027724096298217774, 0.027195615768432616, 0.027242496490478517, 0.027271360397338868, 0.027303743362426757, 0.027267072677612306, 0.027432159423828126, 0.027980512619018554, 0.027295167922973634, 0.027251039505004883, 0.02727497673034668, 0.02727174377441406, 0.02728691291809082, 0.027336864471435546, 0.027231775283813476, 0.027376415252685547, 0.02741196823120117, 0.027474111557006835, 0.027339231491088866, 0.02738096046447754, 0.027368223190307617, 0.027387903213500975, 0.029067264556884766, 0.027836191177368165, 0.027463712692260743, 0.02726691246032715, 0.02712816047668457, 0.026995967864990235, 0.02694790458679199, 0.027057823181152345, 0.02712451171875, 0.027010368347167968, 0.02707321548461914, 0.026979808807373048, 0.02699078369140625, 0.0270728645324707, 0.027080703735351562, 0.02712166404724121, 0.02707587242126465, 0.02703603172302246, 0.02702351951599121, 0.027017248153686522, 0.02718022346496582, 0.027069408416748045, 0.02711347198486328, 0.027090431213378906, 0.02708531188964844, 0.027107328414916993, 0.027181055068969725, 0.027136032104492187, 0.027150304794311523, 0.02707164764404297, 0.02702012825012207, 0.027121152877807617, 0.027070976257324218, 0.027141759872436524, 0.027224031448364258, 0.027498239517211913, 0.027464351654052734, 0.02736947250366211, 0.027398143768310547, 0.027283231735229493, 0.02734716796875, 0.027299840927124022, 0.027271167755126953, 0.027230207443237304, 0.027242528915405274, 0.0272260799407959, 0.027370943069458007, 0.027158496856689453, 0.027165279388427735, 0.027240447998046875, 0.027232255935668945, 0.027262975692749023, 0.027243808746337892, 0.027284032821655275, 0.027229663848876953, 0.02727801513671875, 0.027303936004638672, 0.027254783630371093, 0.027352895736694336, 0.027277088165283202, 0.02735500717163086, 0.027371616363525392, 0.027373760223388673, 0.029223615646362305, 0.02812313652038574, 0.027559455871582032, 0.027306463241577147, 0.02715443229675293, 0.02715648078918457, 0.027200511932373047, 0.02703984069824219, 0.027116384506225586, 0.027133951187133788, 0.0272609920501709, 0.027222015380859374, 0.027047840118408203, 0.027021408081054688, 0.027084640502929688, 0.02716454315185547, 0.027090591430664064, 0.027221887588500977, 0.027044639587402344, 0.027142112731933593, 0.027228160858154295, 0.0271231689453125, 0.027109792709350586, 0.027191423416137697, 0.027152448654174804, 0.027123647689819334, 0.027222015380859374, 0.027119264602661133, 0.027210079193115234, 0.027241535186767578, 0.02711244773864746, 0.027217088699340822, 0.02712588882446289, 0.027349472045898438, 0.027331872940063475, 0.027454336166381835, 0.027373567581176757, 0.027447296142578126, 0.027441152572631834, 0.027288639068603515, 0.027214111328125, 0.027445472717285157, 0.02736172866821289, 0.027314048767089844, 0.027230335235595704, 0.027282527923583984, 0.027294624328613282, 0.027215167999267577, 0.027300352096557616, 0.027397439956665038, 0.02737241554260254, 0.027295743942260742, 0.027398048400878908, 0.027261024475097657, 0.0273768310546875, 0.027226943969726563, 0.02719968032836914, 0.02730169677734375, 0.027387903213500975, 0.02735308837890625, 0.02732796859741211, 0.027232799530029297, 0.02730803108215332, 0.02919219207763672, 0.028078079223632812, 0.0275230712890625, 0.027225727081298827, 0.027142528533935548, 0.02709199905395508, 0.027031999588012695, 0.027124256134033204, 0.02713113594055176, 0.027097856521606446, 0.027162431716918945, 0.027024959564208983, 0.027148927688598633, 0.02711689567565918, 0.027144224166870116, 0.027119359970092773, 0.02719833564758301, 0.027053056716918947, 0.027160768508911134, 0.02698303985595703, 0.027101375579833983, 0.02716057586669922, 0.027031551361083983, 0.02697395133972168, 0.027001087188720702, 0.02716057586669922, 0.027082752227783204, 0.027198623657226563, 0.027157344818115235, 0.02715648078918457, 0.02716262435913086, 0.027172864913940428, 0.027243839263916016, 0.027165376663208007, 0.027482112884521483, 0.02749235153198242, 0.027453535079956053, 0.0273767032623291, 0.027410335540771484, 0.027525728225708007, 0.027373952865600584, 0.027262943267822266, 0.027235616683959962, 0.027257280349731447, 0.027285791397094725, 0.027256832122802735, 0.027303936004638672, 0.027299840927124022, 0.0273305606842041, 0.027378847122192383, 0.027343711853027343, 0.027205408096313475, 0.027248287200927736, 0.027284032821655275, 0.02727302360534668, 0.027281600952148436, 0.027434015274047853, 0.02737455940246582, 0.02793471908569336, 0.027201536178588868, 0.027389951705932617, 0.027338815689086915, 0.027356895446777343, 0.029134048461914062, 0.028009471893310548, 0.02750054359436035, 0.027178432464599608, 0.02706489562988281, 0.027191295623779296, 0.027084800720214845, 0.02711961555480957, 0.027043455123901366, 0.02783270454406738, 0.026990591049194337, 0.027131904602050783, 0.02709503936767578, 0.027159616470336913, 0.027173824310302734, 0.02706768035888672, 0.02711174392700195, 0.027095455169677735, 0.02712166404724121, 0.027080095291137696, 0.027193952560424804, 0.02718243217468262, 0.027144031524658205, 0.027226367950439454, 0.027232704162597657, 0.02724051284790039, 0.027268224716186524, 0.02722502326965332, 0.027256832122802735, 0.02731340789794922, 0.027325183868408202, 0.027239583969116212, 0.027213760375976562, 0.027335584640502928, 0.02739596748352051, 0.027383935928344726, 0.02754764747619629, 0.027455488204956056, 0.027505983352661134, 0.02733299255371094, 0.02737388801574707, 0.0273973445892334, 0.027413280487060546, 0.02733465576171875, 0.027258880615234377, 0.027249727249145508, 0.027179967880249022, 0.027264703750610353, 0.027257152557373047, 0.02731007957458496, 0.027215871810913086, 0.027287551879882813, 0.0273305606842041, 0.02738761520385742, 0.0274803524017334, 0.027322368621826174, 0.027983871459960938, 0.02733875274658203, 0.02737923240661621, 0.027333087921142578, 0.027375328063964845, 0.02732681655883789, 0.02740163230895996, 0.029177152633666992, 0.028002592086791993, 0.027438720703125, 0.02719375991821289, 0.027183488845825197, 0.02707865524291992, 0.027107328414916993, 0.027090944290161133, 0.027064031600952148, 0.02710966491699219, 0.02711043167114258, 0.027038719177246092, 0.027127775192260742, 0.0271278076171875, 0.02713599967956543, 0.02712544059753418, 0.027126079559326173, 0.027236352920532225, 0.027174911499023437, 0.027146240234375, 0.027187391281127928, 0.027223680496215822, 0.0271976318359375, 0.02716374397277832, 0.027150272369384765, 0.02713692855834961, 0.02721526336669922, 0.02718377685546875, 0.02724857521057129, 0.027176095962524415, 0.0271430721282959, 0.027064096450805663, 0.027162431716918945, 0.027302303314208985, 0.02735513687133789, 0.027392127990722655, 0.027575519561767577, 0.027407007217407228, 0.0273756160736084, 0.027402240753173827, 0.02730169677734375, 0.027357376098632813, 0.027441152572631834, 0.027184223175048827, 0.027281375885009767, 0.027277599334716796, 0.02725119972229004, 0.02729145622253418, 0.027255136489868163, 0.027236352920532225, 0.02738492774963379, 0.02725161552429199, 0.027254783630371093, 0.02715769577026367, 0.027251520156860352, 0.027335935592651368, 0.027400960922241212, 0.027428863525390625, 0.027389568328857423, 0.027420223236083983, 0.027402784347534178, 0.02738377571105957, 0.027402240753173827]",tokens/s,36.697908730091044,,
 
24275
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
24276
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
24277
  return func(*args, **kwargs)
24278
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 560.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 8.12 MiB is free. Process 21077 has 14.73 GiB memory in use. Of the allocated memory 14.62 GiB is allocated by PyTorch, and 1.67 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
24279
 
24280
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
24281
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
25123
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
25124
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
25125
  return func(*args, **kwargs)
25126
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 64.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 30.12 MiB is free. Process 19405 has 14.71 GiB memory in use. Of the allocated memory 14.51 GiB is allocated by PyTorch, and 85.33 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
25127
 
25128
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
25129
  float32-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,mistral,TencentARC/Mistral_Pro_8B_v0.1,TencentARC/Mistral_Pro_8B_v0.1,cuda,0,42,,,True,True,,float32,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
26613
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
26614
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
26615
  return func(*args, **kwargs)
26616
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 21431 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
26617
 
26618
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
26619
  float16-eager,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,eager,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
32546
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
32547
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
32548
  return func(*args, **kwargs)
32549
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 22501 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
32550
 
32551
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
32552
  float16-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,float16,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):
 
34605
  self.weight = Parameter(torch.empty((out_features, in_features), **factory_kwargs))
34606
  File ""/usr/local/lib/python3.10/dist-packages/torch/utils/_device.py"", line 79, in __torch_function__
34607
  return func(*args, **kwargs)
34608
+ torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 280.00 MiB. GPU 0 has a total capacity of 14.74 GiB of which 42.12 MiB is free. Process 22847 has 14.70 GiB memory in use. Of the allocated memory 14.58 GiB is allocated by PyTorch, and 1.64 MiB is reserved by PyTorch but unallocated. If reserved but unallocated memory is large try setting PYTORCH_CUDA_ALLOC_CONF=expandable_segments:True to avoid fragmentation. See documentation for Memory Management (https://pytorch.org/docs/stable/notes/cuda.html#environment-variables)
34609
 
34610
  ",,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,,
34611
  bfloat16-sdpa,pytorch,2.4.1+cu124,optimum_benchmark.backends.pytorch.backend.PyTorchBackend,text-generation,transformers,gpt_neox,EleutherAI/pythia-12b,EleutherAI/pythia-12b,cuda,0,42,,,True,True,,bfloat16,True,False,,sdpa,,False,,False,forward,,False,,inference,optimum_benchmark.scenarios.inference.scenario.InferenceScenario,10,10,10,1,2,256,,True,True,True,64,64,process,optimum_benchmark.launchers.process.launcher.ProcessLauncher,True,kill,False,spawn, Intel(R) Xeon(R) Platinum 8259CL CPU @ 2.50GHz,8,33163.759616,Linux,x86_64,Linux-5.10.225-213.878.amzn2.x86_64-x86_64-with-glibc2.35,x86_64,3.10.12,['Tesla T4'],1,16106127360,0.5.0,,4.45.1,,0.34.2,,,,1.22.0,,,,0.13.0,,"Traceback (most recent call last):