use model without the need for torch-xla

#4
by lovodkin93 - opened

Hello,
I would like to run your model from a linux server, but I keep getting errors related to the torch-xla.
At first the errors stated that I need to separately download torch-xla, which I did.
Now, the error says:

ImportError: /lib64/libm.so.6: version `GLIBC_2.23' not found (required by /home/nlp/sloboda1/anaconda3/envs/hug_original/lib/python3.7/site-packages/torch_xla/lib/libxla_computation_client.so)

After searching online, it appears the problem can be solved if I were able to run a docker, but unfortunately I cannot in my server.

So, my question is, is there a way to disable the need for torch-xla, and run it using torch?

Sign up or log in to comment