Recommended infrastructure & text embedding inference (TEI) compatibility

#12
by navaprasannaraj - opened

Hi,

1-) I would like to get your advice related to the recommended infrastructures to run this model:

  • if CPU (ram needed, number of cpus, storage disk, inference time per query)
  • if GPU(ram needed, number of GPUs,type GPU (a100, l40s, can A100 partionned into MIG works), storage disk, inference time per query)

2-) Is the model compatible to load & run with text embedding inference ?

Sign up or log in to comment