transformers-CFG-JSON-demo / requirements.txt
Saibo Geng
use gpt-large and optimum package for faster cpu infernce
b5b0c27
torch
optimum
transformers>=4.26
transformers-cfg==0.2.0