alberti / requirements.txt
versae's picture
Update requirements.txt
88bc0de
raw
history blame contribute delete
91 Bytes
tokenizers
datasets==1.9.0
transformers==4.8.2
torch==1.11.0
streamlit
icu_tokenizer
langid