File size: 164 Bytes
df610d6
 
 
 
 
1
2
3
4
5
{
  "tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
  "update_embeddings": false,
  "max_seq_length": 1000000
}