File size: 164 Bytes
22e309f
 
 
 
 
1
2
3
4
5
{
  "tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
  "update_embeddings": false,
  "max_seq_length": 1000000
}