mGqFiPhu / 0_WordEmbeddings /wordembedding_config.json
Craig Schmidt
first checkin
9bc6e90
raw
history blame
164 Bytes
{
"tokenizer_class": "sentence_transformers.models.tokenizer.WhitespaceTokenizer.WhitespaceTokenizer",
"update_embeddings": false,
"max_seq_length": 1000000
}