Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yacht
/
latte-mc-bert-base-thai-ws
like
1
Token Classification
Transformers
PyTorch
best2010
lst20
tlc
vistec-tp-th-2021
wisesight_sentiment
Thai
bert
feature-extraction
word segmentation
Inference Endpoints
License:
cc-by-sa-4.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
refs/pr/2
latte-mc-bert-base-thai-ws
1 contributor
History:
5 commits
SFconvertbot
Adding `safetensors` variant of this model
23d0466
10 months ago
.gitattributes
1.57 kB
update tokenizer with fast version
10 months ago
README.md
1.86 kB
update latte url in model card
10 months ago
added_tokens.json
5.16 MB
update tokenizer with fast version
10 months ago
config.json
867 Bytes
add model, tokenizer files, and model card
10 months ago
model.safetensors
1.2 GB
LFS
Adding `safetensors` variant of this model
10 months ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.LongStorage"
What is a pickle import?
1.2 GB
LFS
add model, tokenizer files, and model card
10 months ago
special_tokens_map.json
336 Bytes
update tokenizer with fast version
10 months ago
tokenizer.json
33.9 MB
LFS
update tokenizer with fast version
10 months ago
tokenizer.pkl
pickle
Detected Pickle imports (5)
"collections.OrderedDict"
,
"transformers.models.bert.tokenization_bert.BertTokenizer"
,
"transformers.tokenization_utils.Trie"
,
"transformers.models.bert.tokenization_bert.WordpieceTokenizer"
,
"transformers.models.bert.tokenization_bert.BasicTokenizer"
How to fix it?
15.1 MB
LFS
add model, tokenizer files, and model card
10 months ago
tokenizer_config.json
396 Bytes
update tokenizer with fast version
10 months ago
vocab.txt
996 kB
add model, tokenizer files, and model card
10 months ago