Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
NHNDQ
/
content_consumption
like
0
Feature Extraction
Transformers
Safetensors
roberta
text-embeddings-inference
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
content_consumption
1 contributor
History:
3 commits
jisukim8873
Upload tokenizer
f0623fe
verified
9 months ago
.gitattributes
1.52 kB
initial commit
9 months ago
config.json
881 Bytes
Upload model
9 months ago
model.safetensors
442 MB
LFS
Upload model
9 months ago
special_tokens_map.json
173 Bytes
Upload tokenizer
9 months ago
tokenizer.json
752 kB
Upload tokenizer
9 months ago
tokenizer_config.json
1.28 kB
Upload tokenizer
9 months ago
vocab.txt
248 kB
Upload tokenizer
9 months ago