Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bnunticha
/
both-sent-segment
like
0
Token Classification
Transformers
TensorBoard
Safetensors
camembert
Generated from Trainer
Inference Endpoints
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
88c2381
both-sent-segment
1 contributor
History:
2 commits
bnunticha
Training in progress, step 500
88c2381
10 months ago
runs
Training in progress, step 500
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
config.json
783 Bytes
Training in progress, step 500
10 months ago
model.safetensors
419 MB
LFS
Training in progress, step 500
10 months ago
sentencepiece.bpe.model
905 kB
LFS
Training in progress, step 500
10 months ago
special_tokens_map.json
365 Bytes
Training in progress, step 500
10 months ago
tokenizer.json
2.18 MB
Training in progress, step 500
10 months ago
tokenizer_config.json
1.79 kB
Training in progress, step 500
10 months ago
training_args.bin
pickle
Detected Pickle imports (8)
"transformers.training_args.OptimizerNames"
,
"accelerate.state.PartialState"
,
"torch.device"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.training_args.TrainingArguments"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_utils.IntervalStrategy"
How to fix it?
4.6 kB
LFS
Training in progress, step 500
10 months ago