Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
bghira
/
terminus-xl-gamma-v1
like
19
Text-to-Image
Diffusers
Safetensors
StableDiffusionXLPipeline
Inference Endpoints
License:
openrail++
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
terminus-xl-gamma-v1
/
tokenizer_2
3 contributors
History:
1 commit
Bagheera Bghira
v1: from ptx0/coco-xltest, 36000 + 21800 + 14200 steps of mixed LAION/MJ dataset and offset noise with input perturbation on a probability of 25%, 10% caption dropout, cosine LR 4e-7 to 8e-7 every 3200 steps
90816e1
about 1 year ago
merges.txt
Safe
525 kB
v1: from ptx0/coco-xltest, 36000 + 21800 + 14200 steps of mixed LAION/MJ dataset and offset noise with input perturbation on a probability of 25%, 10% caption dropout, cosine LR 4e-7 to 8e-7 every 3200 steps
about 1 year ago
special_tokens_map.json
Safe
460 Bytes
v1: from ptx0/coco-xltest, 36000 + 21800 + 14200 steps of mixed LAION/MJ dataset and offset noise with input perturbation on a probability of 25%, 10% caption dropout, cosine LR 4e-7 to 8e-7 every 3200 steps
about 1 year ago
tokenizer_config.json
Safe
725 Bytes
v1: from ptx0/coco-xltest, 36000 + 21800 + 14200 steps of mixed LAION/MJ dataset and offset noise with input perturbation on a probability of 25%, 10% caption dropout, cosine LR 4e-7 to 8e-7 every 3200 steps
about 1 year ago
vocab.json
Safe
1.06 MB
v1: from ptx0/coco-xltest, 36000 + 21800 + 14200 steps of mixed LAION/MJ dataset and offset noise with input perturbation on a probability of 25%, 10% caption dropout, cosine LR 4e-7 to 8e-7 every 3200 steps
about 1 year ago