Transformers
Back to all models
Model: albert-large

Monthly model downloads

albert-large albert-large
- downloads
last 30 days

How to use this model directly from the πŸ€—/transformers library:

			
Copy model
tokenizer = AutoTokenizer.from_pretrained("albert-large") model = AutoModel.from_pretrained("albert-large")

Config

See raw config file
attention_probs_dropout_prob: 0.1 ...
hidden_act: "gelu" ...
hidden_dropout_prob: 0.1 ...
embedding_size: 128 ...
hidden_size: 1024 ...
initializer_range: 0.02 ...
intermediate_size: 4096 ...
max_position_embeddings: 512 ...
num_attention_heads: 16 ...
num_hidden_layers: 24 ...
num_hidden_groups: 1 ...
net_structure_type: 0 ...
gap_size: 0 ...
num_memory_blocks: 0 ...
inner_group_num: 1 ...
down_scale_factor: 1 ...
type_vocab_size: 2 ...
vocab_size: 30000 ...