Transformers
Back to all models
Model: xlm-roberta-large-finetuned-conll03-english

Monthly model downloads

xlm-roberta-large-finetuned-conll03-english xlm-roberta-large-finetuned-conll03-english
- downloads
last 30 days

How to use this model directly from the 🤗/transformers library:

			
Copy model
tokenizer = AutoTokenizer.from_pretrained("xlm-roberta-large-finetuned-conll03-english") model = AutoModel.from_pretrained("xlm-roberta-large-finetuned-conll03-english")

Config

See raw config file
attention_probs_dropout_prob: 0.1 ...
▾ finetuning_task: null ...
hidden_act: "gelu" ...
hidden_dropout_prob: 0.1 ...
hidden_size: 1024 ...
▾ id2label: { "0": "B-LOC", "1": "B-MISC", "2": "B-ORG", "3": "I-LOC", "4": "I-MISC", "5": "I-ORG", "6": "I-PER", "7": "O" } ...
initializer_range: 0.02 ...
intermediate_size: 4096 ...
is_decoder: false ...
▾ label2id: { "B-LOC": 0, "B-MISC": 1, "B-ORG": 2, "I-LOC": 3, "I-MISC": 4, "I-ORG": 5, "I-PER": 6, "O": 7 } ...
layer_norm_eps: 0.00001 ...
max_position_embeddings: 514 ...
num_attention_heads: 16 ...
num_hidden_layers: 24 ...
num_labels: 8 ...
output_attentions: false ...
output_hidden_states: false ...
output_past: true ...
▾ pruned_heads: {} ...
torchscript: false ...
type_vocab_size: 1 ...
use_bfloat16: false ...
vocab_size: 250002 ...