File size: 907 Bytes
ad2bafd |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
---
language: pt
license: mit
tags:
- bert
- pytorch
datasets:
- Twitter
---
## Introduction
XXXXXXXXXXXXXXXXXXXXXXXXXX
## Available models
| Model | Arch. | #Layers | #Params |
| ---------------------------------------- | ---------- | ------- | ------- |
| `pablocosta/bertabaporu-base-uncased` | BERT-Base | 12 | 110M |
| `pablocosta/bertabaporu-large-uncased` | BERT-Large | 24 | 335M |
## Usage
```python
from transformers import AutoTokenizer # Or BertTokenizer
from transformers import AutoModelForPreTraining # Or BertForPreTraining for loading pretraining heads
from transformers import AutoModel # or BertModel, for BERT without pretraining heads
model = AutoModelForPreTraining.from_pretrained('pablocosta/bertabaporu-base-uncased')
tokenizer = AutoTokenizer.from_pretrained('pablocosta/bertabaporu-base-uncased')
```
|