Instructions to use mpalaval/bert-ner-2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mpalaval/bert-ner-2 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("token-classification", model="mpalaval/bert-ner-2")# Load model directly from transformers import AutoTokenizer, AutoModelForTokenClassification tokenizer = AutoTokenizer.from_pretrained("mpalaval/bert-ner-2") model = AutoModelForTokenClassification.from_pretrained("mpalaval/bert-ner-2") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- f35a989ef875d94db46f3c84166aaffd8c19a9871977879251dfd7609599bddf
- Size of remote file:
- 431 MB
- SHA256:
- b95cdd3df7404aef8b43a6893600e62fb9106984e3ff6a722a395f4fb9dc3f4b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.