File size: 1,324 Bytes
6c844b3
 
f3dcc56
 
6c844b3
f3dcc56
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
license: mit
language:
- fr
---

This is an 8bit version of [distilcamembert-base-ner](https://huggingface.co/cmarkea/distilcamembert-base-ner) obtained with
[Intel® Neural Compressor](https://github.com/intel/neural-compressor) on [wikiner_fr](https://huggingface.co/datasets/Jean-Baptiste/wikiner_fr)
dataset.

### Get Started

First, install libraries:

```
pip install --upgrade-strategy eager "optimum[neural-compressor]" > null
```

Second, use `INCModelForTokenClassification` from optimum.intel . It can be used in the similar way as
an ordinary `DistilBertForTokenClassification`:

```python
from transformers import AutoModelForTokenClassification, AutoTokenizer
from optimum.intel import INCModelForTokenClassification


model = INCModelForTokenClassification.from_pretrained('konverner/8bit-distilcamembert-base-ner')
tokenizer = AutoTokenizer.from_pretrained('konverner/8bit-distilcamembert-base-ner')

text = "Meta Platforms ou Meta, anciennement connue sous le nom de Facebook, est une multinationale américaine fondée en 2004 par Mark Zuckerberg."

model_input = tokenizer(text, return_tensors='pt')
model_output = model(**model_input)
print(model_output.logits.argmax(2))
# tensor([[0, 4, 4, 4, 4, 4, 0, 4, 4, 0, 0, 0, 0, 0, 0, 0, 0, 4, 0, 0, 0, 0, 0, 0,
#         0, 0, 0, 2, 2, 2, 2, 2, 0, 0]])
```