File size: 1,403 Bytes
bccbe3a
 
 
 
 
c33873b
7037b22
c33873b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
library_name: peft
base_model: mistralai/Mistral-7B-v0.1
pipeline_tag: text-generation
---
Description: 1500 PubMed articles with 4409 annotated chemicals, 5818 diseases and 3116 chemical-disease interactions.\
Original dataset: https://huggingface.co/datasets/tner/bc5cdr \
---\
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land! \
The adapter_category is Named Entity Recognition and the name is Chemical and Disease Recognition (bc5cdr)\
---\
Sample input: Your task is a Named Entity Recognition (NER) task. Predict the category of each entity, then place the entity into the list associated with the category in an output JSON payload. Below is an example:

Input: "Naloxone reverses the antihypertensive effect of clonidine ."

Output: {'B-Chemical': ['Naloxone', 'clonidine'], 'B-Disease': [], 'I-Disease': [], 'I-Chemical': []}

Now, complete the task.

Input: "A standardized loading dose of VPA was administered , and venous blood was sampled at 0 , 1 , 2 , 3 , and 4 hours ."

Output: \
---\
Sample output: {'B-Chemical': ['VPA'], 'B-Disease': [], 'I-Disease': [], 'I-Chemical': []}\
---\
Try using this adapter yourself!
```
from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "mistralai/Mistral-7B-v0.1"
peft_model_id = "predibase/bc5cdr"

model = AutoModelForCausalLM.from_pretrained(model_id)
model.load_adapter(peft_model_id)
```