File size: 2,032 Bytes
9c11183 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
# Question Answering NLU
Question Answering NLU (QANLU) is an approach that maps the NLU task into question answering,
leveraging pre-trained question-answering models to perform well on few-shot settings. Instead of
training an intent classifier or a slot tagger, for example, we can ask the model intent- and
slot-related questions in natural language:
```
Context : I'm looking for a cheap flight to Boston.
Question: Is the user looking to book a flight?
Answer : Yes
Question: Is the user asking about departure time?
Answer : No
Question: What price is the user looking for?
Answer : cheap
Question: Where is the user flying from?
Answer : (empty)
```
Thus, by asking questions for each intent and slot in natural language, we can effectively construct an NLU hypothesis. For more details,
please read the paper:
[Language model is all you need: Natural language understanding as question answering](https://assets.amazon.science/33/ea/800419b24a09876601d8ab99bfb9/language-model-is-all-you-need-natural-language-understanding-as-question-answering.pdf).
To see how to train a QANLU model, visit the [Amazon Science repository](https://github.com/amazon-research/question-answering-nlu)
## Use in transformers:
'''
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
tokenizer = AutoTokenizer.from_pretrained("AmazonScience/qanlu", use_auth_token=True)
model = AutoModelForQuestionAnswering.from_pretrained("AmazonScience/qanlu", use_auth_token=True)
'''
## Citation
If you use this work, please cite:
```
@inproceedings{namazifar2021language,
title={Language model is all you need: Natural language understanding as question answering},
author={Namazifar, Mahdi and Papangelis, Alexandros and Tur, Gokhan and Hakkani-T{\"u}r, Dilek},
booktitle={ICASSP 2021-2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)},
pages={7803--7807},
year={2021},
organization={IEEE}
}
```
## License
This library is licensed under the CC BY NC License. |