|
--- |
|
language: |
|
- en |
|
|
|
tags: |
|
- text2text-generation |
|
|
|
widget: |
|
- text: "Please answer to the following question. Who is going to be the next Ballon d'or?" |
|
example_title: "Question Answering" |
|
- text: "Q: Can Geoffrey Hinton have a conversation with George Washington? Give the rationale before answering." |
|
example_title: "Logical reasoning" |
|
- text: "Please answer the following question. What is the boiling point of Nitrogen?" |
|
example_title: "Scientific knowledge" |
|
- text: "Answer the following yes/no question. Can you write a whole Haiku in a single tweet?" |
|
example_title: "Yes/no question" |
|
- text: "Answer the following yes/no question by reasoning step-by-step. Can you write a whole Haiku in a single tweet?" |
|
example_title: "Reasoning task" |
|
- text: "Q: ( False or not False or False ) is? A: Let's think step by step" |
|
example_title: "Boolean Expressions" |
|
- text: "The square root of x is the cube root of y. What is y to the power of 2, if x = 4?" |
|
example_title: "Math reasoning" |
|
- text: "Premise: At my age you will probably have learned one lesson. Hypothesis: It's not certain how many lessons you'll learn by your thirties. Does the premise entail the hypothesis?" |
|
example_title: "Premise and hypothesis" |
|
|
|
datasets: |
|
- Muennighoff/flan |
|
- Open-Orca/SlimOrca-Dedup |
|
- garage-bAInd/Open-Platypus |
|
- Weyaxi/HelpSteer-filtered |
|
- GAIR/lima |
|
|
|
|
|
license: mit |
|
--- |
|
|
|
# Model Card for the test-version of instructionBERT for Bertology |
|
|
|
A minimalistic instruction model with an already good analysed and pretrained encoder like BERT. |
|
So we can research the [Bertology](https://aclanthology.org/2020.tacl-1.54.pdf) with instruction-tuned models and investigate [what happens to BERT embeddings during fine-tuning](https://aclanthology.org/2020.blackboxnlp-1.4.pdf). |
|
We used the Huggingface API for [warm-starting](https://huggingface.co/blog/warm-starting-encoder-decoder) [BertGeneration](https://huggingface.co/docs/transformers/model_doc/bert-generation) with [Encoder-Decoder-Models](https://huggingface.co/docs/transformers/v4.35.2/en/model_doc/encoder-decoder) for this purpose. |
|
|
|
|