Fine tune and evaluate transformer model on facebook's bAbi tasks.

Towards AI-Complete Question Answering: A Set of Prerequisite Toy Tasks

Training Code: p208p2002/bAbi-tasks-with-transformer-model

task_no task_name score
qa1 single-supporting-fact 100
qa2 two-supporting-facts 99.4
qa3 three-supporting-facts 62.0
qa4 two-arg-relations 100
qa5 three-arg-relations 96.5
qa6 yes-no-questions 100
qa7 counting 100
qa8 lists-sets 99.8
qa9 simple-negation 100
qa10 indefinite-knowledge 100
qa11 basic-coreference 100
qa12 conjunction 100
qa13 compound-coreference 100
qa14 time-reasoning 100
qa15 basic-deduction 100
qa16 basic-induction 100
qa17 positional-reasoning 100
qa18 size-reasoning 100
qa19 path-finding 100
qa20 agents-motivations 100
# Please use with the follow template

INPUT_TEMPLATE = """
Context:
{context}

Question:
{question}

Answer:
{answer}
"""

input_text = INPUT_TEMPLATE.format_map({
    "context":context,
    "question":question,
    "answer":answer
}).strip()
Downloads last month
26
Safetensors
Model size
812M params
Tensor type
F32
·
BOOL
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train p208p2002/gpt2-large-babi

Collection including p208p2002/gpt2-large-babi