license: mit
datasets:
- squad_v2
- quac
language:
- en
widget:
- text: >-
when: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes
ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an
Argentine professional footballer who plays as a forward for and captains
both Major League Soccer club Inter Miami and the Argentina national team.
Widely regarded as one of the greatest players of all time, Messi has won
a record seven Ballon d'Or awards[note 2] and a record six European Golden
Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until
leaving the club in 2021, he had spent his entire professional career with
Barcelona, where he won a club-record 34
- text: >-
where: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel
anˈdɾes ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is
an Argentine professional footballer who plays as a forward for and
captains both Major League Soccer club Inter Miami and the Argentina
national team. Widely regarded as one of the greatest players of all time,
Messi has won a record seven Ballon d'Or awards[note 2] and a record six
European Golden Shoes, and in 2020 he was named to the Ballon d'Or Dream
Team. Until leaving the club in 2021, he had spent his entire professional
career with Barcelona, where he won a club-record 34
- text: >-
how: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes
ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an
Argentine professional footballer who plays as a forward for and captains
both Major League Soccer club Inter Miami and the Argentina national team.
Widely regarded as one of the greatest players of all time, Messi has won
a record seven Ballon d'Or awards[note 2] and a record six European Golden
Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until
leaving the club in 2021, he had spent his entire professional career with
Barcelona, where he won a club-record 34
- text: >-
what: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes
ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an
Argentine professional footballer who plays as a forward for and captains
both Major League Soccer club Inter Miami and the Argentina national team.
Widely regarded as one of the greatest players of all time, Messi has won
a record seven Ballon d'Or awards[note 2] and a record six European Golden
Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until
leaving the club in 2021, he had spent his entire professional career with
Barcelona, where he won a club-record 34
- text: >-
where: Egypt (Egyptian Arabic: مصر Maṣr Egyptian Arabic pronunciation:
[mɑsˤr]), officially the Arab Republic of Egypt, is a transcontinental
country spanning the northeast corner of Africa and the Sinai Peninsula in
the southwest corner of Asia. It is bordered by the Mediterranean Sea to
the north, the Gaza Strip of Palestine and Israel to the northeast, the
Red Sea to the east, Sudan to the south, and Libya to the west. The Gulf
of Aqaba in the northeast separates Egypt from Jordan and Saudi Arabia.
Cairo is the capital and largest city of Egypt, while Alexandria, the
second-largest city, is an important industrial and tourist hub at the
Mediterranean coast.[11] At approximately 100 million inhabitants, Egypt
is the 14th-most populated country in the world, and the third-most
populated in Africa, behind Nigeria and Ethiopia.
- text: >-
where: There is evidence of rock carvings along the Nile terraces and in
desert oases. In the 10th millennium BCE, a culture of hunter-gatherers
and fishers was replaced by a grain-grinding culture. Climate changes or
overgrazing around 8000 BCE began to desiccate the pastoral lands of
Egypt, forming the Sahara. Early tribal peoples migrated to the Nile River
where they developed a settled agricultural economy and more centralized
society.
- text: >-
when: By about 6000 BCE, a Neolithic culture took root in the Nile
Valley.[31] During the Neolithic era, several predynastic cultures
developed independently in Upper and Lower Egypt. The Badarian culture and
the successor Naqada series are generally regarded as precursors to
dynastic Egypt. The earliest known Lower Egyptian site, Merimda, predates
the Badarian by about seven hundred years. Contemporaneous Lower Egyptian
communities coexisted with their southern counterparts for more than two
thousand years. The earliest known evidence of Egyptian hieroglyphic
inscriptions appeared during the predynastic period on Naqada III pottery
vessels, dated to about 3200 BCE.[32]
- text: >-
whose : or the next three millennia. Egyptian culture flourished during
this long period and remained distinctively Egyptian in its religion,
arts, language and customs. The first two ruling dynasties of a unified
Egypt set the stage for the Old Kingdom period, c. 2700–2200 BCE, which
constructed many pyramids, most notably the Third Dynasty pyramid of
Djoser and the Fourth Dynasty Giza pyramids.
- text: >-
who:The First Intermediate Period ushered in a time of political upheaval
for about 150 years.[33] Stronger Nile floods and stabilisation of
government, however, brought back renewed prosperity for the country in
the Middle Kingdom c. 2040 BCE, reaching a peak during the reign of
Pharaoh Amenemhat III. A second period of disunity heralded the arrival of
the first foreign ruling dynasty in Egypt, that of the Semitic Hyksos. The
Hyksos invaders took over much of Lower Egypt around 1650 BCE and founded
a new capital at Avaris. They were driven out by an Upper Egyptian force
led by Ahmose I, who founded the Eighteenth Dynasty and relocated the
capital from Memphis to Thebes.
library_name: transformers
tags:
- generate answers
- question generator
- generate text
- nlp
- dataset maker
- flan t5
- t5
- extract quetions from context
- extract quetion
Model Card for QA_GeneraToR
Excited 😄 to share with you my very first model 🤖 for generating question-answering datasets! This incredible model takes articles 📜 or web pages, and all you need to provide is a prompt and context. It works like magic ✨, generating both the question and the answer. The prompt can be anything – "what," "who," "where" ... etc ! 😅
I've harnessed the power of the flan-t5 model 🚀, which has truly elevated the quality of the results. You can find all the code and details in the repository right here: https://lnkd.in/dhE5s_qg
And guess what? I've even deployed the project, so you can experience the magic firsthand: https://lnkd.in/diq-d3bt ❤️
Join me on this exciting journey into #nlp, #textgeneration, #t5, #deeplearning, and #huggingface. Your feedback and collaboration are more than welcome! 🌟
my fine tuned model
This model is fine tuned to generate a question with answers from a context , why that can be very usful this can help you to generate a dataset from a book article any thing you would to make from it dataset and train another model on this dataset , give the model any context with pre prometed of quation you want + context and it will extarct question + answer for you this are promted i use [ "which", "how", "when", "where", "who", "whom", "whose", "why", "which", "who", "whom", "whose", "whereas", "can", "could", "may", "might", "will", "would", "shall", "should", "do", "does", "did", "is", "are", "am", "was", "were", "be", "being", "been", "have", "has", "had", "if", "is", "are", "am", "was", "were", "do", "does", "did", "can", "could", "will", "would", "shall", "should", "might", "may", "must", "may", "might", "must"]
orignal model info
Code
from transformers import AutoTokenizer, AutoModelForQuestionAnswering
model_name="mohamedemam/Question_generator"
def generate_question_answer(context, prompt, model_name="mohamedemam/Question_generator"):
"""
Generates a question-answer pair using the provided context, prompt, and model.
Args:
context: String containing the text or URL of the source material.
prompt: String starting with a question word (e.g., "what," "who").
model_name: Optional string specifying the model name (default: google/flan-t5-base).
Returns:
A tuple containing the generated question and answer strings.
"""
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForQuestionAnswering.from_pretrained(model_name)
inputs = tokenizer(context, return_tensors="pt")
with torch.no_grad():
outputs = model(**inputs)
start_scores, end_scores = outputs.start_logits, outputs.end_logits
answer_start = torch.argmax(start_scores)
answer_end = torch.argmax(end_scores) + 1 # Account for inclusive end index
answer = tokenizer.convert_tokens_to_strings(tokenizer.convert_ids_to_tokens(inputs["input_ids"][answer_start:answer_end]))[0]
question = f"{prompt} {answer}" # Formulate the question using answer
return question, answer
# Example usage
context = "The capital of France is Paris."
prompt = "What"
question, answer = generate_question_answer(context, prompt)
print(f"Question: {question}")
print(f"Answer: {answer}")