mohamedemam commited on
Commit
a50e15c
1 Parent(s): 64482dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -14
README.md CHANGED
@@ -12,6 +12,7 @@ tags:
12
  - nlp
13
  - dataset maker
14
  ---
 
15
  # Model Card for FLAN-T5 large
16
 
17
  <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg"
@@ -44,8 +45,8 @@ As mentioned in the first few lines of the abstract :
44
 
45
 
46
  - **Model type:** Language model
47
- - **Language(s) (NLP):** English, Spanish, Japanese, Persian, Hindi, French, Chinese, Bengali, Gujarati, German, Telugu, Italian, Arabic, Polish, Tamil, Marathi, Malayalam, Oriya, Panjabi, Portuguese, Urdu, Galician, Hebrew, Korean, Catalan, Thai, Dutch, Indonesian, Vietnamese, Bulgarian, Filipino, Central Khmer, Lao, Turkish, Russian, Croatian, Swedish, Yoruba, Kurdish, Burmese, Malay, Czech, Finnish, Somali, Tagalog, Swahili, Sinhala, Kannada, Zhuang, Igbo, Xhosa, Romanian, Haitian, Estonian, Slovak, Lithuanian, Greek, Nepali, Assamese, Norwegian
48
- - **License:** Apache 2.0
49
  - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5)
50
  - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints)
51
  - **Resources for more information:**
@@ -68,10 +69,10 @@ Find below some example scripts on how to use the model in `transformers`:
68
 
69
  from transformers import T5Tokenizer, T5ForConditionalGeneration
70
 
71
- tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
72
- model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large")
73
 
74
- input_text = "translate English to German: How old are you?"
75
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids
76
 
77
  outputs = model.generate(input_ids)
@@ -89,10 +90,10 @@ print(tokenizer.decode(outputs[0]))
89
  # pip install accelerate
90
  from transformers import T5Tokenizer, T5ForConditionalGeneration
91
 
92
- tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
93
- model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto")
94
 
95
- input_text = "translate English to German: How old are you?"
96
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
97
 
98
  outputs = model.generate(input_ids)
@@ -113,10 +114,10 @@ print(tokenizer.decode(outputs[0]))
113
  import torch
114
  from transformers import T5Tokenizer, T5ForConditionalGeneration
115
 
116
- tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
117
- model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto", torch_dtype=torch.float16)
118
 
119
- input_text = "translate English to German: How old are you?"
120
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
121
 
122
  outputs = model.generate(input_ids)
@@ -134,10 +135,10 @@ print(tokenizer.decode(outputs[0]))
134
  # pip install bitsandbytes accelerate
135
  from transformers import T5Tokenizer, T5ForConditionalGeneration
136
 
137
- tokenizer = T5Tokenizer.from_pretrained("google/flan-t5-large")
138
- model = T5ForConditionalGeneration.from_pretrained("google/flan-t5-large", device_map="auto", load_in_8bit=True)
139
 
140
- input_text = "translate English to German: How old are you?"
141
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
142
 
143
  outputs = model.generate(input_ids)
 
12
  - nlp
13
  - dataset maker
14
  ---
15
+
16
  # Model Card for FLAN-T5 large
17
 
18
  <img src="https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/transformers/model_doc/flan2_architecture.jpg"
 
45
 
46
 
47
  - **Model type:** Language model
48
+ - **Language(s) (NLP):** English
49
+ - **License:** mit
50
  - **Related Models:** [All FLAN-T5 Checkpoints](https://huggingface.co/models?search=flan-t5)
51
  - **Original Checkpoints:** [All Original FLAN-T5 Checkpoints](https://github.com/google-research/t5x/blob/main/docs/models.md#flan-t5-checkpoints)
52
  - **Resources for more information:**
 
69
 
70
  from transformers import T5Tokenizer, T5ForConditionalGeneration
71
 
72
+ tokenizer = T5Tokenizer.from_pretrained("mohamedemam/QA_GeneraToR")
73
+ model = T5ForConditionalGeneration.from_pretrained("mohamedemam/QA_GeneraToR")
74
 
75
+ input_text = r"when: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an Argentine professional footballer who plays as a forward for and captains both Major League Soccer club Inter Miami and the Argentina national team. Widely regarded as one of the greatest players of all time, Messi has won a record seven Ballon d'Or awards[note 2] and a record six European Golden Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until leaving the club"
76
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids
77
 
78
  outputs = model.generate(input_ids)
 
90
  # pip install accelerate
91
  from transformers import T5Tokenizer, T5ForConditionalGeneration
92
 
93
+ tokenizer = T5Tokenizer.from_pretrained("mohamedemam/QA_GeneraToR")
94
+ model = T5ForConditionalGeneration.from_pretrained("mohamedemam/QA_GeneraToR", device_map="auto")
95
 
96
+ input_text = r"when: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an Argentine professional footballer who plays as a forward for and captains both Major League Soccer club Inter Miami and the Argentina national team. Widely regarded as one of the greatest players of all time, Messi has won a record seven Ballon d'Or awards[note 2] and a record six European Golden Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until leaving the club"
97
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
98
 
99
  outputs = model.generate(input_ids)
 
114
  import torch
115
  from transformers import T5Tokenizer, T5ForConditionalGeneration
116
 
117
+ tokenizer = T5Tokenizer.from_pretrained("mohamedemam/QA_GeneraToR")
118
+ model = T5ForConditionalGeneration.from_pretrained("mohamedemam/QA_GeneraToR", device_map="auto", torch_dtype=torch.float16)
119
 
120
+ input_text = r"when: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an Argentine professional footballer who plays as a forward for and captains both Major League Soccer club Inter Miami and the Argentina national team. Widely regarded as one of the greatest players of all time, Messi has won a record seven Ballon d'Or awards[note 2] and a record six European Golden Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until leaving the club"
121
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
122
 
123
  outputs = model.generate(input_ids)
 
135
  # pip install bitsandbytes accelerate
136
  from transformers import T5Tokenizer, T5ForConditionalGeneration
137
 
138
+ tokenizer = T5Tokenizer.from_pretrained("mohamedemam/QA_GeneraToR")
139
+ model = T5ForConditionalGeneration.from_pretrained("mohamedemam/QA_GeneraToR", device_map="auto", load_in_8bit=True)
140
 
141
+ input_text = r"when: Lionel Andrés Messi[note 1] (Spanish pronunciation: [ljoˈnel anˈdɾes ˈmesi] (listen); born 24 June 1987), also known as Leo Messi, is an Argentine professional footballer who plays as a forward for and captains both Major League Soccer club Inter Miami and the Argentina national team. Widely regarded as one of the greatest players of all time, Messi has won a record seven Ballon d'Or awards[note 2] and a record six European Golden Shoes, and in 2020 he was named to the Ballon d'Or Dream Team. Until leaving the club"
142
  input_ids = tokenizer(input_text, return_tensors="pt").input_ids.to("cuda")
143
 
144
  outputs = model.generate(input_ids)