--- language: - en - hi - multilingual tags: - text2text-generation widget: - text: What are you doing? example_title: What are you doing - text: It is raining heavily. example_title: It is raining heavily. - text: How are you? example_title: How are you? datasets: - rvv-karma/English-Hinglish-TOP license: apache-2.0 pipeline_tag: text2text-generation --- # English2Hinglish-Flan-T5-Base This is a finetuned model of [Flan T5 Base](https://huggingface.co/google/flan-t5-base) with [English-Hinglish-TOP](https://huggingface.co/datasets/rvv-karma/English-Hinglish-TOP) dataset. ## Usage ```python # Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM model_name = "rvv-karma/English2Hinglish-Flan-T5-Base" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForSeq2SeqLM.from_pretrained(model_name) input_text = "What are you doing?" input_ids = tokenizer(input_text, return_tensors="pt") output_ids = model.generate(**input_ids) output_text = tokenizer.decode(output_ids[0], skip_special_tokens=True) print(output_text) ``` ## Fine-tuning script [Google Colaboratory Notebook](https://colab.research.google.com/drive/11fUHem8r8qe_Ildh2_1XjOEY-Zyoy5j8?usp=sharing) ## References [DataCamp](https://www.datacamp.com/tutorial/flan-t5-tutorial)