Edit model card

Model Card for CoverGenie

The goal of this project is to build a fine-grained mini-ChatGPT (named “CoverGenie”) , which is designed to generate resumes and cover letters based on job descriptions from the tech field. By nature,it is a language generation task, and it takes the job description as input to a sequence of text and turns it into a structured, certain style of resumes and cover letters. This might involve parameter efficient finetuning, reinforcement learning and prompting engineering to some extent.

Model Details

Model Description

  • Model type: T5 (Text-to-Text-Transfer-Transformer)
  • Language(s) (NLP): [More Information Needed]
  • License: Apache-2.0
  • Finetuned from model: FlanT5 Large

Model Sources [optional]

Uses

It Can Generate Cover letter if we are able to input the Job description and Resume of a candidate.

How to Get Started with the Model

Use the code below to get started with the model.

Click to expand
from transformers import GenerationConfig
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
from transformers import GenerationConfig
import nltk
nltk.download('punkt')
max_source_length=512 
tokenizer = AutoTokenizer.from_pretrained("Hariharavarshan/Cover_genie")
model = AutoModelForSeq2SeqLM.from_pretrained("Hariharavarshan/Cover_genie")
JD='''<Job description Text>'''
resume_text= '''<Resume Text>'''
final_text="give me a cover letter based on the a job description and a resume. Job description:"+JD +" Resume:"+ resume_text
generation_config = GenerationConfig.from_pretrained("google/flan-t5-large",temperature=2.0)
inputs = tokenizer(final_text, max_length=max_source_length, truncation=True, return_tensors="pt")
output = model.generate(**inputs, num_beams=3, do_sample=True, min_length=1000,
                               max_length=10000,generation_config=generation_config,num_return_sequences=3)
decoded_output = tokenizer.batch_decode(output, skip_special_tokens=True)[0]
generated_Coverletter = nltk.sent_tokenize(decoded_output.strip())

Developed by: Hariharavarshan,Jayathilaga,Sara,Meiyu

Downloads last month
2
Safetensors
Model size
783M params
Tensor type
F32
·