library_name: peft | |
base_model: mistralai/Mistral-7B-v0.1 | |
pipeline_tag: text-generation | |
Description: Customer support call classification given call transcript\ | |
Original dataset: https://github.com/cricketclub/gridspace-stanford-harper-valley \ | |
---\ | |
Try querying this adapter for free in Lora Land at https://predibase.com/lora-land! \ | |
The adapter_category is Topic Identification and the name is Customer Support Automation\ | |
---\ | |
Sample input: Consider the case of a customer contacting the support center.\nThe term "task type" refers to the reason for why the customer contacted support.\n\n### The possible task types are: ### \n- replace card\n- transfer money\n- check balance\n- order checks\n- pay bill\n- reset password\n- schedule appointment\n- get branch hours\n- none of the above\n\nSummarize the issue/question/reason that drove the customer to contact support:\n\n### Transcript: <caller> [noise] <agent> [noise] <caller> [noise] <caller> [noise] hello <caller> hello <agent> hi i'm sorry this this call uh hello this is harper valley national bank my name is dawn how can i help you today <caller> hi <caller> oh okay my name is jennifer brown and i need to check my account balance if i could <caller> [noise] <caller> [noise] [noise] [noise] <agent> what account would you like to check <caller> um <caller> [noise] <caller> <unk> <caller> uhm my savings account <caller> please <caller> <unk> <caller> [noise] <caller> [noise] <caller> oh but the way that you're doing <agent> one moment <caller> hello <agent> yeah one moment <caller> uh huh <caller> no problem <caller> [noise] <agent> your account balance is eighty two dollars is there anything else i can help you with <caller> no i don't think so thank you so much you were very helpful <agent> thank you <caller> have a good day bye bye <caller> [noise] <agent> you too \n\n### Task Type:\n\ntest_transcript = \ | |
---\ | |
Sample output: check balance\ | |
---\ | |
Try using this adapter yourself! | |
``` | |
from transformers import AutoModelForCausalLM, AutoTokenizer | |
model_id = "mistralai/Mistral-7B-v0.1" | |
peft_model_id = "predibase/customer_support" | |
model = AutoModelForCausalLM.from_pretrained(model_id) | |
model.load_adapter(peft_model_id) | |
``` |