krum-utsav's picture
Update README.md
030e659
---
library_name: peft
license: wtfpl
language:
- en
pipeline_tag: text-generation
---
## Model description
The tiiuae/falcon-7b model finetuned for Paraphrasing, Changing the Tone of the input sentence(to casual/professional/witty),
Summary and Topic generation from a dialogue. Data for Paraphrasing and Changing the Tone was generated using gpt-35-turbo and a sample of roughly 1000 data points from the
[Dialogsum](https://github.com/cylnlp/dialogsum) dataset was used for Summary and Topic generation.
Look at the repo [llm-toys](https://github.com/kuutsav/llm-toys) for usage and other details.
Try in colab (you might need the pro version):
<a target="_blank" href="https://colab.research.google.com/drive/1hhANNzQkxhrPIIrxtvf0WT_Ste8KrFjh#scrollTo=d6-OJJq_q5Qr">
<img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
</a>
## Installation
```bash
pip install llm-toys
```
```python
from llm_toys.tasks import GeneralTaskAssitant
from llm_toys.config import TaskType
gta = GeneralTaskAssitant()
gta.complete(TaskType.PARAPHRASE_TONE, "Hey, can yuo hepl me cancel my last order?")
# "Could you assist me in canceling my previous order?"
gta.complete(TaskType.PARAPHRASE_TONE, "Hey, can yuo hepl me cancel my last order?", tone="casual")
# "Hey, can you help me cancel my last order?"
gta.complete(TaskType.PARAPHRASE_TONE, "Hey, can yuo hepl me cancel my last order?", tone="professional")
# "I would appreciate if you could assist me in canceling my previous order."
gta.complete(TaskType.PARAPHRASE_TONE, "Hey, can yuo hepl me cancel my last order?", tone="witty")
# "Oops! Looks like I got a little carried away with my shopping spree. Can you help me cancel my last order?"
chat = """
#Person1#: I'm so excited for the premiere of the latest Studio Ghibli movie!
#Person2#: What's got you so hyped?
#Person1#: Studio Ghibli movies are pure magic! The animation, storytelling, everything is incredible.
#Person2#: Which movie is it?
#Person1#: It's called "Whisper of the Wind." It's about a girl on a magical journey to save her village.
#Person2#: Sounds amazing! I'm in for the premiere.
#Person1#: Great! We're in for a visual masterpiece and a heartfelt story.
#Person2#: Can't wait to be transported to their world.
#Person1#: It'll be an unforgettable experience, for sure!
""".strip()
gta.complete(TaskType.DIALOGUE_SUMMARY_TOPIC, chat)
# {"summary": "#Person1# tells #Person2# about the upcoming Studio Ghibli movie.
# #Person1# thinks it's magical and #Person2#'s excited to watch it.",
# "topic": "Movie premiere"}
```
## Sample training data
```json
[
{
"original": "If you have any further questions, feel free to ask.",
"casual": "Got more questions? Feel free to ask away. I'm here to help!",
"professional": "Should you have any additional inquiries, please don't hesitate to ask.",
"witty": "Curiosity is always in style! If you have more mysteries to solve, I'm all ears!",
"paraphrase": "Don't hesitate to ask if you have any more questions."
},
{
"fname": "dev_473",
"dialogue": "#Person1#: Did you enjoy your weekend at the highland hotel? I heard it's and excellent place to stay and has good facilities.\n#Person2#: I had a wonderful time. The rooms are not very big, but they are well furnished. The restaurant is excellent and reasonably priced. There's a sauna and a Jacuzzi.\n#Person1#: Do they have a swimming pool?\n#Person2#: No, they don't. they have a beauty parlor, but I didn't go there.\n#Person1#: What's the service like?\n#Person2#: It's very good. Check in and check out at the reception only took a few minutes. The wait staff is very good. A waiter recommended their baked fish, which tasted wonderful. The hotel was quite full, so I'd suggest making a reservation if you intend to go there. The hotel offers a discount at the weekends.\n#Person1#: It sounds perfect. Did you have any complaints at all?\n#Person2#: There was a problem with the internet access, so I couldn't check my email, but I didn't complain about it to the management.\n#Person1#: I suppose you were happy to forget about the outside world.\n#Person2#: Yes, I was. Here's their business card.\n#Person1#: Thanks. Was there a mina bar in the room?\n#Person2#: No, there wasn't. There is a bar on the ground floor and of course you can buy drinks in the restaurant to go with your meal.\n#Person1#: One of the things I dislike about hotels is that everyone expects tips.\n#Person2#: I know. At the inland hotel, they have an interesting policy. When you check out, you put some money in a special box at reception. Each evening, the money in the box is shared equally by the hotel staff.",
"summary": "#Person2# enjoys #Person2#'s weekend at the highland hotel because of the hotel's excellent and reasonably priced restaurant and good service. #Person2# introduces the hotel's facilities, weekend discount, and its interesting tip policy and suggests #Person1# make a reservation in advance.",
"topic": "Experience in hotel"
}
]
```
## Training params
```json
{
"batch_size": 1,
"eval_ratio": 0.05,
"eval_steps": 100,
"gradient_accumulation_steps": 4,
"learning_rate": 0.0001,
"logging_steps": 100,
"lora_alpha": 32,
"lora_dropout": 0.05,
"lora_r": 16,
"max_length": 1024,
"model_name": "tiiuae/falcon-7b",
"num_train_epochs": 3,
"seed": 10,
"task_type": "paraphrase_tone,dialogue_summary_topic",
"use_aim": True
}
```
## Training curve
![train_eval_loss](falcon-7b-paraphrase-tone-dialogue-summary-topic.jpeg)
## Training procedure
The following `bitsandbytes` quantization config was used during training:
- load_in_8bit: False
- load_in_4bit: True
- llm_int8_threshold: 6.0
- llm_int8_skip_modules: None
- llm_int8_enable_fp32_cpu_offload: False
- llm_int8_has_fp16_weight: False
- bnb_4bit_quant_type: nf4
- bnb_4bit_use_double_quant: True
- bnb_4bit_compute_dtype: bfloat16
### Framework versions
- PEFT 0.4.0.dev0