Edit model card
YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Planned.AI (planned day) Personalized Trip Planner Model in Tunisia

Overview

This repository contains a personalized trip planner tool based on a finetuned version of the base model from the Hugging Face Transformers library. The tool generates tailored trip itineraries for users based on their preferences and specified destinations. The model leverages a dataset of scraped places from across Tunisia to provide comprehensive and personalized recommendations.

Model Description

The personalized trip planner utilizes a finetuned version of the base model from the Hugging Face Transformers library. The model has been trained on a dataset comprising various attractions, landmarks, and destinations from Tunisia. By incorporating user preferences and destination inputs, the model generates personalized trip plans that cater to individual interests and requirements.

Usage

To utilize the Personalized Trip Planner tool, follow these steps:

  1. Install the Hugging Face Transformers library:
pip install transformers
  1. Load the base model and tokenizer:
from transformers import AutoModelForCausalLM, AutoTokenizer

# Load the base model
model = AutoModelForCausalLM.from_pretrained("SadokBarbouche/planned.AI-gemma-2b-it")

# Load the tokenizer
tokenizer = AutoTokenizer.from_pretrained("SadokBarbouche/planned.AI-gemma-2b-it")

Data Preparation

The model training data comprises scraped information about various attractions and landmarks from Tunisia. The dataset was carefully curated to encompass a diverse range of destinations, ensuring the model's ability to generate comprehensive trip plans.

Evaluation

The performance of the personalized trip planner tool was evaluated based on its ability to generate relevant, coherent, and personalized trip plans tailored to user preferences and specified destinations. Evaluation results demonstrate the effectiveness of the base model in providing valuable recommendations for travelers.

Acknowledgements

We would like to express our gratitude to the contributors of the google-maps-scraper tool on github , as well as the developers of the Hugging Face Transformers library for their support in model integration and usage.

Downloads last month
7
Safetensors
Model size
2.51B params
Tensor type
FP16
·
Inference API
Input a message to start chatting with SadokBarbouche/planned.AI-gemma-2b-it.
This model can be loaded on Inference API (serverless).