Edit model card

You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Fine-Tuned Meditation Text Generation Model

This model is fine-tuned for generating text related to meditation and mindfulness topics. It is compatible with the Hugging Face Transformers library and is optimized for text generation tasks.

Intended Use

This model is designed to assist users by generating informative or calming text related to meditation, mindfulness, and relaxation practices. It can be used to create content for meditation guides, descriptions, or other wellness-oriented resources.

Example Usage with Hugging Face Transformers

To use this model for text generation, you can load it directly with the Hugging Face pipeline and generate responses based on prompts related to meditation and mindfulness.

Code Example

Install the required libraries if you haven’t already:

pip install transformers torch

from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

# Load the model and tokenizer
model_name = "Phoenix21/fine-tuned-meditation-model"  # Replace with your model path on Hugging Face
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Create a text generation pipeline
generator = pipeline("text-generation", model=model, tokenizer=tokenizer)

# Example prompt
prompt = "Meditation is a powerful tool for managing stress because"
output = generator(prompt, max_length=100, do_sample=True, temperature=0.7)

# Print generated text
print(output[0]["generated_text"])
Downloads last month
25
Safetensors
Model size
125M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.