Edit model card

HelpingAI-Lite-chat

HelpingAI-Lite-chat is a conversational model with 1 billion parameters. It is finetuned from HelpingAI and falcon


Subscribe to my YouTube channel

Subscribe

🎯 Purpose

The HelpingAI-Lite-chat aims to add conversational capabilities to the HelpingAI-Lite model. This initiative is driven by the need for a smaller, open-source, instruction-finetuned, ready-to-use model, suitable for users with limited computational resources, like lower-end consumer GPUs.

πŸ“– Example Code

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_name = "OEvortex/HelpingAI-Lite-chat"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
    model_name, device_map="auto", torch_dtype=torch.bfloat16
)
chat_history = [
    {"role": "user", "content": "Hello!"},
    {"role": "assistant", "content": "Hello! How can I assist you today?"},
    {"role": "user", "content": "Explain what AI is."},
]
input_ids = tokenizer.apply_chat_template(
    chat_history, tokenize=True, add_generation_prompt=True, return_tensors="pt"
).to(model.device)
output_tokens = model.generate(
    input_ids,
    do_sample=True,
    temperature=0.7,
    repetition_penalty=1.05,
    max_new_tokens=200,
)
output_text = tokenizer.decode(
    output_tokens[0][len(input_ids[0]) :], skip_special_tokens=True
)
print(output_text)

⚠️ Limitations

This model may generate inaccurate or misleading information and is prone to hallucination, creating plausible but false narratives. It lacks the ability to discern factual content from fiction and may inadvertently produce biased, harmful or offensive content. Its understanding of complex, nuanced queries is limited. Users should be aware of this and verify any information obtained from the model.

The model is provided 'as is' without any warranties, and the creators are not liable for any damages arising from its use. Users are responsible for their interactions with the model.

Downloads last month
6
Safetensors
Model size
1.31B params
Tensor type
BF16
Β·

Space using OEvortex/HelpingAI-Lite-chat 1