--- license: apache-2.0 datasets: - RedHenLabs/qa-news-2016 language: - en library_name: transformers pipeline_tag: text-generation ---

Quantized GGUF version of News reporter 3B LLM

Image

## Model Description News Reporter 3B LLM is based on Phi-3 Mini-4K Instruct a dense decoder-only Transformer model designed to generate high-quality text based on user prompts. With 3.8 billion parameters, the model is fine-tuned using Supervised Fine-Tuning (SFT) to align with human preferences and question answer pairs. ### Key Features: - Parameter Count: 3.8 billion. - Architecture: Dense decoder-only Transformer. - Context Length: Supports up to 4,000 tokens. - Training Data: 43.5K+ question and answer pairs curated from different News channel. ## Model Benchmarking