Model Card for Dhruv-27b-preview


Model Overview

Dhruv-27B-preview is a powerful 27-billion parameter language model based on Gemma3, fine-tuned for advanced reasoning tasks. Dhruv-27B is part of the Gemini Reasoning Series, optimized for high performance on academic, logical, and factual evaluations. It is capable of deep contextual reasoning and chain-of-thought generation, making it suitable for research, enterprise applications, and AI agents requiring robust general knowledge understanding.


Key Metrics

Dhruv-27b-preview achieves strong performance on key benchmarks:

+------------------------+--------+
|        Metric         | Score  |
+------------------------+--------+
| MMLU (5-shot)          | 80.0%  |
| GPQA (0-shot)          | 52.0%  |
+------------------------+--------+

Model Architecture

  • Base Model: Gemma3
  • Model Size: 27B parameters
  • Type: Decoder-only Transformer (causal LM)
  • Precision: bf16 with int8 quantization for inference efficiency
  • Training Objective: Instruction-tuned with emphasis on reasoning, question answering, and factual correctness

Intended Use

  • Research and academic QA tasks
  • General-purpose reasoning agents
  • Multilingual factual reasoning
  • Enterprise AI tools requiring high factual accuracy and depth

USAGE

  • Use a System Template

Limitations

  • Not optimized for creative generation or dialog tasks.
  • May hallucinate in areas with limited training data.
  • Requires fine-grained prompt engineering for complex instructions.

Citation

@misc{vicharai_dhruv27b,
  title={Dhruv-27B: Gemma3 Reasoning Model},
  author={vicharai},
  year={2025},
  publisher={vicharai},
  url={https://vichar.io}
}

Developed by vichar ai Huggingface Website

Downloads last month
75
Safetensors
Model size
27.4B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for vicharai/Dhruv-27B

Finetuned
(16)
this model