Overview
Phi-4 model, a state-of-the-art 14B parameter Transformer designed for advanced reasoning, conversational AI, and high-quality text generation. Built on a mix of synthetic datasets, filtered public domain content, academic books, and Q&A datasets, Phi-4 ensures exceptional performance through data quality and alignment. It features a 16K token context length, trained on 9.8T tokens over 21 days using 1920 H100-80G GPUs. Phi-4 underwent rigorous fine-tuning and preference optimization to enhance instruction adherence and safety. Released on December 12, 2024, it represents a static model with data cutoff as of June 2024, suitable for diverse applications in research and dialogue systems.
Variants
No | Variant | Cortex CLI command |
---|---|---|
1 | gguf | cortex run phi-4 |
Use it with Jan (UI)
- Install Jan using Quickstart
- Use in Jan model Hub:
cortexso/phi-4
Use it with Cortex (CLI)
- Install Cortex using Quickstart
- Run the model with command:
cortex run phi-4
Credits
- Author: Microsoft Research
- Converter: Homebrew
- Original License: License
- Papers: Phi-4 Technical Report