A GGUF-optimized version of the Phi-4 model, designed for efficient text generation and reasoning with a 16K token context length, supporting low-latency inference on various hardware platforms.
Phi-4 GGUF is an optimized AI model developed by Microsoft Research, designed for efficient inference using the GGUF format. It is a 14B-parameter Transformer model trained on high-quality synthetic, public domain, and academic datasets to improve advanced reasoning, instruction adherence, and general AI applications. The model supports 16K token context length and is optimized for fast, low-latency inference in memory-constrained environments.
MIT
Microsoft
Text Generation
N.A.
Open
Sector Agnostic
12/03/25 06:35:20
0
MIT
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.