A state-of-the-art Mixture of Experts (MoE) model optimized for efficient text generation, multilingual reasoning, and instruction adherence, designed for both research and commercial AI applications.
Phi-3.5-MoE-Instruct is a large-scale, multilingual AI model from Microsoft, incorporating a Mixture of Experts (MoE) architecture to enhance efficiency, reasoning, and performance while reducing computational costs. It is trained on high-quality synthetic and filtered datasets, supporting 128K token context length and optimized for instruction following, problem-solving, and generative AI applications.
MIT
Microsoft
Mixture of Experts (MoE) Language Model
N.A.
Open
Sector Agnostic
12/03/25 06:35:22
0
MIT
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.