Indian Flag
Government Of India
A-
A
A+

Phi-3.5-MoE-Instruct - Mixture of Experts AI Model for Reasoning and Multilingual AI

A state-of-the-art Mixture of Experts (MoE) model optimized for efficient text generation, multilingual reasoning, and instruction adherence, designed for both research and commercial AI applications.

About Model

Phi-3.5-MoE-Instruct is a large-scale, multilingual AI model from Microsoft, incorporating a Mixture of Experts (MoE) architecture to enhance efficiency, reasoning, and performance while reducing computational costs. It is trained on high-quality synthetic and filtered datasets, supporting 128K token context length and optimized for instruction following, problem-solving, and generative AI applications.

Phi-3.5-MoE-Instruct - Mixture of Experts AI Model for Reasoning and Multilingual AI

Metadata Metadata

MIT

Microsoft

Mixture of Experts (MoE) Language Model

N.A.

Open

Sector Agnostic

12/03/25 06:35:22

0

Activity Overview Activity Overview

  • Downloads0
  • Redirect 7
  • File Size 0
  • Views 221

Tags Tags

  • Mixture of Experts
  • Instruction Following
  • Multilingual
  • Text Generation
  • Reasoning
  • Transformers
  • NLP
  • Microsoft

License Control License Control

MIT

More Models from Microsoft Corporation (India) Pvt. Ltd. More Models from Microsoft Corporation (India) Pvt. Ltd.

TAPEX: Large SQL Execution Model (Table Pre-training via Learning a Neural SQL Executor)
A large-sized TAPEX model pre-trained to simulate neural SQL execution, enabling the execution of SQL queries on given tables.
Transformers
SQLExecution
PreTrainedModel
TAPEX
DataRetrieval
NeuralExecutor
BART
  • See Upvoters0
  • Downloads6
  • File Size0
  • Views187
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: Large Model (Table Pre-training via Learning a Neural SQL Executor)
A large-sized pre-trained model designed to enhance table-based question answering and fact verification tasks.
BART
TableQuestionAnswering
FactVerification
PreTrainedModel
LargeModel
  • See Upvoters0
  • Downloads5
  • File Size0
  • Views144
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: TabFact Data enabled Large Finetuned (Table Pre-training via Learning a Neural SQL Executor) Model
A large-sized TAPEX model fine-tuned on the TabFact dataset, designed to enhance performance in table-based fact verification tasks.
FactVerification
NaturalLanguageProcessing
Transformers
BART
DataValidation
FineTunedModel
TabFact
TAPEX
  • See Upvoters0
  • Downloads5
  • File Size0
  • Views109
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX (Table Pre-training via Learning a Neural SQL Executor) Large Finetuned Model
A large-sized TAPEX model fine-tuned on the WikiTableQuestions dataset, designed to enhance performance in table-based question answering tasks.
TAPEX
TableQuestionAnswering
NaturalLanguageProcessing
Transformers
BART
DataExtraction
FineTunedModel
WikiTableQuestions
  • See Upvoters0
  • Downloads2
  • File Size0
  • Views148
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: Base Model (Table Pre-training via Learning a Neural SQL Executor)
A base-sized pre-trained model designed to enhance table-based question answering and fact verification tasks.
BART
TableQuestionAnswering
FactVerification
PreTrainedModel
TabularData
  • See Upvoters0
  • Downloads3
  • File Size0
  • Views105
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: WikiTable Questions Data enabled Base Finetuned (Table Pre-training via Learning a Neural SQL Executor) Model
A base-sized TAPEX model fine-tuned on the WikiTableQuestions dataset, designed to enhance performance in table-based question answering tasks.
NaturalLanguageProcessing
TableQuestionAnswering
TAPEX
WikiTableQuestions
FineTunedModel
DataExtraction
BART
Transformers
  • See Upvoters0
  • Downloads3
  • File Size0
  • Views137
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: WikiSQL Data enabled Base Finetuned (Table Pre-training via Learning a Neural SQL Executor) Model
A large-sized TAPEX model fine-tuned on the WikiSQL dataset, optimized for translating natural language questions into SQL queries for effective table-based question answering.
Transformers
NaturalLanguageProcessing
SQLQueryGeneration
TAPEX
WikiSQL
FineTunedModel
DataRetrieval
BART
  • See Upvoters0
  • Downloads5
  • File Size0
  • Views118
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: TabFact Data enabled Base Finetuned (Table Pre-training via Learning a Neural SQL Executor) Model
A base-sized TAPEX model fine-tuned on the TabFact dataset, tailored for verifying the factual accuracy of textual statements against tabular data.
FactVerification
TAPEX
TabFact
FineTunedModel
DataValidation
BART
Transformers
NaturalLanguageProcessing
  • See Upvoters0
  • Downloads4
  • File Size0
  • Views105
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

TAPEX: Base Finetuned (Table Pre-training via Learning a Neural SQL Executor) Model
A base-sized TAPEX model fine-tuned on the WikiSQL dataset, designed to enhance performance in table-based question answering tasks.
DataExtraction
NaturalLanguageProcessing
Transformers
BART
TableQuestionAnswering
FineTunedModel
WikiSQL
TAPEX
  • See Upvoters0
  • Downloads5
  • File Size0
  • Views148
Updated 9 month(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.

BiomedBERT - Domain-Specific Biomedical Language Model
A biomedical NLP model pre-trained from scratch on abstracts and full-text articles from PubMed and PubMed Central, achieving state-of-the-art performance on biomedical language understanding tasks.
Transformers
inference endpoints
exbert
Bert
English
JAX
PyTorch
Fill-Mask
  • See Upvoters0
  • Downloads88
  • File Size0
  • Views1,324
Updated 1 year(s) ago

MICROSOFT CORPORATION (INDIA) PVT. LTD.