A 1.3B parameter Transformer model specialized for Python code generation, trained on a mix of StackOverflow, competition code, Python textbooks, and synthetic datasets, achieving over 50% accuracy on HumanEval.
Phi-1 is a lightweight, code-specialized Transformer model developed by Microsoft, designed for basic Python coding tasks. Despite having a relatively small dataset compared to contemporary large language models (LLMs), it has demonstrated strong performance in generating functional Python code, making it a valuable tool for developers, researchers, and educational purposes.
MIT
Microsoft
Text Generation
N.A.
Open
Sector Agnostic
12/03/25 06:35:46
0
MIT
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.