Param-1-5B is a bilingual (English–Hindi) large language model developed under the Param-1 family. With 5 billion parameters, this model extends the capabilities of Param-1-2.9B by incorporating enhanced mathematical reasoning and code understanding/generation. The model is pretrained from scratch and designed to serve as a strong foundation for downstream tasks such as mathematical problem solving, and code-related understanding / generation.
* 5B parameter dense Transformer model * Bilingual: English and Hindi * Enhanced domains: Math and Code (compared to Param-1-2.9B) * Updated dataset mixture and percentage ratios * Designed as a pretrained (PT) base model
Attribution-Non-Commercial 4.0 International (CC BY-NC 4.0)
bharatgenai
Transformers
PyTorch
Restricted
Other
07/05/26 09:52:31
10.42 GB
To preview this file, you need to be a registered user. Please complete the registration process to gain access and continue viewing the content.
Attribution-Non-Commercial 4.0 International (CC BY-NC 4.0)
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.