A transformer-based model pre-trained on large-scale corpora, covering 12 major Indian languages.
Indic-BERT is a multilingual language representation model pre-trained on large-scale corpora covering 12 major Indian languages. It is based on the BERT architecture and is designed to provide contextual embeddings for various natural language processing tasks such as text classification, named entity recognition, and question answering. Indic-BERT aims to enhance the performance of NLP applications in Indian languages by offering rich semantic representations.
MIT
Divyanshu Kakwani and Anoop Kunchukuttan and Satish Golla and Gokul N.C. and Avik Bhattacharyya and Mitesh M. Khapra and Pratyush Kumar
Multilingual Language Model
N.A.
Open
Sector Agnostic
21/02/25 13:21:14
0
MIT
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.