模型:
l3cube-pune/telugu-bert
TeluguBERT is a Telugu BERT model trained on publicly available Telugu monolingual datasets.
Preliminary details on the dataset, models, and baseline results can be found in our [ paper ] .
Citing:
@article{joshi2022l3cubehind, title={L3Cube-HindBERT and DevBERT: Pre-Trained BERT Transformer models for Devanagari based Hindi and Marathi Languages}, author={Joshi, Raviraj}, journal={arXiv preprint arXiv:2211.11418}, year={2022} }
Other Monolingual Indic BERT models are listed below: Marathi BERT Marathi RoBERTa Marathi AlBERT
Hindi BERT Hindi RoBERTa Hindi AlBERT
Dev BERT Dev RoBERTa Dev AlBERT
Kannada BERT Telugu BERT Malayalam BERT Tamil BERT Gujarati BERT Oriya BERT Bengali BERT Punjabi BERT Assamese BERT