模型:
l3cube-pune/marathi-bert
New version of this model is available here: https://huggingface.co/l3cube-pune/marathi-bert-v2
MahaBERT is a Marathi BERT model. It is a multilingual BERT (bert-base-multilingual-cased) model fine-tuned on L3Cube-MahaCorpus and other publicly available Marathi monolingual datasets. [dataset link] ( https://github.com/l3cube-pune/MarathiNLP )
More details on the dataset, models, and baseline results can be found in our [paper] ( https://arxiv.org/abs/2202.01159 )
@InProceedings{joshi:2022:WILDRE6, author = {Joshi, Raviraj}, title = {L3Cube-MahaCorpus and MahaBERT: Marathi Monolingual Corpus, Marathi BERT Language Models, and Resources}, booktitle = {Proceedings of The WILDRE-6 Workshop within the 13th Language Resources and Evaluation Conference}, month = {June}, year = {2022}, address = {Marseille, France}, publisher = {European Language Resources Association}, pages = {97--101} }