模型:
uhhlt/am-roberta
This is the Amharic RoBERTa transformer-based LM. It is part of the effort to build benchmark datasets and models for Amharic NLP.
If you want to test the model in the Hosted inference API , copy the following texts to the box (right side)
Example 1:
አበበ <mask> በላ ።
Example 2:
የአገሪቱ አጠቃላይ የስንዴ አቅርቦት ሶስት አራተኛው የሚመረተው በአገር <mask> ነው።
The example shows possible words for the fill in the blank -- mask task
More resource regarding Amharic NLP is available here If you use the model in your work, please cite the following paper
@Article{fi13110275, AUTHOR = {Yimam, Seid Muhie and Ayele, Abinew Ali and Venkatesh, Gopalakrishnan and Gashaw, Ibrahim and Biemann, Chris}, TITLE = {Introducing Various Semantic Models for Amharic: Experimentation and Evaluation with Multiple Tasks and Datasets}, JOURNAL = {Future Internet}, VOLUME = {13}, YEAR = {2021}, NUMBER = {11}, ARTICLE-NUMBER = {275}, URL = {https://www.mdpi.com/1999-5903/13/11/275}, ISSN = {1999-5903}, DOI = {10.3390/fi13110275} }