模型:
marcosgg/bert-base-gl-cased
This is a base pre-trained BERT model (12 layers, cased) for Galician (ILG/RAG spelling). It was evaluated on lexical semantics tasks, using a dataset to identify homonymy and synonymy in context , and presented at ACL 2021.
There is also a small version (6 layers, cased): marcosgg/bert-small-gl-cased
If you use this model, please cite the following paper :
@inproceedings{garcia-2021-exploring, title = "Exploring the Representation of Word Meanings in Context: {A} Case Study on Homonymy and Synonymy", author = "Garcia, Marcos", booktitle = "Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)", year = "2021", publisher = "Association for Computational Linguistics", url = "https://aclanthology.org/2021.acl-long.281", doi = "10.18653/v1/2021.acl-long.281", pages = "3625--3640" }