模型:
NYTK/named-entity-recognition-nerkor-hubert-hungarian
For further models, scripts and details, see our demo site .
F-score: 90.18%
from transformers import pipeline ner = pipeline(task="ner", model="NYTK/named-entity-recognition-nerkor-hubert-hungarian") input_text = "A Kovácsné Nagy Erzsébet nagyon jól érzi magát a Nokiánál, azonban a Németországból érkezett Kovács Péter nehezen boldogul a beilleszkedéssel." print(ner(input_text, aggregation_strategy="simple"))
If you use this model, please cite the following paper:
@inproceedings {yang-language-models, title = {Training language models with low resources: RoBERTa, BART and ELECTRA experimental models for Hungarian}, booktitle = {Proceedings of 12th IEEE International Conference on Cognitive Infocommunications (CogInfoCom 2021)}, year = {2021}, publisher = {IEEE}, address = {Online}, author = {Yang, Zijian Győző and Váradi, Tamás}, pages = {279--285} }