模型:
dimitriz/greek-media-bert-base-uncased
This model is a domain-adapted version of nlpaueb/bert-base-greek-uncased-v1 on Greek media centric data.
Details will be updated soon.
Details will be updated soon.
Details will be updated soon.
The following hyperparameters were used during training:
Details will be updated soon.
The model has been officially released with the article "PIMA: Parameter-shared Intelligent Media Analytics Framework for Low Resource Language. Dimitrios Zaikis, Nikolaos Stylianou and Ioannis Vlahavas. In the Special Issue: New Techniques of Machine Learning and Deep Learning in Text Classification, Applied Sciences Journal. 2023" ( https://www.mdpi.com/2174928 ).
If you use the model, please cite the following:
@Article{app13053265,
AUTHOR = {Zaikis, Dimitrios and Stylianou, Nikolaos and Vlahavas, Ioannis},
TITLE = {PIMA: Parameter-Shared Intelligent Media Analytics Framework for Low Resource Languages},
JOURNAL = {Applied Sciences},
VOLUME = {13},
YEAR = {2023},
NUMBER = {5},
ARTICLE-NUMBER = {3265},
URL = {https://www.mdpi.com/2076-3417/13/5/3265},
ISSN = {2076-3417},
DOI = {10.3390/app13053265}
}