模型:
deepset/gbert-base
Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper , we outline the steps taken to train our model and show that it outperforms its predecessors.
Paper: here Architecture: BERT base Language: German
GermEval18 Coarse: 78.17 GermEval18 Fine: 50.90 GermEval14: 87.98
See also: deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator
Branden Chan: branden.chan [at] deepset.ai Stefan Schweter: stefan [at] schweter.eu Timo Möller: timo.moeller [at] deepset.ai
We bring NLP to the industry via open source! Our focus: Industry specific language models & large scale QA systems.
Some of our work:
Get in touch: Twitter | LinkedIn | Slack | GitHub Discussions | Website
By the way: we're hiring!