中文

German BERT large

Released, Oct 2020, this is a German BERT language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper , we outline the steps taken to train our model and show that it outperforms its predecessors.

Overview

Paper: here Architecture: BERT large Language: German

Performance

GermEval18 Coarse: 80.08
GermEval18 Fine:   52.48
GermEval14:        88.16

See also: deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator

Authors

Branden Chan: branden.chan@deepset.ai Stefan Schweter: stefan@schweter.eu Timo Möller: timo.moeller@deepset.ai

About us

deepset is the company behind the open-source NLP framework Haystack which is designed to help you build production ready NLP systems that use: Question answering, summarization, ranking etc.

Some of our other work:

Get in touch and join the Haystack community

For more info on Haystack, visit our GitHub repo and Documentation .

We also have a Discord community open to everyone!

Twitter | LinkedIn | Discord | GitHub Discussions | Website

By the way: we're hiring!