gelectra-large / README.md
julianrisch's picture
Update README.md
0935e33 verified
metadata
language: de
license: mit
datasets:
  - wikipedia
  - OPUS
  - OpenLegalData
  - oscar

German ELECTRA large

Released, Oct 2020, this is a German ELECTRA language model trained collaboratively by the makers of the original German BERT (aka "bert-base-german-cased") and the dbmdz BERT (aka bert-base-german-dbmdz-cased). In our paper, we outline the steps taken to train our model and show that this is the state of the art German language model.

Overview

Paper: here
Architecture: ELECTRA large (discriminator)
Language: German

Performance

GermEval18 Coarse: 80.70
GermEval18 Fine:   55.16
GermEval14:        88.95

See also:
deepset/gbert-base deepset/gbert-large deepset/gelectra-base deepset/gelectra-large deepset/gelectra-base-generator deepset/gelectra-large-generator

Authors

Branden Chan: branden.chan [at] deepset.ai
Stefan Schweter: stefan [at] schweter.eu
Timo Möller: timo.moeller [at] deepset.ai

About us

deepset is the company behind the production-ready open-source AI framework Haystack.

Some of our other work:

Get in touch and join the Haystack community

For more info on Haystack, visit our GitHub repo and Documentation.

We also have a Discord community open to everyone!

Twitter | LinkedIn | Discord | GitHub Discussions | Website | YouTube

By the way: we're hiring!