gpt2-medium-dutch / README.md
yhavinga's picture
Saving weights and logs at step 320000
cd84b33
|
raw
history blame
1.44 kB
metadata
language: nl
widget:
  - text: In het jaar 2030 zullen we
  - text: Toen ik gisteren volledig in de ban was van
  - text: >-
      Studenten en leraren van de Bogazici Universiteit in de Turkse stad
      Istanbul
  - text: In Israël was een strenge lockdown
tags:
  - gpt2-medium
  - gpt2
pipeline_tag: text-generation
datasets:
  - yhavinga/mc4_nl_cleaned

GPT2-Medium pre-trained on cleaned Dutch mC4 🇳🇱

Training is not finished!

Dataset:

Tokenizer:

  • Tokenizer trained on mC4 with scripts from the Huggingface Transformers Flax examples

Training details:

  • Trained for 320K of 520K steps (31 dec 2021)
  • Block size: 512
  • Optimizer: adam, lr 8e-4, beta1 0.9, beta2 0.98
  • Warmup steps: 5000
  • Weight decay: 0.01

Work in progress. Dec 2021-Jan2022