--- language: - nl datasets: - yhavinga/mc4_nl_cleaned tags: - seq2seq - lm-head license: apache-2.0 inference: false --- # Work in progress. Jan 2022 # A collection of Dutch T5 models * Many thanks to the [Google TPU Research Cloud](https://sites.research.google/trc/about/) for providing access to a TPU cluster! * Continuation of work started during the [Hugging Face community week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organized by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google, for the project [Pre-train T5 from scratch in Dutch](https://discuss.huggingface.co/t/pretrain-t5-from-scratch-in-dutch/8109). * Using improved training script - no more exceptions during training, so no restarting required. * All models trained with tensorboard metrics. * Thanks to @gsarti for creating the [t5-flax-gcp repository](https://github.com/gsarti/t5-flax-gcp)!