metadata
library_name: transformers
tags:
- Danish
- BPE Tokenization
- CerebrasGPT
This CerebrasGPT-111M model employs a standard Byte-Pair-Encoding (BPE) tokenizer for Danish text. It serves as a benchmark to compare against morphology-aware tokenizers.