library_name: transformers | |
tags: [Danish, BPE Tokenization, CerebrasGPT] | |
### DA-BPE-CEREBRAS | |
This CerebrasGPT-111M model employs a standard Byte-Pair-Encoding (BPE) tokenizer for Danish text. It serves as a benchmark to compare against morphology-aware tokenizers. |