--- license: apache-2.0 language: - da pipeline_tag: fill-mask tags: - bert - danish widget: - text: Hvide blodceller beskytter kroppen mod [MASK] --- # Danish medical BERT MeDa-BERT was initialized with weights from a pretrained Danish BERT model (https://huggingface.co/Maltehb/danish-bert-botxo). Next, it was fine-tuned for 48 epochs with masked language modelling on a Danish medical corpus of 123M tokens. The development of the corpus and model is described further in the paper: Here is an example on how to load the model in PyTorch using the [🤗Transformers](https://github.com/huggingface/transformers) library: ```python from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("jannikskytt/MeDa-Bert") model = AutoModelForMaskedLM.from_pretrained("jannikskytt/MeDa-Bert") ```