File size: 932 Bytes
af6cd4c 37a63d5 1954534 75d2aa5 bfaea97 0141a81 0fe4f87 bfaea97 647e8f9 bfaea97 3913966 bfaea97 6a5322f 5acb9f7 6a5322f 5acb9f7 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 |
---
license: cc-by-nc-3.0
language:
- da
pipeline_tag: fill-mask
tags:
- bert
- danish
widget:
- text: Hvide blodlegemer beskytter kroppen mod [MASK]
---
# Danish medical BERT
MeDa-BERT was initialized with weights from a pretrained Danish BERT model (https://huggingface.co/Maltehb/danish-bert-botxo). Next, it was fine-tuned for 48 epochs using the MLM objective on a Danish medical corpus of 123M tokens.
The development of the corpus and model is described further in the paper:
Here is an example on how to load the model in PyTorch using the [🤗Transformers](https://github.com/huggingface/transformers) library:
```python
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("jannikskytt/MeDa-Bert")
model = AutoModelForMaskedLM.from_pretrained("jannikskytt/MeDa-Bert")
```
### Citing
If you find our model helps, please consider citing this :)
```
@article{
}
``` |