jannikskytt
commited on
Commit
•
1050884
1
Parent(s):
405e6ed
Update README.md
Browse files
README.md
CHANGED
@@ -60,4 +60,33 @@ configs:
|
|
60 |
- split: train
|
61 |
path: Verb inflection analogy/train*
|
62 |
|
63 |
-
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
60 |
- split: train
|
61 |
path: Verb inflection analogy/train*
|
62 |
|
63 |
+
---
|
64 |
+
|
65 |
+
# Danish medical word embeddings
|
66 |
+
|
67 |
+
MeDa-We was trained on a Danish medical corpus of 123M tokens. The word embeddings are 300-dimensional and are trained using [FastText](https://fasttext.cc/).
|
68 |
+
|
69 |
+
The embeddings were trained for 10 epochs using a window size of 5 and 10 negative samples.
|
70 |
+
|
71 |
+
The development of the corpus and word embeddings is described further in our [paper](https://aclanthology.org/2023.nodalida-1.31/).
|
72 |
+
|
73 |
+
We also trained a transformer model on the developed corpus which can be found [here](https://huggingface.co/jannikskytt/MeDa-Bert).
|
74 |
+
|
75 |
+
### Citing
|
76 |
+
|
77 |
+
```
|
78 |
+
@inproceedings{pedersen-etal-2023-meda,
|
79 |
+
title = "{M}e{D}a-{BERT}: A medical {D}anish pretrained transformer model",
|
80 |
+
author = "Pedersen, Jannik and
|
81 |
+
Laursen, Martin and
|
82 |
+
Vinholt, Pernille and
|
83 |
+
Savarimuthu, Thiusius Rajeeth",
|
84 |
+
booktitle = "Proceedings of the 24th Nordic Conference on Computational Linguistics (NoDaLiDa)",
|
85 |
+
month = may,
|
86 |
+
year = "2023",
|
87 |
+
address = "T{\'o}rshavn, Faroe Islands",
|
88 |
+
publisher = "University of Tartu Library",
|
89 |
+
url = "https://aclanthology.org/2023.nodalida-1.31",
|
90 |
+
pages = "301--307",
|
91 |
+
}
|
92 |
+
```
|