Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,7 @@ pipeline_tag: feature-extraction
|
|
7 |
# GLiNER-MoE-MultiLingual: A Zero-Shot Multilingual NER Model with MOE Architecture
|
8 |
|
9 |
This repository provides **GLiNER-MoE-MultiLingual**, a zero-shot Named Entity Recognition (NER) model trained for **one epoch** using a **Mixture of Experts (MOE)** from NOMIC-MOE architecture. GLiNER-MoE-MultiLingual aims to handle zero shot **multilingual** NER tasks across various domains.
|
|
|
10 |
|
11 |
---
|
12 |
|
|
|
7 |
# GLiNER-MoE-MultiLingual: A Zero-Shot Multilingual NER Model with MOE Architecture
|
8 |
|
9 |
This repository provides **GLiNER-MoE-MultiLingual**, a zero-shot Named Entity Recognition (NER) model trained for **one epoch** using a **Mixture of Experts (MOE)** from NOMIC-MOE architecture. GLiNER-MoE-MultiLingual aims to handle zero shot **multilingual** NER tasks across various domains.
|
10 |
+
Inspired from my work documented on this [medium article](https://medium.com/@mayankrakesh1/divide-specialize-and-conquer-my-ideas-on-how-moe-meetscontrastive-learning-in-nlp-part-1-8379803220d0).
|
11 |
|
12 |
---
|
13 |
|