Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,17 @@ To use the model with optillm you can just prepend `router` to the model name. E
|
|
14 |
|
15 |
Otherwise, refer to the code in [router-plugin](https://github.com/codelion/optillm/blob/main/optillm/plugins/router_plugin.py) to see how to use this model for classification.
|
16 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
17 |
# Usage
|
18 |
|
19 |
To use the model directly you will need to use our `OptILMClassifier` class as we added additional layers to the base model. The additional
|
|
|
14 |
|
15 |
Otherwise, refer to the code in [router-plugin](https://github.com/codelion/optillm/blob/main/optillm/plugins/router_plugin.py) to see how to use this model for classification.
|
16 |
|
17 |
+
This model is based on `ModernBERT-large`and better than the previous [router model](https://huggingface.co/codelion/optillm-bert-uncased)
|
18 |
+
that was based on `bert-large-uncased`.
|
19 |
+
|
20 |
+
### Router results on AIME 2024 pass@1
|
21 |
+
|
22 |
+
| Model | Score |
|
23 |
+
|-------|-----:|
|
24 |
+
| router-gpt4o-mini with codelion/optillm-modernbert-large | 13.33 |
|
25 |
+
| router-gpt4o-mini with codelion/optillm-bert-uncased | 6.67 |
|
26 |
+
| gpt4o-mini | 3.33 |
|
27 |
+
|
28 |
# Usage
|
29 |
|
30 |
To use the model directly you will need to use our `OptILMClassifier` class as we added additional layers to the base model. The additional
|