Update README.md
Browse files
README.md
CHANGED
@@ -10,6 +10,8 @@ tags:
|
|
10 |
|
11 |
This repo contains the latest version of PMC_LLaMA_7B, which is LLaMA-7b finetuned on the PMC papers in the S2ORC dataset.
|
12 |
|
|
|
|
|
13 |
The model was trained with the following hyperparameters:
|
14 |
|
15 |
* Epochs: **10**
|
|
|
10 |
|
11 |
This repo contains the latest version of PMC_LLaMA_7B, which is LLaMA-7b finetuned on the PMC papers in the S2ORC dataset.
|
12 |
|
13 |
+
Notably, different from `chaoyi-wu/PMC_LLAMA_7B`, this model is further trained for 10 epochs.
|
14 |
+
|
15 |
The model was trained with the following hyperparameters:
|
16 |
|
17 |
* Epochs: **10**
|