Update README.md
Browse files
README.md
CHANGED
@@ -6,7 +6,9 @@ language:
|
|
6 |
- en
|
7 |
---
|
8 |
|
9 |
-
# gpt4all-lora
|
|
|
|
|
10 |
|
11 |
An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
|
12 |
This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-2 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2) is trained with three.
|
|
|
6 |
- en
|
7 |
---
|
8 |
|
9 |
+
# gpt4all-lora-epoch-3
|
10 |
+
|
11 |
+
This is an intermediate (epoch 3 / 4) checkpoint from `nomic-ai/gpt4all-lora`.
|
12 |
|
13 |
An autoregressive transformer trained on [data](https://huggingface.co/datasets/nomic-ai/gpt4all_prompt_generations) curated using [Atlas](https://atlas.nomic.ai/).
|
14 |
This model is trained with four full epochs of training, while the related [gpt4all-lora-epoch-2 model](https://huggingface.co/nomic-ai/gpt4all-lora-epoch-2) is trained with three.
|