File size: 602 Bytes
1b18c3b
 
 
f79f0a1
 
 
 
f006fc5
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
license: apache-2.0
---
A collection of LoRAs for int8 LLaMA trained on an assortment of literature (approximately 16 MB) for 2 epochs.


UPDATE: 2024-04-18
Retrained using Transformers 4.28.1, on two epochs, with a small amount more data.


Notes for usage.
```
- These models are not instruct LoRAs. They are designed to supplement existing story data.
- There will likely be some bleedthrough on locations and names, this is especially notable if you use with very little context.
- There isn't any large notable formatting, ### seperated stories in the dataset, and *** seperated chapters.
```