File size: 658 Bytes
2de6fd2 eb8bcad 2de6fd2 eb8bcad b855155 eb8bcad |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 |
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
inference: false
datasets:
- the_pile_books3
---
# mpt-7b-storywriter: sharded
This is a version of the [mpt-7b-storywriter](https://huggingface.co/mosaicml/mpt-7b-storywriter) model, sharded to 2 GB chunks for low-RAM loading (i.e. Colab). The weights are stored in `bfloat16` so in theory you can run this on CPU, though it may take forever.
Please refer to the previously linked repo for details on usage/implementation/etc. This model was downloaded from the original repo under Apache-2.0 and is redistributed under the same license.
---
> More details/usage to be added later |