tinyllama-15M / README.md
nickypro's picture
Update README.md
ccee75c
|
raw
history blame
285 Bytes
---
license: mit
---
This is the 15M parameter Llama 2 architecture model trained on the TinyStories dataset.
These are converted from
[karpathy/tinyllamas](https://huggingface.co/karpathy/tinyllamas).
See the [llama2.c](https://github.com/karpathy/llama2.c) project for more details.