Local Models
Collection
19 items
•
Updated
•
1
The TinyLlama project aims to pretrain a 1.1B Llama model on 3 trillion tokens. This is the chat model finetuned on a diverse range of synthetic dialogues generated by ChatGPT.
No | Variant | Cortex CLI command |
---|---|---|
1 | 1b-gguf | cortex run tinyllama:1b-gguf |
cortexhub/tinyllama
cortex run tinyllama