apepkuss79's picture
Update README.md
214ca52 verified
|
raw
history blame
667 Bytes
---
model_name: DeepSeek-R1-Distill-Llama-70B
base_model: deepseek-ai/DeepSeek-R1-Distill-Llama-70B
model_creator: deepseek-ai
quantized_by: Second State Inc.
library_name: transformers
---
# DeepSeek-R1-Distill-Llama-70B-GGUF
## Original Model
[deepseek-ai/DeepSeek-R1-Distill-Llama-70B](https://huggingface.co/deepseek-ai/DeepSeek-R1-Distill-Llama-70B)
## Run with Gaianet
**Prompt template:**
prompt template: `llama-3-chat`
**Context size:**
chat_ctx_size: `128000`
**Run with GaiaNet:**
- Quick start: https://docs.gaianet.ai/node-guide/quick-start
- Customize your node: https://docs.gaianet.ai/node-guide/customize
*Quantized with llama.cpp b4519*