apepkuss79's picture
Update README.md
e4f9930 verified
|
raw
history blame
818 Bytes
---
license: apache-2.0
model_name: Breeze-7B-Instruct-v1_0
base_model: MediaTek-Research/Breeze-7B-Instruct-v1_0
inference: false
model_creator: MediaTek-Research
pipeline_tag: text-generation
quantized_by: Second State Inc.
language:
- zh
- en
---
![](https://github.com/GaiaNet-AI/.github/assets/45785633/d6976adc-f97d-4f86-a648-0f2f5c8e7eee)
# Breeze-7B-Instruct-v1_0-GGUF
## Original Model
[MediaTek-Research/Breeze-7B-Instruct-v1_0](https://huggingface.co/MediaTek-Research/Breeze-7B-Instruct-v1_0)
## Run with Gaianet
**Prompt template**
prompt template: `mediatek-breeze`
**Context size**
chat_ctx_size: `8000`
**Run with GaiaNet**
- Quick start: https://docs.gaianet.ai/node-guide/quick-start
- Customize your node: https://docs.gaianet.ai/node-guide/customize
*Quantized with llama.cpp b3613*