datasets: | |
- HuggingFaceH4/ultrachat_200k | |
base_model: | |
- meta-llama/Meta-Llama-3-8B-Instruct | |
library_name: transformers | |
## meta-llama/Meta-Llama-3-8B-Instruct - W4A16 Compression | |
This is a compressed model using [llmcompressor](https://github.com/vllm-project/llm-compressor). | |
## Compression Configuration | |
- Base Model: meta-llama/Meta-Llama-3-8B-Instruct | |
- Compression Scheme: W4A16 | |
- Dataset: HuggingFaceH4/ultrachat_200k | |
- Dataset Split: train_sft | |
- Number of Samples: 512 | |
- Preprocessor: chat | |
- Maximum Sequence Length: 8192 | |
## Sample Output | |
#### Prompt: | |
``` | |
<|begin_of_text|><|start_header_id|>user<|end_header_id|> | |
Who is Alan Turing?<|eot_id|> | |
``` | |
#### Output: | |
``` | |
<|begin_of_text|><|begin_of_text|><|start_header_id|>user<|end_header_id|> | |
Who is Alan Turing?<|eot_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|><|start_header_id|> | |
``` | |
## Evaluation | |
<TODO> | |