--- datasets: - HuggingFaceH4/ultrachat_200k base_model: - meta-llama/Llama-3.2-3B library_name: transformers, deltazip --- ## meta-llama/Llama-3.2-3B - 4b_2n4m_128bs Compression This is a compressed model using [deltazip](https://github.com/eth-easl/deltazip). [Paper](https://arxiv.org/abs/2312.05215), [Compression Tool](https://github.com/eth-easl/deltazip), [Inference Engine (Soon)](https://github.com/eth-easl/deltazip). ## Compression Configuration - Base Model: meta-llama/Llama-3.2-3B - Compression Scheme: 4b_2n4m_128bs - Dataset: HuggingFaceH4/ultrachat_200k - Dataset Split: train_sft - Max Sequence Length: 2048 - Number of Samples: 256 ## Sample Output #### Prompt: ``` [{'role': 'user', 'content': 'Who is Alan Turing?'}] ``` #### Output: ``` <|begin_of_text|><|start_header_id|>system<|end_header_id|> Cutting Knowledge Date: December 2023 Today Date: 05 Dec 2024 <|eot_id|><|start_header_id|>user<|end_header_id|> Who is Alan Turing?<|eot_id|><|start_header_id|>assistant<|end_header_id|> Alan Turing (1912-1954) was a British mathematician, computer scientist, logician, and philosopher. He is widely regarded as one of the most influential figures in the development of computer science, artificial intelligence, and cryptography. Turing was born in London, England, and studied mathematics at King's College, Cambridge. During World War II, he worked at the Government Code and Cypher School (GC&CS) at ``` ## Evaluation