--- language: - en license: apache-2.0 tags: - autoquant - fp8 --- # Llama-3.3-70B-Instruct-abliterated-FP8-Dynamic This is a quantized version of [thisnick/Llama-3.3-70B-Instruct-abliterated](https://huggingface.co/thisnick/Llama-3.3-70B-Instruct-abliterated) using FP8 quantization.