kecik
Added measurements.json, updated readme
2dfe295
|
raw
history blame
686 Bytes
metadata
language:
  - en
tags:
  - tag1
  - tag2
license: any valid license identifier
base_model: meta-llama/Llama-2-70b-hf

4.65bpw exl2 quant of ausboss/SuperCOT-70B. Calibration done using wikitext measurements.json file included in repo.

Original model card below:

Special thanks to Alpin, Tav and the rest of the Pygmalion peeps involved in training this one. Its trained on the supercot dataset like my other qloras and models. I'll update the card with more info soon.

Might be a bit overbaked 🧑‍🍳🔥