QAT

#1
by rezzie-rich - opened

First I want to congratulate on such a huge aomplisment. This is sure a historic moment in open source community. Thank you

I do have an inquiry though, are these 2.5 series models trained using Quantized Aware Training (QAT) for FP8 like mistral-nemo?

Sign up or log in to comment