Text2Text Generation
Transformers
PyTorch
Safetensors
t5
text-generation-inference
Inference Endpoints

What does "recommended for better performance" mean?

#1
by mstachow - opened

The model card for this model says that of all the flan-alpaca-* models, the gpt4-xl is recommended for better performance. Is that referring to computational performance or accuracy or something else?

Deep Cognition and Language Research (DeCLaRe) Lab org

Hi, as it was trained on GPT-4 generated data (compared to GPT-3 or ChatGPT), the model will likely generate higher quality responses in most cases

@chiayewken interesting...compared to which model? I am finding that the 11B XXL version is working better for my specific task, but then again it is 8B more parameters than the 3B. Does it mean that "among the 3B models, this one should perform best"?

Sign up or log in to comment