prompt template

#1
by ThiloteE - opened

Prompt template:

I have had success with following template using GPT4All 3.0:

<|endoftext|>Instruct: %1<|endoftext|>
<|endoftext|>Output: %2<|endoftext|>

Please put stuff like that it into the model card, if you want people to use your models. You can find info about the prompt template in the tokenizer_config.json of the original model (which is phi-2 from microsoft). While you are usually nice and respond, when I ask you something, I have lost a little bit of hope you are ever going to provide better readme files along with your models T.T

Feedback about the model:

The model is VERBOSE. Produces super long responses. Not in the best quality I have to say, especially, if you do not give an instruction and ask a question instead. Maybe it is possible to play around with the prompt template some more, but ultimately it depends on what you used during the finetune and what was in the dataset and only you know.

I'm sorry for the late response! I've been pulling 12 hour shifts at little caesars whilst keeping this project going, as I see potential in using it for programming work in the future. The system prompt is indeed a standard phi-2 system prompt.

Sign up or log in to comment