Prompting to OLMo
#8
by
herambpatil2004
- opened
Somehow, I am not able to get good responses when I run the inference using the code given in model card, is there anything I can improve?
hey @herambpatil2004 ; this is a base model, not a chat-tuned one. It is expected not to do well when prompting.
soldni
changed discussion status to
closed
this is a base model, but you could try some prompts such as URIAL https://github.com/Re-Align/URIAL?tab=readme-ov-file#content to make it chat with you for now @herambpatil2004