Question answering format
Hello,
I want to know if this model is able for Question Answering and what is the format I should give to it?
Does it also require context?
Thanks
No, it is not trained for Q&A format. It is not a instruct or chat fine tuned version, It is a base model. But still you can ask questions.
You can use this format -
What is the capital of Germany - Berlin
What is the capital of India - New Delhi
What is the capital of Egypt -
The it'll answer Cairo.
And it needs context for most of the questions. Since it is trained on 30B tokens data, it doesn't know everything. If you ask what is the capital of France - it might say Paris and some other same level questions, but if you ask it complex questions, which you usually ask GPT 4 or Google Bard or Bing chat or Chat GPT (GPT 3.5) - it might not be able to answer them.
Moreover it has only 2048 context length, so you cannot ask long questions and expect lengthy answers.
I'm running it on Google Colab.
Can we get list of libraries being used?
What do you mean? If you are just using the model PyTorch and Transformers should let you run inference.
Hello @codegood ! As vriveras previously mentioned, this model has not been instructed fine-tuned or reinforced with human feedback. Thus, it is more useful to generate completions than chatting, although it can produce some reasonable chat depending on the prompt you are using.
The libraries used to load the model are torch
, transformers
and einops
. As long as you have them, you should be able to load and infer with the model.