Edit model card

Quantizations of https://huggingface.co/microsoft/phi-2

From original readme

How to Use

Phi-2 was integrated in transformers version 4.37. If you need to use an earlier version, you need to pass trust_remote_code=True to the from_pretrained() function.

Phi-2 is known for having an attention overflow issue (with FP16). If you are facing this issue, please enable/disable autocast on the PhiAttention.forward() function.

Intended Uses

Given the nature of the training data, the Phi-2 model is best suited for prompts using the QA format, the chat format, and the code format.

QA Format:

You can provide the prompt as a standalone question as follows:

Write a detailed analogy between mathematics and a lighthouse.

where the model generates the text after "." . To encourage the model to write more concise answers, you can also try the following QA format using "Instruct: <prompt>\nOutput:"

Instruct: Write a detailed analogy between mathematics and a lighthouse.
Output: Mathematics is like a lighthouse. Just as a lighthouse guides ships safely to shore, mathematics provides a guiding light in the world of numbers and logic. It helps us navigate through complex problems and find solutions. Just as a lighthouse emits a steady beam of light, mathematics provides a consistent framework for reasoning and problem-solving. It illuminates the path to understanding and helps us make sense of the world around us.

where the model generates the text after "Output:".

Chat Format:

Alice: I don't know why, I'm struggling to maintain focus while studying. Any suggestions?
Bob: Well, have you tried creating a study schedule and sticking to it?
Alice: Yes, I have, but it doesn't seem to help much.
Bob: Hmm, maybe you should try studying in a quiet environment, like the library.
Alice: ...

where the model generates the text after the first "Bob:".

Code Format:

def print_prime(n):
   """
   Print all primes between 1 and n
   """
   primes = []
   for num in range(2, n+1):
       is_prime = True
       for i in range(2, int(math.sqrt(num))+1):
           if num % i == 0:
               is_prime = False
               break
       if is_prime:
           primes.append(num)
   print(primes)

where the model generates the text after the comments.

Downloads last month
260
GGUF
Model size
2.78B params
Architecture
phi2

1-bit

2-bit

3-bit

4-bit

5-bit

6-bit

8-bit

Inference Examples
Inference API (serverless) has been turned off for this model.