munish0838 commited on
Commit
0d4b5b5
1 Parent(s): 361a68f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +67 -0
README.md ADDED
@@ -0,0 +1,67 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: deepseek
4
+ license_link: LICENSE
5
+ base_model: deepseek-ai/deepseek-coder-6.7b-instruct
6
+ pipeline_tag: text-generation
7
+ ---
8
+
9
+ # QuantFactory/deepseek-coder-6.7b-instruct-GGUF
10
+ This is quantized version of [deepseek-ai/deepseek-coder-6.7b-instruct](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-instruct) created using llama.cpp
11
+
12
+ # Model Description
13
+
14
+ <p align="center">
15
+ <img width="1000px" alt="DeepSeek Coder" src="https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/pictures/logo.png?raw=true">
16
+ </p>
17
+ <p align="center"><a href="https://www.deepseek.com/">[🏠Homepage]</a> | <a href="https://coder.deepseek.com/">[🤖 Chat with DeepSeek Coder]</a> | <a href="https://discord.gg/Tc7c45Zzu5">[Discord]</a> | <a href="https://github.com/guoday/assert/blob/main/QR.png?raw=true">[Wechat(微信)]</a> </p>
18
+ <hr>
19
+
20
+
21
+
22
+
23
+ ### 1. Introduction of Deepseek Coder
24
+
25
+ Deepseek Coder is composed of a series of code language models, each trained from scratch on 2T tokens, with a composition of 87% code and 13% natural language in both English and Chinese. We provide various sizes of the code model, ranging from 1B to 33B versions. Each model is pre-trained on project-level code corpus by employing a window size of 16K and a extra fill-in-the-blank task, to support project-level code completion and infilling. For coding capabilities, Deepseek Coder achieves state-of-the-art performance among open-source code models on multiple programming languages and various benchmarks.
26
+
27
+ - **Massive Training Data**: Trained from scratch fon 2T tokens, including 87% code and 13% linguistic data in both English and Chinese languages.
28
+
29
+ - **Highly Flexible & Scalable**: Offered in model sizes of 1.3B, 5.7B, 6.7B, and 33B, enabling users to choose the setup most suitable for their requirements.
30
+
31
+ - **Superior Model Performance**: State-of-the-art performance among publicly available code models on HumanEval, MultiPL-E, MBPP, DS-1000, and APPS benchmarks.
32
+
33
+ - **Advanced Code Completion Capabilities**: A window size of 16K and a fill-in-the-blank task, supporting project-level code completion and infilling tasks.
34
+
35
+
36
+
37
+ ### 2. Model Summary
38
+ deepseek-coder-6.7b-instruct is a 6.7B parameter model initialized from deepseek-coder-6.7b-base and fine-tuned on 2B tokens of instruction data.
39
+ - **Home Page:** [DeepSeek](https://deepseek.com/)
40
+ - **Repository:** [deepseek-ai/deepseek-coder](https://github.com/deepseek-ai/deepseek-coder)
41
+ - **Chat With DeepSeek Coder:** [DeepSeek-Coder](https://coder.deepseek.com/)
42
+
43
+
44
+ ### 3. How to Use
45
+ Here give some examples of how to use our model.
46
+ #### Chat Model Inference
47
+ ```python
48
+ from transformers import AutoTokenizer, AutoModelForCausalLM
49
+ tokenizer = AutoTokenizer.from_pretrained("deepseek-ai/deepseek-coder-6.7b-instruct", trust_remote_code=True)
50
+ model = AutoModelForCausalLM.from_pretrained("deepseek-ai/deepseek-coder-6.7b-instruct", trust_remote_code=True, torch_dtype=torch.bfloat16).cuda()
51
+ messages=[
52
+ { 'role': 'user', 'content': "write a quick sort algorithm in python."}
53
+ ]
54
+ inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to(model.device)
55
+ # tokenizer.eos_token_id is the id of <|EOT|> token
56
+ outputs = model.generate(inputs, max_new_tokens=512, do_sample=False, top_k=50, top_p=0.95, num_return_sequences=1, eos_token_id=tokenizer.eos_token_id)
57
+ print(tokenizer.decode(outputs[0][len(inputs[0]):], skip_special_tokens=True))
58
+ ```
59
+
60
+ ### 4. License
61
+ This code repository is licensed under the MIT License. The use of DeepSeek Coder models is subject to the Model License. DeepSeek Coder supports commercial use.
62
+
63
+ See the [LICENSE-MODEL](https://github.com/deepseek-ai/deepseek-coder/blob/main/LICENSE-MODEL) for more details.
64
+
65
+ ### 5. Contact
66
+
67
+ If you have any questions, please raise an issue or contact us at [[email protected]](mailto:[email protected]).