aashish1904 commited on
Commit
5ca29f2
1 Parent(s): 418de7e

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +88 -0
README.md ADDED
@@ -0,0 +1,88 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ---
3
+
4
+ license: apache-2.0
5
+ pipeline_tag: text-generation
6
+ tags:
7
+ - chat
8
+ - mistral
9
+ - roleplay
10
+ - creative-writing
11
+ base_model:
12
+ - nbeerbower/mistral-nemo-bophades-12B
13
+ - anthracite-org/magnum-v2-12b
14
+ - Sao10K/MN-12B-Lyra-v3
15
+ - Gryphe/Pantheon-RP-1.6-12b-Nemo
16
+ language:
17
+ - en
18
+
19
+ ---
20
+
21
+ ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)
22
+
23
+ # QuantFactory/StarDust-12b-v2-GGUF
24
+ This is quantized version of [Luni/StarDust-12b-v2](https://huggingface.co/Luni/StarDust-12b-v2) created using llama.cpp
25
+
26
+ # Original Model Card
27
+
28
+
29
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6303fa71fc783bfc7443e7ae/c3ddWBoz-lINEykUDCoXy.png)
30
+
31
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6303fa71fc783bfc7443e7ae/hOpgDxJS2sDO7HzuC9e18.png)
32
+
33
+
34
+ # StarDust-12b-v2
35
+
36
+ ## Quants
37
+
38
+ - GGUF: [mradermacher/StarDust-12b-v2-GGUF](https://huggingface.co/mradermacher/StarDust-12b-v2-GGUF)
39
+ - weighted/imatrix GGUF: [mradermacher/StarDust-12b-v2-i1-GGUF](https://huggingface.co/mradermacher/StarDust-12b-v2-i1-GGUF/tree/main)
40
+ - exl2: [lucyknada/Luni_StarDust-12b-v2-exl2](https://huggingface.co/lucyknada/Luni_StarDust-12b-v2-exl2)
41
+
42
+ ## Description | Usecase
43
+
44
+ - The result of this merge is in my opinion a more vibrant and less generic sonnet inspired prose, it's able to be gentle and harsh where asked.
45
+ - The v2 uses the non-kto magnum which tends to have less "claudeism" (making the story feel rather repetitive)
46
+ - Note on Non-Kto: There is a very big gap between people preferring and disliking the KTO. To make things easier, you can still use [Luni/StarDust-12b-v1](https://huggingface.co/Luni/StarDust-12b-v1) which has the KTO version.
47
+ - In early testing users have reported a much better experience in longer roleplays and a abillity to add a creative touch to the stable experiencve.
48
+
49
+ Just like with v1:
50
+ - This model is intended to be used as a Role-playing model.
51
+ - Its direct conversational output is... I can't even say it's luck, it's just not made for it.
52
+ - Extension to Conversational output: The Model is designed for roleplay, direct instructing or general purpose is NOT recommended.
53
+
54
+ ## Prompting
55
+
56
+ ### Edit: ChatML has proven to be the BEST choice.
57
+
58
+ Both Mistral and ChatML should work though I had better results with ChatML:
59
+ ChatML Example:
60
+ ```py
61
+ """<|im_start|>user
62
+ Hi there!<|im_end|>
63
+ <|im_start|>assistant
64
+ Nice to meet you!<|im_end|>
65
+ <|im_start|>user
66
+ Can I ask a question?<|im_end|>
67
+ <|im_start|>assistant
68
+ """
69
+ ```
70
+
71
+
72
+
73
+ ## Merge Details
74
+ ### Merge Method
75
+
76
+ This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [Sao10K/MN-12B-Lyra-v3](https://huggingface.co/Sao10K/MN-12B-Lyra-v3) as a base.
77
+
78
+ ### Models Merged
79
+
80
+ The following models were included in the merge:
81
+ * [nbeerbower/mistral-nemo-bophades-12B](https://huggingface.co/nbeerbower/mistral-nemo-bophades-12B)
82
+ * [anthracite-org/magnum-v2-12b](https://huggingface.co/anthracite-org/magnum-v2-12b)
83
+ * [Gryphe/Pantheon-RP-1.6-12b-Nemo](https://huggingface.co/Gryphe/Pantheon-RP-1.6-12b-Nemo)
84
+ * [Sao10K/MN-12B-Lyra-v3](https://huggingface.co/Sao10K/MN-12B-Lyra-v3)
85
+
86
+ ### Special Thanks
87
+
88
+ Special thanks to the SillyTilly and myself for helping me find the energy to finish this.