File size: 1,861 Bytes
0046654
 
 
 
 
 
 
 
 
 
 
 
 
 
d8b5559
0046654
 
 
 
 
e64bf5c
0046654
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: llama2
datasets:
- digitalpipelines/wizard_vicuna_70k_uncensored
tags:
- uncensored
- wizard
- vicuna
- llama
---


<!-- header start -->
<div style="width: 100%;">
    <img src="https://digitalpipelines.net/assets/images/logos/dp_logo_transparent.png" alt="Digital Pipelines" style="width: 10%; min-width: 200px; display: block; margin: auto;">
</div>
<!-- header end -->

# Overview
Fine-tuned [Llama-2 13B](https://huggingface.co/TheBloke/Llama-2-13B-Chat-fp16) trained with an uncensored/unfiltered Wizard-Vicuna conversation dataset [digitalpipelines/wizard_vicuna_70k_uncensored](https://huggingface.co/datasets/digitalpipelines/wizard_vicuna_70k_uncensored).
A QLoRA was created and used for fine-tuning and then merged back into the model. Llama2 has inherited bias even though it's been finetuned on an uncensored dataset. 


## Available versions of this model
* [GPTQ model for usage with GPU. Multiple quantisation options available.](https://huggingface.co/digitalpipelines/llama2_13b_chat_uncensored-GPTQ)
* [Various GGML model quantization sizesfor CPU/GPU/Apple M1 usage.](https://huggingface.co/digitalpipelines/llama2_13b_chat_uncensored-GGML)
* [Original unquantised model](https://huggingface.co/digitalpipelines/llama2_13b_chat_uncensored)


## Prompt template: Llama-2-Chat

```
SYSTEM: You are a helpful, respectful and honest assistant. Always answer as helpfully as possible, while being safe.  Your answers should not include any harmful, unethical, racist, sexist, toxic, dangerous, or illegal content. Please ensure that your responses are socially unbiased and positive in nature. If a question does not make any sense, or is not factually coherent, explain why instead of answering something not correct. If you don't know the answer to a question, please don't share false information.
USER: {prompt}
ASSISTANT:
```