File size: 2,750 Bytes
ef6a2e6
 
9ab29ff
 
 
ef6a2e6
 
9ab29ff
ef6a2e6
9ab29ff
ef6a2e6
9ab29ff
ef6a2e6
9ab29ff
ef6a2e6
9ab29ff
 
 
 
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
ef6a2e6
9ab29ff
 
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
ef6a2e6
9ab29ff
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
---
library_name: transformers
license: mit
base_model:
- microsoft/Phi-3-mini-128k-instruct
---

![RocRacoon-3b Banner](https://cdn-uploads.huggingface.co/production/uploads/652c2a63d78452c4742cd3d3/LLeoQZMZ5WDE5iZusC6EB.png)

# RocRacoon-3b 🦝

RocRacoon-3b is a versatile language model designed to excel in creative writing, storytelling, and multi-turn conversations. Built on the Phi-3-mini-128k-instruct model, it has been fine-tuned to enhance its contextual understanding and generate more engaging and coherent responses.

## Model Details πŸ“Š

- **Developed by:** Aixon Lab
- **Model type:** Causal Language Model
- **Language(s):** English (primarily), may support other languages
- **License:** MIT
- **Repository:** https://huggingface.co/aixonlab/RocRacoon-3b

## Quantization
- **GGUF:** https://huggingface.co/mradermacher/RocRacoon-3b-GGUF

## Model Architecture πŸ—οΈ

- **Base model:** microsoft/Phi-3-mini-128k-instruct
- **Parameter count:** ~3 billion
- **Architecture specifics:** Transformer-based language model

## Intended Use 🎯
RocRacoon-3b is designed for a wide range of natural language processing tasks, with a particular focus on article writing and topic based multi-turn conversations. It can be used for text generation, dialogue systems, and content creation.

## Ethical Considerations πŸ€”
As a derivative of the Phi-3-mini model, RocRacoon-3b may inherit some biases and limitations. Users should be aware of potential biases in generated content and use the model responsibly, especially in sensitive contexts.

## Performance and Evaluation
Comprehensive performance metrics for RocRacoon-3b are currently being compiled. Initial tests show improvements in coherence and creativity compared to the base model. Users are encouraged to contribute their findings and benchmarks.

## Limitations and Biases
While efforts have been made to mitigate biases, the model may still exhibit some biases present in its training data. Users should critically evaluate the model's outputs and use them in conjunction with human judgment, particularly for sensitive applications.

## Additional Information
For more details on the base Phi-3-mini-128k-instruct model, please refer to its model card and documentation.

## How to Use 
```python
from transformers import AutoTokenizer, AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("aixonlab/RocRacoon-3b")
tokenizer = AutoTokenizer.from_pretrained("aixonlab/RocRacoon-3b")

prompt = "Write a short story about a clever raccoon"
input_ids = tokenizer(prompt, return_tensors="pt").input_ids

generated_ids = model.generate(input_ids, max_length=200)
generated_text = tokenizer.decode(generated_ids, skip_special_tokens=True)
print(generated_text)