munish0838 commited on
Commit
b888668
1 Parent(s): 0599ad1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +108 -0
README.md ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ language:
4
+ - en
5
+ - fr
6
+ - de
7
+ - es
8
+ - it
9
+ - pt
10
+ - ja
11
+ - ko
12
+ - zh
13
+ - ar
14
+ - el
15
+ - fa
16
+ - pl
17
+ - id
18
+ - cs
19
+ - he
20
+ - hi
21
+ - nl
22
+ - ro
23
+ - ru
24
+ - tr
25
+ - uk
26
+ - vi
27
+ license: cc-by-nc-4.0
28
+ pipeline_tag: text-generation
29
+ tags:
30
+ - cohere
31
+ ---
32
+
33
+ # Aya-23-8B-GGUF
34
+ - This is quantized version of [CohereForAI/aya-23-8B](https://huggingface.co/CohereForAI/aya-23-8B) created using llama.cpp
35
+
36
+ # Model Description
37
+
38
+ Aya 23 is an open weights research release of an instruction fine-tuned model with highly advanced multilingual capabilities. Aya 23 focuses on pairing a highly performant pre-trained [Command family](https://huggingface.co/CohereForAI/c4ai-command-r-plus) of models with the recently released [Aya Collection](https://huggingface.co/datasets/CohereForAI/aya_collection). The result is a powerful multilingual large language model serving 23 languages.
39
+
40
+ This model card corresponds to the 8-billion version of the Aya 23 model. We also released a 35-billion version which you can find [here](https://huggingface.co/CohereForAI/aya-23-35B).
41
+
42
+ We cover 23 languages: Arabic, Chinese (simplified & traditional), Czech, Dutch, English, French, German, Greek, Hebrew, Hindi, Indonesian, Italian, Japanese, Korean, Persian, Polish, Portuguese, Romanian, Russian, Spanish, Turkish, Ukrainian, and Vietnamese
43
+
44
+ Developed by: [Cohere For AI](https://cohere.for.ai) and [Cohere](https://cohere.com/)
45
+
46
+ - Point of Contact: Cohere For AI: [cohere.for.ai](https://cohere.for.ai/)
47
+ - License: [CC-BY-NC](https://cohere.com/c4ai-cc-by-nc-license), requires also adhering to [C4AI's Acceptable Use Policy](https://docs.cohere.com/docs/c4ai-acceptable-use-policy)
48
+ - Model: aya-23-8B
49
+ - Model Size: 8 billion parameters
50
+
51
+ **Try Aya 23**
52
+
53
+ You can try out Aya 23 (35B) before downloading the weights in our hosted Hugging Face Space [here](https://huggingface.co/spaces/CohereForAI/aya-23).
54
+
55
+ ### Usage
56
+
57
+ Please install transformers from the source repository that includes the necessary changes for this model
58
+
59
+ ```python
60
+ # pip install transformers==4.41.1
61
+ from transformers import AutoTokenizer, AutoModelForCausalLM
62
+
63
+ model_id = "CohereForAI/aya-23-8B"
64
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
65
+ model = AutoModelForCausalLM.from_pretrained(model_id)
66
+
67
+ # Format message with the command-r-plus chat template
68
+ messages = [{"role": "user", "content": "Anneme onu ne kadar sevdiğimi anlatan bir mektup yaz"}]
69
+ input_ids = tokenizer.apply_chat_template(messages, tokenize=True, add_generation_prompt=True, return_tensors="pt")
70
+ ## <BOS_TOKEN><|START_OF_TURN_TOKEN|><|USER_TOKEN|>Anneme onu ne kadar sevdiğimi anlatan bir mektup yaz<|END_OF_TURN_TOKEN|><|START_OF_TURN_TOKEN|><|CHATBOT_TOKEN|>
71
+
72
+ gen_tokens = model.generate(
73
+ input_ids,
74
+ max_new_tokens=100,
75
+ do_sample=True,
76
+ temperature=0.3,
77
+ )
78
+
79
+ gen_text = tokenizer.decode(gen_tokens[0])
80
+ print(gen_text)
81
+ ```
82
+
83
+ ### Example Notebook
84
+
85
+ [This notebook](https://huggingface.co/CohereForAI/aya-23-8B/blob/main/Aya_23_notebook.ipynb) showcases a detailed use of Aya 23 (8B) including inference and fine-tuning with [QLoRA](https://huggingface.co/blog/4bit-transformers-bitsandbytes).
86
+
87
+ ## Model Details
88
+
89
+ **Input**: Models input text only.
90
+
91
+ **Output**: Models generate text only.
92
+
93
+ **Model Architecture**: Aya-23-8B is an auto-regressive language model that uses an optimized transformer architecture. After pretraining, this model is fine-tuned (IFT) to follow human instructions.
94
+
95
+ **Languages covered**: The model is particularly optimized for multilinguality and supports the following languages: Arabic, Chinese (simplified & traditional), Czech, Dutch, English, French, German, Greek, Hebrew, Hindi, Indonesian, Italian, Japanese, Korean, Persian, Polish, Portuguese, Romanian, Russian, Spanish, Turkish, Ukrainian, and Vietnamese
96
+
97
+ **Context length**: 8192
98
+
99
+ ### Evaluation
100
+
101
+ <img src="benchmarks.png" alt="multilingual benchmarks" width="650" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
102
+ <img src="winrates.png" alt="average win rates" width="650" style="margin-left:'auto' margin-right:'auto' display:'block'"/>
103
+
104
+ Please refer to the [Aya 23 technical report](https://cohere.com/research/papers/aya-command-23-8b-and-35b-technical-report-2024-05-23) for further details about the base model, data, instruction tuning, and evaluation.
105
+
106
+ ### Terms of Use
107
+
108
+ We hope that the release of this model will make community-based research efforts more accessible, by releasing the weights of a highly performant multilingual model to researchers all over the world. This model is governed by a [CC-BY-NC](https://cohere.com/c4ai-cc-by-nc-license) License with an acceptable use addendum, and also requires adhering to [C4AI's Acceptable Use Policy](https://docs.cohere.com/docs/c4ai-acceptable-use-policy).