File size: 461 Bytes
0759521
 
 
667b94c
3b5058f
667b94c
3b5058f
667b94c
3b5058f
667b94c
 
3b5058f
 
667b94c
3b5058f
667b94c
 
3b5058f
 
667b94c
 
3b5058f
667b94c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
---
license: llama2
---
CodeLlama 2 7b

With Guanaco Lora (Tim Dettmers), merged by Varunk29.

Then

With Mistral AI 7b 0.1 delta bits compared to Llama2 (extracted by Undi95), merged by me.

---

Base model (CodeLlama) training context : 16k (max context up to 96k with the base ROPE)

Mistral injection training context : 8k (Sliding Windows Attention is likely inoperant on such a merge/injection)

---

For test and amusement only.


Prompt : Alpaca works.