File size: 493 Bytes
cb28db5
 
 
f738339
6a2caa0
f738339
c2dd20b
 
 
497e788
d8f8d91
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
---
license: apache-2.0
---
This is a low-rank adapter for OpneCALM-7B
trained on the 134K Japanese dataset.

It doesn't contain the foundation model itself, so it's Apache 2.0 licensed.

You can try it here.  
[colab notebook](https://colab.research.google.com/gist/kam0sika/380db89c90f2a19eca1d28d79fee7c92/amazingca_generate.ipynb)

QLoRA finetuning code is here.  
[colab notebook](https://colab.research.google.com/gist/kam0sika/bacc7019f5f11833531e1520178381a8/finetune_calm_qlora.ipynb)