File size: 1,027 Bytes
1f60643
 
 
7015247
 
d95ec85
7015247
cdf7bea
7015247
cdf7bea
d95ec85
3416abf
 
7015247
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
---
library_name: peft
base_model: unsloth/gemma-1.1-7b-it-bnb-4bit
datasets:
- naklecha/minecraft-question-answer-700k
---
# Gemma 1.1 7B Instruct Minecraft Adapter Model
## Updated Version
This model is fine-tuned from [Unsloth's Gemma 1.1 7B Instruct quantized model](https://huggingface.co/unsloth/gemma-1.1-7b-it-bnb-4bit) with [naklecha's Minecraft Question-Answer dataset](https://huggingface.co/datasets/naklecha/minecraft-question-answer-700k).
Fine-tuned with first 100k rows from dataset with 1 epoch, it took around 2 hours 20 minutes with NVIDIA RTX 4090.

Model can now generate some good answers. But sometimes it can generate inappropriate answers. I think this problem is based on lack of data. 

## Important Notes
- Model sometimes generates answers with no meanings. I am currently investigating this. This process can be long since I am a beginner in this field. If you have any suggestions, feel free to say it on model's Community page.
- Model is using bitsandbytes so use it with a CUDA supported GPU.