metadata
base_model: unsloth/gemma-2-27b-it
tags:
- text-generation-inference
- transformers
- unsloth
- gemma2
- trl
- sft
license: apache-2.0
language:
- en
Compute sponsored by Arrow Denmark and Nvidia
- Developed by: ThatsGroes
- License: apache-2.0
- Finetuned from model : unsloth/gemma-2-27b-it
This gemma2 model was trained 2x faster with Unsloth and Huggingface's TRL library.
[codecarbon INFO @ 21:07:45] 2.748063 kWh of electricity used since the beginning.