Transformers
German
Inference Endpoints
File size: 850 Bytes
79e50fb
 
 
 
 
 
 
 
 
977a2ad
 
 
 
 
 
 
b715d77
 
 
 
 
 
 
977a2ad
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
license: openrail
datasets:
- oscar-corpus/OSCAR-2201
language:
- de
metrics:
- perplexity
library_name: transformers
---
# OPT-2.7B finetuned on OSCAR with LoRA
This model was finetuned on 80000 examples from the german subset of the oscar corpus. See [the git repo](https://github.com/bjoernpl/llm_finetuning_de)
for more info and exact hyperparameters.

To run this model, simply instantiate the `facebook/opt-2.7b` model as per usual and activate the adapter with PEFT:

```python
from peft import PeftModel, PeftConfig
from transformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("facebook/opt-2.7b")
model = PeftModel.from_pretrained(model, "bjoernp/opt2.7B-de-lora")
```

Refer to the [OPT Documentation](https://huggingface.co/facebook/opt-2.7b) for more infos on how to run the model and use the processor.