Text Generation
PEFT
Safetensors
German
Bavarian

LLäMmlein 1B

This is a Bavarian adapter for the German Tinyllama 1B language model which was tuned on a dump of the Bavarian wikipedia, without further optimization. Please don't take it too seriously ;) Find more details on our page and our preprint!

Run it

import torch
from peft import PeftConfig, PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer

# script config
base_model_name = "LSX-UniWue/LLaMmlein_1B"
adapter_name = "LSX-UniWue/Betzerl_1B_wiki_preview"
device = "cuda"  # or mps

# load model
config = PeftConfig.from_pretrained(adapter_name)
base_model = model = AutoModelForCausalLM.from_pretrained(
    base_model_name,
    torch_dtype=torch.bfloat16,
    device_map=device,
)
base_model.resize_token_embeddings(32001)
model = PeftModel.from_pretrained(base_model, adapter_name)
tokenizer = AutoTokenizer.from_pretrained(adapter_name)
Downloads last month
26
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LSX-UniWue/Betzerl_1B_wiki_preview

Adapter
(8)
this model

Dataset used to train LSX-UniWue/Betzerl_1B_wiki_preview

Collection including LSX-UniWue/Betzerl_1B_wiki_preview