KARAKURI LM Thinking
Collection
1 item
•
Updated
[email protected]
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "karakuri-ai/karakuri-lm-32b-thinking-2501-exp"
model = AutoModelForCausalLM.from_pretrained(
model_name,
torch_dtype="auto",
device_map="auto",
)
tokenizer = AutoTokenizer.from_pretrained(model_name)
messages = [
{"role": "user", "content": "こんにちは。"}
]
input_ids = tokenizer.apply_chat_template(
messages,
add_generation_prompt=True,
return_tensors="pt",
).to(model.device)
outputs = model.generate(input_ids, max_new_tokens=512)
tokenizer.decode(outputs[0][input_ids.shape[-1]:])
This work was supported by the Ministry of Economy, Trade and Industry (METI) and the New Energy and Industrial Technology Development Organization (NEDO) through the Generative AI Accelerator Challenge (GENIAC).
@misc{karakuri_lm_32b_thinking_2501_exp,
author = { {KARAKURI} {I}nc. },
title = { {KARAKURI} {LM} 32{B} {T}hinking 2501 {E}xperimental },
year = { 2025 },
url = { https://huggingface.co/karakuri-ai/karakuri-lm-32b-thinking-2501-exp },
publisher = { Hugging Face },
journal = { Hugging Face repository }
}