Edit model card

Heron GIT Japanese StableLM Base 7B

heron

Model Details

Heron GIT Japanese StableLM Base 7B is a vision-language model that can converse about input images.
This model was trained using the heron library. Please refer to the code for details.

Usage

Follow the installation guide.

import requests
from PIL import Image

import torch
from transformers import AutoProcessor
from heron.models.git_llm.git_japanese_stablelm_alpha import GitJapaneseStableLMAlphaForCausalLM

device_id = 0

# prepare a pretrained model
model = GitJapaneseStableLMAlphaForCausalLM.from_pretrained(
    'turing-motors/heron-chat-git-ja-stablelm-base-7b-v0', torch_dtype=torch.float16
)
model.eval()
model.to(f"cuda:{device_id}")

# prepare a processor
processor = AutoProcessor.from_pretrained('turing-motors/heron-chat-git-ja-stablelm-base-7b-v0')

# prepare inputs
url = "https://www.barnorama.com/wp-content/uploads/2016/12/03-Confusing-Pictures.jpg"
image = Image.open(requests.get(url, stream=True).raw)

text = f"##human: これは何の写真ですか?\n##gpt: "

# do preprocessing
inputs = processor(
    text,
    image,
    return_tensors="pt",
    truncation=True,
)
inputs = {k: v.to(f"cuda:{device_id}") for k, v in inputs.items()}

# set eos token
eos_token_id_list = [
    processor.tokenizer.pad_token_id,
    processor.tokenizer.eos_token_id,
]

# do inference
with torch.no_grad():
    out = model.generate(**inputs, max_length=256, do_sample=False, temperature=0., eos_token_id=eos_token_id_list)

# print result
print(processor.tokenizer.batch_decode(out)[0])

Model Details

Training

This model was initially trained with the Adaptor using STAIR Captions. In the second phase, it was fine-tuned with LLaVA-Instruct-150K-JA and Japanese Visual Genome using LoRA.

Training Dataset

Use and Limitations

Intended Use

This model is intended for use in chat-like applications and for research purposes.

Limitations

The model may produce inaccurate or false information, and its accuracy is not guaranteed. It is still in the research and development stage.

How to cite

@misc{GitJapaneseStableLM, 
    url    = {[https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v0](https://huggingface.co/turing-motors/heron-chat-git-ja-stablelm-base-7b-v0)}, 
    title  = {Heron GIT Japanese StableLM Base 7B}, 
    author = {Yuichi Inoue, Kotaro Tanahashi, and Yu Yamaguchi}
}

Citations

@misc{JapaneseInstructBLIPAlpha, 
    url    = {[https://huggingface.co/stabilityai/japanese-instructblip-alpha](https://huggingface.co/stabilityai/japanese-instructblip-alpha)}, 
    title  = {Japanese InstructBLIP Alpha}, 
    author = {Shing, Makoto and Akiba, Takuya}
}

license: cc-by-nc-4.0

Downloads last month
20
Inference Examples
Inference API (serverless) has been turned off for this model.