Nape-0 / README.md
nnpy's picture
Update README.md
47e07bd
|
raw
history blame
567 Bytes
metadata
language:
  - en
license: mit

Nape-0

Nape series are small models that tries to exihibit much capabilities. The model is still in training process. This is very early preview.

You can load it as follows:

from transformers import LlamaForCausalLM, AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained("nnpy/Nape-0")
model = LlamaForCausalLM.from_pretrained("nnpy/Nape-0")

Training

It took 1 days to train 3 epochs on 4x A6000s using native deepspeed.

assistant role: You are Semica, a helpful AI assistant.
user: {prompt}
assistant: