Edit model card

 
  
    Model: DIABLO 354M πŸ”₯
    Lang: IT
  
 

Model description

This model is a causal language model for the Italian language, based on a GPT-like [1] architecture (more specifically, the model has been obtained by modifying Meta's XGLM architecture [2] and exploiting its 564M checkpoint).

The model has ~354M parameters and a vocabulary of 50.335 tokens. It is a foundation model, pre-trained for causal language modeling, so it is mainly suitable for basic natural language generation, and you will have to fine-tune it in order to use it on more specific downstream tasks.

Quick usage

In order to use the model for inference on GPU, the following pipeline is needed:

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
from transformers import pipeline

tokenizer = AutoTokenizer.from_pretrained("osiria/diablo-italian-base-354m")
model = AutoModelForCausalLM.from_pretrained("osiria/diablo-italian-base-354m", torch_dtype=torch.float16)

device = torch.device("cuda")
model = model.to(device)

pipeline_nlg = pipeline("text-generation", model = model, tokenizer = tokenizer, device = 0)
pipeline_nlg("Ciao, mi chiamo Marco Rossi e")

# [{'generated_text': 'Ciao, mi chiamo Marco Rossi e sono un ragazzo di 23 anni.'}]

Limitations

The model might behave erratically when presented with prompts which are too far away from its pre-training and, because of the probabilistic nature of its generation, it might occasionally produce biased or offensive content with respect to gender, race, ideologies, and political or religious beliefs. These limitations imply that the model and its outputs should be used with caution, and should not be involved in situations that require the generated text to be fair or true.

References

[1] https://arxiv.org/abs/2005.14165

[2] https://arxiv.org/abs/2112.10668

License

The model is released under MIT license

Downloads last month
21
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Collection including osiria/diablo-italian-base-354m