YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Model Card for iopwsy/MatGPT-synthesis

Model Details

Model Description

predicted the synthesis path in inorganic chemistry

Uses

How to Get Started with the Model

from transformers import GPT2Tokenizer, GPT2LMHeadModel, StoppingCriteria, StoppingCriteriaList
import torch
model_path = "iopwsy/MatGPT-synthesis"
tokenizer = GPT2Tokenizer.from_pretrained(model_path, pad_token = '<|endoftext|>')
model = GPT2LMHeadModel.from_pretrained(model_path)
model.config.pad_token_id = model.config.eos_token_id
model.eval()
class StopforGPT2(StoppingCriteria):
    def __call__(self, input_ids: torch.LongTensor, scores: torch.FloatTensor, **kwargs) -> bool:
        return input_ids[0,-1].detach().cpu().numpy() == 50256

def get_path(text):
    with torch.no_grad():
        res = model.generate(input_ids = tokenizer.encode_plus(text, return_tensors = 'pt'),
                             do_sample=True,
                             top_k = 10,
                             top_p = 0.95,
                             temperature = 0.1,
                             max_new_tokens = 300,
                             stopping_criteria=StoppingCriteriaList([StopforGPT2()]))
    print(tokenizer.decode(res[0],skip_special_tokens=True))

### example
get_path("How to synthesis Li7La3Zr2O12?\n")
Downloads last month
8
Safetensors
Model size
137M params
Tensor type
F32
·
BOOL
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.