Model not responding

#30
by Dm124r - opened

Hello, I am trying to run this model locally using this code:

model_name = 'microsoft/Phi-3.5-mini-instruct'
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
tokenizer.save_pretrained(f'C:\Users\tokenizer\{model_name}')
model.save_pretrained(f'C:\Users\model\{model_name}')

tokenizer = AutoTokenizer.from_pretrained(f'C:\Users\tokenizer\{model_name}')
model = AutoModelForCausalLM.from_pretrained(f'C:\Users\model\{model_name}')

prompt = 'Write a short 100 word essay about AI.'

promt_embeding = tokenizer(prompt, return_tensors='pt')

response = model.generate(**promt_embeding)

print(tokenizer.decode(response[0]))

But the model does not respond, only this appears:

2025-01-20_13-32-55.jpg
Can you explain what I'm doing wrong?

Sign up or log in to comment