Model not responding
Hello, I am trying to run this model locally using this code:
model_name = 'microsoft/Phi-3.5-mini-instruct'
tokenizer = AutoTokenizer.from_pretrained(model_name, trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
tokenizer.save_pretrained(f'C:\Users\tokenizer\{model_name}')
model.save_pretrained(f'C:\Users\model\{model_name}')
tokenizer = AutoTokenizer.from_pretrained(f'C:\Users\tokenizer\{model_name}')
model = AutoModelForCausalLM.from_pretrained(f'C:\Users\model\{model_name}')
prompt = 'Write a short 100 word essay about AI.'
promt_embeding = tokenizer(prompt, return_tensors='pt')
response = model.generate(**promt_embeding)
print(tokenizer.decode(response[0]))
But the model does not respond, only this appears: