Facing issue with pipeline code
#1
by
RaviNaik
- opened
Getting the below error when running the pipeline code of TheBloke/Lemur-70B-Chat-v1-AWQ
. Could you please help?
# Inference can also be done using transformers' pipeline
from transformers import pipeline
print("*** Pipeline:")
pipe = pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
max_new_tokens=512,
do_sample=True,
temperature=0.7,
top_p=0.95,
top_k=40,
repetition_penalty=1.1,
)
print(pipe(prompt_template)[0]["generated_text"])
### Error
AttributeError: 'LlamaAWQForCausalLM' object has no attribute 'config'
Thanks
Resolution is to use model=model.model
during pipeline construction.
Refer https://github.com/casper-hansen/AutoAWQ/issues/107
RaviNaik
changed discussion status to
closed