Meta-Llama-3-8B-Instruct can be run on CPU?

#173
by Tizem - opened

Hi community,
I followed the video prepared by Meta instructions to show how to download and setup the model Llama 3.
see the link: https://llama.meta.com/docs/llama-everywhere/running-meta-llama-on-windows/#setup

I run some issue installing the pytorch, I installed the version for python, cpu. and while running my python script for the pipeline, I get
the following error msg:

OSError: meta-llama/Meta-Llama-3-8B-Instruct does not appear to have a file named config.json. Checkout 'https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct/tree/None' for available files.

could someone help with it, please!, if soneone knows whether this model can run on cpu?
Thanks!

This comment has been hidden

I also run on CPU
Follow the pytorch office installation to install cpu version

pip install torch

And then I use the code to download the model. It will save model on the cache location.

C:\Users\a\.cache\huggingface\hub\models--meta-llama--Meta-Llama-3.1-8B-Instruct

import transformers
import torch

model_id = "meta-llama/Meta-Llama-3.1-8B-Instruct"

pipeline = transformers.pipeline(
    "text-generation",
    model=model_id,
    model_kwargs={"torch_dtype": torch.bfloat16},
    device_map="auto",
)

messages = [
    {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"},
    {"role": "user", "content": "Who are you?"},
]

outputs = pipeline(
    messages,
    max_new_tokens=256,
)
print(outputs[0]["generated_text"][-1])

About your question OSError: meta-llama/Meta-Llama-3-8B-Instruct does not appear to have a file named config.json, maybe you can run the code to download model and make sure your internet is good.

By the way, the below is always fail for me. I always download the uncompleted file.

# Make sure you have git-lfs installed (https://git-lfs.com)
git lfs install

# When prompted for a password, use an access token with write permissions.
# Generate one from your settings: https://huggingface.co/settings/tokens
git clone https://huggingface.co/meta-llama/Meta-Llama-3.1-8B-Instruct

Sign up or log in to comment