Could not locate the configuration_RW.py inside tiiuae/falcon-40b-instruct.
#7
by
cosmino
- opened
I am trying to use this version after I used the falcon-40b-instruct
this is the code:
model_config = AutoConfig.from_pretrained('ichitaka/falcon-40b-instruct-8bit', trust_remote_code=True)
model_config.max_seq_len = self.model_max_length
model = AutoModelForCausalLM.from_pretrained(
ichitaka/falcon-40b-instruct-8bit,
trust_remote_code=True,
config=model_config,
quantization_config=self.bnb_config,
torch_dtype=torch.bfloat16,
device_map='auto'
)
model.eval() # Set the model to evaluation mode
tokenizer = AutoTokenizer.from_pretrained(
'ichitaka/falcon-40b-instruct-8bit', padding_side="left", truncation_side="left",
model_max_length=self.model_max_length
)
# Most LLMs don't have a pad token by default
tokenizer.pad_token = tokenizer.eos_token
And the fist line returned -> tiiuae/falcon-40b-instruct does not appear to have a file named configuration_RW.py. Checkout 'https://huggingface.co/tiiuae/falcon-40b-instruct/main' for available files.
Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings
I also checked this thread -> https://huggingface.co/tiiuae/falcon-7b/discussions/60 but it didn't work
Thanks for your help