Update config.json
#10
by
julien31
- opened
Hi all,
max_position_embeddings should be 131072 instead of 16384
Hi @julien31 you're indeed right, not sure this is something that has automatically been done by AutoAWQ? Because for the rest of the AutoAWQ quants this seems right? Do you have any clue @casperhansen ? Maybe I'm missing something!
Can you please merge this. I ran also into this error and the context length should be still 128k.
alvarobartt
changed pull request status to
merged