AttributeError: module peft.tuners.lora has no attribute layer

#5
by Borko24 - opened

I get that error since yesterday. It worked like charm the previous days. The error started with "ModuleNotFoundError: No module named 'transformers.models.gemma'". I seem to have resolved this with upgrading "auto-gpt" but then this error showed up. Tried updating peft and transformers but no combination seems to work of the aforementioned libs. If anyone knows what is going wrong.

A follow up: The problem seems to be with incompatible CUDA and torch version. Make sure that your torch is compatible with your CUDA version and also that you configure the right version when installing auto-gpt.

Sign up or log in to comment