RTX 4090 using Text-Generation-WebUI / Oogabooga FAILS to load this model with Exllama2 (or any method?)
I don't think the latest version of Exllama v2 has updated the binary wheels unfortunately. You may need to build the Exllama v2 wheel yourself inside of ooba. To do this, you'll need to:
Run cmd_linux.sh ooba on Linux
./cmd_linux.sh
Or Windows:
cmd_windows.bat
Clone exllamav2 into a different directory (outside of ooba)
# Linux
cd $HOME
# Windows
cd C:\
git clone https://github.com/turboderp/exllamav2.git
cd exllamav2
Install exllamav2 using the above cmd_* shell
pip install -e .
Thank you for the excellent written instructions, unfortunately it failed on the final step for some reason. I'll try again later and post what it shows when I can. Thank you.
You can also try just updating ooba. They will eventually upgrade exl2 to the latest version (if they have not already done so). Or you can run the cmd shell manually to update get pull; pip install -U -r requirements.txt
I've updated OOBE and this is the error I get. I'll try requirements next...
error: subprocess-exited-with-error
Γ python setup.py develop did not run successfully.
β exit code: 1
β°β> [32 lines of output]
Version: 0.0.7
C:\text-generation-webui\installer_files\env\Lib\site-packages\setuptools\command\develop.py:40: EasyInstallDeprecationWarning: easy_install command is deprecated.
!!
********************************************************************************
Please avoid running ``setup.py`` and ``easy_install``.
Instead, use pypa/build, pypa/installer or other
standards-based tools.
See https://github.com/pypa/setuptools/issues/917 for details.
********************************************************************************
!!
easy_install.initialize_options(self)
C:\text-generation-webui\installer_files\env\Lib\site-packages\setuptools\_distutils\cmd.py:66: SetuptoolsDeprecationWarning: setup.py install is deprecated.
!!
********************************************************************************
Please avoid running ``setup.py`` directly.
Instead, use pypa/build, pypa/installer or other
standards-based tools.
See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details.
********************************************************************************
!!
self.initialize_options()
warning: no previously-included files matching '*.pyc' found anywhere in distribution
warning: no previously-included files matching 'dni_*' found anywhere in distribution
C:\text-generation-webui\installer_files\env\Lib\site-packages\torch\utils\cpp_extension.py:383: UserWarning: Error checking compiler version for cl: [WinError 2] The system cannot find the file specified
warnings.warn(f'Error checking compiler version for {compiler}: {error}')
error: [WinError 2] The system cannot find the file specified
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
Also "get pull; pip install -U -r requirements.txt" did not work, nor did git pull; pip install -U -r requirements.txt
Git pull said I was up to date, so I did run: pip install -U -r requirements.txt
After this, I STILL have the error above... :-(
NOTE: I have no problem loading this one - https://huggingface.co/LoneStriker/Yi-34B-Spicyboros-3.1-4.65bpw-h6-exl2/tree/main
^ Which is considerably worse than all of my 13 and 20b models....random nonsense, spelling errors, etc...what am I doing wrong? Shouldn't these be a better RPG/story experience than those in general? Do I need to use a specific PROFILE to reign this in? I know these like a TEMP under 1.0, but beyond that?
For your ooba errors, it's trying to access a C compiler that does not exist on your system. You're probably better off just moving your installer_files
directory aside and doing a fresh installation again with .\start_windows.bat
.
For the Yi-34b-Spicyboros model, try setting your repetition penalty down to 1.
This model was unfortunately trained with the Yi-34B model and not the Yi-34B-Llama model. So you need to have the latest exllamav2 installed from git and not the pre-compiled package. I think version 0.0.8 when released will fixed this problem, but you need to build it yourself for now if you want to run this model. This model also seems to be very flakey, so it may not be worth the effort; you may just want to wait for 0.0.8 to be released and merged into ooba.
I solved this problem by manually installing the 0.0.8 exllamav2 wheel that just came out from here https://github.com/turboderp/exllamav2/releases/tag/v0.0.8
Yeah I'll wait for the later version. Thanks for the information, and thanks Unicough for the alternative method to get it working if I change my mind....
Also remember to copy the missing *.py files from the original model and enable --trust-remote-code