Regarding suda and bitsandbytes (pls HELP)

#49
by Anish-45 - opened

C:\LLaVA\llava_env\lib\site-packages\transformers\utils\generic.py:441: FutureWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
_torch_pytree._register_pytree_node(
False

===================================BUG REPORT===================================
C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\cuda_setup\main.py:166: UserWarning: Welcome to bitsandbytes. For bug reports, please run

python -m bitsandbytes

warn(msg)

CUDA_SETUP: WARNING! libcudart.so not found in any environmental path. Searching in backup paths...
The following directories listed in your path were found to be non-existent: {WindowsPath('/usr/local/cuda/lib64')}
DEBUG: Possible options found for libcudart.so: set()
CUDA SETUP: PyTorch settings found: CUDA_VERSION=118, Highest Compute Capability: 8.9.
CUDA SETUP: To manually override the PyTorch CUDA version please see:https://github.com/TimDettmers/bitsandbytes/blob/main/how_to_use_nonpytorch_cuda.md
CUDA SETUP: Loading binary C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\libbitsandbytes_cuda118.so...
argument of type 'WindowsPath' is not iterable
CUDA SETUP: Problem: The main issue seems to be that the main CUDA runtime library was not detected.
CUDA SETUP: Solution 1: To solve the issue the libcudart.so location needs to be added to the LD_LIBRARY_PATH variable
CUDA SETUP: Solution 1a): Find the cuda runtime library via: find / -name libcudart.so 2>/dev/null
CUDA SETUP: Solution 1b): Once the library is found add it to the LD_LIBRARY_PATH: export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:FOUND_PATH_FROM_1a
CUDA SETUP: Solution 1c): For a permanent solution add the export from 1b into your .bashrc file, located at ~/.bashrc
CUDA SETUP: Solution 2: If no library was found in step 1a) you need to install CUDA.
CUDA SETUP: Solution 2a): Download CUDA install script: wget https://github.com/TimDettmers/bitsandbytes/blob/main/cuda_install.sh
CUDA SETUP: Solution 2b): Install desired CUDA version to desired location. The syntax is bash cuda_install.sh CUDA_VERSION PATH_TO_INSTALL_INTO.
CUDA SETUP: Solution 2b): For example, "bash cuda_install.sh 113 ~/local/" will download CUDA 11.3 and install into the folder ~/local
Traceback (most recent call last):
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1382, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python310\lib\importlib_init
.py", line 126, in import_module
return _bootstrap.gcd_import(name[level:], package, level)
File "", line 1050, in gcd_import
File "", line 1027, in find_and_load
File "", line 1006, in find_and_load_unlocked
File "", line 688, in load_unlocked
File "", line 883, in exec_module
File "", line 241, in call_with_frames_removed
File "C:\LLaVA\llava_env\lib\site-packages\transformers\generation\utils.py", line 28, in
from ..integrations.deepspeed import is_deepspeed_zero3_enabled
File "C:\LLaVA\llava_env\lib\site-packages\transformers\integrations\deepspeed.py", line 49, in
from accelerate.utils.deepspeed import HfDeepSpeedConfig as DeepSpeedConfig
File "C:\LLaVA\llava_env\lib\site-packages\accelerate_init
.py", line 3, in
from .accelerator import Accelerator
File "C:\LLaVA\llava_env\lib\site-packages\accelerate\accelerator.py", line 35, in
from .checkpointing import load_accelerator_state, load_custom_state, save_accelerator_state, save_custom_state
File "C:\LLaVA\llava_env\lib\site-packages\accelerate\checkpointing.py", line 24, in
from .utils import (
File "C:\LLaVA\llava_env\lib\site-packages\accelerate\utils_init
.py", line 131, in
from .bnb import has_4bit_bnb_layers, load_and_quantize_model
File "C:\LLaVA\llava_env\lib\site-packages\accelerate\utils\bnb.py", line 42, in
import bitsandbytes as bnb
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes_init
.py", line 6, in
from . import cuda_setup, utils, research
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\research_init
.py", line 1, in
from . import nn
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\research\nn_init
.py", line 1, in
from .modules import LinearFP8Mixed, LinearFP8Global
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\research\nn\modules.py", line 8, in
from bitsandbytes.optim import GlobalOptimManager
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\optim_init
.py", line 6, in
from bitsandbytes.cextension import COMPILED_WITH_CUDA
File "C:\LLaVA\llava_env\lib\site-packages\bitsandbytes\cextension.py", line 20, in
raise RuntimeError('''
RuntimeError:
CUDA Setup failed despite GPU being available. Please run the following command to get more information:

    python -m bitsandbytes

    Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
    to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
    and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1382, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python310\lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "C:\LLaVA\llava_env\lib\site-packages\transformers\modeling_utils.py", line 42, in
from .generation import GenerationConfig, GenerationMixin
File "", line 1075, in _handle_fromlist
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1372, in getattr
module = self._get_module(self._class_to_module[name])
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1384, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.generation.utils because of the following error (look up to see its traceback):

    CUDA Setup failed despite GPU being available. Please run the following command to get more information:

    python -m bitsandbytes

    Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
    to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
    and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1382, in get_module
return importlib.import_module("." + module_name, self.name)
File "C:\Users\Admin\AppData\Local\Programs\Python\Python310\lib\importlib_init
.py", line 126, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "", line 1050, in _gcd_import
File "", line 1027, in _find_and_load
File "", line 1006, in _find_and_load_unlocked
File "", line 688, in _load_unlocked
File "", line 883, in exec_module
File "", line 241, in _call_with_frames_removed
File "C:\LLaVA\llava_env\lib\site-packages\transformers\models\llava\modeling_llava.py", line 23, in
from ... import PreTrainedModel
File "", line 1075, in _handle_fromlist
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1372, in getattr
module = self._get_module(self._class_to_module[name])
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1384, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):

    CUDA Setup failed despite GPU being available. Please run the following command to get more information:

    python -m bitsandbytes

    Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
    to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
    and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "C:\LLaVA\caption_img.py", line 4, in
from transformers import AutoProcessor, LlavaForConditionalGeneration
File "", line 1075, in _handle_fromlist
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1373, in getattr
value = getattr(module, name)
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1372, in getattr
module = self._get_module(self._class_to_module[name])
File "C:\LLaVA\llava_env\lib\site-packages\transformers\utils\import_utils.py", line 1384, in _get_module
raise RuntimeError(
RuntimeError: Failed to import transformers.models.llava.modeling_llava because of the following error (look up to see its traceback):
Failed to import transformers.modeling_utils because of the following error (look up to see its traceback):
Failed to import transformers.generation.utils because of the following error (look up to see its traceback):

    CUDA Setup failed despite GPU being available. Please run the following command to get more information:

    python -m bitsandbytes

    Inspect the output of the command and see if you can locate CUDA libraries. You might need to add them
    to your LD_LIBRARY_PATH. If you suspect a bug, please take the information from python -m bitsandbytes
    and open an issue at: https://github.com/TimDettmers/bitsandbytes/issues

Sign up or log in to comment