not run
not run never
WindowsWhlBuilder_cuda.bat
Does not complete operation
(base) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>pip install flash_attn-2.7.0.post2+cu124torch2.4.1cxx11abiFALSE-cp310-cp310-win_amd64.whl
ERROR: flash_attn-2.7.0.post2+cu124torch2.4.1cxx11abiFALSE-cp310-cp310-win_amd64.whl is not a supported wheel on this platform.
(base) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>
(myenv) m@DESKTOP-MUB16F8:$ pip show flash-attn$
Name: flash_attn
Version: 2.7.4.post1
Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
Home-page: https://github.com/Dao-AILab/flash-attention
Author: Tri Dao
Author-email: [email protected]
License:
Location: /home/m/myenv/lib/python3.12/site-packages
Requires: einops, torch
Required-by:
(myenv) m@DESKTOP-MUB16F8:
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>pip install flash_attn-2.7.0.post2+cu124torch2.4.1cxx11abiFALSE-cp310-cp310-win_amd64.whl
Processing c:\users\target store\desktop\1\flash-attention\flash-attention-windows-wheel\flash_attn-2.7.0.post2+cu124torch2.4.1cxx11abifalse-cp310-cp310-win_amd64.whl
Requirement already satisfied: torch in c:\programdata\anaconda3\envs\my10\lib\site-packages (from flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (2.1.0+cu121)
Requirement already satisfied: einops in c:\programdata\anaconda3\envs\my10\lib\site-packages (from flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (0.8.0)
Requirement already satisfied: filelock in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (3.13.1)
Requirement already satisfied: typing-extensions in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (4.12.2)
Requirement already satisfied: sympy in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (1.13.1)
Requirement already satisfied: networkx in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (3.3)
Requirement already satisfied: jinja2 in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (3.1.4)
Requirement already satisfied: fsspec in c:\programdata\anaconda3\envs\my10\lib\site-packages (from torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (2024.6.1)
Requirement already satisfied: MarkupSafe>=2.0 in c:\programdata\anaconda3\envs\my10\lib\site-packages (from jinja2->torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (2.1.3)
Requirement already satisfied: mpmath<1.4,>=1.1.0 in c:\programdata\anaconda3\envs\my10\lib\site-packages (from sympy->torch->flash-attn==2.7.0.post2+cu124torch2.4.1cxx11abiFALSE) (1.3.0)
flash-attn is already installed with the same version as the provided wheel. Use --force-reinstall to force an installation of the wheel.
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>
flash-attn is already installed with the same version as the provided wheel. Use --force-reinstall to force an installation of the wheel.
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>python
Python 3.10.16 | packaged by Anaconda, Inc. | (main, Dec 11 2024, 16:19:12) [MSC v.1929 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
import flash_attn
Traceback (most recent call last):
File "", line 1, in
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\flash_attn_init.py", line 3, in
from flash_attn.flash_attn_interface import (
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\flash_attn\flash_attn_interface.py", line 5, in
import torch
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch_init.py", line 137, in
raise err
OSError: [WinError 127] The specified procedure could not be found. Error loading "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch\lib\nvfuser_codegen.dll" or one of its dependencies.
from flash_attn import flash_attn_func
File "", line 1
from flash_attn import flash_attn_func
IndentationError: unexpected indent
from flash_attn import flash_attn_func
Traceback (most recent call last):
File "", line 1, in
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\flash_attn_init.py", line 3, in
from flash_attn.flash_attn_interface import (
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\flash_attn\flash_attn_interface.py", line 5, in
import torch
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch_init.py", line 137, in
raise err
OSError: [WinError 127] The specified procedure could not be found. Error loading "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch\lib\nvfuser_codegen.dll" or one of its dependencies.
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>WindowsWhlBuilder_cuda.bat FORCE_CXX11_ABI=TRUE
MAX_JOBS: 1
FLASH_ATTENTION_FORCE_CXX11_ABI: TRUE
Requirement already satisfied: setuptools>=49.6.0 in c:\programdata\anaconda3\envs\my10\lib\site-packages (75.8.0)
Requirement already satisfied: packaging in c:\programdata\anaconda3\envs\my10\lib\site-packages (24.2)
Requirement already satisfied: wheel in c:\programdata\anaconda3\envs\my10\lib\site-packages (0.45.1)
Requirement already satisfied: psutil in c:\programdata\anaconda3\envs\my10\lib\site-packages (6.1.1)
python: can't open file 'C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel\setup.py': [Errno 2] No such file or directory
Traceback (most recent call last):
File "", line 1, in
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch_init_.py", line 137, in
raise err
OSError: [WinError 127] The specified procedure could not be found. Error loading "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch\lib\nvfuser_codegen.dll" or one of its dependencies.
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\flash-attention-windows-wheel>
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\build\lib.win-amd64-cpython-310\flash_attn\models>python gpt.py
Traceback (most recent call last):
File "C:\Users\TARGET STORE\Desktop\1\flash-attention\build\lib.win-amd64-cpython-310\flash_attn\models\gpt.py", line 11, in
import torch
File "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch_init_.py", line 137, in
raise err
OSError: [WinError 127] The specified procedure could not be found. Error loading "C:\ProgramData\anaconda3\envs\my10\lib\site-packages\torch\lib\nvfuser_codegen.dll" or one of its dependencies.
(my10) C:\Users\TARGET STORE\Desktop\1\flash-attention\build\lib.win-amd64-cpython-310\flash_attn\models>
(myenv) m@DESKTOP-MUB16F8:$ pip show flash-attn$ python3
Name: flash_attn
Version: 2.7.4.post1
Summary: Flash Attention: Fast and Memory-Efficient Exact Attention
Home-page: https://github.com/Dao-AILab/flash-attention
Author: Tri Dao
Author-email: [email protected]
License:
Location: /home/m/myenv/lib/python3.12/site-packages
Requires: einops, torch
Required-by:
(myenv) m@DESKTOP-MUB16F8:
import flash_attn
Traceback (most recent call last):
File "", line 1, in
File "/home/m/myenv/lib/python3.12/site-packages/flash_attn/init.py", line 3, in
from flash_attn.flash_attn_interface import (
File "/home/m/myenv/lib/python3.12/site-packages/flash_attn/flash_attn_interface.py", line 15, in
import flash_attn_2_cuda as flash_attn_gpu
ImportError: /home/m/myenv/lib/python3.12/site-packages/flash_attn_2_cuda.cpython-312-x86_64-linux-gnu.so: undefined symbol: _ZNK3c1011StorageImpl27throw_data_ptr_access_errorEv
try:
... import torch
t flash_... import flash_attn
... from flash_attn import flash_attn_func
...
... # Verify version
... print(f"Flash Attention version: {flash_attn.version}")
...
... # Basic functionality test
... if torch.cuda.is_available():
q = to... q = torch.randn(2, 8, 32, 64, device='cuda')
... k = torch.randn(2, 8, 32, 64, device='cuda')
v = torc... v = torch.randn(2, 8, 32, 64, device='cuda')
...
... output = flash_attn_func(q, k, v)
... print("Flash Attention test successful!")
else:
... else:
... print("CUDA device not available!")
... except ImportError as e:
... print(f"Import Error: {e}")
... except RuntimeError as e:
... print(f"Runtime Error: {e}")
...
Import Error: /home/m/myenv/lib/python3.12/site-packages/flash_attn_2_cuda.cpython-312-x86_64-linux-gnu.so: undefined symbol: _ZNK3c1011StorageImpl27throw_data_ptr_access_errorEv
Please check your torch version.