runtime error
Exit code: 1. Reason: removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`. warnings.warn( Downloading shards: 0%| | 0/7 [00:00<?, ?it/s][A Downloading shards: 14%|ββ | 1/7 [00:11<01:10, 11.75s/it][A Downloading shards: 29%|βββ | 2/7 [00:24<01:01, 12.36s/it][A Downloading shards: 43%|βββββ | 3/7 [00:36<00:49, 12.40s/it][A Downloading shards: 57%|ββββββ | 4/7 [00:48<00:36, 12.11s/it][A Downloading shards: 71%|ββββββββ | 5/7 [01:01<00:24, 12.39s/it][A Downloading shards: 86%|βββββββββ | 6/7 [01:15<00:12, 12.94s/it][A Downloading shards: 100%|ββββββββββ| 7/7 [01:27<00:00, 12.47s/it][A Downloading shards: 100%|ββββββββββ| 7/7 [01:27<00:00, 12.43s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 82, in <module> "idefics2-8b-chatty": Idefics2ForConditionalGeneration.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 3544, in from_pretrained config = cls._autoset_attn_implementation( File "/usr/local/lib/python3.10/site-packages/transformers/models/idefics2/modeling_idefics2.py", line 1385, in _autoset_attn_implementation config = super()._autoset_attn_implementation( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1454, in _autoset_attn_implementation cls._check_and_enable_flash_attn_2( File "/usr/local/lib/python3.10/site-packages/transformers/modeling_utils.py", line 1555, in _check_and_enable_flash_attn_2 raise ImportError(f"{preface} Flash Attention 2 is not available. {install_message}") ImportError: FlashAttention2 has been toggled on, but it cannot be used due to the following error: Flash Attention 2 is not available. Please refer to the documentation of https://huggingface.co/docs/transformers/perf_infer_gpu_one#flashattention-2 to install Flash Attention 2.
Container logs:
Fetching error logs...