cuda out of memory

#11
by shiyu-wangbyte - opened

RuntimeError: CUDA out of memory. Tried to allocate 216.00 MiB (GPU 0; 10.76 GiB total capacity; 10.03 GiB already allocated; 217.44 MiB free; 10.03 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF

can anyone explain how can I split the total memory to mutil GPUs?

Sign up or log in to comment