18 Commits

Author SHA1 Message Date
0bfb936ab4 comfy-aimdo 0.2 - Improved pytorch allocator integration (#12557)
Integrate comfy-aimdo 0.2 which takes a different approach to
installing the memory allocator hook. Instead of using the complicated
and buggy pytorch MemPool+CudaPluggableAlloctor, cuda is directly hooked
making the process much more transparent to both comfy and pytorch. As
far as pytorch knows, aimdo doesnt exist anymore, and just operates
behind the scenes.

Remove all the mempool setup stuff for dynamic_vram and bump the
comfy-aimdo version. Remove the allocator object from memory_management
and demote its use as an enablment check to a boolean flag.

Comfy-aimdo 0.2 also support the pytorch cuda async allocator, so
remove the dynamic_vram based force disablement of cuda_malloc and
just go back to the old settings of allocators based on command line
input.
2026-02-21 10:52:57 -08:00
f8acd9c402 Reduce RAM usage, fix VRAM OOMs, and fix Windows shared memory spilling with adaptive model loading (#11845) 2026-02-01 01:01:11 -05:00
d7a0aef650 Set OCL_SET_SVM_SIZE on AMD. (#11139) 2025-12-06 00:15:21 -05:00
5b80addafd Turn off cuda malloc by default when --fast autotune is turned on. (#10393) 2025-10-18 22:35:46 -04:00
e78d230496 Only enable cuda malloc on cuda torch. (#9031) 2025-07-23 19:37:43 -04:00
f1d6cef71c Revert "Disable cuda malloc by default."
This reverts commit 50bf66e5c4.
2024-08-14 08:38:07 -04:00
50bf66e5c4 Disable cuda malloc by default. 2024-08-14 02:49:25 -04:00
2f93b91646 Add Tesla GPUs to cuda malloc blacklist. 2024-03-26 23:09:28 -04:00
caddef8d88 Auto disable cuda malloc on unsupported GPUs on Linux. 2024-03-04 09:03:59 -05:00
192ca0676c Add some more cards to the cuda malloc blacklist. 2023-08-13 16:08:11 -04:00
861fd58819 Add a warning if a card that doesn't support cuda malloc has it enabled. 2023-08-13 12:37:53 -04:00
fc71cf656e Add some 800M gpus to cuda malloc blacklist. 2023-08-05 21:54:52 -04:00
5a90d3cea5 GeForce MX110 + MX130 are maxwell. 2023-08-04 21:44:37 -04:00
7c0a5a3e0e Disable cuda malloc on a bunch of quadro cards. 2023-07-25 00:09:01 -04:00
30de083dd0 Disable cuda malloc on all the 9xx series. 2023-07-23 13:29:14 -04:00
85a8900a14 Disable cuda malloc on regular GTX 960. 2023-07-22 11:05:33 -04:00
39c58b227f Disable cuda malloc on GTX 750 Ti. 2023-07-19 15:14:10 -04:00
799c08a4ce Auto disable cuda malloc on some GPUs on windows. 2023-07-19 14:43:55 -04:00