Commit Graph

24 Commits

Author SHA1 Message Date
1a4bd9e9a6 Refactor the attention functions.
There's no reason for the whole CrossAttention object to be repeated when
only the operation in the middle changes.
2023-10-11 20:38:48 -04:00
1938f5c5fe Add a force argument to soft_empty_cache to force a cache empty. 2023-09-04 00:58:18 -04:00
bed116a1f9 Remove optimization that caused border. 2023-08-29 11:21:36 -04:00
1c794a2161 Fallback to slice attention if xformers doesn't support the operation. 2023-08-27 22:24:42 -04:00
d935ba50c4 Make --bf16-vae work on torch 2.0 2023-08-27 21:33:53 -04:00
95d796fc85 Faster VAE loading. 2023-07-29 16:28:30 -04:00
fa28d7334b Remove useless code. 2023-06-23 12:35:26 -04:00
b8636a44aa Make scaled_dot_product switch to sliced attention on OOM. 2023-05-20 16:01:02 -04:00
797c4e8d3b Simplify and improve some vae attention code. 2023-05-20 15:07:21 -04:00
bae4fb4a9d Fix imports. 2023-05-04 18:10:29 -04:00
73c3e11e83 Fix model_management import so it doesn't get executed twice. 2023-04-15 19:04:33 -04:00
e46b1c3034 Disable xformers in VAE when xformers == 0.0.18 2023-04-04 22:22:02 -04:00
3ed4a4e4e6 Try again with vae tiled decoding if regular fails because of OOM. 2023-03-22 14:49:00 -04:00
c692509c2b Try to improve VAEEncode memory usage a bit. 2023-03-22 02:45:18 -04:00
83f23f82b8 Add pytorch attention support to VAE. 2023-03-13 12:45:54 -04:00
a256a2abde --disable-xformers should not even try to import xformers. 2023-03-13 11:36:48 -04:00
0f3ba7482f Xformers is now properly disabled when --cpu used.
Added --windows-standalone-build option, currently it only opens
makes the code open up comfyui in the browser.
2023-03-12 15:44:16 -04:00
1de86851b1 Try to fix memory issue. 2023-03-11 15:15:13 -05:00
cc8baf1080 Make VAE use common function to get free memory. 2023-03-05 14:20:07 -05:00
509c7dfc6d Use real softmax in split op to fix issue with some images. 2023-02-10 03:13:49 -05:00
773cdabfce Same thing but for the other places where it's used. 2023-02-09 12:43:29 -05:00
e8c499ddd4 Split optimization for VAE attention block. 2023-02-08 22:04:20 -05:00
5b4e312749 Use inplace operations for less OOM issues. 2023-02-08 22:04:13 -05:00
220afe3310 Initial commit. 2023-01-16 22:37:14 -05:00