Commit Graph

1664 Commits

Author SHA1 Message Date
72e3feb573 Load API JSON (#1932)
* added loading api json

* revert async change

* reorder
2023-11-09 13:33:43 -05:00
cd6df8b323 Fix sanitize node name removing the "/" character. 2023-11-09 13:10:19 -05:00
ec12000136 Add support for full diff lora keys. 2023-11-08 22:05:31 -05:00
064d7583eb Add a CONDConstant for passing non tensor conds to unet. 2023-11-08 01:59:09 -05:00
794dd2064d Fix typo. 2023-11-07 23:41:55 -05:00
0a6fd49a3e Print leftover keys when using the UNETLoader. 2023-11-07 22:15:55 -05:00
fe40109b57 Fix issue with object patches not being copied with patcher. 2023-11-07 22:15:15 -05:00
a527d0c795 Code refactor. 2023-11-07 19:33:40 -05:00
2a23ba0b8c Fix unet ops not entirely on GPU. 2023-11-07 04:30:37 -05:00
844dbf97a7 Add: advanced->model->ModelSamplingDiscrete node.
This allows changing the sampling parameters of the model (eps or vpred)
or set the model to use zsnr.
2023-11-07 03:28:53 -05:00
d07cd44272 Merge branch 'master' of https://github.com/cubiq/ComfyUI 2023-11-07 01:52:13 -05:00
656c0b5d90 CLIP code refactor and improvements.
More generic clip model class that can be used on more types of text
encoders.

Don't apply weighting algorithm when weight is 1.0

Don't compute an empty token output when it's not needed.
2023-11-06 14:17:41 -05:00
b3fcd64c6c Make SDTokenizer class work with more types of tokenizers. 2023-11-06 01:09:18 -05:00
4acfc11a80 add difference blend mode 2023-11-05 19:00:23 +01:00
a6c83b3cd0 Merge branch 'fix_unet_wrapper_function_name' of https://github.com/gameltb/ComfyUI 2023-11-05 12:41:38 -05:00
02f062b5b7 Sanitize unknown node types on load to prevent XSS. 2023-11-05 12:29:28 -05:00
7e455adc07 fix unet_wrapper_function name in ModelPatcher 2023-11-05 17:11:44 +08:00
1ffa8858e7 Move model sampling code to comfy/model_sampling.py 2023-11-04 01:32:23 -04:00
ae2acfc21b Don't convert Nan to zero.
Converting Nan to zero is a bad idea because it makes it hard to tell when
something went wrong.
2023-11-03 13:13:15 -04:00
ee74ef5c9e Increase maximum batch size in LatentRebatch. 2023-11-02 13:07:41 -04:00
6e84a01ecc Refactor the template manager (#1878)
* add drag-drop to node template manager

* better dnd, save field on change

* actually save templates

---------

Co-authored-by: matt3o <matt3o@gmail.com>
2023-11-02 12:29:57 -04:00
dd116abfc4 Merge branch 'quantize-dither' of https://github.com/tsone/ComfyUI 2023-11-02 00:57:00 -04:00
d2e27b48f1 sampler_cfg_function now gets the noisy output as argument again.
This should make things that use sampler_cfg_function behave like before.

Added an input argument for those that want the denoised output.

This means you can calculate the x0 prediction of the model by doing:
(input - cond) for example.
2023-11-01 21:24:08 -04:00
2455aaed8a Allow model or clip to be None in load_lora_for_models. 2023-11-01 20:27:20 -04:00
45a3df1cde Merge branch 'filter-widgets-crash-fix' of https://github.com/Jantolick/ComfyUI 2023-11-01 20:17:25 -04:00
ecb80abb58 Allow ModelSamplingDiscrete to be instantiated without a model config. 2023-11-01 19:13:03 -04:00
88410ace9b fix: handle null case for currentNode widgets to prevent scroll error 2023-11-01 16:52:51 -04:00
e73ec8c4da Not used anymore. 2023-11-01 00:01:30 -04:00
111f1b5255 Fix some issues with sampling precision. 2023-10-31 23:49:29 -04:00
7c0f255de1 Clean up percent start/end and make controlnets work with sigmas. 2023-10-31 22:14:32 -04:00
a268a574fa Remove a bunch of useless code.
DDIM is the same as euler with a small difference in the inpaint code.
DDIM uses randn_like but I set a fixed seed instead.

I'm keeping it in because I'm sure if I remove it people are going to
complain.
2023-10-31 18:11:29 -04:00
1777b54d02 Sampling code changes.
apply_model in model_base now returns the denoised output.

This means that sampling_function now computes things on the denoised
output instead of the model output. This should make things more consistent
across current and future models.
2023-10-31 17:33:43 -04:00
23c5d17837 Added Bayer dithering to Quantize node. 2023-10-31 22:22:40 +01:00
c837a173fa Fix some memory issues in sub quad attention. 2023-10-30 15:30:49 -04:00
125b03eead Fix some OOM issues with split attention. 2023-10-30 13:14:11 -04:00
41b07ff8d7 Fix TAESD preview to only decode first latent, instead of all 2023-10-29 13:30:23 -05:00
a12cc05323 Add --max-upload-size argument, the default is 100MB. 2023-10-29 03:55:46 -04:00
aac8fc99d6 Cleanup webp import code a bit. 2023-10-28 12:24:50 -04:00
2a134bfab9 Fix checkpoint loader with config. 2023-10-27 22:13:55 -04:00
e60ca6929a SD1 and SD2 clip and tokenizer code is now more similar to the SDXL one. 2023-10-27 15:54:04 -04:00
6ec3f12c6e Support SSD1B model and make it easier to support asymmetric unets. 2023-10-27 14:45:15 -04:00
434ce25ec0 Restrict loading embeddings from embedding folders. 2023-10-27 02:54:13 -04:00
40963b5a16 Apply primitive nodes to graph before serializing workflow. 2023-10-26 19:52:41 -04:00
723847f6b3 Faster clip image processing. 2023-10-26 01:53:01 -04:00
a373367b0c Fix some OOM issues with split and sub quad attention. 2023-10-25 20:17:28 -04:00
7fbb217d3a Fix uni_pc returning noisy image when steps <= 3 2023-10-25 16:08:30 -04:00
3783cb8bfd change 'c_adm' to 'y' in ControlNet.get_control 2023-10-25 08:24:32 -05:00
d1d2fea806 Pass extra conds directly to unet. 2023-10-25 00:07:53 -04:00
036f88c621 Refactor to make it easier to add custom conds to models. 2023-10-24 23:31:12 -04:00
3fce8881ca Sampling code refactor to make it easier to add more conds. 2023-10-24 03:38:41 -04:00