fa34efe3bd
Update frontend to v1.2.47 ( #4798 )
...
* Update web content to release v1.2.47
* Update shortcut list
2024-09-05 18:56:01 -04:00
5cbaa9e07c
Mistoline flux controlnet support.
2024-09-05 00:05:17 -04:00
c7427375ee
Prioritize freeing partially offloaded models first.
2024-09-04 19:47:32 -04:00
22d1241a50
Add an experimental LoraSave node to extract model loras.
...
The model_diff input should be connected to the output of a
ModelMergeSubtract node.
2024-09-04 16:38:38 -04:00
f04229b84d
Add emb_patch support to UNetModel forward ( #4779 )
2024-09-04 14:35:15 -04:00
f067ad15d1
Make live preview size a configurable launch argument ( #4649 )
...
* Make live preview size a configurable launch argument
* Remove import from testing phase
* Update cli_args.py
2024-09-03 19:16:38 -04:00
483004dd1d
Support newer glora format.
v0.2.1
2024-09-03 17:02:19 -04:00
00a5d08103
Lower fp8 lora memory usage.
2024-09-03 01:25:05 -04:00
d043997d30
Flux onetrainer lora.
2024-09-02 08:22:15 -04:00
f1c2301697
fix typo in stale-issues ( #4735 )
v0.2.0
2024-09-01 17:44:49 -04:00
8d31a6632f
Speed up inference on nvidia 10 series on Linux.
2024-09-01 17:29:31 -04:00
b643eae08b
Make minimum_inference_memory() depend on --reserve-vram
2024-09-01 01:18:34 -04:00
baa6b4dc36
Update manual install instructions.
2024-08-31 04:37:23 -04:00
d4aeefc297
add github action to automatically handle stale user support issues ( #4683 )
...
* add github action to automatically handle stale user support issues
* improve stale message
* remove token part
2024-08-31 01:57:18 -04:00
587e7ca654
Remove github buttons.
2024-08-31 01:53:10 -04:00
c90459eba0
Update ComfyUI_frontend to 1.2.40 ( #4691 )
...
* Update ComfyUI_frontend to 1.2.40
* Add files
2024-08-30 19:32:10 -04:00
04278afb10
feat: return import_failed from init_extra_nodes function ( #4694 )
2024-08-30 19:26:47 -04:00
935ae153e1
Cleanup.
2024-08-30 12:53:59 -04:00
e91662e784
Get logs endpoint & system_stats additions ( #4690 )
...
* Add route for getting output logs
* Include ComfyUI version
* Move to own function
* Changed to memory logger
* Unify logger setup logic
* Fix get version git fallback
---------
Co-authored-by: pythongosssss <125205205+pythongosssss@users.noreply.github.com >
2024-08-30 12:46:37 -04:00
63fafaef45
Fix potential issue with hydit controlnets.
2024-08-30 04:58:41 -04:00
ec28cd9136
swap legacy sdv15 link ( #4682 )
...
* swap legacy sdv15 link
* swap v15 ckpt examples to safetensors
* link the fp16 copy of the model by default
2024-08-29 19:48:48 -04:00
6eb5d64522
Fix glora lowvram issue.
2024-08-29 19:07:23 -04:00
10a79e9898
Implement model part of flux union controlnet.
2024-08-29 18:41:22 -04:00
ea3f39bd69
InstantX depth flux controlnet.
2024-08-29 02:14:19 -04:00
b33cd61070
InstantX canny controlnet.
2024-08-28 19:02:50 -04:00
34eda0f853
fix: remove redundant useless loop ( #4656 )
...
fix: potential error of undefined variable
https://github.com/comfyanonymous/ComfyUI/discussions/4650
2024-08-28 17:46:30 -04:00
d31e226650
Unify RMSNorm code.
2024-08-28 16:56:38 -04:00
b79fd7d92c
ComfyUI supports more than just stable diffusion.
2024-08-28 16:12:24 -04:00
38c22e631a
Fix case where model was not properly unloaded in merging workflows.
2024-08-27 19:03:51 -04:00
6bbdcd28ae
Support weight padding on diff weight patch ( #4576 )
2024-08-27 13:55:37 -04:00
ab130001a8
Do RMSNorm in native type.
2024-08-27 02:41:56 -04:00
ca4b8f30e0
Cleanup empty dir if frontend zip download failed ( #4574 )
2024-08-27 02:07:25 -04:00
70b84058c1
Add relative file path to the progress report. ( #4621 )
2024-08-27 02:06:12 -04:00
2ca8f6e23d
Make the stochastic fp8 rounding reproducible.
2024-08-26 15:12:06 -04:00
7985ff88b9
Use less memory in float8 lora patching by doing calculations in fp16.
2024-08-26 14:45:58 -04:00
c6812947e9
Fix potential memory leak.
v0.1.3
2024-08-26 02:07:32 -04:00
9230f65823
Fix some controlnets OOMing when loading.
2024-08-25 05:54:29 -04:00
6ab1e6fd4a
[Bug #4529 ] Fix graph partial validation failure ( #4588 )
...
Currently, if a graph partially fails validation (i.e. some outputs are
valid while others have links from missing nodes), the execution loop
could get an exception resulting in server lockup.
This isn't actually possible to reproduce via the default UI, but is a
potential issue for people using the API to construct invalid graphs.
2024-08-24 15:34:58 -04:00
07dcbc3a3e
Clarify how to use high quality previews.
2024-08-24 02:31:03 -04:00
8ae23d8e80
Fix onnx export.
2024-08-23 17:52:47 -04:00
7df42b9a23
Fix dora.
v0.1.2
2024-08-23 04:58:59 -04:00
5d8bbb7281
Cleanup.
2024-08-23 04:06:27 -04:00
2c1d2375d6
Fix.
2024-08-23 04:04:55 -04:00
64ccb3c7e3
Rework IPEX check for future inclusion of XPU into Pytorch upstream and do a bit more optimization of ipex.optimize(). ( #4562 )
2024-08-23 03:59:57 -04:00
9465b23432
Added SD15_Inpaint_Diffusers model support for unet_config_from_diffusers_unet function ( #4565 )
2024-08-23 03:57:08 -04:00
bb4416dd5b
Fix task.status.status_str caused by #2666 ( #4551 )
...
* Fix task.status.status_str caused by 2666 regression
* fix
* fix
v0.1.1
2024-08-22 17:38:30 -04:00
c0b0da264b
Missing imports.
2024-08-22 17:20:51 -04:00
c26ca27207
Move calculate function to comfy.lora
2024-08-22 17:12:00 -04:00
7c6bb84016
Code cleanups.
2024-08-22 17:05:12 -04:00
c54d3ed5e6
Fix issue with models staying loaded in memory.
2024-08-22 15:58:20 -04:00