a84cd0d1ad
Don't unload/reload model from CPU uselessly.
2023-02-08 03:40:43 -05:00
e3e65947f2
Add a --help to main.py
2023-02-07 22:13:42 -05:00
1f18221e17
Add --port to set custom port.
2023-02-07 21:57:17 -05:00
6e40393b6b
Fix delete sometimes not properly refreshing queue state.
2023-02-07 00:07:31 -05:00
d71d0c88e5
Add some simple queue management to the GUI.
2023-02-06 23:40:38 -05:00
b1a7c9ebf6
Embeddings/textual inversion support for SD2.x
2023-02-05 15:49:03 -05:00
1de5aa6a59
Add a CLIPLoader node to load standalone clip weights.
...
Put them in models/clip
2023-02-05 15:20:18 -05:00
56d802e1f3
Use transformers CLIP instead of open_clip for SD2.x
...
This should make things a bit cleaner.
2023-02-05 14:36:28 -05:00
bf9ccffb17
Small fix for SD2.x loras.
2023-02-05 11:38:25 -05:00
678105fade
SD2.x CLIP support for Loras.
2023-02-05 01:54:09 -05:00
3f3d77a324
Fix image node always executing instead of only when the image changed.
2023-02-04 16:08:29 -05:00
4225d1cb9f
Add a basic ImageScale node.
...
It's pretty much the same as the LatentUpscale node for now but for images
in pixel space.
2023-02-04 16:01:01 -05:00
bff0e11941
Add a LatentCrop node.
2023-02-04 15:21:46 -05:00
43c795f462
Add a --listen argument to listen on 0.0.0.0
2023-02-04 12:01:53 -05:00
41a7532c15
A bit bigger.
2023-02-03 13:56:00 -05:00
7bc3f91bd6
Add some instructions how to use the venv from another SD install.
2023-02-03 13:54:45 -05:00
149a4de3f2
Fix potential issue if exception happens when patching model.
2023-02-03 03:55:50 -05:00
ef90e9c376
Add a LoraLoader node to apply loras to models and clip.
...
The models are modified in place before being used and unpatched after.
I think this is better than monkeypatching since it might make it easier
to use faster non pytorch unet inference in the future.
2023-02-03 02:46:24 -05:00
96664f5d5e
Web interface bug fix for multiple inputs from the same node.
2023-02-03 00:39:28 -05:00
1d84a44b08
Fix some small annoyances with the UI.
2023-02-02 14:36:11 -05:00
e65a20e62a
Add a button to queue prompts to the front of the queue.
2023-02-01 22:34:59 -05:00
4b08314257
Add more features to the backend queue code.
...
The queue can now be queried, entries can be deleted and prompts easily
queued to the front of the queue.
Just need to expose it in the UI next.
2023-02-01 22:33:10 -05:00
9d611a90e8
Small web interface fixes.
2023-01-31 03:37:34 -05:00
fef41d0a72
Add LatentComposite node.
...
This can be used to "paste" one latent image on top of the other.
2023-01-31 03:35:03 -05:00
3fa009f4cc
Add a LatentFlip node.
2023-01-31 03:28:38 -05:00
69df7eba94
Add KSamplerAdvanced node.
...
This node exposes more sampling options and makes it possible for example
to sample the first few steps on the latent image, do some operations on it
and then do the rest of the sampling steps. This can be achieved using the
start_at_step and end_at_step options.
2023-01-31 03:09:38 -05:00
f8f165e2c3
Add a LatentRotate node.
2023-01-31 02:28:07 -05:00
1daccf3678
Run softmax in place if it OOMs.
2023-01-30 19:55:01 -05:00
0d8ad93852
Add link to examples github page.
2023-01-30 01:09:35 -05:00
f73e57d881
Add support for textual inversion embedding for SD1.x CLIP.
2023-01-29 18:46:44 -05:00
702ac43d0c
Readme formatting.
2023-01-29 13:23:57 -05:00
da6f56235b
Add section to readme explaining how to get better speeds.
2023-01-29 13:15:03 -05:00
3661e10648
Add a command line option to disable upcasting in some cross attention ops.
2023-01-29 13:12:22 -05:00
50db297cf6
Try to fix OOM issues with cards that have less vram than mine.
2023-01-29 00:50:46 -05:00
36ec5690a6
Add some more model configs including some to use SD1 models in fp16.
2023-01-28 23:23:49 -05:00
484b957c7a
Quick fix for chrome issue.
2023-01-28 12:43:43 -05:00
2706c0b7a5
Some VAEs come in .pt files.
2023-01-28 12:28:29 -05:00
d133cf4f06
Added some AMD stuff to readme.
2023-01-28 04:06:25 -05:00
73f60740c8
Slightly cleaner code.
2023-01-28 02:14:22 -05:00
0108616b77
Fix issue with some models.
2023-01-28 01:38:42 -05:00
2973ff24c5
Round CLIP position ids to fix float issues in some checkpoints.
2023-01-28 00:19:33 -05:00
e615d40ca1
Fix UI annoyance with multiline textboxes sometimes getting stuck.
2023-01-27 23:33:27 -05:00
e6aa9f0c0a
Modal now actually shows up on colab.
2023-01-27 22:51:08 -05:00
88d0f4b397
Use a modal instead of an alert so the errors show up even on colab.
2023-01-27 22:24:37 -05:00
d8af790fa6
Small colab notebook fix.
2023-01-27 21:54:46 -05:00
266db2066d
Update colab notebook to use an iframe for GUI.
2023-01-27 21:48:48 -05:00
9fa1827906
Added better instructions for nvidia.
2023-01-27 17:23:47 -05:00
21eb33f233
Right category for latent upscale node.
2023-01-27 14:11:57 -05:00
e4528ef949
Add a support/dev matrix chat room link to the readme.
2023-01-27 13:08:15 -05:00
bc475f86c4
Slightly better errors.
2023-01-26 23:30:29 -05:00