Skip to content

Commit

Permalink
Merge pull request #1 from unwdef/dev
Browse files Browse the repository at this point in the history
merge dev
  • Loading branch information
unwdef authored Apr 15, 2024
2 parents 8e1cd5f + 063ed43 commit 11b796c
Show file tree
Hide file tree
Showing 9 changed files with 325 additions and 92 deletions.
12 changes: 12 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
## 2.0.0

* Added Stack version of the Randomize LoRAs node;
* Added Trigger words fields for the Randomize LoRAs nodes;
* Fixed Randomize LoRAs outputing duplicated LoRAs if the user selected the same LoRA multiple times;
* Added Random Text from Multiline node;
* Added Text Multiline With Variables node;

## 1.0.0

* Initial launch
* Added Randomize LoRAs node
27 changes: 23 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,43 @@
# unwdef Custom Nodes for ComfyUI

At the moment, only one node is available.
This is a work in progress repo.

## Randomize LoRAs Node
The Randomize LoRAs node randomly loads LoRAs based on a predefined selection with also randomized weights. This enables users to experiment with different artistic effects on their generated images.

![preview](https://github.com/unwdef/unwdef-nodes-comfyui/assets/166751903/686f12e1-ed35-4165-94f7-048c0550c2fc)
![nodes_lora](https://github.com/unwdef/unwdef-nodes-comfyui/assets/166751903/e3ae5179-06ac-4154-94a9-1fb31a47fe35)
Note: The "Show Text" node is part of [pythongosssss/ComfyUI-Custom-Scripts](https://github.com/pythongosssss/ComfyUI-Custom-Scripts)

There is also a "stack" version for working with other lora nodes with the stacking functionality such as [Efficiency Nodes](https://github.com/jags111/efficiency-nodes-comfyui)

### How It Works
Connect the **model** and **clip** outputs from this node to your KSampler or other processing nodes. The output, **chosen loras**, provides a textual representation detailing which LoRAs and corresponding weights were applied during the generation.

You can also provide the **trigger words** for each lora. They will be outputted as a formatted text separated by commas. Useful for you to concatenate the trigger words into your prompts.

### Configuration Fields
- **seed**: Ensures reproducibility. Maintain the same seed for consistent results across generations. _Note: Keep the same selected loras for this to work._
- **max_random**: Limits the maximum number of LoRAs to apply. Even if you select up to 10, you can choose to apply fewer.
- **lora_x**: Specifies the LoRA file to use.
- **min_str_x** and **max_str_x**: Defines the minimum and maximum strengths for each LoRA, allowing for a range of intensities.
- **trigger_words_x**: The trigger words for the selected lora.

## Random Text from Multiline Node
Will output one (or multiple) lines from a multiline text component.

![node_random_text_from_multiline](https://github.com/unwdef/unwdef-nodes-comfyui/assets/166751903/432196cc-067f-4f84-9ca4-769d3a3c46d7)

## Text Multiline with Variables

Will replace instances of !var_x in your text with the contents of the var_x inputs.

![nodes_text](https://github.com/unwdef/unwdef-nodes-comfyui/assets/166751903/cd9c0724-1dcc-426b-b66e-6e733b3be264)


## Installation
You can use the [ComfyUI-Manager](https://github.com/ltdrdata/ComfyUI-Manager). Search for "unwdef" or "unwdef-nodes".

### Installation
To install the Randomize LoRAs node in ComfyUI:
Or you can install it manually:

1. Open your terminal and navigate to your `ComfyUI/custom_nodes` directory.
2. Clone the repository using:
Expand Down
17 changes: 16 additions & 1 deletion __init__.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,18 @@
from .unwdef_nodes import NODE_CLASS_MAPPINGS, NODE_DISPLAY_NAME_MAPPINGS
from .unwdef_nodes.nodes_lora import *
from .unwdef_nodes.nodes_text import *

NODE_CLASS_MAPPINGS = {
"RandomizeLoras": RandomizeLoras,
"RandomizeLorasStack": RandomizeLorasStack,
"RandomTextFromMultiline": RandomTextFromMultiline,
"TextMultilineWithVariables" : TextMultilineWithVariables,
}

NODE_DISPLAY_NAME_MAPPINGS = {
"RandomizeLoras": "Randomize LoRAs",
"RandomizeLorasStack": "Randomize LoRAs (Stack)",
"RandomTextFromMultiline": "Random Text From Multiline",
"TextMultilineWithVariables": "Text Multiline with Variables",
}

__all__ = ["NODE_CLASS_MAPPINGS", "NODE_DISPLAY_NAME_MAPPINGS"]
Binary file removed preview.png
Binary file not shown.
Binary file added previews/nodes_lora.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added previews/nodes_text.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
87 changes: 0 additions & 87 deletions unwdef_nodes.py

This file was deleted.

186 changes: 186 additions & 0 deletions unwdef_nodes/nodes_lora.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,186 @@
import random
from nodes import LoraLoader
import folder_paths

class RandomizeLoras:
def __init__(self):
pass

@classmethod
def INPUT_TYPES(cls):
loras = ["None"] + folder_paths.get_filename_list("loras")
inputs = {
"required": {
"model": ("MODEL",),
"clip": ("CLIP", ),
"seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
"max_random": ("INT", {"default": 10, "min": 1, "max": 10}),
}
}
for i in range(1, 11):
inputs["required"][f"lora_{i}"] = (loras,)
inputs["required"][f"min_str_{i}"] = ("FLOAT", {"default": 0.5, "min": -10.0, "max": 10.0, "step": 0.01})
inputs["required"][f"max_str_{i}"] = ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01})
inputs["required"][f"trigger_words_{i}"] = ("STRING", { "multiline": False, "default": "" })

return inputs

RETURN_TYPES = ("MODEL", "CLIP", "STRING", "STRING")
RETURN_NAMES = ("model", "clip", "trigger_words", "chosen_loras")
FUNCTION = "load_lora"
CATEGORY = "unwdef/lora"

def load_lora(self, model, clip, seed, max_random, **kwargs):
if seed is not None:
random.seed(seed) # For reproducibility

# Initialize list to hold lora configurations
lora_configs = []

# Dynamically extract lora configurations from kwargs
for i in range(1, 11):
lora_name = kwargs.get(f"lora_{i}")
min_str = kwargs.get(f"min_str_{i}")
max_str = kwargs.get(f"max_str_{i}")
trigger_words = kwargs.get(f"trigger_words_{i}")

if lora_name != "None" and not any(config['name'] == lora_name for config in lora_configs):
lora_configs.append({"name": lora_name, "min_str": min_str, "max_str": max_str,
"trigger_words": ', '.join([s.strip() for s in trigger_words.strip().split(',') if s.strip()])})

# Initialize the string to hold chosen loras and values
chosen_str = ""

# Initialize the string to hold the trigger words
chosen_trigger_words = ""

# Check if no loras are selected
if len(lora_configs) == 0:
return (model, clip, chosen_trigger_words, chosen_str)

# Adjust max_random
if (max_random > len(lora_configs)):
max_random = len(lora_configs)

# Randomly choose some of these loras
chosen_loras = random.sample(lora_configs, random.randint(1, max_random))

for lora in chosen_loras:
# Randomly determine a value between min_str and max_str
strength = random.uniform(lora['min_str'], lora['max_str'])

# Apply changes to model and clip
model, clip = LoraLoader().load_lora(model, clip, lora['name'], strength, strength)

# Append the current lora and its value to the string
chosen_str += f"<lora:{lora['name'].split('.')[0]}:{strength:.2f}>, "

# Append the trigger words for each lora
existing_chosen_trigger_words = set(chosen_trigger_words.split(', '))
chosen_trigger_words = set(lora['trigger_words'].split(', '))
combined_words = existing_chosen_trigger_words | chosen_trigger_words
chosen_trigger_words = ', '.join(sorted(combined_words))


# Find the last occurrence of the comma to remove it
last_comma_index = chosen_str.rfind(',')
# Slice the string to remove the last comma and everything after it
if last_comma_index != -1:
chosen_str = chosen_str[:last_comma_index]

return (model, clip, chosen_trigger_words.lstrip(", "), chosen_str)

class RandomizeLorasStack:
def __init__(self):
pass

@classmethod
def INPUT_TYPES(cls):
loras = ["None"] + folder_paths.get_filename_list("loras")
inputs = {
"required": {
"seed": ("INT", {"default": 0, "min": 0, "max": 0xffffffffffffffff}),
"max_random": ("INT", {"default": 10, "min": 1, "max": 10}),
}
}
for i in range(1, 11):
inputs["required"][f"lora_{i}"] = (loras,)
inputs["required"][f"min_str_{i}"] = ("FLOAT", {"default": 0.5, "min": -10.0, "max": 10.0, "step": 0.01})
inputs["required"][f"max_str_{i}"] = ("FLOAT", {"default": 1.0, "min": -10.0, "max": 10.0, "step": 0.01})
inputs["required"][f"trigger_words_{i}"] = ("STRING", { "multiline": False, "default": "" })

inputs["optional"] = {
"lora_stack": ("LORA_STACK",)
}

return inputs

RETURN_TYPES = ("LORA_STACK", "STRING", "STRING")
RETURN_NAMES = ("LORA_STACK", "trigger_words", "chosen_loras")
FUNCTION = "load_lora_stack"
CATEGORY = "unwdef/lora"

def load_lora_stack(self, seed, max_random, lora_stack=None, **kwargs):
if seed is not None:
random.seed(seed) # For reproducibility

# Initialize list to hold lora configurations
lora_configs = []

# Initialize lora stack list
lora_list = list()
if lora_stack is not None:
lora_list.extend([l for l in lora_stack if l[0] != "None"])

# Dynamically extract lora configurations from kwargs
for i in range(1, 11):
lora_name = kwargs.get(f"lora_{i}")
min_str = kwargs.get(f"min_str_{i}")
max_str = kwargs.get(f"max_str_{i}")
trigger_words = kwargs.get(f"trigger_words_{i}")

if lora_name != "None" and not any(config['name'] == lora_name for config in lora_configs):
lora_configs.append({"name": lora_name, "min_str": min_str, "max_str": max_str,
"trigger_words": ', '.join([s.strip() for s in trigger_words.strip().split(',') if s.strip()])})

# Initialize the string to hold chosen loras and values
chosen_str = ""

# Initialize the string to hold the trigger words
chosen_trigger_words = ""

# Check if no loras are selected
if len(lora_configs) == 0:
return (lora_list, chosen_trigger_words, chosen_str, )

# Adjust max_random
if (max_random > len(lora_configs)):
max_random = len(lora_configs)

# Randomly choose some of these loras
chosen_loras = random.sample(lora_configs, random.randint(1, max_random))

for lora in chosen_loras:
# Randomly determine a value between min_str and max_str
strength = random.uniform(lora['min_str'], lora['max_str'])

# Add to the stack
lora_list.extend([(lora['name'], strength, strength)]),

# Append the current lora and its value to the string
chosen_str += f"<lora:{lora['name'].split('.')[0]}:{strength:.2f}>, "

# Append the trigger words for each lora
existing_chosen_trigger_words = set(chosen_trigger_words.split(', '))
chosen_trigger_words = set(lora['trigger_words'].split(', '))
combined_words = existing_chosen_trigger_words | chosen_trigger_words
chosen_trigger_words = ', '.join(sorted(combined_words))

# Find the last occurrence of the comma to remove it
last_comma_index = chosen_str.rfind(',')
# Slice the string to remove the last comma and everything after it
if last_comma_index != -1:
chosen_str = chosen_str[:last_comma_index]

return (lora_list, chosen_trigger_words.lstrip(", "), chosen_str,)

Loading

0 comments on commit 11b796c

Please sign in to comment.