Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Magic Prompt adding unknown prompts and symbols when using Loras #707

Closed
wiseoldowl-66 opened this issue Jan 15, 2024 · 9 comments
Closed

Comments

@wiseoldowl-66
Copy link

Hello,

Magic Prompt appears to be having some issues when enabled if Loras are present in the main prompt.

With some images, not all, it adds random characters (e.g. < and [ ] ) to the prompt. Additionally, the word 'unknown' frequently appears in the final prompt. Example below. The problem seems to disappear when the Loras are removed from the main prompt.

I'm using SDXL and have tried two different base models and a number of different Loras.

The problem seems to be exacerbated by very short prompts. Providing a longer original prompt was used, increasing the 'Max magic prompt length' did not seem to worsen it. Additionally, using the same short prompts without any Loras did not replicate the issue.

Running latest versions of web-ui and the dynamic-prompts. I'm using the Gustavosta/MagicPrompt-Stable-Diffusion prompt model.

Examples (main prompt is up to and including the lora):

image

image

@akx
Copy link
Collaborator

akx commented Jan 16, 2024

Thanks for the report!

I suppose the Magic Prompt models (which aren't part of the extension, to be clear) haven't been trained on prompts that contain LoRAs, so they get confused and output nonsense.

We could strip out LoRA syntax before feeding the prompt to the Magic Prompt and then put it back in afterwards, but at present, that isn't happening.

@wiseoldowl-66
Copy link
Author

wiseoldowl-66 commented Jan 16, 2024

Thank you for the reply. It hadn't occurred to me that the prompt model relies on the existing prompt but of course this makes sense.

Your proposed solution sounds like a good idea. I would offer to help but wouldn't know how!

akx added a commit to akx/sd-dynamic-prompts that referenced this issue Jan 16, 2024
akx added a commit to akx/sd-dynamic-prompts that referenced this issue Jan 16, 2024
akx added a commit to akx/sd-dynamic-prompts that referenced this issue Jan 16, 2024
@wiseoldowl-66
Copy link
Author

Thank you for the quick solution!

akx added a commit to akx/sd-dynamic-prompts that referenced this issue Jan 16, 2024
akx added a commit that referenced this issue Jan 16, 2024
@akx
Copy link
Collaborator

akx commented Jan 16, 2024

@perspeculum No problem. I merged #708 now, so if you update the extension, this should work better :)

@wiseoldowl-66
Copy link
Author

I'm getting Python errors on generation with the new merge, and no magic prompts being added. This is the traceback:

Traceback (most recent call last):
      File "E:\SD\webui\webui\modules\scripts.py", line 718, in process
        script.process(p, *script_args)
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\dynamic_prompting.py", line 481, in process
        all_prompts, all_negative_prompts = generate_prompts(
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\helpers.py", line 93, in generate_prompts
        all_prompts = prompt_generator.generate(prompt, num_prompts, seeds=seeds) or [""]
      File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 164, in generate
        magic_prompts = self._generate_magic_prompts(prompts)
      File "E:\SD\webui\webui\extensions\sd-dynamic-prompts\sd_dynamic_prompts\magic_prompt.py", line 20, in _generate_magic_prompts
        magic_prompts = super()._generate_magic_prompts(orig_prompts)
      File "E:\SD\webui\system\python\lib\site-packages\dynamicprompts\generators\magicprompt.py", line 210, in _generate_magic_prompts
        prompts = self._generator(
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 201, in __call__
        return super().__call__(text_inputs, **kwargs)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1120, in __call__
        return self.run_single(inputs, preprocess_params, forward_params, postprocess_params)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\base.py", line 1126, in run_single
        model_inputs = self.preprocess(inputs, **preprocess_params)
      File "E:\SD\webui\system\python\lib\site-packages\transformers\pipelines\text_generation.py", line 205, in preprocess
        prefix + prompt_text, padding=False, add_special_tokens=False, return_tensors=self.framework
    TypeError: can only concatenate str (not "tuple") to str

@akx
Copy link
Collaborator

akx commented Jan 16, 2024

Oops... I'll check that out.

@akx akx mentioned this issue Jan 16, 2024
@akx
Copy link
Collaborator

akx commented Jan 16, 2024

@perspeculum Okay, fixed. Silly transformers... Can you try again?

@wiseoldowl-66
Copy link
Author

@akx That appears to be working well now, thank you!

There's a minor formatting issue in terms of the double ,, at the beginning and lack of , before the tag, but I don't suppose this is affecting the functionality.

image

@akx
Copy link
Collaborator

akx commented Jan 16, 2024

The double commas and lacks of spaces shouldn't matter :) (We actually already do try to clean up various cruft from the machine-generated prompts, but evidently not double commas!)

I'll go ahead and close this as fixed.

@akx akx closed this as completed Jan 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants