Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is support stable diffusion? #9

Open
libai-lab opened this issue Oct 10, 2024 · 3 comments
Open

is support stable diffusion? #9

libai-lab opened this issue Oct 10, 2024 · 3 comments

Comments

@libai-lab
Copy link

No description provided.

@blepping
Copy link

@libai-lab

if you use or can use ComfyUI, see #11 - i made an extension to use SageAttention there (and can be used with Stable Diffusion). doesn't currently work with SD15 models, only some SDXL attentions work (but there is a performance improvement). i didn't test with other models like Flux.

of course, this could also probably be ported to other frontends like A1111 but someone else would have to do it.

note: i'm just a random person, no affiliation with the SageAttention project.

@KartavyaBagga
Copy link

KartavyaBagga commented Oct 15, 2024

Hi @jt-zhang,
Will there be a support for SD15, head_dim 40 for hidden_states ?

@jason-huang03
Copy link
Member

supporting head dim 40 is hard because int8 tensor core requires mma inner dimension of 32. In the case of hidden_states 40 we recommend use the fp16 baseline attention kernel.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants