Skip to content

Is causal attention actually used in CausalDiT? #16

@Dingrui-Wang

Description

@Dingrui-Wang

Thank you for your amazing work! @seonghyeonye @zchuning

In you code regarding CausalWanSelfAttention, the related implementation is here

I am confused about the _process_clean_image_only function which will only be called if kv_cache is None, but during inference, kv_cache won't be none after initial timestep, so in this case, the normal self.attn instead of self.causal_attn will be used

Is this designed on purpose? If there is no issue about this, then the name CausalWanSelfAttention can be a bit confusing since no causal attention is really used during inference.

Thank you very much in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions