Skip to main content
This documentation was AI-generated. If you find any errors or have suggestions for improvement, please feel free to contribute! Edit on GitHub
The UNetTemporalAttentionMultiply node applies multiplication factors to different types of attention mechanisms in a temporal UNet model. It modifies the model by adjusting the weights of self-attention and cross-attention layers, distinguishing between structural and temporal components. This allows fine-tuning of how much influence each attention type has on the model’s output.

Inputs

ParameterData TypeRequiredRangeDescription
modelMODELYes-The input model to modify with attention multipliers
self_structuralFLOATNo0.0 - 10.0Multiplier for self-attention structural components (default: 1.0)
self_temporalFLOATNo0.0 - 10.0Multiplier for self-attention temporal components (default: 1.0)
cross_structuralFLOATNo0.0 - 10.0Multiplier for cross-attention structural components (default: 1.0)
cross_temporalFLOATNo0.0 - 10.0Multiplier for cross-attention temporal components (default: 1.0)

Outputs

Output NameData TypeDescription
modelMODELThe modified model with adjusted attention weights