【AIGC】 Comprensión profunda del modelo LORA

Comprensión profunda del modelo LORA.

El modelo LORA es un modelo de red neuronal que puede ajustar automáticamente los pesos entre capas de la red neuronal mediante el aprendizaje para mejorar el rendimiento del modelo. Este artículo explorará en profundidad los principios, escenarios de aplicación, ventajas y desventajas del modelo LORA.

1. Principio del modelo LORA

El nombre completo del modelo LORA es Learnable Re-Weighting, que es un modelo de reponderación que se puede aprender. Mejora principalmente el rendimiento del modelo al aprender los pesos entre las capas de la red neuronal. Específicamente, el modelo LORA ajusta automáticamente el peso de la capa actual al aprender la correlación entre la capa anterior y la siguiente, mejorando así el rendimiento del modelo.

La idea básica del modelo LORA es considerar cada capa de la red neuronal como un extractor de características ponderables, y el peso de cada capa determina su impacto en la salida del modelo. El modelo LORA puede permitir que diferentes capas desempeñen un mejor papel en diferentes tareas ajustando los pesos.

En el modelo LORA, el peso de cada capa consta de dos partes: el peso de la capa anterior y el peso de la siguiente capa. Específicamente, supongamos que la capa actual es la ii -ésimai capa, la capa anterior esi − 1 i-1iNivel 1 , el siguiente nivel esi+1 i+1i+1 capa, entonces el peso de la capa actual se puede expresar como:

wi = α yo ⋅ W yo − 1 ⋅ W yo + 1 w_i = \alpha_i \cdot W_{i-1} \cdot W_{i+1}wyo=ayoW.yo 1W.yo + 1

Entre ellos, α i \alpha_iayoes el parámetro aprendible aprendido, W i − 1 W_{i-1}W.yo 1Suma W i + 1 W_{i+1}W.yo + 1son los pesos de la capa anterior y la siguiente respectivamente. Por α i \alpha_iayoPara el aprendizaje, el modelo LORA puede ajustar automáticamente el peso de la capa actual, mejorando así el rendimiento del modelo.

2. Escenarios de aplicación del modelo LORA

El modelo LORA tiene una amplia gama de escenarios de aplicación y se utiliza principalmente en escenarios que requieren procesamiento de datos complejos, como procesamiento de lenguaje natural, visión por computadora, etc. En el campo del procesamiento del lenguaje natural, el modelo LORA puede mejorar el desempeño de tareas como la clasificación de texto y el análisis de sentimientos mediante el aprendizaje del contexto contextual. En el campo de la visión por computadora, el modelo LORA puede mejorar el desempeño de tareas como la clasificación de imágenes y la detección de objetos al aprender la correlación entre diferentes capas.

3. Ventajas y desventajas del modelo LORA

La principal ventaja del modelo LORA es que puede aprender automáticamente la correlación entre capas, mejorando así el rendimiento del modelo. A diferencia del ajuste de peso manual tradicional, el modelo LORA puede ajustar los pesos automáticamente mediante el aprendizaje de datos, evitando las limitaciones causadas por el ajuste de peso manual.

Sin embargo, el modelo LORA también tiene algunas deficiencias. En primer lugar, el modelo LORA supone que cada capa solo se ve afectada por las capas anterior y posterior, lo que puede causar algunos problemas en algunos casos, pero en algunas aplicaciones, esta suposición puede simplificar el diseño y la implementación del modelo.

En el modelo LORA, cada capa tiene un peso de capa superior y un peso de capa inferior correspondientes, y estos pesos se pueden obtener mediante el aprendizaje. Durante el proceso de entrenamiento, el modelo LORA ajustará automáticamente estos pesos, lo que le permitirá aprender características de los datos con mayor precisión.

El proceso de implementación del modelo LORA es relativamente simple: solo necesita realizar una operación de reponderación en cada capa del modelo, es decir, la multiplicación ponderada del peso de la capa superior y el peso de la capa inferior para obtener nuevos pesos, y luego usar estos Nuevos pesos para actualizar el modelo. Esta operación de reponderación se puede implementar utilizando la función torch.mm() en el marco PyTorch.

En general, el modelo LORA es un modelo de reponderación fácil de aprender, simple y eficaz, que puede mejorar significativamente el rendimiento del modelo en determinadas aplicaciones. Sin embargo, debido a las limitaciones de sus supuestos, el modelo LORA puede no ser adecuado para determinados conjuntos de datos y escenarios de aplicación.

4. Composición del modelo LORA

Un modelo LORA consta de tres partes lora_down.weight: lora_up.weighty alpha.
donde lora_down.weighty lora_up.weightson los pesos de las capas superior e inferior en el modelo LORA, y alphason los coeficientes de escala cuando se actualizan los pesos.

5. Método de denominación

Estos son los nombres de los pesos y sesgos de las distintas capas del modelo PyTorch. En PyTorch, los pesos y sesgos de cada capa se almacenan en un state_dictdiccionario llamado. Estas reglas de nomenclatura suelen estar determinadas por el tipo de capa y la ubicación de la capa en el modelo. Por ejemplo, represente el peso ascendente de la capa de proyección lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weightde la novena capa de autoatención en el modelo LORA .K

6. Ejemplo clave del modelo LORA

Estas claves se utilizan para cargar y guardar los parámetros de peso del modelo LORA en PyTorch. Cada tecla está asociada a un tensor de peso en el modelo LORA.

# LORA模型中的权重参数的key

- lora_te_text_model_encoder_layers_0_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_0_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_0_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_0_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_0_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_10_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_10_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_10_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_10_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_11_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_11_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_11_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_11_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_1_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_1_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_1_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_1_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_2_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_2_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_2_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_2_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_3_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_3_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_3_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_3_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_4_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_4_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_4_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_4_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_5_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_5_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_5_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_5_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_6_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_6_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_6_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_6_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_7_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_7_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_7_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_7_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_8_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_8_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_8_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_8_self_attn_v_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc1.alpha.
- lora_te_text_model_encoder_layers_9_mlp_fc1.lora_down.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc1.lora_up.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc2.alpha.
- lora_te_text_model_encoder_layers_9_mlp_fc2.lora_down.weight.
- lora_te_text_model_encoder_layers_9_mlp_fc2.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_k_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_out_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_q_proj.lora_up.weight.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.alpha.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_down.weight.
- lora_te_text_model_encoder_layers_9_self_attn_v_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_0_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_0_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_0_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_0_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_0_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_0_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_1_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_1_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_1_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_1_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_proj_in.alpha.
- lora_unet_down_blocks_2_attentions_0_proj_in.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_proj_in.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_proj_out.alpha.
- lora_unet_down_blocks_2_attentions_0_proj_out.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_proj_out.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_proj_in.alpha.
- lora_unet_down_blocks_2_attentions_1_proj_in.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_proj_in.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_proj_out.alpha.
- lora_unet_down_blocks_2_attentions_1_proj_out.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_proj_out.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_down_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_mid_block_attentions_0_proj_in.alpha.
- lora_unet_mid_block_attentions_0_proj_in.lora_down.weight.
- lora_unet_mid_block_attentions_0_proj_in.lora_up.weight.
- lora_unet_mid_block_attentions_0_proj_out.alpha.
- lora_unet_mid_block_attentions_0_proj_out.lora_down.weight.
- lora_unet_mid_block_attentions_0_proj_out.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_mid_block_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_1_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_1_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_1_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_2_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_2_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_2_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_0_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_0_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_0_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_1_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_1_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_1_transformer_blocks_0_ff_net_2.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_proj_in.alpha.
- lora_unet_up_blocks_3_attentions_2_proj_in.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_proj_in.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_proj_out.alpha.
- lora_unet_up_blocks_3_attentions_2_proj_out.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_proj_out.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn1_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_k.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_out_0.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_q.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_attn2_to_v.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_0_proj.lora_up.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.alpha.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_down.weight.
- lora_unet_up_blocks_3_attentions_2_transformer_blocks_0_ff_net_2.lora_up.weight.

Supongo que te gusta

Origin blog.csdn.net/qq_44824148/article/details/130522248
Recomendado
Clasificación