EmmanuelCasarrubias commited on
Commit
cab73cf
1 Parent(s): d885235

Add word transformation function using encoder layer and attention mechanism

Browse files

This commit introduces a function transform_word that applies an encoder layer with an attention mechanism to transform input words into significant numerical representations. The function iterates over each character in the input word, constructs the Q matrix using construir_matriz_Q, and generates the K transpose and V matrices using generar_k_transpuesta_y_v. Then, it computes the attention scores using scaled_dot_product_attention_2D, followed by calculating the projection weight matrices Wq, Wk, and Wv using calcular_pesos_proyeccion.

Next, the function applies the encoder layer operation using encoder_layer, passing the Q, K transpose, V matrices, and projection weights Wq, Wk, and Wv. The resulting encoder output is then processed to obtain significant numbers for each character in the word. Finally, the significant numbers are accumulated and returned as a list, representing the transformed word.

This function enables the conversion of textual inputs into numerical representations by leveraging transformer-based architecture components, facilitating applications such as text classification, sentiment analysis, and sequence modeling.

Files changed (1) hide show
  1. transform_word.py +15 -0
transform_word.py ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import numpy as np
2
+ from matrices import construir_matriz_Q, generar_k_transpuesta_y_v
3
+ from attention import encoder_layer, calcular_pesos_proyeccion, scaled_dot_product_attention_2D
4
+
5
+ def transform_word(word):
6
+ significant_numbers = []
7
+ for char in word:
8
+ Q = construir_matriz_Q(char)
9
+ K_transpose, V = generar_k_transpuesta_y_v(Q, char)
10
+ output = scaled_dot_product_attention_2D(Q, K_transpose, V)
11
+ Wq, Wk, Wv = calcular_pesos_proyeccion(output)
12
+ encoder_output = encoder_layer(Q, K_transpose, V, Wq, Wk, Wv)
13
+ significant_number = np.sum(np.max(encoder_output, axis=1))
14
+ significant_numbers.append(significant_number)
15
+ return significant_numbers