$\tfrac{\mathrm{d} }{\mathrm{d} W } (\vec{a}W) \odot (\vec{b}W)$

I am looking for the solution of this derivative of the Hadamard Product.
a and b are vectors of dimension |V|, W is a Matrix with dimension |V|×|d|
• Alessandro Iraci
0

What do you mean with the derivative wrt W?

• Alessandro Iraci
0

I did d some research and it seems to me that there is no commonly accepted definition. Please provide one.

• Panda
0

The term is part of the backpropagation of my neural network. I need to calculate this derivative to update my weights in the neural network. Hope this helps.

• Panda
0

The hadamard product is the elemt-wise product of two matrices of the same dimensionality https://en.m.wikipedia.org/wiki/Hadamard_product_(matrices)

• Erdos
0

One can view the product as a function from R^(|V|\times |d|) to R^(|d|). So the derivative may mean just the matrix of partial derivatives. Do you expect the result to be a a |d| by ( |V|\times |d| ) matrix?

• Mathe
0

a and b must be row vectors for this product to make sense.

• Panda
0

Ideally the result would be in the dimenson of |V|

• Mathe
0

The preferred notation for a vector is as a column vector, so ideally aW would be written as a^t W , meaning a transpose times W. will keep with the notation you used, so that aW is a row vector times a matrix, but this is not a very common notation.

• Mathe
0

I will keep with the notation you used, so that aW is a row vector times a matrix, but this is not a very common notation*

• Panda
0

I am sorry for any inconvenience with the notation. Of course, a and b would be transposed.

Answers can be viewed only if
1. The questioner was satisfied and accepted the answer, or
2. The answer was disputed, but the judge evaluated it as 100% correct.