Derivative Hadamard Product

$\tfrac{\mathrm{d} }{\mathrm{d} W } (\vec{a}W) \odot (\vec{b}W)$

I am looking for the solution of this derivative of the Hadamard Product.
a and b are vectors of dimension |V|, W is a Matrix with dimension |V|×|d|
  • What do you mean with the derivative wrt W?

  • I did d some research and it seems to me that there is no commonly accepted definition. Please provide one.

  • The term is part of the backpropagation of my neural network. I need to calculate this derivative to update my weights in the neural network. Hope this helps.

  • The hadamard product is the elemt-wise product of two matrices of the same dimensionality https://en.m.wikipedia.org/wiki/Hadamard_product_(matrices)

  • One can view the product as a function from R^(|V|\times |d|) to R^(|d|). So the derivative may mean just the matrix of partial derivatives. Do you expect the result to be a a |d| by ( |V|\times |d| ) matrix?

  • a and b must be row vectors for this product to make sense.

  • Ideally the result would be in the dimenson of |V|

  • The preferred notation for a vector is as a column vector, so ideally aW would be written as a^t W , meaning a transpose times W. will keep with the notation you used, so that aW is a row vector times a matrix, but this is not a very common notation.

  • I will keep with the notation you used, so that aW is a row vector times a matrix, but this is not a very common notation*

  • I am sorry for any inconvenience with the notation. Of course, a and b would be transposed.

Answer

Answers can be viewed only if
  1. The questioner was satisfied and accepted the answer, or
  2. The answer was disputed, but the judge evaluated it as 100% correct.
View the answer

1 Attachment

The answer is accepted.