Issue
I am confused about the following example of a matrix tensor multiplication that returns a vector. At first glance I thought that it would mean multiplying the first dimension of the tensor dydx by the matrix dLdy but I don't get the expected results as depicted below. So what is the meaning of this einsum ?
import torch
import numpy as np
dLdy = torch.randn(2,2)
dydx = torch.randn(2,2,2)
torch.einsum('jk,jki->i', dLdy, dydx)
tensor([0.3115, 3.7255])
dLdy
tensor([[-0.4845, 0.6838],
[-1.1723, 1.4914]])
dydx
tensor([[[ 1.5496, -1.2722],
[ 0.1221, 1.0495]],
[[-1.4882, 0.0307],
[-0.5134, 1.6276]]])
(dLdy * dydx[0]).sum()
-0.1985
Solution
For A and B this is contraction (sum) over the first two dimensions jk, so res(i) = sum_{j,k} A(j,k)B(j,k,i)
for example:
import torch
import numpy as np
dLdy = torch.randn(2,2)
dydx = torch.randn(2,2,2)
print(torch.einsum('jk,jki->i', dLdy, dydx))
print((dLdy * dydx[:,:,0]).sum())
print((dLdy * dydx[:,:,1]).sum())
produces
tensor([4.6025, 1.8987])
tensor(4.6025)
tensor(1.8987)
ie (dLdy * dydx[:,:,0]).sum()
is the first element of the resulting vector, etc
Answered By - piterbarg
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.