Issue
I have a list of matrices with size of (63,32,1,600,600)
, when I want to stack it with torch.stack(matrices).cpu().detach().numpy()
it's raising with error:
"stack expects each tensor to be equal size, but got [32, 1, 600, 600] at entry 0 and [16, 1, 600, 600] at entry 62". Is tried for resizing but it did not work. I appreciate any recommendations.
Solution
If I understand correctly what you're trying to do is stack the outputted mini-batches together into a single batch. My bet is that your last batch is partially filled (only has 16 elements instead of 32).
Instead of using torch.stack
(creating a new axis), I would simply concatenate with torch.cat
on the batch axis (axis=0
). Assuming matrices
is a list of torch.Tensor
s.
torch.cat(matrices).cpu().detach().numpy()
As torch.cat
concatenates on axis=0
by default.
Answered By - Ivan
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.