Issue
I am working on a semantic segmentation project in pytorch and I have class maps in the following shapes: [H,W] where each element is an integer between 0-n where n is the number of classes, H the height of the image and W the width of the image.
Here is an example:
test_label = torch.zeros([10,10])
test_label[:5,:5] = 1
test_label[5:,:5] = 2
test_label[:5,5:] = 3
test_label
Output:
tensor([[1., 1., 1., 1., 1., 3., 3., 3., 3., 3.],
[1., 1., 1., 1., 1., 3., 3., 3., 3., 3.],
[1., 1., 1., 1., 1., 3., 3., 3., 3., 3.],
[1., 1., 1., 1., 1., 3., 3., 3., 3., 3.],
[1., 1., 1., 1., 1., 3., 3., 3., 3., 3.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.]])
Now, what I want is something of the shape [n,C,H] where [1,C,H] would be e.g.:
tensor([[1., 1., 1., 1., 1., 0., 0., 0., 0., 0.],
[1., 1., 1., 1., 1., 0., 0., 0., 0., 0.],
[1., 1., 1., 1., 1., 0., 0., 0., 0., 0.],
[1., 1., 1., 1., 1., 0., 0., 0., 0., 0.],
[1., 1., 1., 1., 1., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
And [2,H,W] would be:
tensor([[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.],
[2., 2., 2., 2., 2., 0., 0., 0., 0., 0.]])
Is there a pytorch function which does this? My current approach would be iteratively masking over each unique element in the original tensor and insert them into a tensor of the shape [n,H,W] initially filled with all zeros. But that doesn't seem to be the best way to do it. I tried to look it up but it seems like I am not able to find the right name for this operation.
Thank you very much for your time.
Solution
You could apply nn.functional.one_hot
to convert the dense format into one-hot encoding and then multiple with the label value to get the desired result:
>>> C = int(x.max()) + 1
>>> ohe = F.one_hot(x.long(), num_classes=C)
Then multiple by the label values:
>>> res = ohe*torch.arange(C)
>>> res.permute(2,0,1)
tensor([[[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]],
[[1, 1, 1, 1, 1, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 0, 0, 0, 0, 0],
[1, 1, 1, 1, 1, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0],
[2, 2, 2, 2, 2, 0, 0, 0, 0, 0]],
[[0, 0, 0, 0, 0, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 0, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 0, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 0, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 0, 3, 3, 3, 3, 3],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]])
Answered By - Ivan
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.