Issue
I want to resize my image from 32 * 32 to 16 * 16. (By using torch.tensor) Like decreasing the resolution? Can anyone help me?
Solution
If you have an image (stored in a tensor) and you want to decrease it's resolution, then you are not reshaping
it, but rather resizing it.
To that end, you can use pytorch's interpolate
:
import torch
from torch.nn import functional as nnf
y = nnf.interpolate(x[None, None, ...], size=(16, 16), mode='bicubic', align_corners=False, antialias=True)
Notes:
nnf.interpolate
operates on batches of multi-channel images, that is, it expects its inputx
to have 4 dimensions:batch
-channels
-height
-width
. So, if yourx
is a single image with a single channel (e.g., an MNIST digit) you'll have to create a singleton batch dimension and a singleton channel dimension.- Pay close attention to
align_corners
andantialias
-- make sure you are using the right configuration for your needs.
For more information regarding aliasing and alignment when resizing images you can look at ResizeRight.
Answered By - Shai
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.