Issue
my problem is i got two tensors in one dataset with headers image and label
when i execute simple loop all look fine unfortunetly when i make dataloader as below
training_loader = torch.utils.data.DataLoader(training_dataset, batch_size=100, shuffle=True)
and run
for i in training_loader:
print(i)
Im getting error:
RuntimeError: stack expects each tensor to be equal size, but got [224, 224] at entry 0 and [224, 224, 3] at entry 4
what can cause it and how to fix it ? thank you inadvance
Solution
It seems like one (or more) of your images is not a color image, but a gray-scale image.
Modify your loading code to force all images to be treated as color images:
img = Image.load(filename).convert('RGB')
See this answer for more details.
Answered By - Shai
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.