Issue
I'm playing around with a image dataset in kanggle (https://www.kaggle.com/competitions/paddy-disease-classification/data). In this dataset, there are about 10000 images with 480*640 resolution.
When I try to load this dataset by following code,
for (label, file) in dataset_file_img(dataset_path)
image = load_img_into_tensor(file)
data.append(image/255)
data_label.append(label)
it consume about 20GB of RAM.
What is the best practice of loading a dataset like this?
Any help will/would be appreciated!
Solution
Try the following from keras
:
Answered By - der Fotik
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.