Issue
I am trying to make a CNN-LTSM model with Keras, and I want to use keras.utils.Sequence to feed data into model (as the data is continuous, and I want every available window in the data be used for training).
But then Keras keeps complaining about the shape of my input data.
Error:
Epoch 1/100
WARNING:tensorflow:Model was constructed with shape (None, 20, 30) for input KerasTensor(type_spec=TensorSpec(shape=(None, 20, 30), dtype=tf.float32, name='input_7'), name='input_7', description="created by layer 'input_7'"), but it was called on an input with incompatible shape (None, None).
Cell 5 line 2
15 dataSeq = TrainDataFeedSequence(train, INPUT_SIZE)
17 # for i in range(0, len(dataSeq)):
18 # curInputShape = dataSeq.__getitem__(i)[0].shape
19 # if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
20 # print(dataSeq.__getitem__(i)[0])
---> 22 model.fit(x=dataSeq, epochs=100, verbose=1)
in filter_traceback.<locals>.error_handler(*args, **kwargs)
67 filtered_tb = _process_traceback_frames(e.__traceback__)
68 # To get the full stack trace, call:
69 # `tf.debugging.disable_traceback_filtering()`
---> 70 raise e.with_traceback(filtered_tb) from None
71 finally:
72 del filtered_tb
in outer_factory.<locals>.inner_factory.<locals>.tf__train_function(iterator)
13 try:
14 do_return = True
---> 15 retval_ = ag__.converted_call(ag__.ld(step_function), (ag__.ld(self), ag__.ld(iterator)), None, fscope)
16 except:
17 do_return = False
ValueError: Exception encountered when calling layer "sequential_6" " f"(type Sequential).
Input 0 of layer "conv1d_3" is incompatible with the layer: expected min_ndim=3, found ndim=2. Full shape received: (None, None)
My Code for my sequence
class TrainDataFeedSequence(tf.keras.utils.Sequence):
def __init__(self, data, batch_size):
self.data = data
self.batch_size = batch_size
def __len__(self):
return len(self.data) - self.batch_size
def __getitem__(self, idx):
inputs = self.data.iloc[idx:idx+self.batch_size,:]
targets = self.data.iloc[idx+self.batch_size,:1]
return (inputs, targets)
dataSeq = TrainDataFeedSequence(train, 20)
# this prints out nothing
# for i in range(0, len(dataSeq)):
# curInputShape = dataSeq.__getitem__(i)[0].shape
# if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
# print(dataSeq.__getitem__(i)[0])
model.fit(x=dataSeq, epochs=100)
I'm quite sure that every item in my Sequence are not in shape (None, None), so what goes wrong?
keras == 2.10.0,tensorflow == 2.10.0, python == 3.9.18
I tried to print out all the shape of input items as well and all of them are in shape (20, 30).
Solution
To whoever wants to know, it's because of the TrainDataFeedSequence
doesn't return input
and targets
as np.Array
making two np.Array
with inputs and targets and return them solved the error
working version:
class TrainDataFeedSequence(tf.keras.utils.Sequence):
def __init__(self, data, batch_size):
self.data = data
self.batch_size = batch_size
def __len__(self):
return len(self.data) - self.batch_size
def __getitem__(self, idx):
inputs = self.data.iloc[idx:idx+self.batch_size,:]
targets = self.data.iloc[idx+self.batch_size,:1]
return np.array(inputs), np.array(targets)
dataSeq = TrainDataFeedSequence(train, 20)
# this prints out nothing
# for i in range(0, len(dataSeq)):
# curInputShape = dataSeq.__getitem__(i)[0].shape
# if (curInputShape[0] == None or curInputShape[1] == None or curInputShape == (None, None)):
# print(dataSeq.__getitem__(i)[0])
model.fit(x=dataSeq, epochs=100)
Worth mentioning that this exact hand-made Sequence is responsible for screwing up my training process, so better don't use it directly, this should help
Answered By - laezZ_boi
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.