Issue
When I am using early stopping the model trains for one epoch only, which is not what should be doing.
Here is the example without early stopping:
# split a univariate sequence into samples
def split_sequence(sequence, n_steps):
X, y = list(), list()
for i in range(len(sequence)):
# find the end of this pattern
end_ix = i + n_steps
# check if we are beyond the sequence
if end_ix > len(sequence)-1:
break
# gather input and output parts of the pattern
seq_x, seq_y = sequence[i:end_ix], sequence[end_ix]
X.append(seq_x)
y.append(seq_y)
return np.array(X), np.array(y)
sequence = np.arange(10, 1000, 10)
n_steps = 3
X, y = split_sequence(sequence, n_steps)
n_features = 1
X = X.reshape((X.shape[0], X.shape[1], n_features))
model = Sequential()
model.add(LSTM(50, activation='relu', input_shape=(n_steps, n_features)))
model.add(Dense(1))
model.compile(optimizer='adam', loss='mean_absolute_percentage_error')
# early_stopping = EarlyStopping(monitor='val_loss', patience= 5)
hist = model.fit(X, y, validation_split=0.2, batch_size = 16, epochs = 200)
As can be seen in the following screenshots the error is continuously declining for the first 15+ epochs:
Now if I try early stopping it stops in the first epoch:
hist = model.fit(X, y, validation_split=0.2, callbacks = [EarlyStopping(patience=5)], batch_size = 16)
What I am doing wrong and how can I correct it?
Solution
You forgot to specify the number of epochs in this call, so it defaults to 1:
hist = model.fit(X, y, validation_split=0.2, callbacks = [EarlyStopping(patience=5)], batch_size = 16)
Change it to:
hist = model.fit(X, y, validation_split=0.2, callbacks=[EarlyStopping(patience=5)], batch_size=16, epochs=200)
Cheers
Answered By - Daniele Grattarola
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.