Issue
I'm experimenting with variations of a basic architecture:
cnn_model5 = models.Sequential([
layers.Input(shape=input_shape),
# Normalize.
norm_layer,
layers.Conv2D(32, 3, activation='relu'),
layers.Conv2D(64, 3, activation='relu'),
layers.MaxPooling2D(),
layers.Dropout(0.25),
layers.Flatten(),
layers.Dense(128, activation='relu',kernel_regularizer=tf.keras.regularizers.L2(0.01)),
layers.Dropout(0.6),
layers.Dense(2, activation='softmax'),
])
However, cnn5_model.summary()
only produces this output:
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
normalization (Normalizati (None, 124, 129, 1) 3
on)
conv2d_2 (Conv2D) (None, 122, 127, 32) 320
conv2d_3 (Conv2D) (None, 120, 125, 64) 18496
max_pooling2d_1 (MaxPoolin (None, 60, 62, 64) 0
g2D)
dropout_2 (Dropout) (None, 60, 62, 64) 0
flatten_1 (Flatten) (None, 238080) 0
dense_2 (Dense) (None, 128) 30474368
dropout_3 (Dropout) (None, 128) 0
dense_3 (Dense) (None, 2) 258
=================================================================
Total params: 30493445 (116.32 MB)
Trainable params: 30493442 (116.32 MB)
Non-trainable params: 3 (16.00 Byte)
It doesn't contain information about the regularizers, drop_out arguments, initializations or activation functions. So, although I can store it (along with results) to a file in an output directory, I am still not automatically recording all the relevant information needed to recreate the net.
The best approach I can think of is to create a string containing my command to create the architecture, save that string to a file in my output directory and the use Python's exec
command to create the net.
Is there a better way?
Solution
You can get the information in model.get_config()['layers']
. For example:
cnn_model5 = models.Sequential([
layers.Input(shape=input_shape),
layers.Conv2D(32, 3, activation='relu'),
layers.MaxPooling2D(),
layers.Dropout(0.25),
layers.Flatten(),
layers.Dense(128, activation='relu',kernel_regularizer=tf.keras.regularizers.L2(0.01)),
layers.Dropout(0.6),
layers.Dense(2, activation='softmax'),
])
cnn_model.get_config()['layers'][2]['config']
(the dropout layer):
{'dtype': 'float32', 'name': 'dropout_1', 'noise_shape': None, 'rate': 0.25, 'seed': None, 'trainable': True}
cnn_model.get_config()['layers'][4]['config']
(the first dense layer):
{'name': 'dense_2', 'trainable': True, 'dtype': 'float32', 'units': 128, 'activation': 'relu', 'use_bias': True,
'kernel_initializer': {'module': 'keras.initializers', 'class_name': 'GlorotUniform', 'config': {'seed': None},
'registered_name': None}, 'bias_initializer': {'module': 'keras.initializers', 'class_name': 'Zeros',
'config': {}, 'registered_name': None}, 'kernel_regularizer': {'module': 'keras.regularizers', 'class_name': 'L2',
'config': {'l2': 0.009999999776482582}, 'registered_name': None}, 'bias_regularizer': None, 'activity_regularizer': None,
'kernel_constraint': None, 'bias_constraint': None}
You can also get the same information in slightly different format in cnn_model5.layers
.
If you want the information printed out, you can use:
for info_ in cnn_model5.get_config()['layers']:
print(info_['config'])
Answered By - mhenning
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.