Issue
I want to add another Input Layer to my dense network after inputs are convolved. I get a graph disconnection error. I can't seem to find out why after an intense google search.
I followed similar approach like on other Stack overflow questions but that does not worked for me.
from tensorflow.keras import Sequential
from tensorflow.keras.layers import Embedding
from tensorflow.keras.layers import Dense,Input,Conv1D,MaxPool1D,Activation,Dropout,Flatten
from tensorflow.keras.models import Model
from tensorflow.keras import applications
from tensorflow.keras.layers import concatenate
import tensorflow as tf
tf.keras.backend.clear_session()
from keras.regularizers import l2
from tensorflow.keras import backend as K
#max_length = 1000
input_layer = Input(shape=(max_length,) )
input_layer_2 = Input(shape=(7,) )
x = Embedding(vocab_size, output_dim=100,weights=[embedding_matrix],input_length=max_length, trainable=False)(input_layer)
c1 = Conv1D(filters=13, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_normal")(x)
c2 = Conv1D(filters=13, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_normal")(x)
c3 = Conv1D(filters=13, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_normal")(x)
x = concatenate([c1, c2, c3], axis=1)
m1 = MaxPool1D(pool_size=2)(x)
c1 = Conv1D(filters=12, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_uniform")(m1)
c2 = Conv1D(filters=12, kernel_size=2, padding='same', activation='relu',kernel_initializer="he_uniform")(m1)
c3 = Conv1D(filters=12, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_uniform")(m1)
x = concatenate([c1, c2, c3], axis=1)
m2 = MaxPool1D(pool_size=2)(x)
c4 = Conv1D(filters=15, kernel_size=2, padding='same', activation='relu',kernel_initializer="glorot_uniform")(m2)
flat1 = Flatten()(c4)
x = concatenate([flat1, input_layer_2], axis=1)
#drop = Dropout(0.50)(x)
d1 = Dense(64, activation="relu",kernel_initializer="glorot_uniform")(x)
d1 = Dense(32, activation="relu",kernel_initializer="glorot_uniform")(d1)
x = concatenate([d1, input_layer_2])
out = Dense(3, activation="softmax",kernel_initializer="glorot_uniform")(d1)
model = Model(inputs=input_layer, outputs=out)
print(model.summary())
Error:
ValueError: Graph disconnected: cannot obtain value for tensor KerasTensor(type_spec=TensorSpec(shape=(None, 7),
dtype=tf.float32, name='input_2'), name='input_2', description="created by layer 'input_2'")
at layer "concatenate_2". The following previous
layers were accessed without issue: ['embedding',
'conv1d', 'conv1d_1', 'conv1d_2', 'concatenate', 'max_pooling1d',
'conv1d_3', 'conv1d_4', 'conv1d_5', 'concatenate_1', 'max_pooling1d_1', 'conv1d_6', 'flatten']
Solution
As you have defined two inputs and a model based on parallel layers (branch models), you are supposed to define both inputs and outputs as you define model in here
Model(inputs=[input_layer1, input_layer2, ...],
outputs=[output_layer1, output_layer2, ...])
so in this case you should change it into
Model(inputs=[input_layer,input_layer_2], outputs=out)
And even notice the models with more than one output will also need to define losses and metrics as key:value pair dictionary in model.compile() based on the output layers name which you have defined before
Answered By - Soroush Mirzaei
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.