I am training a GRU neural network and added dropout and recurrent dropout in my GRU layer but since then I can't get reproducible results every time I run the program again and I can't fix this problem even with :
recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42),
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42))
in the same layer.
This is my model:
model = tf.keras.models.Sequential()
model.add(tf.keras.layers.GRU(20, activation='tanh',dropout=0.1,
recurrent_dropout=0.2,recurrent_activation="sigmoid", return_sequences =
False, input_shape=(train_XX.shape[1], train_XX.shape[2]),
recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42),
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42)))
model.add(tf.keras.layers.Dense(1, activation='sigmoid',
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42),))
model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=False,
name="binary_crossentropy",),optimizer='adam',
metrics=[tf.keras.metrics.PrecisionAtRecall(0.75)] )
I had already set the seed at the beginning of the programme with:
but by adding before the 3 rows of seed fixation:
it resolves my problem.