dropout, recurrent_dropout in LSTM layer

568 Views Asked by At

I am training a GRU neural network and added dropout and recurrent dropout in my GRU layer but since then I can't get reproducible results every time I run the program again and I can't fix this problem even with :

recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42), 
kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42))

in the same layer.

This is my model:

model = tf.keras.models.Sequential()

model.add(tf.keras.layers.GRU(20, activation='tanh',dropout=0.1, 
    recurrent_dropout=0.2,recurrent_activation="sigmoid", return_sequences = 
    False, input_shape=(train_XX.shape[1], train_XX.shape[2]), 
    recurrent_initializer=tf.keras.initializers.Orthogonal(seed=42), 
    kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42)))

model.add(tf.keras.layers.Dense(1, activation='sigmoid', 
    kernel_initializer=tf.keras.initializers.GlorotUniform(seed=42),))

model.compile(loss=tf.keras.losses.BinaryCrossentropy(from_logits=False,
    name="binary_crossentropy",),optimizer='adam',
    metrics=[tf.keras.metrics.PrecisionAtRecall(0.75)] )
1

There are 1 best solutions below

0
Virginie Gautier On

I had already set the seed at the beginning of the programme with:

import numpy as np
import tensorflow as tf
import random as rn
np.random.seed(1)
tf.random.set_seed(2)
rn.seed(3)

but by adding before the 3 rows of seed fixation:

import os
os.environ['PYTHONHASHSEED'] = '0'
os.environ['CUDA_VISIBLE_DEVICES'] = ''

it resolves my problem.