Wrong get_best_hyperparameters from KerasTuner

44 Views Asked by At

I'm trying to tune the hyper parameters for a model using KerasTuner. I have the following method to build the models:

def hyperTuning(hp: keras_tuner.HyperParameters):
    i = hp.Int("numLayers", 0, 2)

    model = Sequential()
    model.add(LSTM( units= hp.Int("units", min_value= 32, max_value= 512, step= 32), 
                    activation= 'tanh',
                    return_sequences= (i >= 1),
                    input_shape= (lstmForecast.train_in.shape[1], lstmForecast.train_in.shape[2])))
    
    if(i >= 1):
        model.add(LSTM( units= hp.Int(f"units{i}", min_value= 32, max_value= 512, step= 32), 
                activation= 'tanh',
                return_sequences= (i >= 2) ))
        
        if(i >= 2):
            model.add(LSTM( units= hp.Int(f"units{i + 1}", min_value= 32, max_value= 512, step= 32), 
                    activation= 'tanh'))
    
    model.add(Dropout(0.2))
    model.add(Dense(units = 1))
    model.compile(optimizer='adam', loss='mean_squared_error', metrics=[km.RootMeanSquaredError()],)
    return model

After running the KerasTuner and printing get_best_hyperparameters(), however, what I get is:

    best_hps = tuner.get_best_hyperparameters(1)
    print(best_hps[0].values)
{'numLayers': 1, 'units': 160, 'units1': 32}
{'numLayers': 1, 'units': 448, 'units1': 32}
{'numLayers': 1, 'units': 160, 'units1': 32}
{'numLayers': 0, 'units': 416, 'units1': 448}
{'numLayers': 1, 'units': 256, 'units1': 32}
{'numLayers': 0, 'units': 512}
{'numLayers': 0, 'units': 32, 'units2': 160, 'units3': 96}
{'numLayers': 0, 'units': 96}

I see 2 problems here, that I don't understand why they're happening:

  1. Why is get_best_hyperparameters() returning that many values?
  2. How is {'numLayers': 0, 'units': 32, 'units2': 160, 'units3': 96} a possible output when in my code I specify that the second layer is only added when i >= 1 and the third is only added when i >= 2 ?
0

There are 0 best solutions below