difference calling activation function

51 Views Asked by At

While I'm study tensorflow, I got a question.

There are two way to define activation function.

activation = 'relu' and activation = tf.nn.relu

I want to know difference between of them.

(Actually, I think other activation functions are include in this case.)

I tried two way.

First one is

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape = (28, 28)),
    tf.keras.layers.Dense(128, activation = 'relu'),
    tf.keras.layers.Dense(10, activation = tf.nn.softmax)
])

Second one is

model = tf.keras.models.Sequential([
    tf.keras.layers.Flatten(input_shape = (28, 28)),
    tf.keras.layers.Dense(128, activation = tf.nn.relu),
    tf.keras.layers.Dense(10, activation = tf.nn.softmax)
])

I think they gave me same result.

What is the difference of them?

2

There are 2 best solutions below

0
Minh-Long Luu On BEST ANSWER

They are the same. Keras did this to give a clearer UX for users if they don't want to customize the inner workings if a function.

This line of code from the Dense layer gets the activation function from the activations file. In the activations file, you can see that this relu function is wrapped with @keras_export, which means "you can specify it as a string".

Some users might want to specify additional params like tf.keras.activations.relu(x, alpha=0.5), that's why both ways exist, and you can use either way.

0
yxz On

using activation='relu' is a shorthand for activation=tf.keras.activations.relu. so one is from the TF's library and the other from Keras. So which one you should use depends on whether you would use TF to create a NN or use keras to make a sequential model. For more detail you can see this post.