I am trying to add a ReLU activation function layer to my neural network. However when I try the following code I get this error:
ValueError: The truth value of an array with more than one element is ambiguous. Use a.any() or a.all()
I tried using:
class Relu(Activation):
    def __init__(self):
        def relu(x):
            return max(x, 0)
        def reluprime(x):
            return 1 if x > 0 else 0
        super().__init__(relu, reluprime)
I am very new to neural networks. Thank you
                        
Your variable
xis a numpy array.When dealing with numpy arrays, it is recommended to use numpy functions, which tend to act elementwise, rather than builtin python functions, which don't know what to do with a numpy array.
For instance,
max(x, 0)makes sense ifxis a number, but herexis an array, so what does it mean? How do you compare an array with 0?Instead, use
np.maximum, which will compare each element of the array with 0 and return an array.Likewise, instead of using an
1 if x > 0 else 0expression which makes no sense ifxis an array, use numpy functionheaviside:Relevant documentation: