ReLU (Rectified Linear Unit): A widely used activation function that returns the input if it is positive and returns 0 if it is negative.
ReLU (Rectified Linear Unit): A widely used activation function that returns the input if it is positive and returns 0 if it is negative.