What is the significance of the rectified linear unit (ReLU) activation function in deep learning?
ReLU is a widely used activation function that introduces non-linearity by outputting the input for positive values and zero for negative values. It helps with faster convergence in training.