What is the role of the rectified linear unit (ReLU) activation function in deep learning?
ReLU introduces non-linearity to neural networks by outputting the input for positive values and zero for negative values. It is widely used to enable the modeling of complex relationships.