<p>
  Implement a program that performs the Rectified Linear Unit (ReLU) activation function on a vector of 32-bit floating point numbers.
  The ReLU function sets all negative values to zero and leaves positive values unchanged: \[\text{ReLU}(x) = \max(0, x)\]
</p>

<h2>Implementation Requirements</h2>
<ul>
  <li>External libraries are not permitted</li>
  <li>The <code>solve</code> function signature must remain unchanged</li>
  <li>The final result must be stored in <code>output</code></li>
</ul>

<h2>Example 1:</h2>
<pre>
Input:  input = [-2.0, -1.0, 0.0, 1.0, 2.0]
Output: output = [0.0, 0.0, 0.0, 1.0, 2.0]
</pre>

<h2>Example 2:</h2>
<pre>
Input:  input = [-3.5, 0.0, 4.2]
Output: output = [0.0, 0.0, 4.2]
</pre>

<h2>Constraints</h2>
<ul>
  <li>1 &le; <code>N</code> &le; 100,000,000</li>
</ul>
