<p>
  Implement the SiLU (Sigmoid Linear Unit) activation function forward pass for 1D input vectors. 
  Given an input tensor of shape [N] where N is the number of elements, compute the output using the elementwise formula.
</p>

<p>
  SiLU is defined as:
  \[
  \begin{align}
  \sigma(x) &= \frac{1}{1 + e^{-x}} \\
  \text{SiLU}(x) &= x \cdot \sigma(x)
  \end{align}
  \]
</p>

<h2>Implementation Requirements</h2>
<ul>
  <li>Use only native features (external libraries are not permitted)</li>
  <li>The <code>solve</code> function signature must remain unchanged</li>
  <li>The final result must be stored in the <code>output</code> tensor</li>
</ul>

<h2>Example 1:</h2>
<pre>
Input:  input = [0.5, 1.0, -0.5]  (N=3)
Output: output = [0.3112295, 0.731059, -0.1887705]
</pre>

<h2>Example 2:</h2>
<pre>
Input:  input = [-1.0, -2.0, -3.0, -4.0, -5.0]  (N=5)
Output: output = [-0.26894143 -0.23840584 -0.14227763 -0.07194484 -0.03346425]
</pre>

<h2>Constraints</h2>
<ul>
  <li>1 ≤ <code>N</code> ≤ 10,000</li>
  <li>-100.0 ≤ input values ≤ 100.0</li>
</ul>