% Some commands to this chapter
\newcommand{\Individuos}{{\cal I}}
\newcommand{\IndRand}{{\sffamily IndRand}}
\newcommand{\Random}{{\sffamily Random}}
\newcommand{\Crossover}{{\sffamily Crossover}}
\newcommand{\Mutation}{{\sffamily Mutation}}
\newcommand{\Sto}{\left< \,} % string open
\newcommand{\Stc}{\, \right>} % string close
\newcommand{\Ste}{\Sto \, \Stc} % empty
\newcommand{\Sbo}{\left[ \,} % square bracket open
\newcommand{\Sbc}{\, \right]} % square bracket close
\newcommand{\Entry}[2]{#1 \left[ \, #2 \, \right]} % square bracket close
\newcommand{\Paux}{P_{\rm aux}} % empty
\newcommand{\MyComment}[1]{\Comment{\parbox[t]{11cm}{#1}}} % empty
\newcommand{\MyCommentL}[2]{\rule{0cm}{0.4cm}$\triangleright$
  {\parbox[t]{#1}{#2}}} % empty


\chapter[Proposed PSO and SVM Methodology for Reliability Prediction]{Proposed Particle Swarm Optimization and  Support Vector Machine Methodology for Reliability Prediction}\label{ch:model-prob}


In this work, \gls{pso} is considered to tackle the model selection
problem for \gls{svr} tasks. As the Gaussian \gls{rbf} kernel is
broadly used in reliability related \gls{svm} works, it is the one
taken into account. In this way, the parameters $C$, $\varepsilon$ and
the Gaussian \gls{rbf} width $\sigma$ must be defined. These three
\gls{svr} parameters become decision variables to the \gls{pso}
algorithm and they form a 3-dimensional search space. In fact, instead
of using $\sigma$, it is considered $\gamma = \frac{0.5}{\sigma^2}$,
which can be noticed on the expression of the \gls{rbf} kernel in
Table \ref{tab:kernel}. This is necessary due to {\sf{LIBSVM}}
requirements, since it works with gamma values and not directly with
the width $\sigma$. Thus, the $i^{th}$ particle is described by the
vectors $\mathbf{x}_i = (x_{i1}, x_{i2}, x_{i3})$, $\mathbf{p}_i =
(p_{i1}, p_{i2}, p_{i3})$ and $\mathbf{v}_i = (v_{i1}, v_{i2},
v_{i3})$, where the first, second and third dimensions are
respectively related to $C$, $\varepsilon$ and $\gamma$.

\citeonline{bratton2007} contends that a standard \gls{pso} algorithm
includes a {\it lbest} model, the usage of the constriction factor for
velocities' and thus positions' updates, the number of particles set
as 50, a non-uniform swarm initialization and the procedure of
skipping fitness evaluation when particles exit the feasible search
space. 

In this work, similarly to the suggested \gls{pso}, it is used the
constriction factor, infeasible particles are allowed but their
fitness values are not assessed and a {\it lbest} model is
implemented. Differently from the standard algorithm, the traditional
random uniform swarm initialization is performed and the number of
particles is set to 30. Indeed, \citeonline{bratton2007} states that
20-100 particles had produced quite similar empirical
results. 

Additionally, a {\it gbest} \gls{pso} is also implemented in order to
have its performance compared to the {\it lbest} one in the specific
context of reliability prediction. Also, in the \gls{pso} fitness
evaluation phase, instead of only evaluating an ordinary objective
function, the coupling with {\sf LIBSVM} takes place. Next Section
details the steps involved in the implemented \gls{pso} combined to
{\sf LIBSVM}.

\section{Steps}

\subsection{Read of Data and  Definition of  Variables' Bounds}

Before the initialization of \gls{pso} swarm, it is necessary to read
the available data of inputs and outputs from a file in text
format. Such file is organized as follows: the first column is
comprised of the output values ($y_1, y_2, \dots, y_{\ell + \vartheta
  + \lambda}$) and the following ones are each filled with a dimension
of the input vector $\mathbf{x}$. This data set, for example, may have
been originated from a condition monitoring procedure.

After reading all data, the entire set is subdivided in training,
validation and test sets, whose sizes ($\ell, \vartheta, \lambda$,
respectively) are defined by the user. Usually, the majority of the
entries are reserved for the training step, and the remaining ones are
approximately equally divided to form the validation and test
sets. The implemented \gls{pso}+\gls{svm} entails the validation sets
approach in which the order of observations are respected, since
cross-validation and leave-one-out are computational costly approaches
and problems concerning reliability prediction based on time series
are also considered.

The maximum and minimum values of the variables ($x_j^{min},x_j^{max},
j = 1, 2, 3$) define intervals with different magnitudes. According to
\citeonline{kecman2005}, the ``radius'' of the \gls{svr} tube
$\varepsilon$ can be defined as a percentage of the mean of the
training outputs ($y_i, i = 1,2,\dots,\ell$). Following this idea, in
this work, $\varepsilon$ has its lower and upper bounds defined
respectively as $\frac{0.001}{\ell}\sum_{i=1}^{\ell}y_i$ and
$\frac{0.15}{\ell}\sum_{i=1}^{\ell}y_i$. This way of defining the
boundary values of $\varepsilon$ is adapted to the data under
analysis. However, the minimum and maximum values of $C$ and $\gamma$
are not defined using the available observations and are determined
rather in an arbitrary way. As a consequence, their ranges are greater
than the one defined for $\varepsilon$.

\subsection{Particle Swarm Initialization}

In this work, it is implemented the traditional random uniform swarm
initialization and the particles' initial positions are randomly
selected from their respective intervals of definition. The positions
$\mathbf{p}_i$ are initially set equal to $\mathbf{x}_i$ for each
particle $i$. 

Given that the variables' ranges are very different, the
velocities are initialized in a special manner. The maximum velocity
of each dimension $v_j^{max}$ is set to 10\% of the range where the
specific variable is defined:
% --------------------------------------------------------------------------
\begin{equation}\label{vmax}
  v_{j}^{max}  = \frac{1}{10} \, (x_{j}^{max}-x_{j}^{min}), \quad j = 1, 2, 3
\end{equation}
% --------------------------------------------------------------------------
in which $x_{j}^{max}$ and $x_{j}^{min}$ are the maximum and minimum
values the related variable can assume. After that, velocities are
randomly chosen from the interval $[-v_{j}^{max},v_{j}^{max}]$, $j
= 1,2,3$, for all particles. Only 10\% of the range was chosen to
initially set small velocities in an attempt to avoid the exit of
particles to infeasible areas in early stages of the \gls{pso}
algorithm. 

The initialization procedure for an arbitrary particle $i$ is
summarized in the following pseudocode. The notation $n$ indicates the
number of dimensions or variables taken into account. For the
\gls{svr} model selection problem considering \gls{rbf} kernels, $n =
3$. The notation {\sc{Rand}}$\,(\cdot)$, in turn, is the function that
returns a real number randomly selected in the interval passed as
argument, according to a uniform distribution.
\newpage
% --------------------------------------------------------------------------
\vspace{0.5cm}

\begin{footnotesize}
\hrule
\begin{algorithmic}[0]
  \Procedure{InitializeParticle\,}{$x_1^{min}, x_1^{max}, x_2^{min},
    x_2^{max}, \dots, x_n^{min}, x_n^{max}$} 

  \For {$j = 1, 2, \dots, n$} 
  \State $x_{ij} \leftarrow$ {\sc{Rand}} $\,(x_j^{min},x_j^{max})$
  \State $p_{ij} \leftarrow x_{ij}$ 
  \State $v_{ij} \leftarrow$ {\sc{Rand}} $\,(-v_j^{max},v_j^{max})$
  \Comment{$v_j^{max}$ defined by Equation \eqref{vmax}} 
  \EndFor  \State {\bf end for}
  \State {\bf return} particle $i$
  \EndProcedure \State {\bf end procedure} 

\end{algorithmic}
\hrule
\end{footnotesize}

\vspace{0.5cm} 
% --------------------------------------------------------------------------

\subsection{Definition of Particles' Neighborhoods}

This step is required only when the {\it lbest} model is adopted. If
the {\it gbest} one is considered, the neighborhood is equal to the
entire swarm and there is no necessity to explicitly define particles'
neighbors.

In the {\it lbest} approach, particles' neighbors are arbitrarily
defined considering the particles' generation order and not taking
into account any sort of distance metrics. For example, in the case of
{\it lbest} ring model, particle $i$ has $i - 1$ and $i + 1$ as
neighbors. If $i = 1$, then the ``left'' neighbor is the last particle
and, conversely, if the last particle is considered, its ``right''
neighbor is the first particle. For the sake of illustration, consider
a swarm with 10 particles. Table \ref{tab:neigh} presents the
neighborhood of each one of them when it is formed by 2 or 4 other
particles. The list of particles concerns the order of generation in
the initialization step.

% --------------------------------------------------------------------------
\begin{table}[!ht]
\begin{center}
\begin{footnotesize}
\caption{Examples of particles' neighborhoods}
\label{tab:neigh}

\vspace{0.2cm}

\begin{tabular}{lll} \toprule 
  \textbf{Particle} & \textbf{2 neighbors} & \textbf{4 neighbors} \\\midrule
  1  & 10, 2  & 9, 10, 2, 3 \\
  2  & 1, 3   & 10, 1, 3, 4 \\
  3  & 2, 4   & 1,  2, 4, 5 \\
  4  & 3, 5   & 2,  3, 5, 6 \\
  5  & 4, 6   & 3,  4, 6, 7 \\ 
  6  & 5, 7   & 4,  5, 7, 8 \\
  7  & 6, 8   & 5,  6, 8, 9 \\
  8  & 7, 9   & 6,  7, 9, 10\\
  9  & 8, 10  & 7,  8, 10, 1\\
  10 & 9, 1   & 8,  9, 1, 2 \\\bottomrule
\end{tabular}
\end{footnotesize}
\end{center}
\end{table}
% --------------------------------------------------------------------------

\subsection{Fitness Evaluation: Coupling of Particle Swarm
  Optimization and Support Vector Machine}

The objective function denoting the fitness of particles, in this
work, is the \gls{nrmse} \eqref{nrmse}. At the fitness evaluation
step, the coupling between \gls{pso} and \gls{svm} takes place. The
validation sets approach is adopted so as to guide the search of
optimal parameter values by \gls{pso}. In this way, given a specific
particle, whose current position ($\mathbf{x}$) defines a set of
parameters, $C, \varepsilon, \gamma$, along with the training and
validation sets at hand, {\sf{LIBSVM}} is able to perform the
\gls{svr}. Firstly, it solves the training problem taking into account
the training set. It then provides the support vectors (both bounded
and free), their respective Lagrange multipliers values as well as the
value of the linear coefficient $b_0$. With these results it is
possible to calculate the regression function. Secondly, these
outcomes are used to feed the prediction portion of
{\sf{LIBSVM}}. Thus, the trained ``machine'' is used to predict the
outputs from the input values of the validation set. With the
predicted and the available real values it is possible to calculate
the validation \gls{nrmse}.

Also, the fitness evaluation phase entails the update of particles'
best positions ($\mathbf{p}$). If a particle's current position
$\mathbf{x}$ results in a smaller validation \gls{nrmse}, then its
best position becomes $\mathbf{x}$ and the calculated fitness value is
stored. That is, $\mathbf{p}$ is made equal to $\mathbf{x}$ and a
particle fitness is always related to its best position
$\mathbf{p}$. Otherwise, nothing changes.

The fitness evaluation of all particles takes place immediately after
the initialization and definition of neighborhoods (if {\it lbest} is
adopted) steps so as to start the search procedure by \gls{pso}. It is
also performed after the update of particles' velocities and positions
and is followed by the update of the global best particle and the
particles' best neighbors (when {\it lbest} model is considered). 

\subsection{Update of Particles' Velocities and Positions}

Similarly to the standard algorithm, the developed \gls{pso} use the
constriction factor in its updating rules as well as the permission of
infeasible particles existence without fitness evaluation in such
circumstance. In fact, the impacts of the {\it let particles fly}
strategy could be empirically observed by some experiments performed
in early stages of this work. When the particles exited the feasible
search space in terms of one or more variables, their positions were
set as either the lower or upper bound of the related dimensions,
depending on the limit they have surpassed. This procedure negatively
influenced the \gls{pso} performance, given that the particles were
inclined to occupy the boundary regions of the feasible search space,
which prejudiced its exploration. The implementation of {\it let
  particles fly} gave noticeable improvements to the \gls{pso}
performance.

The update of particles' velocities and positions is accomplished by
computing Equations \eqref{velupdtchi} and \eqref{xupdt},
respectively. Additionally, along with the constriction factor,
velocities are bounded to the definition ranges of variables so as to
avoid particles going to far from the feasible search space. Hence,
after initialization step, $v_{j}^{max} = x_{j}^{max} - x_{j}^{min}, j
= 1, 2, 3$.

% Nevertheless, due to the cross-validation approach, the error
% function (fitness or objective function) may be assessed a great
% number of times and a {\it lbest} topology may render the algorithm
% even slower, which is not interesting. In addition, each of such
% evaluations demands a \gls{svm} training. In this way, the {\it
%   gbest} approach is selected.


\subsection{Update of Global Best and Particles' Best Neighbors}

The update of global best and particles' best neighbors always takes
place after the fitness evaluation step, given that it is based on
particles' fitness values. The global best updating consists in
identifying and storing the particle that have led to the smallest
validation \gls{nrmse} overall particles of the swarm up to the
present iteration. Not only the global best particle with its
associated positions and fitness is stored, but also the number of the
iteration in which it was found ($bIter$). This number may be used in
the assessment of the stop criteria.

If the {\it lbest} model is adopted, then it is necessary to verify
the best particles' neighbors at each \gls{pso} iteration, since they
play the role of guiding search along with particles' best
positions. In a particle's neighborhood, including itself, the one
with the smallest validation \gls{nrmse} becomes the best neighbor in
the current iteration.

\subsection{Stop Criteria}

These steps are repeated until a stop criterion is reached. The
proposed \gls{pso} involves three of them:
\begin{enumerate}
\item Maximum number of iterations ($nIter$).
\item The global best particle (and thus the fitness value) is the
  same in 10\% of the maximum number of iterations.
\item The global best fitness value in consecutive iterations are
  different but such difference is less than a tolerance $\delta$.
\end{enumerate}

The update of particles' velocities and positions, the fitness
evaluation and the update of global best and best particles' neighbors
are repeated until one of the considered stop criteria is met. As a
result, the \gls{pso}+\gls{svm} procedure provides the ``machine''
with the most suitable parameter values $C, \varepsilon, \gamma$. This
\gls{pso}-optimized \gls{svm} is then used to predict the outputs from
the input values of the test set so as to have at least an idea of its
generalization ability, which is given by the test \gls{nrmse}. Such
evaluation, as in the fitness assessment step, is made by means of the
prediction portion of {\sf{LIBSVM}}.

In order to summarize and to provide the reader with the essence of
what has been just explained the \gls{pso}+\gls{svm} algorithm is
given in the forms of pseudocode and flow chart (Figure
\ref{fig:flow}). 

\section{Proposed Methodology Pseudocode and Flow Chart}

In the pseudocode, $nPart$ and $nNeigh$ are
respectively the number of particles and of particles'
neighbors. Besides that, {\sf svm} is a ``machine'' returned from {\sf
  LIBSVM}, $f_i$ is the the fitness value (validation \gls{nrmse}) of
particle $i$ and $f_{test}$ is the test \gls{nrmse} returned by the
best particle found in all \gls{pso} iterations. The symbol $^*$ means
optimal in relation to the validation \gls{nrmse}.

% \newcommand{\Input}{
% \parbox[t]{8.7cm}{ 
% \begin{tabular}{rl}
%   $n$&  $\triangleright$ number of variables \\
%   $\mathbf{x}^{min},\mathbf{x}^{max}$, &  $\triangleright$ $n$-dimensional vectors of variables' bounds \\
%   $nPart$  &  $\triangleright$ number of particles \\ 
%   $nNeigh$  &  $\triangleright$ number of particles' neighbors \\
%   $c_1,c_2,\chi$ & $\triangleright$ \gls{pso} constants and constriction factor \\
%   $nIter$  &  $\triangleright$ maximum number of iterations \\
%   $\delta$ &  $\triangleright$ tolerance \\
%   $D$      &  $\triangleright$ data set \\
%   $\ell, \vartheta, \lambda$ &  $\triangleright$ number of training, validation and test points \\
% %  $f$      &  $\triangleright$ fitness \\
% \end{tabular}}
% }
% --------------------------------------------------------------------------
\vspace{0.5cm}

\begin{footnotesize}
\hrule
\begin{algorithmic}[0]
  \Procedure{ParticleSwarmOptimization\,}{$n, \mathbf{x}^{min}, \mathbf{x}^{max}, nPart, nNeigh, c_1,c_2,\chi,nIter, \delta,D,\ell, \vartheta, \lambda$} 
  %\State \hfill $\left(\Input \right)$ \Statex 
   
  \State \MyCommentL{14cm}{read data from a text file} 
  \State \MyCommentL{14cm}{define variables' bounds} 
  \State \MyCommentL{14cm}{particle swarm initialization and first fitness evaluation} 
  \For {$i = 1,\dots, nPart$} 
  \State {\sc InitializeParticle}$(\mathbf{x}^{min},\mathbf{x}^{max})$
  \State {\sf{svm}} $\leftarrow$ {\sf{trainLIBSVM}} $(\mathbf{p}_i,\,D_1,D_2,\dots,D_{\ell})$
  \Comment{train} 
  \State $(\hat{y}_1,\hat{y}_2,\dots,\hat{y}_{\vartheta}) \leftarrow$ {\sf{predLIBSVM}}$(\text{{\sf{svm}}},\,D_{\ell+1},D_{\ell+2},\dots,D_{\ell+\vartheta})$
  \Comment{predict validation outputs} 
  \State $f_i \leftarrow $ \gls{nrmse} $(\hat{y}_1, \hat{y}_2,\dots,\hat{y}_{\vartheta})$ 
  \Comment{particle fitness, Equation \eqref{nrmse}}
  \EndFor
  \State {\bf end for}
  \State \MyCommentL{14cm}{find best global particle}  
  \State $f^* \leftarrow \min_{i}\,(f_i), i = 1,2,\dots,nPart$; $b \leftarrow i$
  \Comment{update best global fitness}
  \State $\mathbf{p}^* \leftarrow \mathbf{p}_b$  \Comment{update best global positon}
  \State $bIter \leftarrow 0$ 
  \Comment{update best iteration}
  \State \MyCommentL{14cm}{define particles' neighborhood and best neihgbor}
  \State \MyCommentL{14cm}{perform \gls{pso}}
  \For {$k = 1, 2,\dots, nIter$}
  \For {$i = 1, 2,\dots, nPart$} 
  \State $\mathbf{v}_i \leftarrow \min (\text{Equation \eqref{velupdtchi}}, \mathbf{x}^{max}-\mathbf{x}^{min})$ 
  \Comment{update particle velocity}
  \State $\mathbf{x}_i \leftarrow$ Equation \eqref{xupdt} 
  \Comment{update particle current position}
  \If {$\mathbf{x}_i$ is feasible}
  \State \MyCommentL{14cm}{evaluate fitness}
  \State {\sf{svm}} $\leftarrow$ {\sf{trainLIBSVM}} $(\mathbf{x}_i,\,D_1,D_2,\dots,D_{\ell})$
  \State $(\hat{y}_1,\hat{y}_2,\dots,\hat{y}_{\vartheta}) \leftarrow$ {\sf{predLIBSVM}}$(\text{{\sf{svm}}},\,D_{\ell+1},D_{\ell+2},\dots,D_{\ell+\vartheta})$
  \State $f \leftarrow$ \gls{nrmse} $(\hat{y}_1,\hat{y}_2,\dots,\hat{y}_{\vartheta})$ 
  \If {$f < f_i$}
  \State $f_i \leftarrow f$
  \Comment{update particle fitness}
  \State $\mathbf{p}_i \leftarrow \mathbf{x}_i$
  \Comment{update particle best position}
  \EndIf
  %\State {\bf end if}
  \EndIf
  %\State {\bf end if}
  \EndFor
  \State {\bf end for}
  \State \MyCommentL{14cm}{update best global particle ($f^*, \mathbf{p}^*, bIter$)}
  \State \MyCommentL{14cm}{update best particles' neighbors}
  \State \MyCommentL{14cm}{verify whether stop criterion 2 or 3 is met} 
  \If {$0 < | f_k^* - f_{k-1}^*| < \delta$ {\bf or} ($k - bIter = 10\% \cdot nIter$ {\bf and} $f_k^* = f_{bIter}^*$)}
  \State {\bf break}
  \EndIf
  %\State {\bf end if}
  \EndFor
  \State {\bf end for} 
  \State $(\hat{y}_{\ell+\vartheta+1},\hat{y}_{\ell+\vartheta+2},\dots,\hat{y}_{\ell+\vartheta+\lambda}) \leftarrow$ {\sf{predLIBSVM}} $(\text{{\sf{svm}}}^*,\,D_{\ell+\vartheta+1},D_{\ell+\vartheta+2},\dots,D_{\ell+\vartheta+\lambda})$
  \Comment{predict test outputs} 
  \State $f_{test} \leftarrow$ \gls{nrmse} $(\hat{y}_{\ell+\vartheta+1}, \hat{y}_{\ell+\vartheta+2},\dots,\hat{y}_{\ell+\vartheta+\lambda})$
  \Comment{Equation \eqref{nrmse}} 
  \State {\bf return} $f^*,\mathbf{p}^*, f_{test}$
  \EndProcedure
  \State {\bf end procedure} \Comment{end of procedure}
\end{algorithmic}
\hrule
\end{footnotesize}

\vspace{0.5cm}
% --------------------------------------------------------------------------

% --------------------------------------------------------------------------
\begin{figure}[!ht]
\centering{
\includegraphics[width=\linewidth]{fig/flowchart.pdf}
\caption{Flow chart of the proposed PSO+SVM methodology}
\label{fig:flow}}
\end{figure}
% --------------------------------------------------------------------------  
