\section{System Design}
\label{sec:SystemDesign}
For the system design, the Design Space Exploration method was used, in order to find the best possible alternatives for the system components. The design space exploration that is described in this section focusses on the different architectures or hardware devices that can be used to implement the component. For the actual implementation of the components on that architecture, a lot of design decision have also been made. These design decisions are described in their corresponding sections in this document.


\subsection{System components}
From the Requirements and the System Architecture, the required system components can be derived.
\\
First, in order to get position information from the JIWY setup an Encoder read-out component is neccessary. After processing the incoming measurement values, the motors need to be actuated using a PWM signal. Therefore, a PWM component is required and a Controller component to process the incoming sensor data.


On before hand, it is reasonable that both computer hardware platforms (i.e. the NIOS and Gumstix) will be used, which makes it neccessary to have a Communication component connecting the two platforms.


The vision in the loop system relies heavily on the read-out of the webcam, which is taken care of by the Webcam readout component, whereas the actual tracking algorithm performed on the webcam data is to be implemented in an Object tracking component.

\subsection{Design Space Exploration}
In this subsection, the Design Space Exploration on the vision in the loop controller is performed. The components that are defined above can be implemented in several ways, as is indicated in table \ref{tab:alternatives} where different components alternatives are given along with an expected score on development time, performance, accuracy and flexibility.

\begin{table}[htbp]
	\centering
	\caption{Alternatives for required subsystems}
	\label{tab:alternatives}%
	\setlength{\tabcolsep}{3pt}
	\begin{tabulary}{\textwidth}{p{0.22\textwidth}LCccc}
		\hline
		\textbf{subsystem} & \textbf{alternative} & \textbf{development time} & \textbf{performance} & \textbf{accuracy} & \textbf{flexibility} \\
		\hline
		\multirow{2}[1]{*}{\textbf{Encoder}} 		& VHDL				& +		& ++	& ++	& - \\
													& Software			& ++	& --	& --	& + \\
		\hline		
		\multirow{2}[1]{*}{\textbf{PWM}}			& VHDL				& +		& ++	& ++	& - \\
													& Software			& ++	& --	& --	& + \\
		\hline		
		\multirow{3}[1]{*}{\textbf{Controller Platform}} & Gumstix	 	& +		& ++	& - 	& +- \\
													& NIOS				& ++	& +		& + 	& + \\
													& FPGA				& -		& +++	& + 	& - \\									
		\hline		
		\multirow{2}[1]{*}{\textbf{Controller Datatype}}		& Integer 	& -		& ++	& - 	& - \\
													& Floating Point			& +		& +		& + 	& - \\
		\hline
		\multirow{2}[1]{*}{\textbf{\begin{minipage}{0.19\textwidth}Controller~Design method\end{minipage}}}		& Model Based Design 	& +		& -	& + 	& + \\
													& Iterative coding on real setup			& -		& +		& + 	& - \\
		\hline		
		\multirow{2}[1]{*}{\textbf{Communication}}	& UART		& +	& +	& ++ 	& + \\
													& I$^2$C	& +-		& -	& ++	& ++ \\		
													& SPI	& +-		& ++	& ++	& - \\		
		\hline		
		\multirow{2}[1]{*}{\textbf{Webcam readout}}	& RAW				& -		& ++	& na	& - \\
													& GStreamer			& +		& +		& na	& + \\
		\hline		
		\multirow{2}[1]{*}{\textbf{Object tracking}} & Own				& +		& ++	& -		& - \\
													& OpenCV Library \citep{OPENCV}	& -		& -		& ++	& + \\
		\hline
		
		\end{tabulary}%
\end{table}%

It follows from table \ref{tab:alternatives} that both the Encoder and PWM can be implemented in VHDL and in software. Although development time is important in this project, performance and accuracy are of higher importance here, because the Encoder and PWM form more or less the very basis of the vision in the loop controller. As a consequence, the Encoder and PWM are implemented in VHDL.
\\ \\
For the communication between the NIOS and the Gumstix, several communication protocols are possible (e.g UART, I$^2$C and SPI), differing in both implementation complexity and bandwidth. The column accuracy in the table is of less importance here, because they can all be used to transfer messages between components. The accuracy could reflect the error rate, however this depends on the transmission speed and even then a message protocol with error detection could be used.

Unfortunately, the I$^2$ and SPI did not work in this setup, which made the choice considerably easier. Because of a voltage difference between the Gumstix and NIOS, a voltage converter was needed to compensate for this. This was only available for the UART. Therefore, UART was chosen as communication method.

For the controller controlling the Jiwy setup, several choices can be made. First, the platform on which the controller resides is to be chosen. Although the Gumstix platform provides in more processing power, it can not directly interface with the hardware (i.e. Encoder and PWM), which the NIOS and FPGA platform can. From the NIOS and FPGA platform, the NIOS was chosen, because development time is more important than excellent performance.\\
The data representation being used in the controller can be both integer and floating point. The development time for an integer-based controller is considered large with respect to a floating point solution, therefore, the floating point data type is chosen.\\
The last interesting alternative for the controller is the design method. Here, two approaches can be used, which are iterative coding on the actual setup or code generation as the final step in Model Based Design. Note that an accurate model of the system under design is a prerequisite for Model Based Design. Iterative coding on the actual setup is considered both time consuming and inflexible, therefore, the controller code will be derived using code generation and Model Based Design.
\\ \\
For Webcam readout, a choice can be made between using the GStreamer library or implementing a readout using raw input data. Here, the performance of both possibilities is considered adequate while the development time using GStreamer will be small compared to developing an own implementation. Because of the latter, GStreamer will be used for Webcam readout. 
\\ \\
The Object tracking algorithm can be implemented using the sophisticated OpenCV platform or it can be implemented using own coding on the GStreamer output. Using OpenCV will provide in more flexibility and accuracy (i.e. OpenCV provides in powerful object recognition tools) at the cost of learning and implementing a new library on top of, or possibly instead of, GStreamer. As a consequence of the learning time expected, the development time is expected to be significant, and therefore, despite off-the-shelf object recognition, it is chosen to implement an own (and simple) object recognition implementation.

