% Final Year Project Report

% David Martin
% 20017455
% Applied Computing Year 4

\documentclass{article}
\usepackage{textcomp}
\usepackage{times}
\usepackage{graphicx}
\renewcommand{\figurename}{Fig.}

%\usepackage[greek]{babel}
%\usepackage[T1]{fontenc}
% package needed for euro symbol

\begin{document}


% TITLE

\title{Web Enabled Robot Control}
\author{David Martin\\
        Applied Computing Year 4,\\
		Waterford Institute of Technology,\\
		Waterford,\\
		Ireland\\}
\date{\today}
\maketitle


% PROJECT GOALS

\section{Project Goals}
The main goal of this project was to remotely control a Lego Mindstorms NXT robot from a web browser. The NXT is a 3 wheeled robot capable of moving forwards, backwards, and turning on it's own axis \cite{nxt}. The advantage of using a web browser instead of a purpose built application is that it can be used on most modern computing devices, for example, a PC, Mac, games console, PDA or mobile phone.\\
\indent This project had a number of secondary goals as well. The first was to make the driver part of the application extensible. This would allow any robot to be controlled by simply plugging in a different driver. This feature would hopefully lengthen the lifetime of the project after completion. Another goal was to investigate possible ways to improve and make robot control more interesting. The starting point for this goal was Macro Mode, a way of controlling the robot automatically by programming in basic event driven actions from the web browser. This mode would be an alternative to manual control, or Interactive Mode. This goal was further developed over the course of the project by researching a way to include streaming video in the web browser.\\
\indent An implicit goal of the project was to research and use some of the latest design and programming tools. This included design methodologies, and industry used tools and practices. By completing this goal, a better understanding of working on a project in a real world environment was achieved. This goal was the most important one because the project wasn't being developed for a real customer. The project was essentially a playground for doing fun things with interesting technologies.\\


% TOOLSET

\section{Toolset}
The project toolset was divided into Hardware, Software and Support Tools. The hardware section gives a list of all the hardware used and a brief description of each item. The software section includes a list of the minimum required software technologies to get the web interface up and running . The support tools section gives an overview of additional software technologies and methodologies that supported the development process, and reasons for choosing them.\\

\subsection{Hardware}
The project included a main set and a supporting set of hardware. The main set was used extensively throughout the project while the supporting set was used either for research of for a minor part of the project. Some of the hardware was purchased or obtained specifically for the project. In general, the majority of hardware was already available (either owned or accessible through the Institute).

\subsubsection{Main Hardware}
The main hardware set was enough to get basic web control working. It included:
\begin{itemize}
	\item Lego Mindstorms NXT Robot (\texteuro 250)
	\item 6x AA Batteries (\texteuro 5)
	\item Laptop PC (\texteuro 500 approx.)
	\item USB Bluetooth Dongle (\texteuro 20)
\end{itemize}
\indent The Lego Mindstorms NXT Robot is an Embedded device with a small LCD screen. It has 4 ports for connecting sensors to. The kit comes with an Ultrasonic Sensor for getting distances to objects, a Light Sensor for getting brightness levels, a Sound Sensor for reading decibel levels, and a Touch Sensor with a simple button on it. There are also 3 ports for connecting the included servo motors. The servo motors have a high level of accuracy and can be syncronised with each other for movement \cite{nxt}.\\
% IMAGE OF NXT
\begin{figure}[!h]
	\centering
	\includegraphics[height=50mm]{fp_nxt_kit.png}
	\caption[Lego Mindstorms Kit]{Lego Mindstorms Kit}
	\label{fig:nxt_kit}
\end{figure}
\indent The NXT comes with a bluetooth module and USB port. Either of these can be used for communication between the PC and NXT. The USB cable is a standard cable similar to those used to connect a printer to a PC i.e. USB-A Male to USB-B. The bluetooth module can also be programmed to communicate with other NXTs. Programming the NXT can be done using many different languages including C and Java. The NXT is powered by 6 AA size batteries which sit in the back of the device.\\
\indent The Laptop PC used was approximately 3 years old. Its specification was an Intel Pentium M 1.70MHz processor with 2GB of RAM running Windows XP Professional. The only required features were USB ports for the Bluetooth Dongle and USB Cable, and a Wireless Adapter. The Wireless Adapter was built into the laptop. The Bluetooth dongle supported the Bluetooth 2.0 standard i.e. 3 megabits per second \cite{bluetooth}.

\subsubsection{Supporting Hardware}
The supporting set of hardware was mostly used during the latter stages of the project. The set included:\\
\begin{itemize}
	\item Apple iPhone 2G (\texteuro 300)
	\item FON Fonera 2.0 Wireless Router (\texteuro 50)
	\item 2 9v batteries (\texteuro 10)
	\item 5v regulator (\texteuro 1)
	\item Logitech Quickcam Express (\texteuro 10)
	\item Creative Live! Cam Video IM Pro (\texteuro 25)
	\item Huawei E220 3G Modem with SIM Card (\texteuro 40)
	\item Mikomi 4-port USB 2.0 Hub (\texteuro 15)
	\item Nintendo DSi (\texteuro 170)
\end{itemize}
% IMAGE OF IPHONE
\begin{figure}[!h]
	\centering
	\includegraphics[height=50mm]{fp_iphone.png}
	\caption[iPhone]{Apple iPhone\cite{iphone_image}}
	\label{fig:iphone}
\end{figure}
\indent The Apple iPhone is a PDA style touch screen device capable of making phone calls. It has a built-in 1.3 megapixel camera that can be used to take pictures or record video \cite{iphone_phone}. It also comes equipped with bluetooth capabilities, although they are somewhat limited by the current version of the iPhone Firmware, 2.1\cite{iphone_firmware}.\\
% IMAGE OF FON
\begin{figure}[!h]
	\centering
	\includegraphics[height=20mm]{fp_fon.png}
	\caption[FON]{Fonera 2.0 Wireless Router}
	\label{fig:fon}
\end{figure}
\indent The Fonera 2.0 is a small embedded linux device that can be used as a wireless router. Its networking capabilities include an 802.11g wireless adapter, a local area network ethernet port, and a wide area network ethernet port. The Fonera also has a USB port which supports USB 2.0 devices. As the router is running linux, existing application and drivers can be reused by by recompiling them for the FON processor. The power requirements of the router are 5 volts with a maximum current of 2 amps. As the router was used in a mobile environment, a battery powered setup was used instead of the 220 volt AC power adapter. The battery power setup consisted of two 9 volt batteries connected in parallel to a 5 volt regulator to ensure a stable voltage.\\
\indent To provide video streaming a USB camera was used. The first camera tried was the Logitech Quickcam Express. This camera was roughly 5 years old and wasnt compatible with Linux (non-UVC compliant). Because of this incompatibity a second camera was used. The second camera was a Creative Live! Cam Video IM Pro. It supported the UVC standard, which allowed it to be used with the Fonera Router. It was capable of streaming video at a resolution of 640 by 480 pixels, and had automatic brightness adjustment. The camera was powered through the USB port.\\
% IMAGE OF E220
\begin{figure}[!h]
	\centering
	\includegraphics[height=20mm]{fp_e220.png}
	\caption[e220]{Huawei E220 3G USB Modem\cite{e220}}
	\label{fig:e220}
\end{figure}
\indent The Huawei E220 is a 3G Modem that can be connected to a PC and powered through the USB port. The modem allows mobile devices, such as a laptop, to obtain an internet connection wherever a 3G HSDPA signal is available. The maximum speed of the connection is 3.6Mbps\cite{e220}. As the modem can be connected via a USB port, it could also be connected to the FON Router. This setup would provide a completely mobile wireless LAN with internet access. The FON Router only has a single USB port, so to connect both a camera and 3G modem a USB 2.0 Hub was used. The hub has a single male port for connecting to the FON, and 4 female ports for connecting other devices. The hub is also capable of being powered by an external source if needed.\\
% IMAGE OF DSi
\begin{figure}[!h]
	\centering
	\includegraphics[height=35mm]{fp_dsi.png}
	\caption[dsi]{Nintendo DSi\cite{dsi}}
	\label{fig:dsi}
\end{figure}
\indent The Nintendo DSi is a handheld games console. It has 2 VGA cameras, one pointing towards the user and one pointing in front of them. There is an SD card slot on the side of the console capable of reading SDHC cards. The DSi has built-in Wireless 802.11g capabilities and can support WPA2 security\cite{dsi}. As part of the DS Online Shop, a free web browser, developed by Opera Software, can be downloaded and installed on the DSi. This web browser has 2 different viewing modes: overview mode which looks exactly like a normal PC browser, and column mode which removes formatting and styling, reducing all content to a single scrollable fixed width column. The browser also supports Javascript.

\subsection{Software}
Most of the software used in the project was open source and free to use. The advantage of using open source software is that the source code can be explored to understand it better. Also, updates are more regular than with proprietry software. If there isn't time to wait for an update, the source code can be modified to fit the exact needs. The main software technologies used were Java, Groovy on Grails, HTML, CSS, Javascript and Maven. Other software technologies were used as well but only for minor parts of the project. 

\subsubsection{Java}
Java is a low level language that runs on a Virtural Machine. Java source code compiles to Java bytecode i.e. class files. These class files can run on a Java Virutal Machine that has been implemented for any device. Java is made up of the Runtime Environment and the Development Kit. The Runtime Environment contains the JVM and is enough to just run Java class files. To compile Java source code, the Development Kit is required. Java's most notrious feature is automatic memory management. This is handled by a periodic garbage collector that removes any unreferenced objects.\\
\indent The reason for choosing Java as the main language was because of the iCommand and lejos libraries. These libraries are written in Java and can be used for communicating with and controlling the NXT robot. The libraries can be used with Bluetooth or a direct USB connection. iCommand is a remote control libary that runs on a PC and can send commands directly to the NXT. It can also retrieve sensor readings from the NXT. Lejos is a JVM that runs on the NXT. To use it, a new firmware image has to be installed on the brick. By installing lejos, Java code can be written for and run directly on the NXT without the need for remote control over Bluetooth or USB\cite{icommand_and_lejos}.

\subsubsection{Groovy on Grails}
The remote control element of this project was done entirely through a web browser. To provide this feature, a web application had to be written that could interact easily with the iCommand Java library. In addition, the web application had to provide a user-friendly interface. For these reasons Groovy on Grails was chosen.\\
\indent Grails is a modern web application framework that runs on top of the Java based Spring framework. It follows the Model-View-Controller pattern for applications\cite{grails}. The views can be written using groovy server pages (GSP), java server pages (JSP) or plain html. The models and controllers are written in Groovy.\\
\indent Groovy is a dynamically typed scripting language that compiles into Java bytecode. This fact means that Groovy files can interact with any other Java class file, making the entire JDK and other Java based libaries easily available to it. The biggest advantage of this is that the iCommand remote control libraries can be integrated into the web application. 

\subsubsection{HTML, CSS \& Javascript}
The web interface had to be designed and implemented using a combination of 3 technologies. HTML was used in conjunction with Cascading Style Sheets to create the layout of the web interface. HTML was used to specify elements on the page and their contents. CSS was to used specify the postion, colouring and overall style of these elements on the page. In general, HTML and CSS are rendered the same in every browser. However, there were some problems encountered during the project that made the web page render incorrectly in Internet Explorer. For this reason all Acceptance tests were performed in Mozilla Firefox only.\\
\indent Javascript is a client side scripting language that can be used to animate web page elements, silently retrieve content in the background and assist a user when filling in a form. It allows web page elements to be dynamically manipulated. The reason for using Javascript in this project was to send remote control commands as XML HTTP Requests to the server. This is known as Asynchronous Javascript and XML (AJAX). These AJAX calls were triggered whenever the user pressed a command button without the need to refresh the page.

\subsubsection{Maven}
Maven is a project build tool as well as a repository for java libraries\cite{maven}. It is similar to ANT in that build tasks such as compiling source code and running tests can be automated. Instead of a build.xml file, one or more pom.xml filea are used. A project can contain more than pom file. These files will have a child/parent relationship. Each pom represents an independent part of the project that can be packaged and reused in another project if required. These packages can be stored in a Maven repository.\\
\indent The Maven repository can be either local or on a remote server. Currently, thousands of java libraries and projects and hosted in the central maven repository on the internet\cite{maven_repo}. Throughout this project a local repository was used. Whenever a dependency wasn't available locally, it was automatically downloaded from the central repository and placed in the local repository. The advantage of having dependencies in a repository is that the overall size of the source code is small.\\
\indent Another feature of Maven is arhetypes. Archetypes are predefined project skeletons for different technologies\cite{maven_archetypes}. For example, there is an archetype for a Java Servlet Web Application and another for an Enterprise Java Beans Application. A project prototype can get up and running very quickly by using archetypes. A Groovy on Grails archetype was used for main part of this project, and simple Java Application archetypes were used for most of the other parts. A dependency tree was set up between the different poms. 

\subsection{Support Tools}
The previous set of software tools were essential to the project. The following set wasn't essential but helped the project development move along a lot quicker. These tools consisted of development software and methodologies. This toolset is not complete but does contain the main tools that were used consistently throughout the project.  

\subsubsection{Google Documents}
Google Documents is a web based tool for creating and editing spreadsheets, documents and other files. Documents are automatically saved at regular intervals. The user can revert back to a previous save of the document if required. Documents can be exported to various formats or printed from the web browser\cite{google_docs}.\\
% SCREENSHOT OF GOOGLE DOCS
\begin{figure}[!h]
	\centering
	\includegraphics[height=55mm]{fp_google_docs.png}
	\caption[google_docs]{Google Documents Tasks Spreadsheet}
	\label{fig:google_docs}
\end{figure}
\indent A spreadsheet was maintained for the duration of this project. It was used for assignment of tasks and development hours. The spreadsheet had a 2 sheet layout. The first sheet was used to assign tasks to each iteration number. An iteration was defined as a 2 week period consisting of 24 hours of project development. Any currently active tasks were highlighted in orange, while completed tasks were highlighted in green. Inactive tasks or tasks that hadn't beend started weren't highlighted.\\
\indent The second sheet was used for keeping a record of important project dates. This included the start and end date for each iteration as well as any report or demonstration dates. Supervisor meetings were scheduled on a weekly basis. These were held on the last day of each iteration for an iteration review, and half way through each iteration for a mid-iteration review. The same colouring scheme as the first sheet was used for the project dates sheet.

\subsubsection{Eclipse}
Eclipse is an Integrated Development Environment (IDE) predominantly used for Java development. It provides a graphical based tool for editing and compiling source code. It also provides code completion, formatting, highlighting and code suggestions. The version of Eclispe used was 3.3\cite{eclipse}. Eclipse has a plugin architecture for adding extra helper tools and functions. Installed plugins included one for Maven so that pom.xml files could be parsed, and another for Subversion which allowed checking out, commiting etc. from within the IDE. Eclipse was used for development of the driver part of the project.  

\subsubsection{Netbeans}
Netbeans is also an Integrated Development Environment that can used for Java development. Like Eclipse it has a plugin architecture for adding extra helpers and features. The version of Netbeans used was 6.5\cite{netbeans}. The reason for using Netbeans as well as Eclipse was because the Groovy on Grails integration with Eclipse was less advanced than the Groovy on Grails plugin set for Netbeans. The advantage of having the Grails integration was mainly syntax highlighting, but also code formatting and the ability to run applications and perform limited debugging from within the IDE. 

\subsubsection{Subversion}
Subversion is a code repository for keeping ever increasing revisions of code as it is modified\cite{subversion}. Code is commited to the repository usually whenever some reasonable sized change is made to a file or if a new file is created. This commiting process is done manually by the user. Subversion allows concurrent access to the repository, meaning more than one user can be working on the same project. If 2 users have modified the same file but in different places, subversion will attempt to merge the changes together when they both commit their copy.\\
\indent There are many advantages to using a revision based code repository. All changes to the code can be tracked right back to the day a file was created. Subversion keeps a log of commit messages and the modifying user. Any revision of the code can be checked out from the repository. This can come in useful if code is overwritten or new functionality affects some older stable code. If the \emph{head} revision of the repository is unstable or completely broken, files can be reverted back to older working versions.\\
\indent Subversion can be installed locally or on a remote machine. For this project, a remote machine was used. The subversion repository is located at the Google Code project page\cite{googlecode}. This project page provides a wiki, download section, issue tracking system and a 100 megabyte subversion repository. The page is accessible entirely from a web browser and was free to set up an account with. Out of all these features, the repository was used mostly. The other features would be more useful if this project had multiple members.

\subsubsection{Extreme Programming}
Extreme Programming is a design methodology based around iterative development and test driven software\cite{xp}. It relies on user stories to drive release planning. A release is a milestone event that consists of a fixed number of iterations. An iteration is defined as 2 weeks of development on the project. In Extreme Programming, releases are planned based on what individual pieces of functionality can be included. This is decided by estimating the time taken to develop each user story. If there are uncertain estimates about a user story, a spike is undertaken.\\
\indent A spike is a quick prototype of a feature. In this project a spike was developed to get a better understanding of the communication requirements for the NXT. By developing a spike, the developer can get a better idea of how much time it will take to develop the user story. All of the above information is fed into the release plan. The release plan has a list of each user story, how much time it will take to develop, and the iteration number it will be done in.\\
\indent An important part of Extreme Programming is testing. As user stories are being fleshed out, tests can be written for the unimplemented features. The features are then deemed to be complete if they pass all of the tests written for them. The type of testing performed is typically unit and functional testing, and on a regular basis. Acceptance tests are also performed regularly when the feature is near completion. As much as possible, the testing should be automated, which was the case for this project.

\subsubsection{Hudson}
Hudson is a continuous integration tool. It performs regular builds of a project as changes are made to it. A web interface is provided for configuration and analysis of builds\cite{hudson}. In this project Hudson was configured to poll the subversion repository at regular intervals for any changes. Whenever a new revision was detected, hudson would check out the head revision. The Maven pom.xml files of each part of the project were then parsed and a build was triggered. This build process consisted of the following:
\begin{enumerate}
	\item resolve any maven dependencies
	\item compile all the java and groovy source code
	\item run any automated tests
	\item package all the classes and web files in an archive e.g. JAR or WAR
	\item install the resulting packages in the local maven repository
\end{enumerate}
% SCREENSHOT OF HUDSON
\begin{figure}[!h]
	\centering
	\includegraphics[height=30mm]{fp_hudson.png}
	\caption[hudson]{Hudson building the individual project components}
	\label{fig:hudson}
\end{figure}
\indent A summary of the latest builds were shown on the Hudson project home page. The image beside each project would change colour and display a different icon depending on the current build status. If the most recent build was successful, the image would be blue. If the build failed, the image would be red. The build icon showed a summary report that indicated the reliability of the project over the last 5 builds. If the icon showed a sun, it meant the last 5 builds passed. If there were dark clouds with rain, the build had failed at least 3 of the last 5 builds. Hudson was an important support tool in this project as any recent commits to the subversion repository could be tested quickly and any problems could be discovered before moving on.


% DEVELOPMENT

\section{Development}
Development of the project took place over a total of 24 weeks. The first 4 weeks were used to come up with the initial project proposal. Once this proposal was approved, 8 weeks were used to get familiar with the technologies and toolset. During this 8 week period an architectural spike was developed. As defined in the Extreme Programmin methodology, an architectural spike is a prototype for the project or a system metaphor. It doesn't represent the finished product, and the code used in this stage may only have parts of it reused in the actual project. The architectural spike allowed sufficient time to become familiar with the toolset so that a release plan could be put together for the remaining 12 weeks.\\
\indent To summarise, some preliminary development was done during the first 12 weeks and the majority was done during the second 12 weeks. Of the second 12 weeks, the first 6 were divided into three 2 week iterations for Engineering Release 1, and the second 6 weeks were divided into three 2 week iterations for Engineering Release 2. 

\subsection{NXT Robot}
As mentioned earlier, the Lego Mindstorms NXT robot is an embedded device with 4 sensor ports, 3 motor ports and a USB port. The NXT kit came with instructions for building the Tribot robot. The Tribot is a 3 wheeled mobile robot with a pair of grabbing arms at the front. Two of the servo motors are connected to a rubber wheel each, one on either side of the robot. The third servo motor sits in between these 2 motors and is connected to the grabbing arms at the front by a series of cogs. The NXT brick, or main device, sits at an angle on top of the motors. The small wheel at the back is unpowered but capable of rotating through $360\,^{\circ}$, similar to a shopping cart wheel.\\
% IMAGE OF TRIBOT
\begin{figure}[!h]
	\centering
	\includegraphics[height=50mm]{fp_nxt_tribot.png}
	\caption[nxt_tribot]{NXT Tribot}
	\label{fig:nxt_tribot}
\end{figure}
\indent The Tribot was a good design that was used initially in the project. However, it had a few flaws that led to altering the design slightly. The first problem was the pair of grabbing arms. The length of these arms mean that when they were closed, they projected out in front to almost double the total length of the robot. When they were open, the total arm span was around 300mm. This would cause the robot to get stuck if navigating through a tight space or around objects like a chair leg. To solve this problem without affecting the grabbing capabilities, two things were done. The length of the arms were significantly reduced so that they sat a lot closer to the robot instead of far out in front. In addition, the angle of opening was limited to roughly $90\,^{\circ}$.\\
\indent The second problem with the Tribot design was the position of the brick. The brick was standing up at an angle of around $30\,^{\circ}$ on top of the motors. The center of gravity was raised as a result of this. This led to increased strain on the motors when suddenly stopping or changing direction. Also, the height of the robot would restrict the locations it could drive under. The solution to this was to mount the brick horizontally. By doing this the overall height of the robot was reduced by a quarter without affecting the total length.\\
% CARGO ROBOT IMAGE
\begin{figure}[!h]
	\centering
	\includegraphics[height=40mm]{fp_nxt_cargo.png}
	\caption[nxt_cargo]{Modified NXT with cargo area on the back}
	\label{fig:nxt_cargo}
\end{figure}
\indent Later on in the project the design was altered slightly again. Instead of mounting the brick directly onto the robot, a small open cargo area was constructed. This area was big enough to carry the brick horizontally still, but also carry other devices such as the FON Wireless Router. Making this change meant that any future hardware that needed to be tested on the NXT could be dropped in the cargo area instead of spending time mounting it with lego pieces.\\
\indent The Logitech USB camera had to be mounted on the robot as well. The position of this was too important to simply throw it in the cargo area so some lego pieces were used. By dismantling the small plastic camera support, the lego pieces could be attached to the camera. This made placing the camera on the robot very easy. A spot at the front of the robot was used to click the camera into place. This location gave a good view of where the robot was going and still allowed the camera to be rotated if needed.

\subsection{Driver Specification}
The driver component was specified and implemented in a generic way to facilitate writing drivers for any robot. Any additional driver, including the NXT driver, could then be plugged into the application and controlled using the web interface.\\
% UML FOR DRIVER
\begin{figure}[!h]
	\centering
	\includegraphics[height=45mm]{fp_uml.png}
	\caption[uml]{Class diagram showing generic driver package and NXT driver package}
	\label{fig:uml}
\end{figure}
\indent The driver specification consisted of a Controller, an Event and an Action. An additional class, Condition, was also specified but was only used in Macro Mode. The Controller is an abstract class that must be implemented by the driver suite. It contains two abstract methods, \emph{init()} and \emph{destroy()}. The init method should have all the necessary logic to set up a persistent connection to the robot. The destory method must close the connection when the session is finished and clean up any other used objects or open streams.\\
\indent In the NXT driver suite the NXTController extends the Controller class. The init method establishes a bluetooth connection to the NXT by using the iCommand library. In addition, the servo motors are initialised and the default speed of the motors is set. The NXTController acts as a proxy for all commands that will be sent to the NXT. It maintains the status of the NXT and has a list of all possible actions and events. When the destroy method is called, the bluetooth connection is closed.\\
\indent The Action abstract class has one abstract method, \emph{execute()}. Depending on the implementing class, the execute method will perform the requested action. For example, a class called Stop that extends Action must perform the necessary procedures in the execute method to make the robot stop. A driver suite will typically have more than one Action. In fact, the driver specification defines 8 specific actions that must be implemented in order to use the web interface to its full potential. They are:
\begin{itemize}
	\item MoveForward
	\item MoveBackward
	\item TurnLeft
	\item TurnRight
	\item TurnLeftOnSpot
	\item TurnRightOnSpot
	\item CenterTurning
	\item Stop
\end{itemize}
\indent By implementing these actions, the robot can be moved around by using the arrow keys on the keyboard. All of these actions and any other implemented actions in the driver suite can also be used by pressing a button on the web interface. There is no limit to the number of actions that can be implemented.\\
\indent The NXT driver suite implements the 8 actions above in addition to 2 others. These objects perform the relevant actions by instructing the left and right motors to move simultaneously. To move forward or backward, the motors move in the same direction as each other. To turn on the spot, the motors rotate in the opposite direction to each other at the same speed. To turn left or right while also moving forward or backward, the two motors rotate in the same direction but at different speeds e.g. if the right motor moves forward faster than the left motor, the robot starts turning left. The CenterTurning action resets the difference in speed between the 2 motors so that it will travel in a straight line. The Stop action simply stops the 2 motors.\\
\indent The other 2 actions that the NXT driver suite implements are for the grabber. OpenGrabber instructs the third motor to rotate clockwise for 1 second, thereby opening the grabbing arms. CloseGrabber rotates the motor anti-clockwise for 1 second, closing the grabbing arms. If the arms don't completely open or close within the time, they will stay wherever they finish up.\\
\indent The Event abstract class encapsulates any sensors on the robot. These sensors could include current speed, battery level or grabber position in the case of the NXT. The class defines 2 abstract methods, \emph{readFromSensor()} and \emph{getMeasurementUnits()}. The readFromSensor method communicates with the robot to get back a reading from the relevant sensor. The sensor reading is returned as a string primitive type. The abstract class will always store the very last sensor reading locally. This is to reduce multiple calls to the robot within a short space of time where the sensor reading is unlikely to change. This functionality is built in with \emph{getCurrentReading()}.\\
\indent Every sensor will have a different measurement unit e.g. centimetres for distance, decibels for sound levels or degrees for motor positions. A string representation of the measurement units should be implemented by the driver suite in the getMeasurementUnits method. Only 1 specific event is defined by the generic driver suite, BatteryEvent. By implementing this abstract event in the relevant driver suite a graphical representation of the battery level can be seen on the web interface.\\
\indent There are 7 events implemented for the NXT driver suite. They are:
\begin{itemize}
	\item Battery
	\item TurnOffset
	\item GrabberPosition
	\item Echo
	\item Touch
	\item Sound
	\item Light
\end{itemize}
\indent The Battery event is implemented as specified above. It gets the current battery level, in millivolts, which is a value between 0 and 9000. The TurnOffset event doesn't communicate with the NXT to return its reading. Instead, it subtracts the current speed of the left motor from the speed of the right motor (both stored in the NXTController) and returns the result. If the value is zero, the robot is travelling in a straight line either forwards or backwards. As the robot turns left or right at sharper angles, the turn offset value will either increase above zero or decrease below zero. This offset can be reset at any time by performing the CenterTurning action.\\
\indent The GrabberPosition event returns the current rotation angle of the grabber motor. When the NXTController is initialised this value will be set to zero. The value will increase or decrease by at most 90$^\circ$ whenever the grabber opens or closes.\\
\indent The last 4 events in the NXT driver suite are for the actual 4 sensors. The Echo event will return a value in the range 0-255 cm from the Ultrasonic Sensor. The Touch event will return either a 1 or 0 depending on whether or not the Touch Sensor is pushed in. The Sound event reads the current decibel level from the Sound Sensor. This value will be in the range 0-160dB. The Light event reads a brightness level from the Light Sensor. This value is expressed as a percentage, where 0\% is completely dark and 100\% is very bright.\\
\indent The generic driver specification above provides all the necessary features to interact with a robot by sending instructions to it and getting feedback. The driver suite can be used by any application that can integrate with Java classes. By implementing the Controller, and some Actions and Events for a robot, it can also be controlled through the web interface, detailed below.

\subsection{Web Design}
The web interface was confined to a single page to make it user friendly. All robot controlling and feedback was visible without changing the URL. The interface consisted of 4 main panels and a battery reading panel. The battery panel displayed a small graphic of a green fill on a grey background. The level of the green fill indicated the battery level of the robot. The percentage battery reading was also placed inside the graphic. The battery level is obtained from the BatteryEvent object at regular intervals by sending an AJAX request to the server.\\
\indent The 4 main panels were:
\begin{itemize}
	\item Image Panel
	\item Status Panel
	\item Sensor Panel
	\item Action Panel
\end{itemize}
% SCREENSHOT OF WEB INTERFACE
\begin{figure}[!h]
	\centering
	\includegraphics[height=55mm]{fp_web.png}
	\caption[web]{Web Interface showing the various panels}
	\label{fig:web}
\end{figure}
\indent The image panel was initially used for displaying a static image of the NXT robot. Possible variations on this usage could include automatically displaying an image of the robot being controlled. This would be implemented by including a fixed size image with each driver suite. So, for example, an image of size 320 by 240 pixels with the name 'robot.png' would be located in the driver suite images folder.\\
\indent After much research into video streaming and cameras, it was decided to place a java applet in the image panel instead. This applet used, called Cambozola, allowed a http video stream to be displayed in the web page\cite{cambozola}. The stream was taken from the server, which in turn took the stream from a USB webcam. The camera was pointed down at the NXT robot area, giving any web user a third person view of the robot. An additional improvement on this setup was done by mounting the camera directly on the NXT. The camera was then connected to the on-board battery powered wireless router, which sent the video stream back to the server. The video stream now provided the web user with a first person view of the NXT.\\
\indent The status panel allows the user to connect and disconnect from the robot. A message area located under these links was used to display any relevant information or errors. For example, whenever the robot was instructed to move, a confirmation message would be displayed to show that the server received the message. Other information such as connection status would also be displayed here.\\
\indent The sensor panel initially doesn't show any information. As soon as the robot is connected, the panel gets populated with all of the Event objects. The name, value and measurement units are extracted from the Event objects and displayed in a readable way. These objects were sent from the server in JavaScript Object Notation (JSON) format. Doing this was relatively easy in grails as it just required adding \texttt{'as JSON'} to the end of the line that returned the list of events. The web page was configured to send an AJAX request in the background to the server at regular intervals. This request constantly got the latest sensor readings and updated the web interface. As soon the user presses 'Disconnect', the periodic AJAX call is cancelled and the sensor panel is emptied.\\
\indent Like the sensor panel, the action panel doesn't show any information until the robot is connected. After it connects, an AJAX request gets a list of all the Action objects from the robot driver suite. For each action in the list, a button is created in the action panel with the name of the action on it. When clicked, the button triggers another AJAX request in the background to perform the selected action. If a response is received from the server, the message area in the status panel is updated by printing \emph{'action\_name executed ok'}. Again, like the sensor panel, whenever the user presses 'Disconnect', the list of actions is removed from the panel.

\subsection{Web Control}
The robot can be controlled entirely by using the action panel. However, as mentioned in the driver specification section, if 8 particular Actions were implemented, the arrow keys on the keyboard could be used to control the robot as well. To reiterate, these actions were MoveForward, MoveBackward, TurnLeft, TurnRight, TurnLeftOnSpot, TurnRightOnSpot, CenterTurning and Stop. All 8 of these were implemented by the NXT driver suite.\\
\indent The four arrow keys: up; down; left and right, were programmed using javascript to call these 8 actions. In general, whenever a key was pressed, a movement action was triggered. Whenever that key was released, the stop action was called. By using a combination of 4 booleans, \emph{connected, moving, turning \& turningOnSpot}, the arrow keys were programmed to control the robot in an intuitive way. No action could be performed unless \emph{connected} was set to \emph{true}. When not moving, the following outcomes were possible:
\begin{itemize}
	\item pressing up moved the robot forward until the key was released
	\item pressing down moved the robot backward until the key was released
	\item pressing left turned the robot anti-clockwise until the key was released
	\item pressing right turned the robot clockwise until the key was released
\end{itemize}
\indent When moving forward or backward, the following outcomes were possible:
\begin{itemize}
	\item pressing left steered the robot gently left until the key was released
	\item pressing right steered the robot gently left until the key was released
\end{itemize}
\indent Whenever the robot is steering left or right and the left or right arrow key is released, the CenterTurning action will be triggered, causing the robot to straighten back up again.

\subsection{Video Streaming}
The client side of the video stream was handled by the Cambozola Java applet. This was done by placing the following extract inside the image panel element:\\
{\footnotesize{\texttt{\textless applet codebase="\$\{request.contextPath\}/applets"\\
\indent code="com.charliemouse.cambozola.Viewer" archive="cambozola.jar"\\
\indent width="320" height="240" alt="Webcam"\textgreater\\
\textless param name="url" value="http://webrobot.test:8082/?action=stream"/\textgreater\\
\indent You need to install Java to use the webcam.\\
\textless /applet\textgreater\\}}}
\indent The \emph{request.contextPath} variable ensures the proper url for the applet location on the server is used. The applet archive name is specified as \emph{cambozola.jar}. The stream URL is \emph{http://webrobot.test:8082/?action=stream}. The domain name used here, \emph{webrobot.test}, is an alias for the localhost that was used for testing purposes. If the user doesn't have Java installed or has an incompatible web browser, a helpful message is displayed instead of the applet.\\
\indent As the camera was used in two different modes, third person and first person, two different streaming methods were used. For the third person view the camera was connected to a USB port on the server. A batch file was written to start up Video Lan Client (VLC) in streaming mode. VLC is an open source video playback, capture and streaming application\cite{vlc}. The batch file for starting VLC contained the command below.\\
{\footnotesize{\texttt{"D:\textbackslash Program Files\textbackslash VideoLAN\textbackslash VLC\textbackslash vlc.exe" -vvv dshow:// :dshow-vdev="USB Video Device" :dshow-adev="" :sout=\#transcode\{vcodec=mjpg, vb=512, scale=1, fps=5, width=320, height=240\}:duplicate\{dst=display, dst=std\{access=http, mux=mpjpeg, dst=:8082/stream.mjpeg\}\}\\}}}
\indent This command set up the stream in mjpeg format at a size of 320 by 240 pixels at 5 frames per second. Mjpeg format is basically a stream of contanstly changing jpeg images. The \emph{dst} parameter specifies the port number to listen on and the resource name to access the stream, \emph{stream.mjpeg} in this case. The \emph{duplicate} command allows the stream to be redirected to multiple locations. By specifying \emph{dst=display}, the stream will be output locally on the server as well as being streamed over http.\\
\indent The setup for first person mode was a little different. The camera was plugged into the USB port on the FON Wireless Router which was placed in the cargo area of the NXT. The Fonera router runs an embedded version of Linux. This version comes with drivers for cameras that support the UVC standard. A plugin was used on the FON to stream the video back to the server over the wireless network. This plugin used a command line tool called mjpg\_streamer\cite{mjpg_streamer}. Mjpg\_streamer can be started using parameters to set the resolution, frame rate and port number like with VLC. The startup command was stored in a script file in /etc/hotplug.d/webcam/10-webcam. The command was as follows:\\
{\footnotesize{\texttt{mjpg\_streamer -b -i "input\_uvc.so -d /dev/video0" -o "output\_http.so\\
\indent -w /etc/fon/webcam/ -p 8082 \$\{auth:+-c \$\{user\}:\$\{pass\}\}"\\}}}
\indent On the server, port forwarding was set up to forward any tcp traffic on port 8082 to the FON on the same port. This was done using a small open source tool for Windows called port\_forward.exe, version 1.0. Now whenever the applet tried to connect to the server looking for the stream, the request and all subsequent traffic would be forwarded to the FON.


% FUTURE WORK

\section{Future Work}
The majority of research for possible future work was done in Engineering Release 2 i.e. the final 6 weeks. The research was divided into 3 groups: multiple user agent support, iPhone integration and FON integration. Some of the research resulted in plausable solutions while other parts of it weren't possible to follow through with due to hardware or software limitations. This situation might change in the coming months as the main researched technologies were relatively new and still in the development process i.e. Fonera Router firmware and plugins, and iPhone firmware.

\subsection{Multiple User Agent Support}
The web interface was proven to work perfectly with Mozilla Firefox. In Microsoft Internet Explorer 7 (IE7) some of the CSS styling was out of place and the arrow keys didn't perform correctly. All of the other panels, including the streaming applet, did work correctly though in IE7. The application wasn't tested with other browsers such as Opera or Safari. A possible future goal would be to test the web interface in a group of various popular browsers on various operating systems. The only operating systems tried already were Windows XP Professional and Mac OSX. A good Linux based desktop system to try both Firefox and Konqueror web browsers on would be Ubuntu Desktop 8.10.\\
\indent User Agent support shouldn't be limited to desktop based systems. By changing the layout of the web interface and modifying or removing elements, the interface can be tailored to display correctly and be compatible with various devices. Possible devices with a web browser include mobile phones such as the iPhone, PDA's and the Nintendo DSi. A small test was developed to use the Nintendo DSi web browser. This browser was developed by Opera and supports javascript, so it could handle AJAX calls. It doesn't support java applets so the streaming element had to be removed. In the Grails view file (index.gsp) an if-statement was used to check the current User-Agent of the client. If the agent name contained the keyword 'Nintendo', slightly different CSS and javascript files were returned i.e. \emph{core\_ds.css} and \emph{index\_ds.js}.\\
\indent The modified CSS file removed a lot of the formatting and styling, as well as hiding the image and battery panels. The modified javascript file had to change the way arrow keys were used. This meant removing the ability to hold the keys down. Instead, whenever a key was pressed the robot would move continuously, and whenever the up key was pressed thereafter the robot would stop. This was to allow for the lack of \emph{onKeyReleased} support. Also, the DSi browser only recognised the directional pad as arrow keys when one of the shoulder buttons (L or R) was held down as well.\\
\indent An interesting variant on web based control for the DSi would be a small 'homebrew' application. The would run directly on the DSi, take input from the buttons and translate them into http calls to be sent back to the server. The application could be made available via the \emph{Download Play} option. A \emph{Download Play} application would allow any Nintendo DS owner to easily download the application and control the robot without having the web browser installed. Making applications available via \emph{Download Play} requires a specific wireless chipset made by Ralink to be used on the download server. 

\subsection{FON Wireless Router}
When researching the FON router initially, 3 ideas were proposed. All 3 of these ideas involved mounting the FON on the NXT and having the first person camera attached to the USB port. They also required the FON and NXT to be connected by a USB cable. The reason for this was to eliminate bluetooth. By allowing the NXT to be controlled through the USB connection via the FON, 2 advantages would be achieved. Battery power would be extended on the NXT as the bluetooth module could be switched off. Also, the range of the NXT would no longer limited by bluetooth, but by the Wireless connection on the FON.\\
\indent The first hurdle with this solution is getting the NXT to be recognised by the FON router. This requires obtaining or writing a linux driver that can communicate with the NXT over USB. A driver already exists for Linux on the x86 architecture but it would have to be modified and recompiled to work with the FONs Atheros chipset. The second hurdle is establishing communication. An application would have to be written for both the FON and NXT in either C or Java so they could pass messages between each other. All of this development is beyond the scope of this project.\\
\indent A slightly easier but temporary alternative would be to connect the USB bluetooth dongle to the FON and use this to communicate with the NXT. To do this, a Linux driver would have to be obtained again. Once working, a serial connection would be established to the NXT. Instead of writing a custom application to talk between the FON and NXT, the serial connection could be forwarded back to the server over WiFi. Forwarding of a serial port can be done using tools such as hub4com on Windows, and ser2net on Linux. The current application on the server would have to be configured to send the NXT commands over the virtual serial port instead of to a local Bluetooth dongle.\\
\indent If either Bluetooth or USB connectivity is established between the FON and NXT, another interesting project would be to port the server to run entirely on the FON. This would make the NXT robot independent of any additional hardware or software apart from whats housed in the cargo area. Clients could connect via the Wireless access point on the FON and load up the web interface. This feature would allow the NXT to be deployed anywhere and controlled by any WiFi enabled device. The Wireless adapter on the FON could be configured to act as both an access point and a client for other networks by using multiple virtual SSIDs.\\
\indent An extra step to extend the reach of the NXT would be to connect a USB 3G modem, such as the Huawei E220, to the FON. By doing this, the NXT could be controlled by virtually any device capable of accessing the internet. If that device has support for video streaming, the stream could be incorporated into the controlling application, web based or otherwise.

\subsection{iPhone Camera and Bluetooth}
The Apple iPhone could be a complete replacement for the FON router. It has a built-in camera, bluetooth capabilities, WiFi support and 2G or 3G internet connectivity options. However, with the current firmware, version 2.1, there are a few of stumbling blocks to overcome.\\
\indent The first problem is setting up the video stream. There are no applications built into the iPhone to facilitate streaming. Some custom applications, such as Qik, can set up video and audio streaming. Qik currently only allows viewing of the streams from a predefined URL on their website\cite{qik}. This means the stream cannot be directed over the local WiFi interface. The problem with this is the latency. There can be anything upwards of a 30 second gap between the camera recording the video and the stream being displayed on their website. A more scalable solution is needed so that the stream can be sent out any interface on the iPhone to any requesting device. This would require using the iPhone Sofware Development Kit (SDK) to write a custom application.\\
\indent The second problem with using the iPhone is communicating with the NXT over bluetooth. The built-in bluetooth support currently only works with bluetooth headsets and hands free kits. In order to work with the NXT, bluetooth serial support would have to be added. This isn't an impossible feat many other bluetooth enabled phones have this support.\\
\indent There are two options here. The first is to wait for the official iPhone firmware to be updated with this support. Unfortunately there is no indication of when or if this will be added. The second option is to wait until the iBluetooth project has serial support. The iBluetooth project aims to extend the bluetooth functionality of the iPhone by implementing more of the Bluetooth standards such as File Transfer Protocol (FTP), Basic Imaging Profile (BIP) or Serial Port Profile (SPP)\cite{ibluetooth}.\\
\indent If communication can be established between the iPhone and NXT, the next step would be to port the server to the iPhone using the SDK. Alternatively, an unofficial Java Virtual Machine could be installed allowing the current server to be ported a little easier than having to rewrite it from scratch. Either way, there are many problems to overcome before integrating the iPhone and NXT. For these reasons the FON  is a much more viable way of continuing this project.

\section{Summary}
In this project a web interface for controlling theoretically any robot was developed. The generic driver specification allowed a driver suite to be written for any robot and plugged into the web interface. A driver suite was written and developed for the Lego Mindstorms NXT as a demonstration of the project. To further enhance the web control aspect, a streaming video feed was set up at the server and on the robot. This feed was was made accessible from the web interface by embedding a java applet. A spin-off of this project was the hardware setup on the NXT. This proved to be more than just a demo robot and had many more possiblities. Only some of these possiblities were explored and there is much scope for bringing the project forward in different directions.

{\small{
\begin{thebibliography}{99}
	\bibitem{nxt}Lego.com MINDSTORMS NXT Home (2009)
		[online] 
		\emph{available: http://mindstorms.lego.com}, 
		last accessed [20, April, 2009]
		
	\bibitem{bluetooth}Bluetooth.org (2009)
		[online] 
		\emph{available: https://www.bluetooth.org/apps/content/}, 
		last accessed [20, April, 2009]
		
	\bibitem{iphone_phone}Apple - iPhone - Features (2009)
		[online] 
		\emph{available: http://www.apple.com/iphone/features/}, 
		last accessed [20, April, 2009]
	\bibitem{iphone_image}Apple - iPhone - Gallery (2009)
		[online] 
		\emph{available: http://www.apple.com/iphone/gallery/\#image3}, 
		last accessed [20, April, 2009]
	\bibitem{iphone_firmware}Apple - iPhone - Software Update (2009)
		[online] 
		\emph{available: http://www.apple.com/iphone/softwareupdate/}, 
		last accessed [20, April, 2009]
	\bibitem{ibluetooth}iBluetooth Project (2009) 
		[online] 
		\emph{available: http://www.ibluetoothproject.com/}, 
		last accessed [20, April, 2009]
	\bibitem{qik}Qik (2009) 
		[online] 
		\emph{available: http://qik.com/}, 
		last accessed [20, April, 2009]
		
	\bibitem{fon}La Fonera 2.0 - Fonera Wiki Beta (2009)
		[online] 
		\emph{available: http://wiki.fon.com/wiki/La\_Fonera\_2.0}, 
		last accessed [20, April, 2009]
		
	\bibitem{e220}Huawei E220 (2009)
		[online] 
		\emph{available: http://www.huawei.com/mobileweb/en/products/view.do?id=282}, 
		last accessed [20, April, 2009]
		
	\bibitem{dsi}Nintendo DSi - Nintendo (2009)
		[online] 
		\emph{available: http://www.nintendo.co.uk/NOE/en\_GB/systems/nintendo\_dsi\_11513.html}, 
		last accessed [20, April, 2009]
		
	\bibitem{icommand_and_lejos}iCommand and Lejos (2009)
		[online] 
		\emph{available: http://lejos.sourceforge.net}, 
		last updated [11, December, 2008]
		
	\bibitem{grails}Grails (2009)
		[online] 
		\emph{available: http://grails.org}, 
		last updated [1, January, 2009]
		
	\bibitem{maven}Maven (2009) 
		[online] 
		\emph{available: http://maven.apache.org}, 
		last updated [5, January, 2009]
	\bibitem{maven_repo}Maven Central Repository(2009) 
		[online] 
		\emph{available: http://repo1.maven.org/maven2/}, 
		last updated [20, April, 2009]
	\bibitem{maven_archetypes}Maven - Introduction to Archetypes(2009) 
		[online] 
		\emph{available: http://maven.apache.org/guides/introduction/introduction-to-archetypes.html}, 
		last accessed [20, April, 2009]
		
	\bibitem{google_docs}Google Docs(2009) 
		[online] 
		\emph{available: http://docs.google.com}, 
		last accessed [20, April, 2009]
		
	\bibitem{eclipse}Eclipse.org Home (2009)
		[online] 
		\emph{available: http://www.eclipse.org}, 
		last accessed [20, April, 2009]
		
	\bibitem{netbeans}Netbeans IDE Download (2009)
		[online] 
		\emph{available: http://www.netbeans.org/downloads/}, 
		last accessed [20, April, 2009]
		
	\bibitem{subversion}Subversion(2009) 
		[online] 
		\emph{available: http://subversion.tigris.org/}, 
		last updated [20, April, 2009]
	\bibitem{subversion_project}Project Code Repository(2009) 
		[online] 
		\emph{available: https://webenabledrobotcontrol.googlecode.com/svn}, 
		last updated [20, April, 2009]
		
	\bibitem{googlecode}Google Code (2009) 
		[online] 
		\emph{available: http://code.google.com/hosting/}, 
		last accessed [20, April, 2009]
		
	\bibitem{xp}
		Extreme Programming: A Gentle Introduction (2006)
		[online] 
		\emph{available: http://www.extremeprogramming.org}, 
		last updated [17, February, 2006]
		
	\bibitem{hudson}Hudson (2009) 
		[online] 
		\emph{available: https://hudson.dev.java.net}, 
		last accessed [20, April, 2009]
		
	\bibitem{cambozola}Cambozola Streaming Image Viewer (2009) 
		[online] 
		\emph{available: http://www.charliemouse.com/code/cambozola/}, 
		last accessed [20, April, 2009]
		
	\bibitem{vlc}VLC media player (2009) 
		[online] 
		\emph{available: http://www.videolan.org/vlc/}, 
		last accessed [20, April, 2009]
		
	\bibitem{mjpg_streamer}Mjpg Streamer (2009) 
		[online] 
		\emph{available: http://www.zoneminder.com/wiki/index.php/Uvc}, 
		last accessed [20, April, 2009]
		
\end{thebibliography}
}
\end{document}
