\documentclass[letterpaper, 10 pt, conference]{ieeeconf}

\usepackage{amssymb, amsmath}
\usepackage{hyperref}
\usepackage{color}
\usepackage{graphicx}
\usepackage{graphics}
\usepackage[all]{xy}
\usepackage{subfig}

\begin{document}
\title{CSCI5561 Class Project: Moving Object Following With An Aerial Robotic Platform Using Its Intrinsic Sensors}

\author{Dario Canelon, Jiannan "Eric" Zhang \\
dario@cs.umn.edu, zhan2001@umn.edu}


\maketitle

\begin{abstract}
Object tracking and following is important in many applications. Different methods have been developed to track some specific objects based on different features of that object. This proposal gives an overview of the expectation of our project and some methods can be tried to achieve our goals. A brief summary is also given for current tracking and recognition algorithms. Finally, we listed our timeline for the project.
\end{abstract}
%----------------------------------------------------------------------------------------------------------------------------------------------------------
		

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
\section{Introduction}
Tracking and following objects is one of the most important topics in computer vision. It requires different technologies to achieve accuracy and robustness. Different methods can be applied according to different applications, from early days when mechanical trackers and magnetic trackers were widely used to recent years that sensors and image-based vision system are getting more and more powerful because of the growth computers. Among all of this, vision system has a great advantage because it is not invasive and less costly \cite{Crespi2005a}[1]. Based on the surveys [1] [2], tracking methods we can use in our project is classified into 3 categories: Edge-based tracking, template matching, interest-point-based approaches. Most of the methods use some natural features of the specific objects, and they have different computational payloads. We are going to put our tracking system on an AR Drone, and track the objects from the sky and this job can be done by object recognition, tracking and following.

\section{Problem}
Our story is simple and useful: a helicopter in a battle field wants to find a tank on the ground which is in need of help (maybe it need transportation because a mountain is in front of it, or it is surrounded by enemies and needs air force to save it), then it sends an SOS signal. Immediately, the AR Drone will be aware of this and fly to the object, this requires 3 steps: taking off and find the object, fly towards it while tracking the position of the object, finally stop at a certain point from the object. In our application, we want the AR Drone to stop at the top.


\subsection{Object recognition}
After the AR Drone takes off, it should firstly find the object and turn to the correct direction to that object. Some prior prepared images will be sent to the controlling computer as input, as the AR Drone is rotating, the computer will process the images transmitted back from the front camera. Within 360 degrees’ rotation, the object can be recognized by certain matching algorithms. Then the AR Drone should adjust its head direction accurately and make itself face the object.

\subsection{Movement towards the object}
The second step is moving the AR Drone to that object. As we assume the object would be far from the AR Drone at the beginning, front camera will be used. While the AR Drone is moving, the object in the image will change, preciously, become bigger (scaling) and moves out of the image gradually (translating). The key point is that the AR Drone should be aware of this and keep recognizing and tracking it. At a certain point, the object will totally be out of the image, at this time, beneath camera should be open and continue tracking. One problem will happen is there will be a blind area, if there is, one solution is make the AR Drone fly higher after it opens beneath camera.

\subsection{Following}
The finally step sounds less challenging if the second step can be realized. The AR Drone will move until the object appears at the center of its image. And then follows it forever (future applications will be substantial).

%\subsection{Pick up or Loading}
%
%\section{Proposed Solution}

\subsection{Recognition and Tracking}

It is a frame transformation problem if we keep the object static. Algorithms that are robust to affine transformations must be considered. There are several algorithms we would like to try:

\begin{enumerate}
	\item RAPiD tracker [3]. Edge-based tracking, low computational load.
	\item Lucas-Kanade algorithm [4]. Global region recognition.
	\item Interest-local-points based algorithms [7].
	\item SIFT, SURF, BRIEF [5] [6]. Feature detection and matching.
\end{enumerate}•

%\section{Related Work}
%
%stuff
	
\section{Tasks/Timelines/Milestones}

An up-to-date version can be seen at \url{https://sites.google.com/site/2012scsci5561drones/progress/workflow}

\begin{itemize}
	\item 2012.02.26
		\begin{itemize}
			\item Create project page
			\item Brainstorm architecture
			\item Install Robot Operating System and project-specific packages (Brown AR Drone package)
			\item Achieve AR drone flight using Robot Operating System (ROS)	
		\end{itemize}•
	\item 2012.03.10
		\begin{itemize}
			\item Acquire video stream from AR Drone
			\item Research feature detectors
			\item Implement features detectors on acquired video
		\end{itemize}•
	
	\item 2012.03.15
		\begin{itemize}
			\item Setup feature detector testing methodology
		\end{itemize}•
	
	\item 2012.03.20
	\begin{itemize}
		\item Choose different robots to track
		\item Choose feature detectors based on metrics dependent on speed and robot
	\end{itemize}•
	\end{itemize}•

\section{Bibliography}
\begin{enumerate}
\item Monocular Model-Based 3D Tracking of Rigid Objects: A Survey, Vincent Lepetit, Pascal Fua
\item Tracking with Rigid Objects, C. Harris
\item An iterative image registration technique with an application to stereo vision, B. Lucas, T. Kanade
\item Distinctive Image Features from Scale-Invariant Key-points, David G. Lowe
\item BRIEF: Binary Robust Independent Elementary Features, Michael Calonder, Vincent Lepetit, Christoph Strecha, and Pascal Fua
\item A feature-based correspondence algorithm for image matching, W. F.
\item Project website: https://sites.google.com/site/2012scsci5561drones/progress/workflow

\end{enumerate}•




%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

%----------------------------------------------------------------------------------------------------------------------------------------------------------

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%\bibliographystyle{IEEEconf}	
%\bibliographystyle{plain}	
%\bibliography{bib}	
%----------------------------------------------------------------------------------------------------------------------------------------------------------

\end{document}


%
%	\begin{figure}[tb]
%		\centering
%		\includegraphics[width=0.30\textwidth]{images/100_1800.jpg}
%		\caption{Aquapod}
%		\label{fig:aquapod_pool}
%	\end{figure}
%	