\documentclass[a4paper,12pt]{article}
\usepackage[utf8x]{inputenc}
\usepackage[bookmarks=true,bookmarksopen=true,bookmarksnumbered=true,
           pdfborder={0 0 0},colorlinks=true,linkcolor=blue]
           {hyperref}
\usepackage[pdftex]{graphicx} % for images

\title{Pattern Recognition}
\author{Aurelian R\u{a}doac\u{a}}
\date{} % no date

\begin{document}

 \maketitle

 \begin{abstract}
  Pattern recognition means examining input data and recognize patterns
  based on \ttfamily a priori \rmfamily knowledge or on \ttfamily statistical
  \rmfamily information extracted from the patterns \cite{PatternRecognition}.

  Patterns are assigned to a class using a classifier (e.g., a rule based
  on the location of a graphical representation of the given sample with
  respect to other samples of the known class) \cite{WolframPattern}.

  Pattern recognition is used in diverse applications, like: handwriting
  and speech recognition, marketing and financial analysis, gene expression,
  biometrics, and so on.
 \end{abstract}

 \tableofcontents

 \section{Introduction}

 A pattern recognition system consists of a sensor that gathers the
 observations to be classified or described, a feature extraction
 mechanism that computes numeric or symbolic information from the
 observations, and a classification or description scheme that does the
 actual job of classifying or describing observations, relying on the
 extracted features \cite{PatternRecognition}.

 A sensor is a device that measures a physical quantity and converts it
 into a signal that can be read by an observer or by an instrument
 \cite{Sensor}.

 In pattern recognition and in image processing, feature extraction is
 a special form of dimensionality reduction.
 When the input data to an algorithm is too large to be processed and
 it is suspected to be notoriously redundant (much data, but not much
 information) then the input data will be transformed into a reduced
 representation set of features (also named features vector).
 Transforming the input data into the set of features is called
 feature extraction. If the features extracted are carefully chosen
 it is expected that the features set will extract the relevant
 information from the input data in order to perform the desired task
 using this reduced representation instead of the full size input
 \cite{FeatureExtraction}.

 Statistical classification is a supervised machine learning procedure
 in which individual items are placed into groups based on quantitative
 information on one or more characteristics inherent in the items
 (referred to as traits, variables, characters, etc.) and based on a
 training set \cite{TrainingSet} of previously labeled items
 \cite{Classification}.

 The classification or description scheme is usually based on the
 availability of a set of patterns that have already been classified
 or described. This set of patterns is termed the training set (matching
 input to desired output), and the resulting learning strategy is
 characterized as supervised learning (deduce a function connecting input
 to observed or desired output). Learning can also be unsupervised, in
 the sense that the system is not given an a priori labeling of patterns,
 instead it itself establishes the classes based on the statistical
 regularities of the patterns \cite{PatternRecognition}.

 This function's values can be in a continuous range or discrete, defining
 classes (using supervised learning \cite{SupervisedLearning}) or can
 delimit similar continuous ranges (or clusters, using unsupervised learning
 \cite{UnsupervisedLearning}).

 The classification or description scheme usually uses one of the
 following approaches: statistical (or decision theoretic) and
 syntactic (or structural). Statistical pattern recognition is based
 on statistical characterizations of patterns, assuming that the
 patterns are generated by a probabilistic system. Syntactical
 (or structural) pattern recognition is based on the structural
 interrelationships of features \cite{PatternRecognition}. 

 \section{Algorithms}

 \subsection{k-nearest neighbors}
  In pattern recognition, the k-nearest neighbors algorithm (k-NN) \cite{knn}
  is a method for classifying objects based on closest training examples
  in the feature space.

  In pattern recognition a feature space is an abstract space where each
  pattern sample is represented as a point in n-dimensional space. Its
  dimension is determined by the number of features used to describe
  the patterns. Similar samples are grouped together, which allows the
  use of density estimation for finding patterns \cite{FeatureSpace}.

  k-NN is a type of instance-based learning (compare new instances with
  those from the training set), or
  lazy learning \cite{LazyLearning} where the function is only approximated locally and all
  computation is deferred until classification. The k-nearest neighbor
  algorithm is amongst the simplest of all machine learning algorithms:
  an object is classified by a majority vote of its neighbors, with the
  object being assigned to the class most common amongst its k nearest
  neighbors (k is a positive integer, typically small). If k = 1, then
  the object is simply assigned to the class of its nearest neighbor
  \cite{knn}.

  \includegraphics{220px-KnnClassification.png}

  \underline{Source:}
  \href{http://en.wikipedia.org/wiki/File:KnnClassification.svg}
  {http://en.wikipedia.org/wiki/File:KnnClassification.svg}

  Example of k-NN classification. The test sample (green circle)
  should be classified either to the first class of blue squares
  or to the second class of red triangles. If k = 3 it is classified
  to the second class because there are 2 triangles and only 1 square
  inside the inner circle. If k = 5 it is classified to first class
  (3 squares vs. 2 triangles inside the outer circle).


 \section{Fuzzy Logic}
  Fuzzy Logic is used for solving problems that deal with approximations,
  rather than exact solutions. Instead of truth values it uses degrees of
  truth \cite[Degrees of truth]{FuzzyLogic}, which are values between
  0 and 1, just like probabilities.  Unlike probabilities, however,
  fuzzy logic assigns values to sets based on subjective measures.

  Consider, for example, the problem of determining if a glass is full
  of water (or how much). If the glass contains 30\% water, how full or
  empty this is depends on the observer. There could be a rule saying
  the glass is full down to 50\%. Fuzzy Logic can say if the glass is
  full or empty based on some defined rules and observed values (input
  data). This problem does not seem to be easily solvable or even
  representable in terms of probabilities. We could define, for instance,
  the probability that someone will appreciate that the glass is full
  based on a certain water level.

  As a further distinction, probabilities work with sets, while in
  fuzzy logic, even the set membership is fuzzy. Output values of a
  certain event might be assigned to different classes, close to each
  other. With sets, the elements either belong or don't belong to it.
  With fuzzy sets \cite{FuzzySet}, elements have degrees of membership.
  Thus, we can define a membership function taking any real values in
  the interval $[0,1]$.

  Fuzzy Logic focuses on what the system should do rather than trying
  to understand how it works. One can concentrate on solving the problem
  rather than trying to model the system mathematically, if that is even
  possible. This almost invariably leads to quicker, cheaper solutions
  \cite{FuzzyTutorial}.

  Fuzzy logic facilitates design of systems that mimic human reasoning.
  A fuzzy system accepts data input from sensors, then makes decisions
  based on that input. In most cases, these decisions are the basis for
  a control system. However, a fuzzy rule-driven system can simply be
  a classification engine that draws distinctions between and labels
  differing types of input data \cite[OVERVIEW]{CharacterRecognition}.

  The three elements required to realize a fuzzy system are fuzzification,
  rule application, and defuzzification. Fuzzification is the scaling of
  input data to the universe of discourse (the range of possible values
  assigned to fuzzy sets). Rule application is the evaluation of
  fuzzified input data against the fuzzy rules written specifically for
  the system. Defuzzification is the generation of a specific output
  value based on the rule strengths (significant values in the fuzzy sets
  reflecting degrees of membership to these sets) that emerge from the rule
  application \cite[THE FUZZY LOGIC DESIGN PROCESS]{CharacterRecognition}.

  \subsection{Sensor Networks}

   To be able to implement an application that models a real world situation,
   me have to work with discrete values. That means, in a sensor application,
   for instance, we must record only significant data at certain time frames,
   without too much loss of information (or generality). We must also apply
   discrete values when establishing the fuzzy sets.

   In modelling sensor networks, there might be some problems regarding
   stream segmentation \cite[p.2, Subsection 2.1 Modelling sensor networks]
   {StreamSegmentation}:
   \begin{enumerate}
    \item In sensor networks, data are collected from the real world where
          the sensors are embedded. It reflects certain properties of the
          world, but may not be directly understandable by users. For
          example, acoustic and infrared sensor signals from moving vehicles
          are meaningless to database users. Instead, type, location and
          trajectories of those vehicles are important.
    \item Data stream is continuous. Real world produces continuous time
          sequence of multi-modality spatiotemporal data stream. To realize
          our goal of monitoring the dynamics, sensor networks must sample
          the data stream without much loss of information.
    \item Multi-modality data are collected using multiple types of sensors,
          which represent various aspects of the targets.
    \item There may be uncertainty due to reliability of sensors and
          communication networks, and dynamic nature of the environment.
   \end{enumerate}

   A sensor network can be best represented by a graph $G = (V, E)$,
   with $V$ denoting vertex set and $E$ the edge set.

   All sensor networks are meant to accomplish certain tasks. A group of
   sensors collaboratively working together to fulfill a task is one of
   the major characteristics of sensor networks. A task can be
   location-invariant, such as monitoring a junction of a bridge, or
   dynamic, such as tracking moving target
   \cite[p.3, Subsection 2.2 Neighborhood system and collaboration region]
   {StreamSegmentation}.

   Vertices are related to each other via a neighborhood system. A
   neighborhood consists of one or a group of sensor nodes, performing
   a common task or forming a homogeneous region.

   A neighborhood can be dynamic. The site of the neighborhood can
   change as the task location changes. In this case, vertices in
   the neighborhood change accordingly.

   Sensors are scheduled to detect any objects or events, which may
   occur anytime and anywhere. As soon as an object or event is
   detected, an active region is formed. Sensors inside the active
   region will work together to classify the object
   \cite[p.4, Subsection 2.3 Detection and classification]
   {StreamSegmentation}.

   Different sensors can provide different images of an object in the
   same active region (from different angles or distances). These images
   must be fusioned later, first by setting the same coordinates, then
   recognizing the context, objects, their characteristics and relevance,
   and movement for a dynamic environment
   \cite[p.1-2, Section 1 Introduction]{StreamSegmentation},
   establishing the corresponding fuzzy set and membership degree, and
   taking the proper action.

   In recognizing images or patterns within them, it is important to
   know the context, or to have a map of the region. The patterns
   recognized in images and maps are matched (like applying the rules
   of fuzzy logic), and the result is defuzzyfied (turned into an
   understandable format) \cite[p.3, Figure 2]{ChangeDetection}.

   The fuzzy sets can be determined using a probability distribution.
   For example, to detect the shadow movements over a known terrain
   during daylight, the surface and the objects on it determine a
   certain probability distribution for the shadowed regions (like
   contiguous regions behind a structure) \cite{RegionDetection}.

 \begin{thebibliography}{20}
  \bibitem[1]{PatternRecognition}
   \textquotedblleft Pattern recognition\textquotedblright, online
   article at 

   \href{http://en.wikipedia.org/wiki/Pattern\_recognition}
   {http://en.wikipedia.org/wiki/Pattern\_recognition}
  \bibitem[2]{WolframPattern}
   \textquotedblleft Pattern Recognition Primer\textquotedblright,
   online article at

   \href{http://demonstrations.wolfram.com/PatternRecognitionPrimer/}
   {http://demonstrations.wolfram.com/PatternRecognitionPrimer/}
  \bibitem[3]{Sensor}
   \textquotedblleft Sensor\textquotedblright, online article at 

   \href{http://en.wikipedia.org/wiki/Sensor}
   {http://en.wikipedia.org/wiki/Sensor}
  \bibitem[4]{FeatureExtraction}
   \textquotedblleft Feature extraction\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/Feature\_extraction}
   {http://en.wikipedia.org/wiki/Feature\_extraction}
  \bibitem[5]{TrainingSet}
   \textquotedblleft Training set\textquotedblright, online article at 

   \href{http://en.wikipedia.org/wiki/Training\_set}
   {http://en.wikipedia.org/wiki/Training\_set}
  \bibitem[6]{Classification}
   \textquotedblleft Statistical classification\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/Statistical\_classification}
   {http://en.wikipedia.org/wiki/Statistical\_classification}
  \bibitem[7]{SupervisedLearning}
   \textquotedblleft Supervised learning\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/Supervised\_learning}
   {http://en.wikipedia.org/wiki/Supervised\_learning}
  \bibitem[8]{UnsupervisedLearning}
   \textquotedblleft Unsupervised learning\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/Unsupervised\_learning}
   {http://en.wikipedia.org/wiki/Unsupervised\_learning}
  \bibitem[9]{knn}
   \textquotedblleft k-nearest neighbor algorithm\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/K-nearest\_neighbor\_algorithm}
   {http://en.wikipedia.org/wiki/K-nearest\_neighbor\_algorithm}
  \bibitem[10]{FeatureSpace}
   \textquotedblleft Feature space\textquotedblright,
   online article at 

   \href{http://en.wikipedia.org/wiki/Feature\_space}
   {http://en.wikipedia.org/wiki/Feature\_space}
  \bibitem[11]{LazyLearning}
   \textquotedblleft Lazy learning\textquotedblright, online article at 

   \href{http://en.wikipedia.org/wiki/Lazy\_learning}
   {http://en.wikipedia.org/wiki/Lazy\_learning}
  \bibitem[12]{FuzzyLogic}
   \textquotedblleft Fuzzy logic\textquotedblright, online article at 

   \href{http://en.wikipedia.org/wiki/Fuzzy\_logic}
   {http://en.wikipedia.org/wiki/Fuzzy\_logic}
  \bibitem[13]{FuzzySet}
   \textquotedblleft Fuzzy set\textquotedblright, online article at 

   \href{http://en.wikipedia.org/wiki/Fuzzy\_set}
   {http://en.wikipedia.org/wiki/Fuzzy\_set}
  \bibitem[14]{FuzzyTutorial}
   \textquotedblleft Fuzzy Logic Tutorial\textquotedblright,
   online article at 

   \href{http://www.seattlerobotics.org/Encoder/mar98/fuz/flindex.html}
   {http://www.seattlerobotics.org/Encoder/mar98/fuz/flindex.html}
  \bibitem[15]{CharacterRecognition}
   \textquotedblleft Optical Character Recognition Using Fuzzy
   Logic\textquotedblright\ , online article at 

   \vspace{-5mm}
   \begin{flushleft}
    \href{http://www.freescale.com/files/microcontrollers/doc/app\_note/AN1220\_D.pdf}
    {http://www.freescale.com/files/microcontrollers/doc/app\_note/ AN1220\_D.pdf}
   \end{flushleft}
   \vspace{-5mm}
  \bibitem[16]{StreamSegmentation}
   \textquotedblleft Stream Segmentation\textquotedblright,
   online article at 

   \href{http://www.aas.net.cn/qikan/manage/wenzhang/060603.pdf}
   {http://www.aas.net.cn/qikan/manage/wenzhang/060603.pdf}
  \bibitem[17]{ChangeDetection}
   \textquotedblleft AUTOMATED PROCEDURES FOR INTEGRATION OF SATELLITE
   IMAGES AND MAP DATA FOR CHANGE DETECTION\textquotedblright,
   online article at 

   \vspace{-5mm}
   \begin{flushleft}
    \href{http://www.ifp.uni-stuttgart.de/publications/commIV/dowman73n.pdf}
    {http://www.ifp.uni-stuttgart.de/publications/commIV/ dowman73n.pdf}
   \end{flushleft}
   \vspace{-5mm}
  \bibitem[18]{RegionDetection}
   \textquotedblleft Data Segmentation for Region Detection in a Sensor
   Network\textquotedblright, online article at 

   \vspace{-5mm}
   \begin{flushleft}
    \href{http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.100.8148&rep=rep1&type=pdf}
    {http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.100.8148
    \&rep=rep1\&type=pdf}
   \end{flushleft}
   \vspace{-5mm}

 \end{thebibliography}
 \addcontentsline{toc}{section}{References}

\end{document}

