\section{Background}

Before we describe our approach, we introduce to the reader the list of
artificial intelligence techniques that we employ in our approach

\subsection{PageRank}

Google's PageRank is a technique for determining the relative importance of web
pages. It was first described in \cite{Page1999}. The primary use of the
PageRank algorithm is to sort web pages search results in
order of importance, for a given set of web pages. For the benefit of the
readers the following is the basic formula of the PageRank algorithm

\begin{eqnarray}
R(v) &=& c \displaystyle\sum^{}_{v \in B_v} \frac{R'(v)}{N_v} + c.E(u)
\end{eqnarray}

The key feature of the formula that will is useful in our approach is the source
sink vector. The source sink vector allows us to tailor the ranks assigned to
each page based on a specific goal. In our case the goal is the same as that of
a world wide web search engine like Google, to rank a list of popular web pages
for a given niche.

At the time of writing this paper, we were unable to get access to Google's
implementation of the PageRank algorithm. We could not get specific PageRank
values for our search results. It was also not possible for us to tweak the rank
source vector to target out search results for a given niche of websites. In the
absence of access to Google's PageRank algorithm, we resorted to using the
publicly available web API through which to access search results. However the
search list should be considered biased because it is subject to Google's
interpretation of how to convert a given search term into the corresponding rank
source vector. 

\subsection{Naive Bayes Model}

Naive bayes models\cite{Norvig2008} are a method of classification. Its a 
probabilistic method of classfication based on Bayes Theorom\cite{Stigler1982}. 
A naive bayes model 
is first trained for individual attributes of a set of entities belonging to a
class. Once trained it can be used to classify new entities by combining the
probability of occurrence of individual attributes of the entities. The
combination of probability of occurrence of individual attributes of the entitiy
in a previously encountered entity of a class, is done using the Bayes Theorom.

The formula for Bayes Theorom is defined as follows
\begin{eqnarray}
P(C|x_1,...,x_n) &=& \alpha P(C) \displaystyle\prod_i P(x_i|C)
\end{eqnarray}

Important thing to remember in that formula is that the $P(x_i|C)$'s are
multiplied which models the naive bayes assumption that the probability of
occurence of each feature $x_i$ given a class $C$ is independent of the
probability of the other feature.
