\documentclass{article}

\usepackage{fullpage}

\title{CS 510 Project Proposal}
\author{Daniel Moyer, Doug Traher, Archit Baweja}

\begin{document}
\maketitle

\section{Problem Statement and Motivation}
Over the years the World Wide Web has grown from merely a set of interconnected
web pages, to a rich media delivery platform for various organizations. In
today's world, it is necessary to have a website that not only allows the 
content to be delivered to the user, but in a beautiful manner. Appearances
matter and websites are no exception. 

Appearances are especially important with the advent of the Web 2.0 revolution,
which, although focuses primarily on usability of websites, has also started a
tendency for trendy looking websites. It is not enough to just get the content 
across, it needs to be delivered with flare.

To keep up appearances for our websites, it is thus vitally important to keep 
our websites updated in terms of appearances. To aid website designers and
developers to keep abreast with latest fashion in website design, we propose a
novel approach which is an amalgamation of various artificial integelligence 
methods including PageRank\cite{Page1999}, Bayesian Filters\cite{1051714}. We
hope that the results of our experiments will allow website designers and
developers to gain a better understanding of what elements of website design
are common and up-to-date.

\section{Approach}
The approach that we propose has several components to it. The first part
involves using PageRank \cite{Page1999}
algorithm to find the most popular websites for a given subject. For the
purposes of our research we plan to experiment in various niches of websites.
Restricting our initial short listed websites for a particular niche will be
dictated by the facility provided by the rank source vector used in the PageRank
algorithm \cite{Page1999}.

After obtaining this list we will apply various techniques for identifying
and extracting elements of good website design (See section on Evaluation
Approach). Elements like load speed
of a webpage are straightforward to extract. Elements regarding the design of
the website will need to be extracted from the source of the webpage using
techniques discussed in \cite{956961} \cite{1183654} \cite{1367749}. We hope to
provide our results on the difficulty and or ease in extracting values for
various elements in the final report.

Finally to interpret the data we obtain, we will aggregate information on the
frequency of presence of the various elements of good website design. Another 
way we plan to further research the data we collect is to research into
simple statiscal analysis techniques like Bayesian 
Filtering to decide what elements of good website design are most commonly used
together. We got our idea for using bayesian filters from the extensive research
and application of bayesian filters for estimation techniques in various fields
including e-mail spam filters and detecting and browsing events in unstructured
text \cite{564391}. 

Another benefit and use of this data would be in the form of recommendations
to a web designer in a number of ways. 
\begin{enumerate}
\item A web designer could use our results to see if a website he designs
adheres to accepted values of elements of good website design.
\item The results could be used in as plugin for a web designing application or
 by a novice web developer to come up with automatically generated templates for
 website design.
\end{enumerate}

\section{Related Works}

The most related article is a blog post about using PageRank to determine the most powerful people on LinkIn \cite{Puffelens2008}.  His thought being that the person with the most networking skills is very influential.  People will want to link with someone they look up to.  Similarly, we would be applying PageRank to find the most popular web templates.   A more well-documented journal article related to our topic is about using PageRank to determine where people's attention is in the blogosphere \cite{Kirchhoff2007}.  Sites with higher PageRanks probably are more popular, and in turn, their content is most likely popular.  A novel use of PageRank in another domain is a project called CodeRank, which is used to determine popular code modules in large software systems \cite{Neate2006}There is also a review of PageRank's algorithms for trust computation \cite{Hussain2007}.  To learn
about the background on web page templates, Gibson explains their evolution \cite{1062763}.
Also, to gauge the impact of a web page based on looks, Reed does an analysis of
political websites \cite{1358925}.  To give us ideas on how to extract template information,
Wang's work on incremental web page template detection will be useful \cite{1367749}.  Also,
Vieira's fast and robust method for web page template detection and removal can
be studied to help extract the actual templates \cite{1183654}.

\section{Evaluation Approach}
To evaluate the results of our research we consulted various sources for
what are considered elements of good website design. If
the results of our algorithm are rich with websites that contain the usage of
these points, then we can say that the algorithm is smart enough to determine if
a site possesses a decent design; obviously the more of these points a site
contains the “better” the design. As a result of our initial research, we have
come across the following elements of good website design that we plan to
research on

\begin{enumerate}
\item Site Navigation - By
examining the websites internal links (that is links that lead to another part
of the same site) to ensure a tight circular path, we can presume that each page
can re-route the user back to the beginning of the site (ie the home page).  
\item Load Speed - We define Load Speed of the website (or the pages
therein).  By conducting small tests, we will be able to determine if the site is
designed to load within an optimal timeframe, which is a very useful aspect of
web design in a world where users are
used to having everything they need in the blink of an eye.
\item Screen Resolution - 
A “good” webpage is one that will scale properly to
different screen resolution settings, and we should test to make sure that pages
that our project deems as possessing a nice design has this property. 
\item Consistency - Pages throughout a website
should have a uniform look and feel in order to fit together properly.
\item Cross Browser Compatability - A website should
have the same look across all browsers, but at the very least the most popularly
used ones.  If javascript is broken in IE7 but works in Firefox, the website is
still a failure; this should be an important point as well.
\end{enumerate}

From our experiments we hope to obtain values for all these elements that will
serve as good guidelines for web design.

\newpage
\section{Project Timeline}
As an initial goal, we plan to accomplish implementing the following minimum
features 
\begin{enumerate}
\item Ever since our last proposal, we have not found any sign of a publically
available API for accessing PageRank values from Google. This goal has thus been
changed to finalizing the use of one of the many freely available PageRank
implementations available. (Nov 9th) 
\item Get aquainted with the process of Bayesian Filters for classification.
(Nov 9th) 
\item Research on ways to identify and extract various elements of website
design (Nov 16th)
\item Extract data on various elements of good website design. (Nov 16th)
\item Implement a tool that integrates implementations of these two algorithms
 for generating user requests for popular website designs and/or color schemes.
(Nov 26th)
\item Aggregation of results for the final report. (Nov 30th) 
\item Final paper submission. (Dec 2)
\end{enumerate}

Additional work that can be research if the results prove correct and time
permits
\begin{enumerate}
\item Additional elements involved in the design of websites that can be
 identified and extracted.
\item Implementing a utility that generates new templates for web pages based
 on the user's starting preferences and results obtained from the process.
\end{enumerate}

\newpage
\bibliographystyle{plain}
\bibliography{refs}
\end{document}
