% !TEX root = ipdps-main.tex
\subsection{Related Work}\label{sec:related}

The problem of finding sparse cuts on graphs has been studied extensively
\cite{BhattL84,BenczurK96,Karger00,AroraRV04,SpielmanT04,ManokaranNRS08,DasSarmaGP09}. 
Sparse cuts form an important tool for analyzing large-scale distributed networks such as the Internet and Peer-to-Peer networks, as well as large-scale graphs such as the web graph, online social communities, click graphs from search engine query logs and VLSI circuits. Sparse cuts are useful in graph clustering and partitioning among numerous other applications \cite{SpielmanT04,AndersenCL06}.


The second eigenvector of the transition matrix is an important quantity to analyze many properties of a graph. A simple way of graph partitioning is by ordering the nodes in increasing order of coordinate values in the eigenvector. This partition can be used to compute sparse cut. This is a well known approach studied in \cite{LovaszS90,LovaszS93,SpielmanT04,AndersenCL06,DasSarmaGP09}. We use this approach in this paper. The second eigenvector technique has been analyzed in many papers \cite{Alon86,Boppana87,JerrumS88}.  

Lov{\'a}sz and Simonovits \cite{LovaszS90,LovaszS93} first show how random walks can be used to find sparse cuts. Specifically,
they show that random walks of length $O(1/\phi)$ can be used to compute a cut with sparsity at most $\tilde O(\sqrt{\phi})$ if the sparsest cut has conductance $\phi$. 
Spielman and Teng \cite{SpielmanT04} mostly follow the work of Lov{\'a}sz and Simonovits, but they implement it more efficiently by sparsifying the graph. They propose a nearly linear time algorithm for finding an approximate sparsest cut with approximate balance. 
Andersen, Chung, and Lang \cite{AndersenCL06} proposed a local partitioning algorithm using PageRank vector (instead of second eigenvector) to find cuts
near a specified vertex and global cuts. The running time of their algorithm was proportional
to the size of small side of the cut. Das Sarma, Gollapudi and Panigrahy \cite{DasSarmaGP09} present an algorithm for finding sparse cut in graph streams. Their algorithm requires sub-linear
space for a certain range of parameters, but provides much a weaker approximation to the sparsest cut compared to \cite{AndersenCL06,SpielmanT04}.
Arora, Rao, and Vazirani \cite{AroraRV04} provide $O(\sqrt{\log n})$-approximation algorithm
using semi-definite programming techniques. Their algorithm gives good approximation ratio, however it is slower than algorithms based on spectral methods and
random walks. Kannan, Vempala, and Vetta \cite{KannanVV04} studied variants of spectral algorithm for clustering or partitioning a graph. 

Graph partitioning or rather clustering is an well studied optimization problem. Suppose we are given an undirected graph and a conductance parameter $\phi$. The problem of finding a partition $(S, \bar{S})$ such that $\phi(S) \leq \phi$, or conclude no such partition exits is NP-complete problem (see, \cite{LeightonR99},\cite{SimaS06}). As a result, several approximation algorithms exits in literature. Leighton and Rao presents $O(\log n)$ approximation of the sparsest cut algorithm in \cite{LeightonR99} where they used linear programming. Later Arora, Rao, and Vazirani \cite{AroraRV04} improved this to $O(\sqrt{\log n})$ using semi-definite programming techniques. This is the best known approximation of the sparsest cut computation problem.  
Further, several works obtains algorithm with similar approximation guarantees algorithm but better running time such as \cite{AroraHK04}, \cite{KhandekarRV06}, \cite{AroraK07}, \cite{OrecchiaSVV08}. However, unfortunately no work have been found in distributed computing model. Our paper is the first to attempt in distributed setting for sparse cuts computation.  

The work of \cite{mihail} discusses spectral algorithms for enhancing the {\em topology awareness}, e.g., by identifying and assigning weights to {\em critical} edges of the network.  Critical edges are those that cross sparse cuts. 
 They discuss centralized
algorithms with provable performance, and introduce
decentralized heuristics with no provable guarantees. These algorithms are
based on distributed solutions of convex programs  and assign special weights to links
crossing or directed towards small cuts by minimizing
the second eigenvalue.
It is mentioned that obtaining provably efficient decentralized algorithm is an important open problem.
Our algorithm is fully
decentralized and  based on performing random walks, and so more
amenable to dynamic and self-organizing networks.



