\section{Introduction}\label{Sec:Intro}

Finding maximum matchings in undirected graphs has been of interest
for long. Consider a graph $G(V,E)$ with edge weight $w:E\rightarrow
\mathbf{R}^+$, without self-loops for simplicity. A set of edges
$M\subseteq E(G)$ is said to be a matching if no two edges in $M$
are adjacent.
%
%For a matching $M$, let $|M|$ and $w(M)$ denote the number of edges,
%and the sum of weights of edges respectively. We introduce some standard definitions.
%
For any set of edges $E'$ let $w(E')$ denote the the sum of weights
of edges in $E'$.
%
\comment{
\begin{definition*}
Given an unweighted graph $G$, a matching $M$ in $G$ is said to be a
matching of maximum cardinality if for every matching $M'$ in $G$,
$|M|\geq |M'|$. Given a weighted graph $G$, a matching $M$ in $G$ is
said to be a matching of maximum weight if for every matching $M'$
in $G$, $w(M)\geq w(M')$.
\end{definition*}

%\begin{definition}
%A matching $M$ in $G$ is said to be {\em maximal} if for every edge
%$e\in E(G)$ and $e\notin M$, we have $M\cup \{e\}$ is not a
%matching.
%\end{definition}

\begin{definition*}
An algorithm ${\mathcal{A}}$ for MCM is said to be
$\alpha$-approximate if for every graph $G$, the matching $M$ output
by $\mathcal{A}$ is such that $|M|\geq \frac{1}{\alpha}|M'|$, for
every other matching $M'$ in $G$.  An algorithm ${\mathcal{B}}$ for
MWM is said to be $\beta$-approximate if for every graph $G$, the
matching $M$ output by $\mathcal{B}$ is such that $w(M)\geq
\frac{1}{\beta}w(M')$, for every other matching $M'$ in $G$.
\end{definition*}
}
%
A matching $M$ in $G$ is said to be a matching of maximum weight if
for every matching $M'$ in $G$, $w(M)\geq w(M')$.

The problem of finding a maximum weight matching is a fundamental
problem in computer science. It has been studied extensively for
decades and there are exact algorithms that run in polynomial time.
The fastest known algorithm is due to Gabow~\cite{gabow} that runs
in time $O(nm + n^2\log n)$. When dealing with massive graphs, even
this proves to be inefficient. Several faster approximation
algorithms have been considered. Preis~\cite{preis} gave a
2-approximation algorithm that runs in $O(m)$ time. This was
improved to a $(3/2 + \epsilon)$-approximation algorithm in two
papers: one by Drake and Hougardy~\cite{DH05} that runs in time
$O(m/\epsilon)$ and another by Pettie and Sanders~\cite{PS} that
runs in time $O(m\log {\frac{1}{\epsilon}})$.

\subsection*{Streaming Model}

When dealing with massive graphs, such as online social communities
or other web-scale graphs, it is no longer reasonable to assume that
the input can be processed in physical memory (RAM). The graph is
normally stored on disks or tapes. In such a setting, it is the
read/write head seek times that dominate the running time.

The natural assumption in such settings is that the input data is
presented as a sequential stream (in no specific order). In this
paper, we denote the first arriving edge in the stream by $e_1$, the
second by $e_2$, and so on till $e_m$. Any algorithm processing this
stream is allowed a small working memory, typically logarithmic in
the size of the stream. This has been called the streaming model,
and has been extensively studied for the past several years.
Polylogarithmic space proves to be highly inadequate when dealing
with graphs.

To deal with graph problems in the streaming context
Muthukrishnan~\cite{M05} proposed the model of a semi-streaming
algorithm. The algorithm has access to the edges of $G$ as a stream
appearing in arbitrary order, and the memory of the algorithm is
restricted to $O(n\polylog{n})$ bits. This space requirement turns
out to be necessary for even verifying several simple graph
properties such as connectivity.

In this paper, we focus on the problem of finding maximum matchings
in graphs. This problem arises in many natural settings dealing with
large graphs (see e.g. refinement of FEM nets~\cite{MM00},
multilevel partitioning of graphs~\cite{MPD00}, virtual screening of
protein databases and~\cite{DH05} for other examples).
Further,
with the advent of online markets leading to entire fields of research such as
ad auctions, matchings have attained an even stronger recent interest. Commercial search
engines with an advertising component solve bipartite matching problems on a daily basis
allocating bids to bidders in an online fashion (see for example~\cite{GM08} and references therein). The problem of matchings in the streaming
algorithms setting has been studied for the past couple of years~\cite{FKMSZ05,McGregor05,Zelke08}. In the
one-pass streaming setting, using $\tilde{O}(n)$ space, several
approximation algorithms have been developed, improving the
approximation guarantee from 6 to 5.58 over papers~\cite{FKMSZ05,McGregor05,Zelke08}. McGregor~\cite{McGregor05} also gives a
multiple pass $(2+\epsilon)$-approximation solution, where the
number of passes depends on $\epsilon$. However, for some
applications, where the input data must be processed immediately and
cannot be stored, one can allow only a single pass. In the one pass
semi-streaming setting, it is impossible to find the optimal
matching~\cite{FKMSZ05}. In this paper, we present a single pass, $\tilde{O}(n)$
space streaming algorithm that guarantees a 5.24-approximation to the
maximum matching; this improves the best known 5.58-approximation of Zelke~\cite{Zelke08}.

\textbf{Our Contribution and Techniques.} We present the first
algorithm that considers {\em global} improvement to the stored
matching, as the stream is received. In particular, our algorithm is
a natural globalization of Zelke's algorithm~\cite{Zelke08} with
several modifications built on top of it.
%
The main challenge of this approach is the complication of the
analysis; i.e., the charging argument cannot be done locally. We
deal with this difficulty in two ways. First, we adjust the
algorithm to ensure that it makes global improvement while
maintaining a desired local properties. Second, we substantially
simplify our analysis by showing that some properties of the input
graph and the structure of the optimal solution can be assumed
without loss of generality.
%
Another technique that we use to analyze the algorithm's performance
is to consider what will happen in the future and decide how to
transfer the charge according to this information.
%
This analysis adds more insight to the problem and might lead to
further improvement.


%We present a new algorithm that considers {\em global} changes to
%the stored matching, as the stream is received. Previous algorithms
%have only considered local improvements. The deleted edges that we
%store are the same as in Zelke's algorithm~\cite{Zelke08}. Our
%approach of analyzing the algorithm uses a charging scheme (as all
%previous algorithms), however, there are several key differences.
%First, we substantially simplify our analysis by showing that some
%properties of the input graph and the structure of the optimal
%solution that can be assumed without loss of generality. Second,
%when we transfer the charge from, say, an edge $e$ to an edge $e'$
%at time $t$, not only we look at the local structure of the graph at
%time $t$, but also we look at what will happen in the future and
%decide how to transfer the charge according to this information.
%Finally, we do not only transfer the charge from some variables at,
%say, time $t$ to time $t+1$, but sometimes we transfer the charge to
%time $t+i$ for some $i>1$. This analysis adds more insight to the
%problem and might lead to further improvement.
