


\section{Main Results - TO BE REMOVE}\label{sec:main}

%\begin{theorem}\label{thm:main1}
%Under a model where the adversary is allowed to delete/add edges such that the diameter is always at most $D$, there is a distributed algorithm that requires only $O(1 + D/\log n)$ rounds per edge-failure such that it maintains a $c$-approximate densest subgraph for constant $c=$, .
%\end{theorem}
%
%\begin{theorem}\label{thm:main1}
%Under a model where the adversary is allowed to delete/add edges such that the diameter is always at most $D$, there is a distributed algorithm that requires only $O(1 + kD/\log n)$ rounds per edge-failure such that it maintains a $c$-approximate subgraph for the densest at least $k$ problem, for a constant $c=$.
%\end{theorem}
%
%\begin{corollary}
%\end{corollary}

\subsection{Algorithms}

- keep multiple potential densest subgraphs and switch to another when current one falls below approximation threshold
-- this requires that the density of densest subgraph be at least $D\log n$ so that there is enough time to redo computation
-- works for deletion and addition (since densest subgraph must be at least $D\log n$)
-- redo computation in the background (with lag)
-- also works for k at least $D\log n$
-- state as $kd \geq D\log n$

\subsection{Algorithms (Danupon's version)}


\paragraph{High level idea:} Danupon: There are two changes here. (1) In our previous discussion, we always delete half of the nodes. This algorithm needs $O(D_{\max}\log^2 n)$ time in the static case. I'll explain here the algorithm that uses $O(D_{\max}\log n)$ time based on the idea of deleting all nodes whose degrees are much smaller than the average. (2) We will deal with the since network directly. This is because there are many things that you have to deal with in the dynamic network and thus the idea of computing the densest subgraph of the network $G_{t-D_{\max}\log n}$ at time $t$ doesn't really work; e.g., some edge is already deleted when you are about to delete the nodes and thus you cannot notify some neighbors of the deleted node.

%\begin{algorithm}
%\caption{Xxx}
%\begin{algorithmic}[1]
%\STATE Let $H_0=G_t$ where $t$ is the current time step. Let $H^*$ be the currently maintained densest subgraph.
%\STATE At any step of the network, every node $v$ will maintain $m_v$, an estimate of the number of edges currently in the network. We will guarantee that $E(G_t)-2D_{\max}\leq m_v$ for all $v$, where $m$ is the actual number of edges in the network.
%\STATE Every node whose degree is less than $m/2n$ deletes itself from $G$, i.e., (1) it notifies its neighbors this and (2) it sends some information to the central node so that the central node can notify every one the number of edges in the network. This takes $2D_{\max}$ steps thus when this process is finished everyone still maintain $E(G_{t+2D_{\max}})-2D_{\max}\leq m_v$ at time $t+2D_{\max}$. Note we get $H_1$.
%\STATE The central node keeps track of the density of $H^*$ and $H_1, H_2, ...$. When the density of $H_i$ is more than the density of $H^*$, for some $i$, it broadcast to everyone that $H^*=H_i$.
%\end{algorithmic}
%\end{algorithm}
%

\begin{algorithm}[h!]
\caption{High level pseudocode}
\begin{algorithmic}[1]
\STATE Given a graph $G(V,E)$ with $|v| = n$.
\REQUIRE Each node of $G$ has a unique ID, and a random number generator.
\STATE $G_0 = G$ \COMMENT{maybe unnecessary}
\FOR{each node  $v\in G$}
\STATE \textsc{Init($v$)}
\ENDFOR
\STATE \textsc{ComputeDenseSubGraph($G_0$)}
\STATE Let $T$ be running time
\STATE Let $H*$ the densest subgraph computed by \textsc{ComputeDenseSubGraph($G_0$)}
\STATE \textsc{MaintainDenseSubGraph($G_T, H*$)}
\end{algorithmic}
\end{algorithm}

\begin{algorithm}[h!]
\caption{\textsc{Init($v$)}}
\begin{algorithmic}[1]
\STATE Set variables: $v.inGraph = TRUE$ etc
\end{algorithmic}
\end{algorithm}

\begin{algorithm}[h!]
\caption{ComputeDenseSubGraph($G$)}
\begin{algorithmic}[1]
\STATE Let $H^*=H_0 = G_0 = G $
\FOR{each node  $v\in G$}
\STATE Set u.a.r. $r.v.$s $rv_0, rv_1, \ldots, rv_{deg(v)}$ where $deg(v,G)$ is the degree of $v$ in $G$, each $r.v.$ between 0 and 1.
\STATE Set $rvnodes = rv_0$, $rvedges$ = min($rv_1, \ldots, rv_{deg(v)}$)
%\ENDFOR
\FOR{t = 0 to $D\log n$}
\STATE broadcast $rvnodes, rvedges$ on all edges
\STATE for all neighbors $w$ receive $w.rvnodes, w.rvedges$ on the edges
\STATE set $rvnodes, rvedges$ to minimum of own value and received values.
\ENDFOR
\IF{t = $kD$,  $1 \ge k \le \log n$}
 \IF{$v.degree < m_v/2n$}
 \STATE Delete $v$
 \ENDIF
\ENDIF
\ENDFOR
\end{algorithmic}
\end{algorithm}

\begin{algorithm}[h!]
\caption{MaintainDenseSubGraph($G$)}
\begin{algorithmic}[1]
\FOR{t = $\log n$ to T}
\IF{t = $c \log n$}
\STATE \textsc{ComputeDenseSubGraph($G_t$)}
\ENDIF
\ENDFOR
\end{algorithmic}
\end{algorithm}


\begin{algorithm}
\begin{algorithmic}[1]
\STATE Let $H_0=G_t$ where $t$ is the current time step. Let $H^*$ be the currently maintained densest subgraph.
\STATE At any step of the network, every node $v$ will maintain $m_v$, an estimate of the number of edges currently in the network. We will guarantee that $E(G_t)-2D_{\max}\leq m_v$ for all $v$, where $m$ is the actual number of edges in the network.
\STATE Every node whose degree is less than $m/2n$ deletes itself from $G$, i.e., (1) it notifies its neighbors this and (2) it sends some information to the central node so that the central node can notify every one the number of edges in the network. This takes $2D_{\max}$ steps thus when this process is finished everyone still maintain $E(G_{t+2D_{\max}})-2D_{\max}\leq m_v$ at time $t+2D_{\max}$. Note we get $H_1$.
\STATE The central node keeps track of the density of $H^*$ and $H_1, H_2, ...$. When the density of $H_i$ is more than the density of $H^*$, for some $i$, it broadcast to everyone that $H^*=H_i$.
\end{algorithmic}
\end{algorithm}


The main claim is that we will find a new $H_i$ to replace $H^*$ before the density of $H^*$ drop more than half.


\subsection{Proof of Main Theorem}

\subsection{Proof for the densest at least $k$ problem}

{\bf Danupon:} This is what I wrote in the email. I'll refine this later. We also have to adapt this to the new algorithm (where we delete nodes whose degree is less than average) ...

Here's another attempt to prove that the algorithm I mentioned is $O(1/(1-\alpha))$ approximation. I'll just show that when you delete one node at a time you can guarantee a ratio of 4. The  $O(1/(1-\alpha))$ approximation will follow since deleting $\alpha$ fraction will lose another $O(1/(1-\alpha))$ factor. This proof is a modification of Khuller-Saha's proof for their 2-approximation algorithm that Atish mentioned.

Let $G^*$ be an optimal solution and $d^*$ be the density of $G^*$. We keep extracting $H_1, H_2, ...$ (as Atish defined) until either 1) $H_1 \cup H_2 \cup ... \cup H_\ell$ contains half of edges in $G^*$ or 2)  $H_1 \cup H_2 \cup ... \cup H_\ell$ have more than $K$ nodes. This observation is in KS. We'll use KS definition $D_i=H_1 \cup H_2 \cup ... \cup H_i$ and $G_i$ is G with edges in $D_i$ removed. (Attention: This is different from Atish's definition.)

\begin{observation} $density(H_i)\geq d^*/2$, for all $i=1, ..., \ell$.
\end{observation}
\begin{proof}
(Detail in Khuller-Saha.)  Since half of edges in $G^*$ is in $G_i$ (that's why we didn't stop), $E(G^*)\cap E(G_i)$ is a candidate for $H_i$ with density at least $d^*/2$. Thus, $density(H_i)\geq d^*/2$.
\end{proof}

This observation implies that if we stop by the first rule then we get a graph of size $\geq K$ having density $\geq d^*/2$. If we stop by the second rule then we clearly have half of edges in $G^*$ and we're done once we pad some nodes to make the size equal to $K$. Thus, $D_\ell$ (probably with some node padded) is $2$-approximation.

Now, consider the algorithm I mentioned. Look at the first time we delete some node from $D_\ell$. (If this doesn't happen then we're done since our subgraph of size $K$ will contain $D_\ell$.) This observation is like before but crucial.

\begin{observation} For any $i$ and node $v$ in $H_i$, $degree(v)\geq density(H_i)$.
\end{observation}
\begin{proof} If $degree(v)<d(H_i)$ then $density(H_i-v) > density(H_i)$, contradicting the fact that $H_i$ is an optimal solution in $G_i$.
\end{proof}


So, when we delete some node from $D_\ell$, such node have degree at least $density(H_i)\geq d^*/2$ (by the first observation). Thus, the density of this subgraph is at least $d^*/4$ and we are done.

Note: The analysis is actually not tight (you can improve from 4 to 3) but it doesn't matter, I guess.




\subsection{Lower Bound}

-- $D$ lower bound should be possible
-- think of two similarly dense graphs connected by a long line with changes happening at both ends

\subsection{Extensions}

- Possible directions
-- Make lower and upper bounds match
-- show that node failures makes the problem very hard
-- the guarantees we have so far depend on density of optimal and diameter $D$, both of which change; can we get dependence on something similar?
---- need to define $D_{max}$ (worst diameter over time)
-- what happens when multiple failures happen at once?

-- Multiple deletions/additions per round
