\subsection{Sufficient condition: Theorem~\ref{thm:sufficient.symmetric}}
\label{sec:sym_sufficient.tex}

Let us consider a probabilistic construction of a bipartite graph
$G=(V,W,E)$ where, given $n_1, n_2, \ldots, n_r$ such that $\alpha_i =
\frac{n_i}{n/k} \in [n^{-1/100}, n^{1/100}]$, we place an
independently drawn random copy $G_i$ of $K_{n_i,n_i}$ between $V$ and
$W$. In other words, $G$ is the union of $G_1,G_2,\ldots,G_r$ where
$G_i = (V_i,W_i, E_i=V_i \times W_i)$ and $V_i$, $W_i$ are uniformly
chosen random $n_i$ element subsets of $V$ and $W$ respectively. Fix a
potential independent set $(S,T)$ of size $k \times k$. Then, as shown
below,
\begin{equation}
\label{eq:basic}
\Pr[E(H_i) \cap S \times T = \emptyset] \leq 1 - (1 - \exp(-\alpha_i))^2. 
\end{equation}

\noindent Thus, since the graphs $G_i$'s are chosen independently, the
probability that $(S,T)$ is independent in $G$ is
\[ p=\Pr[(S,T) \mbox{ is independent in } G] \leq \prod_{i=1}^r (1- (1 - \exp(-\alpha_i))^2).\]
By the union bound, if $p {n \choose k}^2 < 1$ then there is bipartite
graph built by putting together one copy each of $K_{n_i,n_i}$ that
avoids all independent sets of size $k \times k$. The interesting
aspect of this calculation is in the form of the expression $p_i=1- (1 -
\exp(-\alpha_i))^2$.  We will show below that
\begin{equation}
\label{eq:pi}
p_i \leq \left\{ \begin{array}{l l}
                        \exp\left(-\frac{\alpha_i^2}{3}\right) & \mbox{if $\alpha_i \leq 1$} \\
                        \exp(- (1-\ln 2) \alpha_i) & \mbox{if $\alpha_i > 1$}
                       \end{array} \right. 
\end{equation}

\noindent This immediately gives us our first result, the sufficient
condition stated as Theorem~\ref{thm:sufficient.symmetric} in the
introduction.

\paragraph{Proof of (\ref{eq:basic}):} Recall that $G_i=(V_i, W_i, V_i \times W_i)$, 
where $V_i$ and $W_i$ are uniformly chosen random subsets of $V$ and
$W$ of size $n_i$ each.
\[ \Pr[E(H_i) \cap S \times T = \emptyset] = \Pr[V_i \cap S = \emptyset \vee W_i \cap T = \emptyset]
            = 1 - (1-\Pr[V_i \cap S = \emptyset])(1-\Pr[W_i \cap T = \emptyset]).\]
Then, (\ref{eq:basic}) follows from this because
\begin{eqnarray*}
\Pr[V_i \cap S = \emptyset],  \Pr[W_i \cap T = \emptyset] 
                            & = & {n-k \choose n_i}{n \choose n_i}^{-1} \\
                            & \leq & \left(\frac{n-k}{n}\right)^{n_i} \\
                            & \leq & \exp\left(-\frac{kn_i}{n}\right) \\
                            & = & \exp(-\alpha_i).
\end{eqnarray*}

\paragraph{Proof of (\ref{eq:pi}):} We have
\[ p_i = 1 - (1 - \exp(-\alpha_i))^2 = \exp(-\alpha_i)(2 - \exp(-\alpha_i)).\]
If $\alpha_i \leq 1$, then we have
\[ 2 - \exp(-\alpha_i) \leq 1 + \alpha_i -\frac{\alpha_i^2}{2} + \frac{\alpha_i^3}{6}
                  \leq \exp\left(\alpha_i - \frac{\alpha_i^2}{3}\right).\]
Thus,
\[ p_i = \exp(-\alpha_i)(2 - \exp(-\alpha_i)) \leq \exp\left(-\frac{\alpha_i^2}{3}\right).\]
If $\alpha_i > 1$, then we have
\[ p_i = \exp(-\alpha_i)(2 - \exp(-\alpha_i)) \leq 2 \exp(-\alpha_i) = \exp(-\alpha_i + \ln 2)
\leq \exp(- (1 -\ln 2)\alpha_i).\]

We might ask if this sufficient condition is also necessary. As noted
in the introduction, the K\H{o}v\'{a}ri, S\'{o}s and Tur\'{a}n bound
(Inequality ~\ref{eq:kst}) explains the first term in the above
sufficient condition, and the Hansel bound (Inequality
~\ref{eq:hansel}) explains the second term. We thus have explanations
for both the terms in the LHS of the sufficient condition, using two
classical theorems of graph theory. However, neither of them implies
in full generality that the sufficient condition derived above is
necessary.
