\documentclass{article}
\input{macro.tex}
\title{B matrix and mutual information}
\begin{document}
\maketitle
There are many ways to measure the dependencies between two random variables.  Mutual information is one of commonly used measure. Another one is called B matrix F-norm. 
Suppose two discrete random variables are considered. Their joint distribution can be described by a $\abs{\X} \times \abs{\Y}$ matrix $P_{XY}(i,j)$.
Then marginal distribution $P_X(i)$ and $P_Y(j)$ can be obtained from $P_{XY}$. We can require that $P_X(i) > 0 $ and $P_Y(j) > 0 $ since we can remove the zero-probability alphabet.
The centerized B matrix is defined as $B_{ij}= \frac{P_{XY}(i,j) - P_X(i) P_Y(j)}{\sqrt{P_X(i)P_Y(j)}}$. Then B matrix F-norm is the Frobenius norm of B matrix, denoted as $\norm{B}_F$.

We give an example to show the relationship between mutual information and B matrix F-norm.
Suppose we have two Bernoulli random variable $X, Y$ whose joint distribution is 
$\begin{bmatrix}
p & 0.6-p \\
0.55-p & p - 0.15
\end{bmatrix}$ where $ 0.15\leq p \leq 0.55$. Then we can draw the two measures with respect to $p$ as Fig. \ref{re}
\begin{figure}
\centering
\includegraphics[width=6cm]{relation.eps}
\caption{}\label{re}
\end{figure}
\end{document}