\chapter{Statistical Analysis Methods}
\label{app:stat_methods}

\section{Entropy and Mutual Information Estimation}

\subsection{Plug-in estimators}
Empirical distributions $\hat{p}(x)$ yield plug-in entropy $\hat{H}(X) = -\sum_x \hat{p}(x) \log \hat{p}(x)$. Bias corrections (Miller--Madow) improve finite-sample performance.

\subsection{Kozachenko--Leonenko estimator}
For continuous variables, utilize $k$-nearest neighbors distances $\rho_i$:
\begin{equation}
\hat{H}(X) = \psi(n) - \psi(k) + \log c_d + \frac{d}{n} \sum_{i=1}^n \log \rho_i,
\end{equation}
where $\psi$ is the digamma function and $c_d$ the unit ball volume.

\subsection{Mutual information}
Compute via entropy estimates: $\hat{I}(X; Y) = \hat{H}(X) + \hat{H}(Y) - \hat{H}(X, Y)$ or directly using Kraskov $k$-NN estimators.

\section{Transfer Entropy}

Transfer entropy estimation relies on conditional mutual information. For discrete processes, employ plug-in counts with Bayesian smoothing; for continuous processes, adopt ensemble estimators (JIDT toolkit).

\section{Effective Information}

Approximate interventions by resampling the source level with maximum-entropy distributions. Evaluate response distributions empirically and compute mutual information using estimators above.

\section{Hypothesis Testing}

\begin{itemize}
    \item \textbf{$t$-tests and ANOVA}: For normally distributed metrics; apply Welch corrections for unequal variances.
    \item \textbf{Non-parametric tests}: Mann--Whitney U, Kruskal--Wallis when distributional assumptions fail.
    \item \textbf{Multiple testing control}: Benjamini--Hochberg procedure for false discovery rate control.
\end{itemize}

\section{Bootstrap Procedures}

Resample runs or time blocks to estimate confidence intervals. Use block bootstrapping when temporal autocorrelation is significant.

\section{Finite-Size Scaling}

To collapse order-parameter curves:
\begin{enumerate}
    \item Estimate critical temperature $T_c(L)$ for each lattice size $L$.
    \item Rescale axes: $m L^{\beta/\nu}$ vs. $(T - T_c) L^{1/\nu}$.
    \item Adjust exponents $\beta, \nu$ to achieve visual overlap, confirming universality.
\end{enumerate}

\section{Sensitivity Analysis}

Compute Sobol sensitivity indices using quasi-random sampling. For each parameter $\theta_i$:
\begin{equation}
S_i = \frac{\Var_{\theta_i}(\E_{\theta_{-i}}[Y | \theta_i])}{\Var(Y)}.
\end{equation}
Adaptive sampling strategies (Latin hypercube, Bayesian optimization) prioritize regions of high variance.

