\chapter{Conclusion} \label{chap:5}
\section{Summary of the Contributions of this Thesis}
In this thesis, a new approach is suggested for SS-PARSE reconstruction. The major parts of this approach are:
\begin{compactenum}[1.]
    \item The interpolated reconstruction method is proposed. This method can reconstruct higher-resolution images that show higher-frequency features while the reconstruction is still well constrained. A phantom experiment showed that the interpolated reconstruction is much better than the non-interpolated reconstruction in a larger field inhomogeneity environment. This could be also useful for high-field MRI because the field inhomogeneity of high-field MRI is more serious.
    \item Cubic convolution interpolator yields significantly less blurred $R_2^*$ than cubic spline interpolator because of its zero-passing at neighboring sample locations.
    \item The polynomial approximation of the exponential time function was used to linearize the reconstruction. This approximation is convenient for the dynamic estimation of $R_2^*$ and field map because the optimized polynomial coefficients need not be changed for updated $R_2^*$ and field map. Using this approximation, the reconstruction is converted to the form of Fourier transforms so that the FFT can be used.
    \item The quadratic approximation of the line search was implemented. This method greatly reduces the computation of the line search.
\end{compactenum}

Motivated by non-Cartesian trajectories used by SS-PARSE, we also improved the nonuniform FFT (NUFFT). The pre-weight scaling factor and interpolation coefficients are two critical issues in the NUFFT. The existing methods first optimize the interpolation, then compute the scaling factor based on the optimized interpolation. We linearized this problem so that optimal solutions of the scaling factor and interpolation coefficients are simultaneously found. This method improved the precision of the NUFFT.

This framework was extended to parallel imaging. We validated this method with simulated data. We also applied the proposed method to the experiment of human brain.

\section{Future Works}
In Chapter \ref{chap:4}, we observed that regularization can further improve the reconstruction performance. We empirically choose the regularization parameter and kernel. We need a mathematical method to optimize the parameter and kernel. The generalized cross-validation \cite{Reeves1992} is a possible candidate for this issue. We tentatively tried this method, but we did not have satisfying results. We need to refine this work or choose a different method.

The proposed methods were mathematically validated by simulated data. Validation by human and animal experiments is needed. The methods suggested in this dissertation reconstructed realistic images from these experiments, but we do not have a quantitative error analysis because of the lack of ``golden standard". With a golden standard from a human experiment, we would be able to refine some reconstruction parameters.

We tested the cubic convolution and cubic spline interpolations. Other interpolations may be better suited for this problem. With the interpolation, we have a continuous model instead of a discrete model. Because of the insurmountable computation obstacle we can only solve this problem with a higher discrete resolution. A possible future work is to find a continuous reconstruction algorithm that is within the capability of today's computing technology. The core of this algorithm is still how to deal with the problematic exponential time function. 