{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# GPflow manual"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "This document can be used to get familiarised with GPflow. We've split up the material in four different categories: basics, understanding, advanced needs and tailored models.  We have also provided a [flow diagram](GPflows.png) to guide users to the relevant parts of GPflow for their specific problem.\n",
    "\n",
    "## Basics\n",
    "\n",
    "The basics notebooks cover the elementary uses of GPflow, where we show how to use GPflow for your basic datasets with existing models.\n",
    "\n",
    "  - In [regression.ipynb](basics/regression.ipynb) and [classification.ipynb](basics/classification.ipynb) we show how to use GPflow to fit simple regression and classification models (Rasmussen and Williams, 2006)\n",
    "  - In [gplvm.ipynb](basics/GPLVM.ipynb) we cover the unsupervised case, and showcase GPflow's Bayesian GPLVM implementation (Titsias and Lawrence, 2010).\n",
    "\n",
    "In each of these notebooks we go over the data format, model setup, model optimisation and prediction options.\n",
    "\n",
    "## Understanding\n",
    "\n",
    "This section covers the building blocks of GPflow from an implementation perspective, and shows how the different modules interact as a whole.\n",
    "<!-- - [Architecture](understanding/architecture.ipynb)  **[TODO]** -->\n",
    "<!--  - [Utilities](understanding/utilities.ipynb): expectations, multi-output, conditionals, Kullback-Leibler divergences (KL), log-densities, features and quadrature  **[TODO]** -->\n",
    "  - [Manipulating models](understanding/models.ipynb)\n",
    "  - [Handling TensorFlow graph and sessions](understanding/tf_graphs_and_sessions.ipynb).\n",
    "  - [Tips and Tricks](tips_and_tricks.ipynb): a more elaborate list of how GPflow works together with TensorFlow.\n",
    "\n",
    "    \n",
    "## Advanced Needs\n",
    "\n",
    "GPflow also allows for more complex features and models:\n",
    "\n",
    "*Models:*\n",
    "  - [MCMC](advanced/mcmc.ipynb) instead of variational inference: Hamiltonian Monte Carlo to sample the posterior GP and hyperparameters.\n",
    "  - [Ordinal regression](advanced/ordinal_regression.ipynb) shows how to use GPflow for dealing with ordinal variables.\n",
    "  - [Varying noise for different data points](advanced/varying_noise.ipynb) using a custom likelihood and using the SwitchedLikelihood.\n",
    "  - [Multi-class classification](advanced/multiclass_classification.ipynb): classification in the non-binary case.\n",
    "  - [Multi-output with coregionalisation](advanced/coregionalisation.ipynb), when not all outputs are observed at every data point.\n",
    "  - [Multi-outputs with SVGPs](advanced/multioutput.ipynb), more efficient when all outputs are observed at all data points.\n",
    "  - When you're dealing with a large datasets (over 1k), you want to resort to Sparse methods. In [GPs for big data](advanced/gps_for_big_data.ipynb) we show how to use GPflow's Sparse Variational GP (SVGP) model (Hensman et al., 2013; 2015)\n",
    "  - [Manipulating kernels](advanced/kernels.ipynb) shows what covariances are included in the library, and how they can be combined to create new ones.\n",
    "\n",
    "*Features:*\n",
    "  - [Natural gradients](advanced/natural_gradients.ipynb) for optimising the variational approximate posterior's parameters.\n",
    "<!--  - [optimisers](advanced/optimisation.ipynb)  **[TODO]** -->\n",
    "  - [Settings](advanced/settings.ipynb): Adjust jitter (for inversion or Cholesky errors), floating point precision, parallelism, and more.\n",
    "  - [Monitoring parameter optimisations](advanced/monitoring.ipynb): Sending things to TensorBoard, (re)storing checkpoints, and more.\n",
    "\n",
    "## Tailored models\n",
    "\n",
    "In this section, we show how GPflow's utilities and codebase can be used to build new probabilistic models.\n",
    "These can be seen as complete examples.\n",
    "  - [Kernel design](tailor/kernel_design.ipynb) shows how to implement a covariance function that is not available by default in GPflow. For this example, we look at the Brownian motion covariance.\n",
    "<!--  - [likelihood design](tailor/likelihood_design.ipynb) **[TODO]** -->\n",
    "<!--  - [Latent variable models](tailor/models_with_latent_variables.ipynb) **[TODO]** -->\n",
    " <!-- - [Updating models with new data](tailor/updating_models_with_new_data.ipynb) **[TODO]** -->\n",
    "  - Two different ways of how to [combine tensorflow neural networks with GPflow models](tailor/gp_nn.ipynb).\n",
    "  - [External mean functions](tailor/external-mean-function.ipynb): here is how to use a neural network as mean function.\n",
    "  - [Mixture density network](tailor/mixture_density_network.ipynb): we show how GPflow's utilities make it easy to build other, non-GP probabilistic models.\n",
    "    \n",
    "\n",
    "## Theoretical notes\n",
    "\n",
    "The following notebooks relate to the theory of Gaussian processes and approximations. These are not required reading for using GPflow, but are included for those interested in theoretical underpinning and technical details.\n",
    "  - [Derivation of VGP equations](theory/vgp_notes.ipynb).\n",
    "  - [Derivation of SGPR equations](theory/SGPR_notes.ipynb).\n",
    "  - [FITC vs VFE](theory/FITCvsVFE.ipynb): why we like the Variational Free Energy objective for our sparse approximations.\n",
    "  - A ['Sanity check' notebook](theory/Sanity_check.ipynb) that demonstrates the overlapping behaviour of many of the GPflow model classes in special cases (specifically, with a Gaussian likelihood and, for sparse approximations, inducing points fixed to the data points).\n",
    "\n",
    "### References\n",
    "Carl E Rasmussen and Christopher KI Williams. Gaussian Processes for Machine Learning. MIT Press, 2006.\n",
    "\n",
    "James Hensman, Nicolo Fusi, and Neil D Lawrence. Gaussian Processes for Big Data. Uncertainty in Artificial Intelligence, 2013.\n",
    "\n",
    "James Hensman, Alexander G de G Matthews, and Zoubin Ghahramani. Scalable variational Gaussian process classification. Proceedings of the Eighteenth International Conference on Artificial Intelligence and Statistics, 2015.\n",
    "\n",
    "Michalis Titsias and Neil D Lawrence. Bayesian Gaussian process latent variable model. Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 2010.\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.7.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
