<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
                "http://www.w3.org/TR/REC-html40/loose.dtd">
<html>
<head>
  <title>Description of demev1</title>
  <meta name="keywords" content="demev1">
  <meta name="description" content="DEMEV1	Demonstrate Bayesian regression for the MLP.">
  <meta http-equiv="Content-Type" content="text/html; charset=iso-8859-1">
  <meta name="generator" content="m2html &copy; 2003 Guillaume Flandin">
  <meta name="robots" content="index, follow">
  <link type="text/css" rel="stylesheet" href="../../m2html.css">
</head>
<body>
<a name="_top"></a>
<div><a href="../../menu.html">Home</a> &gt;  <a href="#">ReBEL-0.2.7</a> &gt; <a href="#">netlab</a> &gt; demev1.m</div>

<!--<table width="100%"><tr><td align="left"><a href="../../menu.html"><img alt="<" border="0" src="../../left.png">&nbsp;Master index</a></td>
<td align="right"><a href="menu.html">Index for .\ReBEL-0.2.7\netlab&nbsp;<img alt=">" border="0" src="../../right.png"></a></td></tr></table>-->

<h1>demev1
</h1>

<h2><a name="_name"></a>PURPOSE <a href="#_top"><img alt="^" border="0" src="../../up.png"></a></h2>
<div class="box"><strong>DEMEV1	Demonstrate Bayesian regression for the MLP.</strong></div>

<h2><a name="_synopsis"></a>SYNOPSIS <a href="#_top"><img alt="^" border="0" src="../../up.png"></a></h2>
<div class="box"><strong>This is a script file. </strong></div>

<h2><a name="_description"></a>DESCRIPTION <a href="#_top"><img alt="^" border="0" src="../../up.png"></a></h2>
<div class="fragment"><pre class="comment">DEMEV1    Demonstrate Bayesian regression for the MLP.

    Description
    The problem consists an input variable X which sampled from a
    Gaussian distribution, and a target variable T generated by computing
    SIN(2*PI*X) and adding Gaussian noise. A 2-layer network with linear
    outputs is trained by minimizing a sum-of-squares error function with
    isotropic Gaussian regularizer, using the scaled conjugate gradient
    optimizer. The hyperparameters ALPHA and BETA are re-estimated using
    the function EVIDENCE. A graph  is plotted of the original function,
    the training data, the trained network function, and the error bars.

    See also
    <a href="evidence.html" class="code" title="function [net, gamma, logev] = evidence(net, x, t, num)">EVIDENCE</a>, <a href="mlp.html" class="code" title="function net = mlp(nin, nhidden, nout, outfunc, prior, beta)">MLP</a>, <a href="scg.html" class="code" title="function [x, options, flog, pointlog, scalelog] = scg(f, x, options, gradf, varargin)">SCG</a>, <a href="demard.html" class="code" title="">DEMARD</a>, <a href="demmlp1.html" class="code" title="">DEMMLP1</a></pre></div>

<!-- crossreference -->
<h2><a name="_cross"></a>CROSS-REFERENCE INFORMATION <a href="#_top"><img alt="^" border="0" src="../../up.png"></a></h2>
This function calls:
<ul style="list-style-image:url(../../matlabicon.gif)">
<li><a href="evidence.html" class="code" title="function [net, gamma, logev] = evidence(net, x, t, num)">evidence</a>	EVIDENCE Re-estimate hyperparameters using evidence approximation.</li><li><a href="mlp.html" class="code" title="function net = mlp(nin, nhidden, nout, outfunc, prior, beta)">mlp</a>	MLP	Create a 2-layer feedforward network.</li><li><a href="mlpfwd.html" class="code" title="function [y, z, a] = mlpfwd(net, x)">mlpfwd</a>	MLPFWD	Forward propagation through 2-layer network.</li><li><a href="mlppak.html" class="code" title="function w = mlppak(net)">mlppak</a>	MLPPAK	Combines weights and biases into one weights vector.</li><li><a href="netevfwd.html" class="code" title="function [y, extra, invhess] = netevfwd(w, net, x, t, x_test, invhess)">netevfwd</a>	NETEVFWD Generic forward propagation with evidence for network</li><li><a href="netopt.html" class="code" title="function [net, options, varargout] = netopt(net, options, x, t, alg);">netopt</a>	NETOPT	Optimize the weights in a network model.</li></ul>
This function is called by:
<ul style="list-style-image:url(../../matlabicon.gif)">
<li><a href="demnlab.html" class="code" title="function demnlab(action);">demnlab</a>	DEMNLAB A front-end Graphical User Interface to the demos</li></ul>
<!-- crossreference -->


<h2><a name="_source"></a>SOURCE CODE <a href="#_top"><img alt="^" border="0" src="../../up.png"></a></h2>
<div class="fragment"><pre>0001 <span class="comment">%DEMEV1    Demonstrate Bayesian regression for the MLP.</span>
0002 <span class="comment">%</span>
0003 <span class="comment">%    Description</span>
0004 <span class="comment">%    The problem consists an input variable X which sampled from a</span>
0005 <span class="comment">%    Gaussian distribution, and a target variable T generated by computing</span>
0006 <span class="comment">%    SIN(2*PI*X) and adding Gaussian noise. A 2-layer network with linear</span>
0007 <span class="comment">%    outputs is trained by minimizing a sum-of-squares error function with</span>
0008 <span class="comment">%    isotropic Gaussian regularizer, using the scaled conjugate gradient</span>
0009 <span class="comment">%    optimizer. The hyperparameters ALPHA and BETA are re-estimated using</span>
0010 <span class="comment">%    the function EVIDENCE. A graph  is plotted of the original function,</span>
0011 <span class="comment">%    the training data, the trained network function, and the error bars.</span>
0012 <span class="comment">%</span>
0013 <span class="comment">%    See also</span>
0014 <span class="comment">%    EVIDENCE, MLP, SCG, DEMARD, DEMMLP1</span>
0015 <span class="comment">%</span>
0016 
0017 <span class="comment">%    Copyright (c) Ian T Nabney (1996-2001)</span>
0018 
0019 clc;
0020 disp(<span class="string">'This demonstration illustrates the application of Bayesian'</span>)
0021 disp(<span class="string">'re-estimation to determine the hyperparameters in a simple regression'</span>)
0022 disp(<span class="string">'problem. It is based on a local quadratic approximation to a mode of'</span>)
0023 disp(<span class="string">'the posterior distribution and the evidence maximization framework of'</span>)
0024 disp(<span class="string">'MacKay.'</span>)
0025 disp(<span class="string">' '</span>)
0026 disp(<span class="string">'First, we generate a synthetic data set consisting of a single input'</span>)
0027 disp(<span class="string">'variable x sampled from a Gaussian distribution, and a target variable'</span>)
0028 disp(<span class="string">'t obtained by evaluating sin(2*pi*x) and adding Gaussian noise.'</span>)
0029 disp(<span class="string">' '</span>)
0030 disp(<span class="string">'Press any key to see a plot of the data together with the sine function.'</span>)
0031 pause;
0032 
0033 <span class="comment">% Generate the matrix of inputs x and targets t.</span>
0034 
0035 ndata = 16;            <span class="comment">% Number of data points.</span>
0036 noise = 0.1;            <span class="comment">% Standard deviation of noise distribution.</span>
0037 randn(<span class="string">'state'</span>, 0);
0038 x = 0.25 + 0.07*randn(ndata, 1);
0039 t = sin(2*pi*x) + noise*randn(size(x));
0040 
0041 <span class="comment">% Plot the data and the original sine function.</span>
0042 h = figure;
0043 nplot = 200;
0044 plotvals = linspace(0, 1, nplot)';
0045 plot(x, t, <span class="string">'ok'</span>)
0046 xlabel(<span class="string">'Input'</span>)
0047 ylabel(<span class="string">'Target'</span>)
0048 hold on
0049 axis([0 1 -1.5 1.5])
0050 fplot(<span class="string">'sin(2*pi*x)'</span>, [0 1], <span class="string">'-g'</span>)
0051 legend(<span class="string">'data'</span>, <span class="string">'function'</span>);
0052 
0053 disp(<span class="string">' '</span>)
0054 disp(<span class="string">'Press any key to continue'</span>)
0055 pause; clc;
0056 
0057 disp(<span class="string">'Next we create a two-layer MLP network having 3 hidden units and one'</span>)
0058 disp(<span class="string">'linear output. The model assumes Gaussian target noise governed by an'</span>)
0059 disp(<span class="string">'inverse variance hyperparmeter beta, and uses a simple Gaussian prior'</span>)
0060 disp(<span class="string">'distribution governed by an inverse variance hyperparameter alpha.'</span>)
0061 disp(<span class="string">' '</span>);
0062 disp(<span class="string">'The network weights and the hyperparameters are initialised and then'</span>)
0063 disp(<span class="string">'the weights are optimized with the scaled conjugate gradient'</span>)
0064 disp(<span class="string">'algorithm using the SCG function, with the hyperparameters kept'</span>)
0065 disp(<span class="string">'fixed. After a maximum of 500 iterations, the hyperparameters are'</span>)
0066 disp(<span class="string">'re-estimated using the EVIDENCE function. The process of optimizing'</span>)
0067 disp(<span class="string">'the weights with fixed hyperparameters and then re-estimating the'</span>)
0068 disp(<span class="string">'hyperparameters is repeated for a total of 3 cycles.'</span>)
0069 disp(<span class="string">' '</span>)
0070 disp(<span class="string">'Press any key to train the network and determine the hyperparameters.'</span>)
0071 pause;
0072 
0073 <span class="comment">% Set up network parameters.</span>
0074 nin = 1;        <span class="comment">% Number of inputs.</span>
0075 nhidden = 3;        <span class="comment">% Number of hidden units.</span>
0076 nout = 1;        <span class="comment">% Number of outputs.</span>
0077 alpha = 0.01;        <span class="comment">% Initial prior hyperparameter.</span>
0078 beta_init = 50.0;    <span class="comment">% Initial noise hyperparameter.</span>
0079 
0080 <span class="comment">% Create and initialize network weight vector.</span>
0081 net = <a href="mlp.html" class="code" title="function net = mlp(nin, nhidden, nout, outfunc, prior, beta)">mlp</a>(nin, nhidden, nout, <span class="string">'linear'</span>, alpha, beta_init);
0082 
0083 <span class="comment">% Set up vector of options for the optimiser.</span>
0084 nouter = 3;            <span class="comment">% Number of outer loops.</span>
0085 ninner = 1;            <span class="comment">% Number of innter loops.</span>
0086 options = zeros(1,18);        <span class="comment">% Default options vector.</span>
0087 options(1) = 1;            <span class="comment">% This provides display of error values.</span>
0088 options(2) = 1.0e-7;        <span class="comment">% Absolute precision for weights.</span>
0089 options(3) = 1.0e-7;        <span class="comment">% Precision for objective function.</span>
0090 options(14) = 500;        <span class="comment">% Number of training cycles in inner loop.</span>
0091 
0092 <span class="comment">% Train using scaled conjugate gradients, re-estimating alpha and beta.</span>
0093 <span class="keyword">for</span> k = 1:nouter
0094   net = <a href="netopt.html" class="code" title="function [net, options, varargout] = netopt(net, options, x, t, alg);">netopt</a>(net, options, x, t, <span class="string">'scg'</span>);
0095   [net, gamma] = <a href="evidence.html" class="code" title="function [net, gamma, logev] = evidence(net, x, t, num)">evidence</a>(net, x, t, ninner);
0096   fprintf(1, <span class="string">'\nRe-estimation cycle %d:\n'</span>, k);
0097   fprintf(1, <span class="string">'  alpha =  %8.5f\n'</span>, net.alpha);
0098   fprintf(1, <span class="string">'  beta  =  %8.5f\n'</span>, net.beta);
0099   fprintf(1, <span class="string">'  gamma =  %8.5f\n\n'</span>, gamma);
0100   disp(<span class="string">' '</span>)
0101   disp(<span class="string">'Press any key to continue.'</span>)
0102   pause;
0103 <span class="keyword">end</span>
0104 
0105 fprintf(1, <span class="string">'true beta: %f\n'</span>, 1/(noise*noise));
0106 
0107 disp(<span class="string">' '</span>)
0108 disp(<span class="string">'Network training and hyperparameter re-estimation are now complete.'</span>) 
0109 disp(<span class="string">'Compare the final value for the hyperparameter beta with the true'</span>) 
0110 disp(<span class="string">'value.'</span>)
0111 disp(<span class="string">' '</span>)
0112 disp(<span class="string">'Notice that the final error value is close to the number of data'</span>)
0113 disp([<span class="string">'points ('</span>, num2str(ndata),<span class="string">') divided by two.'</span>])
0114 disp(<span class="string">' '</span>)
0115 disp(<span class="string">'Press any key to continue.'</span>)
0116 pause; clc;
0117 disp(<span class="string">'We can now plot the function represented by the trained network. This'</span>)
0118 disp(<span class="string">'corresponds to the mean of the predictive distribution. We can also'</span>)
0119 disp(<span class="string">'plot ''error bars'' representing one standard deviation of the'</span>)
0120 disp(<span class="string">'predictive distribution around the mean.'</span>)
0121 disp(<span class="string">' '</span>)
0122 disp(<span class="string">'Press any key to add the network function and error bars to the plot.'</span>)
0123 pause;
0124 
0125 <span class="comment">% Evaluate error bars.</span>
0126 [y, sig2] = <a href="netevfwd.html" class="code" title="function [y, extra, invhess] = netevfwd(w, net, x, t, x_test, invhess)">netevfwd</a>(<a href="mlppak.html" class="code" title="function w = mlppak(net)">mlppak</a>(net), net, x, t, plotvals);
0127 sig = sqrt(sig2);
0128 
0129 <span class="comment">% Plot the data, the original function, and the trained network function.</span>
0130 [y, z] = <a href="mlpfwd.html" class="code" title="function [y, z, a] = mlpfwd(net, x)">mlpfwd</a>(net, plotvals);
0131 figure(h); hold on;
0132 plot(plotvals, y, <span class="string">'-r'</span>)
0133 xlabel(<span class="string">'Input'</span>)
0134 ylabel(<span class="string">'Target'</span>)
0135 plot(plotvals, y + sig, <span class="string">'-b'</span>);
0136 plot(plotvals, y - sig, <span class="string">'-b'</span>);
0137 legend(<span class="string">'data'</span>, <span class="string">'function'</span>, <span class="string">'network'</span>, <span class="string">'error bars'</span>);
0138 
0139 disp(<span class="string">' '</span>)
0140 disp(<span class="string">'Notice how the confidence interval spanned by the ''error bars'' is'</span>)
0141 disp(<span class="string">'smaller in the region of input space where the data density is high,'</span>)
0142 disp(<span class="string">'and becomes larger in regions away from the data.'</span>)
0143 disp(<span class="string">' '</span>)
0144 disp(<span class="string">'Press any key to end.'</span>)
0145 pause; clc; close(h); 
0146 <span class="comment">%clear all</span></pre></div>
<hr><address>Generated on Tue 26-Sep-2006 10:36:21 by <strong><a href="http://www.artefact.tk/software/matlab/m2html/">m2html</a></strong> &copy; 2003</address>
</body>
</html>