\lstset{frame=single,numbers=left,numbersep=4pt,language=Synapse,keywordstyle=\color{blue},commentstyle=\color{DarkGreen}}

\section{Introduction}

The Synapse language is currently implemented as an interpreter, however future versions may be able to compile C/C++ with OpenCL or CUDA and MATLAB.  Synapse was designed for programs that simulate biological neural networks and run in a massively concurrent manner.

\section{Installing}

The code for the interpreter can be downloaded from \url{http://synapse-lang.googlecode.com}.  You will also need to download and install Ocaml \url{http://caml.inria.fr/ocaml/}.  Once you have done this, you can go into the ``src'' directory and type ``make''.  To test the install, you can type ``make test''.  The output will be very verbose, however a summary of the results will be shown at the end.

\section{Writing your first Synapse programs}

The simplest program is following:

\begin{lstlisting}
input $1[5]; /* Define the input parameter as a vector with 5 elements */

$2 << $1; /* Copy input to output */
\end{lstlisting}

Before running the program you first need to create the input file.  One of the formats that Synapse allows is space-delimited files.  Each line in the file corresponds to a single time step.  Here is an example that works with our previous example:

\begin{verbatim}
1 2 3 4 5
0 1 0 1.2 0
7 .8 3 2 2
2 8 .9 3 0
\end{verbatim}

You can run the example with:

\begin{verbatim}
.\synap input.txt output.txt < program.syn
\end{verbatim}


Synapse allows you to flip the input quickly.  For example, the following could mirror an image with:

\begin{lstlisting}
input $1[480,640,3]; /* Height, width, and number of channels for an image */
$2[ y,size($1,2)-x+1,c] << $1[y,x,c] for x=[1:640] y=[1:480] c=[1:3];
\end{lstlisting}

\section{Activation Functions}

Functions are helpful in order to do more complex tasks.  There are two types of functions allowed in Synapse.
Activation functions take a single float and returns a single float.  Optional parameters can also be passed.
The following example performs gamma correction on 128x128 images.

\ImportExample{ex-gammacorrection}
%
%When given the following image
%\includegraphics{images/barb-128x128.png}
%this image is produced
%\includegraphics{images/barb-gamma.png}.

\begin{figure}[ht]
\centering
\subfigure[Original image]{
\includegraphics[scale=1]{images/barb-128x128.png}
\label{fig:subfig1}
}
\hspace{1em}
\subfigure[Resulting image]{
\includegraphics[scale=1]{images/barb-gamma.png}
\label{fig:subfig2}
}
\label{fig:subfigureExample}
\caption[]{An example input and output image resulting from Example ex-gammacorrection.}
\end{figure}

Please note that the ability to read in sequences of images (in PPM format) is very experimental and known bugs exist,
for example, the file format for the output images is currently ignored.
This example can be found in the tests directory in the subversion repository.  It can be run with
 ``./run\_examples.sh ../tests/ex-gammacorrection.syn'' in the subversion src directory.
 
 
\section{Kernel Functions}

The other type of functions that are allowed are kernel functions.  Kernel functions can be applied to (or convolved with)
matrices (or expressions that results in matrices).  The are defined with the keyword \lit{kernel}.  There is a non-optional
parameter for each dimension of the expression for which it is convolved.

\ImportExample{test-kernel1d}


 
\section{Modules and Neurons}

In order to be able to model biological neural networks, the network connections need to be represented.
This can be done with the use of modules and neurons.  Every neuron must be in a module, which could be
equated to a hypercolumn or micronetwork.  A module specifies input neurons, output neurons, and inter-neurons.
The input and output neurons are the only neurons that can have external connections.  The below example only
uses input and output neurons.

\ImportExample{test-module1}

Each synapse connection is evaluated concurrently, which means that it can take several or many time steps for the values to propagate through the network, but it also means that it can easily utilize multi-core architectures.

See Appendix B for more a lot more examples and the next chapter for the rules of the Synapse language.