\documentclass[twoside,a4paper]{article}
\usepackage{geometry}
\geometry{margin=1.5cm, vmargin={0pt,1cm}}
\setlength{\topmargin}{-1cm}
\setlength{\paperheight}{29.7cm}
\setlength{\textheight}{25.3cm}

% useful packages.
\usepackage{amsfonts}
\usepackage{amsmath}
\usepackage{amssymb}
\usepackage{amsthm}
\usepackage{enumerate}
\usepackage{graphicx}
\usepackage{multicol}
\usepackage{fancyhdr}
\usepackage{layout}
\usepackage{tabularx}

\begin{document}
	
	\pagestyle{fancy}
	\fancyhead{}
	\lhead{Haolong Li}
	\chead{DMAA Homework \#1}
	\rhead{\today}
	
\section*{Brief}
$P_{47}$ 2.2 2.3 

\section*{2.2 \quad	
		Refer to Example 2.1, build an example that solve perceptron model from training data set.}

Solution:
	
Training data set $\mathbf{T} = {(1,2)^{\mathbf{T}},(2,3)^{\mathbf{T}},(4,4)^{\mathbf{T}}} $, within it $x_{1}=(1,2)^{\mathbf{T}}$ is a plus sample point, $x_{2}=(2,3)^{\mathbf{T}} \quad x_{3}=(4,4)^{\mathbf{T}}$ are minus sample points.
	
Build optimization problem:
		\[
			\min\limits_{\omega,b}\mathbf{L}(\omega,b)=-\sum_{x_{i} \in M}^{}y_{i}(\omega * x_{i} + b)
		\]		
Using algorithm 2.1 to find $\omega,b$, $\eta$=1.
	
	~\\
(1) Let $\omega_{0}=0,b_{0}=0$.

~\\
	
(2) Using the algorithm 2.1.

\begin{center}	
\begin{tabular}{cccc<{\centering}}
		\hline
		number of iteration & missorted point & $\omega$ & b \\
		\hline
		0 & & 0 & 0 \\
		1 & $x_{1}$ & $\omega_{1}=(1,2)^{\mathbf{T}}$ & $b_{1}=1$\\
		2 & $x_{2}$ & $\omega_{2}=(-1,-1)^{\mathbf{T}}$ & $b_{2}=0$\\
		3 & $x_{1}$ & $\omega_{3}=(0,1)^{\mathbf{T}}$ & $b_{3}=1$\\
		4 & $x_{2}$ & $\omega_{4}=(-2,-2)^{\mathbf{T}}$ & $b_{4}=0$\\
		5 & $x_{1}$ & $\omega_{5}=(-1,0)^{\mathbf{T}}$ & $b_{5}=1$\\
		6 & $x_{1}$ & $\omega_{6}=(0,2)^{\mathbf{T}}$ & $b_{6}=2$\\
		7 & $x_{2}$ & $\omega_{7}=(-2,-1)^{\mathbf{T}}$ & $b_{7}=1$\\
		8 & $x_{1}$ & $\omega_{8}=(-1,1)^{\mathbf{T}}$ & $b_{8}=2$\\
		9 & $x_{2}$ & $\omega_{9}=(-3,-2)^{\mathbf{T}}$ & $b_{9}=1$\\
		10 & $x_{1}$ & $\omega_{10}=(-2,-1)^{\mathbf{T}}$ & $b_{10}=2$\\
		11 & $x_{1}$ & $\omega_{11}=(-1,1)^{\mathbf{T}}$ & $b_{11}=3$\\
		12 & $x_{2}$ & $\omega_{12}=(-3,-2)^{\mathbf{T}}$ & $b_{12}=2$\\
		13 & $x_{1}$ & $\omega_{13}=(-2,0)^{\mathbf{T}}$ & $b_{13}=3$\\
		14 & & $\omega_{13}=(-2,0)^{\mathbf{T}}$ & $b_{13}=3$\\		
		\hline
\end{tabular}
\end{center}

The separating hyperplane is : $-2x^{(1)}+3=0$,

The perceptron model is : $f(x)=sign(-2x^{(1)}+3=0)$.

\section*{2.3 \quad
	 prove the theory below:sample set is linearly separable $ \Leftrightarrow $ the convex hull consists of the plus sample point set and the convex hull  consists of the minus sample point set mutually disjoint.}

Solution:

$\Rightarrow$:

The sample set is linearly separable, so there exists a separating hyperplane S, s.t. if $y_{i}=+1,\omega * x_{i}+b>0$;if $y_{i}=-1,\omega * x_{i}+b<0$

To express better, we assume that $x'_{1},x'_{2},x'_{3},\dots,x'_{n}$ are plus sample points, $x"_{1},x"_{2},x"_{3},\dots,x"_{m}$ are minus sample points.

If the convex hull consists of the plus sample point set and the convex hull  consists of the minus sample point set don't mutually disjoint, so there exists 
\begin{equation}
\lambda'_{1},\lambda'_{2},\dots,\lambda'_{n},\lambda"_{1},\lambda"_{2},\dots,\lambda"_{m}, s.t. \quad \lambda'_{1}x'_{1}+\lambda'_{2}x'_{2}+\dots+\lambda'_{n}x'_{n}=\lambda"_{1}x"_{1}+\lambda"_{2}x"_{2}+\dots+\lambda"_{m}x"_{m} \label{1}
\end{equation}
Let $(1)*\omega+b$,the we know the left $>$ 0, the right $<$ 0, conflicting.

$\Leftarrow$:





\end{document}

%%% Local Variables: 
%%% mode: latex
%%% TeX-master: t
%%% End: 
