

\documentclass[xcolor=dvipsnames,svgnames]{article}

% The following \documentclass options may be useful:
%
% 10pt          To set in 10-point type instead of 9-point.
% 11pt          To set in 11-point type instead of 9-point.
% authoryear    To obtain author/year citation style instead of numeric.
\usepackage{xcolor} 
%\usepackage{diagrams}
\usepackage{amsmath}
\usepackage{amssymb}
\usepackage{fullpage}
\usepackage{amsthm}
\usepackage{proof}
\usepackage{hyperref}
\usepackage{listings}



\definecolor{light-gray}{gray}{0.86}
\lstdefinelanguage{MyHaskell}
{alsoletter={+,|,->, =, ::}, morekeywords={+,|,->, =, ::, data, Int, max, case, of}
}
\lstdefinelanguage{MyLambda}
{alsoletter={\., |, ->, = }, morekeywords={=,|, -> ,\., lam , fix, case, of}}

\definecolor{syellow}{RGB}{181,137,0}
\definecolor{scyan}{RGB}{42,161,152}
\newcommand{\gray}[1]{\colorbox{light-gray}{#1}}
\newcommand{\nil}[0]{\mathsf{nil}} 
\newcommand{\cons}[0]{\mathsf{cons}} 
\newcommand{\vecc}[0]{\mathsf{vec}} 
\newcommand{\suc}[0]{\mathsf{S}} 
\newcommand{\nat}[0]{\mathsf{Nat}} 
\newcommand{\ind}[0]{\mathsf{Ind}} 
\newcommand{\app}[0]{\mathsf{app}} 
\newcommand{\rec}[0]{\mathsf{rec}} 
\newcommand{\case}[0]{\mathsf{case}} 
\newcommand{\eq}[0]{\mathsf{Eq}} 
\newcommand{\add}[0]{\mathsf{add}} 

\newcommand{\selfstar}[0]{\mathsf{Selfstar}} 
\newcommand{\self}[0]{\mathbf{S}} 
\newcommand{\cc}[0]{\mathbf{CC}} 
\newcommand{\systemt}[0]{\mathbf{T}} 
\newcommand{\M}[3]{
\{#1_i \mapsto #2_i\}_{i \in #3}} 
\newcommand{\bm}[4]{
\{(#1_i:#2_i) \mapsto #3_i\}_{i \in #4}} 
\newcommand{\frank}[1]{\textcolor{blue}{\textbf{[#1 ---Frank]}}}
\newtheorem{theorem}{Theorem}
\newtheorem{definition}[theorem]{Definition}
\newtheorem{lemma}[theorem]{Lemma}
\newtheorem{proposition}[theorem]{Proposition}

%%\newarrowfiller{dasheq} {==}{==}{==}{==}
%% \newarrow {Mapsto} |--->
%% \newarrow {Line} -----
%% \newarrow {Implies} ===={=>}
%% \newarrow {EImplies} {}{dasheq}{}{dasheq}{=>}
%% \newarrow {Onto} ----{>>}
%% \newarrow {DashInto} C{dash}{}{dash}{>}
%% \newarrow {Dashto}{}{dash}{}{dash}{>}
%% \newarrow {Dashtoo}{}{dash}{}{dash}{>>}

\begin{document}

%% \conferenceinfo{WXYZ '05}{date, City.} 
%% \copyrightyear{2005} 
%% \copyrightdata{[to be supplied]} 

%% \titlebanner{banner above paper title}        % These are ignored unless
%% \preprintfooter{short description of paper}   % 'preprint' option specified.

\title{Dependently-Typed Programming with Scott Encoding}
%%\subtitle{Extended Abstract}

\author{Peng Fu, Aaron Stump \\
   Computer Science, The University of Iowa
}
          
%\author{Peng Fu  Aaron Stump}

\maketitle

\begin{abstract}
We introduce $\selfstar$, a Curry-style dependent type
system featuring \textit{self} type $\iota x.t$, 
together with mutually recursive definitions and $*:*$. We show how to obtain
 Scott-encoded datatypes and the corresponding elimination schemes with $\selfstar$. Examples such as numerals, vector are given to demonstrate the power of $\selfstar$ as a dependently-typed programming language. Standard metatheorems such as type preservation and progress are proved. 
\end{abstract}

%% \category{CR-number}{subcategory}{third-level}

%% \terms
%% term1, term2

%% \keywords
%%  Lambda Encoding, Dependent Type, Type preservaiton, Progress, Confluence

\section{Introduction} 
\label{sec:intro}
In the practice of designing dependently-typed language, it is often desirable to design a system that provides the notions of inductive data type and pattern matching, as well as the ability to write programs that computes types. Often language with primitive notion of inductive data type and pattern matching has complicated design, meaning that it is not obvious to see the language is type safe, i.e. satisfies the type preservation and progress properties.  

In this paper, we study a system called $\selfstar$, which combines a type construct $\iota x.t$, called self type \cite{Pfu:2013}, together with $*:*$ and mutually recursive definitions. Self type and mutually recursive definitions provides the ability to type Scott-encoded inductive data type as lambda terms. We want to emphasis that operations on Scott-encoded data are as efficient as the operations defined on primitive build-in notion of inductive data. $*:*$ gives us the ability to write programs that compute types. We are able to show $\selfstar$ is type safe. $\selfstar$ gives an example of simple and reliable design of dependently typed functional programming language, thus is suitable to serve as a core language. 

In $\selfstar$, every type is inhabited, so $\selfstar$ is inconsistent as a logic. The only logical feature in $\selfstar$ is the Leibniz \textit{convertibility}, i.e. we define $t_1 =_A t_2$ to be $\Pi C:A \to *. C t_1 \to C t_2$. Note that we use ``convertibility'' instead of ``equality'' to indicate one can not interpret $t_1 =_{A} t_2$ as a formula. If we know the inhabitant of $t_1 =_A t_2$ is normalized at the term $\lambda C.\lambda x.x$, then we can use $t_1 =_A t_2$ to cast the type $P t_1$ to $P t_2$ by applying the term $(\lambda C.\lambda x.x) P$ to the inhabitant of $P t_1$. Note that $(\lambda C.\lambda x.x) P \to_{\beta} \lambda x.x$, so the casting will not affect the inhabitant of $P t_1$. %% In $\selfstar$, $t_1 =_A t_2$ can only be interpreted as type-level convertability, not the equality between programs $t_1$ and $t_2$. 

Scott encoding (reported in \cite{CHS:72}) does not suffer from the ineffeciency problems associated with Church encoding. For functional programming, Scott encoding seems to be a better fit than Church encoding \cite{Jansen:2011}. From the typing perspective, Scott-encoded data contain subdata, so one would need recursive definition in order to define a type for Scott encoded data. Elimination schemes for the Scott-encoded data are derivable in $\selfstar$. This means the programmer can write down programs that have types like $\Pi x:\mathsf{Nat}. \mathsf{add}\ x \ 0 =_{\mathsf{Nat}} x$, which increases the flexibility of type-level casting. 
 
 The main contributions of this paper are:

\begin{itemize}
\item  We present $\selfstar$, which allows us to type Scott-encoded data and derive elimination schemes for Scott-encoded data. $\selfstar$ simplifies the design of the functional programming language, since the primitive notion of inductive data and pattern matching is not needed in $\selfstar$. 
\item  We prove type preservation and progress for $\selfstar$ by applying the method we developed in the study of System $\self$.  
\end{itemize}

\subsection{Preliminaries}
\label{motivation}
In Curry style system $\systemt$ \cite{Girard:1989} equipped with polymorphic and dependent type, one has a primitive notion of \textit{recursor}, namely, $\rec:  \Pi x:\nat.\forall U. (\nat \to U \to U) \to U \to U$ and two reductions rules: $\rec\ 0\ f\ v \to v$ and $\rec\ (\suc n)\ f \ v \to f\ n\ (\rec \ n\ f\ v)$. 

The recursor can be emulated with lambda terms. For example, $\rec := \lambda n.\lambda f.\lambda v.n\ f\ v$, with the notion of numeral $\bar{0} := \lambda s.\lambda z.z$ and $\bar{n} := \lambda s.\lambda z.s\ \overline{n-1}\ (\overline{n-1}\ s\ z)$. One can verify that the definition of $\rec$ in lambda calculus behaves the same as the one in system $\systemt$. With recursive definition, we can define $\nat := \forall U.(\nat \to U \to U) \to U \to U$. Note that the type of $\bar{n}$ is the same as the type of $\rec\ \bar{n}$.

So far the type of the recursor is \textit{elementary}, i.e., not involving dependency. To make real use of 
dependent type, we ask if it is possible to obtain a type likes $\Pi x:\nat.\forall U:\nat \to *.  ( \Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U\ x$. Note that $\suc := \lambda n.\lambda s.\lambda z.s\ n \ (n \ s\ z)$. We want to emphasis the underlying
computational behavior of this type should be the same as $\rec$, thus we want a typing relation $\rec : \forall U:\nat \to *. \Pi x:\nat. ( \Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U\ x$. We also want the type of $\rec\ \bar{n}$, namely $( \Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U\ \bar{n}$ to be the same as the type of $\bar{n}$. So we want the following typing relation: $\bar{n}: (\Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U\ \bar{n}$ for any $\bar{n}$. That is where we need self-type mechanism: 

\[
\begin{array}{cc}
  \infer[\textit{selfGen}]{\Gamma \vdash t: \iota x.T}{\Gamma \vdash t: [t/x]T}
&
 \infer[\textit{selfInst}]{\Gamma \vdash t: [t/x]T}{\Gamma \vdash t: \iota x.T}
\end{array}
\]

\noindent The two rules above allow self type $\iota x.T$ to refer to its subject. So it is not surprising we define $\nat := \iota x.(\Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U \ x$. With \textit{selfGen}, \textit{selfInst} and mutually recursive definition, one can verify that indeed, the type of $\bar{n}$ is the same as the type of $\rec \ \bar{n}$ and the type of $\rec$ is indeed $\forall U:\nat \to *. \Pi x:\nat. ( \Pi y:\nat. U\ y \to U\ (\suc y)) \to U\ \bar{0} \to U\ x$. It is tempting to claim that $\rec$ represents
the induction principle, but it is not for the following two reasons: 1. With mutually recursive definitions, the types can not be interpreted as formulas. 2. The dependent product $\Pi$ is not exactly the first order quantifier $\forall$. 

System $\systemt$ is close to the functional programming language, but still a little far from
a functional programmer's usual experience. In mordern functional programming language, one
would want to write a plus $2$ function in the following style:

\begin{lstlisting}[language=MyHaskell, keywordstyle=\color{blue},basicstyle=\ttfamily]
data Nat = Zero
          | Succ Nat

plusTwo :: Nat -> Nat
plusTwo n = case n of
           Succ p -> Succ (plusTwo p) 
         | Zero -> Succ (Succ Zero)


\end{lstlisting}

\noindent With Scott numerals, we can achieve the effects above. Assume Scott numerals and mutually recursive definition. We define:

\begin{lstlisting}[language=MyLambda, keywordstyle=\color{blue},basicstyle=\ttfamily]
 Zero = lam s . lam z . z
 Succ n =  lam s . lam z . s n
 plusTwo n = case n  
                lam p. Succ (plusTwo p)
                Succ (Succ Zero)
\end{lstlisting}

\noindent Of course, $\mathsf{case} := \lambda n.\lambda f.\lambda a.n\ f\ a$ and $\mathsf{lam}$ denotes the usual $\lambda$. One can see the differences between the two programs above are mostly superficial. Now let us first give a elementary version of type for $\case$, which is $(\nat \to U) \to U \to U$. The dependency version is $\Pi x: \nat. (\Pi y:\nat. U\ (\suc y)) \to U \ \bar{0} \to U\ x$. And we define $\nat := \iota x. (\Pi y:\nat. U\ (\suc y)) \to U \ \bar{0} \to U\ x$, so one can again check that $\case\ \bar{n}$ and $\bar{n}$ have the same type for Scott numeral $\bar{n}$ using the \textit{selfInst} rule. 

Observe that the term for $\rec$ and $\case$ (even the \textit{iterator} in system \textbf{F}) is the term $\lambda n.\lambda f.\lambda a. n\ f\ a$. We call the type of this term \textit{elimination scheme}. When we say that elemination scheme is derivable in $\selfstar$, we mean the elimination scheme is inhabited by the term $\lambda n.\lambda f.\lambda a. n\ f\ a$ in $\selfstar$ (modulo type annotations).  

\subsection{Related Work on Self Type}

Self type is proposed and incorporated in System $\self$ in $\cite{Pfu:2013}$. System $\self$
is a consistent logic system that use Church encoding to handle inductive data type. There are
many differences between $\selfstar$ and $\self$. First and foremost, $\selfstar$ is designed for a Turing-complete dependently-typed functional programming language, while $\self$ is intended as a consistent logical system. Second, $\selfstar$ supports unrestricted recursive definitions and Scott encoded data, these two features can not be included in $\self$, since they will cause the non-terminating behavior for terms. Third, $\selfstar$ is featuring $*:*$, which enable us to write programs that compute types. It is also a logically inconsistent feature, which can not be added to $\self$. 



\subsection{Overview}

In section \ref{self}, we present $\mathsf{Selfstar}$. We show how to type Scott-encoded data and its derivative. The corresponding elimination schemes for Scott-encoded data are derived. We also provide several examples (numerals and vectors) to demonstrate the power of $\selfstar$. In section \ref{s}, metatheorems such as type preservation and progress are proved for $\selfstar$ by applying the method we develped in $\self$ \cite{Pfu:2013}. 

%% Definitions of abstract reduction system, lambda calculus, simple types are given in section \ref{Pre}. We present Scott and Church numerals in both untype and typed forms in section \ref{Types}. Dependent type system and the related problem with Church encoding are discussed in detail in section \ref{Dep}. Seciton \ref{Conf}, several methods to show \textit{confluence} are given. We give an outline of proving confluence for the term system of $\mathsf{Selfstar}$ (Section \ref{Local}). Relation of confluence to type preservation is discussed in Section \ref{Conf:Presv}.  We present system $\mathsf{Selfstar}$ (Section \ref{Self}), which not only enable us to type Scott encoding and Church encoding data, but also allow us to derive corresponding induction principle and case analysis principle. 
\
\section{Dependently-typed programming with Selfstar}
\label{self}
$\selfstar$ uses the self type mechanism to obtain inductive data, resulting in a design that is simpler than most dependently-typed core languages. Intuitively, it is hard imagine how to emulate inductive datatype and pattern matching without any build-in mechanisms. But as we observe in section \ref{motivation}, Scott encoding together with mutually recursive definitions are enough to perform pattern matching on inductive data. The real difficulty lies in the typing. We want to make sure that both Scott-encoded data and definable operations on these data are typable in $\selfstar$. The self type allows us to type Scott data and to derive the corresponding elimination schemes. Thus operations on Scott data are typable using the elimination scheme.  

\subsection{System Selfstar}
We give the full specification of $\selfstar$ in this section. We use gray boxes to hightlight
certain important terms and rules. 

\begin{definition}[Syntax]

\

\noindent \textit{Terms} $t \ :: = * \ | \ x \ | \ \lambda x.t \ |
\ t t' \ | \ \mu t \ | \ \Pi x:t_1.t_2 \ | \ \gray{$\iota x.t$}$

\noindent \textit{Closure} $\mu \ ::= \M{x}{t}{N}$

\noindent \textit{Value} $v \ ::= * \ | \ \lambda x.t \ | \ \Pi x:t_1.t_2 \ | \ \iota x.t \ | \ \vec{\mu}(\Pi x:t_1.t_2) \ | \ \vec{\mu}(\iota x.t)$

\noindent \textit{Context} $\Gamma \ :: = \ \cdot \ | \ \Gamma, x:t
\ | \ \Gamma, \tilde{\mu}$

\end{definition} 
\noindent \textbf{Remarks} :

\begin{itemize}
  
  \item If $\mu$ is $\M{x}{t}{N}$, then $\tilde{\mu}$ is $\bm{x}{a}{t}{N}$ for some term $a_i$.

    \item  For $\M{x}{t}{N}$, we
require for any $ 1 \leq i \leq n $, the free variable set $\mathrm{FV}(t_i) \subseteq \mathrm{dom}(\mu) = \{x_1,..., x_n\}$. We also do not allow any reductions and substitutions inside the closure. We call this the \textit{locality} restriction. %% and each $t_i$ is \textit{pure}(i.e. does not contain any closure),  we call this requirement \textit{local property}. The motivation of purity requirement come from the following intuition: when ever $\mu_1, \Gamma_1, \{ x \mapsto \mu_1 y\}$, then we can reformulate it
%% as $\Gamma_1, \mu_1 \cup \{ x \mapsto y\}$. Purity is used in the proof of lemma \ref{erase:eq}.
 Without locality requirement, it is hard to establish confluence for reductions(see \cite{Ariola:1997}). 
\item $\mathrm{FV}(\mu t) = \mathrm{FV}(t) - \mathrm{dom}(\mu)$. 
  
%% \item $\vec{\mu}t$ denotes $\mu_1...\mu_n t$. $\vec{\mu}$ here and through out this article is not allowed to be an empty closure. We do have a notation for allowing possibly empty many closure, namely, $\dot{\vec{\mu}}$. 

%% \item $[t'/x](\mu t )\ \equiv \mu([t'/x]t)$ and
%% $[t'/x](\iota y.t )\ \equiv \iota y. [t'/x]t$.
\end{itemize}

\begin{figure}
  \begin{tabular}{ll}

\infer{\cdot \vdash \mathsf{wf}}{}

&
\infer{\Gamma, \tilde{\mu} \vdash \mathsf{wf}}{\Gamma \vdash \mathsf{wf} & \{\Gamma,\tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in\tilde{\mu}}  }    
\\
\\

\infer{\Gamma, x:t\vdash \mathsf{wf}}{\Gamma\vdash \mathsf{wf} & \Gamma \vdash t: * }    

  \end{tabular}
  \caption{Well-formed Context \fbox{$\Gamma \vdash \mathsf{wf}$}}
  \label{wf}
\end{figure}


\begin{figure}

\begin{center}
\begin{tabular}{ll}
    
  \infer[\textit{Star}]{\Gamma \vdash *:*}{}

&
  
\infer[\textit{Var}]{\Gamma \vdash x:t}{(x:t) \in \Gamma}
\\
\\
\gray{
\infer[\textit{Self}]{\Gamma \vdash \iota x.t : *}{\Gamma,
x:\iota x.t \vdash t : * }
}
&
\gray{
\infer[\textit{SelfInst}]{\Gamma \vdash t: [t/x]t'}{\Gamma
\vdash t : \iota x.t'}
}
\end{tabular}
\end{center}
%\vspace{-0.4cm}
\begin{center}
\begin{tabular}{c}
\infer[\textit{Pi}]{\Gamma \vdash \Pi x:t_1.t_2 : *}{\Gamma,
x: t_1 \vdash t_2 : * & \Gamma \vdash t_1 : * }
\\
\\
\gray{
\infer[\textit{SelfGen}]{\Gamma \vdash t : \iota x.t'}{\Gamma
\vdash t: [t/x]t' & \Gamma \vdash \iota x.t': *}
}
\\
\\
\infer[\textit{Conv}]{\Gamma \vdash t : t_2}{\Gamma \vdash t:
t_1 & \Gamma \vdash t_1 \cong t_2 & \Gamma \vdash t_2:*}

\\
\\
\infer[\textit{Lam}]{\Gamma \vdash \lambda x.t :\Pi x:t_1.
t_2}{\Gamma, x:t_1 \vdash t: t_2 & \Gamma \vdash t_1:*}
\\
\\
\infer[\textit{App}]{\Gamma \vdash t t':[t'/x] t_2}{\Gamma
\vdash t:\Pi x:t_1. t_2 & \Gamma \vdash t': t_1}

\\
\\
\infer[\textit{Mu}]{\Gamma \vdash \mu t: \mu t'}{\Gamma, \tilde{\mu}
\vdash t:t' &  \{\Gamma, \tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in \tilde{\mu}} }

\end{tabular}
\end{center}
\caption{Typing \fbox{$\Gamma \vdash t:t'$}}
\label{typerule}
\end{figure}

The definition of well-formed context is in figure \ref{wf}, typing rules are in figure \ref{typerule}. Operational semantics, equality and type level reduction rules are in figure \ref{cbv}, figure \ref{eq} and figure \ref{clored}.

\

\noindent \textbf{Remarks}:
\begin{itemize}
  
\item $(t_i : a_i) \in \tilde{\mu}$ means $(x_i:a_i) \mapsto t_i \in \tilde{\mu}$. $\vec{\mu}t$ denotes $\mu_1...\mu_n t$.  %% $x_i \mapsto t_i \in \tilde{\mu}$ means $(x_i:a_i) \mapsto t_i \in \tilde{\mu}$.

\item Typing does not depend on well-formness of the context, so the self type formation rule \textit{self} is not circular in this sense. We will show: if $\Gamma \vdash \mathsf{wf}$ and $\Gamma \vdash t:t'$, then $\Gamma \vdash t':*$ (Appendix \ref{wftype}). 
\item We use call-by-value strategy for the execution. 
\item $\cong$ denotes $=_o \cup =$, where $=_o$ denotes the reflexive transitive and symmetry
closure of $\to_o$. 
\item The equality rules incorpates executions to automatize a portion of equality reasoning. 

\item At type level, we want to have the ability to open the closure when it appear in the context. closure reduction allow us to do this, without this type level reduction, we can not prove type preservation. 
\end{itemize}


\begin{figure}
\begin{center}
  \begin{tabular}{ll}

 \infer{\Gamma \vdash \vec{\mu} * \leadsto *}{}

& 

 \infer{\Gamma \vdash x_i \leadsto  t_i}{(x_i \mapsto t_i) \in \Gamma}

\\
\\

 \infer{\Gamma \vdash \vec{\mu} x_i \leadsto  \vec{\mu} t_i}{(x_i \mapsto t_i) \in
\mu \in \vec{\mu}}

&
\infer{\Gamma \vdash\vec{\mu} x \leadsto  x}{x \notin
\mathrm{dom}(\vec{\mu}) }


\\
\\
\infer{\Gamma \vdash\vec{\mu}(t t') \leadsto
(\vec{\mu}t)(\vec{\mu}t')}{ }

&
\infer{\Gamma \vdash\vec{\mu}(\lambda x.t) \leadsto \lambda
x.(\vec{\mu}t)}{ }

\\
\\
\infer{\Gamma \vdash(\lambda x.t)v \leadsto [v/x]t}{}

&
\infer{\Gamma \vdash t t' \leadsto t'' t'}{\Gamma \vdash t
\leadsto t''}
\\
\\
\infer{\Gamma \vdash(\lambda x.t) t' \leadsto (\lambda x.t)
t''}{\Gamma \vdash t'\leadsto t''}

\end{tabular}
\end{center}
\caption{Call-by-value Reductions}
\label{cbv}
\end{figure}


\begin{figure}

\begin{center}
\begin{tabular}{ll}

\infer{\Gamma \vdash t_1= t_2}{\Gamma \vdash t_1
{\leadsto^*} t_2}

&
\infer{\Gamma \vdash(\lambda x.t)t' = [t'/x]t}{}

\\
\\

\infer{\Gamma \vdash \mu t = t }{ \mathrm{FV}(t)\#
\mathrm{dom}(\mu)}

&

\infer{\Gamma \vdash\mu (\iota x.t) =\iota x.(\mu t')}{}

\\
\\
\infer{\Gamma \vdash\mu (\Pi x:t_1.t_2) =\Pi x:\mu t_1. \mu
t_2}{}

&

\infer{\Gamma \vdash \mu t = t }{ \mathrm{FV}(t)\#
\mathrm{dom}(\mu)}

\\
\\

\infer{\Gamma \vdash t t' = t'' t'''}{\Gamma \vdash t=t''&\Gamma \vdash t' = t'''}
    &


\infer{\Gamma \vdash\mu t =\mu t'}{\Gamma,\tilde{\mu} \vdash t=t' }

\\
\\

\infer{\Gamma \vdash\iota x.t = \iota x.t'}{\Gamma \vdash t =t'
}

&

\infer{\Gamma \vdash\lambda x.t = \lambda x.t'}{\Gamma \vdash t
=t' }
\\
\\

\infer{\Gamma \vdash t_1 =  t_3}{\Gamma \vdash t_2 =  t_3
&\Gamma \vdash t_1 =  t_2}

&
\infer{\Gamma \vdash t_1 =  t_2}{\Gamma \vdash t_2 = t_1 }

\end{tabular}
\end{center}
  \caption{Equality}
  \label{eq}
\end{figure}

\begin{figure}

\begin{center}
\begin{tabular}{ll}
\infer{\Gamma \vdash \mu t \to_{o} t}{\mu \in \Gamma}
&
\infer{\Gamma \vdash\lambda x.t \to_{o} \lambda x.t'}{\Gamma
\vdash t \to_{o}t' }

\\
\\
\infer{\Gamma \vdash t t' \to_{o} t'' t'}{\Gamma \vdash t
\to_{o}t''}

&

\infer{\Gamma \vdash t t' \to_{o} t t''}{\Gamma \vdash
t'\to_{o}t''}

\\
\\
\infer{\Gamma \vdash \Pi x:t_1.t_2 \to_{o} \Pi
x:t_1'.t_2'}{\Gamma \vdash t_1 \to_{o}t_1' } 


&
\infer{\Gamma \vdash \mu t \to_{o} \mu t'}{\Gamma,\tilde{\mu}\vdash t
\to_{o}t' }

\\
\\

\infer{\Gamma \vdash \Pi x:t_1.t_2 \to_{o} \Pi
x:t_1'.t_2'}{\Gamma \vdash t_2 \to_{o}t_2' } 

&
\infer{\Gamma \vdash \iota x.t \to_{o} \iota x.t'}{\Gamma
\vdash t \to_{o}t' }
  
\end{tabular}
\end{center}
\caption{Closure Reductions}
\label{clored}
\end{figure}

\subsection{Scott Encodings in Selfstar}
Now let us see some concrete examples of Scott encodings in $\selfstar$. For convenience, we write $a \to b$ for $\Pi x:a.b$ with $x \notin \mathrm{FV}(b)$.



\begin{definition}[Scott Numerals]
Let $\tilde{\mu}_s$ be the following recursive defintions:

\noindent $(\mathsf{Nat}:* ) \mapsto$

\noindent $ \gray{$\iota x$}. \Pi C: \mathsf{Nat} \to *. \gray{ $(\Pi n : \mathsf{Nat}.C\ (\mathsf{S}\ n))$} \to (C\ 0) \to (C\ x)$

\noindent $(\mathsf{S}: \mathsf{Nat} \to \mathsf{Nat} )\mapsto \lambda n.\lambda C. \lambda s.\lambda z. s \ n$

\noindent $(0:\mathsf{Nat})  \mapsto \lambda C. \lambda s. \lambda z.z$

\noindent with $s: \Pi n : \mathsf{Nat}.C\ (\mathsf{S}\ n), z: C 0, n: \mathsf{Nat}$, we have
 $\tilde{\mu}_s \vdash \mathsf{wf}$(using SelfGen and SelfInst rules).

\end{definition}



\begin{definition}[Elimination Scheme for Scott Numerals]
\

\noindent $\tilde{\mu}_s \vdash \mathsf{Case} : \Pi C: \mathsf{Nat} \to *. \Pi n:\mathsf{Nat}. (C\ (\mathsf{S}\ n)) \to C\ 0 \to \Pi n:\mathsf{Nat}. C\ n$  

\noindent $\mathsf{Case}\ := \lambda C. \lambda s.\lambda z. \lambda n. n\ C\ s\ z$

\noindent with $s: \Pi n:\mathsf{Nat}. (C\ (\mathsf{S}\ n)), z: C 0, n : \mathsf{Nat}$(using SelfInst rule).
\end{definition}

\noindent \textbf{Typing}: Let $\Gamma = \tilde{\mu}_s, C: \mathsf{Nat}\to *, s:\Pi n : \mathsf{Nat}. C\ (\mathsf{S}\ n), z: C 0, n : \mathsf{Nat}$. Since $n : \mathsf{Nat}$, by \textit{selfInst}, $n : \Pi C: ( \mathsf{Nat}\to *).  (\Pi y : \mathsf{Nat}. (C\ (\mathsf{S}\ y)) \to (C\ 0) \to (C\ n)$. Thus $n \ C \ s\ z : C\ n$.


\begin{definition}[Addition] We define $\mu_+$: 
\
  
\noindent $(\mathsf{add}: \mathsf{Nat} \to \mathsf{Nat} \to \mathsf{Nat} ) \mapsto$
  
  $ \lambda n.\lambda m. \mathsf{Case}\ (\lambda n.\mathsf{Nat})\ (\lambda p . (\mathsf{S}\ (\mathsf{add}\ p\ m)) )\ m \ n$

\end{definition}
\noindent One can check that $\tilde{\mu}_s, \mu_+ \vdash \mathsf{wf}$.

\begin{definition}[Leibniz Convertibility]
  \
  
\noindent $\mathsf{Eq} :=\lambda A. \lambda x.\lambda y. \Pi C: A \to *. C\ x \to C\ y $.
\end{definition}

\begin{definition}
  $\tilde{\mu}_s  \vdash \mathsf{addZ} : \Pi x:\nat. \eq \ \nat \ (\add \ x \ 0) \ x$.
\end{definition}

\noindent \textbf{Typing}: We are trying to show 

\noindent $\Pi x:\nat. \Pi C: \nat\to *. C\ (\add \ x \ 0) \to C\ x$ 

\noindent is inhabited. We know that the type of $\case\ (\lambda z.(\eq \ \nat \ (\add \ z \ 0) \ z)) $ is $\Pi n:\mathsf{Nat}. (\eq \ \nat \ (\add \ (\suc n) \ 0) \ \suc n) \to (\eq \ \nat \ (\add \ 0 \ 0) \ 0) \to \Pi n:\mathsf{Nat}.\eq \ \nat \ (\add \ n \ 0) \ n$. 

\noindent So $\case\ (\lambda z.(\eq \ \nat \ (\add \ z \ 0) \ z)) \ p_1 \ p_2 : \Pi n:\mathsf{Nat}.\eq \ \nat \ (\add \ n \ 0) \ n$, with $p_1: \Pi n:\mathsf{Nat}. (\eq \ \nat \ (\add \ (\suc n) \ 0) \ \suc n)$ and $p_2 : \eq \ \nat \ (\add \ 0 \ 0) \ 0$. 

\noindent It is easy to see that $p_2 := \lambda C[: \nat \to *]. \lambda x[:C (\add \ 0 \ 0)].x$. 

\noindent We know $\mathsf{addZ}\ n\ (\lambda q[:\nat]. C (\suc q)) : C (\suc (\add\ n \ 0)) \to C (\suc n)$. 

\noindent Thus $p_1 := $

\noindent $\lambda n[:\nat].\lambda C[:\nat \to *].\lambda z[:C (\add \ (\suc n) \ 0)].$

$(\mathsf{addZ}\ n\ (\lambda q[:\nat]. C (\suc q))) \ z$. 

\noindent So we arive at the following definition:

  
  \noindent $\mathsf{addZ} := \case\ (\lambda z.(\eq \ \nat \ (\add \ z \ 0) \ z))$
  
  $ (\lambda n. \lambda C. \lambda z.  (\mathsf{addZ}\ n\ (\lambda q. C (\suc q))) \ z) $
  
  $(\lambda C. \lambda x.x) =_{\beta} $
  
  $\lambda y. y\ (\lambda z.(\eq \ \nat \ (\add \ z \ 0) \ z))$
  
  $ (\lambda n. \lambda C. \lambda z.  (\mathsf{addZ}\ n\ (\lambda q. C (\suc q))) \ z) \ (\lambda C. \lambda x.x)$

  \
  
 Observe that $\mathsf{addZ}$ is a recursive function that is equivalent to $\lambda C.\lambda z.z$ for all input of Scott numerals. So it is safe to use $\mathsf{addZ}\ \bar{n}$ to convert $\add\ \bar{n}\ 0$ to $\bar{n}$. 

\begin{definition}[Vector]
  \
  
Let $\tilde{\mu}_v$ be the following recursive defintions:

\noindent $(\vecc(U, n):* ) \mapsto$

\noindent $ \gray{$\iota x$}. \Pi C: \Pi p:\nat. \vecc(U, p) \to *. $

$\gray{ $(\Pi m : \mathsf{Nat}. \Pi u:U. \Pi y:\vecc(U, m). C\ (\mathsf{S}\ m)\ (\cons\ m \ u\ y))$} \to$

$(C\ 0 \ \nil) \to (C\ n\ x)$

\noindent $(\cons: \Pi n: \nat. U \to \vecc(U, n) \to \vecc(U, \suc n) )\mapsto $

$\lambda n.\lambda v. \lambda l. \lambda C.\lambda y. \lambda x.y \ n\ v\ l$

\noindent $(\nil:\vecc(U, 0))  \mapsto \lambda C. \lambda y. \lambda x.x$

\noindent where $n: \mathsf{Nat}, v: U, l: \vecc (U, n), C:\Pi p:\nat. \vecc(U, p) \to * , y:\Pi m: \mathsf{Nat}.\Pi u:U.\Pi y: \vecc(U,m).(C \ (\mathsf{S}m)\ (\mathsf{cons}\ m\ u\ y)) , x:  C \ 0 \ \nil $. 
\end{definition}

\noindent \textbf{Typing}: It is easy to see that $\nil$ is typable to $\vecc (U, 0)$. Now we show how $\cons$ is typable to $\Pi n: \mathsf{Nat}.U \to \vecc (U, n) \to \vecc (U, \suc n)$. The type of $y \ n\ v\ l$ is $C\ (\mathsf{S}n)\ (\mathsf{cons}\ n\ v\ l)$. 

\noindent So $\gray{$\lambda C.\lambda y. \lambda x. y\ n\ v \ l$} :$

$\Pi C: (\Pi p:\nat.\vecc(U, p)\to *).$

$(\Pi m: \mathsf{Nat}.\Pi u:U.\Pi y: \vecc(U,m).(C \ (\mathsf{S}m)\ (\mathsf{cons}\ m\ u\ y))) \to $

$C\ 0\ \nil \to C\ (\suc n) \ \gray{$(\lambda C.\lambda y. \lambda x. y\ n\ v \ l)$} $. 

\noindent So by \textit{selfGen}, we have $\lambda C.\lambda y. \lambda x. y\ n\ v \ l : \vecc (U, \suc n)$. Thus $ \cons : \Pi n:\mathsf{Nat}. U \to \vecc(U, n) \to \vecc(U, \suc n)$.

\begin{definition}[Elimination Scheme for Vector]
\

  \noindent  $\tilde{\mu}_v \vdash \mathsf{Case}(U, n) : $
  
  $ \Pi C: (\Pi p:\nat.\vecc(U, p) \to *).$
  
 $(\Pi m: \mathsf{Nat}.\Pi u:U.\Pi y: \vecc(U,m).(C \ (\mathsf{S}m)\ (\mathsf{cons}\ m\ u\ y))) $
  
  $\to C\ 0\ \nil \to \Pi x: \vecc(U, n).(C\ n\ x)$

\noindent where $\mathsf{Case}(U,n) := \lambda C. \lambda s. \lambda z. \lambda x. x\ C\ s\ z$

\noindent $C: (\Pi p:\nat.\vecc(U, p) \to *), s :\Pi m: \mathsf{Nat}.\Pi u:U.\Pi y: \vecc(U,m).(C \ (\mathsf{S}m)\ (\mathsf{cons}\ m\ u\ y)), z: C\ 0\ \nil, x:\vecc(U,n)$. 

\end{definition}

\begin{definition}[Append]
\

\noindent $\tilde{\mu}_v \vdash \app :$

\noindent $\Pi n_1:\mathsf{Nat}. \Pi n_2:\mathsf{Nat}.\vecc(U, n_1) \to \vecc(U, n_2) \to \vecc(U, n_1+n_2)$

\noindent where $\app := \lambda n_1. \lambda n_2.\lambda l_1.\lambda l_2. $

$\mathsf{Case}(U, n_1) ( \lambda z.\lambda q.\vecc(U, z + n_2))$

$(\lambda m. \lambda h. \lambda t. \cons\ (m+n_2) \ h \ (\app \ m \ n_2\ t\ l_2)) $

$l_2 \ l_1$ 

\end{definition}


\noindent \textbf{Typing}: We want to show $\app : \Pi n_1:\mathsf{Nat}. \Pi n_2:\mathsf{Nat}. \vecc(U, n_1) \to \vecc(U, n_2) \to \vecc(U, n_1+n_2)$. We instantiate $C :=  \lambda z.(\lambda q.\vecc(U, z + n_2))$ , where $q$ free over $\vecc(U, y + n_2)$, in $\mathsf{Case}(U, n_1)$. By beta reductions, we get $\mathsf{Case}(U, n_1)\ (\lambda z.\lambda q.\vecc(U, z + n_2)) : \Pi m: \mathsf{Nat}.\Pi u:U.\Pi y: \vecc(U,m).(\vecc(U, (\mathsf{S}m) + n_2)) \to \vecc(U, 0+n_2) \to \Pi x: \vecc(U, n_1). \vecc(U, n_1+n_2)$. 

\noindent Also, $\lambda m. \lambda h. \lambda t. \cons\ (m+n_2) \ h \ (\app \ m \ n_2\ t\ l_2) : \Pi m: \mathsf{Nat}.\Pi h:U.\Pi t: \vecc(U,m).(\vecc(U, (\mathsf{S}\ (m+n_2)))$ 

\noindent With $l_1: \vecc(U, n_1), l_2:\vecc(U, n_2)$, we can see that it is the case. 


\begin{definition}[Associativity]
\

\noindent $\tilde{\mu}_v \vdash \mathsf{assoc}:\Pi (n_1. n_2. n_3: \nat).$

$ \Pi(v_1: \vecc(U, n_1). v_2:\vecc(U, n_2).v_3:\vecc(U, n_3)).$

$ \eq \ \vecc(U, n_1 + n_2 + n_3) \ (\app \ n_1\ (n_2+n_3)\ v_1 \ (\app\ n_2\ n_3 \ v_2 \ v_3))$

$(\app \ (n_1 + n_2) \ n_3\ (\app \ n_1 \ n_2 \ v_1 \ v_2) \ v_3)$
\end{definition}

\noindent It is lengthy to present the full term of $\mathsf{assoc}$. So we elide
that here. 

The following definition of numeral is the numeral we mentioned in section \ref{motivation}. 

\begin{definition}[An Variant of Scott Numerals]
Let $\tilde{\mu}_d$ be the following recursive defintions:

\noindent $(\mathsf{Nat}:* ) \mapsto $

\noindent \small{$\gray{$\iota x$}. \Pi C: \mathsf{Nat} \to *.  (\Pi n : \mathsf{Nat}. (C\ n) \to (C\ (\mathsf{S}\ n))) \to (C\ 0) \to C\ x$}

\noindent $(\mathsf{S}: \mathsf{Nat} \to \mathsf{Nat} )\mapsto \lambda n.\lambda C. \lambda s.\lambda z. s \ n\ (n\ C\ s\ z)$

\noindent $(0:\mathsf{Nat})  \mapsto \lambda C. \lambda s. \lambda z.z$

\noindent with $s: \Pi n : \mathsf{Nat}. (C\ n) \to (C\ (\mathsf{S}\ n)), z: C 0, n: \mathsf{Nat}$, we have $\tilde{\mu}_d \vdash \mathsf{wf}$ (using \textit{selfGen} and \textit{selfInst} rules).

\end{definition}

\begin{definition}[Elimination Scheme]
\

\noindent $\tilde{\mu}_d \vdash \mathsf{Rec} : \Pi C: \mathsf{Nat} \to *. \Pi n : \mathsf{Nat}.( (C\ n) \to C\ (\mathsf{S}\ n)) \to C\ 0 \to \Pi n:\mathsf{Nat}. C\ n$  

\noindent $\mathsf{Rec}\ := \lambda C. \lambda s.\lambda z. \lambda n. n\ C\ s\ z$

\noindent with $s:\Pi n : \mathsf{Nat}. (C\ n) \to (C\ (\mathsf{S}\ n)), z: C 0, n : \mathsf{Nat}$.
\end{definition}

\noindent \textbf{Typing}: Let $\Gamma = \tilde{\mu}_d, C: \mathsf{Nat}\to *, s:\Pi n : \mathsf{Nat}. (C\ n) \to (C\ (\mathsf{S}\ n)), z: C 0, n : \mathsf{Nat}$. Since $n : \mathsf{Nat}$, by \textit{selfInst}, $n : \Pi C: ( \mathsf{Nat}\to *).  (\Pi y : \mathsf{Nat}. (C\ y) \to (C\ (\mathsf{S}\ y))) \to (C\ 0) \to (C\ n)$. Thus $n \ C \ s\ z : C\ n$.


\section{Metatheory}
\label{s}

We apply the method that we developed in \cite{Pfu:2013} to prove type preservation. Again, 
we want to emphasis that $\selfstar$ is a different system compare to $\self$ in \cite{Pfu:2013}, the only similar features are self types and mutually recursive definitions, thus resulting
similar type preservation proof. In fact, we present the whole proof of type preservation for
$\selfstar$ in order to show that the method we developed in \cite{Pfu:2013} for proving type preservation is indeed applicable to other similar systems. 

The proof of type preservation of $\selfstar$ is simpler compare to the one in \cite{Pfu:2013} in the following senses: 1. With $*:*$, we no longer have separate syntactic categories for
types and kinds. 2. Polymorphism is anotated in $\selfstar$, which leads to an easier proof
of type preservation, namely, we do not need \textit{morph analysis} \cite{Pfu:2013} for $\selfstar$. These simplifications lead to a simpler proof of type preservation. 


In order to prove type preservation for $\selfstar$, we need \textit{confluence analysis} for the type level transformation. We need to show type level transformation is confluent. Thus the transformation from $\Pi x:t_1.t_2$ to $\Pi x:t_1'.t_2'$ implies $t_1$ can be transformed to $t_1'$ and $t_2$ can be transformed to $t_2'$. Thus we establish the \textit{compatibility} property for $\selfstar$, which is the major result in order to prove type preservation. The proofs for section \ref{analytic}, \ref{confanalysis}, \ref{preservation} are in Appendix \ref{a1}, \ref{a2} and \ref{a3}. Once we prove type preservation, progress theorem is easy to prove.


\subsection{The Analytical System}
\label{analytic}
It is combersome to directly prove the equality in $\selfstar$ is Church-Rosser.
 We develop an \textit{analytical system} and we prove that the analytical system is equivalent (theorem \ref{SandC}) to the equality system in $\selfstar$. Then we prove the analytical system is confluent, which implies the Church-Rosser of the equality in $\selfstar$. 

The beta-reductions (figure \ref{betared}) include definition substitutions and the ordinary beta-reduction in lambda calculus. The mu-reductions (figure \ref{mured}) are for moving the closure inside the term structure. 

\begin{figure}

\

\centering{
\begin{tabular}{ll}


\infer{\Gamma \vdash x \to_{\beta} t}{(x\mapsto t) \in \Gamma}

&
\infer{\Gamma \vdash(\lambda x.t)t' \to_{\beta} [t'/x]t}{}

\\
\\

\infer{\Gamma \vdash\mu x_i \to_{\beta} \mu t_i}{(x_i \mapsto
t_i) \in \mu}

&

\infer{\Gamma \vdash\lambda x.t \to_{\beta} \lambda x.t'}{\Gamma
\vdash t \to_{\beta}t' }

\\

\\
\infer{\Gamma \vdash t t' \to_{\beta} t'' t'}{\Gamma \vdash t
\to_{\beta}t''}

&

\infer{\Gamma \vdash t t' \to_{\beta} t t''}{\Gamma \vdash
t'\to_{\beta}t''}

\\
\\

\infer{\Gamma \vdash \mu t \to_{\beta} \mu t'}{\Gamma,\tilde{\mu}\vdash t
\to_{\beta}t' }

&

\infer{\Gamma \vdash \iota x.t \to_{\beta} \iota x.t'}{\Gamma
\vdash t \to_{\beta}t' }
\\
\\

\infer{\Gamma \vdash \Pi x:t_1.t_2 \to_{\beta} \Pi
x:t_1'.t_2'}{\Gamma \vdash t_2 \to_{\beta}t_2' } 

&
\infer{\Gamma \vdash \Pi x:t_1.t_2 \to_{\beta} \Pi
x:t_1'.t_2'}{\Gamma \vdash t_1 \to_{\beta}t_1' } 
\end{tabular}
}
\caption{Beta Reductions}
  \label{betared}
\end{figure}

\begin{figure}

\

\centering{
\begin{tabular}{ll}


\infer{ \Gamma \vdash \mu t \to_{\mu} t}{\mathrm{dom}(\mu) \#
\mathrm{FV}(t)}

&
\infer{ \Gamma \vdash \mu(\lambda x.t) \to_{\mu} \lambda x.\mu
t}{}

\\
\\

\infer{ \Gamma \vdash \mu(t_1 t_2)  \to_{\mu} (\mu t_1 ) (\mu
t_2)}{}

&

\infer{ \Gamma \vdash \mu(\iota x.t) \to_{\mu} \iota x.\mu t}{}

\\

\\
\infer{ \Gamma \vdash \mu(\Pi x:t_1.t_2) \to_{\mu} \Pi x:\mu
t_1.\mu t_2}{}

&
\infer{ \Gamma \vdash \lambda x.t \to_{\mu} \lambda x.t'}{\Gamma
\vdash t \to_{\mu} t'}

\\
\\
\infer{ \Gamma \vdash t t' \to_{\mu} t t''}{ 
\Gamma \vdash t'\to_{\mu} t''}

&
\infer{ \Gamma \vdash t t' \to_{\mu} t'' t'}{ \Gamma \vdash t
\to_{\mu} t''}

\\

\\
\infer{\Gamma \vdash  \Pi x:t_1 t_2 \to_{\mu} \Pi x:t_1.
t_2'}{ \Gamma \vdash t_1 \to_{\mu} t_1'}

&
\infer{\Gamma \vdash  \iota x.t \to_{\mu} \iota x.t'}{\Gamma
\vdash t \to_{\mu} t'}

\\
\\
\infer{ \Gamma \vdash \Pi x:t_1 t_2 \to_{\mu} \Pi x:t_1.
t_2'}{ \Gamma \vdash t_2 \to_{\mu} t_2'}

&
\infer{ \Gamma \vdash \mu t \to_{\mu} \mu t'}{\Gamma,\tilde{\mu}\vdash t
\to_{\mu}t' }

\end{tabular}
}  
\caption{Mu Reductions}
  \label{mured}
\end{figure}

Let $\to$ denote $\to_{\beta} \cup \to_{\mu}$. Let $\leftrightarrow^*$ denote $(\to \cup \to^{-1})^*$. The following lemmas show the relation between $\to$ and $=$. 

\begin{lemma} \label{optoan} If $\Gamma \vdash t_1 \leadsto t_2 $, then
$\Gamma \vdash  t_1 \to t_2$.  \end{lemma}
\begin{proof} \noindent By induction on derivation of $\Gamma \vdash t_1
    \leadsto t_2 $. 
\end{proof}


\begin{lemma}
\label{join} 
If $\Gamma \vdash t_1 = t_2 $, then  $\Gamma \vdash t_1 \leftrightarrow^* t_2$.  
\end{lemma} 
\begin{proof}
  By induction on the derivation of $\Gamma \vdash t_1 = t_2$. 
\end{proof}
\begin{lemma} 
  \label{toeq}
If $\Gamma \vdash  t_1 \to t_2 $, then $\Gamma \vdash t_1 = t_2$.  
\end{lemma} 
\begin{proof}
  By induction on the derivation of $\Gamma \vdash  t_1 \to t_2$.
\end{proof}

The following theorem shows that the analytic system is equivalent to the equality system. 
\begin{theorem} 
\label{SandC} 
$\Gamma \vdash t_1 = t_2$ iff $\Gamma \vdash t_1 \leftrightarrow^* t_2$.
\end{theorem}
\begin{proof}
  By lemma \ref{join}, lemma \ref{toeq}.
\end{proof}

Suppose $\to$ is confluent. By theorem \ref{SandC}, we know that $\Gamma \vdash \Pi x:t_1.t_2 = \Pi x:t_1'.t_2'$ implies $\Gamma \vdash \Pi x:t_1.t_2 \leftrightarrow^* \Pi x:t_1'.t_2'$. The confluence of $\to$ implies Church-Rosser of $\leftrightarrow^*$, namely, there exists a $t$ such that $\Gamma \vdash \Pi x:t_1.t_2 \to^* t$ and $\Gamma \vdash \Pi x:t_1'.t_2' \to^* t$. By definition of $\to$, we know $t$ must be of the form $\Pi x:t_3.t_4$, with $\Gamma \vdash t_1 \to t_3$, $\Gamma \vdash t_1' \to t_3$, $\Gamma \vdash t_2 \to t_4$ and $\Gamma \vdash t_2' \to t_4$. So by lemma \ref{toeq}, we have $\Gamma \vdash t_1 = t_1'$ and $\Gamma \vdash t_2 = t_2'$.

Now let us focus on the proof of the confluence of $\to$. The confluence argument is similar to the one described in \cite{CurienHL96}. We are going to use the following lemma to conclude the confluence of $\to_{\beta} \cup \to_{\mu}$.
\begin{lemma}[Hardin's interpretation lemma\cite{Hardin:1989}]
\label{interp}
Let $\to $ be $ \to_1 \cup \to_2$, 
$\to_1$ being confluent and strongly normalizing. We denote by $\nu(a)$ the $\to_1$-normal form of $a$. Suppose that there is some relation $\to_i$ on the $\to_1$-normal forms satisfying:

\

$\to_i \subseteq \twoheadrightarrow$, \footnote{$\twoheadrightarrow$ is the reflexive symmetric  transitive closure of $\to$.} and $a \to_2 b $ implies $ \nu(a)   {\twoheadrightarrow_i}    \nu(b)$ $(\dagger)$

\

\noindent Then the confluence of $\to_i$ implies the confluence of $\to$.
\end{lemma}

\begin{proof}
 Suppose $\to_i$ is confluent. Assume $a  {\twoheadrightarrow}  a'$ and $a  {\twoheadrightarrow}  a''$. So by ($\dagger$), $\nu(a)  {\twoheadrightarrow_i}  \nu(a')$ and $\nu(a)  {\twoheadrightarrow_i}  \nu(a'')$. Note that $t  {\to_1^*}  t'$ implies $\nu(t) = \nu(t')$(By the confluence and strong normalization of $\to_1$). By the confluence of $\to_i$, there exists a $b$ such that $\nu(a')  {\twoheadrightarrow_i}  b$ and $\nu(a'')  {\twoheadrightarrow_i}  b$. Since $\to_i, \to_1 \subseteq \twoheadrightarrow$, we get $a' {\twoheadrightarrow}   \nu(a')  {\twoheadrightarrow}  b$ and $a'' {\twoheadrightarrow}   \nu(a'')  {\twoheadrightarrow}  b$. Hence $\to$ is confluent.
\end{proof}


The idea behinds interpretation method is that it allows us to modulo the $\to_1$-reduction, we only need to focus on proving the confluence of $\to_i$. This is 
essential since in our case, $\to_{\beta, \mu}$ can not be directly parallelized, namely, one can not use Tait-Martin L\"of's method (reported in \cite{Barendregt85}) directly to prove the confluence of $\to_{\beta, \mu}$, the paralleled version does not enjoy diamond property. With the interpretation method, after modulo the $\to_{\mu}$-reduction, we introduce a new reduction $\to_{\beta\mu}$ (corresponds to $\to_i$), we can then use the parallel reduction method to prove confluence of $\to_{\beta\mu}$. 

\begin{lemma}
  $\to_{\mu}$ is confluent and terminating.
\end{lemma}

So $\to_{\mu}$ correspond to $\to_1$ in the interpretation lemma. Since $\to_{\mu}$ is strongly normalizing and confluent, we can define a normalization function which effectively computes the mu-normal form. 

\begin{definition}[$\mu$-Normal Forms]
  \label{mu-normal}
\

\noindent $n \ :: = * \ | \ x \ | \   \mu x_i \ | \ \lambda x.n \ | \ n n'\ | \ \Pi x:n.n' \ | \ \iota x.n$

\end{definition}

\noindent  \textbf{Note}: for the $\mu x_i$ in definition \ref{mu-normal}, we assume $x_i \in \mathrm{dom}(\mu)$. 

\begin{definition}[$\mu$ normalization function]
    \
    
    \begin{tabular}{ll}

  $ m(*) \ : = \ * $
\\
  $ m(x) \ : = \  x$
\\
  $m(\lambda y.t)\ : = \ \lambda y.m(t)$

\\
  $m(t_1 t_2)\ : = \ m(t_1) m(t_2)$
\\

  $m(\iota x.t)\ : = \ \iota x. m(t)$
\\
  $m(\Pi x:t.t') \ := \Pi x:  m(t). m(t')$.
\\
  $ m(\vec{\mu}y) \ := y$ if $y \notin dom(\vec{\mu})$.

\\
  $ m(\vec{\mu}y) \ := \mu_i y$ if $y \in dom(\mu_i)$.

\\
  $m(\vec{\mu}(t t')) \ :=  m(\vec{\mu} t) m( \vec{\mu}t')$

\\
  $m(\vec{\mu}( \lambda x.t)) \ := \lambda x.  m(\vec{\mu}t)$.

\\
  $m(\vec{\mu}( \iota x.t)) \ := \iota x.  m(\vec{\mu}t)$.

\\
  $m(\vec{\mu}( \Pi x:t.t')) \ := \Pi x:  m(\vec{\mu}t). m(\vec{\mu}t')$.

\end{tabular}

\end{definition}

We shall devise a new notion of reduction on mu-normal form, then show that this reduction is confluent (corresponds to $\to_i$ in the interpretation lemma and satisfying the $\dagger$ property), thus by the interpretation lemma, we can show $\to_{\beta, \mu}$ is confluent\footnote{$\to_{\beta,\mu}$ denotes $\to_{\beta} \cup \to_{\mu}$, we will use this convention throughout the paper.}. A natural way to define reduction on mu-normal form is that right after a beta-reduction, one immediately mu-normalizes the contractum, which can form a notion of reduction on mu-normal form.  

\begin{definition}[$\beta$ Reduction on $\mu$-normal Forms]
\

\begin{center}
  
\infer{\Gamma \vdash n \to_{\beta \mu} m(t)}{\Gamma \vdash n \to_{\beta}t}
\end{center}
\end{definition}

The following lemma shows that $\to_{\beta\mu}$ corresponds to the $\to_i$ in the interpretation 
lemma. 
\begin{lemma}
\label{fp}
If $\Gamma \vdash t \to_{\beta} t'$, then $\Gamma \vdash m(t)\to_{\beta\mu} m(t')$.
\end{lemma}

\begin{lemma}
  $\to_{\beta\mu}$ is confluent. 
\end{lemma}

\begin{theorem}
  $\to_{\beta, \mu}$ is confluent. 
\end{theorem}
\begin{proof}
  We know $\to_{\beta\mu}$ is confluent. Since
$\to_{\mu}$ is strongly normalizing and confluent, and by lemma \ref{fp} and Hardin's 
interpretation lemma(lemma \ref{interp}), we conclude $\to_{\beta, \mu}$ is confluent. 
\end{proof}

\subsection{Confluence Analysis}
\label{confanalysis}

\begin{definition}
\

\noindent  $\Gamma \vdash t_1 \to_{\iota} t_2 $ if $t_1 \equiv \iota x.t' $ and $t_2 \equiv [t/x]t' $ for some fix term $t$. 

\end{definition}

Note that $\to_{\iota}$ models the \textit{selfInst} rule, $\to_{\iota}^{-1}$ models the \textit{selfGen} rule. The notion of $\iota$-reduction does not build in structure congruence, namely, we do not allow reduction rules like: if $T \to_{\iota}T'$, then $\lambda x.T \to_{\iota} \lambda x.T'$. The purpose of $\iota$-reduction is to emulate the typing rule \textit{selfInst} and \textit{selfGen}. This rewriting point of view on typing is inspired by Kuan et.al. \cite{kuan2007} and Stump et.al. \cite{stump2011}.

\begin{lemma}[Confluence]
  $\to_{\iota}$ is confluent.
\end{lemma}
\begin{proof}
  This is obvious since $\to_{\iota}$ is deterministic. 
\end{proof}

The goal of this section is to show $\to_{o, \iota, \beta, \mu}$ is confluent. We make extensive
use of the notion of \textit{commutativity}, which provides a simple way to prove the confluence of a reduction system that has several confluent subreductions.

\begin{definition}[Commutativity]
  Let $\to_1, \to_2$ be two notions of reduction. $\to_1$ (strongly) commute with $\to_2$ if $a \to_1 b_1$ and $a \to_2 b_2$, then there exists a $c$ such that $b_1 \to_2 c$ and $b_2 \to_1 c$.
\end{definition}

\begin{proposition}[Hindley-Rosen \cite{hindley1964} \cite{Rosen:1973}]
  Let $\to_1, \to_2$ be two notions of reduction. Suppose both $\to_1$ and $\to_2$ are
confluent, and $\to_1^*$ commutes with $\to_2^*$. Then $\to_1 \cup \to_2$ is confluent.
\end{proposition}

\begin{proposition}[Weak Commutativity \cite{Barendregt85}]
  Let $\hookrightarrow$ denote the reflexive closure of $\to$. Let $\to_1, \to_2$ be two notions of reduction. $\to_1$ weak commutes with $\to_2$ if $a \to_1 b_1$ and $a \to_2 b_2$, then there exists a $c$ such that $b_1 \hookrightarrow_2 c$ and $b_2 \twoheadrightarrow_1 c$.

\

If $\to_1$ weak commutes with $\to_2$, then $\to_1^*$ and $\to_2^*$ commute.
\end{proposition}


\begin{lemma}
  $\to_{\beta, \mu}$ commutes with $\to_{\iota}$. Thus $\to_{\beta,\mu, \iota}$ is confluent. 
\end{lemma}

\begin{lemma}
  $\to_o$ has diamond property, thus is confluent.
\end{lemma}

\begin{lemma}
  $\to_o$ commutes with $\to_{\iota}$, weak commutes with $\to_{\beta}$, $\to_{\mu}$. 
\end{lemma}

\begin{theorem}
  $\to_{o, \iota, \beta, \mu}$ is confluent.
\end{theorem}

\noindent Let $=_{\beta,\mu,\iota,o}$ denotes the reflexive transitive symmetry closure of $\to_{o} \cup \to_{\iota} \cup \to_{\beta} \cup \to_{\mu}$. The goal of confluence analysis is to establish the following theorem. 

\begin{theorem}[$\iota$-elimination, Compatibility]
\label{invsc}
\

If $\Gamma \vdash \Pi x:t_1.t_2 =_{\beta,\mu,\iota,o} \Pi x:t_1'.t_2'$, then $\Gamma \vdash t_1 =_{\beta,\mu,o} t_1'$ and $\Gamma \vdash t_2 =_{\beta,\mu,o} t_2'$. 
\end{theorem}

\begin{proof}
  If $\Gamma \vdash \Pi x:t_1.t_2 =_{\beta,\mu,\iota,o} \Pi x:t_1'.t_2'$, then by the confluence of $\to_{\beta,\mu,\iota,o}$, there exists a $t$ such that $\Gamma \vdash \Pi x:t_1.t_2 (\to_{o, \iota,\beta,\mu})^* t$ and $\Gamma \vdash \Pi x:t_1'.t_2' (\to_{o, \iota,\beta,\mu})^* t$. Since all the reductions on $\Pi x:t_1.t_2$ preserve the structure of the dependent type, one will never have a chance to use $\to_{\iota}$-reduction, thus $\Gamma \vdash \Pi x:t_1.t_2 (\to_{o,\beta,\mu})^* t$ and $\Gamma \vdash \Pi x:t_1'.t_2' (\to_{o,\beta,\mu})^* t$. So $t$ must be of the form $\Pi x:t_3.t_4$. And $\Gamma \vdash t_1  (\to_{o,\beta,\mu})^* t_3$, $\Gamma \vdash t_1' (\to_{o,\beta,\mu})^* t_3$, $\Gamma \vdash t_2  (\to_{o,\beta,\mu})^* t_4$ and $\Gamma \vdash t_2' (\to_{o,\beta,\mu})^* t_4$. Finally, we have $\Gamma \vdash t_1 =_{\beta,\mu,o} t_1'$ and $\Gamma \vdash t_2 =_{\beta,\mu,o} t_2'$. 
  
\end{proof}

\subsection{Type Preservation}
\label{preservation}
The proof of type preservation proceeds as usual. The inversion lemma and substitution lemma
are standard. Note that in the final preservation proof, we use the compatibility theorem.

\begin{lemma}[Inversion]
  \
  
  \begin{itemize}
  \item     If $\Gamma \vdash \lambda x.t : t'$, then $\Gamma, x: t_1 \vdash t : t_2$ 
    and $\Gamma \vdash  \Pi  x : t_1 . t_2 {=}_{\beta,\mu,\iota,o} t' $ for some $t_1, t_2$.
    \item     If $\Gamma \vdash t_1 t_2 : t'$, then
    $\Gamma \vdash t_1 : \Pi x : t_1'. t_2'$ and $\Gamma \vdash t_2 :t_1'$,  $\Gamma \vdash [t_2/x] t_2' {=}_{\beta,\mu,\iota,o}  t'$ for some $t_1', t_2'$.

        \item If $\Gamma \vdash x : t'$, then  $x:t \in \Gamma$ and $\Gamma \vdash t {=}_{\beta,\mu,\iota,o} t' $ for some $t$.

            %% \item     If $\Gamma  \vdash \vec{\mu}t : t'$ and $t$ does not have a closure at head position, then $\Gamma, \tilde{\vec{\mu}} \vdash t:t'' $ and $\Gamma  \vdash \vec{\mu}t'' \stackrel{\vec{\mu}t} {=}_{\beta,\mu,\iota,o} t' $.

  \end{itemize}

\end{lemma}

\begin{lemma}[Substitution] 
    \label{subst} If $ \Gamma_1, x:t_1, \Gamma_2\vdash  t: t_2 $
and $ \Gamma \vdash  t': t_1 $, then $  \Gamma_1, [t'/x]\Gamma_2 \vdash  [t'/x] t:[t'/x] t_2 $.
\end{lemma}
\begin{theorem}[Type Preservation] If $\Gamma \vdash \mathsf{wf}$ and $ \Gamma \vdash  t \leadsto t' $ and $ \Gamma \vdash  t:
t'' $, then $ \Gamma \vdash  t' : t'' $.  
\end{theorem}

\begin{proof}
  We list one interesting case here.

  \

  \infer{\Gamma \vdash t_1' t_2' :[t_2'/x]t_2''  }{\Gamma \vdash
    t_1' : \Pi x:t_1''.t_2'' & \Gamma \vdash t_2':t_1''}

\

\noindent Suppose $\Gamma \vdash (\lambda  x.t_1)v \leadsto [v/x]t_1$.
Then we know $\Gamma \vdash (\lambda  x.t_1)v : [t/x]t_2''
$ and $ \Gamma \vdash  \lambda x.t_1 : \Pi x : t_1''.t_2''$ and $ \Gamma \vdash  v :
t_1'' $. By inversion on $ \Gamma \vdash  \lambda x.t_1 : \Pi  x : t_1''. t_2'' $, we
have $ \Gamma , x:a \vdash  t_1 : b  $ and $ \Gamma \vdash  \Pi x : a.b {=}_{\beta,\mu,\iota,o} \Pi x: t_1''.t_2''$. By theorem \ref{invsc}, we have $\Gamma \vdash a  =_{\beta,\mu,o}  t_1'' $
and $\Gamma \vdash b =_{\beta,\mu,o} t_2''$. So we have $\Gamma, x:a \vdash t_1 :  t_2''$
and $\Gamma \vdash v :  a$. So by lemma \ref{subst}, we have $ \Gamma
\vdash  [v/x]t_1 : [v/x]t_2''$, as required. 

\end{proof}

\begin{theorem}[Progress]
  If $\cdot \vdash t:t''$, then either $\cdot \vdash t \leadsto t'$ or $t$ is a value.
\end{theorem}

\begin{proof}
  See Appendix \ref{prog}.
\end{proof}

\section{Conclusion}
We introduce $\selfstar$, which incorporates the self type construct together with $*:*$ and unrestricted mutually recursion. Scott-encoded datatypes and the corresponding elimination schemes are derivable within $\selfstar$. We also demonstrate the process of proving the type preservation theorem. 

$\selfstar$ is inconsistent as a logic and it only admits a notion of type convertibility. Reasoning about potentially diverging programs poses substantial difficulties for designing a logical consistent language. We want (and will) to investigate on this topic more and to develop a consistent logical system that has the ability to reason about general programs (e.g. Operations that are defined on Scott-encoded data).   


%% \acks

%% Acknowledgments, if needed.

% We recommend abbrvnat bibliography style.

\bibliographystyle{plain}

% The bibliography should be embedded for final submission.
\bibliography{paper}
%% \begin{thebibliography}{}
%% \softraggedright

%% \bibitem[Smith et~al.(2009)Smith, Jones]{smith02}
%% P. Q. Smith, and X. Y. Jones. ...reference text...

%% \end{thebibliography}

\appendix


\section{Progress}
\label{prog}
\begin{lemma}
\label{val}
  If $\cdot \vdash v:\Pi x:t_1.t_2$, then $v \equiv \lambda x.t$. 
\end{lemma}

\begin{proof}
  Case analysis on $v$. Suppose $v \equiv *$. By inversion, $\cdot \vdash * : *$ 
and $\cdot \vdash * =_{\beta,\mu,\iota,o} \Pi x:t_1.t_2$, which contradicts 
Church-Rosser of $=_{\beta,\mu,\iota,o}$. Suppose $v \equiv \vec{\mu}(\Pi x:t_3.t_4)$. 
By inversion, we have $\tilde{\vec{\mu}} \vdash  \Pi x:t_3.t_4 : t_a$ and $\cdot \vdash \vec{\mu} t_a \stackrel{\vec{\mu}(\Pi x:t_3.t_4)}{=}_{\beta,\mu,\iota,o} \Pi x:t_1.t_2$. By inversion on 
$\tilde{\vec{\mu}} \vdash  \Pi x:t_3.t_4 : t_a$, we have $\tilde{\vec{\mu}} \vdash  * \stackrel{\Pi x:t_3.t_4}{=}_{\beta,\mu,\iota,o} t_a$. So we have $\cdot \vdash \vec{\mu} * \stackrel{\vec{\mu}(\Pi x:t_3.t_4)}{=}_{\beta,\mu,\iota,o} \vec{\mu} t_a \stackrel{\vec{\mu}(\Pi x:t_3.t_4)}{=}_{\beta,\mu,\iota,o} \Pi x:t_1.t_2$. Again, this contradicts Church-Rosser of $=_{\beta,\mu,\iota,o}$. For other cases like: $v \equiv \Pi x:t.t', \iota x.t, \vec{\mu}(\iota x.t)$, we argue similarly. 
\end{proof}

\begin{theorem}[Progress]
  If $\cdot \vdash t:t''$, then either $\cdot \vdash t \leadsto t'$ or $t$ is a value.
\end{theorem}

\begin{proof}
  By induction on the derivation of $\cdot \vdash t:t''$, we list a few cases.

\

%% \noindent \textbf{Case}:

%% \

%% \infer[\textit{Star}]{\cdot \vdash *:*}{}

%% \

%% \noindent Obvious.

%% \

%% \noindent \textbf{Case}:

%% \

%% \infer[\textit{Var}]{\Gamma \vdash x:T}{(x:T) \in \Gamma}

%% \

%% \noindent This case will not arise.

%% \

%% \noindent \textbf{Case}:

%% \

%% \infer[\textit{Pi}]{\cdot \vdash \Pi x:t_1.t_2 : *}{x: t_1 \vdash t_2 : * & \cdot \vdash t_1 : * }

%% \

%% \noindent Obvious.

%% \

%% \noindent \textbf{Case}:

%% \

%% \infer[\textit{Self}]{\cdot \vdash \iota x.t : *}{x:\iota x.t \vdash t : * }

%% \

%% \noindent Obvious.

%% \


%% \infer[\textit{SelfInst}]{\cdot \vdash t: [t/x]t'}{\cdot \vdash t : \iota x.t'}

%% \

%% \noindent By IH. 

%% \

%% \noindent \textbf{Case}:

%% \

%% \infer[\textit{Lam}]{\cdot \vdash \lambda x.t :\Pi x:t_1.
%% t_2}{x:t_1 \vdash t: t_2 & \cdot \vdash t_1:*}

%% \

%% \noindent Obvious.

%% \

\noindent \textbf{Case}:

\

\infer[\textit{Mu}]{\cdot \vdash \mu t: \mu t'}{ \tilde{\mu}
\vdash t:t' &  \{\tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in \tilde{\mu}} }

\

\noindent Identify $t$ as $\dot{\vec{\mu}} t''$, where $t''$ does not contains any closure
at head position. Case analysis on $t''$, if it is $*, x, \lambda x.t_a, t_a t_b$, then there
exist a $t'$ such that $\cdot \vdash t \leadsto t'$. If $t'' \equiv \Pi x:t_a.t_b, \iota x.t_a$,
then it is already a value. 

\

\noindent \textbf{Case}:

\

\infer[\textit{App}]{\cdot \vdash t t':[t'/x] t_2}{\cdot
\vdash t:\Pi x:t_1. t_2 & \cdot \vdash t': t_1}

\

\noindent Since $\cdot \vdash t:\Pi x:t_1. t_2 $ and $ \cdot \vdash t': t_1$, by IH, $t$ either 
steps or is a value, likewise for $t'$. If $t$ can take a step, then $t t'$ can also take a step. If $t$ is a value, by lemma \ref{val}, $t$ must be of the form $\lambda x.t_a$. So if $t'$ can
take a step, then $t t'$ can also take a step. If both $t'$ is a value, then $t t'$ can take a step.  

\end{proof}


\section{Proofs of Section \ref{analytic}}
\label{a1}
Let $\dot{\vec{\mu}}$ denote $0$ or more closures. 
\begin{lemma}
\label{norm:fun}
 Let $\Phi$ denote the set of $\mu$ normal form. For any term $t$, $m(t)\in \Phi$.
\end{lemma}
\begin{proof}
  One way to prove this is first identify $t$ as $\dot{\overrightarrow{\mu_1}}t'$, here $\dot{\overrightarrow{\mu_1}}$ means
there are zero or more closures and $t'$ does not contains any closure at head position.
 Then we can proceed by induction on the structure of $t'$:

\

\noindent \textbf{Base Cases}: $t' = x$, $t' = *$, obvious.

\

\noindent \textbf{Step Cases}: If $t' = \lambda x.t''$, 
then $m(\dot{\overrightarrow{\mu_1}}(\lambda x.t'')) \equiv \lambda x.m(\dot{\overrightarrow{\mu_1}} t'')$. Now we can
again identify $t''$ as $\dot{\overrightarrow{\mu_2}} t'''$, where $t'''$ does not have any closure at head position. Since $t'''$ is structurally smaller than $\lambda x.t''$, by IH, $m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}} t''') \in \Phi$, thus $m(\dot{\overrightarrow{\mu_1}}(\lambda x.t'')) \equiv \lambda x.m(\dot{\overrightarrow{\mu_1}} t'') \in \Phi$.

For $t' = t_a t_b$, $t' = \iota x.t''$, $t' = \Pi x:t_a.t_b$, we can argue similarly as above.
\end{proof}

In order to prove lemma \ref{fp}, we prove the following more general lemma instead.

\begin{lemma}
\label{stump}
 If $\Gamma, \dot{\vec{\mu}} \vdash a \to_{\beta}b$, then $\Gamma \vdash m(\dot{\vec{\mu}}a) \to_{\beta\mu} m(\dot{\vec{\mu}}b)$. 
\end{lemma}

\begin{proof}
  By induction on derivation of $\Gamma, \dot{\vec{\mu}} \vdash a \to_{\beta}b$. 
\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma, \dot{\vec{\mu}} \vdash x \rightarrow_{\beta}  t}{(x \mapsto t) \in \Gamma, \dot{\vec{\mu}}}

\

\noindent If $x \mapsto t \in \dot{\vec{\mu}}$, then $\Gamma \vdash m(\dot{\vec{\mu}}x) \equiv  \mu x \to_{\beta\mu} m(\mu t) \equiv m(\dot{\vec{\mu}} t)$. Techincally, the last equality need
to be justified, informally we can justify that by locality of $\mu$.  If $x \mapsto t \in \Gamma$, then $\Gamma \vdash m(\dot{\vec{\mu}}x) \equiv x \to_{\beta\mu} m(t) \equiv m(\dot{\vec{\mu}} t)$. 

\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma,\dot{\vec{\mu}} \vdash \mu x_i \to_{\beta} \mu t_i}{(x_i \mapsto t_i) \in \mu}

\

\noindent We have $\Gamma \vdash m(\dot{\vec{\mu}}\mu x_i) \equiv \mu x_i \to_{\beta\mu} m(\mu  t_i) \equiv m(\dot{\vec{\mu}} \mu t_i)$.

\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma, \dot{\vec{\mu}} \vdash (\lambda x.t)t' \to_{\beta} [t'/x]t}{}

\

\noindent We have $\Gamma \vdash m(\dot{\vec{\mu}}((\lambda x.t)t')) \equiv (\lambda x.m(\dot{\vec{\mu}}t))m(\dot{\vec{\mu}}t') \to_{\beta\mu}$

\noindent $ m([m(\dot{\vec{\mu}}t)/x]m(\dot{\vec{\mu}}t')) \equiv m([\dot{\vec{\mu}}t/x]\dot{\vec{\mu}}t') \equiv m(\dot{\vec{\mu}}([t'/x]t))$. The last two equalities are by lemma \ref{norm:iden}, lemma \ref{norm:sub}. 

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma, \dot{\vec{\mu}} \vdash \lambda x.t \to_{\beta} \lambda x.t'}{\Gamma, \dot{\vec{\mu}} \vdash t \to_{\beta}t' }

\

\noindent $\Gamma \vdash m(\dot{\vec{\mu}}(\lambda x.t))  \equiv  \lambda x.m(\dot{\vec{\mu}}t)  \stackrel{IH}{\to_{\beta\mu}} \lambda x.m(\dot{\vec{\mu}} t') \equiv m(\dot{\vec{\mu}}(\lambda x.t')) $. 

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma, \dot{\vec{\mu}} \vdash \mu t \to_{\beta} \mu t'}{\Gamma,\dot{\vec{\mu}} ,\tilde{\mu} \vdash t \to_{\beta}t' }

\

\noindent We want to show $\Gamma \vdash  m(\dot{\vec{\mu}} \mu t) \to_{\beta\mu} m(\dot{\vec{\mu}} \mu t')$. This is directly by IH. 

\

\noindent All the other cases are similar.
\end{proof}



\begin{lemma}
    \label{norm:sub}
    \
    
$m(\vec{\mu}\vec{\mu}t) \equiv m(\vec{\mu}t)$ and $m(\vec{\mu} ([t_2/x]t_1)) \equiv m( [\vec{\mu} t_2/x]\vec{\mu} t_1)$
\end{lemma}

\begin{proof}
We can prove this using the same method as lemma \ref{norm:fun}, namely, identify $t$ and then proceed by inducton. 
\end{proof}

\begin{lemma}
\label{norm:iden}
 $m(m(t)) \equiv m(t)$ and $m([m(t_1)/y] m(t_2)) \equiv m([t_1/y]t_2)$. 
\end{lemma}
\begin{proof}
The first equality is by lemma \ref{norm:id} and lemma \ref{norm:fun}. For the second equality, we 
prove it using similar method as lemma \ref{norm:fun}: We identify $t_2$ as $\dot{\overrightarrow{\mu_1}}t_2'$, where $t_2'$ does not contains any closure at head position. We proceed by induction on the structure of $t_2'$:

\

\noindent \textbf{Base Cases}: $t_2' = *$, obvious. For $t_2' = x$, we use $m(m(t)) \equiv m(t)$. 

\

\noindent \textbf{Step Cases}: If $t_2' = \lambda x.t_2''$, 
then $m(\dot{\overrightarrow{\mu_1}}(\lambda x.[t_1/y]t_2'')) \equiv $

\noindent $\lambda x.m(\dot{\overrightarrow{\mu_1}}([t_1/y]t_2'')) \equiv \lambda x.m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}([t_1/y]t_2'''))$, where $t_2''$ is identified as $\dot{\overrightarrow{\mu_2}} t_2'''$, and $t_2'''$ does not have any closure at head position. Since $t_2'''$ is structurally smaller than $\lambda x.t_2''$, by IH, 

\noindent $m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}([t_1/y]t_2''')) \equiv m([t_1/y](\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}t_2''')) \equiv $

\noindent $ m([m(t_1)/y] m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}t_2'''))$. 

\noindent Thus $\lambda x.m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}([t_1/y]t_2''')) \equiv \lambda x. m([m(t_1)/y] m(\dot{\overrightarrow{\mu_1}}\dot{\overrightarrow{\mu_2}}t_2'''))$, implying $m([t_1/y]\dot{\overrightarrow{\mu_1}}(\lambda x.t_2'')) \equiv m( [m(t_1)/y] m(\lambda x.\dot{\overrightarrow{\mu_1}}t_2''))$. 

\noindent Since $m( [m(t_1)/y] m(\lambda x.\dot{\overrightarrow{\mu_1}}t_2'')) \equiv m( [m(t_1)/y] m(\dot{\overrightarrow{\mu_1}}(\lambda x.t_2'')))$, we conclude $m( [m(t_1)/y] m(\dot{\overrightarrow{\mu_1}}(\lambda x.t_2''))) \equiv m([t_1/y]\dot{\overrightarrow{\mu_1}}(\lambda x.t_2'')) $.

For $t_2' = t_a t_b$, $t_2' = \iota x.t_2''$, $t_2' = \Pi x:t_a.t_b$, we can argue similarly as above.

\end{proof}
\begin{lemma}
\label{norm:id}
  If $n \in \Phi$, then $m(n) \equiv n$. 
\end{lemma}
\begin{proof}
  By induction on the structure of $n$. 
\end{proof}

\begin{definition}[$\beta$ Reduction on $\mu$-normal Forms]

\
  
\infer{\Gamma \vdash n \to_{\beta \mu} m(t)}{\Gamma \vdash n \to_{\beta}t}

\end{definition}

\noindent \textbf{Note}: From this definition we can conclude: 

\

\begin{tabular}{ll}

\infer{\Gamma \vdash \lambda x.n \to_{\beta \mu} \lambda x.n'}{\Gamma \vdash n \to_{\beta \mu} n' }

&

\infer{\Gamma \vdash n n' \to_{\beta \mu} n n''}{\Gamma \vdash n' \to_{\beta \mu} n'' }
\\
\\

\infer{\Gamma \vdash \Pi x:n.n' \to_{\beta \mu} \Pi x:n.n''}{\Gamma \vdash n' \to_{\beta \mu} n'' }
&
\infer{\Gamma \vdash n n' \to_{\beta \mu} n'' n'}{\Gamma \vdash n \to_{\beta \mu} n'' }

\\
\\
\infer{\Gamma \vdash \Pi x:n.n' \to_{\beta \mu} \Pi x: n''.n'}{\Gamma \vdash n \to_{\beta \mu} n'' }

&

\infer{\Gamma \vdash \iota x.n \to_{\beta \mu} \iota x.n'}{\Gamma \vdash n \to_{\beta \mu} n' }
\end{tabular}

\

\noindent The first rule follows because: Assume $\Gamma \vdash n \to_{\beta\mu} n'$, say $m(t) \equiv n'$ and $\Gamma \vdash n \to_{\beta} t$. Then $\Gamma \vdash \lambda x.n \to_{\beta} \lambda x.t$ and $m(\lambda x.t) \equiv \lambda x.m(t) \equiv \lambda x.n'$. The others follow similarly. 


\begin{lemma}
\label{subj}
If $\Gamma \vdash   n_1\to_{\beta\mu} n_1'$, then $\Gamma \vdash m([n_2/x]n_1) \to_{\beta\mu} m([n_2/x]n_1')$.
\end{lemma}
\begin{proof}
  By induction on derivation of $\Gamma \vdash   n_1\to_{\beta} t_1$, where $m(t_1) \equiv n_1'$.
We will list a few nontrivial cases. Note that the we use lemma \ref{norm:iden} implicitly. 

\

\noindent \textbf{Base Case}:

\

\infer{\Gamma \vdash y \to_{\beta} t_1}{(y\mapsto t_1) \in \Gamma}

\

\noindent In this case $n_1 = y$. By locality, we have $\Gamma \vdash m([n_2/x]y) \equiv y \to_{\beta\mu} m(t_1) \equiv m([n_2/x]t_1)$.


\

\noindent \textbf{Base Case}: 

\

\infer{\Gamma \vdash(\lambda y.n)n' \to_{\beta} [n'/y]n}{}

\

\noindent $n_1 = (\lambda y.n)n'$. So $\Gamma \vdash m([n_2/x]((\lambda y.n)n')) $

\noindent $\equiv m((\lambda y.[n_2/x]n) [n_2/x]n') \equiv (\lambda y.m([n_2/x]n)) m([n_2/x]n')  \to_{\beta\mu}m([m([n_2/x]n')/y] m([n_2/x]n)) \equiv m([[n_2/x]n'/y] ([n_2/x]n)) \equiv m([n_2/x]([n'/y]n))$. 

\

\noindent \textbf{Base Case}: 

\

\infer{\Gamma \vdash\mu x_i \to_{\beta} \mu t_i}{(x_i \mapsto
t_i) \in \mu}

\

\noindent $n_1 = \mu x_i$. By locality, $\Gamma \vdash m([n_2/x]\mu x_i) \equiv \mu x_i \to_{\beta\mu} m(\mu t_i) \equiv m([n_2/x](\mu t_i))$.


\

\noindent \textbf{Step Case}: 

\

\infer{\Gamma \vdash\lambda y.n \to_{\beta} \lambda y.t'}{\Gamma
\vdash n \to_{\beta}t' }

\

\noindent $n_1 = \lambda y.n$. By IH, we have $\Gamma\vdash m([n_2/x]n)  \to_{\beta\mu} m([n_2/x]t')$. So $\Gamma\vdash m(\lambda y.[n_2/x]n)  \to_{\beta\mu} m(\lambda y.[n_2/x]t')$.

\

\noindent \textbf{Step Case}: 

\

\infer{\Gamma \vdash \mu t \to_{\beta} \mu t'}{\Gamma,\tilde{\mu}\vdash t
\to_{\beta}t' }

\

\noindent This case will not arise since $n_1$ is already in $\mu$ normal form.

\

\noindent The other cases are similar.
 
\end{proof}
\begin{lemma}
\label{subp}
If $\Gamma \vdash n_2\to_{\beta\mu} n_2'$, then $\Gamma \vdash m([n_2/x]n_1) \stackrel{*}{\to_{\beta\mu}} m([n_2'/x]n_1)$.
\end{lemma}
\begin{proof}
  By induction on $n_1$.
\end{proof}

%% \begin{definition}
%%   We say $x$ is unsubstitutable w.r.t. $\Gamma$ if there is some $t, a$ such that $(x:a) \mapsto t \in \Gamma$. 
%% \end{definition}

%% \noindent \textbf{Note}: If $x$ is not unsubstitutable, we say it is substitutable, in this draft we assume for any substitution $[t/x]$, $x$ is substitutable w.r.t. $\Gamma$.


\begin{definition}[Parallel Reductions]

\

\footnotesize{
  \begin{tabular}{ll}


\infer{\Gamma \vdash  n \Rightarrow_{\beta \mu} n}{}

&

\infer{\Gamma \vdash  x \Rightarrow_{\beta\mu} m( t)}{(x \mapsto t) \in \Gamma}
\\
\\
\infer{\Gamma \vdash \mu x_i \Rightarrow_{\beta\mu} m(\mu t_i)}{(x_i \mapsto t_i) \in \mu}

&

\infer{\Gamma \vdash (\lambda x.n_1) n_2 \Rightarrow_{\beta\mu} m([n_2'/x]n_1')}{\Gamma \vdash   n_1\Rightarrow_{\beta\mu} n_1' &\Gamma \vdash  n_2\Rightarrow_{\beta\mu} n_2'}
\\
\\

\infer{\Gamma \vdash \lambda x.n \Rightarrow_{\beta\mu} \lambda x.n'}{\Gamma \vdash n \Rightarrow_{\beta\mu}n' }


&
\infer{\Gamma \vdash n n' \Rightarrow_{\beta\mu} n'' n'''}{ \Gamma \vdash  n \Rightarrow_{\beta\mu}n''& \Gamma \vdash n' \Rightarrow_{\beta\mu} n'''}

\\
\\

\infer{\Gamma \vdash \iota x.n \Rightarrow_{\beta\mu} \iota x.n'}{\Gamma \vdash n \Rightarrow_{\beta\mu}n' }
&
\infer{\Gamma \vdash \Pi x:n.n' \Rightarrow_{\beta\mu} \Pi x:n''.n'''}
{\Gamma \vdash n' \Rightarrow_{\beta\mu} n''' &\Gamma \vdash  n \Rightarrow_{\beta\mu}n'' }
\end{tabular}
}
\end{definition}

\begin{lemma}
  $\to_{\beta\mu} \subseteq \Rightarrow_{\beta\mu} \subseteq \to_{\beta\mu}^*$.
\end{lemma}
\begin{proof}
  %% Many proofs involving parallelization typically state this lemma as obvious result. Obviousness arises in many situations, sadly this is not one of them. Thus I include a proof sketch here.

For $\to_{\beta\mu} \subseteq \Rightarrow_{\beta\mu}$, by induction on the derivation of
$\Gamma \vdash n \to_{\beta} t$, where $\Gamma \vdash n \to_{\beta\mu} m(t)$.

For $\Rightarrow_{\beta\mu} \subseteq \to_{\beta\mu}^*$, by induction on the derivation of
$\Gamma \vdash n \Rightarrow_{\beta\mu} n'$. We show the case where(the other cases are obvious): 

\

\infer{\Gamma \vdash (\lambda x.n_1) n_2 \Rightarrow_{\beta\mu} m([n_2'/x]n_1')}{\Gamma \vdash   n_1\Rightarrow_{\beta\mu} n_1' &\Gamma \vdash  n_2\Rightarrow_{\beta\mu} n_2'}

\

\noindent By lemma \ref{key}, we know that $\Gamma \vdash m([n_2/x]n_1) \Rightarrow_{\beta\mu} m([n_2/x]n_1)$, given $\Gamma \vdash   n_1\Rightarrow_{\beta\mu} n_1' ,\Gamma \vdash  n_2\Rightarrow_{\beta\mu} n_2'$. Since $\to_{\beta\mu} \subseteq \Rightarrow_{\beta\mu}$, we have: if $\Gamma \vdash   n_1\to_{\beta\mu} n_1', \Gamma \vdash  n_2\to_{\beta\mu} n_2'$ , then $\Gamma \vdash m([n_2/x]n_1) \to_{\beta\mu} m([n_2'/x]n_1')$($\dagger$). By IH, we have $\Gamma \vdash   n_1\stackrel{*}{\to_{\beta\mu}} n_1' ,\Gamma \vdash  n_2 \stackrel{*}{\to_{\beta\mu}}n_2'$. By lemma \ref{subj}, lemma \ref{subp} and ($\dagger$), we have $\Gamma \vdash (\lambda x.n_1)n_2 \to_{\beta\mu} m([n_2/x]n_1) \stackrel{*}{\to_{\beta\mu}} m([n_2'/x]n_1')$.

\end{proof}
\begin{lemma}
\label{lemma7}
If $\Gamma \vdash n_2 \Rightarrow_{\beta\mu} n_2'$, then $\Gamma \vdash m([n_2/x]n_1) \Rightarrow_{\beta\mu} m([n_2'/x]n_1)$.
\end{lemma}

\begin{proof}
\noindent  By induction on the structure of $n_1$. 

\

\noindent \textbf{Base Cases}: $n_1= x$, $n_1 = \mu x_i$, $n_1 = *$. Obvious. 

\

\noindent \textbf{Step Case}: $n_1= \lambda y.n$. We have $\Gamma \vdash m(\lambda y.[n_2/x]n) \equiv \lambda y.m([n_2/x]n) \stackrel{IH}{\Rightarrow_{\beta\mu}} \lambda y.m([n_2'/x]n) \equiv m(\lambda y.[n_2'/x]n)$.

\

\noindent \textbf{Step Case}: $n_1= n n'$. We have $\Gamma \vdash m([n_2/x]n [n_2/x]n') \equiv m([n_2/x]n) m([n_2/x]n')\stackrel{IH}{\Rightarrow_{\beta\mu}} m([n_2'/x]n) m([n_2'/x]n')\equiv m([n_2'/x]n[n_2'/x]n)$.

\

\noindent \textbf{Step Case}: $n_1 = \iota x.n, \Pi x:n.n'$. Similar as above.

\end{proof}

\begin{lemma}
\label{key}
If $\Gamma \vdash n_1 \Rightarrow_{\beta\mu} n_1'$ and $\Gamma \vdash n_2 \Rightarrow_{\beta\mu} n_2'$, then $\Gamma \vdash m([n_2/y]n_1) \Rightarrow_{\beta\mu} m([n_2'/y]n_1')$.
\end{lemma}

\begin{proof}

\noindent We prove this by induction on the derivation of $\Gamma \vdash n_1 \Rightarrow_{\beta\mu} n_1'$.
  
\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma \vdash n \Rightarrow_{\beta \mu} n}{}

\

\noindent By lemma \ref{lemma7}.

\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma \vdash \mu x_i\Rightarrow_{\beta\mu} m(\mu t_i)}{x_i \mapsto t_i \in \mu}

\

\noindent Since $y \notin \mathrm{FV}(\mu x_i)$ and $\mu$ is local, $m([n_2/y]\mu x_i) \equiv m(\mu x_i)$, then $m(\mu x_1) \equiv \mu x_i \Rightarrow_{\beta\mu} m(\mu t_i) \equiv m(m(\mu t_i))$(lemma \ref{norm:iden}). 

\

\noindent \textbf{Base Case:}

\

\infer{\Gamma \vdash  x \Rightarrow_{\beta\mu} m( t)}{(x \mapsto t) \in \Gamma}

\

\noindent In this case, we assume $x \not \equiv y$, then we have $m([n_2/y]x) \equiv m(x) \equiv x \Rightarrow_{\beta\mu} m(t) \equiv m(m(t)) \equiv m([n_2/y]m(t))$. 

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma \vdash (\lambda x.n_a) n_b \Rightarrow_{\beta\mu} m([n_a'/x]n_b')}{\Gamma \vdash n_a\Rightarrow_{\beta\mu} n_a' & \Gamma \vdash n_b\Rightarrow_{\beta\mu} n_b'}

\

\noindent We have $\Gamma \vdash m((\lambda x.[n_2/y]n_a) [n_2/y] n_b) \equiv $

\noindent $ (\lambda x.m([n_2/y]n_a)) m([n_2/y] n_b)\stackrel{IH}{\Rightarrow_{\beta\mu}}$

\noindent $m([m([n_2'/y] n_b')/x]m([n_2'/y] n_a')) \equiv m([n_2'/y]([n_b'/x]n_a'))$. The last equality is by lemma \ref{norm:iden}. Here we first apply induction hypothesis to reduce, then apply ${\Rightarrow_{\beta\mu}}$.

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma \vdash \lambda x.n \Rightarrow_{\beta\mu} \lambda x.n'}{\Gamma \vdash n \Rightarrow_{\beta\mu}n' }

\

\noindent We have $\Gamma \vdash m(\lambda x.[n_2/y]n) \equiv \lambda x.m([n_2/y]n) \stackrel{IH}{\Rightarrow_{\beta\mu}} $

\noindent $\lambda x.m([n_2'/y]n') \equiv m(\lambda x.[n_2'/y]n') $

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma \vdash n_a n_b \Rightarrow_{\beta\mu} n_a'n_b'}{ \Gamma \vdash n_a\Rightarrow_{\beta\mu} n_a' & \Gamma \vdash n_b\Rightarrow_{\beta\mu} n_b'}

\

\noindent We have $\Gamma \vdash m([n_2/y]n_a [n_2/y] n_b) \equiv m([n_2/y]n_a) m([n_2/y] n_b)$

$ \stackrel{IH}{\Rightarrow_{\beta\mu}} m([n_2'/y] n_a') m([n_2'/y] n_b')\equiv m([n_2'/y](n_a'n_b'))$.

\

\noindent The other cases are similar as above. 

\end{proof}
\begin{lemma}[Diamond Property] 
  If $ \Gamma \vdash n \Rightarrow_{\beta\mu} n'$ and $\Gamma \vdash n \Rightarrow_{\beta\mu} n''$, then there exist $n'''$ such that $ \Gamma \vdash n'' \Rightarrow_{\beta\mu} n'''$ and $ \Gamma \vdash n' \Rightarrow_{\beta\mu} n'''$.
\end{lemma}

\begin{proof}
  \noindent By induction on the derivation of $\Gamma \vdash n \Rightarrow_{\beta\mu} n'$. 

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma \vdash n \Rightarrow_{\beta \mu} n}{}

\

\noindent Obvious.

\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma \vdash x \Rightarrow_{\beta\mu} m( t)}{(x \mapsto t) \in \Gamma}

\

\noindent Obvious.

\

\noindent \textbf{Base Case:}

\

\noindent \infer{\Gamma \vdash \mu x_i\Rightarrow_{\beta\mu} m(\mu t_i)}{}

\

\noindent Obvious. 

\

\noindent \textbf{Step Case:}

\

\noindent \infer{\Gamma \vdash (\lambda x.n_1) n_2 \Rightarrow_{\beta\mu} m([n_2'/x]n_1')}{ \Gamma \vdash n_1\Rightarrow_{\beta\mu} n_1' &\Gamma \vdash n_2\Rightarrow_{\beta\mu} n_2'}

\

\noindent Suppose $\Gamma \vdash (\lambda x.n_1) n_2 \Rightarrow_{\beta\mu}(\lambda x.n_1'') n_2''$, where $\Gamma \vdash n_1 \Rightarrow_{\beta\mu}n_1''$ and $\Gamma \vdash n_2 \Rightarrow_{\beta\mu} n_2''$. By IH, there exist $n_1''', n_2'''$ such that $\Gamma \vdash n_1'' \Rightarrow_{\beta\mu}n_1'''$ and $\Gamma \vdash n_1' \Rightarrow_{\beta\mu}n_1'''$ and $\Gamma \vdash n_2' \Rightarrow_{\beta\mu} n_2'''$ and $\Gamma \vdash n_2' \Rightarrow_{\beta\mu}n_2'''$ . By lemma \ref{key}, $\Gamma \vdash m([n_1'/x]n_2') \Rightarrow_{\beta\mu} m([n_1'''/x]n_2''')$, also $\Gamma \vdash (\lambda x.n_1'') n_2''\Rightarrow_{\beta\mu} m([n_1'''/x]n_2''')$.  

\

\noindent Suppose $\Gamma \vdash (\lambda x.n_1) n_2 \Rightarrow_{\beta\mu}m([n_2''/x]n_1'') $, where $\Gamma \vdash n_1 \Rightarrow_{\beta\mu}n_1''$ and $\Gamma \vdash n_2 \Rightarrow_{\beta\mu} n_2''$. By IH, there exist $n_1''', n_2'''$ such that $\Gamma \vdash n_1'' \Rightarrow_{\beta\mu}n_1'''$ and $\Gamma \vdash n_1' \Rightarrow_{\beta\mu}n_1'''$ and $\Gamma \vdash n_2' \Rightarrow_{\beta\mu} n_2'''$ and $\Gamma \vdash n_2' \Rightarrow_{\beta\mu}n_2'''$ . By lemma \ref{key}, $\Gamma \vdash m([n_1'/x]n_2') \Rightarrow_{\beta\mu} m([n_1'''/x]n_2''')$ and $\Gamma \vdash m([n_1''/x]n_2'') \Rightarrow_{\beta\mu} m([n_1'''/x]n_2''')$.

\

\noindent The other cases are either similar to the one above or easy.

\end{proof}


\begin{theorem}
  $\to_{\beta} \cup \to_{\mu}$ is confluent. 
\end{theorem}
\begin{proof}
  We know by diamond property of $\Rightarrow_{\beta\mu}$, $\to_{\beta\mu}$ is confluent. Since
$\to_{\mu}$ is strongly normalizing and confluent, and by lemma \ref{fp} and Hardin's 
interpretation lemma(lemma \ref{interp}), we conclude $\to_{\beta} \cup \to_{\mu}$ is confluent. 
\end{proof}

\section{Proofs of Section \ref{confanalysis}}
\label{a2}
\begin{lemma}
\label{cus}
    Let $\to$ denote $\to_{\beta}\cup \to_{\mu}$, if $\Gamma \vdash t \to t'$, then $\Gamma \vdash [t_1/x]t \to [t_1/x]t'$ for any $t_1$. 
\end{lemma}
\begin{proof}
  Obvious.
\end{proof}
\begin{lemma}
  Let $\to$ denote $\to_{\beta}\cup \to_{\mu}$, then $\to$ commutes with $\to_{\iota}$. i.e. if $\Gamma \vdash t_1 \to t_2$ and $\Gamma \vdash t_1 \to_{\iota} t_3$, then there exist $t_4$ such that $\Gamma \vdash t_2 \to_{\iota} t_4$ and $\Gamma \vdash t_3 \to t_4$. 
\end{lemma}
\begin{proof}
  Since $\Gamma \vdash t_1 \to_{\iota} t_3$, we know that $t_1 \equiv \iota x.t'$ and $t_3 \equiv [t/x]t'$. We also have $\Gamma \vdash t_1\equiv \iota x.t' \to t_2$. By inversion,
we know that $t_2 \equiv \iota x.t''$ with $\Gamma \vdash t' \to t''$. By lemma \ref{cus}, we know that 
$\Gamma \vdash [t/x]t' \to [t/x]t''$. Thus $t_4 \equiv [t/x]t''$ and $\Gamma \vdash \iota x.t'' \to_{\iota} [t/x]t''$.
\end{proof}

%% \noindent \textbf{Note}: We are lucky to get communtativity here, which I didn't realize at
%% first. But I am fine with this since luck is only for the one who prepare. 
\begin{theorem}
  $\to \cup \to_{\iota}$ is confluent. 
\end{theorem}
\begin{lemma}
\label{subo}
  If $\Gamma \vdash t_1 \to_o t_2$, then $\Gamma \vdash [t/x]t_1 \to_o [t/x]t_2$.
\end{lemma}
\begin{proof}
By induction on derivaton. 
\end{proof}

\begin{lemma}
\label{subo2}
  If $\Gamma \vdash t_1 \to_o t_2$, then $\Gamma \vdash [t_1/x]t \hookrightarrow_o [t_2/x]t$.
\end{lemma}
\begin{proof}
  By induction on the structure of $t$. 
\end{proof}
\begin{lemma}
  $\to_o$ has diamond property, thus is confluent.
\end{lemma}
\begin{proof}
  Straightforward induction.
\end{proof}

\begin{lemma}
  $\to_o$ commutes with $\to_{\iota}$. 
\end{lemma}
\begin{proof}
  Suppose $\Gamma \vdash \iota x.t'\to_{\iota} [t/x]t'$ and $\Gamma \vdash \iota x.t' \to_o \iota x.t''$ with $\Gamma \vdash t' \to_o t''$. Then by lemma \ref{subo}, we have $\Gamma \vdash [t/x]t' \to_o [t/x]t''$. We also have $\Gamma \vdash \iota x.t'' \to_{\iota} [t/x]t''$. 
\end{proof}

\begin{lemma}
  $\to_o$ weak commutes with $\to_{\beta}$. 
\end{lemma}
\begin{proof}
\noindent By induction on $\to_o$.

\

\noindent \textbf{Case}: $\Gamma \vdash \mu t \to_o t$, where $\mu \in \Gamma$.

\

\noindent If $\Gamma \vdash \mu x_i \to_{\beta} \mu t_i$, where $x_i \mapsto t_i \in \mu$, then
$\Gamma \vdash \mu x_i \to_o x_i$. So we have $\Gamma \vdash \mu t_i \to_o t_i$ and $\Gamma \vdash x_i \to_{\beta} t_i$ since $\mu \in \Gamma$. 

\

\noindent If $\Gamma \vdash \mu t \to_{\beta} \mu t'$, with $\Gamma \vdash t \to_{\beta} t'$. So we have $\Gamma \vdash t \to_{\beta} t'$ and $\Gamma \vdash \mu t' \to_o t'$. 

\

\noindent \textbf{Case}: $\Gamma \vdash (\lambda x.t_1)t_2 \to_o (\lambda x.t_1')t_2$, where 
$\Gamma \vdash t_1 \to_o t_1'$. 

\

\noindent Suppose $\Gamma \vdash (\lambda x.t_1)t_2 \to_{\beta} [t_2/x]t_1$. By lemma \ref{subo}, we know that $\Gamma \vdash [t_2/x]t_1 \to_o [t_2/x]t_1'$. And we also have $\Gamma \vdash (\lambda x.t_1')t_2 \to_{\beta} [t_2/x]t_1'$. 

\

\noindent \textbf{Case}: $\Gamma \vdash (\lambda x.t_1)t_2 \to_o (\lambda x.t_1)t_2'$, where 
$\Gamma \vdash t_2 \to_o t_2'$. 

\

\noindent Suppose $\Gamma \vdash (\lambda x.t_1)t_2 \to_{\beta} [t_2/x]t_1$. By lemma \ref{subo2}, we know that $\Gamma \vdash [t_2/x]t_1 \hookrightarrow_o [t_2'/x]t_1$. And we also have $\Gamma \vdash (\lambda x.t_1)t_2' \to_{\beta} [t_2'/x]t_1$. 

\

\noindent The other cases are by induction.
\end{proof}

\begin{lemma}
$\to_o$ weak commutes with $\to_{\mu}$. i.e. if $\Gamma \vdash t \to_{o} t'$ and $\Gamma \vdash t \to_{\mu} t''$, then there exist a $t_1$ such that $\Gamma \vdash t'' \to_{o}^* t_1$ and $\Gamma \vdash t' \hookrightarrow_{\mu} t_1$.

\end{lemma}

\begin{proof}
\noindent  By induction on $\Gamma \vdash t \to_{o} t'$. 

\

\noindent \textbf{Case}: $\Gamma \vdash \mu t \to_o t$, where $\mu \in \Gamma$.

\

\noindent Suppose $\Gamma \vdash \mu t \to_{\mu} t$ with $dom(\mu) \# \mathrm{FV}(t)$. This case
is obvious. 

\

\noindent Suppose $t \equiv \lambda x.t_2$ and $\Gamma \vdash \mu (\lambda x.t_2) \to_{\mu} \lambda x.\mu t_2$. Then $\Gamma \vdash \lambda x.t_2 \hookrightarrow_{\mu} \lambda x.t_2$ and $\Gamma \vdash \lambda x.\mu t_2 \to_o \lambda x.t_2$.   

\

\noindent Suppose $t \equiv t_2 t_3$ and $\Gamma \vdash \mu (t_2 t_3) \to_{\mu} (\mu t_2)(\mu t_3)$. Then $\Gamma \vdash t_2 t_3 \hookrightarrow_{\mu} t_2 t_3$ and $\Gamma \vdash (\mu t_2)(\mu t_3) \to_o^* t_2 t_3$.

\

\noindent For $t \equiv \iota x.t_2, \Pi x:t_2.t_3$, we can argue similarly. 

\

\noindent The other cases are by induction.   
\end{proof}
\begin{theorem}
  $\to_{o} \cup \to_{\iota} \cup \to_{\beta} \cup \to_{\mu}$ is confluent.
\end{theorem}

\section{Proofs of Section \ref{preservation}}
\label{a3}
\noindent \textbf{Note}: In this section we use $\stackrel{t}{=}_{\beta,\mu,\iota,o}$ to mean 
the same thing as $=_{\beta,\mu,\iota,o}$, but with an emphasis on the subject $t$.

\begin{lemma}
    \label{type} If $\Gamma \vdash t_1 \stackrel{t}{=}_{\beta,\mu,\iota,o} t_2$ and $\Gamma \vdash t : t_1$ and $\Gamma \vdash t_2:*$, then $\Gamma \vdash t : t_2$.
\end{lemma}

\begin{proof} 
By induction on length of $\Gamma \vdash t_1 \stackrel{t}{=}_{\beta,\mu,\iota,o} t_2$.

\end{proof}

\begin{lemma} 
\label{conv}
 If $\Gamma \vdash t_1 \stackrel{t}{=}_{\beta,\mu,\iota,o} t_2$ and $\Gamma \vdash t = t'$, then $\Gamma \vdash t_1
\stackrel{t'}{=}_{\beta,\mu,\iota,o} t_2$.  
\end{lemma} 
\begin{proof} By induction on length of $\Gamma \vdash t_1 \stackrel{t}{=}_{\beta,\mu,\iota,o} t_2$.
\end{proof}

\begin{lemma}
    $m(\mu_1 \mu_2 t) \equiv m(\mu_2 \mu_1 t)$, thus $\Gamma \vdash \mu_1 \mu_2 t = \mu_2 \mu_1 t$.
\end{lemma}

\begin{proof}
  Identify $t$ as $\dot{\vec{\mu}} t'$, where $t'$ does not have any closure
at head position. By induction on the structure of such $t'$. Also
 $\Gamma \vdash \mu_1 \mu_2 t = m(\mu_1 \mu_2 t) = m(\mu_2 \mu_1 t) = \mu_2 \mu_1 t$. 
\end{proof}
\begin{lemma}
\label{submu}
    $\Gamma \vdash \mu ([t/x]t') = [\mu t/x] \mu t'$
\end{lemma}
\begin{proof}
$\Gamma \vdash \mu ([t/x]t') = m(\mu ([t/x]t')) = m([\mu t/x] \mu t') = [\mu t/x] \mu t'$.
\end{proof}

\begin{lemma} 
    \label{metacong} 
    If $\Gamma, \tilde{\mu} \vdash
t' \stackrel{t}{=}_{\beta,\mu,\iota,o} t''$, then $\Gamma\vdash \mu 
 t' \stackrel{\mu t }{=}_{\beta,\mu,\iota,o} \mu 
 t''$
 \end{lemma} 
 \begin{proof} By induction on length of
$\Gamma, \tilde{\mu}  \vdash t' \stackrel{t}{=}_{\beta,\mu,\iota,o} t''$. We list a few cases.

\noindent\textbf{Case}: $\Gamma, \tilde{\mu}  \vdash t' {=} t''$.

\

\noindent We have $\Gamma  \vdash \mu t'   = \mu t''$. 

\

\noindent\textbf{Case}: $\Gamma, \tilde{\mu}  \vdash \iota x.t' {\to_{\iota}} [t/x]t'$.

\

\noindent We know $\Gamma \vdash \mu \iota x.t' {=} \iota x.\mu t'  {\to_{\iota}} [\mu t/x] \mu t' {=} \mu [t/x]t'$ (the last equality is by lemma \ref{submu}). 

\end{proof}


\begin{lemma}[Inversion I] 
    If $\Gamma \vdash \lambda x.t : t'$, then $\Gamma, x: t_1 \vdash t : t_2$ 
    and $\Gamma \vdash  \Pi  x : t_1 . t_2 \stackrel{\lambda x.t}{=}_{\beta,\mu,\iota,o} t' $.

\end{lemma}

\begin{proof} By induction on the derivation of $\Gamma \vdash \lambda x.t : t'$. 

\end{proof} 

\begin{lemma}[Inversion II] 
    If $\Gamma \vdash t_1 t_2 : t'$, then
    $\Gamma \vdash t_1 : \Pi x : t_1'. t_2'$ and $\Gamma \vdash t_2 :t_1'$,  $\Gamma \vdash [t_2/x] t_2'\stackrel{ t_1
t_2} {=}_{\beta,\mu,\iota,o}  t'$.

\end{lemma}

\begin{lemma}[Inversion III] 
    If $\Gamma \vdash * : t$, then $\Gamma \vdash * \stackrel{*}{=}_{\beta,\mu,\iota,o}  t $.
\end{lemma}

\begin{lemma}[Inversion IV] 
  If $\Gamma \vdash x : t'$, then  $x:t \in \Gamma$ and $\Gamma \vdash t \stackrel{x}{=}_{\beta,\mu,\iota,o} t' $.
\end{lemma}

\begin{lemma}[Inversion V] 
 If $\Gamma, \tilde{\mu} \vdash x_j : t'$ and $x_j \in dom(\mu)$, then  $x_j:a_j \in \mu$ and $\Gamma, \tilde{\mu} \vdash a_j \stackrel{x_j}{=}_{\beta,\mu,\iota,o} t' $. 
\end{lemma}

\begin{lemma}[Inversion VI] 
    If $\Gamma  \vdash \vec{\mu}t : t'$ and $t$ does not have a closure at head position, then 
$\Gamma, \tilde{\vec{\mu}} \vdash t:t'' $ and $\Gamma  \vdash \vec{\mu}t'' \stackrel{\vec{\mu}t} {=}_{\beta,\mu,\iota,o} t' $.
\end{lemma}

\begin{lemma}[Inversion VII]
If $\Gamma \vdash \iota x.t:t'$, then $\Gamma, x:\iota x.t \vdash t:*$ and $\Gamma \vdash * \stackrel{\iota x.t}{=}_{\beta,\mu,\iota,o} t'$.
  
\end{lemma}

\begin{lemma}[Inversion VIII]
If $\Gamma \vdash \Pi x:t_1.t_2:t'$, then $\Gamma, x:t_1 \vdash t_2:*$ and $\Gamma \vdash t_1:*$ and $\Gamma \vdash * \stackrel{\Pi x:t_1.t_2}{=}_{\beta,\mu,\iota,o} t'$.
  
\end{lemma}

\begin{lemma} 
    \label{perm} If $\Gamma, \tilde{\mu}, y:b
\vdash t:a$ , then $\Gamma, y: \mu  b,\tilde{\mu} 
 \vdash t :a$.  
 \end{lemma}
\begin{proof} By induction on the
derivation of $\Gamma, \tilde{\mu} , y: \mu 
b \vdash t:t''$.
\end{proof}

\begin{lemma}[Substitution] 
    \label{subst} If $ \Gamma_1, x:t_1, \Gamma_2\vdash  t: t_2 $
and $ \Gamma \vdash  t': t_1 $, then $  \Gamma_1, [t'/x]\Gamma_2 \vdash  [t'/x] t:[t'/x] t_2 $.
\end{lemma}

\begin{proof} By induction on the derivation of $ \Gamma_1, x:t_1, \Gamma_2\vdash  t: t_2 $
.  We will show a few nontrivial cases. 

\

\noindent \textbf{Case}: 

\

\infer{\Gamma  \vdash \iota  y. t:* }{\Gamma , y: \iota 
y .t\vdash t : *}
 
\

\noindent Let $\Gamma = \Gamma_1, x:t_1, \Gamma_2$. We want to show
$\Gamma_1,[t'/x] \Gamma_2 \vdash \iota  y. [t'/x]t : *$. By IH,
we have $\Gamma_1,[t'/x] \Gamma_2, y: \iota  y. [t'/x]t \vdash
[t'/x]t : *$. So it is the case.

\

\noindent \textbf{Case}: 

\

\infer{\Gamma  \vdash t : \iota  y. t'' }{\Gamma \vdash t :
[t/y]t'' & \Gamma  \vdash \iota  y. t'':*}

\

\noindent Let $\Gamma = \Gamma_1, x:t_1, \Gamma_2$. We want to show
$\Gamma_1,[t'/x] \Gamma_2 \vdash [t'/x]t :\iota  y. [t'/x]t'' $. By
IH, we have $\Gamma_1,[t'/x] \Gamma_2 \vdash [t'/x]t :  [
[t'/x]t/y]([t'/x]t'')$. So it is the case.

\

\noindent \textbf{Case}: 

\

\infer{\Gamma \vdash t : [t/y]t''}{\Gamma  \vdash t : \iota  y. t''
}

\

\noindent Let $\Gamma = \Gamma_1, x:t_1, \Gamma_2$. We want to show
$\Gamma_1,[t'/x] \Gamma_2 \vdash [t'/x]t :  [ [t'/x]t/y]([t'/x]t'')$. By
IH, we have $\Gamma_1,[t'/x] \Gamma_2 \vdash [t'/x]t :\iota  y.
[t'/x]t'' $. So it is the case.


%% \noindent \textbf{Case}: 

%% \

%% \infer[\textit{Open}]{\Gamma, \tilde{\mu}, \Gamma' \vdash t:  t''}{
%% \Gamma, \tilde{\mu}, \Gamma' \vdash t:\mu t'' }

%% \

%% \noindent Let $\Gamma = \Gamma_1, x:t_1, \Gamma_2$. We want to show
%% $\Gamma_1,[t'/x] \Gamma_2, [t'/x] \tilde{\mu}, [t'/x]\Gamma' \vdash [t'/x]t : [t'/x]t''$. By
%% IH, we have $\Gamma_1,[t'/x] \Gamma_2 ,[t'/x]\tilde{\mu}, [t'/x]\Gamma' \vdash [t'/x]t :\mu [t'/x]t'' $. Note we make use of the locality of $\mu$. For the case where $\Gamma' = \Gamma_1, x:t_1, \Gamma_2$, we can argue similarly.

\

\noindent \textbf{Case}: 

\

\infer[\textit{Mu}]{\Gamma \vdash \mu t: \mu t''}{\Gamma, \tilde{\mu}
\vdash t:t'' &  \{\Gamma, \tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in \tilde{\mu}} }

\

\noindent Let $\Gamma = \Gamma_1, x:t_1, \Gamma_2$. We want to show
$\Gamma_1,[t'/x] \Gamma_2  \vdash \mu [t'/x]t : \mu [t'/x]t''$. By
IH, we have $\Gamma_1,[t'/x] \Gamma_2 ,[t'/x] \tilde{\mu}  \vdash [t'/x]t : [t'/x]t'' $ and
$\{\Gamma_1,[t'/x] \Gamma_2 , [t'/x]\tilde{\mu} \vdash t_j: [t'/x]a_j\}_{(t_j:[t'/x]a_j) \in [t'/x]\tilde{\mu}}$. 

\end{proof}

\begin{theorem}[Type Preservation] If $\Gamma \vdash \mathsf{wf}$ and $ \Gamma \vdash  t \leadsto t' $ and $ \Gamma \vdash  t:
a $, then $ \Gamma \vdash  t' : a $.  
\end{theorem}

\begin{proof} \noindent By induction on the derivation of $ \Gamma \vdash  t: a $, 
We list a few nontrivial cases.

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash * : *}{}

\

\noindent This case will not arise.

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash x:a}{ x:a \in \Gamma}

\

\noindent If $\Gamma \vdash x \leadsto t'$, this means $ (x:a) \mapsto t'
\in \Gamma$ and $\Gamma \vdash t':a$ since $\Gamma \vdash \mathsf{wf}$.

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash t:t_2}{\Gamma \vdash t : t_1 & \Gamma \vdash t_1 \cong
t_2  & \Gamma \vdash t_2:*}

\

\noindent In this case  $\Gamma \vdash t \leadsto t'$. By IH, $\Gamma
\vdash t': t_1$.  Since $\Gamma \vdash t_1 \cong t_2$, we have $\Gamma \vdash
t':t_2$. 

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash t:[t/x]t''}{\Gamma \vdash t : \iota x.t'' }

\

\noindent In this case  $\Gamma \vdash t \leadsto t'$. By IH, $\Gamma
\vdash t': \iota  x.t''$. Thus we have $\Gamma \vdash t':
[t'/x]t''$. Since $\Gamma \vdash t' = t$, we have $\Gamma \vdash t':
[t/x]t''$ by Conv rule.

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash t : \iota x.t'' }{\Gamma \vdash t:[t/x]t'' & \Gamma \vdash \iota x.t'':*}

\

\noindent In this case  $\Gamma \vdash t \leadsto t'$. By IH, $\Gamma
\vdash t': [t/x]t''$. Since $\Gamma \vdash [t/x]t'' = [t'/x]t''$, we have
$\Gamma \vdash t': [t'/x]t''$.Thus we have $\Gamma \vdash t' :
\iota x.t''$. 

\

\noindent \textbf{Case:}

\

\infer{\Gamma \vdash t_1' t_2' :[t_2'/x]t_2''  }{\Gamma \vdash
t_1' : \Pi x:t_1''.t_2'' & \Gamma \vdash t_2':t_1''}

\

\noindent Suppose $\Gamma \vdash (\lambda  x.t_1)v \leadsto [v/x]t_1$.
Then we know $\Gamma \vdash (\lambda  x.t_1)v : [v/x]t_2''
$ and $ \Gamma \vdash  \lambda x.t_1 : \Pi x : t_1''.t_2''$ and $ \Gamma \vdash  v :
t_1'' $. By inversion on $ \Gamma \vdash  \lambda x.t_1 : \Pi  x : t_1''. t_2'' $, we
have $ \Gamma , x:a \vdash  t_1 : b  $ and $ \Gamma \vdash  \Pi x : a.b \stackrel{\lambda  x.t_1 }{=}_{\beta,\mu,\iota,o} \Pi x: t_1''.t_2''$. By theorem \ref{invsc}, we have $\Gamma \vdash a  =_{\beta,\mu,o}  t_1'' $
and $\Gamma \vdash b =_{\beta,\mu,o} t_2''$. So we have $\Gamma, x:a \vdash t_1 :  t_2''$
and $\Gamma \vdash v :  a$. So by lemma \ref{subst}, we have $ \Gamma
\vdash  [v/x]t_1 : [v/x]t_2''$, as required. 


\

\noindent Suppose $\Gamma \vdash t_1 t_2 \leadsto t_1' t_2$, where $\Gamma
\vdash t_1 \leadsto t_1'$. We know $\Gamma \vdash t_1 t_2 : [t_2/x]t_2''$ 
and $ \Gamma \vdash  t_1 : \Pi x : t_1''.t_2''$ and $ \Gamma \vdash  t_2 :
t_1'' $. By IH, we know $ \Gamma \vdash  t_1' :  \Pi x : t_1''.t_2''$. So $\Gamma
\vdash t_1' t_2 : [t_2/x]t_2''$.

\

\noindent Suppose $\Gamma \vdash (\lambda  x.t_1) t_2 \leadsto (\lambda 
x.t_1) t_2'$, where $\Gamma \vdash t_2 \leadsto t_2'$. We know $\Gamma
\vdash (\lambda  x.t_1)t_2 : [t_2/x]t_2''$ and $ \Gamma \vdash \lambda 
x.t_1 : \Pi x : t_1''.t_2''  $ and $ \Gamma \vdash  t_2 :  t_1'' $. By IH, we know
$ \Gamma \vdash  t_2' :  t_1''  $. So $\Gamma \vdash (\lambda  x.t_1) t_2' :
[t_2'/x]t_2''$. And we know $\Gamma \vdash [t_2/x]t_2'' = [t_2'/x]t_2''$.

\

%% (\textbf{Warning}: the following cases are tedious, you could
%% potentially felt asleep if you haven't so far!)
\noindent \textbf{Case:}

\

\infer{\Gamma \vdash \mu t: \mu t'}{\Gamma, \tilde{\mu}
\vdash t:t' &  \{\Gamma, \tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in \tilde{\mu} } }

\

\noindent Suppose  $\Gamma \vdash \mu x_j \leadsto \mu
t_j$, where $x_j \mapsto t_j \in \mu$. We have $\Gamma, \tilde{\mu} \vdash x_j:t'$. By
inversion, $\Gamma, \tilde{\mu} \vdash x_j:a_j$ and $\Gamma, \tilde{\mu} \vdash a_j \stackrel{x_j}{=}_{\beta,\mu,\iota,o}  t'$. 
 Since $\Gamma, \tilde{\mu} \vdash x_j = t_j$ and by lemma \ref{conv}, we   
get $\Gamma, \tilde{\mu} \vdash a_j \stackrel{t_j}{=}_{\beta,\mu,\iota,o} t'$. 
Since $\Gamma, \tilde{\mu} \vdash  t_j : a_j$, by lemma \ref{type}, $\Gamma, \tilde{\mu} \vdash t_j:t'$. Thus we have $\Gamma \vdash \mu t_j : \mu t'$.

\

\noindent Suppose  $\Gamma \vdash \mu \vec{\mu} x_j \leadsto   \mu \vec{\mu} t_j$, where $x_j \mapsto t_j  \in \mu_j$. By inversion on  $\Gamma, \tilde{\mu} \vdash \vec{\mu}x_j:t'$, we have $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash x_j:t_a$ and $\Gamma, \tilde{\mu} \vdash \vec{\mu}t_a \stackrel{\vec{\mu}x_j}{=}_{\beta,\mu,\iota,o}  t'$. By inversion on  $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash x_j:t_a$, we have $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash x_j:b$, where $(x_j:b) \in \tilde{\mu} \cup \tilde{\vec{\mu}}$ and $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash b \stackrel{x_j}{=}_{\beta,\mu,\iota,o}  t_a$. So $\Gamma \vdash \mu \vec{\mu} b \stackrel{\mu \vec{\mu} x_j}{=}_{\beta,\mu,\iota,o} \mu \vec{\mu} t_a  \stackrel{\mu \vec{\mu} x_j}{=}_{\beta,\mu,\iota,o}\mu t'$. Since $\Gamma \vdash \mu \vec{\mu} t_j:\mu \vec{\mu} b$ and $\Gamma \vdash \mu \vec{\mu} x_j = \mu \vec{\mu} t_j$, so $\Gamma \vdash \mu\vec{\mu} t_j : \mu t'$.

\

\noindent Suppose  $\Gamma\vdash \mu * \leadsto *$.
We have $\Gamma, \tilde{\mu} \vdash
* : t''$. We have $\Gamma, \tilde{\mu} \vdash * \stackrel{*}{=}_{\beta,\mu,\iota,o} t''$(by inversion). Thus we have $\Gamma \vdash  \mu *  \stackrel{\mu *}{=}_{\beta,\mu,\iota,o} \mu t''$(lemma
\ref{metacong}). We also know that $\Gamma\vdash * : *$ and
$\Gamma \vdash \mu * = *$. So we have $\Gamma
\vdash * \stackrel{\mu *}{=}_{\beta,\mu,\iota,o} \mu
 t''$. Thus $\Gamma \vdash  * \stackrel{*}{=}_{\beta,\mu,\iota,o} \mu t''$.  So $\Gamma
\vdash * : \mu t''$(lemma \ref{type}).

\

\noindent Suppose  $\Gamma\vdash \mu\vec{\mu} * \leadsto *$. We argue similarly.

\

\noindent Suppose  $\Gamma\vdash \mu x \leadsto x$, where $x \notin dom(\mu)$.
We have $\Gamma, \tilde{\mu} \vdash
x : t''$. We have $\Gamma, \tilde{\mu} \vdash a \stackrel{x}{=}_{\beta,\mu,\iota,o} t''$, where $x:a \in \Gamma$(by inversion). Thus
we have $\Gamma \vdash  \mu a  \stackrel{\mu x}{=}_{\beta,\mu,\iota,o} \mu t''$(lemma
\ref{metacong}). We also know that $\Gamma\vdash x : a$ and
$\Gamma \vdash \mu x = x$ and $\Gamma \vdash \mu a = a$. Thus $\Gamma \vdash  a
\stackrel{x}{=}_{\beta,\mu,\iota,o} \mu t''$.  So $\Gamma
\vdash x : \mu t''$(lemma \ref{type}).

\

\noindent Suppose  $\Gamma\vdash \mu \vec{\mu} x \leadsto x$, where $x \notin dom(\mu)\cup dom(\vec{\mu})$. By inversion on $\Gamma, \tilde{\mu} \vdash
\vec{\mu}x : t'$, we have $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash
x : t_a$, where $\Gamma, \tilde{\mu} \vdash \vec{\mu} t_a \stackrel{\vec{\mu} x}{=}_{\beta,\mu,\iota,o}  t'$. By inversion on $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash
x : t_a$, we have $x:b \in \Gamma$ and $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash b \stackrel{ x}{=}_{\beta,\mu,\iota,o}  t_a$. So $\Gamma,\mu \vdash  \vec{\mu} b \stackrel{  \vec{\mu} x}{=}_{\beta,\mu,\iota,o} \vec{\mu} t_a \stackrel{  \vec{\mu} x}{=}_{\beta,\mu,\iota,o} t'$. So $\Gamma \vdash  b \stackrel{  x}{=}_{\beta,\mu,\iota,o} \mu t'$. Thus $\Gamma \vdash x:\mu t'$.

\

\noindent Suppose $\Gamma\vdash \mu \lambda x.t \leadsto
\lambda  x. \mu  t$. We have $\Gamma, \tilde{\mu} \vdash \lambda  x.t : t''$ and 
$\Gamma, \tilde{\mu}, x:t_1'' \vdash t : t_2''$ and $\Gamma, \tilde{\mu} \vdash
\Pi  x:t_1''.t_2'' \stackrel{\lambda  x.t}{=}_{\beta,\mu,\iota,o}  t''$(by
inversion). Thus we have $\Gamma,x:\mu   t_1'' \vdash \mu
  t: \mu   t_2''$(lemma \ref{perm}) and
$\Gamma \vdash \mu  (\Pi  x:t_1''.t_2'') \stackrel{\mu
 \lambda  x.t}{=}_{\beta,\mu,\iota,o} \mu  t''$(lemma \ref{metacong}). By lemma \ref{conv}, $\Gamma \vdash
(\Pi  x:\mu  t_1''.\mu  t_2'')
\stackrel{\lambda  x.\mu  t}{=}_{\beta,\mu,\iota,o} \mu  t''$. Also, $\Gamma\vdash \lambda  x.\mu 
t : \Pi  x: (\mu  t_1'').(\mu 
t_2'')$.  So by lemma \ref{type}, $\Gamma\vdash \lambda 
x.\mu   t :  \mu  t''$.

\

\noindent Suppose $\Gamma\vdash \mu \vec{\mu}\lambda x.t \leadsto
\lambda  x. \mu  \vec{\mu} t$. By inversion on $\Gamma, \tilde{\mu}\vdash  \vec{\mu}(\lambda x.t): t'$, we have $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash \lambda x.t:t_a$, where $\Gamma, \tilde{\mu} \vdash \vec{\mu} t_a \stackrel{\vec{\mu} (\lambda x.t)}{=}_{\beta,\mu,\iota,o}  t'$. By inversion on $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash \lambda x.t:t_a$, then we have $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}}, x:t_1'' \vdash t : t_2''$ and $\Gamma, \tilde{\mu}, \tilde{\vec{\mu}} \vdash
\Pi  x:t_1''.t_2'' \stackrel{\lambda  x.t}{=}_{\beta,\mu,\iota,o} t_a$.
So $\Gamma, \tilde{\mu}  \vdash \Pi  x:\vec{\mu}t_1''.\vec{\mu}t_2'' \stackrel{\vec{\mu}(\lambda  x.t)}{=}_{\beta,\mu,\iota,o}\vec{\mu}t_a \stackrel{\vec{\mu}(\lambda  x.t)}{=}_{\beta,\mu,\iota,o} t' $. Thus $\Gamma  \vdash \Pi  x:\mu \vec{\mu}t_1''.\mu \vec{\mu}t_2'' \stackrel{\mu\vec{\mu}(\lambda  x.t)}{=}_{\beta,\mu,\iota,o} \mu t' $. Since $\Gamma \vdash \lambda  x.\mu\vec{\mu} t : \Pi x: \mu\vec{\mu} t_1''.\mu\vec{\mu} t_2''$, we have $\Gamma \vdash \lambda  x.\mu\vec{\mu} t : \mu t'$. 

\

\noindent Suppose $\Gamma\vdash \mu  (t_1' t_2') \leadsto (
\mu  t_1')( \mu  t_2')$. We have
$\Gamma, \tilde{\mu} \vdash t_1't_2' : t''$. We have $\Gamma,
\tilde{\mu} \vdash t_1' : \Pi  x:t_1''. t_2''$ and $\Gamma,
\tilde{\mu} \vdash t_2' : t_1''$ and $\Gamma, \tilde{\mu} \vdash  [t_2'/x]t_2''
\stackrel{t_1't_2'}{=}_{\beta,\mu,\iota,o}  t''$(by inversion). Thus we have $\Gamma
\vdash \mu   t_1':  \mu  (\Pi 
x:t_1''.t_2'')$ and $\Gamma \vdash \mu   t_2':  \mu
 t_1''$ and $\Gamma \vdash \mu 
 ([t_2'/x]t_2'') \stackrel{\mu  (t_1'
t_2')}{=}_{\beta,\mu,\iota,o}\mu  t'' $(lemma \ref{metacong}). By
lemma \ref{conv}, we have $\Gamma \vdash   
[\mu t_2'/x]\mu t_2'' \stackrel{(\mu  t_1')(\mu t_2')}{=}_{\beta,\mu,\iota,o} \mu t''$. So $\Gamma\vdash
(\mu   t_1')(\mu   t_2') :[\mu t_2'/x]\mu  t_2''$ and then $\Gamma \vdash (\mu   t_1')(\mu   t_2') : \mu t''$(lemma \ref{type}).

\

\noindent Suppose $\Gamma\vdash \mu \vec{\mu}  (t_1' t_2') \leadsto (\mu  \vec{\mu} t_1')( \mu  \vec{\mu} t_2')$, we argue similar as the case for $\Gamma\vdash \mu \vec{\mu}\lambda x.t \leadsto \lambda  x. \mu  \vec{\mu} t$. 

\end{proof}
\section{Well-Form Type}
\label{wftype}
\begin{lemma}
  If $\Gamma \vdash \mathsf{wf}$ and $\Gamma \vdash t:t'$, then $\Gamma \vdash t':*$.
\end{lemma}
\begin{proof}
\noindent  By induction on derivation of $\Gamma \vdash t:t'$. We list a few nontrivial cases.

\

\noindent \textbf{Case}: 

\

\infer[\textit{SelfInst}]{\Gamma \vdash t: [t/x]t'}{\Gamma
\vdash t : \iota x.t'}

\

\noindent By IH, we have $\Gamma \vdash \iota x.t':*$. So by inversion, 
we have $\Gamma , x:\iota x.t' \vdash t':*$. %% and $\Gamma \vdash * \stackrel{\iota x.t'}{=}_{\beta,\mu, \iota,o} *$. 
So by lemma \ref{subst}, we know $\Gamma \vdash [t/x]t':*$. 


\

\noindent \textbf{Case}: 

\

\infer[\textit{Lam}]{\Gamma \vdash \lambda x.t :\Pi x:t_1.
t_2}{\Gamma, x:t_1 \vdash t: t_2 & \Gamma \vdash t_1:*}

\

\noindent By IH, we know $\Gamma, x:t_1 \vdash t_2 : *$. Since $\Gamma \vdash t_1:*$, by 
\textit{Pi} rule, we have $\Gamma \vdash \Pi x:t_1.t_2:*$. 

\

\noindent \textbf{Case}: 

\

\infer[\textit{App}]{\Gamma \vdash t t':[t'/x] t_2}{\Gamma
\vdash t:\Pi x:t_1. t_2 & \Gamma \vdash t': t_1}

\

\noindent By IH, we have $\Gamma \vdash \Pi x:t_1. t_2:*$. 
By inversion on $\Gamma \vdash \Pi x:t_1. t_2:*$, we have $\Gamma, x:t_1 \vdash t_2:*$. 
So by lemma \ref{subst}, we have $\Gamma \vdash [t'/x]t_2:*$. 

\

\noindent \textbf{Case}: 

\

\infer[\textit{Mu}]{\Gamma \vdash \mu t: \mu t'}{\Gamma, \tilde{\mu}
\vdash t:t' &  \{\Gamma, \tilde{\mu} \vdash t_j: a_j\}_{(t_j:a_j) \in \tilde{\mu}} }

\

\noindent By IH, we have $\Gamma, \tilde{\mu}\vdash t':*$. So $\Gamma \vdash \mu t':\mu *$, thus
$\Gamma \vdash \mu t':*$. 

\end{proof}


\end{document}
