A matrix grammar, introduced by the Sironmoneys and K. Krithivasan in \cite{giftsironmoneyranisironmoney1972abstract} is a set of string grammars. One of these grammars is used to generate a string. The symbols in this string are the starting symbols for several other string grammars. These grammars generate a picture vertically. This downward generation is done in parallel. That means, if one vertical grammar applies a terminal rule, any other vertical grammar is also applying a terminal rule. Thus, only rectangular pictures can be generated. Formally, matrix grammars can be defined as follows:

\begin{definition}
\label{grammars_definition_xmg}
	$G = (G_H, G_V)$ is called a \emph{phrase-structure matrix grammar} (PSMG), (\emph{context-sensitive matrix grammar} (CSMG), \emph{context-free matrix grammar} (CFMG), \emph{right-linear matrix grammar} (RLMG) ), where 
	\begin{compactitem}
		\item $G_H = (N_H, I, P_H, S)$ is a phrase-structure grammar (PSG), (context-sensitive grammar (CSG), context-free grammar (CFG), right-linear grammar (RLG) ) where
		\begin{compactitem}
			\item $N_H$ is a finite set of horizontal non-terminals,
			\item $I = \{S_1, \dots, S_k\}$ is a finite set of intermediates,
			\item $P_H$ is a finite set of phrase-structure (context-sensitive, context-free, right-linear) production rules and
			\item S is the start symbol. 
		\end{compactitem}
		\item $G_V = \bigcup_{i = 1}^k G_{i}$ where $G_{i} = (N_{i}, T, P_{i}, S_i)$ are right linear grammars where 
		\begin{compactitem}
			\item $T$ is a finite set of terminals,
			\item $N_{i}$ is a finite set of non-terminals ($N_{i} \cap N_{j} = \emptyset$ for $i \neq j$),
			\item $P_{i}$ are right-linear production rules and
			\item $S_i$ is the start symbol. 
		\end{compactitem}
	\end{compactitem}
\end{definition}

A right-linear grammar only contains rules of the form $(A \rightarrow aB)$ or $(A \rightarrow a)$. We call these rules non-terminal and terminal rules respectively. 

The generation process of a matrix grammar begins with the start symbol $S$. The horizontal grammar $G_H$ then generates a string $S_{i_1}S_{i_2} \dots S_{i_n} \in I^*$ for $i_j \in \{1, \dots, k\}$ and ${j \in \{1, \dots, n\}}$. As known from the theory of formal languages, we write ${S \overset{*}{\underset{G_H}{\Rightarrow}} S_{i_1}S_{i_2} \dots S_{i_n}}$. Any intermediate $S_{i_j} \in I$ is a start symbol for a grammar $G_{i_j}$. From the intermediate string, the vertical derivation starts. Assuming that the first application of rules uses non-terminal rules, we get

\begin{center}
\boxed{
\begin{aligned}
\begin{matrix}
S_{i_1} & \dots & S_{i_n}
\end{matrix}
\end{aligned}
}

$\Downarrow$

\boxed{
\begin{aligned}
\begin{matrix}
a_{11} & \dots & a_{1n} \\
A_{1} & \dots & A_{n}
\end{matrix}
\end{aligned}
}

\end{center}
where $(S_{i_j} \rightarrow a_{1j}A_j)$ are rules in $G_{i_j}$ for $j \in \{1, \dots, n\}$. This derivation clearly shows that each vertical grammar is applying one non-terminal rule at the same time. It was required that the non-terminals of each vertical grammar are disjoint. Therefore, in each column only rules from the same grammar can be applied in any further step. This derivation can be continued as follows:

\begin{center}
\boxed{
\begin{aligned}
\begin{matrix}
a_{11} & \dots & a_{1n} \\[-1ex]
\vdots & \vdots & \vdots \\[-0.5ex]
a_{(r-1)1} & \dots & a_{(r-1)n} \\[-0.5ex]
A_1 & \dots & A_n
\end{matrix}
\end{aligned}
}

$\Downarrow$

\boxed{
\begin{aligned}
\begin{matrix}
a_{11} & \dots & a_{1n} \\[-1ex]
\vdots & \vdots & \vdots \\[-0.5ex]
a_{(r-1)1} & \dots & a_{(r-1)n} \\[-0.5ex]
a_{r1} & \dots & a_{rn} \\[-0.5ex]
B_1 & \dots & B_n
\end{matrix}
\end{aligned}
}
\end{center}
where $(A_j \rightarrow a_{rj} B_j)$ are rules in $G_{i_j}$ for $j \in \{1, \dots, n\}$. In this derivation step, each grammar is applying a non-terminal rule. To finish a derivation, each vertical grammar must apply a terminal rule simultaneously:

\begin{center}
\boxed{
\begin{aligned}
\begin{matrix}
a_{11} & \dots & a_{1n} \\[-1ex]
\vdots & \vdots & \vdots \\[-1ex]
a_{(m-1)1} & \dots & a_{(m-1)n} \\[-0.5ex]
A_1 & \dots & A_n
\end{matrix}
\end{aligned}
}

$\Downarrow$

\boxed{
\begin{aligned}
\begin{matrix}
a_{11} & \dots & a_{1n} \\[-1ex]
\vdots & \vdots & \vdots \\[-1ex]
a_{m1} & \dots & a_{mn}
\end{matrix}
\end{aligned}
}

\end{center}
where $(A_j \rightarrow a_{mj})$ are terminal rules in $G_{i_j}$ for $j \in \{1, \dots, n\}$. The reflexive transitive closure of $\Downarrow$ is denoted by $\overset{*}{\Downarrow}$. 

The set of pictures generated by a PSMG (CSMG, CFMG, RLMG) is called a phrase-structure matrix language (PSML), (context-sensitive matrix language (CSML), context-free matrix language (CFML), regular matrix language (RML)). 

\begin{definition}
	Let $G = (G_H, G_V)$ be a matrix grammar. Then 	
	
	\[L(G) = \{p \in T^{*, *} \mid S \overset{*}{\underset{G_H}{\Rightarrow}} S_{i_1} \dots S_{i_{l_2(p)}} \overset{*}{\underset{G_V}{\Downarrow}} p\}\]
	
	is the language corresponding to grammar $G$. 
\end{definition}

With a given string language $L = L(G_H)$ and a list of regular sets $R_1, \dots R_k$ corresponding to the vertical grammars $G_1, \dots G_k$, R. Sironmoney and K. Kritivasan propose the following notation for the generated picture language: 
	
	\[L(G) = (L)::(R_1, \dots, R_k).\]
	
The set of all languages that can be generated with a PSMG (CSMG, CFMG, RLMG) is the family of phrase-structure matrix languages and is denoted by $\familyOf{PSML}$ ($\familyOf{CSML}$, $\familyOf{CFML}$, $\familyOf{RML}$). 

An example from \cite{giftsironmoneyranisironmoney1972abstract} illustrates which languages can be generated with matrix grammars:

\begin{example}
	Let $G = (G_H, G_V)$ be a matrix grammar with $G_H = (N_H, I, P_H, S)$, where
	\begin{compactitem}
		\item $N_H = \{S\}$ is the set of horizontal non-terminals,
		\item $I = \{S_1, S_2\}$ is the set of intermediate symbols and 
		\item $P_H = \{S \rightarrow S_1SS_1, S \rightarrow S_2\}$.
	\end{compactitem}
	It is obvious that $P_H$ only contains context-free production rules. Therefore, $G$ is a context-free matrix grammar. It can easily be seen that the language generated by $G_H$ is the language $L = L(G_H) = \{S_1^nS_2S_1^n \mid n \geq 0\}$. 
	
	The set of vertical grammars $G_V$ consists of two right-linear grammars $G_1$ and $G_2$ where
	\begin{compactitem}
		\item $G_{1} = (\{S_1, A\}, \{., x\}, \{S_1 \rightarrow xA, A \rightarrow .A, A \rightarrow x\}, S_1)$ and
		\item $G_{2} = (\{S_2\}, \{., x\}, \{S_2 \rightarrow xS_2, S_2 \rightarrow x\}, S_2)$. 
	\end{compactitem}
	The languages generated by these two right-linear grammars are $R_1 = \{x.^nx \mid n \geq 0\}$ and $R_2 = \{x^n \mid n \geq 1\}$. 
\end{example}

The language generated by G is the language $L(G) = (L)::(R_1, R_2)$ which contains pictures with a minimum size of $(2, 1)$ and tokens of the shape of a laying H in it. We illustrate the generation of a picture with an example of size $(5, 5)$. 

\begin{center}
	\begin{longtable}{ccc}
		$S \Rightarrow $ & $S_1SS_1 \overset{*}{\Rightarrow}$ & \boxed{
			\begin{aligned}
				\begin{matrix}
					S_1 & S_1 & S_2 & S_1 & S_1 
				\end{matrix}
			\end{aligned}
		}\\ %new line of longtable
		&& $\Downarrow$ \\ %new line of longtable
		&& \boxed{
			\begin{aligned}
				\begin{matrix}
					x & x & x & x & x \\[-0.5ex]
					A & A & S_2 & A & A 
				\end{matrix}
			\end{aligned}
		}\\ %new line of longtable
		&& $\Downarrow$ \\ %new line of longtable
		&& \boxed{
			\begin{aligned}
				\begin{matrix}
					x & x & x & x & x \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					A & A & S_2 & A & A 
				\end{matrix}
			\end{aligned}
		}\\ %new line of longtable
		&& $\overset{*}{\Downarrow}$ \\ %new line of longtable
		&& \boxed{
			\begin{aligned}
				\begin{matrix}
					x & x & x & x & x \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					A & A & S_2 & A & A 
				\end{matrix}
			\end{aligned}
		} \\ %new line of longtable
		&& $\Downarrow$ \\ %new line of longtable
		&& \boxed{
			\begin{aligned}
				\begin{matrix}
					x & x & x & x & x \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					. & . & x & . & . \\[-0.5ex]
					x & x & x & x & x
				\end{matrix}
			\end{aligned}
		}
	\end{longtable}
\end{center}

The Chomsky hierarchy can be extended to matrix grammars. This is obvious because the different matrix grammars are induced by the grammars of the Chomsky hierarchy. Thus we get $\familyOf{RML} \subset \familyOf{CFML} \subset \familyOf{CSML} \subset \familyOf{PSML}$. 

Regarding closure properties \cite{giftsironmoneyranisironmoney1972abstract} claims that each family of matrix grammars is closed under union, concatenation, Kleene closure, $\epsilon$-free homomorphism, inverse homomorphism and intersection with a regular matrix language. That leads to the conclusion that each family of matrix languages is an abstract family of matrix languages. 