<!DOCTYPE html>
<!--

	Modified template for STM32CubeMX.AI purpose

	d0.1: 	jean-michel.delorme@st.com
			add ST logo and ST footer

	d2.0: 	jean-michel.delorme@st.com
			add sidenav support

	d2.1: 	jean-michel.delorme@st.com
			clean-up + optional ai_logo/ai meta data
			
==============================================================================
           "GitHub HTML5 Pandoc Template" v2.1 — by Tristano Ajmone           
==============================================================================
Copyright © Tristano Ajmone, 2017, MIT License (MIT). Project's home:

- https://github.com/tajmone/pandoc-goodies

The CSS in this template reuses source code taken from the following projects:

- GitHub Markdown CSS: Copyright © Sindre Sorhus, MIT License (MIT):
  https://github.com/sindresorhus/github-markdown-css

- Primer CSS: Copyright © 2016-2017 GitHub Inc., MIT License (MIT):
  http://primercss.io/

~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
The MIT License 

Copyright (c) Tristano Ajmone, 2017 (github.com/tajmone/pandoc-goodies)
Copyright (c) Sindre Sorhus <sindresorhus@gmail.com> (sindresorhus.com)
Copyright (c) 2017 GitHub Inc.

"GitHub Pandoc HTML5 Template" is Copyright (c) Tristano Ajmone, 2017, released
under the MIT License (MIT); it contains readaptations of substantial portions
of the following third party softwares:

(1) "GitHub Markdown CSS", Copyright (c) Sindre Sorhus, MIT License (MIT).
(2) "Primer CSS", Copyright (c) 2016 GitHub Inc., MIT License (MIT).

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
==============================================================================-->
<html>
<head>
  <meta charset="utf-8" />
  <meta name="generator" content="pandoc" />
  <meta name="viewport" content="width=device-width, initial-scale=1.0, user-scalable=yes" />
  <title>TensorFlow Lite toolbox support</title>
  <style type="text/css">
.markdown-body{
	-ms-text-size-adjust:100%;
	-webkit-text-size-adjust:100%;
	color:#24292e;
	font-family:-apple-system,system-ui,BlinkMacSystemFont,"Segoe UI",Helvetica,Arial,sans-serif,"Apple Color Emoji","Segoe UI Emoji","Segoe UI Symbol";
	font-size:16px;
	line-height:1.5;
	word-wrap:break-word;
	box-sizing:border-box;
	min-width:200px;
	max-width:980px;
	margin:0 auto;
	padding:45px;
	}
.markdown-body a{
	color:#0366d6;
	background-color:transparent;
	text-decoration:none;
	-webkit-text-decoration-skip:objects}
.markdown-body a:active,.markdown-body a:hover{
	outline-width:0}
.markdown-body a:hover{
	text-decoration:underline}
.markdown-body a:not([href]){
	color:inherit;text-decoration:none}
.markdown-body strong{font-weight:600}
.markdown-body h1,.markdown-body h2,.markdown-body h3,.markdown-body h4,.markdown-body h5,.markdown-body h6{
	margin-top:24px;
	margin-bottom:16px;
	font-weight:600;
	line-height:1.25}
.markdown-body h1{
	font-size:2em;
	margin:.67em 0;
	padding-bottom:.3em;
	border-bottom:1px solid #eaecef}
.markdown-body h2{
	padding-bottom:.3em;
	font-size:1.5em;
	border-bottom:1px solid #eaecef}
.markdown-body h3{font-size:1.25em}
.markdown-body h4{font-size:1em}
.markdown-body h5{font-size:.875em}
.markdown-body h6{font-size:.85em;color:#6a737d}
.markdown-body img{border-style:none}
.markdown-body svg:not(:root){
	overflow:hidden}
.markdown-body hr{
	box-sizing:content-box;
	height:.25em;
	margin:24px 0;
	padding:0;
	overflow:hidden;
	background-color:#e1e4e8;
	border:0}
.markdown-body hr::before{display:table;content:""}
.markdown-body hr::after{display:table;clear:both;content:""}
.markdown-body input{margin:0;overflow:visible;font:inherit;font-family:inherit;font-size:inherit;line-height:inherit}
.markdown-body [type=checkbox]{box-sizing:border-box;padding:0}
.markdown-body *{box-sizing:border-box}.markdown-body blockquote{margin:0}
.markdown-body ol,.markdown-body ul{padding-left:2em}
.markdown-body ol ol,.markdown-body ul ol{list-style-type:lower-roman}
.markdown-body ol ol,.markdown-body ol ul,.markdown-body ul ol,.markdown-body ul ul{margin-top:0;margin-bottom:0}
.markdown-body ol ol ol,.markdown-body ol ul ol,.markdown-body ul ol ol,.markdown-body ul ul ol{list-style-type:lower-alpha}
.markdown-body li>p{margin-top:16px}
.markdown-body li+li{margin-top:.25em}
.markdown-body dd{margin-left:0}
.markdown-body dl{padding:0}
.markdown-body dl dt{padding:0;margin-top:16px;font-size:1em;font-style:italic;font-weight:600}
.markdown-body dl dd{padding:0 16px;margin-bottom:16px}
.markdown-body code{font-family:SFMono-Regular,Consolas,"Liberation Mono",Menlo,Courier,monospace}
.markdown-body pre{font:12px SFMono-Regular,Consolas,"Liberation Mono",Menlo,Courier,monospace;word-wrap:normal}
.markdown-body blockquote,.markdown-body dl,.markdown-body ol,.markdown-body p,.markdown-body pre,.markdown-body table,.markdown-body ul{margin-top:0;margin-bottom:16px}
.markdown-body blockquote{padding:0 1em;color:#6a737d;border-left:.25em solid #dfe2e5}
.markdown-body blockquote>:first-child{margin-top:0}
.markdown-body blockquote>:last-child{margin-bottom:0}
.markdown-body table{display:block;width:100%;overflow:auto;border-spacing:0;border-collapse:collapse}
.markdown-body table th{font-weight:600}
.markdown-body table td,.markdown-body table th{padding:6px 13px;border:1px solid #dfe2e5}
.markdown-body table tr{background-color:#fff;border-top:1px solid #c6cbd1}
.markdown-body table tr:nth-child(2n){background-color:#f6f8fa}
.markdown-body img{max-width:100%;box-sizing:content-box;background-color:#fff}
.markdown-body code{padding:.2em 0;margin:0;font-size:85%;background-color:rgba(27,31,35,.05);border-radius:3px}
.markdown-body code::after,.markdown-body code::before{letter-spacing:-.2em;content:"\00a0"}
.markdown-body pre>code{padding:0;margin:0;font-size:100%;word-break:normal;white-space:pre;background:0 0;border:0}
.markdown-body .highlight{margin-bottom:16px}
.markdown-body .highlight pre{margin-bottom:0;word-break:normal}
.markdown-body .highlight pre,.markdown-body pre{padding:16px;overflow:auto;font-size:85%;line-height:1.45;background-color:#f6f8fa;border-radius:3px}
.markdown-body pre code{display:inline;max-width:auto;padding:0;margin:0;overflow:visible;line-height:inherit;word-wrap:normal;background-color:transparent;border:0}
.markdown-body pre code::after,.markdown-body pre code::before{content:normal}
.markdown-body .full-commit .btn-outline:not(:disabled):hover{color:#005cc5;border-color:#005cc5}
.markdown-body kbd{box-shadow:inset 0 -1px 0 #959da5;display:inline-block;padding:3px 5px;font:11px/10px SFMono-Regular,Consolas,"Liberation Mono",Menlo,Courier,monospace;color:#444d56;vertical-align:middle;background-color:#fcfcfc;border:1px solid #c6cbd1;border-bottom-color:#959da5;border-radius:3px;box-shadow:inset 0 -1px 0 #959da5}
.markdown-body :checked+.radio-label{position:relative;z-index:1;border-color:#0366d6}
.markdown-body .task-list-item{list-style-type:none}
.markdown-body .task-list-item+.task-list-item{margin-top:3px}
.markdown-body .task-list-item input{margin:0 .2em .25em -1.6em;vertical-align:middle}
.markdown-body::before{display:table;content:""}
.markdown-body::after{display:table;clear:both;content:""}
.markdown-body>:first-child{margin-top:0!important}
.markdown-body>:last-child{margin-bottom:0!important}
.Alert,.Error,.Note,.Success,.Warning,.Tips,.HTips{padding:11px;margin-bottom:24px;border-style:solid;border-width:1px;border-radius:4px}
.Alert p,.Error p,.Note p,.Success p,.Warning p,.Tips p,.HTips p{margin-top:0}
.Alert p:last-child,.Error p:last-child,.Note p:last-child,.Success p:last-child,.Warning p:last-child,.Tips p:last-child,.HTips p:last-child{margin-bottom:0}
.Alert{color:#246;background-color:#e2eef9;border-color:#bac6d3}
.Warning{color:#4c4a42;background-color:#fff9ea;border-color:#dfd8c2}
.Error{color:#911;background-color:#fcdede;border-color:#d2b2b2}
.Success{color:#22662c;background-color:#e2f9e5;border-color:#bad3be}
.Note{color:#2f363d;background-color:#f6f8fa;border-color:#d5d8da}
.Alert h1,.Alert h2,.Alert h3,.Alert h4,.Alert h5,.Alert h6{color:#246;margin-bottom:0}
.Warning h1,.Warning h2,.Warning h3,.Warning h4,.Warning h5,.Warning h6{color:#4c4a42;margin-bottom:0}
.Error h1,.Error h2,.Error h3,.Error h4,.Error h5,.Error h6{color:#911;margin-bottom:0}
.Success h1,.Success h2,.Success h3,.Success h4,.Success h5,.Success h6{color:#22662c;margin-bottom:0}
.Note h1,.Note h2,.Note h3,.Note h4,.Note h5,.Note h6{color:#2f363d;margin-bottom:0}
.Tips h1,.Tips h2,.Tips h3,.Tips h4,.Tips h5,.Tips h6{color:#2f363d;margin-bottom:0}
.HTips h1,.HTips h2,.HTips h3,.HTips h4,.HTips h5,.HTips h6{color:#2f363d;margin-bottom:0}
.Tips h1:first-child,.Tips h2:first-child,.Tips h3:first-child,.Tips h4:first-child,.Tips h5:first-child,.Tips h6:first-child,.Alert h1:first-child,.Alert h2:first-child,.Alert h3:first-child,.Alert h4:first-child,.Alert h5:first-child,.Alert h6:first-child,.Error h1:first-child,.Error h2:first-child,.Error h3:first-child,.Error h4:first-child,.Error h5:first-child,.Error h6:first-child,.Note h1:first-child,.Note h2:first-child,.Note h3:first-child,.Note h4:first-child,.Note h5:first-child,.Note h6:first-child,.Success h1:first-child,.Success h2:first-child,.Success h3:first-child,.Success h4:first-child,.Success h5:first-child,.Success h6:first-child,.Warning h1:first-child,.Warning h2:first-child,.Warning h3:first-child,.Warning h4:first-child,.Warning h5:first-child,.Warning h6:first-child{margin-top:0}
h1.title,p.subtitle{text-align:center}
h1.title.followed-by-subtitle{margin-bottom:0}
p.subtitle{font-size:1.5em;font-weight:600;line-height:1.25;margin-top:0;margin-bottom:16px;padding-bottom:.3em}
div.line-block{white-space:pre-line}
  </style>
  <style type="text/css">code{white-space: pre;}</style>
  <link rel="stylesheet" href="data:text/css,%3Aroot%20%7B%2D%2Dmain%2Ddarkblue%2Dcolor%3A%20rgb%283%2C35%2C75%29%3B%20%2D%2Dmain%2Dlightblue%2Dcolor%3A%20rgb%2860%2C180%2C230%29%3B%20%2D%2Dmain%2Dpink%2Dcolor%3A%20rgb%28230%2C0%2C126%29%3B%20%2D%2Dmain%2Dyellow%2Dcolor%3A%20rgb%28255%2C210%2C0%29%3B%20%2D%2Dsecondary%2Dgrey%2Dcolor%3A%20rgb%2870%2C70%2C80%29%3B%20%2D%2Dsecondary%2Dgrey%2Dcolor%2D25%3A%20rgb%28209%2C209%2C211%29%3B%20%2D%2Dsecondary%2Dgrey%2Dcolor%2D12%3A%20rgb%28233%2C233%2C234%29%3B%20%2D%2Dsecondary%2Dlightgreen%2Dcolor%3A%20rgb%2873%2C177%2C112%29%3B%20%2D%2Dsecondary%2Dpurple%2Dcolor%3A%20rgb%28140%2C0%2C120%29%3B%20%2D%2Dsecondary%2Ddarkgreen%2Dcolor%3A%20rgb%284%2C87%2C47%29%3B%20%2D%2Dsidenav%2Dfont%2Dsize%3A%2090%25%3B%7Dhtml%20%7Bfont%2Dfamily%3A%20%22Arial%22%2C%20sans%2Dserif%3B%7D%2A%20%7Bxbox%2Dsizing%3A%20border%2Dbox%3B%7D%2Est%5Fheader%20h1%2Etitle%2C%2Est%5Fheader%20p%2Esubtitle%20%7Btext%2Dalign%3A%20left%3B%7D%2Est%5Fheader%20h1%2Etitle%20%7Bborder%2Dcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bmargin%2Dbottom%3A5px%3B%7D%2Est%5Fheader%20p%2Esubtitle%20%7Bcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bfont%2Dsize%3A90%25%3B%7D%2Est%5Fheader%20h1%2Etitle%2Efollowed%2Dby%2Dsubtitle%20%7Bborder%2Dbottom%3A2px%20solid%3Bborder%2Dcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bmargin%2Dbottom%3A5px%3B%7D%2Est%5Fheader%20p%2Erevision%20%7Bdisplay%3A%20inline%2Dblock%3Bwidth%3A70%25%3B%7D%2Est%5Fheader%20div%2Eauthor%20%7Bfont%2Dstyle%3A%20italic%3B%7D%2Est%5Fheader%20div%2Esummary%20%7Bborder%2Dtop%3A%20solid%201px%20%23C0C0C0%3Bbackground%3A%20%23ECECEC%3Bpadding%3A%205px%3B%7D%2Est%5Ffooter%20%7Bfont%2Dsize%3A80%25%3B%7D%2Est%5Ffooter%20img%20%7Bfloat%3A%20right%3B%7D%2Est%5Ffooter%20%2Est%5Fnotice%20%7Bwidth%3A80%25%3B%7D%2Emarkdown%2Dbody%20%23header%2Dsection%2Dnumber%20%7Bfont%2Dsize%3A120%25%3B%7D%2Emarkdown%2Dbody%20h1%20%7Bborder%2Dbottom%3A1px%20solid%3Bborder%2Dcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bpadding%2Dbottom%3A%202px%3Bpadding%2Dtop%3A%2010px%3B%7D%2Emarkdown%2Dbody%20h2%20%7Bpadding%2Dbottom%3A%205px%3Bpadding%2Dtop%3A%2010px%3B%7D%2Emarkdown%2Dbody%20h2%20code%20%7Bbackground%2Dcolor%3A%20rgb%28255%2C%20255%2C%20255%29%3B%7D%23func%2EsourceCode%20%7Bborder%2Dleft%2Dstyle%3A%20solid%3Bborder%2Dcolor%3A%20rgb%280%2C%2032%2C%2082%29%3Bborder%2Dcolor%3A%20rgb%28255%2C%20244%2C%20191%29%3Bborder%2Dwidth%3A%208px%3Bpadding%3A0px%3B%7Dpre%20%3E%20code%20%7Bborder%3A%20solid%201px%20blue%3Bfont%2Dsize%3A60%25%3B%7DcodeXX%20%7Bborder%3A%20solid%201px%20blue%3Bfont%2Dsize%3A60%25%3B%7D%23func%2EsourceXXCode%3A%3Abefore%20%7Bcontent%3A%20%22Synopsis%22%3Bpadding%2Dleft%3A10px%3Bfont%2Dweight%3A%20bold%3B%7Dfigure%20%7Bpadding%3A0px%3Bmargin%2Dleft%3A5px%3Bmargin%2Dright%3A5px%3Bmargin%2Dleft%3A%20auto%3Bmargin%2Dright%3A%20auto%3B%7Dimg%5Bdata%2Dproperty%3D%22center%22%5D%20%7Bdisplay%3A%20block%3Bmargin%2Dtop%3A%2010px%3Bmargin%2Dleft%3A%20auto%3Bmargin%2Dright%3A%20auto%3Bpadding%3A%2010px%3B%7Dfigcaption%20%7Btext%2Dalign%3Aleft%3B%20%20border%2Dtop%3A%201px%20dotted%20%23888%3Bpadding%2Dbottom%3A%2020px%3Bmargin%2Dtop%3A%2010px%3B%7Dh1%20code%2C%20h2%20code%20%7Bfont%2Dsize%3A120%25%3B%7D%09%2Emarkdown%2Dbody%20table%20%7Bwidth%3A%20100%25%3Bmargin%2Dleft%3Aauto%3Bmargin%2Dright%3Aauto%3B%7D%2Emarkdown%2Dbody%20img%20%7Bborder%2Dradius%3A%204px%3Bpadding%3A%205px%3Bdisplay%3A%20block%3Bmargin%2Dleft%3A%20auto%3Bmargin%2Dright%3A%20auto%3Bwidth%3A%20auto%3B%7D%2Emarkdown%2Dbody%20%2Est%5Fheader%20img%2C%20%2Emarkdown%2Dbody%20%7Bborder%3A%20none%3Bborder%2Dradius%3A%20none%3Bpadding%3A%205px%3Bdisplay%3A%20block%3Bmargin%2Dleft%3A%20auto%3Bmargin%2Dright%3A%20auto%3Bwidth%3A%20auto%3Bbox%2Dshadow%3A%20none%3B%7D%2Emarkdown%2Dbody%20%7Bmargin%3A%2010px%3Bpadding%3A%2010px%3Bwidth%3A%20auto%3Bfont%2Dfamily%3A%20%22Arial%22%2C%20sans%2Dserif%3Bcolor%3A%20%2303234B%3Bcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%7D%2Emarkdown%2Dbody%20h1%2C%20%2Emarkdown%2Dbody%20h2%2C%20%2Emarkdown%2Dbody%20h3%20%7B%20%20%20color%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%7D%2Emarkdown%2Dbody%3Ahover%20%7B%7D%2Emarkdown%2Dbody%20%2Econtents%20%7B%7D%2Emarkdown%2Dbody%20%2Etoc%2Dtitle%20%7B%7D%2Emarkdown%2Dbody%20%2Econtents%20li%20%7Blist%2Dstyle%2Dtype%3A%20none%3B%7D%2Emarkdown%2Dbody%20%2Econtents%20ul%20%7Bpadding%2Dleft%3A%2010px%3B%7D%2Emarkdown%2Dbody%20%2Econtents%20a%20%7Bcolor%3A%20%233CB4E6%3B%20%7D%2Emarkdown%2Dbody%20table%20%2Eheader%20%7Bbackground%2Dcolor%3A%20var%28%2D%2Dsecondary%2Dgrey%2Dcolor%2D12%29%3Bborder%2Dbottom%3A1px%20solid%3Bborder%2Dtop%3A1px%20solid%3Bfont%2Dsize%3A%2090%25%3B%7D%2Emarkdown%2Dbody%20table%20th%20%7Bfont%2Dweight%3A%20bolder%3B%20%7D%2Emarkdown%2Dbody%20table%20td%20%7Bfont%2Dsize%3A%2090%25%3B%7D%2Emarkdown%2Dbody%20code%7Bpadding%3A%200%3Bmargin%3A0%3Bfont%2Dsize%3A95%25%3Bbackground%2Dcolor%3Argba%2827%2C31%2C35%2C%2E05%29%3Bborder%2Dradius%3A1px%3B%7D%2Et01%20%7Bwidth%3A%20100%25%3Bborder%3A%20None%3Btext%2Dalign%3A%20left%3B%7D%2ETips%20%7Bpadding%3A11px%3Bmargin%2Dbottom%3A24px%3Bborder%2Dstyle%3Asolid%3Bborder%2Dwidth%3A1px%3Bborder%2Dradius%3A1px%7D%2ETips%20%7Bcolor%3A%232f363d%3B%20background%2Dcolor%3A%20%23f6f8fa%3Bborder%2Dcolor%3A%23d5d8da%3Bborder%2Dtop%3A1px%20solid%3Bborder%2Dbottom%3A1px%20solid%3B%7D%2EHTips%20%7Bpadding%3A11px%3Bmargin%2Dbottom%3A24px%3Bborder%2Dstyle%3Asolid%3Bborder%2Dwidth%3A1px%3Bborder%2Dradius%3A1px%7D%2EHTips%20%7Bcolor%3A%232f363d%3B%20background%2Dcolor%3A%23fff9ea%3Bborder%2Dcolor%3A%23d5d8da%3Bborder%2Dtop%3A1px%20solid%3Bborder%2Dbottom%3A1px%20solid%3B%7D%2EHTips%20h1%2C%2EHTips%20h2%2C%2EHTips%20h3%2C%2EHTips%20h4%2C%2EHTips%20h5%2C%2EHTips%20h6%20%7Bcolor%3A%232f363d%3Bmargin%2Dbottom%3A0%7D%2Esidenav%20%7Bfont%2Dfamily%3A%20%22Arial%22%2C%20sans%2Dserif%3B%20%20color%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bheight%3A%20100%25%3Bposition%3A%20fixed%3Bz%2Dindex%3A%201%3Btop%3A%200%3Bleft%3A%200%3Bmargin%2Dright%3A%2010px%3Bmargin%2Dleft%3A%2010px%3B%20overflow%2Dx%3A%20hidden%3B%7D%2Esidenav%20hr%2Enew1%20%7Bborder%2Dwidth%3A%20thin%3Bborder%2Dcolor%3A%20var%28%2D%2Dmain%2Dlightblue%2Dcolor%29%3Bmargin%2Dright%3A%2010px%3Bmargin%2Dtop%3A%20%2D10px%3B%7D%2Esidenav%20%23sidenav%5Fheader%20%7Bmargin%2Dtop%3A%2010px%3Bborder%3A%201px%3Bcolor%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bborder%2Dcolor%3A%20var%28%2D%2Dmain%2Dlightblue%2Dcolor%29%3B%7D%2Esidenav%20%23sidenav%5Fheader%20img%20%7Bfloat%3A%20left%3B%7D%2Esidenav%20%23sidenav%5Fheader%20a%20%7Bmargin%2Dleft%3A%200px%3Bmargin%2Dright%3A%200px%3Bpadding%2Dleft%3A%200px%3B%7D%2Esidenav%20%23sidenav%5Fheader%20a%3Ahover%20%7Bbackground%2Dsize%3A%20auto%3Bcolor%3A%20%23FFD200%3B%20%7D%2Esidenav%20%23sidenav%5Fheader%20a%3Aactive%20%7B%20%20%7D%2Esidenav%20%3E%20ul%20%7Bbackground%2Dcolor%3A%20rgba%2857%2C%20169%2C%20220%2C%200%2E05%29%3B%20color%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bborder%2Dradius%3A%2010px%3Bpadding%2Dbottom%3A%2010px%3Bpadding%2Dtop%3A%2010px%3Bpadding%2Dright%3A%2010px%3Bmargin%2Dright%3A%2010px%3B%7D%2Esidenav%20a%20%7Bpadding%3A%202px%202px%3Btext%2Ddecoration%3A%20none%3Bfont%2Dsize%3A%20var%28%2D%2Dsidenav%2Dfont%2Dsize%29%3Bdisplay%3Atable%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%2C%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%7B%20padding%2Dright%3A%205px%3Bpadding%2Dleft%3A%205px%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20a%20%7B%20color%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bfont%2Dweight%3A%20lighter%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20a%20%7B%20color%3A%20var%28%2D%2Dmain%2Ddarkblue%2Dcolor%29%3Bfont%2Dsize%3A%2080%25%3Bpadding%2Dleft%3A%2010px%3Btext%2Dalign%2Dlast%3A%20left%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20a%20%7B%20display%3A%20None%3B%7D%2Esidenav%20li%20%7Blist%2Dstyle%2Dtype%3A%20none%3B%7D%2Esidenav%20ul%20%7Bpadding%2Dleft%3A%200px%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20a%3Ahover%2C%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20a%3Ahover%20%7Bbackground%2Dcolor%3A%20var%28%2D%2Dsecondary%2Dgrey%2Dcolor%2D12%29%3Bbackground%2Dclip%3A%20border%2Dbox%3Bmargin%2Dleft%3A%20%2D10px%3Bpadding%2Dleft%3A%2010px%3B%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20a%3Ahover%20%7Bpadding%2Dright%3A%2015px%3Bwidth%3A%20230px%3B%09%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20a%3Ahover%20%7Bpadding%2Dright%3A%2010px%3Bwidth%3A%20230px%3B%09%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20a%3Aactive%20%7B%20color%3A%20%23FFD200%3B%20%7D%2Esidenav%20%3E%20ul%20%3E%20li%20%3E%20ul%20%3E%20li%20%3E%20a%3Aactive%20%7B%20color%3A%20%23FFD200%3B%20%7D%2Esidenav%20code%20%7B%7D%2Esidenav%20%7Bwidth%3A%20280px%3B%7D%23sidenav%20%7Bmargin%2Dleft%3A%20300px%3Bdisplay%3Ablock%3B%7D%2Emarkdown%2Dbody%20%2Eprint%2Dcontents%20%7Bvisibility%3Ahidden%3B%7D%2Emarkdown%2Dbody%20%2Eprint%2Dtoc%2Dtitle%20%7Bvisibility%3Ahidden%3B%7D%2Emarkdown%2Dbody%20%7Bmax%2Dwidth%3A%20980px%3Bmin%2Dwidth%3A%20200px%3Bpadding%3A%2040px%3Bborder%2Dstyle%3A%20solid%3Bborder%2Dstyle%3A%20outset%3Bborder%2Dcolor%3A%20rgba%28104%2C%20167%2C%20238%2C%200%2E089%29%3Bborder%2Dradius%3A%205px%3B%7D%40media%20screen%20and%20%28max%2Dheight%3A%20450px%29%20%7B%2Esidenav%20%7Bpadding%2Dtop%3A%2015px%3B%7D%2Esidenav%20a%20%7Bfont%2Dsize%3A%2018px%3B%7D%23sidenav%20%7Bmargin%2Dleft%3A%2010px%3B%20%7D%2Esidenav%20%7Bvisibility%3Ahidden%3B%7D%2Emarkdown%2Dbody%20%7Bmargin%3A%2010px%3Bpadding%3A%2040px%3Bwidth%3A%20auto%3Bborder%3A%200px%3B%7D%7D%40media%20screen%20and%20%28max%2Dwidth%3A%201024px%29%20%7B%2Esidenav%20%7Bvisibility%3Ahidden%3B%7D%2Emarkdown%2Dbody%20%7Bmargin%3A%2010px%3Bpadding%3A%2040px%3Bwidth%3A%20auto%3Bborder%3A%200px%3B%7D%23sidenav%20%7Bmargin%2Dleft%3A%2010px%3B%7D%7D%40media%20print%20%7B%2Esidenav%20%7Bvisibility%3Ahidden%3B%7D%23sidenav%20%7Bmargin%2Dleft%3A%2010px%3B%7D%2Emarkdown%2Dbody%20%7Bmargin%3A%2010px%3Bpadding%3A%2010px%3Bwidth%3Aauto%3Bborder%3A%200px%3B%7D%40page%20%7Bsize%3A%20A4%3B%20%20margin%3A2cm%3Bpadding%3A2cm%3Bmargin%2Dtop%3A%201cm%3Bpadding%2Dbottom%3A%201cm%3B%7D%2A%20%7Bxbox%2Dsizing%3A%20border%2Dbox%3Bfont%2Dsize%3A90%25%3B%7Da%20%7Bfont%2Dsize%3A%20100%25%3Bcolor%3A%20yellow%3B%7D%2Emarkdown%2Dbody%20article%20%7Bxbox%2Dsizing%3A%20border%2Dbox%3Bfont%2Dsize%3A100%25%3B%7D%2Emarkdown%2Dbody%20p%20%7Bwindows%3A%202%3Borphans%3A%202%3B%7D%2Epagebreakerafter%20%7Bpage%2Dbreak%2Dafter%3A%20always%3Bpadding%2Dtop%3A10mm%3B%7D%2Epagebreakbefore%20%7Bpage%2Dbreak%2Dbefore%3A%20always%3B%7Dh1%2C%20h2%2C%20h3%2C%20h4%20%7Bpage%2Dbreak%2Dafter%3A%20avoid%3B%7Ddiv%2C%20code%2C%20blockquote%2C%20li%2C%20span%2C%20table%2C%20figure%20%7Bpage%2Dbreak%2Dinside%3A%20avoid%3B%7D%7D">
  <!--[if lt IE 9]>
    <script src="//cdnjs.cloudflare.com/ajax/libs/html5shiv/3.7.3/html5shiv-printshiv.min.js"></script>
  <![endif]-->





<link rel="shortcut icon" href="">

</head>



<body>

		<div class="sidenav">
		<div id="sidenav_header">
							<img src="" title="STM32CubeMX.AI logo" align="left" height="70" />
										<br />7.0.0<br />
										<a href="#doc_title"> TensorFlow Lite toolbox support </a>
					</div>
		<div id="sidenav_header_button">
			 
							<ul>
					<li><p><a id="index" href="index.html">[ Index ]</a></p></li>
				</ul>
						<hr class="new1">
		</div>	

		<ul>
  <li><a href="#overview">Overview</a>
  <ul>
  <li><a href="#summary-table">Summary table</a></li>
  <li><a href="#common-constraints">Common constraints</a></li>
  </ul></li>
  <li><a href="#operators">Operators</a>
  <ul>
  <li><a href="#abs">ABS</a></li>
  <li><a href="#add">ADD</a></li>
  <li><a href="#arg_max">ARG_MAX</a></li>
  <li><a href="#arg_min">ARG_MIN</a></li>
  <li><a href="#average_pool_2d">AVERAGE_POOL_2D</a></li>
  <li><a href="#batch_to_space_nd">BATCH_TO_SPACE_ND</a></li>
  <li><a href="#cast">CAST</a></li>
  <li><a href="#ceil">CEIL</a></li>
  <li><a href="#concatenation">CONCATENATION</a></li>
  <li><a href="#conv_2d">CONV_2D</a></li>
  <li><a href="#cos">COS</a></li>
  <li><a href="#depthwise_conv_2d">DEPTHWISE_CONV_2D</a></li>
  <li><a href="#dequantize">DEQUANTIZE</a></li>
  <li><a href="#div">DIV</a></li>
  <li><a href="#elu">ELU</a></li>
  <li><a href="#equal">EQUAL</a></li>
  <li><a href="#exp">EXP</a></li>
  <li><a href="#expand_dims">EXPAND_DIMS</a></li>
  <li><a href="#fill">FILL</a></li>
  <li><a href="#floor">FLOOR</a></li>
  <li><a href="#floor_div">FLOOR_DIV</a></li>
  <li><a href="#floor_mod">FLOOR_MOD</a></li>
  <li><a href="#fully_connected">FULLY_CONNECTED</a></li>
  <li><a href="#gather">GATHER</a></li>
  <li><a href="#greater">GREATER</a></li>
  <li><a href="#greater_equal">GREATER_EQUAL</a></li>
  <li><a href="#hard_swish">HARD_SWISH</a></li>
  <li><a href="#l2_normalization">L2_NORMALIZATION</a></li>
  <li><a href="#leaky_relu">LEAKY_RELU</a></li>
  <li><a href="#less">LESS</a></li>
  <li><a href="#less_equal">LESS_EQUAL</a></li>
  <li><a href="#local_response_normalization">LOCAL_RESPONSE_NORMALIZATION</a></li>
  <li><a href="#log">LOG</a></li>
  <li><a href="#log_softmax">LOG_SOFTMAX</a></li>
  <li><a href="#logical_and">LOGICAL_AND</a></li>
  <li><a href="#logical_not">LOGICAL_NOT</a></li>
  <li><a href="#logical_or">LOGICAL_OR</a></li>
  <li><a href="#logistic">LOGISTIC</a></li>
  <li><a href="#max_pool_2d">MAX_POOL_2D</a></li>
  <li><a href="#maximum">MAXIMUM</a></li>
  <li><a href="#mean">MEAN</a></li>
  <li><a href="#minimum">MINIMUM</a></li>
  <li><a href="#mirror_pad">MIRROR_PAD</a></li>
  <li><a href="#mul">MUL</a></li>
  <li><a href="#neg">NEG</a></li>
  <li><a href="#pack">PACK</a></li>
  <li><a href="#pad">PAD</a></li>
  <li><a href="#padv2">PADV2</a></li>
  <li><a href="#pow">POW</a></li>
  <li><a href="#prelu">PRELU</a></li>
  <li><a href="#quantize">QUANTIZE</a></li>
  <li><a href="#reduce_any">REDUCE_ANY</a></li>
  <li><a href="#reduce_max">REDUCE_MAX</a></li>
  <li><a href="#reduce_min">REDUCE_MIN</a></li>
  <li><a href="#reduce_prod">REDUCE_PROD</a></li>
  <li><a href="#relu">RELU</a></li>
  <li><a href="#relu6">RELU6</a></li>
  <li><a href="#relu_n1_to_1">RELU_N1_TO_1</a></li>
  <li><a href="#reshape">RESHAPE</a></li>
  <li><a href="#resize_bilinear">RESIZE_BILINEAR</a></li>
  <li><a href="#resize_nearest_neighbor">RESIZE_NEAREST_NEIGHBOR</a></li>
  <li><a href="#round">ROUND</a></li>
  <li><a href="#rsqrt">RSQRT</a></li>
  <li><a href="#shape">SHAPE</a></li>
  <li><a href="#sin">SIN</a></li>
  <li><a href="#slice">SLICE</a></li>
  <li><a href="#softmax">SOFTMAX</a></li>
  <li><a href="#space_to_batch_nd">SPACE_TO_BATCH_ND</a></li>
  <li><a href="#split">SPLIT</a></li>
  <li><a href="#sqrt">SQRT</a></li>
  <li><a href="#square">SQUARE</a></li>
  <li><a href="#squeeze">SQUEEZE</a></li>
  <li><a href="#strided_slice">STRIDED_SLICE</a></li>
  <li><a href="#sub">SUB</a></li>
  <li><a href="#sum">SUM</a></li>
  <li><a href="#tanh">TANH</a></li>
  <li><a href="#tile">TILE</a></li>
  <li><a href="#transpose">TRANSPOSE</a></li>
  <li><a href="#transpose_conv">TRANSPOSE_CONV</a></li>
  <li><a href="#unidirectional_sequence_lstm">UNIDIRECTIONAL_SEQUENCE_LSTM</a></li>
  <li><a href="#unpack">UNPACK</a></li>
  </ul></li>
  <li><a href="#references">References</a></li>
  </ul>
	</div>
	<article id="sidenav" class="markdown-body">
		



<header>
<section class="st_header" id="doc_title">

<div class="himage">
	<img src="" title="STM32CubeMX.AI" align="right" height="70" />
	<img src="" title="STM32" align="right" height="90" />
</div>

<h1 class="title followed-by-subtitle">TensorFlow Lite toolbox support</h1>

	<p class="subtitle">X-CUBE-AI Expansion Package</p>


	<div class="ai_platform">
		AI PLATFORM r7.0.0
					(Embedded Inference Client API 1.1.0)
			</div>
			Command Line Interface r1.5.1
	




</section>
</header>
 




<section id="overview" class="level1">
<h1>Overview</h1>
<p>This document lists the layers (or operators) which can be imported and converted. Supported operators allow to address a large range of classical topologies targeting a Mobile or IoT resource-constrained runtime environment: SqueezeNet, MobileNet V1 or V2, Inception, SSD MobileNet v1,..</p>
<blockquote>
<p>Purpose of this document is to list the operators and their associated constraints or limitations, please refer to the original documentation for details on a given layer.</p>
</blockquote>
<p><a href="https://www.tensorflow.org/lite/">Tensorflow Lite</a> is the format used to deploy a neural network model on mobile platforms. STM.ai imports and converts the <code>.tflite</code> files which is based on the <a href="https://github.com/google/flatbuffers">flatbuffer</a> technology. The official ‘<code>schema.fbs</code>’ definition (<a href="https://github.com/tensorflow/tensorflow">tags <code>v2.5.0</code></a>) is used to import the models. A number of operators from the <a href="https://www.tensorflow.org/lite/guide/ops_compatibility">supported operator</a> list are handled, including the quantized models and/or operators generated by the Quantization Aware Training or/and Post-training quantization processes.</p>
<p><em>This file was automatically generated.</em></p>
<ul>
<li>X-CUBE-AI version : 7.0<br />
</li>
<li>81 operators found</li>
</ul>
<section id="summary-table" class="level2">
<h2>Summary table</h2>
<p>Following table contains the list of the operators that can be imported, if the constraints or limitations are met.</p>
<ul>
<li>supported optional fused activation (or non-linearity): linear, relu, relu_n1_to_1, leaky_relu, relu6, elu, selu, sigmoid, hard_sigmoid, hard_swish, exponential, tanh, softmax, softplus, softsign, abs, acos, acosh, asin, asinh, atan, atanh, ceil, clip, cos, cosh, erf, exp, floor, identity, log, neg, prelu, reciprocal, relu_generic, relu_thresholded, round, rsqrt, sign, sin, sinh, sqrt, swish, tan<br />
</li>
<li>supported optional fused <strong>integer</strong> activation (or non-linearity): prelu, relu, clip, lut, swish, identity, relu6<br />
</li>
<li>if an operator is not supported in integer, floating point version is used. Converters are automically added by the code generator.</li>
</ul>
<table>
<thead>
<tr class="header">
<th style="text-align: left;">operator</th>
<th style="text-align: left;">data types</th>
<th style="text-align: left;">constraints/limitations</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td style="text-align: left;"><a href="#abs">ABS</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#add">ADD</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#arg_max">ARG_MAX</a></td>
<td style="text-align: left;">float32, int32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#arg_min">ARG_MIN</a></td>
<td style="text-align: left;">float32, int32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#average_pool_2d">AVERAGE_POOL_2D</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#average_pool_2d">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#batch_to_space_nd">BATCH_TO_SPACE_ND</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#cast">CAST</a></td>
<td style="text-align: left;">bool, int8, uint8, float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#ceil">CEIL</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#concatenation">CONCATENATION</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#concatenation">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#conv_2d">CONV_2D</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#conv_2d">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#cos">COS</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#depthwise_conv_2d">DEPTHWISE_CONV_2D</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#depthwise_conv_2d">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#dequantize">DEQUANTIZE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#div">DIV</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#elu">ELU</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#equal">EQUAL</a></td>
<td style="text-align: left;">float32, bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#exp">EXP</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#expand_dims">EXPAND_DIMS</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#fill">FILL</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#floor">FLOOR</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#floor_div">FLOOR_DIV</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#floor_mod">FLOOR_MOD</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#fully_connected">FULLY_CONNECTED</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#fully_connected">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#gather">GATHER</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#gather">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#greater">GREATER</a></td>
<td style="text-align: left;">float32, bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#greater_equal">GREATER_EQUAL</a></td>
<td style="text-align: left;">float32, bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#hard_swish">HARD_SWISH</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#l2_normalization">L2_NORMALIZATION</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#leaky_relu">LEAKY_RELU</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#less">LESS</a></td>
<td style="text-align: left;">float32, bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#less_equal">LESS_EQUAL</a></td>
<td style="text-align: left;">float32, bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#local_response_normalization">LOCAL_RESPONSE_NORMALIZATION</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#log">LOG</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#log_softmax">LOG_SOFTMAX</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#log_softmax">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#logical_and">LOGICAL_AND</a></td>
<td style="text-align: left;">bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#logical_not">LOGICAL_NOT</a></td>
<td style="text-align: left;">bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#logical_or">LOGICAL_OR</a></td>
<td style="text-align: left;">bool</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#logistic">LOGISTIC</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#max_pool_2d">MAX_POOL_2D</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#max_pool_2d">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#maximum">MAXIMUM</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#mean">MEAN</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#minimum">MINIMUM</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#mirror_pad">MIRROR_PAD</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#mul">MUL</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#neg">NEG</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#pack">PACK</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#pack">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#pad">PAD</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#padv2">PADV2</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#pow">POW</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#prelu">PRELU</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#quantize">QUANTIZE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#reduce_any">REDUCE_ANY</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#reduce_max">REDUCE_MAX</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#reduce_min">REDUCE_MIN</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#reduce_prod">REDUCE_PROD</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#relu">RELU</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#relu6">RELU6</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#relu_n1_to_1">RELU_N1_TO_1</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#reshape">RESHAPE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#resize_bilinear">RESIZE_BILINEAR</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#resize_nearest_neighbor">RESIZE_NEAREST_NEIGHBOR</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#round">ROUND</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#rsqrt">RSQRT</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#shape">SHAPE</a></td>
<td style="text-align: left;">float32, int8, uint8, int32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#sin">SIN</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#slice">SLICE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#softmax">SOFTMAX</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#softmax">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#space_to_batch_nd">SPACE_TO_BATCH_ND</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#split">SPLIT</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#split">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#sqrt">SQRT</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#square">SQUARE</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#squeeze">SQUEEZE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#strided_slice">STRIDED_SLICE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#strided_slice">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#sub">SUB</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#sum">SUM</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#tanh">TANH</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#tile">TILE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#tile">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#transpose">TRANSPOSE</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#transpose">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#transpose_conv">TRANSPOSE_CONV</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#transpose_conv">specific</a></td>
</tr>
<tr class="even">
<td style="text-align: left;"><a href="#unidirectional_sequence_lstm">UNIDIRECTIONAL_SEQUENCE_LSTM</a></td>
<td style="text-align: left;">float32</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#unidirectional_sequence_lstm">specific</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;"><a href="#unpack">UNPACK</a></td>
<td style="text-align: left;">float32, int8, uint8</td>
<td style="text-align: left;"><a href="#common-constraints">common</a>, <a href="#unpack">specific</a></td>
</tr>
</tbody>
</table>
</section>
<section id="common-constraints" class="level2">
<h2>Common constraints</h2>
<ul>
<li>input and output tensors must be <strong>not dynamic</strong>.
<ul>
<li>variable-length batch dimension (i.e. <code>(None,)</code>) is considered as equal to 1<br />
</li>
<li>must not be greater than 4D<br />
</li>
<li>dimension must be in the range [0, 65536[<br />
</li>
<li>batch dimension is not supported for the axis parameter- data type for the weights/activations tensors must be:
<ul>
<li>float32, int8, uint8<br />
</li>
<li>only int32 for the bias tensor is considered<br />
</li>
</ul></li>
<li>for some operators, bool type is also supported<br />
</li>
</ul></li>
<li>mixed data operations (i.e hybrid operator) are not supported, activations and weights should be quantized<br />
</li>
<li>generated c-model is always <strong>channel-last</strong> (or <code>NHWC</code> format)<br />
</li>
<li>1D operator is mapped on the respective 2D operator by adding a singleton dimension on the input: (12,3) -&gt; (12, 0, 3)</li>
</ul>
</section>
</section>
<section id="operators" class="level1">
<h1>Operators</h1>
<section id="abs" class="level2">
<h2>ABS</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="add" class="level2">
<h2>ADD</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="arg_max" class="level2">
<h2>ARG_MAX</h2>
<p>Computes the indices of the max elements of the input tensor’s element along the provided axis.</p>
<ul>
<li>category: generic layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: int32</li>
</ul>
</section>
<section id="arg_min" class="level2">
<h2>ARG_MIN</h2>
<p>Computes the indices of the min elements of the input tensor’s element along the provided axis.</p>
<ul>
<li>category: generic layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: int32</li>
</ul>
</section>
<section id="average_pool_2d" class="level2">
<h2>AVERAGE_POOL_2D</h2>
<p>Downsamples the input</p>
<ul>
<li>category: pooling layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>arbitrary strides, provided that they are smaller than the input size<br />
</li>
<li>arbitrary pool sizes, provided that they are smaller than the input size</li>
</ul>
</section>
<section id="batch_to_space_nd" class="level2">
<h2>BATCH_TO_SPACE_ND</h2>
<p>Reshape the batch dimension of a tensor</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="cast" class="level2">
<h2>CAST</h2>
<p>Cast elements of the input tensor to the specified output tensor data</p>
<ul>
<li>category: conversion layer<br />
</li>
<li>input data types: bool, int8, uint8, float32<br />
</li>
<li>output data types: bool, int8, uint8, float32</li>
</ul>
</section>
<section id="ceil" class="level2">
<h2>CEIL</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="concatenation" class="level2">
<h2>CONCATENATION</h2>
<p>Performs concatenation of a list of inputs</p>
<ul>
<li>category: merge operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8<br />
</li>
<li>fused activations (if present): linear, relu, relu_n1_to_1, leaky_relu, relu6, elu, selu, sigmoid, hard_sigmoid, hard_swish, exponential, tanh, softmax, softplus, softsign</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>concatenating on the batch dimension is not supported</li>
</ul>
</section>
<section id="conv_2d" class="level2">
<h2>CONV_2D</h2>
<p>Performs convolution operation</p>
<ul>
<li>category: convolutional layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8<br />
</li>
<li>fused activations (if present): linear, relu, relu_n1_to_1, leaky_relu, relu6, elu, selu, sigmoid, hard_sigmoid, hard_swish, exponential, tanh, softmax, softplus, softsign<br />
</li>
<li>integer schemes: weights / activations
<ul>
<li>Signed Symmetric / Signed Asymmetric (SSSA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Signed Asymmetric (SSSA_CH)<br />
</li>
<li>Signed Symmetric / Unsigned Asymmetric (SSUA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Unsigned Asymmetric (SSUA_CH)</li>
</ul></li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>arbitrary strides, provided that they are smaller than the input size<br />
</li>
<li>arbitrary filter kernel sizes, provided that they are smaller than the input size<br />
</li>
<li>for quantized model, dilation values different from 1 are not supported</li>
</ul>
</section>
<section id="cos" class="level2">
<h2>COS</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="depthwise_conv_2d" class="level2">
<h2>DEPTHWISE_CONV_2D</h2>
<p>Performs convolution operation</p>
<ul>
<li>category: convolutional layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8<br />
</li>
<li>fused activations (if present): linear, relu, relu_n1_to_1, leaky_relu, relu6, elu, selu, sigmoid, hard_sigmoid, hard_swish, exponential, tanh, softmax, softplus, softsign<br />
</li>
<li>integer schemes: weights / activations
<ul>
<li>Signed Symmetric / Signed Asymmetric (SSSA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Signed Asymmetric (SSSA_CH)<br />
</li>
<li>Signed Symmetric / Unsigned Asymmetric (SSUA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Unsigned Asymmetric (SSUA_CH)</li>
</ul></li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>arbitrary strides, provided that they are smaller than the input size<br />
</li>
<li>arbitrary filter kernel sizes, provided that they are smaller than the input size<br />
</li>
<li>for quantized model, dilation values different from 1 are not supported</li>
</ul>
</section>
<section id="dequantize" class="level2">
<h2>DEQUANTIZE</h2>
<p>Computes element-wise data conversion low precision to full precision, based on the scale/zeropoint parameters</p>
<ul>
<li>category: conversion layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="div" class="level2">
<h2>DIV</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="elu" class="level2">
<h2>ELU</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="equal" class="level2">
<h2>EQUAL</h2>
<p>Performs logical element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="exp" class="level2">
<h2>EXP</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="expand_dims" class="level2">
<h2>EXPAND_DIMS</h2>
<p>Reshapes a tensor</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="fill" class="level2">
<h2>FILL</h2>
<p>Generates a tensor with given value and shape</p>
<ul>
<li>category: constant layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="floor" class="level2">
<h2>FLOOR</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="floor_div" class="level2">
<h2>FLOOR_DIV</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="floor_mod" class="level2">
<h2>FLOOR_MOD</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="fully_connected" class="level2">
<h2>FULLY_CONNECTED</h2>
<p>Fully Connected operation</p>
<ul>
<li>category: core layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8<br />
</li>
<li>fused activations (if present): linear, relu, relu_n1_to_1, leaky_relu, relu6, elu, selu, sigmoid, hard_sigmoid, hard_swish, exponential, tanh, softmax, softplus, softsign<br />
</li>
<li>integer schemes: weights / activations
<ul>
<li>Signed Symmetric / Signed Asymmetric (SSSA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Signed Asymmetric (SSSA_CH)<br />
</li>
<li>Signed Symmetric / Unsigned Asymmetric (SSUA)<br />
</li>
<li>Signed Symmetric per channel (or per-axis) / Unsigned Asymmetric (SSUA_CH)</li>
</ul></li>
</ul>
</section>
<section id="gather" class="level2">
<h2>GATHER</h2>
<p>Gathers values along a specified axis</p>
<ul>
<li>category: activation function<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>gather along batch dimension is not supported</li>
</ul>
</section>
<section id="greater" class="level2">
<h2>GREATER</h2>
<p>Performs logical element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="greater_equal" class="level2">
<h2>GREATER_EQUAL</h2>
<p>Performs logical element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="hard_swish" class="level2">
<h2>HARD_SWISH</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="l2_normalization" class="level2">
<h2>L2_NORMALIZATION</h2>
<p>Apply Lp-normalization along the provided axis</p>
<ul>
<li>category: normalization function<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="leaky_relu" class="level2">
<h2>LEAKY_RELU</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="less" class="level2">
<h2>LESS</h2>
<p>Performs logical element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="less_equal" class="level2">
<h2>LESS_EQUAL</h2>
<p>Performs logical element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="local_response_normalization" class="level2">
<h2>LOCAL_RESPONSE_NORMALIZATION</h2>
<p>Apply Local Response Normalization over local input regions</p>
<ul>
<li>category: normalization function<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="log" class="level2">
<h2>LOG</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="log_softmax" class="level2">
<h2>LOG_SOFTMAX</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>It is supported only for 1D tensor and only on the channel dimension</li>
</ul>
</section>
<section id="logical_and" class="level2">
<h2>LOGICAL_AND</h2>
<p>Performs boolean element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: bool<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="logical_not" class="level2">
<h2>LOGICAL_NOT</h2>
<p>Performs boolean element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: bool<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="logical_or" class="level2">
<h2>LOGICAL_OR</h2>
<p>Performs boolean element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: bool<br />
</li>
<li>output data types: bool</li>
</ul>
</section>
<section id="logistic" class="level2">
<h2>LOGISTIC</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="max_pool_2d" class="level2">
<h2>MAX_POOL_2D</h2>
<p>Downsamples the input</p>
<ul>
<li>category: pooling layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>arbitrary strides, provided that they are smaller than the input size<br />
</li>
<li>arbitrary pool sizes, provided that they are smaller than the input size</li>
</ul>
</section>
<section id="maximum" class="level2">
<h2>MAXIMUM</h2>
<p>Computes the maximum (element-wise) a list of inputs</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="mean" class="level2">
<h2>MEAN</h2>
<p>Computes the Mean of the input tensor’s element along the provided axes</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="minimum" class="level2">
<h2>MINIMUM</h2>
<p>Computes the minimum (element-wise) a list of inputs</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="mirror_pad" class="level2">
<h2>MIRROR_PAD</h2>
<p>Pads an input tensor</p>
<ul>
<li>category: Reshaping layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="mul" class="level2">
<h2>MUL</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="neg" class="level2">
<h2>NEG</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="pack" class="level2">
<h2>PACK</h2>
<p>Packs a list of tensors into a tensor along a specified axis</p>
<ul>
<li>category: merge operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>related TF operator: tf.stack</li>
</ul>
</section>
<section id="pad" class="level2">
<h2>PAD</h2>
<p>Pads an input tensor</p>
<ul>
<li>category: Reshaping layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="padv2" class="level2">
<h2>PADV2</h2>
<p>Pads an input tensor</p>
<ul>
<li>category: Reshaping layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="pow" class="level2">
<h2>POW</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="prelu" class="level2">
<h2>PRELU</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="quantize" class="level2">
<h2>QUANTIZE</h2>
<p>Computes element-wise data conversion full precision to low precision, based on the scale/zeropoint parameters</p>
<ul>
<li>category: conversion layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="reduce_any" class="level2">
<h2>REDUCE_ANY</h2>
<p>Computes the logical ‘or’ of elements across dimensions of a tensor</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="reduce_max" class="level2">
<h2>REDUCE_MAX</h2>
<p>Computes the Max of the input tensor’s element along the provided axes</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="reduce_min" class="level2">
<h2>REDUCE_MIN</h2>
<p>Computes the Min of the input tensor’s element along the provided axes</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="reduce_prod" class="level2">
<h2>REDUCE_PROD</h2>
<p>Computes the Product of the input tensor’s element along the provided axes</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="relu" class="level2">
<h2>RELU</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="relu6" class="level2">
<h2>RELU6</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="relu_n1_to_1" class="level2">
<h2>RELU_N1_TO_1</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="reshape" class="level2">
<h2>RESHAPE</h2>
<p>Reshapes a tensor</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="resize_bilinear" class="level2">
<h2>RESIZE_BILINEAR</h2>
<p>Resize input tensor using bilinear interpolation</p>
<ul>
<li>category: resizing operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="resize_nearest_neighbor" class="level2">
<h2>RESIZE_NEAREST_NEIGHBOR</h2>
<p>Resize input tensor using nearest interpolation mode</p>
<ul>
<li>category: resizing operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="round" class="level2">
<h2>ROUND</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="rsqrt" class="level2">
<h2>RSQRT</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="shape" class="level2">
<h2>SHAPE</h2>
<p>Returns a tensor containing the shape of the input tensor</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: int32</li>
</ul>
</section>
<section id="sin" class="level2">
<h2>SIN</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="slice" class="level2">
<h2>SLICE</h2>
<p>Crops the input</p>
<ul>
<li>category: reshaping layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="softmax" class="level2">
<h2>SOFTMAX</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>It is supported only for 1D tensor and only on the channel dimension</li>
</ul>
</section>
<section id="space_to_batch_nd" class="level2">
<h2>SPACE_TO_BATCH_ND</h2>
<p>Divides spatial dimensions</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="split" class="level2">
<h2>SPLIT</h2>
<p>Splits a tensor into a list of sub tensors</p>
<ul>
<li>category: split operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>Only supported if the number of splits is equal to the size of the splitting dimension</li>
</ul>
</section>
<section id="sqrt" class="level2">
<h2>SQRT</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="square" class="level2">
<h2>SQUARE</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="squeeze" class="level2">
<h2>SQUEEZE</h2>
<p>Reshapes a tensor</p>
<ul>
<li>category: Reshaping operation<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="strided_slice" class="level2">
<h2>STRIDED_SLICE</h2>
<p>Return a strided slice from input</p>
<ul>
<li>category: resize operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>only unit strides are supported and shrink_axis_mask is not handled</li>
</ul>
</section>
<section id="sub" class="level2">
<h2>SUB</h2>
<p>Performs element-wise operation</p>
<ul>
<li>category: eltwise operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
</section>
<section id="sum" class="level2">
<h2>SUM</h2>
<p>Computes the Sum of the input tensor’s element along the provided axes</p>
<ul>
<li>category: reduction operation<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="tanh" class="level2">
<h2>TANH</h2>
<p>Applies an activation function to the input tensor</p>
<ul>
<li>category: activation layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
</section>
<section id="tile" class="level2">
<h2>TILE</h2>
<p>Constructs a tensor by tiling the input tensor</p>
<ul>
<li>category: reshaping layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>tiling on batch-dimension is not supported</li>
</ul>
</section>
<section id="transpose" class="level2">
<h2>TRANSPOSE</h2>
<p>Permutes the dimensions of the input according to a given pattern</p>
<ul>
<li>category: reshaping layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>transposing the batch dimension is not supported</li>
</ul>
</section>
<section id="transpose_conv" class="level2">
<h2>TRANSPOSE_CONV</h2>
<p>Transposed convolutional layer</p>
<ul>
<li>category: convolutional layer<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>arbitrary strides, provided that they are smaller than the input size<br />
</li>
<li>arbitrary filter kernel sizes, provided that they are smaller than the input size</li>
</ul>
</section>
<section id="unidirectional_sequence_lstm" class="level2">
<h2>UNIDIRECTIONAL_SEQUENCE_LSTM</h2>
<p>Computes a multi-layer long short-term memory (LSTM) RNN to an input sequence (batch=1, timesteps, features)</p>
<ul>
<li>category: recurrent layer<br />
</li>
<li>input data types: float32<br />
</li>
<li>output data types: float32</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>stateless mode support only<br />
</li>
<li>fused activation: sigmoid<br />
</li>
<li>fused recurrent activation: sigmoid<br />
</li>
<li><code>return_state</code> not supported<br />
</li>
<li><code>time_major</code> not supported</li>
</ul>
</section>
<section id="unpack" class="level2">
<h2>UNPACK</h2>
<p>Unpacks num tensors from values along specified axis</p>
<ul>
<li>category: split operator<br />
</li>
<li>input data types: float32, int8, uint8<br />
</li>
<li>output data types: float32, int8, uint8</li>
</ul>
<p>Specific constraints/recommendations:</p>
<ul>
<li>related TF operator: tf.unstack</li>
</ul>
<!-- External ST resources/links -->
<!-- Internal resources/links -->
<!-- External resources/links -->
<!-- Cross references -->
</section>
</section>
<section id="references" class="level1">
<h1>References</h1>
<table>
<colgroup>
<col style="width: 18%" />
<col style="width: 81%" />
</colgroup>
<thead>
<tr class="header">
<th style="text-align: left;">ref</th>
<th style="text-align: left;">description</th>
</tr>
</thead>
<tbody>
<tr class="odd">
<td style="text-align: left;">[DS]</td>
<td style="text-align: left;">X-CUBE-AI - AI expansion pack for STM32CubeMX <a href="https://www.st.com/en/embedded-software/x-cube-ai.html">https://www.st.com/en/embedded-software/x-cube-ai.html</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[UM]</td>
<td style="text-align: left;">User manual - Getting started with X-CUBE-AI Expansion Package for Artificial Intelligence (AI) <a href="https://www.st.com/resource/en/user_manual/dm00570145.pdf">(pdf)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[CLI]</td>
<td style="text-align: left;">stm32ai - Command Line Interface <a href="command_line_interface.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[API]</td>
<td style="text-align: left;">Embedded inference client API <a href="embedded_client_api.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[METRIC]</td>
<td style="text-align: left;">Evaluation report and metrics <a href="evaluation_metrics.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[TFL]</td>
<td style="text-align: left;">TensorFlow Lite toolbox <a href="supported_ops_tflite.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[KERAS]</td>
<td style="text-align: left;">Keras toolbox <a href="supported_ops_keras.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[ONNX]</td>
<td style="text-align: left;">ONNX toolbox <a href="supported_ops_onnx.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[FAQS]</td>
<td style="text-align: left;">FAQ <a href="faq_generic.html">generic</a>, <a href="faq_validation.html">validation</a>, <a href="faq_quantization.html">quantization</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[QUANT]</td>
<td style="text-align: left;">Quantization and quantize command <a href="quantization.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[RELOC]</td>
<td style="text-align: left;">Relocatable binary network support <a href="relocatable.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[CUST]</td>
<td style="text-align: left;">Support of the Keras Lambda/custom layers <a href="keras_lambda_custom.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[TFLM]</td>
<td style="text-align: left;">TensorFlow Lite for Microcontroller support <a href="tflite_micro_support.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[INST]</td>
<td style="text-align: left;">Setting the environment <a href="setting_env.html">(link)</a></td>
</tr>
<tr class="odd">
<td style="text-align: left;">[OBS]</td>
<td style="text-align: left;">Platform Observer API <a href="api_platform_observer.html">(link)</a></td>
</tr>
<tr class="even">
<td style="text-align: left;">[C-RUN]</td>
<td style="text-align: left;">Executing locally a generated c-model <a href="how_to_run_a_model_locally.html">(link)</a></td>
</tr>
</tbody>
</table>
</section>



<section class="st_footer">

<h1> <br> </h1>

<p style="font-family:verdana; text-align:left;">
 Embedded Documentation 

	- <b> TensorFlow Lite toolbox support </b>
			<br> X-CUBE-AI Expansion Package
	 
	
</p>

<img src="" title="ST logo" align="right" height="100" />

<div class="st_notice">
Information in this document is provided solely in connection with ST products.
The contents of this document are subject to change without prior notice.
<br>
© Copyright STMicroelectronics 2020. All rights reserved. <a href="http://www.st.com">www.st.com</a>
</div>

<hr size="1" />
</section>


</article>
</body>

</html>
