\chapter{Sample curricula}

This book is designed to cover two full semesters at undergraduate level, CSCI 3302 and CSCI 4302 at CU Boulder, or a single semester ``crash course'' at graduate level. There are multiple avenues that an instructor could take, each with their unique theme and a varying set of prerequisites on the students. Content within the book is deliberately agnostic to a particular robotic platform, programming language, or simulation environment, leaving it to the instructor to choose an appropriate platform.


\section{An introduction to autonomous \emph{mobile} robots}
This describes a possible one semester curriculum, which takes the students from the kinematics of a differential-wheel platform to a basic understanding of SLAM. This curriculum is involved and requires a firm background in trigonometry, probability theory and linear algebra. This might be too ambitious for third-year Computer Science students, but fares well with Aerospace and Electrical Engineering students, who often have a stronger, and more applied, mathematical background. This curriculum is therefore also well suited as ``advanced class'', e.g. in the fourth year of a CS curriculum.

\subsection{Overview}
The curriculum is motivated by a maze-solving competition that is described in Section \ref{sec:ratslife}. Solving the game can be accomplished using a variety of algorithms ranging from wall following (which requires simple proportional control) to Depth-first Search on the maze to full SLAM. Here, the rules are designed such that creating a map of the environment leads to a competitive advantage on the long run.

\subsection{Content}\label{sec:curr1content}
After introducing the field and the curriculum using Chapter \ref{chap:introduction} ``\nameref{chap:introduction}'', another week can be spent on basic concepts from Chapter \ref{chap:locomotion} ``\nameref{chap:locomotion}'', which includes concepts like ``\nameref{sec:stability}'' and ``\nameref{sec:dof}''. The lab portions of the class can at this time be used to introduce the software and hardware used in the competition. For example, students can experiment with the programming environment of the real robot or setup a simple world in the simulator themselves.

The lecture can then take up pace with Chapter \ref{chap:kinematics}. Here, the topics ``\nameref{sec:coordsystems}'', ``\nameref{sec:kinematics:fwk:mobile}'', and ``\nameref{sec:kinematics:ik:mobile}'' are on the critical path, whereas other sections in Chapter \ref{chap:kinematics} are optional. It is worth mentioning that the forward kinematics of non-holonomic platforms, and in particular the motivation for considering their treatment in velocity rather than position space, are not straightforward and therefore at least some treatment of arm kinematics is recommended. These concepts can easily be turned into practical experience during the lab session.

The ability to implement point-to-point motions in configuration space thanks to knowledge of inverse kinematics, directly lends itself to ``\nameref{sec:maps}'' and ``\nameref{chap:pathplanning}'' treated in Chapter \ref{chap:pathplanning}. For the purpose of maze solving, simple algorithms like Dijkstra's and A* are sufficient, and sampling-based approaches can be skipped. Implementing a path-planning algorithm both in simulation and on the real robot will provide first-hand experience of uncertainty.

The lecture can then proceed to ``\nameref{chap:sensors}'' (Chapter \ref{chap:sensors}), which should be used to motivate uncertainty using concepts like accuracy and precision. These concepts can be formalized using materials in Chapter \ref{chap:statistics} ``\nameref{chap:statistics}'', and quantified during lab. Here, having students record the histogram of sensor noise distributions is a valuable exercise.

Chapters \ref{chap:vision} and \ref{chap:feature_extraction}, which are on ``\nameref{chap:vision}'' and ``\nameref{chap:feature_extraction}'', do not need to extend further than needed to understand and implement simple algorithms for detecting the unique features in the maze environment. In practice, these can usually be detected using basic convolution-based filters from Chapter \ref{chap:vision}, and simple post-processing, introducing the notion of a ``feature'', but without reviewing more complex image feature detectors. The lab portion of the class should be aimed at identifying markers in the environment, and can be scaffolded as much as necessary.

Indepth experimentation with sensors, including vision, serves as a foundation for a more formal treatment of uncertainty in Chapter \ref{chap:uncertainty} ``\nameref{chap:uncertainty}''. Depending on whether the ``\nameref{sec:linefitting}'' example has been treated in Chapter \ref{chap:feature_extraction}, it can be used here to demonstrate error propagation from sensor uncertainty, and should be simplified otherwise. In lab, students can actually measure the distribution of robot position over hundreds of individual trials (this is an exercise that can be done collectively if enough hardware is available), and verify their math using these observations. Alternatively, code to perform these experiments can be provided, giving the students more time to catching up.

The localization problem introduced in Chapter \ref{chap:localization} is best introduced using Markov localization, from which more advanced concepts such as the particle filter and the Kalman filter can be derived. Performing these experiments in the lab is involved, and is best done in simulation, which allows neat ways to visualize the probability distributions changing.

The lecture can be concluded with ``\nameref{sec:ekfslam}'' in Chapter \ref{chap:slam}. Actually implementing EKF SLAM is beyond the scope of an undergraduate robotics class and is achieved only by very few students who go beyond the call of duty. Instead, students should be able to experience the workings of the algorithm in simulation or scaffolded in the experimental platform by the instructor.

The lab portion of the class can be concluded by a competition in which student teams compete against each other. In practice, winning teams differentiate themselves by the most rigorous implementation, often using one of the less complex algorithms, e.g., wall following or simple exploration. Here, it is up to the instructor incentivizing a desired approach.

Depending on the pace of the class in lecture as well as the time that the instructor wishes to reserve for implementation of the final project, lectures can be offset by debates, as described in Section \ref{sec:debates}.

\subsection{Implementation suggestions}
An interesting competition environment can be easily re-created using card board or LEGO bricks and any miniature, differential wheel platform that is equipped with a camera to recognize simple markers in the environment (which serve as landmarks for SLAM). The setup can also easily be simulated in a physics-based simulation environment, which allows scaling this curriculum to a large number of participants. The setup used at CU Boulder using the e-Puck robot and the open-source, free, Webots simulator is shown in Figure \ref{fig:ratslifereal}.

\begin{figure}[htb]
\includegraphics[width=0.48\textwidth]{figs/ratslife_real}
\includegraphics[width=0.48\textwidth]{figs/ratslife_webots}
\caption{\label{fig:ratslifereal}The ``Ratslife'' maze competition created from LEGO bricks and e-Puck robots (left). The same environment simulated in \emph{Webots}.}
\end{figure}

Variations of the above curriculum can be implemented using a basic Arduino-based platform such as ``Sparki''. Sparki is equipped with a swiveling ultrasound scanner, which can be used to simulate a laser range finder and allows the students to extract simple features such as cones, corners or gates in the environment and use them for localization. A bluetooth module allows this robot to be remote controlled, allowing the instructor to move from the Arduino language (C) and computational limitations to a fully-fledged Desktop computer.

The class can also be taught using a ``Raspberry Pi''-based platform that can be equipped with a webcam, runs Linux, and allows the students to perform basic computer vision using OpenCV and other toolboxes. Here, the Python language and Jupyter Lab provides a low floor to access the programming environment, and a number of educational robots using this architecture have become available recently, some even with GPU support.

The class can be taught using modified RC cars, equipped with scanning lasers, stereo cameras and powerful onboard computation. Competitions among the students can involve decision making around avoiding obstacles or following a previously unknown course by recognizing landmarks.
Descriptions and parts for such vehicles, e.g., the ``MIT Racecar'' are available online. Here, emphasis  will need to change from differential wheel kinematics to Ackerman kinematics (Section \ref{sec:carsteering}) both for odometry and planning.

Finally, a variation of this curriculum can also be taught using drones, such as the Parrot drone, which are equipped with a camera as well as a wireless device that allows executing control algorithms on a desktop computer. In this case, landmarks can be deployed throughout the environment, shifting the focus from kinematics to computer vision.


\section{An introduction to robotic manipulation}
A class on robotic manipulation can be equally taught at introductory or advanced level, following an introductory course on mobile robots. While teaching autonomous manipulation sets a high bar on linear algebra, vision and feature detection, a manipulation curriculum can also be very practical, shifting the learning experience from the computational into better understanding the role of embodiment. 

\subsection{Overview}
A class on robotic manipulation can be motivated from the overview shown in \cref{fig:graspalgorithm}, taking the students from basic arm inverse kinematics to point cloud processing and integrated task and motion planning. By focussing on 3D perception and inverse kinematics, it is possible to implement the majority of the class in simulation, making the use of a shared hardware resource optional. Alternatively, the class can also be taught without any computers and require the students to build their own hardware. 

\subsection{Content}
Following the outline of the book, the class can start with mechanisms. Here, the critical role of embodiment should be stressed early on. In Chapter \ref{chap:kinematics}, the focus is instead on manipulating arms, including the Denavit-Hartenberg scheme and numerical methods for inverse kinematics. In this case, the topics ``\nameref{sec:kinematics:fwk:mobile}'' and ``\nameref{sec:kinematics:ik:mobile}'' do not necessarily need to be included. Forward and inverse kinematics can be easily turned into lab sessions using a simple abstraction (Matlab/Mathematica/Python) or simulated robot arm (Webots). If the class uses a more complex or industrial robot arm, an alternative path is to record joint trajectories in a ROS bag and letting the students explore this data, e.g., drawing the trajectories recorded from the robot to guess what it has been done, before moving on to inverse kinematics.

After introducing forces \cref{ch:forces}, theory and practice of grasping can be introduced following the outline in \cref{chap:grasping}.

If the goal of the class is an autonomous solution, the class can then proceed to suitable sensors including basic proprioception, distance sensing, and finally extracting structure from vision. Actuators can be treated as needed, with brushless DC motors and servo motors being standard components of high-performance manipulating systems. If desired, the instructor can also discuss pneumatics and ``soft'' robotics, which are attractive for manipulating some objects. 

With manipulation a so-called integrated task and motion planning problem, \cref{chap:taskexecution} will be an important part of an autonomous manipulation class.

The class can then move on to vision and feature extraction. Topics such as uncertainty and error propagation can be skipped in a class focusing on manipulation. If desired, Bayes rule can be introduced in the context of ``false positives'' in object recognition and task execution, allowing the instructor to introduce concepts such as inference in a task planning framework. 

\subsection{Implementation suggestions}

Unless a gripper is provided, designing and modeling a gripper in a robotic simulator can be a worthwhile exercise. Alternatively, the students can design their own hardware, 3D print an end-effector, and try the versatility of their solution by manually actuating their mechanism to solve a set of manipulation challenges such as described in \cite{patel2016manipulation}. The sky is the limit here, in particular when ``soft'' actuators have been introduced and the students are encouraged to compare conventional mechanisms with suction and jamming grippers. 

How to teach feature extraction and aspects of mapping will depend on the overarching manipulation goals that are used throughout the class. When focusing on simple bin picking, line recognition and RANSAC can be introduced in the context of identifying the bin, and objects therein. In such a scenario, path planning can be substituted by simple inverse kinematics. When focusing on pick-and-place, path planning can be motivated by planning around simple obstacles, focusing on rapidly exploring random trees. Labs and experiments can be easily implemented in simulation, initially focusing on perception only and introducing grasp planning only later. 

Object recognition and segmentation are good motivation for introducing convolutional neural networks (Chapter \cref{chap:ann}) as well as appropriate open-source tools that the students can use as a blackbox. Simulators such as Webots also provide object detection and segmentation, allowing the instructor to focus only on the kinematic aspects of autonomy. 

Simulation reaches their limitations in task that are rich in contacts such as assembly and construction. While a class that is more oriented toward perception might skip this experience, simulation can be complemented by simple experiments in which students create their own hardware. Optimally, a shared resource such an assembly task board, is provided that the students can get time on after demonstrating certain basic capabilities in a simulation environment. 

\section{An introduction to robotic systems}
A robotic systems class can be either offered as an advanced class that allows students to put theoretical concepts to practice, but also as a stand-alone class in which advanced concepts are abstracted behind libraries that are presented as ``black box''.

\subsection{Overview}
A robotic systems curriculum can be motivated by a ``grand challenge'' task such as robotic agriculture, robotic construction or assisted living, all of which require both mobility and manipulation problem. Although a class project is likely to be limited to a toy-example, taking advantage of modern motion-planning frameworks and visualization tools, e.g. ROS/Moveit! \cite{coleman2014reducing}, makes it easy to put the class into an industry-relevant framework and expose the students to state of the art platforms in simulation. Possible class project range from ``robot gardening'' or ``robots building robots'', for which setups can easily be created. These include real or plastic cherry tomato or strawberry plants and robotic construction kits such as Modular Robotics ``Cubelets'', which easily snap together and have the advantage to form structures that are robots themselves, adding additional motivation.

\subsection{Content}
The first two weeks of this curriculum can be mostly identical to that described in Section \ref{sec:curr1content}. If a message passing system such as ROS is used, a good exercise is to record a histogram of message passing times in order to get familiar with the software.

It is now the choice of the instructor, whether to focus more on kinematics of arms or differential kinematics of for mobile platforms. If the systems class is used in an introductory format, it might be sufficient to introduce basic forward kinematics of robot arms. In an advanced setting, the instructor might instead cover differential kinematics in the force domain. 

In case more advanced platforms are available, a depth camera can be mounted above or on the end-effector, allowing to introduce topics like vision (Section \ref{chap:vision}), feature extraction (Section \ref{chap:feature_extraction}) and grasping (Section \ref{chap:grasping}).

\subsection{Implementation suggestions}
A simple servo-based arm can be mounted on a portable structure that contains fixed a set of fixed (3D) cameras. In order to allow a large number of students to get familiar with the necessary software and hardware, the instructor can provide a virtual machine with a preinstalled Linux environment and simulation tools. In particular, using the ``Robot Operating Systems'' (ROS) allows recording so-called ``bag''-files of sensor values, including entire sequences of joint recordings and RGB-D video. This allows the students to work on a large part of the homeworks and project preparation from a computer lab or from home, maximizing availability of real hardware.

If hardware such as a Kinova arm with integrated Intel RealSense or a Universal Robot arm are available, students can prepare for using the shared resource by working with pre-recorded data and a simulation environment. This is not ideal for educating students about grasping, which is not only difficult to simulate, but also difficult to understand in a non-experiental setting. While not explicitely covered in this book, the instructor could bridge this gap by letting the students design their own end-effectors using 3D-printing techniques or augmenting simple two-bar linkage grippers with padding. Experimenting with such devices in a remote controlled setting --- as simple as the students manually actuating the gripper --- will provide some insights on the challenges of grasping and manipulation. The students can then test their designs with the shared resource and allow the instructor to demonstrate the importance of mechanism and sensing co-design.


\section{Class debates}\label{sec:debates}
Class debates are a good way to decompress at the end of class, create a buffer for students to apply their knowledge by preparing for a final project, and require the students to put the materials they learned in a broader context. Student teams prepare pro and contra arguments for a statement of current technical or societal concern, exercising presentation and research skills. Sample topics include \emph{Robots putting humans out of work is a risk that needs to be mitigated}; \emph{Robots should not have the capability to autonomously discharge weapons / drive around in cities (autonomous cars)}; or \emph{Robots need to be made from components other than links, joints, and gears in order to reach the agility of people}.

The students are instructed to make as much use as possible of technical arguments that are grounded in the course materials and in additional literature. For example, students can use the inherent uncertainty of sensors to argue for or against enabling robots to use deadly weapons. Similarly, students relate the importance and impact of current developments in robotics to earlier inventions that led to industrialization, when considering the risk of robots putting humans out of work.

Although suspicious as first, students usually receive this format very well.  While there is agreement that debates help to prepare them for the engineering profession by improving presentation skills, preparing engineers to think about questions posed by society, and reflecting up-to-date topics, the debates seem to have little effect on changing the students' actual opinions on a topic. For example, in a questionnaire administered after class, only two students responded positively. Students are also undecided about whether the debates helped them to better understand the technical content of the class. Yet students find the debate concept important enough that they prefer it over a more in-depth treatment of the technical content of the class, and disagree that debates should be given less time in class. However, students are undecided whether debates are important enough to merit early inclusion in the curriculum or to be part of every class in engineering.

Concerning the overall format, students find that discussion time was too short when allotting 10 minutes per position and 15 minutes for discussion and rebuttal. Also, students tend to agree that debates are an opportunity to decompress (``relaxing''), which is desirable as this period of class coincides with wrapping up the course project.
