{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "26d2f7a4",
   "metadata": {
    "origin_pos": 0
   },
   "source": [
    "# Recurrent Neural Networks\n",
    ":label:`chap_rnn`\n",
    "\n",
    "Up until now, we have focused primarily on fixed-length data.\n",
    "When introducing linear and logistic regression\n",
    "in :numref:`chap_regression` and :numref:`chap_classification`\n",
    "and multilayer perceptrons in :numref:`chap_perceptrons`,\n",
    "we were happy to assume that each feature vector $\\mathbf{x}_i$\n",
    "consisted of a fixed number of components $x_1, \\dots, x_d$,\n",
    "where each numerical feature $x_j$\n",
    "corresponded to a particular attribute.\n",
    "These datasets are sometimes called *tabular*,\n",
    "because they can be arranged in tables,\n",
    "where each example $i$ gets its own row,\n",
    "and each attribute gets its own column.\n",
    "Crucially, with tabular data, we seldom\n",
    "assume any particular structure over the columns.\n",
    "\n",
    "Subsequently, in :numref:`chap_cnn`,\n",
    "we moved on to image data, where inputs consist\n",
    "of the raw pixel values at each coordinate in an image.\n",
    "Image data hardly fitted the bill\n",
    "of a protypical tabular dataset.\n",
    "There, we needed to call upon convolutional neural networks (CNNs)\n",
    "to handle the hierarchical structure and invariances.\n",
    "However, our data were still of fixed length.\n",
    "Every Fashion-MNIST image is represented\n",
    "as a $28 \\times 28$ grid of pixel values.\n",
    "Moreover, our goal was to develop a model\n",
    "that looked at just one image and then\n",
    "outputted a single prediction.\n",
    "But what should we do when faced with a\n",
    "sequence of images, as in a video,\n",
    "or when tasked with producing\n",
    "a sequentially structured prediction,\n",
    "as in the case of image captioning?\n",
    "\n",
    "A great many learning tasks require dealing with sequential data.\n",
    "Image captioning, speech synthesis, and music generation\n",
    "all require that models produce outputs consisting of sequences.\n",
    "In other domains, such as time series prediction,\n",
    "video analysis, and musical information retrieval,\n",
    "a model must learn from inputs that are sequences.\n",
    "These demands often arise simultaneously:\n",
    "tasks such as translating passages of text\n",
    "from one natural language to another,\n",
    "engaging in dialogue, or controlling a robot,\n",
    "demand that models both ingest and output\n",
    "sequentially structured data.\n",
    "\n",
    "\n",
    "Recurrent neural networks (RNNs) are deep learning models\n",
    "that capture the dynamics of sequences via\n",
    "*recurrent* connections, which can be thought of\n",
    "as cycles in the network of nodes.\n",
    "This might seem counterintuitive at first.\n",
    "After all, it is the feedforward nature of neural networks\n",
    "that makes the order of computation unambiguous.\n",
    "However, recurrent edges are defined in a precise way\n",
    "that ensures that no such ambiguity can arise.\n",
    "Recurrent neural networks are *unrolled* across time steps (or sequence steps),\n",
    "with the *same* underlying parameters applied at each step.\n",
    "While the standard connections are applied *synchronously*\n",
    "to propagate each layer's activations\n",
    "to the subsequent layer *at the same time step*,\n",
    "the recurrent connections are *dynamic*,\n",
    "passing information across adjacent time steps.\n",
    "As the unfolded view in :numref:`fig_unfolded-rnn` reveals,\n",
    "RNNs can be thought of as feedforward neural networks\n",
    "where each layer's parameters (both conventional and recurrent)\n",
    "are shared across time steps.\n",
    "\n",
    "\n",
    "![On the left recurrent connections are depicted via cyclic edges. On the right, we unfold the RNN over time steps. Here, recurrent edges span adjacent time steps, while conventional connections are computed synchronously.](../img/unfolded-rnn.svg)\n",
    ":label:`fig_unfolded-rnn`\n",
    "\n",
    "\n",
    "Like neural networks more broadly,\n",
    "RNNs have a long discipline-spanning history,\n",
    "originating as models of the brain popularized\n",
    "by cognitive scientists and subsequently adopted\n",
    "as practical modeling tools employed\n",
    "by the machine learning community.\n",
    "As we do for deep learning more broadly,\n",
    "in this book we adopt the machine learning perspective,\n",
    "focusing on RNNs as practical tools that rose\n",
    "to popularity in the 2010s owing to\n",
    "breakthrough results on such diverse tasks\n",
    "as handwriting recognition :cite:`graves2008novel`,\n",
    "machine translation :cite:`Sutskever.Vinyals.Le.2014`,\n",
    "and recognizing medical diagnoses :cite:`Lipton.Kale.2016`.\n",
    "We point the reader interested in more\n",
    "background material to a publicly available\n",
    "comprehensive review :cite:`Lipton.Berkowitz.Elkan.2015`.\n",
    "We also note that sequentiality is not unique to RNNs.\n",
    "For example, the CNNs that we already introduced\n",
    "can be adapted to handle data of varying length,\n",
    "e.g., images of varying resolution.\n",
    "Moreover, RNNs have recently ceded considerable\n",
    "market share to Transformer models,\n",
    "which will be covered in :numref:`chap_attention-and-transformers`.\n",
    "However, RNNs rose to prominence as the default models\n",
    "for handling complex sequential structure in deep learning,\n",
    "and remain staple models for sequential modeling to this day.\n",
    "The stories of RNNs and of sequence modeling\n",
    "are inextricably linked, and this is as much\n",
    "a chapter about the ABCs of sequence modeling problems\n",
    "as it is a chapter about RNNs.\n",
    "\n",
    "\n",
    "One key insight paved the way for a revolution in sequence modeling.\n",
    "While the inputs and targets for many fundamental tasks in machine learning\n",
    "cannot easily be represented as fixed-length vectors,\n",
    "they can often nevertheless be represented as\n",
    "varying-length sequences of fixed-length vectors.\n",
    "For example, documents can be represented as sequences of words;\n",
    "medical records can often be represented as sequences of events\n",
    "(encounters, medications, procedures, lab tests, diagnoses);\n",
    "videos can be represented as varying-length sequences of still images.\n",
    "\n",
    "\n",
    "While sequence models have popped up in numerous application areas,\n",
    "basic research in the area has been driven predominantly\n",
    "by advances on core tasks in natural language processing.\n",
    "Thus, throughout this chapter, we will focus\n",
    "our exposition and examples on text data.\n",
    "If you get the hang of these examples,\n",
    "then applying the models to other data modalities\n",
    "should be relatively straightforward.\n",
    "In the next few sections, we introduce basic\n",
    "notation for sequences and some evaluation measures\n",
    "for assessing the quality of sequentially structured model outputs.\n",
    "After that, we discuss basic concepts of a language model\n",
    "and use this discussion to motivate our first RNN models.\n",
    "Finally, we describe the method for calculating gradients\n",
    "when backpropagating through RNNs and explore some challenges\n",
    "that are often encountered when training such networks,\n",
    "motivating the modern RNN architectures that will follow\n",
    "in :numref:`chap_modern_rnn`.\n",
    "\n",
    ":begin_tab:toc\n",
    " - [sequence](sequence.ipynb)\n",
    " - [text-sequence](text-sequence.ipynb)\n",
    " - [language-model](language-model.ipynb)\n",
    " - [rnn](rnn.ipynb)\n",
    " - [rnn-scratch](rnn-scratch.ipynb)\n",
    " - [rnn-concise](rnn-concise.ipynb)\n",
    " - [bptt](bptt.ipynb)\n",
    ":end_tab:\n"
   ]
  }
 ],
 "metadata": {
  "language_info": {
   "name": "python"
  },
  "required_libs": []
 },
 "nbformat": 4,
 "nbformat_minor": 5
}