{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# Tutorial for Generative Models of Graphs\n",
    "\n",
    "In earlier tutorials we have seen how learned embedding of a graph and/or a node allow applications such as semi-supervised classification for nodes or sentiment analysis. Wouldn’t it be interesting to predict the future evolution of the graph and perform the analysis iteratively?\n",
    "\n",
    "We will need to generate a variety of graph samples, in other words, we need generative models of graphs. Instead of and/or in addition to learning node and edge features, we want to model the distribution of arbitrary graphs. While general generative models can model the density function explicitly and implicitly and generate samples at once or sequentially, we will only focus on explicit generative models for sequential generation here. Typical applications include drug/material discovery, chemical processes, proteomics, etc.\n",
    "\n",
    "## Introduction\n",
    "\n",
    "The primitive actions of mutating a graph in DGL are nothing more than add_nodes and add_edges. That is, if we were to draw a circle of 3 nodes,\n",
    "\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": []
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.6.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
