{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "0a9b1e18-f4f9-45f9-a3d4-66a2813283db",
   "metadata": {},
   "source": [
    "# How KANs Learn ?\n",
    "\n",
    "This notebook extends the [\"How different machine learning models generalize?\"](https://ydf.readthedocs.io/en/latest/blog/2024/10/04/how-different-machine-learning-models-generalize/) blog post [4] to KANs (**Kolmogorov-Arnold Networks**) [1], demonstrating how they \"learn\", by creating a 2-D image with figures, and try to make KAN models (with various univariate functions) learn them, given the input x/y.\n",
    "\n",
    "The following KAN univariate functions are evaluated:\n",
    "\n",
    "1. B-Spline KAN: the univariate function used in the original KAN paper [1].\n",
    "2. GR-KAN (Group Rational KAN): Published in the KAT paper [2].\n",
    "3. PWL-KAN (Piecewise Linear KAN): A KAN that uses piecewise linear functions.\n",
    "4. PWC-KAN (Piecewise Constant KAN) or Discrete-KAN: A KAN that uses piecewise-constant functions (staircase functions), see demo in [3].\n",
    "\n",
    "* [1] ['Kolmogorov-Arnold Networks\" (arxiv.org/abs/2404.19756)](https://arxiv.org/abs/2404.19756)\n",
    "* [2] [\"KAT: A Knowledge Augmented Transformer for Vision-and-Language\" (arxiv.org/abs/2112.08614)](https://arxiv.org/abs/2112.08614)\n",
    "* [3] [Discrete-KAN](https://gomlx.github.io/gomlx/notebooks/discrete-kan.html)\n",
    "* [4] [\"How different machine learning models generalize?\"](https://ydf.readthedocs.io/en/latest/blog/2024/10/04/how-different-machine-learning-models-generalize/)\n",
    "\n",
    "This notebook was created using [GoNB](https://github.com/janpfeifer/gonb) a Go kernel for Jupyter notebooks, and [GoMLX](https://github.com/gomlx/gomlx), a machine learning framework for Go, using XLA as a backend."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "a284b9a8-abf3-4a1f-97d7-0b53b0bb018f",
   "metadata": {},
   "source": [
    "## Imports and setup code\n",
    "\n",
    "Details, not needed for understanding, so kept hidden."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 1,
   "id": "6abdb185-67b2-4d91-aa6e-e9c32de73ad3",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/markdown": [
       "## GoNB version: `v0.11.1`\n",
       "\n",
       "### Build Info\n",
       "- Go version: go1.25.2 (OS: linux, Arch: amd64)\n"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "\t- Added replace rule for module \"github.com/gomlx/gomlx\" to local directory \"/home/janpf/Projects/gomlx\".\n",
      "\t- Added replace rule for module \"github.com/janpfeifer/gonb\" to local directory \"/home/janpf/Projects/gonb\".\n",
      "\t- Added replace rule for module \"github.com/gomlx/gopjrt\" to local directory \"/home/janpf/Projects/gopjrt\".\n"
     ]
    }
   ],
   "source": [
    "%version\n",
    "!*rm -f go.work && go work init\n",
    "!*go work use . \"${HOME}/Projects/gonb\" \"${HOME}/Projects/gomlx\" \"${HOME}/Projects/gopjrt\"\n",
    "%goworkfix"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 2,
   "id": "03aef856-d032-4cd5-a9e0-481f9ee845cf",
   "metadata": {},
   "outputs": [],
   "source": [
    "import (\n",
    "    \"math\"\n",
    "    \n",
    "    \"github.com/gomlx/gomlx/backends\"\n",
    "    . \"github.com/gomlx/gomlx/pkg/core/graph\"\n",
    "    \"github.com/gomlx/gomlx/pkg/core/shapes\"\n",
    "    \"github.com/gomlx/gomlx/pkg/core/tensors\"\n",
    "    \"github.com/gomlx/gomlx/pkg/ml/context\"\n",
    "    \"github.com/gomlx/gomlx/pkg/ml/datasets\"\n",
    "    \"github.com/gomlx/gomlx/pkg/ml/train\"\n",
    "    \"github.com/gomlx/gomlx/pkg/ml/train/losses\"\n",
    "    \"github.com/gomlx/gomlx/pkg/ml/train/optimizers\"\n",
    "    \"github.com/gomlx/gopjrt/dtypes\"\n",
    "    \"github.com/janpfeifer/gonb/gonbui\"\n",
    "    \"github.com/janpfeifer/gonb/cache\"\n",
    "\n",
    "    // Include default backends\n",
    "    _ \"github.com/gomlx/gomlx/backends/default\"\n",
    ")\n",
    "\n",
    "// Backend used everywhere. Default will use GPU if available, otherwise CPU.\n",
    "var Backend = backends.MustNew()\n",
    "\n",
    "// Makes sure there is a valid reference to `graph` package imported inline.\n",
    "var _ = Add\n",
    "\n",
    "// Define how we want to fail in case of errors.\n",
    "func init_must() {\n",
    "    must.M = func(err error) {\n",
    "        if err != nil {\n",
    "            log.Fatal(\"Error:\\n%+v\\n\", err)\n",
    "        }\n",
    "    }\n",
    "}\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "77998688-ded3-4d59-a133-2c404c7c1340",
   "metadata": {},
   "source": [
    "## Plotting Code\n",
    "\n",
    "We are plotting functions $f(x,y) \\text{ where } x, y \\in [-1, 1]$. The output of $f(x,y) \\in [0, 1]$ represents a degree of grayness 0.0 representing white and 1.0 representing black.\n",
    "\n",
    "The functions are given GoMLX functions, that take as input one array shaped `[batch_size, 2]` (`batch_size` pairs of $(x, y)$).\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 3,
   "id": "31b2b658-858d-42b1-b6a3-0e5196e78dad",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "// PlotFn is the function accepted by the plot functions.\n",
    "//\n",
    "// The function takes as an input shaped [batch_size, 2], with batch_size tuples of (x, y), and should return `[batch_size, 1]`\n",
    "// values from 0.0 to 1.0 \n",
    "//\n",
    "// The context holds the variables of the model (for learned functions), but can be ignored if the function doesn't use any variables.\n",
    "type PlotFn func (ctx *context.Context, xys *Node) *Node \n",
    "\n",
    "// PlotToImg returns the HTML `<img src=...>` of size \"size x size\", with the plotted function, that can be displayed or composed.\n",
    "//\n",
    "// The context can be nil, if the function doesn't use any variables.\n",
    "func PlotToImg(ctx *context.Context, fn PlotFn, size int) string {\n",
    "    if ctx == nil {\n",
    "        ctx = context.New()\n",
    "    }\n",
    "    ctx = ctx.Reuse()  // When plotting, we shouldn't be creating new variables. This way it will fail if function is not yet trained.\n",
    "    // Generate values for each pixel, shaped [size * size, 1]\n",
    "    values := context.MustExecOnce(Backend, ctx, func (ctx *context.Context, g *Graph) *Node {\n",
    "        xys := imgXYs(g, size)\n",
    "        output := fn(ctx, xys)\n",
    "        return ClipScalar(output, 0.0, 1.0)  // Clip to the values allowed.\n",
    "    })\n",
    "\n",
    "    \n",
    "\timg := image.NewGray(image.Rect(0, 0, size, size))\n",
    "    tensors.ConstFlatData[float32](values, func (data []float32) {\n",
    "    \tfor i, v := range data {\n",
    "    \t\t// Convert float32 (0.0 - 1.0) to uint8 (0 - 255)\n",
    "    \t\tgrayValue := uint8((1.0 - v) * 255)  // Colors reversed, so black is 1.0, white is 0.0 (better for white backgrounds).\n",
    "    \t\ty := (i / size)\n",
    "    \t\tx := i % size\n",
    "    \t\timg.SetGray(x, y, color.Gray{Y: grayValue})\n",
    "    \t}\n",
    "    })\n",
    "    \n",
    "    imgSrc := must.M1(gonbui.EmbedImageAsPNGSrc(img))\n",
    "    return fmt.Sprintf(\"<img src=%q/>\", imgSrc)\n",
    "}\n",
    "\n",
    "// imgXYs creates all combination of (x, y) for an image of size (size x size), where $x,y \\in [0, 1]$.\n",
    "// The output is shaped [batchSize, 2], where batchSize = size * size.\n",
    "func imgXYs(g *Graph, size int) *Node {\n",
    "    xs := Iota(g, shapes.Make(dtypes.Float32, size, size, 1), 1) // -> 0, size-1\n",
    "    xs = AddScalar(DivScalar(xs, float64(size)/2.0), -1)  // -> -1.0, 1.0\n",
    "    ys := Iota(g, shapes.Make(dtypes.Float32, size, size, 1), 0)\n",
    "    ys = AddScalar(DivScalar(ys, float64(size)/2.0), -1)  // -> -1.0, 1.0\n",
    "    xys := Concatenate([]*Node{xs, ys}, -1)\n",
    "    xys = Reshape(xys, size*size, 2)  // Contract width and height axes into a \"batch_size\" axis.\n",
    "    return xys\n",
    "}\n",
    "\n",
    "// Plot displays an image generated from the given function.\n",
    "func Plot(ctx *context.Context, fn PlotFn, size int) {\n",
    "    gonbui.DisplayHTML(PlotToImg(ctx, fn, size))\n",
    "}\n",
    "\n",
    "%%\n",
    "// Small test of a gradient function on the x,y diagonal.\n",
    "Plot(nil, func (_ *context.Context, xys *Node) *Node {\n",
    "    value := ReduceAndKeep(xys, ReduceSum, -1)  // x+y -> -2.0 to 2.0\n",
    "    // -> -0.5 to 1.5, but values will be clipped.\n",
    "    return AddScalar(DivScalar(value, 2.0), 0.5)\n",
    "}, 64)\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c93ac9c3-7087-444e-bd43-6075e1819e68",
   "metadata": {},
   "source": [
    "## Ground-Truth Image\n",
    "\n",
    "This is the image we want the various machine learning functions to learn, based on [4].\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 4,
   "id": "ab2c65cf-c567-4fde-b362-b679f752e88d",
   "metadata": {},
   "outputs": [
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func GroundTruth(_ *context.Context, xys *Node) *Node {\n",
    "    g := xys.Graph()\n",
    "    dtype := xys.DType()\n",
    "    \n",
    "    xs := Slice(xys, /*batch_size*/ AxisRange(), /* (x,y) */ AxisElem(0)) // xs -> [-1, 1]\n",
    "    xs = DivScalar(OnePlus(xs), 2)  // xs -> [0, 1]\n",
    "    ys := Slice(xys, /*batch_size*/ AxisRange(), /* (x,y) */ AxisElem(1)) // ys -> [-1, 1]\n",
    "    ys = DivScalar(OnePlus(ys), 2)  // ys -> [0, 1]\n",
    "    \n",
    "    eclipse := Sqrt(Add(\n",
    "        Square(DivScalar(AddScalar(xs, -0.4), 0.7)),\n",
    "        Square(DivScalar(AddScalar(ys, -0.3), 0.5)),\n",
    "    ))\n",
    "    eclipse = LessThan(eclipse, Scalar(g, dtype, 0.5))\n",
    "\n",
    "    w := 0.25\n",
    "    squareLength := Scalar(g, dtype, w)\n",
    "    squareLength2 := Scalar(g, dtype, w * math.Sqrt(2))\n",
    "\n",
    "    xs1 := AddScalar(xs, -0.10)\n",
    "    ys1 := AddScalar(ys, -0.65)\n",
    "    zero := ScalarZero(g, dtype)\n",
    "    square1 := And(\n",
    "        And(GreaterThan(xs1, zero), LessThan(xs1, squareLength)),\n",
    "        And(GreaterThan(ys1, zero), LessThan(ys1, squareLength)),\n",
    "    )\n",
    "\n",
    "    xs2, ys2 := AddScalar(xs, - 0.74), AddScalar(ys, -0.55)\n",
    "    xs2, ys2 = Add(xs2, ys2), Sub(ys2, xs2)\n",
    "    square2 := And(\n",
    "        And(GreaterThan(xs2, zero), LessThan(xs2, squareLength2)),\n",
    "        And(GreaterThan(ys2, zero), LessThan(ys2, squareLength2)),\n",
    "    )\n",
    "    \n",
    "    image := Or(Or(eclipse, square1), square2)\n",
    "    return ConvertDType(image, xys.DType())\n",
    "}\n",
    "\n",
    "%%\n",
    "Plot(nil, GroundTruth, 400)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "f41c6cab-3dbd-4d7d-ba91-e585a46a945a",
   "metadata": {},
   "source": [
    "## Machine Learning a function to mimic `GroundTruth`\n",
    "\n",
    "### Training and Validation Datasets\n",
    "\n",
    "To analyse how a machine learning model \"learns\" (or \"generalizes\") the ground-truth image in an interesting way, we will create a fixed dataset with a limited number of examples, and then train the model.\n",
    "\n",
    "Notice we generate the datset as a couple of fixed tensors, whose shapes are: \n",
    "\n",
    "* Input shape `[num_batches, batch_size, 2]`, with the $(x, y)$ of the data points. Notice `NumTrainingExamples = num_batches * batch_size`.\n",
    "* Labels shape `[num_batches, batch_size, 1]`. While the labels are defined in the range `[0.0, 1.0]`, our ground-truth only uses 0s and 1s, so it can be consider a boolean.\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "id": "34741a64-0106-410b-bc53-eea8d4656d72",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "/trainXYs: shape=(Float32)[1 32000 2]\n",
      "\tSample values: [0.9253949 -0.7290534 0.40012693 0.011769533 0.24289417 -0.20978129 0.15292251 -0.09316999 0.96000326 -0.43578953]\n",
      "/trainLabels: shape=(Float32)[1 32000 1]\n",
      "\tSample values: [0 0 1 1 0 1 0 0 1 0]\n",
      "/validXYs: shape=(Float32)[32000 2]\n",
      "\tSample values: [-0.6587877 -0.9495033 0.28734863 0.8810971 0.63639426 0.5474998 0.8441943 -0.5901769 0.46686828 0.9806895]\n",
      "/validLabels: shape=(Float32)[32000 1]\n",
      "\tSample values: [0 0 1 0 0 1 1 0 1 0]\n"
     ]
    }
   ],
   "source": [
    "const (\n",
    "    TrainingDataSeed = 42\n",
    "    NumTrainingExamples = 32_000\n",
    "    NumValidationExamples = 16_000\n",
    ")\n",
    "\n",
    "// Upload training and validation data as a variables.\n",
    "func UploadTrainingData(ctx *context.Context) {\n",
    "    _ = context.MustExecOnceN(Backend, ctx, func (ctx *context.Context, g *Graph) {\n",
    "        rng := Const(g, RngStateFromSeed(TrainingDataSeed))\n",
    "        var trainXYs, validXYs *Node\n",
    "        rng, trainXYs = RandomUniform(rng, shapes.Make(dtypes.Float32, NumTrainingExamples, 2))  // -> [0, 1]\n",
    "        rng, validXYs = RandomUniform(rng, shapes.Make(dtypes.Float32, NumTrainingExamples, 2))  // -> [0, 1]\n",
    "        trainXYs = AddScalar(MulScalar(trainXYs, 2), -1)  // -> [-1, 1]\n",
    "        validXYs = AddScalar(MulScalar(validXYs, 2), -1)  // -> [-1, 1]\n",
    "        trainLabels := GroundTruth(ctx, trainXYs)\n",
    "        validLabels := GroundTruth(ctx, validXYs)\n",
    "\n",
    "        // Break up into batches.\n",
    "        numBatches := context.GetParamOr(ctx, \"num_batches\", int(1))  // default to 1.\n",
    "        batchSize := NumTrainingExamples / numBatches\n",
    "        trainXYs = Reshape(trainXYs, numBatches, batchSize, 2)\n",
    "        trainLabels = Reshape(trainLabels, numBatches, batchSize, 1)\n",
    "\n",
    "        // Store data into (non-trainable) variables.\n",
    "        _ = ctx.VariableWithValueGraph(\"trainXYs\", trainXYs).SetTrainable(false)\n",
    "        _ = ctx.VariableWithValueGraph(\"trainLabels\", trainLabels).SetTrainable(false)\n",
    "        _ = ctx.VariableWithValueGraph(\"validXYs\", validXYs).SetTrainable(false)\n",
    "        _ = ctx.VariableWithValueGraph(\"validLabels\", validLabels).SetTrainable(false)\n",
    "    })\n",
    "}\n",
    "\n",
    "%%\n",
    "// Create a context, and check that variables with the training data were uploaded with the right shape.\n",
    "ctx := context.New()\n",
    "UploadTrainingData(ctx)\n",
    "for v := range ctx.IterVariables() {\n",
    "    fmt.Printf(\"%s: shape=%s\\n\", v.ScopeAndName(), v.Shape())\n",
    "    tensors.ConstFlatData(v.Value(), func (flat []float32) {\n",
    "        fmt.Printf(\"\\tSample values: %v\\n\", flat[:10])\n",
    "    })\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e3839fbf-8580-487d-8e67-ba6224c5dba3",
   "metadata": {},
   "source": [
    "### Trivial Training Loop\n",
    "\n",
    "Below we create a training loop and a couple of loss functions that uses the training data we just created.\n",
    "\n",
    "It can be used by any model with may want to try to learn."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 6,
   "id": "3042a2bc-6d99-40eb-ab7c-8726cbe7484b",
   "metadata": {},
   "outputs": [],
   "source": [
    "// Train the given learnable fn for numSteps. The context (ctx) will store the learned weights.\n",
    "//\n",
    "// Use one of the losses defined here, since we take the labels as generated by the model function (as opposed to given by the dataset).\n",
    "func Train(ctx *context.Context, fn PlotFn, numSteps int, lossFn losses.LossFn) error {\n",
    "    // Upload training data, if not yet set.\n",
    "    v := ctx.InspectVariable(\"/\", \"trainXYs\")\n",
    "    if v == nil {\n",
    "        UploadTrainingData(ctx)\n",
    "    }\n",
    "    ds := datasets.NewConstantDataset()\n",
    "                       \n",
    "    // Our model function: it gets the batch from the fixed inputs, and call the LearnableFn provides. \n",
    "    // It also returns the labels for the batch.\n",
    "    modelFn := func(ctx *context.Context, spec any, inputs []*Node) []*Node {\n",
    "        g := inputs[0].Graph()\n",
    "        \n",
    "        // Get training data, which must exist already in the context.\n",
    "        trainXYs := ctx.InspectVariable(\"/\", \"trainXYs\").ValueGraph(g)\n",
    "        trainLabels := ctx.InspectVariable(\"/\", \"trainLabels\").ValueGraph(g)\n",
    "    \n",
    "        // Get the batch for this global step.\n",
    "        numBatches := trainXYs.Shape().Dim(0)\n",
    "        batchSize := trainXYs.Shape().Dim(1)\n",
    "        globalStep := optimizers.GetGlobalStepVar(ctx).ValueGraph(g)\n",
    "        batchIdx := ModScalar(globalStep, numBatches)\n",
    "        zeroIdx := ScalarZero(g, globalStep.DType())\n",
    "        batchXYs := DynamicSlice(trainXYs, []*Node{batchIdx, zeroIdx, zeroIdx}, []int{1, batchSize, 2})\n",
    "        batchXYs = Reshape(batchXYs, batchSize, 2)\n",
    "        batchLabels := DynamicSlice(trainLabels, []*Node{batchIdx, zeroIdx, zeroIdx}, []int{1, batchSize, 1})\n",
    "        batchLabels = Reshape(batchLabels, batchSize, 1)\n",
    "\n",
    "        // Prediction\n",
    "        logits := fn(ctx, batchXYs)\n",
    "        return []*Node{logits, batchLabels}\n",
    "    }\n",
    "\n",
    "    // The labels are actually provided by the modelFn, in logits[1].\n",
    "    fixedLossFn := func(labels, logits []*Node) *Node {\n",
    "        return lossFn(/* labels= */ []*Node{logits[1]}, /* logits= */ []*Node{logits[0]})\n",
    "    }\n",
    "    trainer := train.NewTrainer(Backend, ctx, modelFn, fixedLossFn, optimizers.FromContext(ctx), nil, nil)\n",
    "    loop := train.NewLoop(trainer)\n",
    "\tcommandline.AttachProgressBar(loop) // Attaches a progress bar to the loop.\n",
    "    metrics, err := loop.RunSteps(ds, numSteps)\n",
    "    if err != nil { \n",
    "        return err\n",
    "    }\n",
    "    fmt.Printf(\"Train: loss=%v\\n\", metrics[0])\n",
    "    fmt.Printf(\"\\tMedian training time: %s\\n\", loop.MedianTrainStepDuration())\n",
    "    return nil\n",
    "}\n",
    "\n",
    "// PlotLogits converts a logits function -- a function that learns -inf to +inf outputs -- to a 0 to 1 function used for plotting.\n",
    "func PlotLogitsFn(logitsFn PlotFn) PlotFn {\n",
    "    return func(ctx *context.Context, xys *Node) *Node {\n",
    "        logits := logitsFn(ctx, xys)\n",
    "        return Sigmoid(logits)\n",
    "    }\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "7bb54509-4733-42f2-8087-f64babf8ace7",
   "metadata": {},
   "source": [
    "### Accuracy of a Function\n",
    "\n",
    "Measured over the whole images uniformly at the given resolution."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "id": "189b680a-72a2-4b3a-bc4f-d0a0a977ecaf",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "GroundTruth accuracy is 100.0%\n"
     ]
    }
   ],
   "source": [
    "// Accuracy returns the accuracy of the model over the whole image, measured at the given resolution.\n",
    "// It takes the model (fn) prediction as true if $fn(x,y) > 0.5$ or false otherwise.\n",
    "func Accuracy(ctx *context.Context, fn PlotFn, resolution int) float32 {\n",
    "    if ctx == nil {\n",
    "        ctx = context.New()\n",
    "    }\n",
    "    ctx = ctx.Reuse()  // When plotting, we shouldn't be creating new variables. This way it will fail if function is not yet trained.\n",
    "    \n",
    "    // Generate values for each pixel, shaped [size * size, 1]\n",
    "    accuracy := context.MustExecOnce(Backend, ctx, func (ctx *context.Context, g *Graph) *Node {\n",
    "        xys := imgXYs(g, resolution)\n",
    "        dtype := xys.DType()\n",
    "        truth := GroundTruth(ctx, xys)\n",
    "        predicted := fn(ctx, xys)\n",
    "        predicted = GreaterThan(predicted, Scalar(g, dtype, 0.5))\n",
    "        predicted = ConvertDType(predicted, dtype)\n",
    "        accuracy := Equal(truth, predicted)\n",
    "        accuracy = ConvertDType(accuracy, dtype)\n",
    "        accuracy = ReduceAllMean(accuracy)\n",
    "        return accuracy\n",
    "    })\n",
    "    return tensors.ToScalar[float32](accuracy)\n",
    "}\n",
    "\n",
    "%%\n",
    "fmt.Printf(\"GroundTruth accuracy is %.1f%%\\n\", Accuracy(nil, GroundTruth, 400)*100.0)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "62771e58-e27a-4ec2-af13-411088e34345",
   "metadata": {},
   "source": [
    "### Demo Function\n",
    "\n",
    "This is a helper function that trains a given function, optionally prints its trainable variables, plots it and reports back its accuracy."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "29ea8a2d-0777-4e8a-99ed-1e68cf5d3d71",
   "metadata": {},
   "outputs": [],
   "source": [
    "var (\n",
    "    BaseNumSteps = 2500\n",
    ")\n",
    "\n",
    "func DemoFn(fn PlotFn, numBatches int, useCrossentropy bool) {\n",
    "    ctx := context.New()\n",
    "    ctx.SetParam(\"num_batches\", numBatches)\n",
    "    var loss losses.LossFn\n",
    "    if useCrossentropy {\n",
    "        loss = losses.BinaryCrossentropyLogits\n",
    "    } else {\n",
    "        loss = losses.MeanSquaredError\n",
    "    }\n",
    "    must.M(Train(ctx, fn, /*numSteps*/ BaseNumSteps * numBatches, loss))\n",
    "    fmt.Printf(\"Accuracy:\\t%.1f%%\\n\", Accuracy(ctx, fn, 400)*100.0)\n",
    "    \n",
    "    var numParams int\n",
    "    for v := range ctx.IterVariables() {\n",
    "        if v.Trainable {\n",
    "            numParams += v.Shape().Size()\n",
    "        }\n",
    "    }\n",
    "    fmt.Printf(\"# params:\\t%d\\n\", numParams)\n",
    "\n",
    "    if useCrossentropy {\n",
    "        Plot(ctx, PlotLogitsFn(fn), 400)\n",
    "    } else {\n",
    "        Plot(ctx, PlotFn(fn), 400)\n",
    "    }\n",
    "}"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "d28aa036-3770-4fc0-950d-f424241958d7",
   "metadata": {},
   "source": [
    "### Feedforward Neural Network Baseline"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 9,
   "id": "37d93945-2c30-4d35-b90b-31feff960e9e",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "      \u001b[1m 100% [========================================] (5041 steps/s)\u001b[0m [step=24999] [loss+=0.00543] [~loss+=0.00543] [~loss=0.00543]                \n",
      "Train: loss=float32(0.00543)\n",
      "\tMedian training time: 185.371µs\n",
      "Accuracy:\t99.6%\n",
      "# params:\t921\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func FNN(ctx *context.Context, xys *Node) *Node {\n",
    "    return fnn.New(ctx.In(\"fnn\"), xys, /* ouputDim */1).NumHiddenLayers(3, 20).Done()\n",
    "}\n",
    "\n",
    "%%\n",
    "DemoFn(FNN, /* numBatches */ 10, /* useCrossentropy */ true)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "e2679dfa-295b-472c-b5b9-f80f3eed09ee",
   "metadata": {},
   "source": [
    "### BSpline-KAN"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "id": "440e4a5a-2baa-418f-997d-d17d88a26c88",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "      \u001b[1m 100% [========================================] (3243 steps/s)\u001b[0m [step=24999] [loss+=0.00111] [~loss+=0.00121] [~loss=0.00121]        ]        \n",
      "Train: loss=float32(0.001113)\n",
      "\tMedian training time: 296.262µs\n",
      "Accuracy:\t99.7%\n",
      "# params:\t1040\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func BSplineKAN(ctx *context.Context, xys *Node) *Node {\n",
    "    return kan.New(ctx.In(\"BSplineKan\"), xys, 1).\n",
    "        NumHiddenLayers(2, 10).\n",
    "        NumControlPoints(6).\n",
    "        UseResidual(true).\n",
    "        Done()\n",
    "}\n",
    "\n",
    "%%\n",
    "DemoFn(BSplineKAN, /* numBatches */ 10, /* useCrossentropy */ true)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "11a70643-603e-4cee-9dcc-bba97e394beb",
   "metadata": {},
   "source": [
    "### Rational Functions KAN (GR-KAN)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "152063d2-1dca-4307-8ae1-fda373bee2c5",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "      \u001b[1m 100% [========================================] (3795 steps/s)\u001b[0m [step=24999] [loss+=0.00616] [~loss+=0.00678] [~loss=0.00678]                \n",
      "Train: loss=float32(0.006161)\n",
      "\tMedian training time: 254.802µs\n",
      "Accuracy:\t99.7%\n",
      "# params:\t1430\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func GRKAN(ctx *context.Context, xys *Node) *Node {\n",
    "    return kan.New(ctx.In(\"GRKAN\"), xys, 1).\n",
    "        Rational().\n",
    "        NumHiddenLayers(2, 10).\n",
    "        UseResidual(true).\n",
    "        Done()\n",
    "}\n",
    "\n",
    "%%\n",
    "DemoFn(GRKAN, /* numBatches */ 10, /* useCrossentropy */ true)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "68880e1a-1140-4b9c-aa28-c1a15af30246",
   "metadata": {},
   "source": [
    "### Piecewise Linear KAN (PWL-KAN)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "id": "f4126556-7476-452c-92c9-0c8489ff7b55",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "      \u001b[1m 100% [========================================] (3509 steps/s)\u001b[0m [step=24999] [loss+=0.00636] [~loss+=0.00701] [~loss=0.00701]                \n",
      "Train: loss=float32(0.006357)\n",
      "\tMedian training time: 272.862µs\n",
      "Accuracy:\t99.7%\n",
      "# params:\t3170\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func PWLKAN(ctx *context.Context, xys *Node) *Node {\n",
    "    ctx.SetParam(kan.ParamPWLSplitPointsTrainable, true)\n",
    "    return kan.New(ctx.In(\"PWL-KAN\"), xys, 1).\n",
    "        PiecewiseLinear().\n",
    "        NumHiddenLayers(2, 10).\n",
    "        UseResidual(true).\n",
    "        NumControlPoints(20).        \n",
    "        Done()\n",
    "}\n",
    "\n",
    "%%\n",
    "DemoFn(PWLKAN, /* numBatches */ 10, /* useCrossentropy */ true)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "1ad6d060-d41d-45f5-968c-b7f0dc3bc62d",
   "metadata": {},
   "source": [
    "### Piecewise Constant KAN (Discrete-KAN)"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 13,
   "id": "c8a6e6c7-3758-4bdc-9f71-ac2b416c5738",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "      \u001b[1m 100% [========================================] (3552 steps/s)\u001b[0m [step=99999] [loss+=0.00266] [~loss+=0.0024] [~loss=0.0024]        93]            \n",
      "Train: loss=float32(0.002656)\n",
      "\tMedian training time: 269.352µs\n",
      "Accuracy:\t97.4%\n",
      "# params:\t2443\n"
     ]
    },
    {
     "data": {
      "text/html": [
       "<img src=\"\"/>"
      ]
     },
     "metadata": {},
     "output_type": "display_data"
    }
   ],
   "source": [
    "func DiscreteKAN(ctx *context.Context, xys *Node) *Node {\n",
    "    ctx.SetParam(kan.ParamDiscreteSoftnessScheduleMin, 1e-5)\n",
    "    return kan.New(ctx.In(\"DiscreteKAN\"), xys, 1).\n",
    "        Discrete().\n",
    "        NumHiddenLayers(3, 5).\n",
    "        UseResidual(true).\n",
    "        NumControlPoints(30).\n",
    "        DiscreteSoftness(0.2).\n",
    "        DiscreteSplitsTrainable(true).\n",
    "        DiscreteSoftnessScheduleType(kan.SoftnessScheduleExponential).\n",
    "        Done()\n",
    "}\n",
    "\n",
    "%%\n",
    "BaseNumSteps = 10_000\n",
    "DemoFn(DiscreteKAN, /* numBatches */ 10, /* useCrossentropy */ false)"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Go (gonb)",
   "language": "go",
   "name": "gonb"
  },
  "language_info": {
   "codemirror_mode": "",
   "file_extension": ".go",
   "mimetype": "text/x-go",
   "name": "go",
   "nbconvert_exporter": "",
   "pygments_lexer": "",
   "version": "go1.25.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
