{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# How to Save and Load Models\n",
    "\n",
    "This guide shows you how to save and load BrainPy models for checkpointing, resuming training, and deployment."
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Quick Start\n",
    "\n",
    "**Save a trained model:**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import brainpy\n",
    "import brainstate\n",
    "import pickle\n",
    "\n",
    "# After training...\n",
    "state_dict = {\n",
    "    'params': net.states(brainstate.ParamState),\n",
    "    'epoch': current_epoch,\n",
    "}\n",
    "\n",
    "with open('model.pkl', 'wb') as f:\n",
    "    pickle.dump(state_dict, f)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "**Load a model:**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Create model with same architecture\n",
    "net = MyNetwork()\n",
    "brainstate.nn.init_all_states(net)\n",
    "\n",
    "# Load saved state\n",
    "with open('model.pkl', 'rb') as f:\n",
    "    state_dict = pickle.load(f)\n",
    "\n",
    "# Restore parameters\n",
    "for name, state in state_dict['params'].items():\n",
    "    net.states(brainstate.ParamState)[name].value = state.value"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Understanding What to Save\n",
    "\n",
    "### State Types\n",
    "\n",
    "BrainPy has three state types with different persistence requirements:\n",
    "\n",
    "**ParamState (Always save)**\n",
    "   - Learnable weights and biases\n",
    "   - Required to restore trained model\n",
    "   - Examples: synaptic weights, neural biases\n",
    "\n",
    "**LongTermState (Usually save)**\n",
    "   - Persistent statistics and counters\n",
    "   - Not updated by gradients\n",
    "   - Examples: running averages, spike counts\n",
    "\n",
    "**ShortTermState (Never save)**\n",
    "   - Temporary dynamics that reset each trial\n",
    "   - Will be re-initialized anyway\n",
    "   - Examples: membrane potentials, synaptic conductances"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Recommended Approach"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def save_checkpoint(net, optimizer, epoch, filepath):\n",
    "    \"\"\"Save model checkpoint.\"\"\"\n",
    "    state_dict = {\n",
    "        # Required: model parameters\n",
    "        'params': net.states(brainstate.ParamState),\n",
    "\n",
    "        # Optional but recommended: long-term states\n",
    "        'long_term': net.states(brainstate.LongTermState),\n",
    "\n",
    "        # Training metadata\n",
    "        'epoch': epoch,\n",
    "        'optimizer_state': optimizer.state_dict(),  # If continuing training\n",
    "\n",
    "        # Model configuration (helpful for loading)\n",
    "        'config': {\n",
    "            'n_input': net.n_input,\n",
    "            'n_hidden': net.n_hidden,\n",
    "            'n_output': net.n_output,\n",
    "            # ... other hyperparameters\n",
    "        }\n",
    "    }\n",
    "\n",
    "    with open(filepath, 'wb') as f:\n",
    "        pickle.dump(state_dict, f)\n",
    "\n",
    "    print(f\"✅ Saved checkpoint to {filepath}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Basic Save/Load\n",
    "\n",
    "### Using Pickle (Simple)\n",
    "\n",
    "**Advantages:**\n",
    "- Simple and straightforward\n",
    "- Works with any Python object\n",
    "- Good for quick prototyping\n",
    "\n",
    "**Disadvantages:**\n",
    "- Python-specific format\n",
    "- Version compatibility issues\n",
    "- Not human-readable"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import pickle\n",
    "import brainpy\n",
    "import brainstate\n",
    "\n",
    "# Define your model\n",
    "class SimpleNet(brainstate.nn.Module):\n",
    "    def __init__(self, n_neurons=100):\n",
    "        super().__init__()\n",
    "        self.lif = brainpy.state.LIF(n_neurons, V_rest=-65*u.mV, V_th=-50*u.mV, tau=10*u.ms)\n",
    "        self.fc = brainstate.nn.Linear(n_neurons, 10)\n",
    "\n",
    "    def update(self, x):\n",
    "        self.lif(x)\n",
    "        return self.fc(self.lif.get_spike())\n",
    "\n",
    "# Train model\n",
    "net = SimpleNet()\n",
    "brainstate.nn.init_all_states(net)\n",
    "# ... training code ...\n",
    "\n",
    "# Save\n",
    "params = net.states(brainstate.ParamState)\n",
    "with open('simple_net.pkl', 'wb') as f:\n",
    "    pickle.dump(params, f)\n",
    "\n",
    "# Load\n",
    "net_new = SimpleNet()\n",
    "brainstate.nn.init_all_states(net_new)\n",
    "\n",
    "with open('simple_net.pkl', 'rb') as f:\n",
    "    loaded_params = pickle.load(f)\n",
    "\n",
    "# Restore parameters\n",
    "for name, state in loaded_params.items():\n",
    "    net_new.states(brainstate.ParamState)[name].value = state.value"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Using NumPy (Arrays Only)\n",
    "\n",
    "**Advantages:**\n",
    "- Language-agnostic\n",
    "- Efficient storage\n",
    "- Widely supported\n",
    "\n",
    "**Disadvantages:**\n",
    "- Only saves arrays (not structure)\n",
    "- Need to manually track parameter names"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import numpy as np\n",
    "\n",
    "# Save parameters as .npz\n",
    "params = net.states(brainstate.ParamState)\n",
    "param_dict = {name: np.array(state.value) for name, state in params.items()}\n",
    "np.savez('model_params.npz', **param_dict)\n",
    "\n",
    "# Load parameters\n",
    "loaded = np.load('model_params.npz')\n",
    "for name, array in loaded.items():\n",
    "    net.states(brainstate.ParamState)[name].value = jnp.array(array)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Checkpointing During Training\n",
    "\n",
    "### Periodic Checkpoints\n",
    "\n",
    "Save at regular intervals during training."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import braintools\n",
    "\n",
    "# Training setup\n",
    "net = MyNetwork()\n",
    "optimizer = braintools.optim.Adam(lr=1e-3)\n",
    "optimizer.register_trainable_weights(net.states(brainstate.ParamState))\n",
    "\n",
    "save_interval = 5  # Save every 5 epochs\n",
    "checkpoint_dir = './checkpoints'\n",
    "import os\n",
    "os.makedirs(checkpoint_dir, exist_ok=True)\n",
    "\n",
    "# Training loop\n",
    "for epoch in range(num_epochs):\n",
    "    # Training step\n",
    "    for batch in train_loader:\n",
    "        loss = train_step(net, optimizer, batch)\n",
    "\n",
    "    # Periodic save\n",
    "    if (epoch + 1) % save_interval == 0:\n",
    "        checkpoint_path = f'{checkpoint_dir}/epoch_{epoch+1}.pkl'\n",
    "        save_checkpoint(net, optimizer, epoch, checkpoint_path)\n",
    "\n",
    "        print(f\"Epoch {epoch+1}: Loss={loss:.4f}, Checkpoint saved\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Best Model Checkpoint\n",
    "\n",
    "Save only when validation performance improves."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "best_val_loss = float('inf')\n",
    "best_model_path = 'best_model.pkl'\n",
    "\n",
    "for epoch in range(num_epochs):\n",
    "    # Training\n",
    "    train_loss = train_epoch(net, optimizer, train_loader)\n",
    "\n",
    "    # Validation\n",
    "    val_loss = validate(net, val_loader)\n",
    "\n",
    "    # Save if best\n",
    "    if val_loss < best_val_loss:\n",
    "        best_val_loss = val_loss\n",
    "        save_checkpoint(net, optimizer, epoch, best_model_path)\n",
    "        print(f\"✅ New best model! Val loss: {val_loss:.4f}\")\n",
    "\n",
    "    print(f\"Epoch {epoch+1}: Train={train_loss:.4f}, Val={val_loss:.4f}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Resuming Training\n",
    "\n",
    "Continue training from a checkpoint."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def load_checkpoint(filepath, net, optimizer=None):\n",
    "    \"\"\"Load checkpoint and restore state.\"\"\"\n",
    "    with open(filepath, 'rb') as f:\n",
    "        state_dict = pickle.load(f)\n",
    "\n",
    "    # Restore model parameters\n",
    "    params = net.states(brainstate.ParamState)\n",
    "    for name, state in state_dict['params'].items():\n",
    "        if name in params:\n",
    "            params[name].value = state.value\n",
    "\n",
    "    # Restore long-term states\n",
    "    if 'long_term' in state_dict:\n",
    "        long_term = net.states(brainstate.LongTermState)\n",
    "        for name, state in state_dict['long_term'].items():\n",
    "            if name in long_term:\n",
    "                long_term[name].value = state.value\n",
    "\n",
    "    # Restore optimizer state\n",
    "    if optimizer is not None and 'optimizer_state' in state_dict:\n",
    "        optimizer.load_state_dict(state_dict['optimizer_state'])\n",
    "\n",
    "    start_epoch = state_dict.get('epoch', 0) + 1\n",
    "    return start_epoch\n",
    "\n",
    "# Resume training\n",
    "net = MyNetwork()\n",
    "brainstate.nn.init_all_states(net)\n",
    "optimizer = braintools.optim.Adam(lr=1e-3)\n",
    "optimizer.register_trainable_weights(net.states(brainstate.ParamState))\n",
    "\n",
    "# Load checkpoint\n",
    "start_epoch = load_checkpoint('checkpoint_epoch_50.pkl', net, optimizer)\n",
    "\n",
    "# Continue training from where we left off\n",
    "for epoch in range(start_epoch, num_epochs):\n",
    "    train_step(net, optimizer, train_loader)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Advanced Saving Strategies\n",
    "\n",
    "### Versioned Checkpoints\n",
    "\n",
    "Keep multiple checkpoints without overwriting."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from datetime import datetime\n",
    "\n",
    "def save_versioned_checkpoint(net, epoch, base_dir='checkpoints'):\n",
    "    \"\"\"Save checkpoint with timestamp.\"\"\"\n",
    "    timestamp = datetime.now().strftime('%Y%m%d_%H%M%S')\n",
    "    filename = f'model_epoch{epoch}_{timestamp}.pkl'\n",
    "    filepath = os.path.join(base_dir, filename)\n",
    "\n",
    "    state_dict = {\n",
    "        'params': net.states(brainstate.ParamState),\n",
    "        'epoch': epoch,\n",
    "        'timestamp': timestamp,\n",
    "    }\n",
    "\n",
    "    with open(filepath, 'wb') as f:\n",
    "        pickle.dump(state_dict, f)\n",
    "\n",
    "    return filepath"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Keep Last N Checkpoints\n",
    "\n",
    "Automatically delete old checkpoints to save disk space."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "import glob\n",
    "\n",
    "def save_with_cleanup(net, epoch, checkpoint_dir='checkpoints', keep_last=5):\n",
    "    \"\"\"Save checkpoint and keep only last N.\"\"\"\n",
    "\n",
    "    # Save new checkpoint\n",
    "    filepath = f'{checkpoint_dir}/epoch_{epoch:04d}.pkl'\n",
    "    save_checkpoint(net, None, epoch, filepath)\n",
    "\n",
    "    # Get all checkpoints\n",
    "    checkpoints = sorted(glob.glob(f'{checkpoint_dir}/epoch_*.pkl'))\n",
    "\n",
    "    # Delete old ones\n",
    "    if len(checkpoints) > keep_last:\n",
    "        for old_checkpoint in checkpoints[:-keep_last]:\n",
    "            os.remove(old_checkpoint)\n",
    "            print(f\"Removed old checkpoint: {old_checkpoint}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Model Export for Deployment\n",
    "\n",
    "### Minimal Model File\n",
    "\n",
    "Save only what's needed for inference."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def export_for_inference(net, filepath, metadata=None):\n",
    "    \"\"\"Export minimal model for inference.\"\"\"\n",
    "\n",
    "    export_dict = {\n",
    "        'params': net.states(brainstate.ParamState),\n",
    "        'config': {\n",
    "            # Only architecture info, no training state\n",
    "            'model_type': net.__class__.__name__,\n",
    "            # ... architecture hyperparameters\n",
    "        }\n",
    "    }\n",
    "\n",
    "    if metadata:\n",
    "        export_dict['metadata'] = metadata\n",
    "\n",
    "    with open(filepath, 'wb') as f:\n",
    "        pickle.dump(export_dict, f)\n",
    "\n",
    "    # Report size\n",
    "    size_mb = os.path.getsize(filepath) / (1024 * 1024)\n",
    "    print(f\"📦 Exported model: {size_mb:.2f} MB\")\n",
    "\n",
    "# Export trained model\n",
    "export_for_inference(\n",
    "    net,\n",
    "    'deployed_model.pkl',\n",
    "    metadata={\n",
    "        'description': 'LIF network for digit classification',\n",
    "        'accuracy': 0.95,\n",
    "        'date': datetime.now().isoformat()\n",
    "    }\n",
    ")"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "### Loading for Inference"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "def load_for_inference(filepath, model_class):\n",
    "    \"\"\"Load model for inference only.\"\"\"\n",
    "\n",
    "    with open(filepath, 'rb') as f:\n",
    "        export_dict = pickle.load(f)\n",
    "\n",
    "    # Create model from config\n",
    "    config = export_dict['config']\n",
    "    net = model_class(**config)  # Must match saved config\n",
    "    brainstate.nn.init_all_states(net)\n",
    "\n",
    "    # Load parameters\n",
    "    params = net.states(brainstate.ParamState)\n",
    "    for name, state in export_dict['params'].items():\n",
    "        params[name].value = state.value\n",
    "\n",
    "    return net, export_dict.get('metadata')\n",
    "\n",
    "# Load and use\n",
    "net, metadata = load_for_inference('deployed_model.pkl', MyNetwork)\n",
    "print(f\"Loaded model: {metadata['description']}\")\n",
    "\n",
    "# Run inference\n",
    "brainstate.nn.init_all_states(net)\n",
    "output = net(input_data)"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Best Practices\n",
    "\n",
    "✅ **Always save configuration** - Include hyperparameters for reproducibility\n",
    "\n",
    "✅ **Version your checkpoints** - Track model version for compatibility\n",
    "\n",
    "✅ **Save metadata** - Include training metrics, date, description\n",
    "\n",
    "✅ **Regular backups** - Save periodically during long training\n",
    "\n",
    "✅ **Keep best model** - Separate best and latest checkpoints\n",
    "\n",
    "✅ **Test loading** - Verify checkpoint can be loaded before continuing\n",
    "\n",
    "✅ **Use relative paths** - Make checkpoints portable\n",
    "\n",
    "✅ **Document format** - Comment what's in your checkpoint files\n",
    "\n",
    "❌ **Don't save ShortTermState** - It resets anyway\n",
    "\n",
    "❌ **Don't save everything** - Minimize checkpoint size\n",
    "\n",
    "❌ **Don't overwrite** - Keep multiple checkpoints for safety"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## Summary\n",
    "\n",
    "**Quick reference:**"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "# Save\n",
    "checkpoint = {\n",
    "    'params': net.states(brainstate.ParamState),\n",
    "    'epoch': epoch,\n",
    "    'config': net.get_config()\n",
    "}\n",
    "with open('checkpoint.pkl', 'wb') as f:\n",
    "    pickle.dump(checkpoint, f)\n",
    "\n",
    "# Load\n",
    "with open('checkpoint.pkl', 'rb') as f:\n",
    "    checkpoint = pickle.load(f)\n",
    "\n",
    "net = MyNetwork.from_config(checkpoint['config'])\n",
    "brainstate.nn.init_all_states(net)\n",
    "\n",
    "for name, state in checkpoint['params'].items():\n",
    "    net.states(brainstate.ParamState)[name].value = state.value"
   ]
  },
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "## See Also\n",
    "\n",
    "- Core Concepts: State Management\n",
    "- Tutorials: SNN Training\n",
    "- GPU/TPU Usage"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.8.0"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 4
}
