{
 "cells": [
  {
   "cell_type": "markdown",
   "id": "b9ffab25",
   "metadata": {},
   "source": [
    "# 通过 Transformers 下载 \n",
    "\n",
    "本笔记本演示：\n",
    "- 使用 huggingface_hub 下载仓库快照到指定目录\n",
    "- 使用 Transformers 从本地目录加载 tokenizer 与模型（离线可用）"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "4c3b5cf4",
   "metadata": {},
   "source": [
    "## 1. 环境检查与登录提示"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 5,
   "id": "b8ef8fcf",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "huggingface_hub: 0.36.0\n",
      "transformers: 4.57.1\n",
      "HF token configured: True\n"
     ]
    }
   ],
   "source": [
    "import os\n",
    "from pathlib import Path\n",
    "\n",
    "try:\n",
    "    import huggingface_hub as hfh\n",
    "    from huggingface_hub import snapshot_download\n",
    "    print('huggingface_hub:', hfh.__version__)\n",
    "except Exception as e:\n",
    "    print('请先安装 huggingface_hub: pip install huggingface_hub')\n",
    "    raise\n",
    "\n",
    "try:\n",
    "    import transformers\n",
    "    from transformers import AutoTokenizer, AutoModelForCausalLM\n",
    "    print('transformers:', transformers.__version__)\n",
    "except Exception as e:\n",
    "    print('请先安装 transformers: pip install transformers')\n",
    "    raise\n",
    "\n",
    "print('HF token configured:', bool(os.getenv('HUGGINGFACE_HUB_TOKEN')))\n",
    "# 如需登录：在终端运行  huggingface-cli login  或设置环境变量 HUGGINGFACE_HUB_TOKEN"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "6b5ae7f7",
   "metadata": {},
   "source": [
    "## 2. 设置仓库与本地保存路径"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 7,
   "id": "5cf53f6e",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Repo: Qwen/Qwen3-1.7B\n",
      "Local dir: E:\\huggingface_models\\qwen\\Qwen3-1.7B\n"
     ]
    }
   ],
   "source": [
    "# Qwen3-8B 的 Transformers 仓库名（如遇404，请在HF搜索确认真实仓库名）\n",
    "REPO_ID = os.getenv('QWEN3_8B_REPO', 'Qwen/Qwen3-1.7B')\n",
    "# 目标路径：E:/huggingface_models/qwen/Qwen3-8B （可用环境变量覆盖根目录）\n",
    "ROOT_DIR = Path(os.getenv('HF_LOCAL_ROOT', 'E:/huggingface_models'))\n",
    "LOCAL_DIR = (ROOT_DIR / 'qwen' / 'Qwen3-1.7B').resolve()\n",
    "LOCAL_DIR.mkdir(parents=True, exist_ok=True)\n",
    "print('Repo:', REPO_ID)\n",
    "print('Local dir:', LOCAL_DIR)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "ac83d9f0",
   "metadata": {},
   "source": [
    "## 2.5 代理设置（可选）\n",
    "如果你的网络需要代理访问 Hugging Face，请在此处配置。\n",
    "优先顺序：`HF_PROXY`（自定义） > 现有环境变量 `HTTPS_PROXY/HTTP_PROXY/ALL_PROXY`。\n",
    "也可直接在下方手动填写代理地址，例如：`http://127.0.0.1:7890`。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 8,
   "id": "fe61a8ab",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Proxy enabled -> http://127.0.0.1:7897\n",
      "huggingface_hub HTTP backend configured (env proxies will be used).\n"
     ]
    }
   ],
   "source": [
    "# 可选：设置代理（推荐通过环境变量，但也可在此手动覆盖）\n",
    "import os\n",
    "from urllib.parse import urlparse\n",
    "\n",
    "# 1) 读取可能的代理来源\n",
    "proxy_env = os.getenv('HF_PROXY') or os.getenv('HTTPS_PROXY') or os.getenv('HTTP_PROXY') or os.getenv('ALL_PROXY')\n",
    "\n",
    "# 2) 如需手动覆盖，请在此填写（留空即不覆盖）\n",
    "manual_proxy = os.getenv('HF_PROXY_OVERRIDE', 'http://127.0.0.1:7897')  # 例如：'http://127.0.0.1:7897'\n",
    "\n",
    "# 3) 最终代理地址判定\n",
    "PROXY = (manual_proxy or proxy_env or '').strip()\n",
    "\n",
    "def _norm_proxy(p):\n",
    "    if not p:\n",
    "        return ''\n",
    "    parsed = urlparse(p)\n",
    "    # 若缺少scheme，默认加上 http://\n",
    "    if not parsed.scheme:\n",
    "        return f'http://{p}'\n",
    "    return p\n",
    "\n",
    "if PROXY:\n",
    "    PROXY = _norm_proxy(PROXY)\n",
    "    # 同时设置大小写环境变量，便于 requests/httpx 等库识别\n",
    "    os.environ['HTTP_PROXY'] = PROXY\n",
    "    os.environ['HTTPS_PROXY'] = PROXY\n",
    "    os.environ['ALL_PROXY'] = PROXY\n",
    "    os.environ['http_proxy'] = PROXY\n",
    "    os.environ['https_proxy'] = PROXY\n",
    "    os.environ['all_proxy'] = PROXY\n",
    "    print('Proxy enabled ->', PROXY)\n",
    "else:\n",
    "    print('Proxy not set; downloading without proxy.')\n",
    "\n",
    "# 4) 尝试配置 huggingface_hub 的 HTTP 后端（若可用）\n",
    "try:\n",
    "    from huggingface_hub.utils import configure_http_backend  # 新版公开API\n",
    "    # httpx/requests 后端都会尊重环境变量；此处调用可确保后端初始化时读取到最新env。\n",
    "    configure_http_backend()\n",
    "    print('huggingface_hub HTTP backend configured (env proxies will be used).')\n",
    "except Exception:\n",
    "    # 兼容旧版本或无该API时忽略\n",
    "    pass"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "bbe827cc",
   "metadata": {},
   "source": [
    "## 3. 下载仓库快照到本地（支持断点续传）"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 10,
   "id": "0ef9c44e",
   "metadata": {},
   "outputs": [
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Fetching 12 files:   0%|          | 0/12 [00:00<?, ?it/s]Xet Storage is enabled for this repo, but the 'hf_xet' package is not installed. Falling back to regular HTTP download. For better performance, install the package with: `pip install huggingface_hub[hf_xet]` or `pip install hf_xet`\n",
      "Fetching 12 files: 100%|██████████| 12/12 [00:31<00:00,  2.59s/it]"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Snapshot saved to: E:\\huggingface_models\\qwen\\Qwen3-1.7B\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "\n"
     ]
    }
   ],
   "source": [
    "# 不限制文件类型，完整下载；若磁盘有限，可按需使用 allow_patterns 过滤\n",
    "HF_TOKEN = os.getenv('HUGGINGFACE_HUB_TOKEN')\n",
    "if not HF_TOKEN:\n",
    "    print('Warning: 未检测到 HUGGINGFACE_HUB_TOKEN，将以匿名方式尝试下载（可能因权限受限失败）。')\n",
    "\n",
    "snapshot_path = snapshot_download(\n",
    "    repo_id=REPO_ID,\n",
    "    repo_type=\"model\",\n",
    "    local_dir=str(LOCAL_DIR),\n",
    "    local_dir_use_symlinks=False,\n",
    "    resume_download=True,\n",
    "    token=HF_TOKEN  # 显式使用访问令牌\n",
    ")\n",
    "print('Snapshot saved to:', snapshot_path)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "fa0637d4",
   "metadata": {},
   "source": [
    "## 4. 验证已下载文件与大小"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 11,
   "id": "5bb1b0b6",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "config.json\t0.00 MB\n",
      "generation_config.json\t0.00 MB\n",
      "merges.txt\t1.59 MB\n",
      "model-00001-of-00002.safetensors\t3281.77 MB\n",
      "model-00002-of-00002.safetensors\t593.50 MB\n",
      "model.safetensors.index.json\t0.03 MB\n",
      "tokenizer.json\t10.90 MB\n",
      "tokenizer_config.json\t0.01 MB\n",
      "vocab.json\t2.65 MB\n",
      "\n",
      "Total size (approx): 3890.49 MB\n"
     ]
    }
   ],
   "source": [
    "from pathlib import Path\n",
    "\n",
    "total_mb = 0.0\n",
    "for p in sorted(Path(LOCAL_DIR).rglob('*')):\n",
    "    if p.is_file():\n",
    "        size_mb = p.stat().st_size/1024/1024\n",
    "        total_mb += size_mb\n",
    "        if p.suffix in {'.bin', '.safetensors', '.json', '.model', '.txt'}:\n",
    "            print(f\"{p.relative_to(LOCAL_DIR)}\\t{size_mb:.2f} MB\")\n",
    "print(f\"\\nTotal size (approx): {total_mb:.2f} MB\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "64dd865c",
   "metadata": {},
   "source": [
    "## 5. 从本地目录加载（可离线）"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": 12,
   "id": "6652865a",
   "metadata": {},
   "outputs": [
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Loading from: E:\\huggingface_models\\qwen\\Qwen3-1.7B\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "Loading checkpoint shards: 100%|██████████| 2/2 [00:02<00:00,  1.28s/it]"
     ]
    },
    {
     "name": "stdout",
     "output_type": "stream",
     "text": [
      "Loaded tokenizer and model.\n"
     ]
    },
    {
     "name": "stderr",
     "output_type": "stream",
     "text": [
      "\n"
     ]
    }
   ],
   "source": [
    "# 注意：首次加载可能占用较多内存，建议在具备足够内存/GPU的环境中运行\n",
    "from transformers import AutoTokenizer, AutoModelForCausalLM\n",
    "\n",
    "local_path = str(LOCAL_DIR)\n",
    "print('Loading from:', local_path)\n",
    "\n",
    "tokenizer = AutoTokenizer.from_pretrained(local_path, use_fast=True)\n",
    "model = AutoModelForCausalLM.from_pretrained(local_path)\n",
    "print('Loaded tokenizer and model.')"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "10105b4c",
   "metadata": {},
   "source": [
    "## 6. 使用说明与可选项\n",
    "- 若下载仓库名有变更，请修改第2节的 `REPO_ID`（或设置环境变量 `QWEN3_8B_REPO`）。\n",
    "- 覆盖根目录：设置 `HF_LOCAL_ROOT` 指向自定义磁盘路径。\n",
    "- 如果仓库需要协议/登录，先运行 `huggingface-cli login` 或设置 `HUGGINGFACE_HUB_TOKEN`。\n",
    "- 若磁盘有限，可在第3节 `snapshot_download` 中使用 `allow_patterns` 仅下载必要文件（如 `*.json`, `*.safetensors`, `tokenizer.*`）。\n",
    "- 加载速度/显存：可结合 `device_map='auto'`、`torch_dtype='auto'`、`bitsandbytes` 低比特加载（需要额外依赖）。"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "base",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.13.5"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}
