{
 "cells": [
  {
   "cell_type": "markdown",
   "metadata": {},
   "source": [
    "# TensorRT-LLM 快速入门指南\n",
    "\n",
    "## LLM API\n",
    "\n",
    "LLM API 是 Python API，旨在直接在 Python 中简化 TensorRT-LLM 的设置和推理。它通过简单地指定 HuggingFace 仓库名称或模型检查点来启用模型优化。LLM API 通过管理检查点转换、引擎构建、引擎加载和模型推理，简化了整个流程，所有这些功能都通过单一的 Python 对象实现。"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "metadata": {},
   "outputs": [],
   "source": [
    "from tensorrt_llm import LLM, SamplingParams\n",
    "\n",
    "\n",
    "def main():\n",
    "    prompts = [\n",
    "        \"Hello, my name is\",\n",
    "        \"The president of the United States is\",\n",
    "        \"The capital of France is\",\n",
    "        \"The future of AI is\",\n",
    "    ]\n",
    "    sampling_params = SamplingParams(temperature=0.8, top_p=0.95)\n",
    "\n",
    "    llm = LLM(model=\"TinyLlama/TinyLlama-1.1B-Chat-v1.0\")\n",
    "\n",
    "    outputs = llm.generate(prompts, sampling_params)\n",
    "\n",
    "    # Print the outputs.\n",
    "    for output in outputs:\n",
    "        prompt = output.prompt\n",
    "        generated_text = output.outputs[0].text\n",
    "        print(f\"Prompt: {prompt!r}, Generated text: {generated_text!r}\")\n",
    "\n",
    "\n",
    "# The entry point of the program need to be protected for spawning processes.\n",
    "if __name__ == '__main__':\n",
    "    main()"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "ai",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "name": "python",
   "version": "3.12.2"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 2
}
