gradio_docs_rag_hfchattool / sources /batch-functions.json
Csplk's picture
updated gradio sources
c840e14
{"guide": {"name": "batch-functions", "category": "additional-features", "pretty_category": "Additional Features", "guide_index": 6, "absolute_index": 19, "pretty_name": "Batch Functions", "content": "# Batch functions\n\nGradio supports the ability to pass _batch_ functions. Batch functions are just\nfunctions which take in a list of inputs and return a list of predictions.\n\nFor example, here is a batched function that takes in two lists of inputs (a list of\nwords and a list of ints), and returns a list of trimmed words as output:\n\n```py\nimport time\n\ndef trim_words(words, lens):\n trimmed_words = []\n time.sleep(5)\n for w, l in zip(words, lens):\n trimmed_words.append(w[:int(l)])\n return [trimmed_words]\n```\n\nThe advantage of using batched functions is that if you enable queuing, the Gradio server can automatically _batch_ incoming requests and process them in parallel,\npotentially speeding up your demo. Here's what the Gradio code looks like (notice the `batch=True` and `max_batch_size=16`)\n\nWith the `gr.Interface` class:\n\n```python\ndemo = gr.Interface(\n fn=trim_words, \n inputs=[\"textbox\", \"number\"], \n outputs=[\"output\"],\n batch=True, \n max_batch_size=16\n)\n\ndemo.launch()\n```\n\nWith the `gr.Blocks` class:\n\n```py\nimport gradio as gr\n\nwith gr.Blocks() as demo:\n with gr.Row():\n word = gr.Textbox(label=\"word\")\n leng = gr.Number(label=\"leng\")\n output = gr.Textbox(label=\"Output\")\n with gr.Row():\n run = gr.Button()\n\n event = run.click(trim_words, [word, leng], output, batch=True, max_batch_size=16)\n\ndemo.launch()\n```\n\nIn the example above, 16 requests could be processed in parallel (for a total inference time of 5 seconds), instead of each request being processed separately (for a total\ninference time of 80 seconds). Many Hugging Face `transformers` and `diffusers` models work very naturally with Gradio's batch mode: here's [an example demo using diffusers to\ngenerate images in batches](https://github.com/gradio-app/gradio/blob/main/demo/diffusers_with_batching/run.py)\n\n\n", "tags": [], "spaces": [], "url": "/guides/batch-functions/", "contributor": null}}