Webhooks are now publicly available!
Here’s a short guide on how to use Hugging Face Webhooks to build a bot that replies to Discussion comments on the Hub with a response generated by BLOOM, a multilingual language model, using the free Inference API.
First, let’s create a Webhook from your settings.
Your Webhook will look like this:
In this guide, we create a separate user account to host a Space and to post comments:
The third step is actually to listen to the Webhook events.
An easy way is to use a Space for this. We use the user account we created, but you could do it from your main user account if you wanted to.
The Space’s code is here.
We used NodeJS and Typescript to implement it, but any language or framework would work equally well. Read more about Docker Spaces here.
The main server.ts
file is here
Let’s walk through what happens in this file:
app.post("/", async (req, res) => {
if (req.header("X-Webhook-Secret") !== process.env.WEBHOOK_SECRET) {
console.error("incorrect secret");
return res.status(400).json({ error: "incorrect secret" });
}
...
Here, we listen to POST requests made to /
, and then we check that the X-Webhook-Secret
header is equal to the secret we had previously defined (you need to also set the WEBHOOK_SECRET
secret in your Space’s settings to be able to verify it).
const event = req.body.event;
if (
event.action === "create" &&
event.scope === "discussion.comment" &&
req.body.comment.content.includes(BOT_USERNAME)
) {
...
The event’s payload is encoded as JSON. Here, we specify that we will run our Webhook only when:
@discussion-bot
, i.e. our bot was just mentioned in a comment.In that case, we will continue to the next step:
const INFERENCE_URL =
"https://api-inference.huggingface.co/models/bigscience/bloom";
const PROMPT = `Pretend that you are a bot that replies to discussions about machine learning, and reply to the following comment:\n`;
const response = await fetch(INFERENCE_URL, {
method: "POST",
body: JSON.stringify({ inputs: PROMPT + req.body.comment.content }),
});
if (response.ok) {
const output = await response.json();
const continuationText = output[0].generated_text.replace(
PROMPT + req.body.comment.content,
""
);
...
This is the coolest part: we call the Inference API for the BLOOM model, prompting it with PROMPT
, and we get the continuation text, i.e., the part generated by the model.
Finally, we will post it as a reply in the same discussion thread:
const commentUrl = req.body.discussion.url.api + "/comment";
const commentApiResponse = await fetch(commentUrl, {
method: "POST",
headers: {
Authorization: `Bearer ${process.env.HF_TOKEN}`,
"Content-Type": "application/json",
},
body: JSON.stringify({ comment: continuationText }),
});
const apiOutput = await commentApiResponse.json();
Last but not least, you’ll need to configure your Webhook to send POST requests to your Space.
Let’s first grab our Space’s “direct URL” from the contextual menu. Click on “Embed this Space” and copy the “Direct URL”.
Update your webhook to send requests to that URL: