Usage (Transformers.js)

If you haven't already, you can install the Transformers.js JavaScript library from NPM using:

npm i @huggingface/transformers

Example: Selfie segmentation with onnx-community/mediapipe_selfie_segmentation.

import { AutoModel, AutoProcessor, RawImage } from '@huggingface/transformers';

// Load model and processor
const model_id = 'onnx-community/mediapipe_selfie_segmentation';
const model = await AutoModel.from_pretrained(model_id, { dtype: 'fp32' });
const processor = await AutoProcessor.from_pretrained(model_id);

// Load image from URL
const url = 'https://huggingface.co/datasets/Xenova/transformers.js-docs/resolve/main/selfie_segmentation.png';
const image = await RawImage.read(url);

// Pre-process image
const inputs = await processor(image);

// Predict alpha matte
const { alphas } = await model(inputs);

// Save output mask
const mask = await RawImage.fromTensor(alphas[0].mul(255).to('uint8')).resize(image.width, image.height);
mask.save('mask.png');

// (Optional) Apply mask to original image
const result = image.clone().putAlpha(mask);
result.save('result.png');
Input image Predicted mask Output image
image/png image/png image/png
Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.