Uncaught Error: Unauthorized access to file: "https://huggingface.co/briaai/RMBG-1.4/resolve/main/onnx/model_quantized.onnx"

#5
by anmolbansal02 - opened

Getting this error in the console of this demo, I guess since briaai/RMBG-1.4 is a gated model. We are not able to access it without the HF_TOKEN but it's mentioned that the HF_TOKEN is not supported in the browser (https://huggingface.co/docs/transformers.js/en/guides/private) . So, I'm kinda stuck here. Any help is appreciated.

error:

Uncaught Error: Unauthorized access to file: "https://huggingface.co/briaai/RMBG-1.4/resolve/main/onnx/model_quantized.onnx".
    at handleError (index-E_M5nW8h.js:1790:16683)
    at getModelFile (index-E_M5nW8h.js:1790:18843)
    at async constructSession (index-E_M5nW8h.js:1790:48926)
    at async Promise.all (xenova-remove-background-web.static.hf.space/index 1)
    at async PreTrainedModel.from_pretrained (index-E_M5nW8h.js:1790:57065)
    at async AutoModel.from_pretrained (index-E_M5nW8h.js:1790:100744)
    at async index-E_M5nW8h.js:1790:152852

Thanks for reporting! It looks like they recently made their models gated for some reason. I would like to upload the ONNX weights separately in the meantime, but I'm awaiting a response from the model authors ( @mokady ).

Thanks for getting back so quickly. Please share the code solution too as I'm also integrating this demo in my side project :)

Btw I still think that there should be a way to supply HF_token from browser so that we can use gated models as well on the client side.

Thanks for getting back so quickly. Please share the code solution too as I'm also integrating this demo in my side project :)

You can find the source code here: https://github.com/xenova/transformers.js/tree/main/examples/remove-background-client

You just need to clone the repo: https://huggingface.co/briaai/RMBG-1.4 and then update references in the code to your new model (most likely running locally).

Btw I still think that there should be a way to supply HF_token from browser so that we can use gated models as well on the client side.

I don't see how this can be done safely, unfortunately, since it means exposing your token to users of the web-app.

You just need to clone the repo: https://huggingface.co/briaai/RMBG-1.4 and then update references in the code to your new model (most likely running locally).

What do you mean by updating references, sorry I'm very new to using AI models, it would really help if you could point to any resource that I can refer.

I don't see how this can be done safely, unfortunately, since it means exposing your token to users of the web-app.

Right, but let's suppose I want to make an application in which I ask the user to input the HF_TOKEN so that they can use the model locally in their browser, but right now I don't have a way to do so. We can mention a warning in the docs regarding the consequences of exposing the token but there should atleast be a way to directly use gated models from client side.

What do you mean by updating references

I just mean update the code to use the local models. This would involve:

  1. Removing this line
  2. Cloning their repo: https://huggingface.co/briaai/RMBG-1.4 into a publicly-accessible folder. The folder structure should remain the same as the repo, with config.json at the root and a subfolder called onnx with all the weights. I would recommend putting it in something like /public/models/briaai/RMBG-1.4.
  3. Updating the local model path so that it points to the new models.
import { env } from '@xenova/transformers';

// Specify a custom location for models (defaults to '/models/').
env.localModelPath = '/path/to/models/';

Hope that helps!

Right, but let's suppose I want to make an application in which I ask the user to input the HF_TOKEN so that they can use the model locally in their browser, but right now I don't have a way to do so.

This is a use-case I actually hadn't thought of. Feel free to open a feature request here, and we can continue the discussion there (with others' involvement too). I'm still a bit hesitant, but it could be something to look into.

Thanks for sharing the code snippet.

Yes, I'll go ahead and open a feature request.

Opened here

Sign up or log in to comment