Availability of BioMedLM: Model not loading

#4
by Astrix - opened

I am attempting to test the availability of this model with provided sample code on HuggingFace through:

  1. Hosted inference API
  2. Deploy - Inference API.

In either cases, the model is not loading.
Is the model still available?

Screenshot 2023-05-31 at 12.01.10 PM.png

Were you able to get it to work? Still struggling with the same issue.

Stanford CRFM org

model's still there. It's never really worked well with HF's hosted inference stuff. I think it's slightly too big?

I tried the hosted inference API too and got the following errors:

  1. Validation Error:
    The initial part of the error message is a HFValidationError thrown by the huggingface_hub library. This indicates that the repository ID (which is the model ID) does not adhere to the required validation rules. Specifically, it says:

    Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: '/repository'.

    This suggests that the repository ID should only contain alphanumeric characters, and certain symbols like -, _, and . are allowed if they do not start or end the name. Also, the maximum length of 96 characters should not be exceeded. The provided value '/repository' does not meet these criteria.

  2. Missing Config File:
    Next, the logs say ValueError: Can't find 'adapter_config.json' at '/repository'. This indicates that the expected adapter_config.json file is not found at the given path. The path '/repository' seems to be incorrect or not the actual location of the config file.

Sign up or log in to comment