Availability of BioMedLM: Model not loading
Were you able to get it to work? Still struggling with the same issue.
model's still there. It's never really worked well with HF's hosted inference stuff. I think it's slightly too big?
I tried the hosted inference API too and got the following errors:
Validation Error:
The initial part of the error message is aHFValidationError
thrown by thehuggingface_hub
library. This indicates that the repository ID (which is the model ID) does not adhere to the required validation rules. Specifically, it says:Repo id must use alphanumeric chars or '-', '_', '.', '--' and '..' are forbidden, '-' and '.' cannot start or end the name, max length is 96: '/repository'.
This suggests that the repository ID should only contain alphanumeric characters, and certain symbols like
-
,_
, and.
are allowed if they do not start or end the name. Also, the maximum length of 96 characters should not be exceeded. The provided value'/repository'
does not meet these criteria.Missing Config File:
Next, the logs sayValueError: Can't find 'adapter_config.json' at '/repository'
. This indicates that the expectedadapter_config.json
file is not found at the given path. The path'/repository'
seems to be incorrect or not the actual location of the config file.