Error in the config.json

#3
by grg - opened

Hello,
I believe that you have an error in the config.json files, as they cannot be parsed on my side (also, after xoring i don't get the correct hash).

Here is the output of cat config.json.
image.png

I believe you have the same error in other llama based models.

OpenAssistant org

They are XORs from other JSON files, so are not meant to work as standalone JSONs. Some users (especially, but not exclusively, on Windows) have had issues with the JSONs due to line endings being messed up - could this be your problem? I have added more information to the readme, let me know if it helps

OllieStanley changed discussion status to closed

Hello, thanks for your reply. I do not think that is related to my issue, as I am using Linux. Also, the hash is not the same, so it is possible that there is some issue with the file you have on git lfs.

I fixed this issue by manually editing the conf file to be as follows:
"""
{
"_name_or_path": "OpenAssistant/oasst-sft-7e2-llama-30b",
"architectures": [
"LlamaForCausalLM"
],
"bos_token_id": 1,
"eos_token_id": 2,
"hidden_act": "silu",
"hidden_size": 6656,
"initializer_range": 0.02,
"intermediate_size": 17920,
"max_position_embeddings": 2048,
"model_type": "llama",
"num_attention_heads": 52,
"num_hidden_layers": 60,
"pad_token_id": 0,
"rms_norm_eps": 1e-06,
"tie_word_embeddings": false,
"torch_dtype": "float16",
"transformers_version": "4.28.1",
"use_cache": true,
"vocab_size": 32006
}
"""

Now the hash checks out - 9a4d2468ecf85bf07420b200faefb4af

grg changed discussion status to open

Sign up or log in to comment