XOR conversion fails: missing params.json

#19
by ccsum - opened

Where can I locate this params.json file?

python src/transformers/models/llama/convert_llama_weights_to_hf.py \
--input_dir  /home/ec2-user/llm/oasst-sft-6-llama-30b-xor/oasst-sft-6-llama-30b-xor/oasst-sft-6-llama-30b-xor \
--output_dir /home/ec2-user/llm/oasst-sft-6-llama-30b \
--model_size 30B
Traceback (most recent call last):
  File "/home/ec2-user/llm-endpoints/xor-transformation/transformers/src/transformers/models/llama/convert_llama_weights_to_hf.py", line 278, in <module>
    main()
  File "/home/ec2-user/llm-endpoints/xor-transformation/transformers/src/transformers/models/llama/convert_llama_weights_to_hf.py", line 268, in main
    write_model(
  File "/home/ec2-user/llm-endpoints/xor-transformation/transformers/src/transformers/models/llama/convert_llama_weights_to_hf.py", line 90, in write_model
    params = read_json(os.path.join(input_base_path, "params.json"))
  File "/home/ec2-user/llm-endpoints/xor-transformation/transformers/src/transformers/models/llama/convert_llama_weights_to_hf.py", line 76, in read_json
    with open(path, "r") as f:
FileNotFoundError: [Errno 2] No such file or directory: '/home/ec2-user/llm/oasst-sft-6-llama-30b-xor/oasst-sft-6-llama-30b-xor/oasst-sft-6-llama-30b-xor/30B/params.json'

What I can understand is that for this model conversion we need llama weights which have params.json.

Each Llama model has params.json in it's folder so you get the file from the place you downloaded Llama from... Or just find a new place where they have it. Google is quite successful at finding that.

Sign up or log in to comment