aismlv commited on
Commit
d23149f
1 Parent(s): f18e63b

Update config.json

Browse files

Fixes the warning below when using multiple GPUs. The parameter is not utilised by Llava (and set to `false` for Llama)

```
2023-12-14 19:03:27,491 - accelerate.utils.modeling - WARNING - The model weights are not tied. Please use the `tie_weights` method before using the `infer_auto_device` function.
```

Files changed (1) hide show
  1. config.json +1 -0
config.json CHANGED
@@ -23,6 +23,7 @@
23
  "torch_dtype": "float16",
24
  "vocab_size": 32064
25
  },
 
26
  "torch_dtype": "float16",
27
  "transformers_version": "4.36.0.dev0",
28
  "vision_config": {
 
23
  "torch_dtype": "float16",
24
  "vocab_size": 32064
25
  },
26
+ "tie_word_embeddings": false,
27
  "torch_dtype": "float16",
28
  "transformers_version": "4.36.0.dev0",
29
  "vision_config": {