BAAI
/

BoyaWu10 commited on
Commit
506f9f2
1 Parent(s): 6eb095d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -0
README.md CHANGED
@@ -32,6 +32,8 @@ pip install torch transformers accelerate pillow
32
 
33
  If the CUDA memory is enough, it would be faster to execute this snippet by setting `CUDA_VISIBLE_DEVICES=0`.
34
 
 
 
35
 
36
  ```python
37
  import torch
 
32
 
33
  If the CUDA memory is enough, it would be faster to execute this snippet by setting `CUDA_VISIBLE_DEVICES=0`.
34
 
35
+ Users especially those in Chinese mainland may want to refer to a HuggingFace [mirror site](https://hf-mirror.com).
36
+
37
 
38
  ```python
39
  import torch