davanstrien HF Staff commited on
Commit
2960cc5
·
verified ·
1 Parent(s): b27caef

Fix: use cu129 nightly variant (has x86_64 wheels)

Browse files
Files changed (1) hide show
  1. glm-ocr.py +4 -5
glm-ocr.py CHANGED
@@ -11,7 +11,7 @@
11
  # ]
12
  #
13
  # [[tool.uv.index]]
14
- # url = "https://wheels.vllm.ai/nightly"
15
  #
16
  # [tool.uv]
17
  # prerelease = "allow"
@@ -25,10 +25,9 @@ GLM-OCR is a compact 0.9B parameter OCR model achieving 94.62% on OmniDocBench V
25
  Uses CogViT visual encoder with GLM-0.5B language decoder and Multi-Token Prediction
26
  (MTP) loss for fast, accurate document parsing.
27
 
28
- NOTE: Requires vLLM nightly wheels (GLM-OCR added in v0.16.0, PR #33005) and
29
- transformers>=5.1.0 (GLM-OCR support landed in stable release).
30
- As of 2026-02-13, vLLM nightly only has ARM wheels — x86_64 builds broken.
31
- Check https://wheels.vllm.ai/nightly for updated builds before running.
32
  First run may take a few minutes to download and install dependencies.
33
 
34
  Features:
 
11
  # ]
12
  #
13
  # [[tool.uv.index]]
14
+ # url = "https://wheels.vllm.ai/nightly/cu129"
15
  #
16
  # [tool.uv]
17
  # prerelease = "allow"
 
25
  Uses CogViT visual encoder with GLM-0.5B language decoder and Multi-Token Prediction
26
  (MTP) loss for fast, accurate document parsing.
27
 
28
+ NOTE: Requires vLLM nightly wheels from cu129 variant (GLM-OCR added in v0.16.0,
29
+ PR #33005) and transformers>=5.1.0 (GLM-OCR support landed in stable release).
30
+ Uses https://wheels.vllm.ai/nightly/cu129 which has x86_64 wheels.
 
31
  First run may take a few minutes to download and install dependencies.
32
 
33
  Features: