Commit
·
dace8e2
1
Parent(s):
b2a28f3
Pin transformers version for DeepSeek-OCR compatibility
Browse filesFix ImportError with LlamaFlashAttention2 by pinning to
transformers==4.46.3 and tokenizers==0.20.3 as required
by the model's custom code.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
- deepseek-ocr.py +2 -1
deepseek-ocr.py
CHANGED
|
@@ -6,7 +6,8 @@
|
|
| 6 |
# "pillow",
|
| 7 |
# "torch",
|
| 8 |
# "torchvision",
|
| 9 |
-
# "transformers",
|
|
|
|
| 10 |
# "tqdm",
|
| 11 |
# "addict",
|
| 12 |
# "matplotlib",
|
|
|
|
| 6 |
# "pillow",
|
| 7 |
# "torch",
|
| 8 |
# "torchvision",
|
| 9 |
+
# "transformers==4.46.3",
|
| 10 |
+
# "tokenizers==0.20.3",
|
| 11 |
# "tqdm",
|
| 12 |
# "addict",
|
| 13 |
# "matplotlib",
|