Instructions to use mlx-community/GLM-OCR-4bit with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use mlx-community/GLM-OCR-4bit with Transformers:
# Use a pipeline as a high-level helper # Warning: Pipeline type "image-to-text" is no longer supported in transformers v5. # You must load the model directly (see below) or downgrade to v4.x with: # 'pip install "transformers<5.0.0' from transformers import pipeline pipe = pipeline("image-to-text", model="mlx-community/GLM-OCR-4bit")# Load model directly from transformers import AutoTokenizer, AutoModelForImageTextToText tokenizer = AutoTokenizer.from_pretrained("mlx-community/GLM-OCR-4bit") model = AutoModelForImageTextToText.from_pretrained("mlx-community/GLM-OCR-4bit") - MLX
How to use mlx-community/GLM-OCR-4bit with MLX:
# Download the model from the Hub pip install huggingface_hub[hf_xet] huggingface-cli download --local-dir GLM-OCR-4bit mlx-community/GLM-OCR-4bit
- Notebooks
- Google Colab
- Kaggle
- Local Apps
- LM Studio
| { | |
| "backend": "tokenizers", | |
| "clean_up_tokenization_spaces": false, | |
| "eos_token": "<|endoftext|>", | |
| "extra_special_tokens": [ | |
| "<|endoftext|>", | |
| "[MASK]", | |
| "[gMASK]", | |
| "[sMASK]", | |
| "<sop>", | |
| "<eop>", | |
| "<|system|>", | |
| "<|user|>", | |
| "<|assistant|>", | |
| "<|observation|>", | |
| "<|begin_of_image|>", | |
| "<|end_of_image|>", | |
| "<|begin_of_video|>", | |
| "<|end_of_video|>", | |
| "<|begin_of_audio|>", | |
| "<|end_of_audio|>", | |
| "<|begin_of_transcription|>", | |
| "<|end_of_transcription|>", | |
| "<|code_prefix|>", | |
| "<|code_middle|>", | |
| "<|code_suffix|>", | |
| "<think>", | |
| "</think>", | |
| "<tool_call>", | |
| "</tool_call>", | |
| "<tool_response>", | |
| "</tool_response>", | |
| "<arg_key>", | |
| "</arg_key>", | |
| "<arg_value>", | |
| "</arg_value>", | |
| "/nothink", | |
| "<|begin_of_box|>", | |
| "<|end_of_box|>", | |
| "<|image|>", | |
| "<|video|>" | |
| ], | |
| "is_local": true, | |
| "model_max_length": 655380, | |
| "pad_token": "<|endoftext|>", | |
| "padding_side": "left", | |
| "processor_class": "GlmOcrProcessor", | |
| "tokenizer_class": "TokenizersBackend" | |
| } | |