Instructions to use tue-mps/coco_panoptic_eomt_base_640_2x with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use tue-mps/coco_panoptic_eomt_base_640_2x with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("image-segmentation", model="tue-mps/coco_panoptic_eomt_base_640_2x")# Load model directly from transformers import AutoImageProcessor, EomtForUniversalSegmentation processor = AutoImageProcessor.from_pretrained("tue-mps/coco_panoptic_eomt_base_640_2x") model = EomtForUniversalSegmentation.from_pretrained("tue-mps/coco_panoptic_eomt_base_640_2x") - Notebooks
- Google Colab
- Kaggle
| { | |
| "crop_size": null, | |
| "data_format": "channels_first", | |
| "default_to_square": false, | |
| "device": null, | |
| "disable_grouping": null, | |
| "do_center_crop": null, | |
| "do_convert_rgb": null, | |
| "do_normalize": true, | |
| "do_pad": true, | |
| "do_rescale": true, | |
| "do_resize": true, | |
| "do_split_image": false, | |
| "ignore_index": null, | |
| "image_mean": [ | |
| 0.485, | |
| 0.456, | |
| 0.406 | |
| ], | |
| "image_processor_type": "EomtImageProcessorFast", | |
| "image_std": [ | |
| 0.229, | |
| 0.224, | |
| 0.225 | |
| ], | |
| "input_data_format": null, | |
| "resample": 2, | |
| "rescale_factor": 0.00392156862745098, | |
| "return_tensors": null, | |
| "size": { | |
| "longest_edge": 640, | |
| "shortest_edge": 640 | |
| } | |
| } | |