Memory requirements
#6
by
sskorol
- opened
Hi, I'm getting OOM on 3b and 7b models while video processing on my RTX 3090. What are the minimum requirements to make it work?
Hello! I had the same problem. Limiting the values min_pixels and max_pixels worked for me, as per this paragraph
This comment has been hidden
@fortminors
thanks, I ended up with these samples: https://github.com/QwenLM/Qwen2.5-VL/blob/main/cookbooks/video_understanding.ipynb. Used a local video and replaced min/total pixels here with fixed small resolution {"video": video_path, "resized_height": 280, "resized_width": 420}
. It seems like the maximum I can get from 3090, as it gives 100% load during the inference.