Looking for Small VLM/MLLMs Alternatives to Qwen Series Models

Posted by CatSweaty4883@reddit | LocalLLaMA | View on Reddit | 9 comments

I have tried Qwen 3 VL family of models on my rtx3060, max I can load is Q8 8b. The task is visual reasoning/ instruction following. What are some other models I could explore? My system ram is 16gb, vram 12gb.