I built a 24h TPS + Intelligence Index table for Ollama Cloud models
Posted by antonusaca@reddit | LocalLLaMA | View on Reddit | 0 comments
I recently made ollamatps.com for my own model-selection workflow and thought it might be useful here too.
It shows 39 Ollama cloud models sorted by average TPS over the last 24 hours, and I added the Artificial Analysis Intelligence Index so speed and capability are visible in one place.
My current takeaway is that GLM-4.7 looks like the best speed/intelligence balance with average 93 TPS. Kimi K2.6 is still my personal favorite, but in my tests it only reaches about 32 TPS, so it’s not the speed pick.
A few model names don’t map 1:1 across both sites, so some Intelligence Index values are nearest-family matches rather than exact aliases.
Link: https://architects-movies-termination-agreed.trycloudflare.com/ollama-tps-aa-comparison.html
If anyone has a model they want added or a better way to compare throughput vs capability, I’d love feedback.