A Quick Look At The AMD Instinct MI355X With ROCm 7.0
Posted by Balance-@reddit | LocalLLaMA | View on Reddit | 4 comments
Instinct MI355X is coming to market. 288GB HBM3E memory, 8TB/s bandwidth, and expanded FP6 and FP4 datatype support. Phoronix had a limited hands-on:
Yesterday I was invited along with a small group of others to try out the AMD Instinct MI355X accelerator down in Austin, Texas. The AMD Instinct MI355X is fully supported with the newly-released AMD ROCm 7.0.
The AMD Instinct MI355X "hands on" yesterday to celebrate ROCm 7.0 and the MI350X/MI355X hardware ended up being just following a guided Jupyter Notebook for an AI demo... And one that wasn't even performance-related or anything unique to the AMD Instinct MI350 series capabilities. Not quite the hands-on time expected with originally hoping there would be enough time to tap some MI355X accelerators unconstrained and run some AI/LLM benchmarks at least with Llama.cpp and vLLM. Nevertheless via Jupyter Notebook's terminal allowed for poking at the MI355X on ROCm 7.0 during this demo session.
grannyte@reddit
Some day those will come to the used market in quite a while because rightnow even the mi100 is barely available.
DistanceSolar1449@reddit
Well, yeah. The MI100 came out nov 2020. Datacenters usually do 3-5 years of ownership and then resell the cards, because that’s how long the service contract lasts. Expect a flood of MI100s in ~6-12 months.
grannyte@reddit
With a little luck . Left to see if that will be a worth it upgrade for my current setup I'm building
Pro-editor-1105@reddit
What dreams are made of