Huawei 300i Pro Duo AI Inference Card with 96 GB VRAM - anyone bought it and tested it?

Posted by Darlanio@reddit | LocalLLaMA | View on Reddit | 9 comments

It has been over a year since I first heard about Huawei 300i Pro Duo Atlas (rumors before the release).

What support do we have for Huawei 300i Atlas Duo as of present in the LLM-community?

Has anyone bought the cards and the shipping went well?

What kind of tokens/second on models that require more than 24 GB memory have _you_ gotten - not just links to others reviews, but your own tests...

Please, enlighten us...

2 months:

https://www.reddit.com/r/LocalLLaMA/comments/1r04r2w/huawei_atlas_300i_duogpu/

7 months:
https://www.reddit.com/r/LocalLLM/comments/1n4f1gs/huawei_96gb_gpu_cardatlas_300i_duo/

https://www.reddit.com/r/MachineLearning/comments/1n4y2y3/d_huaweis_96gb_gpu_under_2k_what_does_this_mean/

12+ months ago:

https://www.reddit.com/r/LocalLLaMA/comments/1j78xnk/huawei_gpu/

https://www.reddit.com/r/LocalLLaMA/comments/1kgltqs/huawei_atlas_300i_32gb/

https://www.reddit.com/r/LocalLLaMA/comments/1j78xnk/huawei_gpu/