Tesla Tapes Out AI5 Chip in Partnership With TSMC and Samsung
Posted by martincerven@reddit | hardware | View on Reddit | 6 comments
If my math is correct, this could have 12*16GB modules =192GB of memory and
64*12= 768bit wide bus, therefore 820-920 GB/s bandwidth (pretty cool!)
I wonder if they want to use it also in their TeslaBots or just in cars.
PM_ME_YOUR_HAGGIS_@reddit
That bandwidth is less than my 3090
CatalyticDragon@reddit
Barely. But a car doesn't drive on bandwidth alone.
martincerven@reddit (OP)
Sure, but your 3090 has 24GB of memory and consumes 400W
OsgoodSlaughters@reddit
I hope it’s a massive failure for all involved!! 🤞🏻
venfare64@reddit
At least spare Samsung foundry though.
Bderken@reddit
They said they will use it for autonomous robots. To Tesla that is their cars and their TeslaBots