If money and time weren’t issues, what would your dream local AI setup look like?
Posted by Lyceum_Tech@reddit | LocalLLaMA | View on Reddit | 62 comments
If money and time weren’t issues at all, what would your dream local AI / GPU setup actually look like?
I’m talking full no-limits mode... how many GPUs, what kind of cluster, cooling setup, networking, software stack, storage, anything you want.
Would you go all out with a massive rack in your basement, liquid cooling everywhere, custom orchestration, or something completely different?
Curious to see what people would build if they could go crazy with it.
teachersecret@reddit
Two LLMs at the same time, man.
temperature_5@reddit
Since time is not an issue, I'd bring a data center GPU server back from the distant future.
If time were an issue, enough racks of Cerebras to run the copy of Opus I bribed an Anthropic engineer for.
o0genesis0o@reddit
I don't know much about enterprise class hardware, so I guess a giant PC case with 4 RTX 6000 Pro inside and whatever CPU that would pair well with this much GPU power. I would put the whole thing in one of those "edge data center" cabinet with built in cooling, and maybe place the whole cabinet in an isolated shed.
Maybe I'll invest heavily in solar too, just to run this thing.
VoiceApprehensive893@reddit
all of nvidia cards ever made in one monster datacenter
money is not an issue = energy is not an issue
Economy_Cabinet_7719@reddit
Probably want to buy out every engineer and researcher currently living too, so that they could advance your setup.
ea_man@reddit
You mean like a datacenter in space ala Elon Musk?
Well realistically if prices were ok I'd get a unified memory device for MoE low power + one gpu for dense top int models.
LeRobber@reddit
https://youtu.be/bFgTxr5yst0?si=0P62VOJrG7HMGWMw <= this setup
amunozo1@reddit
This is so cool and aesthetically pleasing for a server.
LeRobber@reddit
"How much VRam do you have"
"2TB"
"I don't mean flash, VRAM"
"2TB"
😄
wgaca2@reddit
I'd buy a datacenter if money and time are not an issue
Disposable110@reddit
A whole bunch of datacenters, on Titan.
Lorian0x7@reddit
Titan? you sure? to much latency for my tastes.
Lyceum_Tech@reddit (OP)
Titan as in Saturn’s moon? 🔥
We’re not just building a datacenter, we’re building a civilization on another celestial body. I respect the vision king 👽
Disposable110@reddit
Yeah, and then collapse Saturn into a black hole and move Titan near the event horizon to cheese on time dilation shenanigans.
Lyceum_Tech@reddit (OP)
straight to buying a whole datacenter 😂 respect. what would you put in it first?
wgaca2@reddit
Money is not an issue, buy everything latest and greatest hardware, best engineers i can hire and make them build agi or keep trying as long as i am alive
I mean what is this question money is not an issue..
If you say like, if you had 100-200k what would you build is more like what your question is probably..
Lyceum_Tech@reddit (OP)
😂 i feel you.. the original question was truely 'no limits' mode, but but the 100-200k version is also fun.
for the unlimited version tho.. im with you on going full datacenter + hiring engineers to chase AGI
pollysunders@reddit
Altman, you don't even hiding
Funny_Address_412@reddit
A terabyte of VRAM at least
Lyceum_Tech@reddit (OP)
😂 How many GPUs you thinking to hit that?
KURD_1_STAN@reddit
1 at 300W. Op said money is not an issue
Intrepid_Dare6377@reddit
At least
Silver-Champion-4846@reddit
As a beginner, I would love to get a Framework 13 Pro (Core Ultra 5 250K+ ai cpu), Tiiny AI Pocket Lab, and an RTX gpu. Unless the physics of that (cooling requirements, latency) is worse than just getting a pair of rtx5090s
Lyceum_Tech@reddit (OP)
You think the cooling/latency issues are that bad compared to dual 5090s?
Silver-Champion-4846@reddit
I do not know, I'm just stating it as a possibility by virtue of my ignorance. I wouldn't run them simultaneously on the same inference if the latency is high, but would I be safe if I just ran 3 different inference loops, each one on one of those machines? Idk, the Pocket lab isn't even out yet.
sleepingsysadmin@reddit
Kind of a silly question because it just means 'go build openai's stargate' make your own models. etc.
Which isnt what the subreddit is about.
To me the reasonable answer to this. Go get 240v 4u gpu server. 4x h200 running Q8 minimax 2.7.
It's not consumer or even prosumer tier, it's obviously attainable datacenter tier. Probably in that $80,000 range. It'd use about $300/month in electricity, but nothing more is needed.
Lyceum_Tech@reddit (OP)
$300k/month on power is crazy tho 😂
sleepingsysadmin@reddit
well, $300/month but that's a ridculous amount of tokens.
Plus, you said money is no issue.
__JockY__@reddit
8-pack of B300 SXM. Next question.
ttkciar@reddit
A datacenter full of Helios racks, loaded up with MI455X, next to a nuclear power facility which desalinates ocean water with its waste heat.
Dthen_@reddit
Idk, maybe 60% of the planet's DRAM production for the next year or so?
Lyceum_Tech@reddit (OP)
60% of global DRAM? That's actually insane lmao. Unlimited budget is wild 🔥
Dthen_@reddit
Yeah, I think 60% might be enough. Not sure what I'd do with more. Something to do with goblins, I guess.
Kal-LZ@reddit
I’d settle for a ThinkStation P8 workstation with a Threadripper 9975WX, 512GB DDR5 RDIMM, and triple RTX PRO 6000 96GB GPUs. I’m a person of simple tastes
More-Curious816@reddit
Why not the DGX station? It has unified memory, superior support and less power drawn
NVIDIA DGX Station Blackwell Edition: - GB300 Grace Blackwell Ultra Superchip - 72-core Arm Grace Neoverse V2 CPU (496 GB LPDDR5X) - Blackwell Ultra GPU (252-288 GB HBM3e, 20 PFLOPS AI) - 900 GB/s NVLink-C2C, ConnectX-8 800 Gb/s NIC - Tower workstation, 1600W PSU, DGX OS
Lyceum_Tech@reddit (OP)
Simple tastes approved 🔥
xilvar@reddit
Absolutely no limits? As many b300’s as fit in a 5 story parallelogram tower like data center angled to about 30 degrees off vertical angled away from the south, with all the south, east and west walls being solar panels above a steel and cement structure. Then on the top a rooftop deck where my house sits and my garden where I like to grow tomatoes and collard greens.
ImportancePitiful795@reddit
Money no issue? Nice.
A H14 8GPU MI355X and to house the server, it a 80 Sunreef Power Eco Next. 😊
javiers@reddit
A whole bunch of data centers orbiting the sun for free energy and multiprocessing. I will hire the best AI minds ever available and make them breed an AGI. Then I will make the AGI improve itself until it became an ASI. Then I will achieve immortality (provide that the ASI obeys me) to finally be able to take the proper time to understand the latest and biggest mistery of the Universe aka women.
Intrepid_Dare6377@reddit
The Death Star but renovated as a solar powered self-annealing data center, with an apartment I can hang out in or Airbnb. 🤣 if money is no object, I hover up the karpathy’s of the world to make software gud n stuff.
1998marcom@reddit
Assuming by "local" you mean a single machine on commercial hardware, I'd go for Tenstorrent's Blackhole p150, in a galaxy config of 32 cards on a single machine.
sunflowerapp@reddit
Claude Code + Deepseek V4 pro, this is what I am using but it would be great to host it locally
Lyceum_Tech@reddit (OP)
Hosting them locally would be nice tho. You planning to run them on consumer cards or go bigger?
sunflowerapp@reddit
No....I plan to use their API lol, these are cheap as dirt and I don't have anything super sensitive
Bohdanowicz@reddit
Forget datacenter. Id love to play with a taalas tpu with qwen 3.6 36ba3b or qwen 3.6 27b at 10-20k tks.
The work you could do... the stuff you could try. No waiting 45 min between builds. Keep the flow going.
jd52wtf@reddit
Hundreds of giant data centers full of Nvidia's new Vera units. Also the infrastructure to support them. Not just for me though. I'd donate a third of the compute to non profit research organizations. A third would be distributed to the general public. The last third would be sold at massive markups to giant blood sucking corporations.
jd52wtf@reddit
Here's what phase 2 looks like.
Lyceum_Tech@reddit (OP)
thousand giant datacenters with Vera units and still donating a third to non-profits? mad respect.
Lyceum_Tech@reddit (OP)
Damn this is next level.
AdamDhahabi@reddit
BIZON ZX9000 – Water-cooled 8 GPU server
(choose your flavor: H100, H200, A100, RTX PRO 6000 Blackwell)
Lyceum_Tech@reddit (OP)
That the one you'd actually buy?
AdamDhahabi@reddit
I've not look deeply into it, but water-cooled 8 GPU server it would be :)
DramaKlng@reddit
2x strix halo with deepseek 4 flash :) i know its not much but sufficient for my case
Lyceum_Tech@reddit (OP)
Clean setup. What kind of work are you mainly running on it?
whodoneit1@reddit
100 H100’s
Lyceum_Tech@reddit (OP)
100 H100s is crazy. Full send
ipcoffeepot@reddit
8x B200
Lyceum_Tech@reddit (OP)
8x B200 straight up. Respect.
DeltaSqueezer@reddit
Racks of liquid cooled B300s with a SMR to power it.
Lyceum_Tech@reddit (OP)
liquid cooled B300 racks with an SMR for power? that’s nexxt level. how are you handling the noise and maintenance on something like that?
DeltaSqueezer@reddit
I'll buy Anthropic and have them set it all up for me.
Lyceum_Tech@reddit (OP)
Lmao buying Anthropic just to set it up for you is wild😂