Researchers developed a memory device that kept working at 700°C, opening a path to electronics for Venus, drilling, and AI
Posted by sr_local@reddit | hardware | View on Reddit | 66 comments
IshTheFace@reddit
Just had to squeeze AI in there somewhere
vemundveien@reddit
I am all for sending AI to Venus. Ideally all AI.
superSmitty9999@reddit
I know your comment is sarcastic but this is IMO one of the best uses cases for AI. for example, if we had a mare’s rover with an in-space data center capable of intelligently navigating, we could get much more use out of our rovers on the planet since it could talk to it in realtime.
crshbndct@reddit
Why would it need to talk in real time? What would the rover be doing? We send rovers to do scientific experimentation, digging up rocks, analysing the planet etc. how would inserting chatbots help that process?
AI has no place in science because it is not repeatable. Machine Learning has a place in science but only for prediction, not experimentation.
But you can ask its opinion on something 3 times and get three different answers. It’s can’t do statistical analysis, it can’t adjust for errors in methodology etc.
inevitabledeath3@reddit
Machine learning is AI by definition. You should stop telling other people what science is and go and study some computer science and data science yourself lmao.
crshbndct@reddit
That is what I said?
Also, I dont want to study computer science or data science. I would rather be homeless, honestly.
inevitabledeath3@reddit
You said this which implies machine learning is not a type of AI. For the record machine learning and AI in general is used in many scientific fields for various purposes including data analysis which is something you said can't be done.
crshbndct@reddit
You are wrong on what I meant there. How do I know? Because I am me, and you are not me.
However, I concede the point. You are correct, I am incorrect, and you won this battle of intellect. Not a great achievement, as I am about the stupidest person to use this website, but nevertheless you have beaten me. Well done to you!
inevitabledeath3@reddit
That's a r/suicidebywords if I have ever seen one. Are you like okay? Do you need a hug?
You shouldn't put yourself that much. I am sure there are much dumber people on here. This place is a state.
crshbndct@reddit
Ehh.
I’m not really okay but there’s nothing that can fix my life anyway. I like to talk shit on the internet about things that I am interested in, but then when people somwtimes misunderstand me or whatever I just give up and concede the point because I don’t have the energy to explain myself.
Thank you for your kind words. It is nice when people are nice.
Have a great day.
username_taken0001@reddit
By talking in realtime he meand a AI navitaging system not a chat agent. A rover on mats is hard to drive because of a delay measured in minutes. Before you move a rover and see a rock on front of you it might pass even 40min in worst case making it impossible to react, limiting the safe speed of movement. With a AI driving, the decision could be made on side removing the latency issue..
crshbndct@reddit
Yes, but these things aren't just driving around at 30kmh on the surface. Each and every metre of travel is approved by like 3 mission specialists before it is executed. Curiosity has been there for like 14 years and travelled 35km, which is 2500 metres a year.
username_taken0001@reddit
That's the point. What if we could drive it at 30kmh?
crshbndct@reddit
There would be no point, because we drive it specific places to do specific things. Having some AI controlled rover doing unscientific analysis doesn't actually provide any benefit.
Never mind that AI still makes mistakes, and this is one of those cases where mistakes cannot be abided.
superSmitty9999@reddit
right but image you could do those things 50x faster and do 50x more things because you have a self-driving rover with some valuable but limited decision making abilities and some basic reaction times
SabreSeb@reddit
Imagine if it mistakes a gap for a shadow within the first kilometer, and you just wasted a billion tax dollars
AI has no place in space exploration where you have exact one shot at a mission and a single mistake can mean the failure of the entire mission.
superSmitty9999@reddit
imagine if {a poorly tested system failed}. I feel like this is a strawman argument, your point is irrefutable because it failing is the premise.
Okay now imagine if it didnt mistake a gap for a shadow, because the billion dollars the scientists spent on the vehicle included basic testing, and scientists can now set a waypoint much farther away and the robot can be trusted to get there much faster.
Your concern about reliability is valid but it's isn't a fundamental limitation of AI and more an engineering challenge to overcome. Waymo cars are navigating fully autonomously today.
Jeoshua@reddit
I would have expected better of the notoriously anti-AI crowd than to let you get downvoted for making such good points. Science gathering doesn't benefit from high-speed rovers and LLMs deciding things on their own. It benefits from mission planning and careful analysis.
LLMs might be useful in Science somewhere, but rover piloting, outside some very basic stuff completely achievable without LLMs, is not it.
superSmitty9999@reddit
Lol when they don't see the value of it being able to go 30kph when its gone 30km in 10 years currently
F9-0021@reddit
Rovers already do that. Someone isn't just driving Perseverance remotely with a 15-30 minute ping. They tell it where to go and the computer finds a path. If it gets stuck, it might need manual assistance, but it's not all manual.
superSmitty9999@reddit
imagine trying to do science, but it takes your head 7 minutes to talk to your hands
mattgif@reddit
it would keep it from getting lonely
spiral6@reddit
This is how the Vex get created.
megakaos888@reddit
Is there room on the spaceship for AI CEOs?
Paed0philic_Jyu@reddit
Just send them on a submersible - like that famous one built by an amateur that imploded - to the bottom of the ocean and splash sulphuric acid in the crew cabin.
The effect will be the same.
g0ld-f1sh@reddit
Think of the shareholders
Deep90@reddit
The article didn't mention it, but apparently multiple companies are seriously researching if an orbital datacenter could be made possible.
One of the (MANY) blockers for this that you'd need an insane amount of radiators. However, a chip that can run at lower temps would reduce this significantly. Exponentially actually.
porcinechoirmaster@reddit
Why the flying fuck would anyone put a data center in orbit? Like, there's literally no advantage to doing so.
I just... don't get it. The only time I'd put a data center in orbit is if it was attached to a space station that was doing research there, and even then only as a cache point before sending it down to earth.
jkurratt@reddit
They need a place to hold all the minor footage they squired through "face scan" verifications, and no jurisdiction on Earth is going to host these unholy files.
Exist50@reddit
But why though? What's the point?
Deep90@reddit
I have no idea.
All I know is that this research makes it more scientifically viable if at some point there was actually a good reason to do it.
pdp10@reddit
To be close to the customers, obviously.
windowpuncher@reddit
AI as a purpose built tool is incredibly useful. They're not putting fucking chatgpt on the thing.
Not-the-best-name@reddit
I man yea, god damn, landing working electronics on Venus is harder than AI in so many ways.
TRKlausss@reddit
What? Would you click on an article that doesn’t mention AI? How boomer of you…
(/s)
Splash_Attack@reddit
It's not really squeezing it in in this case. AI has been one of the main touted applications of memristors for as long as there has been serious research on them.
Only difference between this and stuff being published a decade ago in that regard is that the de rigueur term has changes. Now they say "AI" more instead of "neuromorphic computing" or "machine learning".
If you don't believe me go on google scholar, set the date range for something like 2005-2015 and search "memristor neuromorphic" and "memristor AI". You'll get loads of results.
GettCouped@reddit
It's so annoying. Too bad you spent trillions on FOMO for AI.
tavirabon@reddit
For anyone laughing at the thought of AI reaching 700C, the advantage would be lower energy consumption and higher throughput for edge devices, not large-scale compute. Plus "tungsten-hafnium oxide-graphene" doesn't sound cheap to manufacture.
Jeoshua@reddit
Anyone who has done any thinking about the problem would realize that, without massive cooling apparatus, the concept of computer chips reaching such blistering temperatures isn't just a possibility, it's an inevitability.
Just take the cooler off any consumer grade GPU, turn it on, and watch how quickly it either turns itself off from overheating or catches fire.
tavirabon@reddit
Consumer chips throttle in the vicinity of 100C, degrade rapidly past ~115C and dies completely so far below 700C that I have no clue what the hell you're trying to say. VRMs are hardier than any silicon, they can also die instantly from 130C. But this was about memory so just... what?
Jeoshua@reddit
They throttle at that temperature precisely because they degrade at hotter temperatures, not because they can't generate those kind of temperatures. I'm saying that even outside the lower atmosphere of Venus, extremely high temperatures for electronics are very easy to achieve. We put a lot of tech behind cooling these devices, after all.
So my point was that since these devices can operate at extremely high temperatures, and computers, particularly data-center servers are known for their extremely high temperature output, this is something beneficial for far more than just extraterrestrial environments.
Wait_for_BM@reddit
Being able to survive and run =/= running efficiently at 700C as the leakages would be high. That's basic physic.
tavirabon@reddit
The 2 things are not related, one use case is working in extreme environments, the other is energy efficiency at room temperature.
kat0r_oni@reddit
Resistence increases with temperature.
novae_ampholyt@reddit
Would be bigger news if it didn't
eivittunyt@reddit
running at higher resistances can lose a lot less energy than cooling it to room temperature and then running at slightly higher effiency
tavirabon@reddit
I know?
caribbean_caramel@reddit
We will finally get the chance to beat the Soviet record in the Venusian surface.
Jeoshua@reddit
I don't know why anyone would care about the Venusian surface. Ever since it's been proven there isn't some verdant rainforest under there, only the closest analogue in the real world to literal Hell, what good does going to the surface of Venus actually do?
I'd be far more interested in the possibility of balloon "satellites". Maybe even floating platforms. The upper atmosphere, just above the worst of the clouds is actually a remarkably hospitable area. Done right, we could even have manned stations there, with the temperature actually tolerable pressure nearish to Earth-normal, and other than their being no "ground" there, probably the most habitable place in the Solar System outside Earth itself.
lasserith@reddit
It will inevitably be cheaper to just use conventional devices and insulate and cool accordingly. You need transistors and interconnects and no way in hell is that going to be 700c
Thunderjohn@reddit
Finally something I can do some serious overclocking with.
OpenSourcePenguin@reddit
AI is for retention there
Why exactly does an AI chip need to operate at 700 C?
Olobnion@reddit
Venusians are a huge untapped market.
KillerWebDesigner@reddit
"Electronics for Venus" ah journalists 😂😂
exscape@reddit
Are you aware of the Venera missions? Look them up if not. The Soviet Union had multiple successful landers on Venus. They didn't last long, but long enough to send photos.
moofunk@reddit
Yes, the 700C is nice, but what about the 90 bar atmospheric pressure and highly acidic atmosphere?
Ferrum-56@reddit
Pressure isn't bad, only pressure differences are bad. Even humans can survive 70 bars, ICs don't care about pressure.
Pressure combined with the corrosive, hot atmosphere gets worse, but specialized alloys can still last. You would always have to prevent contact of the ICs with the atmosphere of course.
JesusWantsYouToKnow@reddit
Well... Oxygen becomes toxic at 1.4 bar, so regular air with 21% oxygen becomes toxic at around 6.7 bar. If humans want a chance of surviving 70 bar they're going to have to get there carefully managing the partial pressure of oxygen along the way.
Ferrum-56@reddit
True, you need a specialized mix of helium with a small % of oxygen. But the pressure itself is not the problem. It's just to illustrate that something as soft and fragile as a human can survice extremely high pressure.
Only_Statistician_21@reddit
I agree that IC don't care to much about pressure as a whole, but transistors performance inside is heavily influenced by selective pressure differences to enhance electrons and hole mobility a lot.
Cadet_BNSF@reddit
I mean, those can probably be sorted with an enclosure. Granted, you have to figure out how to get motors to work in those conditions, but this I at least a step in the right direction
hyoumah83@reddit
Being able to buy a memory module that works at 700 degrees Celsius has been my childhood dream.
GalvenMin@reddit
Instead, our children will just dream about buying a memory module.
netrunui@reddit
I guess I'm excited about one of those applications...
AlCappuccino9000@reddit
Maybe hardware suitable for RTX 6090
IshTheFace@reddit
69 with a 420 radiator.