AI will possibly be devastating to the world, but not because of a Terminator takeover
Posted by AccordingChocolate12@reddit | collapse | View on Reddit | 23 comments
Before diving into my exact concerns regarding AI I would like to emphasize that I truely believe that mankind can solve so many problems with this new technology. There are already great examples in medicine and other fields that are spectacular and made things possible unimaginable in the past years.
https://cns.utexas.edu/news/features/turbocharging-protein-engineering-ai
The potential of this technology is impossible to comprehend, especially with the new quantum techniques which are arising and todays possibilities of chip design. It is really freaky to be honest and everything is happening so fast that it is really hard to really grasp the development of all of this. I can not really tell how the world has been five years ago and this is… scary… especially since it is evolving faster and faster. But, like I said: I truly believe this technology could make the world better if used thoughtfully and aligned with global goals.
But: The world is the way it is. And my concerns are huge with AI. Not because of terminator scenarios: Totally different ones. Here is a list with my top concerns regarding AI.
1 Energy
90% of data of all the time of mankind was created in the last few years. Imagine that? This is insane to think about. With the apperance of AI image creators and now video creators Coming up aswell the content contribution has exploded and will even more to an unseen and unpredictable extent. Disregarding here the question: „How much of this content is utter trash?“ - how much Energy does this need? The datacenters, the devices, the calculation power of AI. How will our global climate crisis be affected by the increasing power demand of this exploding technology?
2 AI armsrace
Obviously AI holds devastating potential for creating deadly machines. China released footage of some robodog like machine with a machinegun on its back getting dropped by a drone on a roof and then started walking autonomously. So yeah… how about: lets not create those things? But lets be real: there probably are some really super advanced weapons already which are classified top secret or sth. The US and China put so much money into it. This is so scary because I imagine that maybe the use of nukes will get attractive when you have weapons or systems that possibly can intercept the enemys easily or when you can mass produce killing robots without a problem that this is a usecase to be considered by some old mad man. Where is this leading?? We need to work cooperatively with this but the world seems further apart than ever since I was born in the late 90s.
https://hms.harvard.edu/news/risks-artificial-intelligence-weapons-design
https://diplomatmagazine.eu/2024/09/01/ai-arms-race/?amp
https://www.nature.com/articles/d41586-024-01029-0
3 The use of AI will automatize production but the money wont be distributed fairly but rather the centralization of money becomes hyperextreme leading to social conflicts and the breakdown of society.
How will politics react to this? Will some companys basically rule the world?
https://youtu.be/F_7IPm7f1vI?si=EHhPbkEjlIJdz19W
https://amp.cnn.com/cnn/2024/06/20/business/ai-jobs-workers-replacing
Last_Jury5098@reddit
Its a bit sad this got so little upvotes and responses. Because its far more relevant and urgent then climate change or amoc.
Honestly this forum should ditch all climate change posts. And instead focus on collapse risks that can still be prevented or alterered.
Climate change is a massive waste for collapse awareness to focus on,its the wrong angle.
I mean dystopia is shaping right infront of our eyes and people worry about amoc which might or might not collapse which might or might not have certain effects and which might happen in 50 years or 100 years or maybe even never.
People have their prioritys wrong,dont understand certain things. Let the corporate world worry about climate change,trust me they will eventually. And they will even spend money to make you worry about it,just give it time.
For the average population there is far more dangerous and urgent concerns.
Orzhov666@reddit
I don't understand why people are so concerned about AI with the incoming climate catastrophe
Hey_Look_80085@reddit
AI will take the jobs before climate does.
thesilverbandit@reddit
Climate catastrophe is guaranteed, AI apocalypse is likely, and the only deus ex machina for the former is the latter.
HeadAd369@reddit
AI is bullshit. Not even as smart as humans, and that’s a low bar
Ok_Oil_201@reddit
You overestimate average humans now really ...
HomoExtinctisus@reddit
I agree with everything in your title sentence except the word possibly. Nearly assured is closer to accurate.
hogfl@reddit
AI will end up being a flash in the pan. It will exasperate many of the world's problems, resulting in collapse sooner than expected. The reason that AI is a short-term problem is that complex resource-hungry systems are fragile. Remember, the more complicated a technology, the more resources and systems it requires. So, as the systems that support advanced technology fail, so will AI.
PaleInitiative772@reddit
I think the biggest issue is it’s going to turn the internet as we know it into a useless steaming pile of dogshit. It scrapes the internet for info, good and bad alike, and then churns out more questionable garbage at an alarming speed. Ai is going to cannibalize itself at an exponential rate and it will be almost impossible to separate real from ai created bullshit. I think it’s going to occur far faster than anyone realizes. Look at Google search. It’s getting less reliable practically by the day.
BetImaginary4945@reddit
We have 1000000 LLM models that are all mediocre and parroting each other's results. After the first pig, every other it's another pig with different lipstick. Now they're feeding pig meat (synthetic data) to the new generation of pigs. Jensen is a grifter and will be proven as such eventually when these bicycle spinners don't get to the moon.
gaz_w@reddit
interesting take by Iain Mcgilchrist on AI
verdasuno@reddit
2 the AI Arms Race is on and we only have a few years (months, really) to get what is essentially the new Weapons of Mass Destruction under control.
The Chinese AI killer robo-dog is just the beginning; there are autonomous flying drones and more under research and production, it will be a very very dystopian future we leave to our children (heck, many of us reading this will die from) unless we act NOW.
Slaughterbots: https://youtu.be/HipTO_7mUOw?si=tSiPv6jTPIGdYnpM
There is hope: it is widely known how to make and use chemical, biological and nuclear weapons, yet we don’t really see their use because society has deemed these to be fundamentally immoral weapons and any party that uses them becomes a pariah. There are international treaties in place that make their use illegal… but more importantly, explicitly immoral.
An international treaty banning lethal autonomous weapons systems is possible, and because of the leadership of a few countries it is under discussion at the UN.
We don’t need to ban AI or even AI use in weapons, what we need is to maintain active human control and decision-making on every execution / targeting of a human being.
AI alone should never have the power or ability to kill a human on its own. Otherwise, you will get a very very bad future for everyone.
Join the campaign if you want to live: https://www.stopkillerrobots.org/
Dracoia7631@reddit
It's terrifying to think about, but I have a counter arguement. Finite resources. The planet can not supply everything we would need to build an army of robots unless we can find a way to recycle a massive amount of used materials. But, that takes money, and a lot of it. Big losses aren't really acceptable to the elite, even if it could be considered short-term loss in the grand scheme.
Consistent-Big-522@reddit
I think it's less interesting than that, which makes it more depressing tbh. AI is a rebranding of neural networking tech (extra spicy statistical modelling) that has been around since the 80s. The only marked difference with the latest rendition is the hardware technology developed and seemingly endless resources that corporations are willing to spaff into it.
I strongly suspect it will reach standard Enshittification status when the costs of energy consumption and hardware infrastructure are passed to the end user. Microsoft, Apple, and Nvidia already had to bail out OpenAi to the tune of several billion. I don't think this is likely to improve given the diminishingly derivative slop generated from such a costly process.
mmps1@reddit
It’s a scam and can’t do a tiny amount of the shit these grifting pricks pushing it pretend it can.
ObssesesWithSquares@reddit
It's better than most therapists already...
Veganees@reddit
Is that because of the high quality of AI or because of the low quality and availability of therapy?...
Longjumping-Path3811@reddit
Yes.
ObssesesWithSquares@reddit
Low quality and availability especially. At least I can talk to it about whatever, and it will actually try. Not latch onto one diagnosis, and give me an irrelevant drug that does nothing while my life crumbles.
SnAIL_0ut@reddit
Honestly I wish that AI was like a Terminator take over because it would be a better timeline than the one we are currently in.
derpman86@reddit
Butlerian Jihad ftw.
ContextualBargain@reddit
I’m with you op. I don’t really have anything to contribute, but I agree with everything you said. This AI stuff is the most anti human technology that’s ever been invented. It’s going to take all our jobs and make everyone even stupider af. I guess I do have another thing to contribute that’s slightly different than the theme of this post. AI is a Nazi’s wet dream because the propaganda they can create with it dwarfs what Der Sturmer was able to do.
Head-Calligrapher877@reddit
I dunno why I'm the first to comment on this but this is the worst AI shit I've ever seen.