15 years in and I'm struggling with change (Ai). Vibe-check for other middle-aged people feeling alienated by the industry?
Posted by maclargehuge@reddit | sysadmin | View on Reddit | 174 comments
I work in government. I work for a very small organization that partners with larger departments but we set our own agenda.
Currently, I'm the sole AWS admin and run a few websites and internal applications out of it. The bulk of my job is security compliance for our AWS environment to gov standards as well as devops to get code to the web servers from the web team.
In the last year or so we've gone full-tilt on AI-fever at the top levels. The junior IT staff have taken this to heart and are blasting out code that I don't have the time to review. I brought this up to senior management and I was told about all the wonderful tools that exist to automate code review as well and we can automate from all sides. Our answer to any problem lately is "more AI, faster".
I went to school for EE and learned IT by sheer force of will. I want to deeply understand what I'm working with and typically think bottom-up, not top-down. Trying, failing, getting stuck then breaking through... this all took many, many years before I felt confident in understanding what I'm working on. It feels like the brave new world is just to skip all that? Are other organizations running full steam into Wall-e land where everything is either SaaS or just vibe-coded, vibe-reviewed, vibe-documented and vibe-maintained? Do people who do this have any knowledge of their systems anymore? If not, is that okay?
I can't adapt to this world and I really feel like I'm getting left behind, but at the same time, I feel like this is going to be disastrous if we continue on this path. I don't want to become a middle-manager who doesn't understand what he's creating or maintaining. I don't want to sign control over to a series of corporations with their own interests. I want to make things. I want to own things. I want to host things.
The best parts of my job, the reason I got into the industry, are rapidly being outsourced and I'm left with feeling ignorant and useless.
I swore it would never happen to me 15 years ago, but I didn't think the industry would turn this way.
Fellow seniors, how are you adapting?
Sprucecaboose2@reddit
Like cloud means "someone else's computer", Ai rapidly means "someone else's thoughts". It might be here to stay, but I think once people realize it isn't an infallible "answer machine", there will be a correction in the landscape.
ZaradimLako@reddit
I mean, on one hand thats true, on the other hand the advancements in the last 2 years have been astronomical and only continue to be. It's easy to point out the flaws, but the amount of things it can do increased by so much its ridiculous in such a short time.
I doubt a correction will happen anytime soon unless AI progress reaches a wall which people have claimed multiple times in the past 3 years only for it to age like nilk.
Jethro_Tell@reddit
One of the biggest things I've see is the amount of code pushed per feature is huge. I'm not sure that we've actually increased the development timeline. There is just more surface area with every thing I see. Our code base size looks like a hockey stick, and I'm not sticking around to debug that shit in the middle of the night.
It is a lot better than it was, but I'm not sure the value proposition is there.
OmenVi@reddit
Remember when Windows ran at a pretty good pace on 512MB or RAM?
Remember when a 2GB hard drive was more than you were likely to ever need in the lifetime of the computer?
We've already see the effects of the massive influx of coders and development where efficiency is the last thing on the list of importance. Bloated, terribly inefficient, often riddled with vulnerabilities or bugs tucked away to be found years later, in come cases; All requiring more and more power to run, and at the end of the day, things don't really seem faster than they were 20 yrs ago (yes, I know, we're doing more stuff that we just don't see).
AI is going to turn that up to 11. Code efficiency will drop through the floor, and almost all of the developer efforts will go into debugging mashed potato code, trying to sort out why when Jenny in accounting is just trying to run a month end report it takes 300 hrs to complete all of a sudden.
I'm NOT a developer, and I already am seeing it. People just running shit with no wherewithal as to how it works, or if it should work the way it does.
Unfortunately, just like the first iteration of efficiency last mentality, it will largely go unaddressed, and we'll just make better machines to churn through the garbage code faster.
MedicatedDeveloper@reddit
My favorite is when the AI creates something with the 'correct' user facing output but the actual application doesn't DO anything besides lie to the user. Extra points if it also makes a fake test that 'passes'. I've had this happen several times now across models with vibe coding cli tools.
narcissisadmin@reddit
Yeah, but that storage space and memory had to be increased with the advent of multimedia and advanced web content. The reason machines from just 10 years ago are so obsolete is purely artificial, as evidenced by spinning up a Linux desktop on them.
Speeddymon@reddit
Machines from today will still be functional in Linux in 40 years.
Jethro_Tell@reddit
Maybe, probably not, we’ve already dropped i386 from the kernel in 2012, just 20 years of support. Some might but some might not.
ZaradimLako@reddit
Tech debt types of environments were very common even before AI, its just that for the lazy and incompetent ones its even easier to stay lazy and incompetent i guess
Jethro_Tell@reddit
Well, now ad business people that were confined to spreadsheets vibe coding a dashboard and getting access to your data set.
A. They are gonna chuck it over the wall and make it you're problem the day it breaks, and that day will be soon because who's gonna be doing library updates for that?
B. The CEO is gonna ask for R/W on the production data sets.
I'm not gate keeping, I'm happy for anyone to code up whatever dumb app they feel adds value to their organization. But my years of experience here tell me that maintenance of code is already considerably harder than producing code. The issue has never been the speed to produce code or features, it has always been the trailing maintenance and support burden that crushes even functional organizations.
That's why devops was the hotness for so long and to be frank, even after 10 years of devops push, people still usually only get 70-80% there and that 30% can still crush an organization.
And all this is before we see the true price tag. If you were paying what it was worth, and these companies weren't loosing money in the billions per quarter, it would likely still cost less to hire a competent non AI dev.
When they start charging for shit, you'll have a bunch of hamstrug devs and piles of unreadable code and old guys will probably have to come out of retirement like the cobol shit to clean up the mess.
In the mean time, I'm a goat herder.
Sprucecaboose2@reddit
The first time code written by Ai costs a company millions and the resulting lawsuits that result will stifle and rollback a lot of stuff I have a feeling.
ZaradimLako@reddit
It already did though didnt it? Didnt amazon get downtime because of a vibe code session? Maybe something a bit bigger needs to happen, who knows. Perhaps something where human lifes are involved
Sprucecaboose2@reddit
I'm thinking doing something like a stock company making a hugely negative trade or something like that. It needs to hurt rich stockholders for it "to matter".
Bright_Arm8782@reddit
You don't need an AI for that, in the UK we had a hedge fund use the wrong algorithm and have to close down.
Centimane@reddit
There's really nothing AI could do that would change the culture towards them. There's tons of examples already of AI code costing money.
It would pretty much take an AI company breaking their data agreement and using their licensed AI to exfiltrate data. Maybe a hack that does that without the company being complicit.
Sprucecaboose2@reddit
Give greed a bit of time to work. It's not like the ones running the Ai companies are a bastion of virtue.
WaldoOU812@reddit
Not just devs. I'm a systems engineer and AI is a huge help. I can do things with it in a few minutes/hours that would have taken me weeks/months to do previously.
machine_fart@reddit
I wish these goddamn sales people would realize that sooner lol
SummonMonsterIX@reddit
It's been my experience that the natural state of a sales person in tech is lying.
tarvijron@reddit
Lying or golfing.
CoffeePieAndHobbits@reddit
Often at the same time!
CharcoalGreyWolf@reddit
Sales people aren’t about realizing anything other than realizing the next big sale. They don’t have to support it, just proselytize it. They have no ownership of the product, just promotion of it.
GuyWhoSaysYouManiac@reddit
It's really just ignorance in my opinion... They genuinely have no clue. (Almost) no self-respecting decent tech person would willingly go into Sales.
ValeoAnt@reddit
They would if they want to be paid properly lol
Beautiful_Leader_501@reddit
Not just in tech
rb3po@reddit
Realizing that sooner is the antithesis to their function lol
craywolf@reddit
It is difficult to get a man to understand something, when his salary depends on his not understanding it. (Upton Sinclair)
MaelstromFL@reddit
Never happen...
lineskicat14@reddit
Agreed, I think like automation, it will wipe out some jobs, trim some fat. But its not gonna be doomsday, at least not yet. I kind of see it going like self checkouts. Everyone freaked that it would eliminate all store workers. In reality, you still have Grocery store checkout lines, folks bagging, folks helping with the self checkout machines, etc. Instead of there being 8 lines open. Theres 5.
Wild-Plankton595@reddit
I frequently shop at any one of 4 stores. I only see 5 open when there were 20 before. At stores where there were 10-15, I see 1-2, 3 at peak times. Considering many are part timers, that’s a loss of significant number of positions for people who look for supplemental income.
I once brought it up in this sub and was downvoted to hell as a Luddite. My single mother raised us on multiple jobs including cashier at local grocery store, it hits home. Their argument was it’s not supposed to be your sole means of income, it replaces those jobs with techs to maintain those self check out machines just like manliness we’re replaced with cellphones. You need to get with the times, otherwise ditch your cell phone and go back to landline. Expected in a tech sub I suppose. However… yup, the teenager making some money to help at home or buy their first car, parent trying to keep afloat, retired senior supplementing retirement income, anyone that doesn’t need it in normal times but needs additional cash to ride out an emergency is going to stop life to train in some tech capacity to maintain these machines. Is it done, yeah, it’s just not feasible for everyone. These are also the types of jobs someone is not going to commute 45 minutes for, so these are people in our communities. I try to use the cashiers as much as possible, though I know it’s not gonna make a difference besides make me feel better.
Anyway, end tangent, hopefully not rant. Back to AI replacing people.
Confident-Corner3987@reddit
Like this “someone else’s thoughts”
J-TEE@reddit
The anti AI sentiment from IT people is not productive at all. It just makes the executives hate IT people more.
OkChildhood1706@reddit
I guess that will happen once companies need to pay the real price and suddenly 90% correct is not enough anymore…
BrainWaveCC@reddit
Some THING else's thought approximations...
paleologus@reddit
Much like the demon in “The Exorcist” it mixes truth with lies so you don’t know what to believe
Texas_Sysadmin@reddit
I feel you. I have been the it industry for 31 years now, and frankly, AI scares the crap out of me. Over the years, movies like Wargames, Terminator, 2001 a space Odyssey, and TV like Battlestar Galactica have conditioned me to think AI is going to kill me. So AI alienates me every time I see it.
My skills and experience are still needed, but I feel disconnected from the cloud apps and AI advocates. When I say things like the cloud is just other people's servers you are renting makes them look at me like I have 2 heads. But I can remember all the "hottest things" in IT that were supposed to change everything but flopped badly.
Most of the current trends have me mentally yelling "GET OFF MY LAN!!!"
Sigh.. I only have 2.5 years to retirement. The day I retire, I am dismantling my home lab and i become a user so I can bitch at the current crop of admins.
BrechtMo@reddit
You need to see AI (LLM) as a companion which can speed up or execute the tasks that you aren't as good in and don't like. log file analysis, script writing and documenting, getting up to speed with new technologies are all areas where LLM's can be a real timesaver already.
If people are allowed to blast code that they don't understand and have no responsibility over out to production, AI is not the issue.
narcissisadmin@reddit
What if you're already proficient in all of those things?
H3rbert_K0rnfeld@reddit
Did we not learn anything from George Jetson and RUDI?
ipreferanothername@reddit
i get it - im a windows server guy. active directory, GPO, SCCM, lots of powershell automating through various apps, a little citrix and vcenter, and now azure. and its super common for me to try finding technical answers i need via AI and get results that just dont work or arent accurate. that puts me off implementing it for other uses cases. so all these apps that i work in just fine - generally speaking, if i learned the app - i dont really want to relearn something that is changing all the time if i can use my regular features for now.
i do like to keep up to date on IT stuff a little bit - probably not as much as i should, but still - and right now i still have to learn and work in azure [ballache], spend countless hours a week on a cyber vault project, tinker with secure boot, figure out where and how to audit for rc4 changes in this awful environment, etc etc etc.
i have adhd and im in my 40s, im a caretaker to my disabled wife [which is not...exactly full time, but its a thing], and its hard to keep up with all the stuff i want to keep up on. im not the type who enjoys doing labs outside of work hours or who wants to read lots of documentation and watch youtube either. i want to tend to my house, my wife, and some video games or social circles.
i also find all the tools and things that have to get chained together harder and harder to follow. a buddy of mine has been a sharepoint/office/365 trainer for almost 20 years. several years ago he started his own consulting business to work with customers who were asking about it.
last weekend, on a lark and probably a lot of caffeine and nicotine, he....started to mess with claude code and ended up having it build an outlook plugin to work around a weird limit he found between 365 mail and some azure products or something. and he was explaining it all to me, how it all fit together, and i just...couldnt entirely follow it. he doesnt even script one-liners, never mind develop anything, and boom....he has a plugin.
i gave up on git after finding it to be such a headache every time i tinkered years ago, im good at powershell and other technical products and aspects of my job but trying to maintain our traditional stuff while all the cool new toys are rolling out feels like its impossible to keep up. i was hoping i wouldnt feel that way for another decade :-/
narcissisadmin@reddit
I was mostly on board and then you lost me.
MedicatedDeveloper@reddit
You need to adapt. If I were to be on an interview panel and someone said 'I gave up on git, it's too hard.' they'd immediately be sent to the back of the pile.
Some things are not optional.
Your job is trying to get you into a more relevant skill set with the Azure work. Embrace it because if you went out and interviewed now it'd be very hard to land a role. Windows server and M365 administration are some of the most outsourced admin roles ATM.
ipreferanothername@reddit
oh, im with you - its frustrating to me but i keep having to just tell myself 'nerds make shitty nerd tools' because so many annoying things have become the norm. i dont need git at work right now, i want to move into some things that would require it and make myself get it. i want to learn vscode better, im ok with it, but i know i can get more out of it.
but then also - git is the easiest bit, and vscode is more about can i find time to tinker more [because i also find some of it fairly annoying , though i do use it regularly].
figuring all the annoying ways things in azure will or will not work together is another level of bad tools by nerds, for nerds. i want to replace one product: jams scheduler. azure wont do that with one product. its a kinda nice perk in azure that you can granularly just use what you want or need - but also it means i have to try and figure out to replace my functionality from one tool into many azure tools and probably a couple of 365 tools. then i have to have time to learn and build that out...and then the migration.
im still willing and trying to learn things, i have learned a fair bit about azure but getting the experience at work this minute is a real challenge.
Pineapple-Due@reddit
The best way to learn git and vscode is the same as the best way to learn powershell. Just start doing it. Stop using the ISE and start using vscode instead. Watch a 5 minute video on git and just start playing around with it.
MedicatedDeveloper@reddit
You may have more success if you shift your thinking. A tool you do not understand is not a 'shitty nerd tool'. Shitty nerd tools aren't lynchpins of doing our jobs well.
NighTborn3@reddit
I hate this mindset
jaydizzleforshizzle@reddit
Correct, the people who were able to put some powershell together are going to have to step up their game.
ipreferanothername@reddit
i mean i dont just put powershell together - i write modules and automate workflows in lots of products, do a fair bit of sql, and had to dabble with many other random bits of languages on the fly. had to lean fairly well into javascript a while back on top of it.
but also - now ive been around the org for a minute and have a big pile of things on my list and the pile isnt getting smaller you know?
Centimane@reddit
The idea that none of that is in git is a nightmare
Break2FixIT@reddit
Your (our) days are numbered.. if you can lift a ladder and install equipment your safe.. for now
ErrorID10T@reddit
Just like every SDWAN solution I've used has just been trash that's ultimately less efficient and more expensive than a competent network engineer, AI, at least for the forseeable future, is going to be the same. I have yet to find any kind of AI that remotely threatens a competent sysadmin. It's just another shiny new thing executives are convinced we MUST have, but ultimately is a tool that's more harm than good if you don't know what you're doing.
Flat-Classroom4230@reddit
Fortunately haven't seen to much of it in my place of work so far (aerospace) but its coming. What's worrying me more is all these warnings about A.G.I causing an extinction level event
thewunderbar@reddit
If I had a nickel for every fad that came and went I'd be retired already. And to be clear, the "AI everything" is a fad. But the toolset itself is not.
what we currently call "AI" is a tool. Like all new tools, it's seeing extensive use to figure out what it is good for and isn't good for. I use co-pilot every single day in my job, but that doesn't mean that my job has become "co-pilot prompt engineer." It's one tool among many. Just yesterday it much more efficiently helped me troubleshoot an issue I was having. It got me to where I needed to go faster than clicking random links in a google search would have. I still verified it, but it provided me a much better starting point.
And I also find the tools very good at script creation. It's not 100% perfect, and I do end up modifying a couple of things. But a script that used to take me 4+ hours to look at is ready to test in an hour or so, because the tools provide me a starting point that's much closer to the finish line than whatever random thing I start with on stack exchange.
It's just another tool. Lots of companies are in the "this will change everything so lets use it for everything." In 3 years, that pendulum will swing backwards, somewhere into the middle, and we'll be using these tools every day without really talking about it.
simon-g@reddit
100% this. Copilot is doing a great job of taking my crappy powershell scripts and adding all the batching, error handling and logging that I didn’t do. It turns my blunt email replies into polite phrases. It takes my end user documentation and points out the things that won’t make sense to non-IT folks (and how to rewrite). It tears through a million line log file and gives me a nice summary of what is normal and what is wrong. It gives me nice digestible explanations of things when I need one for my boss or someone higher up. I could go on.
It frees up my brain for the more important things. And yeah, it doesn’t always do a perfect job first time, there’s no substitute for judgement and experience. But in the sweet spot of what it does well it’s pretty amazing.
Unique-Path4099@reddit
copilot here too, i use it the same exact way and its been pretty amazing but I will add that I've taken time and 'trained' it (if thats even possible). I've found its most productive use to be via email/teams, its wording to mgmt gets actual responses. I hope those starting out in IT take the time to learn things for themselves before trusting these tools because they are not infallible...what they are to me is excellent time savers and mgmt translators.
dab70@reddit
This is exactly how I feel
doubleopinter@reddit
This is correct
tarvijron@reddit
^^^ listen to this dude Straight up this is gonna be spellcheck or advanced autocomplete or the magic wand selection tool one day. Just another tool that helps keep you honest and helps you move a little faster through the same old task.
OmenVi@reddit
The problem isn't this approach, though.
The problem is the number of people who do NOT have this skillset, thinking they're smart enough to implement code because AI said it would work, putting it into production, and causing issues that don't get discovered until disaster has struck.
That and the deletion of Jr. Dev/Admin jobs that remove the opportunity for people to develop these skills to begin with.
If I thought that everything that AI was churning out was landing in front of a competent person before it was getting used, I'd have less concern.
q123459@reddit
all friction you had encountered can be summarized in one sentence: system-glue management tools are made by mentally disabled people For mentally disabled people. Because human cannot write any decently resilient heuristic around black box infrastructure.
so if your style of problem solving does not align with their way of thinking you will be struggling to accomplish your task.
problem with ai is that their api does not follow conventional program tools-style interface - one command might do very little and require you to supply very little implementation details, other might create whole infrastucture and default settings is not sanitized by anyone to be sane and safe.
traditional iac solves specific tasks in a few well known ways, ai creates their own non standard systems because it does not know what will work and what wont, and it does not know to be vary.
you already are if you're juggling systems spin up
bagomojo@reddit
I think there will be a correction. CEOs are only seeing savings and not the risks. I think we're on the cusos of some major incidents due to Ai. Once that happens we will start to see a correction.
Centimane@reddit
Savings are immediate.
Risks realize into problems in the future.
CEOs of the last 2 decades have proved again and again - if they can save a buck now, at the risk of many more later - they will.
meatballwrangler@reddit
I honesty can't want for the first few major security incidents that can be directly attributed to AI
223454@reddit
They will never admit it was due to AI. We'll know. They'll know. We'll know they know. They'll know we know. But they won't admit it. In a few years they'll quietly, and slowly begin to rehire. By then there will be fewer people that can fix the problems, so they're skills will be worth more.
dotcomGamingReddit@reddit
Like the recent claude cli source code leak which was apparently a human error in a company that claims coding is solved and all their employees use ai?
bagomojo@reddit
The layers of Ai slop that will need to be fixed is mind boggling.
danielfrances@reddit
Yeah, but what if it doesn't pan out like that?
Just last summer I couldn't get these agentic tools to solve a simple bug in our codebase and now we can rewrite entire swaths of the app if we wanted without manually writing any of it. People have been doing just that through various POCs for refreshing our UI and other explorations we have been doing.
Yes, LLMs still need lots of guardrails, and then you have to review to make sure they didn't just ignore those guardrails, but these tools are improving rapidly and need less and less handholding each new release.
Maybe there will be a plateau and then nothing changes for awhile, but people are learning how to use these things more effectively every day. The progress will continue even if it slows.
And the likelihood of more big breakthroughs seems greater than things plateauing - in which case, how long before the models are good enough that they consistently don't produce slop and can rewrite older AI gen stuff to match? That doesn't seem too far off to me.
I can't predict the future, but anyone assuming we are all safe because "lol AI sucks right now and we gotta be here to clean it up" is taking a big risk and is probably operating on dated experiences as well.
A final note - does the business world even care about quality anymore anyways? I'm sitting here staring at the Claude status page and it's wild 98.8% uptime and their leaked source code fiasco, thinking maybe this is a price businesses will gladly pay for the promise of slashing their workforce.
bagomojo@reddit
The issue is it sucks not from a security standpoint. Do you really think a company will proactively go back and rewrite the entire codebase? I am sure the details of the LiteLLM attack will be interesting
223454@reddit
It's called technical debt, and there will be a shit ton of it in a few years.
BatemansChainsaw@reddit
I've had some great contracts fixing other peoples mistakes and more often than we expected had to basically start over from the ground up since it was still the less expensive (but still very expensive) to do.
Frankly, I can't wait for the hammer to fall on this disaster and hope it's so painful we don't hear the words "ai" without disgust dripping on the statements about it.
UserFrienlyName@reddit
Waiting for the term like "AI Legacy code" to appear.
Kuipyr@reddit
Force developers to use AI and then throw the book at them when it screws up. ez pz
vectravl400@reddit
This cracked me because of the Friends reference. It's likely to be true though, and in about 5 years time there'll be a demand for people who can actually parse the problems and solve them with their own thinking minds. Kinda like the Cobol crowd.
Quietwulf@reddit
Just wait till they kill a bunch of people.
Remember the THERAC-25?
bagomojo@reddit
I am a CEO and man it is like a cult amongst other CEOs. I own a cybersecurity firm and will tell them there are many issues, hallucinations, lack of data segmentation, etc. And I have had more than a few tell me I don't know what I talking about. OK buddy I'll be here to charge you for the incident reposnse.
ErikTheEngineer@reddit
If McKinsey came to you as a CEO and said "Just follow our playbook and you can have an all executive company once one of the big 3 gets AGI and just write them a check for "work" that's 50 or 60% of your payroll" -- wouldn't you be in the cult too
I think that dream of no more employees, kicking back and watching the river of profit flow in is too hard to shake for most CEOs!
GolemancerVekk@reddit
No more employees would mean they've outsourced all their business to one of the Code as a Service platforms and they're a glorified supervisor. Why anybody would pay them specifically gobs of money at that point is anybody's guess.
tnoy@reddit
If they're at the same frequency or lower than the security incidents that are created by humans it's going to be seen as a win.
protogenxl@reddit
AI is a better search engine with stupid levels of investor hype and CEOs overselling the capabilities left right and center.
Large language models date back to the 1950, it is only now that implementing them has become feasible, feasible not cost-effective.
d00ber@reddit
I do consulting on the side, and I reached out to a smallish size business where I found all their switch configs on a public git, and a contact @ company.com. Turns out, they tried to use AI to automate switch change management and just didn't check in on how it was being done lol . I was obviously trying to get a customer but they honestly didn't even seem to care.
NoYouAreWrong_@reddit
I'm adapting but leaning into AI. Expressed my interest to management, now I head the AI committee. I'll develop AI governance, security, and enterprise AI tools for staff. I'm trying to become an expert on LLM use, and now more broadly, machine learning and data. It's IT, we were clearly told to expect this throughout our careers.
Vegetable-Ad-1817@reddit
Remember when everyone went to cloud, and the failure to plan ahead for the massive bills and exploitation hit business, Pepperidge Farm IT remembers.
DeebsTundra@reddit
All we do is change. It's our job to change and innovate and solve problems. So go out, learn it, figure out what it does well and what it sucks at. Then utilize it to whatever degree works for you. Don't turn into the cranky "back in my day" admin who still uses First Choice because he doesn't like Wordpad.
I'll be 43 this year, doing this for 15 years and I love the challenge of change. Grab it by the hojos.
dnz007@reddit
It's not "use AI, job done" it's use AI, review the response, make it clarify when needed, make it correct when needed, etc.
maclargehuge@reddit (OP)
I think that can work if you already have the skills. My worry is that we will have a generation that doesn't have those skills.
atrawog@reddit
It's going to go both ways. There will be myriads of people who have no clue about anything. But you shouldn't overlook all the 14yr olds that use AI to turn their PCs into Linux Gaming stations. Simply because for them Windows is just boring as hell.
dnz007@reddit
That problem should solve itself. AI helped me study for and pass my continuing education where I have to test in a proctor center and get patted down and waved with a metal detector.
tekalon@reddit
How did AI help you study vs using textbook or other traditional study materials?
dnz007@reddit
I asked it for trick questions and used it to generate practice tests of trick questions.
DontDoIt2121@reddit
You are in it, you've adapted to everything else that's come your way, adapt to this
ultimatrev666@reddit
I'm a long time prod support engineer / middleware support engineer who was laid off. I've used Python in Academia but never C/C++ so I've been learning C/C++ in my spare time. I do ask for AI for assistance here and there (as well as Stack Overflow, GeekforGeeks). I would never take the code AI generates and accept it as gospel, I always would use an IDE to sanity check the code.
thewunderbar@reddit
This is where we need to get to. I treat these tools like I treat Wikipedia. Is it correct most of the time? Yes. But if I *really* need to know it for 100%, I'm going to the sources to verify.
Unnamed-3891@reddit
And this is why I am not scared of getting old in this industry. At all.
thewunderbar@reddit
If I had a nickel for every time someone said "the new generation doesn't have the skills" I'd be retired already. (yes, I"m aware that I used this line twice in this thread.)
People *always* think that. It's not 100% wrong, but it just represents a change in how things are done. We don't do things the way they were done in the 90's or early 2000's because the toolsets mature and change.
koki_li@reddit
I am a sysadmin and I use AI for troubleshooting only from time to time because the results are mostly useless.
I get much better results when I use AI for coding, but if a AI can’t get results, its making things up.
For all the effort, resources and energy the results I have seen so far, I would call AI bullshit. We don’t live in a world, which waited for AI, we live in a world, where rich people with AIs are desperately searching for a usecase for their bullshit.
dnz007@reddit
> the results are mostly useless.
My experience is the opposite, but I have GPT-pro.
atrawog@reddit
I've been working in IT for 30 years now and from my point of view the issue isn't AI it's the kind of hyper specialized mono culture where all your knowledge is just about a single tool or company.
I went from C64 Basic, MS DOS, Windows 3.11 + Novell, Windows NT, Linux, AIX, VmWare, Cloud Computing to the Linux OpenTofu/Kubernetes stack I'm using today.
And moving everything to a Claude Code + dedicated skill based system is the next obvious step in a long list of changes.
noisyboy@reddit
Like everything else there is a balance. It's not bad with discussing ways to structure a setup or design/refactor the API signatures. The stress is on the "discuss" part, not blanket copy/paste. Use plan mode, don't yolo. It has suggested patterns that I found to be useful. It has also missed approaches that could have been cleaner.
Writing fresh code, not so great. Massive amounts of copy/paste, total disregard to DRY principles. Obvious improvements to code structure totally missed. So what do you do? Don't make it write tons of totally new code.
Boilerplate refactor e.g. due to schema change etc? Saves me a ton of time doing bog standard modifications across many classes. Also caught stuff I may have missed.
This is a non-deterministic prediction engine. There is no point expecting stuff requiring deep original thoughts. Use it for what it is not bad at and even then, keep your eyes open.
justinsst@reddit
AI is definitely over hyped by executives but ultimately it is a tool and everyone should learn how to use it effectively. Linus Torvalds of all people isn’t fully against it (he just views it as a tool), that alone is big a statement.
AI works best when you use it work on things you have a deep understanding of. I rarely write scripts or terraform code from scratch because why would I? If you know your shit you can describe in detail to AI what you want and make minor changes as required.
You mentioned “vibe-documented”, what’s wrong with that? I’ve found AI makes great documentation, in-fact sometimes it’s too detailed lol. It certainly beats the random scripts and whatever else lying around that has zero documentation in the first place.
As for code reviews, I don’t agree with AI being the sole code reviewer either. See if you can find a compromise by asking for smaller PRs, that way you can review them better. AI will be better at reviewing small changes anyway so you can use that angle when talking to leadership. You should also ask that a summary of the changes is included in each PR, which shouldn’t be an issue since AI can do that pretty accurately.
OtisB@reddit
Shit I've been struggling for 10 years. You want to know what helped me? Take an active part in learning something. When I was doing my degree online a few years ago I discovered that I was MUCH more flexible and adaptable to new circumstances when my mind was in that mode and so now I make a point to do some serious learning once a month or so just to keep myself in the learning mindset. It helps me approach tech that I kind of wish wasn't contaminating my nice static job otherwise.
Craptcha@reddit
New automation tech comes in, businesses get excited an go all cowboy until some people start making some costly public mistakes and governments and regulatory bodies start adding guardrails.
The hype on AI (which is a powerful, disruptive tech on its own) is so high that everyone is fomo’ing. Meanwhile, most organizations havent even started true organized digital transformation. AI isn’t going to manage organizational change on your behalf.
Play along, back yourself with written authorizations when they want to take risks and make sure they own the consequences of their mistakes, then step back and let things play out.
Don’t let them make you the human guardrail of their AI experiments.
tarvijron@reddit
Twelve years ago, I was told being an infrastructure admin was pointless because infrastructure as code was going to allow those MeGABRaiNs over in software development to build their own architecture and I better start working on my resume.
Turns out software developers are generally speaking: morons. Their bosses are also morons. They’ve got great ability (the good ones anyways) to break business logic down into software components. They get bored by persistent slow moving problems or projects and go find new jobs, and then I take over the garbage they built and fix it.
n00lp00dle@reddit
id caveat that with tech ceos. the devs are just code monkeys. the monkeys with ideas become tech ceos.
in any other industry there is some kind of barrier to having an idiotic idea become a reality. in healthcare you have ethical guidelines. in construction you have regulations. but in software theres next to nothing.
if someone went to get vc money to build a dyson sphere they would be laughed at. but for some reason the world economy is handed on a silver platter to men who want to make the digital equivalent.
MeatPiston@reddit
Real.
I’ve met web developers that can’t turn their laptop on (they literally could not locate the power button) let alone understand what a server was. Like they literally did not know all those noisy racked boxes were where their sites lived.
tarvijron@reddit
Ask yourself if the AI rah rah crowd (who piss their pants and tear everything down to redo it every time a new model comes out) have the intestinal fortitude to be running a ten year old infrastructure with an eight year old business plan and a let’s do it next quarter management risk appetite.
discosoc@reddit
Just because something is "vibe coded" or whatever doesn't mean the person doesn't know what they are building. It's such a meaningless catch-all term that basically serves as nothing more than a dogwhistle.
You really need to decide if your issues are actually related to ai, or if that's just a convenient placeholder for deeper concerns. Because the ai tools being utilized right now are absolutely a paradigm shift that can't really be ignored, and even if the overall process is still in the early messy phase, the end destination is legit.
anfotero@reddit
I studied to be a sociologist and am a sysadmin: I learned everything by myself, so this resonates with me.
Yes, and I'm in Italy. There's a demented push. A friend of mine, a Microsoft employee, is desperately trying to facilitate Copilot adoption by a huge phone operator and NOBODY USES IT BECAUSE IT'S USELESS. He himself is realizing that. Italian businesses tend to be small, so they're slow in adapting to new tools, but everyone over 500 employees is trying using it and mostrly failing.
Fortunately, I'm a sysadmin so I don't use LLMs for anything. My boss bought me a ChatGPT subscription and I log in from time to time to make him happy, but it's terrible. At first I tried it for troubleshooting and I had to stop after a month of wasted time, aimless bullshit, allucinations and commands that, had I given them in the terminal, would have destroyed our environment.
koki_li@reddit
I had slightly better results with gemini. But in the end, all the AI stuff feels like talking to an idiot, who comes up with a O.K. idea from time to time.
I cant understand the hype.
n00lp00dle@reddit
it works sort of acceptably as code generation. but weve had effectively seamless code completion for years already.
if you offload your critical thinking to it then you are wilfully walking deeper into platos cave.
suncontrolspecies@reddit
this 100%
gbfm@reddit
It's kicking the can down the road
Bad programmers using AI are like the people who used to churn out 100-slide Power point presentation in the past. Lots of fluff, but no substance and wasting the time of anyone who has to sit through those 100 slides
No one mentioned that it used to be easy to know that a programmer's code is bad. The bad formatting usually gives it away. Now a bad programmer and his new BFF the AI can churn out code salad which is perfectly formatted, but ill suited for the environment. It now takes forever to read through the bad Ai generated slop.
Unsurprisingly, the worker who churn out 100slide PowerPoint presentation and the worker who churns out AI slop code don't feel there is anything wrong with making someone down the line waste hours reading through useless fluff.
WaldoOU812@reddit
My very first thought, when my director mentioned in 2024 that we were going to be using AI was, "great, it's just a glorified search engine." I made the smartass comment about, "maybe I'll use it to make a Dungeons & Dragons game." He was 100% on board with that. My director is *seriously* cool, btw. One of his primary bits of advice was, "find something that you really like on a personal level and find a way to use AI for that."
It took me a few months, but eventually I did. I'm a huge tabletop RPG gamer, so one of the first things I did after getting past the, "okay, it's not just a glorified search engine" was to use ChatGPT as a dungeon master (well, "Marshal," technically, for the Classic Deadlands RPG). It wasn't great, but it did a hell of a lot better than I ever expected it would and it was actually a lot better than at least a couple human game masters I've played with.
Point being - that's what got me over the hump and got me really interested in it. Suddenly I'm wondering about how I keep the rules straight - upload documents in projects/gems. How do I keep consistency in between sessions - state file. As I got further in, I'm starting to think it'd be nice to have an html front end that shows party status, quest status, pictures of the NPCs we're talking to, pictures of the places we're visiting, etc. None of that is *directly* related to work, but it's kinda like the wax on / wax off thing from Karate Kid; that knowledge transfers *amazingly* well.
My director uses it to optimize his Diablo IV character builds. My manager uses it to create a grocery shopping / menu app for his family. One of our help desk guys uses it to organizing his music collection (which he'd previously been managing with monstrously insane Excel spreadsheets). A friend on another team uses it to build an HTML front end for his Dungeons & Dragons 5e campaign. And I use it for testing (playing) RPGs that I want to play in person, suggesting dinner options (I'm a ridiculously picky eater), certification exam help (ChatGPT was *critical* in helping me pass the AZ-104), scripting, and even tips on other AIs. I've noticed, btw, that ChatGPT is way better at answering questions about Claude than Claude is.
It's not the magic bullet that a lot of people are talking about and it definitely has a lot of flaws, certainly. It can hallucinate, big-time, moreso if you're not skilled at prompting (as a former computer science teacher told me once, "garbage in, garbage out," and holy crap was he on the nose about that with AI). You have to have a pretty decent chunk of knowledge to fact check it and know when it's completely off the rails, but if you know what you're doing with it, it's definitely a force multiplier. There are things I can do with AI, as a systems engineer (NOT a dev, though I do some scripting on occasion), that would either take me significantly or which I simply couldn't do. It can find that one gold nugget of good information on the 4th page of a Google search that I never would and once I learned to tell it, "look at these documents I've upload as your tier 1, look at this website as your tier 2, and then come back to me and tell me you can't find the answer if you don't see it in either," it got a LOT better.
Whether we like it or not, AI is absolutely the way things are going, and yeah, maybe the bubble will burst at some point, but so long as there's any kind of AI in place, the admin who knows how to use it is going to run circles around the admin who doesn't, all else being equal.
And btw, I say this as a 58-year-old Windows senior systems engineer with 26 years in IT. The understanding isn't limited by age/generation.
WaldoOU812@reddit
Oh, and btw; I'm right there with you on understanding the why behind everything. I'm the guy who historically has always driven everyone nuts because it's never enough for me to just understand "do X to fix Y." I always want to understand why X fixes it, why Y broke in the first place, and everything else I can about the situation as whole. I LOVE AI for that very reason. Yeah, you can just ask, "fix this for me," and it will (sometimes), but you can also ask 101 questions about the why, to the smallest level of detail you want and it'll never lose patience with you. It might run out of tokens, but it won't run out of patience.
And yeah, the whole, "just fix it without understanding" approach is definitely made a lot easier with AI, but that's a personal preference and related to the individual admin. IMO, a smart admin who legitimately cares is going to learn the why behind everything and be much smarter/more capable overall and will use AI to help in that growth process.
immortalsteve@reddit
The amount of vibe-coded apps leading to massive security breaches is on the rise and I view that as job security.
Aless-dc@reddit
I’ve been a day 1 ai hater, but my business has never been good at supporting staff on day to day things so I find myself bouncing ideas off it like it’s a coworker. And legit it’s actually pretty good for that. I catch it out on things but in the same sense where you just need a second set of eyes to break you out of your tunnel vision, it’s good. And also reading debug logs.
ErikTheEngineer@reddit
What I've been finding is that junior folks will crank out massive piles of code that don't exactly work right but are close, and it takes someone knowledgeable about how things work in practice to nibble around the edges and fix it. Or they'll get "write-only" code that needs to be reformulated to be understood by normals.
I do understand how you feel though. I got into this because I like solving complex problems and it feels like I'm just writing essay-format Google searches. I see why the bubble exists too; this is the first time we've come even close to natural-language queries giving even good responses.
Complete-Cricket-351@reddit
probably what's likely to happen is that the AI economic bubble will burst just like the dot bomb and then things will slow down a bit on the adoption side. But like the internet the technology won't go away so you do need to learn how to orchestrate and cross-check AI.
That will buy you a couple of years of relevancy and after that who knows.
Disclaimer I'm not a real tech I'm a tech PM and tech BA who vibe codes and built a website and a job finder alerter with AI I'm up to the point where I've got a couple of repos because my code bases got too many but that's about it I'm not a real developer
EstablishmentTop2610@reddit
AI will continue until morale improves
Pristine_Curve@reddit
I predict an operations crunch, where suddenly operations and security is the critical bottleneck.
AI tools are driving the cost of 'producing code' to zero. Meaning we will see new systems, and new code creation going vertical on the graph, but the number of people who can debug, integrate, and validate those systems stays the same.
"Won't AI just do that as well?"
Possibly, but I would bet on reckless code growing faster than the automated capabilities to deal with it.
jmeg8r@reddit
I'm in my mid 50s and have been a VMware Admin most of that time and some cloud deployments. Similar feelings as you, but I resolved that I have to take a leap and learn yet another technology paradigm shift. AI is by far the most extreme shift I have seen in 30 years. After a year of learning on my own, I am fully into AI and learning as much as I can. Luckily I work in healthcare, and they will not be on the very bleeding edge of technology. I hope you can manage this transition. I have seen many that feel like you just fade away in the background. Good luck!
AndyGates2268@reddit
Leaning in with the most important data and to get right. That sound you do not hear is my pucker factor squeaking too high even for dogs.
Skullpuck@reddit
Somewhat in the same boat. Been doing IT for 25 years. I work for state government and thankfully they denied all AI tools on the network for a very long time. They are slowly allowing some in, but are being very cautious. As much as I like to rail on my executive management, this is one of their better decisions. Thankfully, we have not had to deal with a lot of AI nonsense... yet.
I'm sure it will happen. But, the idea that System Administrators are not going to be needed in the future is legit. Intune and others are pointing the way to "end user management" instead of "IT management". I truly believe we are a species that will die out.
My teenage boys asked me what they should put as their elective for Freshman year at high school. I told them anything except coding and computer management. They told me they wanted to be like me and work for "Microsoft IT". I told them that I was flattered, but, they would be better off studying something that humans will always need like food preparation or plumbing/engineering. It's sad. I was curious if my kids would follow in my footsteps and here I am telling them not to. It sucks.
the-illogical-logic@reddit
I would download and give codex and antigravity a try. Although they are meant for coding you can use them for other things. As you can give it access to files in a folder that it can read, create and modify files directly, it makes it much more useful compared to using something like the standard chatgpt type of interface and having to copy and paste all the time. I prefer codex as you get a decent amount of free use.
As you did EE I recommend getting something like a xiao esp32s3 or S3 plus for the extra bits and use it with the Arduino ide. Not directly work related, but you will probably find it fun and using something like codex makes it quick to do things. I think once you get into it you will find it useful in other ways that will be of benefit work wise.
I use it to control things like LEDs and motors connected to the esp32 using a tablet, which connects to the esp32 via WiFi. The esp32 hosts a web server which I connect to show the dashboard. What I was able to do in a couple of weeks would have taken me years to work out and get working manually.
Zatetics@reddit
If you dont adapt you're earlier on the chopping block. It is inevitable that we will all be replaced by this technology. CC has been a huge multiplier for my workload.
Given that I still have a mortgage, it seems to me much better to be an enthusiastic adopter and power user of the tool, rather than feet dragging, or not worth the token cost. The company will 100% replace me at some point with this technology, but I'd rather that be as late into my life as possible.
Its a fucking security nightmare, but its not my company, i just work there.
WWGHIAFTC@reddit
you may have to scream it out loud, and people still won't hear you, but:
AI is a tool. Anyone using any tool is still held responsible for their turned in work product. period.
I even spelled out 'period' to make it more impactful. lol.
I_cut_the_brakes@reddit
What if I use one AI to review the other AI's work. Does that count?
naked-and-famous@reddit
Jokes aside, this is a solid strategy. If you tell them a competing bot did the work, they get even more picky.
WWGHIAFTC@reddit
sure, if you're willing to stand behind the work you do with the tools you use. There is not 'gotcha'.
If your job is to review code, and you take the accountability of signing off on reviewed code, go for it.
When code you reviewed and signed off as 'good to go' causes a catastrophic issue, that's on you - AI or no AI.
I_cut_the_brakes@reddit
It was just a joke, you guys gotta relax a little.
Dont-take-seriously@reddit
Recently a customer's assistant emailed me about the customer's problem. 'I don't want to tell you how to do your job, but ChatGPT gave me this answer to your problem' and I was subjected to a page of blather, nicely formatted.
Her answer was completely irrelevant, since the problem was that the speakers were hit by a surge and were no longer powering on. But for some reason, the client's request was translated into a question about Outlook formatting, and I was forced to spend three times as long explaining that I was competent and was in communication with the customer, and she was happy with the solution (new speakers.)
I have decades of experience and training on troubleshooting computer errors. By now I can often make an educated guess in seconds for some error that takes others hours to resolve. I use ChatGPT for help with powershell scripts or event viewer logs and am resistant to using it as a general web search or summarizer. Maybe I am old, but I want to verify the chat. I can tell you that CoPilot usually provides terrible answers that a simple kagi search or reddit search provides much faster without several corrections.
Sharp_Animal_2708@reddit
15 years in government IT here too. the thing I keep telling myself is that every wave felt like this. virtualization was going to eliminate ops, cloud was going to eliminate infra, and now AI is going to eliminate... everyone apparently.
what actually happened each time is the job shifted, not disappeared. the people who understood the fundamentals adapted faster than the ones chasing certs in whatever was trending.
your instinct to understand things bottom-up is the right one. that's what separates someone who can debug when the AI-generated code breaks vs someone who just keeps reprompting. are you getting pressure to adopt specific tools or is it more the general vibe shift?
maclargehuge@reddit (OP)
The latter, for now. I would actually prefer more explicit direction. Since we're so small, I usually adapt best practices from other gov departments. The shift is that we now want to be trendsetters on AI in particular, so it's up to us to figure it out, I guess.
Rocknbob69@reddit
Don't let C-Levels go to seminars....solved
maclargehuge@reddit (OP)
The only way I can do that is to become c-level! 😭
meatwad75892@reddit
38 here. I don't feel alienated, I feel annoyed.
Tech will always be tech, change always happens, that's just how this industry is. But at a time when my team is underpaid & understaffed and our workload is ever-increasing... A lot of time has been ripped away from core infrastructure projects in favor of facilitating AI implementations. We're slaughtering our security posture for the sake of a bubble.
odysseusnz@reddit
Twenty-five year vet here, and thankfully now risen to a level where I get to make some of the big decisions, and I'm fighting the slop hard as I can, but they AI bros have deep pockets and a sirens call to CEOs about cost savings and it's bloody hard to keep them at bay. My plan is to hold out until the inevitable crash when eyes get opened and people will be valued for their worth again, I just wish it would hurry up!
OfTheGiantMoths@reddit
Not a sysadmin myself, but I do wonder if some of these projects may be growing Technical Debt exponentially. For example, each AI bug fix introducing 2 more bugs, as it's not capable of understanding a large project.
donttouchmyfries@reddit
did you know the sidewalks in chicago right before the fire were largely made of wood? crazy right?
KingStannisForever@reddit
Ask the management who's gonna sign the code that Ai spews and then another Ai checks? Who's gonna take the responsibility, as I have feeling, you won't be able to sue whatever company that gave you the LLM, if it deletes half your database,... "accidently"
jhdefy@reddit
Why are you responsible for code reviews as a sysadmin?
maclargehuge@reddit (OP)
Small org, ill-defined roles. I do way too much
jhdefy@reddit
I feel your pain. I'm in a similar situation with a small org and broad roles. Best of luck to you.
Kemaro@reddit
41, been in the industry for 13 years now. My entire life has been technology changing at a rapid pace so I welcome and embrace it. Keeps things new and fresh.
machtendo@reddit
Oh man, I can relate. I'm still not even sold on cloud hosting and SaaS, and I see everyone jumping onboard these last few years, then the AI conversation heats up.
Meanwhile, cloud hosting and SaaS is standard while the big tech companies own and manage all the infrastructure we run on top of, we become more and more dependent, and vendor lock-in creeps in more and more while we're ooh-ing and ahh-ing at what amounts to "super clippy" and building more reliance on those services too.
They make it really cheap to jump into, and make it difficult and expensive to get away from. They aren't even doing it well. How many major outages have we seen from AWS, Cloudflare and Microsoft in just the last 6 months?
I'm really struggling with questions like "is this what we've really trained for" or "is this just IT now" - if I give in, I turn into someone managing subscriptions, tokens, compute cycles and iops. If I hold out, I'm an idealist and/or I become a dinosaur.
All while Bezos and Altman talk about providing "compute as a utility" and we don't even own hardware. They make building systems more and more expensive cause they're the ones driving up demand, and hoarding resources and production lines building these data centers. I remember 2008, and I remember bailing out banks that were too big to fail. If these massive gambles on AI go sideways, is Amazon too big to fail?
Maybe I am just an idealist, or an old man literally yelling at a cloud, I just don't care for the direction they're trying to push us in.
AggravatingAmount438@reddit
There's a really good article I read that proved that the longer AI stays, the more problems we're going to see across the board, and the more hands-on we're going to need to compensate for it. And right now, suits think they can just replace everything with AI. I Audited Three Vibe Coded Products in a Single Day - From The Prism
Except AI cannot sustain itself right now. They're making absolutely 0 profit, run wholly on investment, and spending trillions in infrastructure.
AI doesn't build a framework in it's head when it designs a program. It just spits out code and keeps regurgitating it when you point out problems until it works. These vibe coders cannot tell you what that code does. They have no idea. At best, you get an experience programmer who uses it to spit out boiler plate code to save time, but they understand what the code does, and they audit it themselves to make sure it's doing what it's supposed to.
But that programmer is still going to have to be employed and high paid, which these suits think they can just swap them out with more AI.
This is another boom. And AI will always be around now, but it's current model is not sustainable.
1z1z2x2x3c3c4v4v@reddit
A correction will come once enough people have died, and/or enough money has been wasted or lost.
Some companies just need to learn this lesson the hard way.
renegaderelish@reddit
I feel as if one of the bedrock principles that I am having to instill in my non-technical leadership is the need for support for a product/tool/system and the robustness of it. AI flies in the face of all of it.
It feels like AI is hitting sysadmin work like how businesses in general are simply doing anything to make the stock price go up for investors. They want that quarterly report to show green and AI is enabling some of the sloppiest most irresponsible work to "just get done". Task completed toss that shit out and whip through the next task without abandon. Meanwhile, security holes are left open, shit isn't patched, nobody really knows how it works.
Just brute force this task and get to the next task. It goes against every principle I've learned. IT costs money because you need expert support. IT costs money because even if it's a little cumbersome, it never goes down.
Now it's all "fuck it. Crank it out and move on." It just feels wrong.
davidm2232@reddit
I left IT about 4 years ago when my company started going to everything cloud based. I had zero control over anything and whenever our internet would go down, we were totally dead in the water and people were screaming at me. I wanted a career where I spin up a few local servers, configure the services, plug the network all together, and build a few PCs to give to users. That's how it was when I started. No cloud, no AI. Heck, half the machines didn't even have internet access. We didn't even think about security. IT has changed so much in the last 10 years it is basically a totally different career.
OkBaconBurger@reddit
I think that a lot of the industry has just geared itself toward maximizing revenue streams and not really providing cost effective services.
So we feed off of buzzwords. I’m on an AI project and the clanker we put in on 100k nvidia chipsets is dumb as fuck. This is our great innovation? I guess it helps justify removing a couple of jobs which makes the C suite happy but it really feels like an expensive toy at this point.
I miss the days when I was racking servers, running cabling, and getting my hands on gear.
Solkre@reddit
Im 43. To me it’s a (usually) more useful Google search, that’s all. I still have to know what I’m doing to not take shit advice.
The Internet meme game has exploded though!
techdog19@reddit
Only constant in IT is change. Adapt or die. Don't mean to sound mean but tech is constantly moving
KrakenOfLakeZurich@reddit
I'm developer and currently getting pressure from above to adopt AI. 6 months ago, I've been very skeptical. The models back then produced so much garbage, that fixing all the nonsense took more time than just hand-coding it.
The latest models are quite capable and can produce decent code. I don't "vibe". To produce quality code, careful planning and diligent reviews is required. It's still the old "bullshit in, bullshit out". AI is just an amplifier in between. It can make you more productive, but if you feed it "bullshit", it will output "bullshit x 10".
(As of now) I would't let AI touch a prod system directly. Only trough carefully reviewed and properly staged GitOps.
My biggest issue with AI today isn't the AI itself. I see it as a powerful tool that can help at several stages in the development cycle - and probably with sysadmin tasks too, given proper guard rails.
The problem is the unreasonable expectations from manglement associated with it.
cdoublejj@reddit
Ai is bullshit, MOSTLY, https://www.youtube.com/watch?v=h3JfOxx6Hh4
LITERALLY ENRON ALL OVER AGAIN>
wabi-sabi411@reddit
Bro all the kids are vibe coding as fast as they can while not understanding what they are looking at. It’ll be fine.
wise0wl@reddit
I’m half way between doom and tepid optimism. Most companies don’t care about understanding as long as their goal is accomplished. Do customers get served, are KPIs met, are features shipped; that’s what matters.
Best case scenario is qualified engineers become the new ultra technical project managers that translate business requirements into detailed technical specs while orchestrating LLM agents to write the actual code and checking for correctness.
Business people aren’t going to take on the responsibility and time of doing all this. It’s still a full time job. But it’s one or two people instead of twenty. That’s the reality.
We’ve kept our Platform Engineering team small on purpose. It has six very qualified engineers with specific domains of responsibility. We are utilizing LLMs to write a lot of our automation and assist with verification of correctness. I could see the team shrinking slightly but not much. Too much context switching would be required and not enough time to verify.
I think a lot of companies are hoping that LLMs will be able to write the code, write the automation, write the monitoring, verify everything, deploy everything, watch for errors, fix the bugs, and all in less time and for less money and with few or no humans in the loop except for the Product and Executive leadership. Maybe. If the tooling gets much better then maybe, but I feel like it’s just too much context for an LLM to handle, even with sub-agents and limiting context windows and compaction and all the tricks.
wabi-sabi411@reddit
I think it’s gonna hit a compute wall for a while. Add more computer is already starting to max out on gains. But the amount of jobs and micro/small apps being produced daily is astounding. I have kids trying to replicate Corp IT functions in their VBA and because of management siloing it just happens. I spent 5 years as a dev and many people start with VBA in IT to help a business group. Difference is now they can produce a full on management system if they are left alone
jhdefy@reddit
When they tell me about "all the wonderful tools that exist to automate code review as well and we can automate from all sides" I ask for a demonstration. I start asking questions about how it works, and for example tools and for a demonstration of industry proven patterns (which you as an admin can likely provide for your own current work).
Genuinely try to engage in the conversation on a deep level with the person to see how shallow their comment truly is.
BrainWaveCC@reddit
On the one hand, technology has always been about automation, so this direction is not entirely a surprise.
Also, you work in security compliance, so you need to consider looking at the problem structurally, holistically, and not just in terms of the specific outcomes from vibe coded apps.
Reclaim some time by automating as much of the assessments as possible, and use that time to look at the underlying risks and threats, and begin to document that.
There's going to be quite a bit of AI-related pain over the next 12-18 months, at a minimum. Lot's of vulnerabilities will be manifest relating to the technology and its implications. Best to start targeting that now.
For now, they can maintain some sense of the WHAT at a high level. And the overall WHY.
But the HOW? Not for long. And the WHY of the HOW? Nope.
Silver_BackYWG@reddit
I fuckin hate babysitting these not ready for primetime AI apps.
Int-Merc805@reddit
I talked to a friend about this recently and he said he left IT and got into airplane mechanic/maintenance because it’s heavily regulated and often enough it’s not a boss dictating anything, it’s manuals, maintenance schedules, and there’s hefty regulation and fines when steps are skipped.
IT is the Wild West. I am transitioning myself, going into facilities and maintenance as I see the writing on the walls and honestly want nothing to do with the politics of AI stuff anymore.
Facilities and maintenance is EASY and better money. Once you understand systems like we do, it’s dead simple and really fun.
Jawshee_pdx@reddit
Adapting is a cornerstone of our industry. We have to familiarize ourselves with modern technology whether we want too or not.
AI is here and isn't going away. Learn to work with/around it or be left behind.
Code review in particular I am happy to hand over to AI. Computers talking to computers seems like a perfect use case.
AlkalineGallery@reddit
I am loving it so far. Clippy (AI) enabled infrastructure is stupid as hell, it allows me to move fast, but it is chock full of logical errors. We are replacing a "one and done" culture that enterprise relies on with a super fast workflow that is able to build super fast.... But the error checking and testing phase is nearly never ending. So in the end, the process is the same time frame, but the possibility of errors is exponential.
But! For homelab, clippy driven workflows are golden... They break in really cool ways and forces me to learn.
KiefKommando@reddit
I am of the mind that what we are seeing currently is a bubble, something akin to the dot com bubble and subprime mortgage scandal combined. Once it pops and people lose billions there will be course correction. It won’t go away completely but I think we will see “AI” used in more specific scenarios instead of being shoehorned into everything like we do now.
GullibleDetective@reddit
Its trumped up, your job is safe.
If anything look into specialization and how you could help make it work for you
kramit@reddit
You are looking at it the wrong way. You are in IT, you exist to put out fires and fix stuff and patch computer software and hardware together. What you are describing is a massive dumpster fire. Awesome!! When this implodes and it all goes to shit who cares? Job security for years more to come. Let it burn, just make sure you have a fire extinguisher in your hand when the time comes and a request for a pay rise in the other. IT alway moves too fast, nothing ever sticks.
SageAudits@reddit
Do you have opportunities for CPE training? Some of this is just going over what risks are being introduced with having these tools. All AI really is ,is a tool and a lot of it is just built on patterns.
For your developers, what system systems are they really accessing? Touching sensitive data? Db calls? S3 bucket calls?
do risk based approaches on this so if they’re spitting out way more code, maybe you need to be doing more frequent pen testing via outside firms for example. Does your pen testing involve internal and authenticated user testing? Or some grey box test? Maybe more frequent vulnerability scanning and what tools do you have in your stack? Sure seeing a ton of static code review tools but not a tone of dynamic code review tools out there and that may not work with your setup or you really need to configure them otherwise it’s a server scan and not a weapons scan!
It goes back to how often are things changing and if your monitoring capabilities are keeping up.
Are they doing CI/CD? How do you review SCA? What visibility is there into packages? Container security?
Some of these things just apply regardless of AI, but how fast things might be changing that frequency is important to have more aggressive monitoring.
I also get that you’re in government so that means you probably have a strictly defined budget. If you can figure out a sense of where the gaps were and maybe get input from your application developers as well as the security team, maybe you can come up with a game plan to bring something to management where you can really show the return on investment for any new tooling fwiw.
Desnowshaite@reddit
AI can already present high quality results in a way that its logic and sources go well over the head of (some of) the readers so for them AI is extremely smart and you will never be able to convince them it has fundamental issues with the material it outputs.
Basically a large portion of the decision makers have been convinced or rather defeated by AI simply by not understanding what AI does and how it generates its output..
Artificial Intelligence is winning over Natural Stupidity and there is very little that can be done to prevent it on the long term because as long as there is money to be made on AIs, there will be directors and CEOs and managers who will just want to use them regardless of any issues they probably can't even understand anyway.
suncontrolspecies@reddit
unfortunately, true
Nonservium@reddit
I’m 27 years in and have stepped out. Don’t know if I want to go back. The Corp and MSP spaces both suck. I have no desire to manage subscriptions.
transer42@reddit
I also feel a bit tired of the "next new fad" syndrome. There's already so much to keep on top of. But also....if we ignore it, we probably will be left behind. I've got too many working years left to ignore it.
So, what I've been doing is trying to get a little hands on. I've been using a chatbot almost like another engineer - I can go over issues I've been having and go back and forth with results. It basically functions as a faster, smarter Google for me. It can parse logs WAY faster, and can usually spit out chunks of scripts that are pretty good. Of course, ALWAYS test and verify. It's definitely sped up my ability to deal with tough problems or update old automation.
Since you're on AWS, it might be worth looking at the AI Practitioner cert. I went through it less for the cert itself (which isn't worth much, imo) and more to get some foundational knowledge as to how LLMs work under the hood (as much as anyone knows, at least), and what tools are available. The cert looked good for my bosses, and gave me some knowledge to speak a little more knowledgably on AI, particularly within the AWS system.
Also, shout out to the self learning. I went to school for archeology, and have been learning IT as I go the entire time
robotbeatrally@reddit
I'm surprised anything even govt adjacent can use AI in any kind of safe way and still meet security requirements. It was too hard to manage even with GCCH/secluded AI stuff , we ended up just disabling and completely having a no use policy at the end of the day there was always some kind of issue.
belinadoseujorge@reddit
same feeling here
lucky644@reddit
I mean, yeah, you are going to be left behind if you ignore it entirely.
I use AI for time consuming, repetitive tasks, like parsing log files or creating spreadsheets of mind-numbing information. Basic stuff. But I have a good idea of how it works now.
That_Lemon9463@reddit
your frustration about the juniors isn't really about AI, it's the same problem that existed with stack overflow copypasta and blindly following blog tutorials. AI just makes it faster to produce code you don't understand. the difference is that you built the mental models to know when something is wrong. that skill doesn't go away and actually becomes more valuable when everyone else is shipping code they can't debug. in a government compliance environment especially, someone has to actually understand the infrastructure when an auditor asks why a security group is configured a certain way. "the AI did it" is not going to fly there.