Sysadmin horror stories
Posted by IloveSpicyTacosz@reddit | sysadmin | View on Reddit | 329 comments
I love a good sysadmin horror story/experience.
What has been your most interesting experience while working as a sysadmin?
MightySarlacc@reddit
The Flood Part 1: (10k character limit)
In the summer of about 2004 or so, life was good. I had moved up through ranks from helpdesk, to junior local admin and was now part of the global email team. We did Notes/Domino, Groupwise, MSMail and Exchange. Of course Exchange. The US team was gathered from across the country in the Eastern US datacenter for a planning meetings and good times, but the official purpose was to migrate the final Exchange 5.5 mailboxes to 2000 and fully remove all the 5.5 infrastructure. I was both proud that my hard work was recognized and a bit scared as I was more of the Notes/Domino guy and was not as up to speed on Exchange as well as others on the team. Among other things, this was also to help extend my training/indoctrination in the wonderland that was, and forever will be, from now and until the heat death of the universe, the only email system you shall ever desire: Microsoft Exchange.
On a Saturday afternoon, after going out for lunch, we sat in the 'data center' watching the wonderful progress bar for mailbox moves that was the Exchange 2000 move wizard. Think of the old Microsoft move status bar. Little blue boxes on a line that sometimes move, an animated envelope moving from one side to the other and a time estimate that has zero basis in any reality that humans can comprehend. So we sit there, kicking off a handful of moves at a time and watching for the moves that crash and burn. It's explained to me that you don't want to do too many in one go as if it fails, it's a pain to sift through it all and rebuild your batch. Conversely you don't want to do too few as then you are tied up just queuing mailboxes up. So you put a batch through, sit back, watch the blue bar do its thing and shoot the breeze. It got boring, pretty damned boring. This is also our second day of doing this. We are really, really bored. It's getting close to dinner time on Saturday evening.
So we are there in the Eastern US 'data center' (EDC), which really was a converted strip mall subunit, for one of the world's top firms in our field. The setup of the EDC is such that when you walk into the front door, on your left are restrooms, the breakroom and about 5 manager's offices. On the right are 5 or 6 bays of cubicles each in a 4 desk setup. An aisle runs between the offices and the bays, the bays sided with low rise filing cabinets that have printers perched on top of them. At the rear of all this is the data center. Elevated by the raised floor with full glass walls, you can see exactly what this place is when walking in. The 25 or so racks are full of equipment that all have wonderful blinking lights to let you know that important computer stuff is going on inside of them. In the room you can see its large air conditioner/dehumidifier just past the entry door. We are holed up in the second bay of cubes, right next to our boss's office.
So, as mentioned above, it was boring. To combat a bit of this, we start tossing around some of the foam toys that were all the rage at tech conferences in the late 90s and early 00s. We had foam cubes from some Unix vendor, a little globe from Worldcom, a brain from Novell, a foam nose from who knows, and many others. In a totally shocking turn, this turned rowdy and less just tossing stuff around and more into lets bean the dude in the cube across from me. The boss pops out of his office and tells us to calm the hell down, he's doing work. We calm down.
One guy, let's call him Stan, decides to take it down a notch and goes to his car and grabs a pitching wedge, or maybe a sand wedge (doesn't really matter I guess), from his golf bag. He grabs a few foam toys and goes about half way down the aisle. He starts chipping the noses, globes, cubes, footballs and the rest toward the break room and restrooms near the front of the office. Some of the other guys start tossing them back to him and he continues to chip foam toys while we watch animated email envelopes fly from left to right over some blue boxes. Stan, for some reason, decides he wants to really hit the nose. So he takes a full swing, a great swing (Stan is a pretty decent golfer), nails the nose, it goes flying straight down the aisle. Stan follows through, club arcing over his head and perfectly clips off the head of a fire sprinkler. A bit higher and it would have tangled in the suspended ceiling tiles, a bit lower and it would have bent up the little fan blades. But this swing was perfectly on the target to hit the portion designed to let out the pressure built up in those pipes as well as removing the bit that makes the water fan out.
MightySarlacc@reddit
The Flood Part 2:
We all stood momentarily in awe and shock. Stan is gaping up directly next to the torrent of water blackened by years of sitting in pipes, just waiting for the day when it is called to meet the urgent need to save lives. The other three of us were stunned in disbelief what still feels like minutes but what empirical evidence shows was only a few seconds. And now the fire alarm sounds and emergency lights begin to flash. Someone then reasonably suggests we find the valve to turn off the water, so we all bolt out the back door to find what would surely be the obvious water main. Out the back down and down a short stoop of a few steps we fan out looking for a valve. We pop up a few covers but nothing looks big enough to handle the volume of the water blasting out of the clipped sprinkler.
(This part was later related to me). Our boss, go with Al, hears more commotion outside his office and waits for a few moments for us to settle down. Hearing that we don't and now distracted from high stakes spreadsheeting, mail replying, ERPing or whatever boss stuff he was doing, Al pops out of his office to give us an earful. But he finds not a one of us there and instead finds a reverse geyser of water spouting from the ceiling, hitting the floor and splashing the nearby equipment and file cabinets. So he goes into action moving laser printers, document trays and other items away from the splash zone, all the while wondering what the hell just happened.
After a short period of failing to find the cutoff, we all return inside to see what we can do to mitigate the spread of the damage. We divide up. Most of the guys begin to go to each cube and remove any items on the floor and relocate the PCs, surge protectors, subwoofers, and other items as far away from the spray zone and spreading pooling water as possible. In the nearby breakroom, there is a large trash can that is conveniently on heavy duty casters. I remove the liner and push it under the gushing water. The water is crystal clear at this point, the latent brackish content of years of sitting finally purged from the system. It is also cold. I wait a few moments, let the can mostly fill and then push it outside the front door and dump it on the walkway into the parking lot. After doing this once or twice, I am soaked through. I take off my sopping wet shirt as it is getting heaving and binding me, pulling on me as I muscle a very heavy garbage can of water down a carpet floor and over two thresholds to tip over the curb. Somewhere around 10 trips of this, the local fire department shows up.
Al greets the fire department, assures them there is in fact, not a fire, and urges them to turn off the water. Out the back door they all go and finally the water is off. They shut off the alarms and the only noise is the dripping of the water from the filing cabinets. We all take a breather while the fire department sorts out what happened. The story above is told, more or less. When we get to the part about trying to turn off the water, we all ask where the shut off is located. Going out the back stoop, there is a handrail on the right. On the left is a large red pipe, rising about 3 feet out the ground. Hanging from it is a detachable handle. This is the shutoff valve we all missed. The fireman says it is a good thing we all missed as there is a several hundred dollar fine if anyone but the fire department uses it. We look around at the water logged printers, cabinets, carpets and other items we can clearly see and all come to the same conclusion that the fine would be worth the price. But as I would learn, water gets everywhere and it likes to stay there.
Sometime in all this we begin to look around more of the disaster zone. The nearby walls, carpet and filing cabinets are splatter painted with black gunk from the initial blasts from the pipes. The carpets are soaked and the water is running slowly all around the floor. And it is running toward the data center. We badge into the data center and see that the Liebert is reporting that the humidity is climbing rapidly. We pop a few of the flooring tiles from the raised floor and there is a stream of water, slowly draining from the dividing wall toward the racks. It is a few feet away from the water sensing switch on the air conditioning unit. We know that if it trips, the aircon to the room goes off. If the aircon to the room goes off, all the racks will be powered down in the US East data center where many critical applications and services are solely hosted (real redundancy was harder and more expensive back then). So we scramble again to find something to prevent this. One guy is popping flooring tiles off to see exactly where the water intrusion is located. I run off trying to find anything to stop the water. In the restroom area I find the janitor closet and there is a large box full of paper restroom towels. I bring the entire box to the data center and we begin to stuff paper towels under the raised floor in an effort to dam the water from progressing any more. After using about 3/4 of the box, we overcome the water's advance and block its access to the water sensing device. The humidity stabilizes just a few percent below the cutoff threshold.
In the chaos and melee of all this, the VP (or AVP, or whatever) who oversees the site, arrives. He had been on travel and just returned home that weekend. We were scheduled to meet with him over a nice lunch or something on Monday. Being somewhat junior and from a smaller office, I had never met him in person, but I was in his chain of command. So I finally met the guy ultimately responsible for signing off on my recent promotion, guy one step below our CIO, missing his Saturday family time and here I am shirtless, shoeless, soaked in an office devastated by water from a bored dude with a golf club. Winning!, right? Other local managers and service leads start trickling in to check out the damage and ensure their services are running okay. The office manager shows up and she knows all the firemen (its a smallish town) and makes comments in jest to me about dress code violations.
Plans are now formed about cleaning up what we can. The office manager asks about how the basement is faring? We have a basement here? Sure enough, we go outside, down around the hill and in through the back to the 'basement' which has been converted to engineering and design offices. Water has rolled down the walls from upstairs. Documents, electronics, equipment, furnishing are all soaked. More damage. That evening a few guys drive out to Home Depot or wherever to rent some wet/dry vacs. I go to the hotel for dry clothes and return to help with cleanup. The crew returns with a handful of rental carpet cleaners and we get to work sucking what water we can out of the carpeting. As we are running the vacuums, tired, stunned and wondering of the consequences, one of my friends (call him Pete) says
Pete: "Hey Sarlacc, look on the bright side!"
Me: “Oh yeah, whats that?”
Pete: "We're getting great training for our next job!" I burst out laughing, the tension somewhat released for me.
Stan the golfer: "Thats not funny man, thats not funny at all"
Me: "Disagree bro, that is funny because it's true"
We jokingly then made plans about opening our own Stanley Steemer franchise and similar type work. Stan did not appreciate our gallows humor.
In the end ServPro or a similar recovery company came in to remediate the damage. Dehumidifiers and fans were running in the place for weeks. Some equipment had to be replaced, one was a fairly expensive color laser printer. The data center never went down, but it did light the fire to add more redundancy to some key services and applications. I heard the total remediation bill was on the order of near $100,000, but I can't be sure. It surely was not cheap. No one was ever officially fired over this, but I'm sure it did not look well on future prospects for some individuals at the firm. By the time I left that company that data center was still in operation but more workloads were being moved to Azure and other cloud providers.
It has been a long time since this happened and some parts were left out and may be fuzzy. For instance I know we were driving in a crazy thunderstorm, maybe to pick up the industrial carpet cleaner from the remediation contractors. I'm pretty sure it was a Saturday when it happened as the office was empty, but I don’t think it was yet night time. Could have been a Sunday.
But there is one thing that remains abundantly clear to me and that is the Exchange 5.5 to 2003 move dialog was terrible!
TLDR: Golf club almost takes out data center. Exchange is all you will ever need.
UnderstandingFun337@reddit
hahaha....man I was going to skip past your story as it was so long but glad I did'nt lol. I watched an NDE testimony there a few months ago. Basically the man in question said before incarnating on Earth you choose for yourself 3 tsunami events as part of the experience.....this one was definitely one of Stans lmfao :-p
I can only imagine how he was feeling that day during the cleanup operation, the firemen and VP arriving at the scene all because he had one of those "I've a great idea" moments lol
Public-Big-8722@reddit
What a great story! Thank you for sharing. Poor Stan.. can't even imagine the panic that guy was feeling.
liposwine@reddit
Oh I have one. ( Please forgive any errors I'm having to dictate this voice to text and my memory as faded since then) Back in the early '90s, Novell Netware was supposed to have this ability to connect two servers together for a failover. I think it was called sft level 3 or something. We had this consultant Rock up to the office in this unbelievably expensive car ready to do this. So we have these two Compaq Prolient service connected by a fiber cord. And thus begins the nightmare. Now for context, we were running a foxpro database close to a million records. Our CRM software was custom written at the office and to be honest was pretty badass. In any case, we started the testing and the failover did not happen at all, corrupting the database. The sales staff began to be frustrated. Over the next week the two servers would just randomly disconnect for no reason we could ever find. The sales staff began to be much more than just frustrating. During this at 2:00 a.m. we had the manager of the IT department show up drunk and crying about how this project had destroyed his marriage. The consultant cut his losses took off and ran. At this point the salespeople were literally giving us death threats. We reversed everything the way it was and continued happily ever after.
UnderstandingFun337@reddit
holy moly....this actually sounds like one of the darkest IT horror stories on this thread :-(
omfgbrb@reddit
SFT 3. OMG. Talk about your IT horror stories.... That's some shit I haven't even thought about in 20 year. Who's going to rock me to sleep tonight?
liposwine@reddit
Netware floppy install was all peachy until you hit that Unicode floppy ;)
omfgbrb@reddit
You can't call yourself a sysadmin until you've done a NetGen on 360K floppy disks. OMG did I think I'd died and gone to heaven when Novell shipped 2.15 and later on 1.2MB floppies...
CantankerousBusBoy@reddit
I don't know what NetGen is and have never worked on Novell. (Or floppies for that matter.)
I still consider myself a sysadmin. Fight me.
omfgbrb@reddit
Oh you precious child! NetGen was the tedious process of generating an installation of NetWare by swapping between 10 and 40 (depending on density of disks) disks while choosing NICs and disks and performing a CompSurf (Comprehensive Surface Analysis) on your hard disk. CompSurf alone could take from hours to days to complete.
We won't even go into ELSGen or non-dedicated servers...
UnderstandingFun337@reddit
I worked with a girl who accidentally reversed the SAN data sync process (source to remote) for a large hotel global hotel firm. This tragic accident destroyed 24 hours of data for the entire multichain hotel :-(
fun_crush@reddit
We had a guy set up a dead man’s switch in the event he got fired for not taking the vax. The day came and he was escorted out. 30 days later we had a complete enterprise meltdown… admin accounts got disabled… VMs were being deleted backups and snapshots got deleted. Years and years worth of work… deleted. After doing a forensic audit they found that all this was executed using a back door service account with god permissions. Everyone in IT management got fired and it took almost a year and millions of dollars to rebuild.
anomalous_cowherd@reddit
That's got to be jail time for the ex admin too?
fun_crush@reddit
Yeah if he had left a trace. It was one of those things everyone knew he did it but there was no way to prove it. In the end the blame got put on our managers and that was really it.
anomalous_cowherd@reddit
Ah well, if he was an antivaxxer he has his own problems...
fun_crush@reddit
Probably longest week(s)/month of my life. 80 hour weeks with only Sunday off. It was so bad 90% of our team just jumped ship and found a new gig.
TaiGlobal@reddit
Damn it’s fucked yo what he did but he really inflicted pain.
_haha_oh_wow_@reddit
That's just what idiotic anti-vaxxers do.
BinaryTriggered@reddit
you realize people are dying left and right from the experimental clot shot, right? it's not even a vaccine.
_haha_oh_wow_@reddit
wtf are you talking about?
Bogus1989@reddit
Not that id do some shit like that…but the first thing thru my head, would be that itd hurt my fellow coworkers lives more than anything else.
rumandbass@reddit
He refused to vax, so clearly he doesn't care about hurting others.
Sarin10@reddit
didn't know this sub was infested with antivaxxers lol
rumandbass@reddit
Predictable. Our industry is filled with libertarians who have extreme superiority complexes.
LUHG_HANI@reddit
They should have paid for outside IT support to help. That's not fair to do to the staff and proves the ex IT guy partly right.
ForTheInterwebz@reddit
There is a big difference between being anti vaccination and not wanting to take a rushed vaccine that your job is trying to force you to take. Not excusing the actions they took with the kill switch.
FluidGate9972@reddit
Nothing rushed about the vaccine.
https://www.healthychildren.org/English/tips-tools/ask-the-pediatrician/Pages/Was-the-COVID-19-vaccine-rushed.aspx
RandomPhaseNoise@reddit
MRNS technology was rushed big time. We know nothing about the long term effects of it. And there was/is a chance that it would/will become the next contergan fiasco.
horus-heresy@reddit
Stick to sysadmin stuff. Let the epidemiologists stick to their stuff. Every bozo has now become phd with Joe rogan podcast. Dumb
Leinheart@reddit
Imagine being this stupid in 2024.
FluidGate9972@reddit
Again, no, it was not rushed: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8940982/
I still cannot fathom that in this day in age, with information freely available everywhere from any device possible, people are still THAT misinformed. Good Lord.
19610taw3@reddit
Something that took 20+ years to develop is definitely not rushed.
Unless you work at a very, very, very, very, very, very slow pace.
_haha_oh_wow_@reddit
If you are working at a very, very, very, very, very, very slow pace, then it is by definition, not rushed.
neosharkey@reddit
Funny how they got 18 years of long term testing done in 20 years.
Testing the delivery method is not the same as testing the payload.
anomalous_cowherd@reddit
The only thing that was skipped is the usual bureaucratic delays.
Appropriate-Border-8@reddit
He would have been able to contract the virus and also to be able to transmit it, unlike the people like us who took the jab(s). Right? 😉
https://www.imdb.com/title/tt12745644/?ref_=ext_shr
cmaurand@reddit
That’s a class c felony in my state.
Isabad@reddit
If they can prove you did it. If they aren't running vulnerability scans or proper audits/accounting (sounds like they weren't), then good luck.
DontStopNowBaby@reddit
When you remove the log the siem and the backup. Who you gonna call?
punkwalrus@reddit
Not to mention expensive. There are ways to trace all sorts of things, but it will cost you, and rebuilding might be the best way to move forward. Prosecuting a former admin with proof also will take time and money. But if it "took almost a year and millions of dollars to rebuild," I am surprised that they didn't do this.
I have come across a few "time bombs" in my career. Only one had more than a 90 day lead (knowing backups would take 90s days to roll off), but the rest were "within 30 days" which seems stupid. One of them took place mere hours after he was fired. In almost every case, they were served in court, got jail time, and they proved that in MOST cases, if you;'re dumb enough to try and pull something like this off, you're dumb enough to do something that will trace it 100% back to you.
It's career suicide. But if you think you have nothing to lose anyway, I guess?
WaldoOU812@reddit
You didn't happen to work with Terry Childs, did you?
https://www.computerworld.com/article/2754370/network-admin-terry-childs-gets-4-year-sentence.html
fun_crush@reddit
Wow that’s wild. No, lol I didn’t work with him. It blows my mind that people would even consider sabotage. Especially a government or bank. He’s lucky those weren’t federal charges because that could get you a lot more time.
WaldoOU812@reddit
Right? I've been in plenty of situations where I was pretty ticked off at my bosses as an IT guy, but I always figured if I did anything stupid, that'd blacklist me forever. Not to mention, kind of a sacred trust being given the keys to the kingdom, and I figure you gotta be a special kind of asshole to betray that, no matter how bad the company is.
LucidZane@reddit
Well at least you do airgapped backups now, right... right?
TheStixXx@reddit
I would also like to know if the forensics were able to trace back to former admin and if any legal actions were taken.
Way to push the whole team under the bus.
fun_crush@reddit
Right? Unfortunately the blame of the whole meltdown got placed on management. The audit uncovered tons of things that were either not good practice or blatantly breaking policy. Also when it happened no one had any idea what was going on. Did we get hacked? Someone make a huge mistake? Once it was found it was all tied to a service account that’s where it ended.
inshead@reddit
Yeah I can imagine the fact that he waited XX amount of days until after he was gone meant that he didn’t cross anyone’s minds for awhile.
fun_crush@reddit
Yup. Day 0 was the craziest 12 hours I’ve ever experienced in my career. Amazing experience gained though. I even use it as a resume bullet.
twistacatz@reddit
High level how did you guys recover?
fun_crush@reddit
That’s actually the best question that’s been asked. So what we experienced was a comparable to a fire sale. We experienced a 100% loss server side. We contracted a team from Microsoft to assist with user recovery of data with the clients. We also contacted a team that specializes in cybersecurity at the DoD level and after their audit we implemented changes that ranged from separation of duties to privilege escalation. I could go on and on with all the changes that happened but neither, I want to type it all and you probably don’t want to read it all.
In a way it was kinda a blessing considering we migrated all our platforms to a cloud based provider and got off VMware. It took us a year to rebuild everything with new employees, new management and new policies.
nhaines@reddit
The real victory here.
35andAlive@reddit
Why is that? (non sys admin lurker here)
nhaines@reddit
Welcome! Enjoy the jaded vibes, soak up all the knowledge you want, don't be afraid to ask questions! :)
VMware's been an expensive hardware virtualization platform. It was really cutting edge early on, scriptable, and so on. It was one of the first enterprise-focused brands that really caught on. Before, it was that and Virtual PC, which was less focused on running servers, per se.
Windows developed Hyper-V to allow a native capability to run virtual machines, and Linux developed KVM to do the same. KVM is entirely free and is incredibly scriptable. And Amazon used it (and/or the tools built around it) to build Amazon Cloud Services, and Digital Ocean, Linode, and countless other cloud providers use it too. You could use it on your computer right now to spin up a couple virtual machines if you wanted. But VMware, expensive though it was, had the brand name recognition and support and so on.
Well, Broadcom bought VMware late last year and immediately no one could renew their license subscriptions. Then, after about two months, new prices were rolled out and they're all about 10 times the old pricing, with no new updates or changes.
And so, a lot of people are really angry about it, and looking for other solutions. There are plenty, but it's a hassle to migrate 2 or 5 virtual servers. Migrating 100s is a problem.
Serenditously, fun_crush's company got to migrate to something else as part of their recovery effort, and now doesn't have to worry about VMware at all.
It was just a snarky joke that was fun to say, but that's the "why" behind it.
fun_crush@reddit
Underrated comment… 20/20 hindsight it was the best thing they could’ve happened.
TriggernometryPhD@reddit
Breaking News: Company makes bare minimum effort to follow Best Practices; suffers the consequences of doing so.
There's zero excuse as to why / how a Sys Admin should be able to cause this much damage to begin with, outside of failing to implement Zero Trust / Least Privilege, Audit, and Off boarding policies. In a sense, he did the company a favor by having to overhaul it all and rebuild it with security and compliance in mind.
Sounds like a nightmare for those who had to deal with it.
Hebrewhammer8d8@reddit
I definitely would like to read it all on a long trip. Maybe it will be on an episode of Darknet Diaries?
Mrmastermax@reddit
You could write a post with your experience in detail not exposing the company so we can learn from it.
Dookie_boy@reddit
How did you word this in your resume ???
fresh-dork@reddit
it was 30 days after his account disappeared; had i been similarly minded, i'd pick something less obvious, like 38
Impossible_Corgi_239@reddit
This absolutely made my day. Companies that fired people for a personal health choice absolutely deserve this and worse.
EchoPhi@reddit
Actually caught an employee that was on the road to termination attempting something similar. So glad I caught it, it too is a bullet point on my resume.
caillouistheworst@reddit
What happened to the guy who got canned? Did he get in any trouble? I could be wrong, but I’m not a lawyer.
fun_crush@reddit
No nothing. It was a situation where everyone knew he did it but we didn't have enough to prove he did it. The blame was shifted to management for poor policies and bad practices. All management was fired.
caillouistheworst@reddit
Harsh, but it does make sense. No one should have that much unchecked power.
Ok_Analysis_3454@reddit
So did homey catch a raft of criminal charges?
AustinGroovy@reddit
You win.
fun_crush@reddit
Hell no I didn’t lol. 80 hour work weeks for months on end.
horus-heresy@reddit
Were you paid 80 hours worth of salary a week?
caveboat@reddit
Paid?
fun_crush@reddit
yes of course
Appropriate-Border-8@reddit
🤣
Appropriate-Border-8@reddit
Slaves don't get paid, sir! 😠
nsa-cooporator@reddit
Pretty sure he means your story was the most horrific, ergo, you win the implicit competition of this thread.
Appropriate-Border-8@reddit
Thank you, Capt. Obvious.... 😂
highonpsi@reddit
What a hero. Absolute legend.
homepup@reddit
Surely he was hunted down and sentenced to working help desk phones for 20 years or some other type of torture‽‽‽
Appropriate-Border-8@reddit
No offense to the hard working people on the service desk, right? 😉
happyaveragesloth@reddit
He's a chief keyboard dusting officer now.
_haha_oh_wow_@reddit
Did the guy who did it face charges? That sounds pretty crimey...
Hebrewhammer8d8@reddit
I guess the disaster recovery plan was not efficient in recovering for the core things.
neosharkey@reddit
Guess it would have been better for manglement if they had just had un-jabbed people work from home.
Lordgandalf@reddit
Oof this is the biggest one I guess.
rj666x2@reddit
This means the company has a lousy access audit or compliance process. But that ex sysadmin got guts to do this.
Bendy_ch@reddit
I honestly thought you were talking about a VAX system that he didn’t want to take ownership of and got fired for it
Yep, i‘m that old.
fprof@reddit
What is this shithole of a company?
emanuele232@reddit
Lol , I work in a multi billion corporation and if someone wanted to do something like that they surely could A quote from my manager: “but how the devs are supposed to work without admin dbaccess??”
fun_crush@reddit
Right? If I told you the company you might know it. We had around 3K employees so we were mid sized. It’s truly unbelievable how much destruction can be caused by one person on a systems engineering team. It shows you how underpaid we really are.
mallet17@reddit
Geez... what could you even do to prevent this 😵. It definitely pays to have an offsite, isolated backup.
He must have injected every single superpower and loaded up as much modules as he could onto a utility host... then ran them off of schedule/s.
manvscar@reddit
That's a hard lesson for a company to learn about forcing untested vaccines and immoral mandates.
fun_crush@reddit
Most companies like mine were forced to mandate it by the hand of stakeholders and customers we supported. Good people lost their jobs because of one disgruntled employee. These were people I cared about, and they did the same for me. Life was absolute hell when this happened and I don’t wish it on anyone.
manvscar@reddit
I'm sorry you had to deal with that massive amount of stress. For the record, I don't agree with the actions that rogue employee took.
recent-convert@reddit
Go away, professionals are talking here
manvscar@reddit
How many worthless boosters did it take to come up with that response?
sagewah@reddit
I'll say it because everybody else is too polite: Just fuck off you stupid shit.
esisenore@reddit
How many hours a day do you throw toddler tantrums because other people make different choices than you ?
frayala87@reddit
Gtfo
esisenore@reddit
Someone watch this guy as well
Not_your_guy_buddy42@reddit
sanest antivaxxer
iRyan23@reddit
Well what happened to the guy that setup this shit storm? Was he sentenced?
youfrickinguy@reddit
Fisher Plaza?
Hefty-Amoeba5707@reddit
That's epic
viper233@reddit
RAID 5, two disks failed within a week. We were a small shop and didn't have a spare on hand, had to get it shipped.
RAID 1+0 for the rest of my storage life. I learned a lot about lvm with Linux too after that, pity it's inefficient in cloud and adding an attached disk is cheaper and easier. You never trust an attached volume anyway and use managed storage to store application state and files.
Schrojo18@reddit
I wish my NAS could do 10e to make use of the 5 drives I've got
LucidZane@reddit
I feel like half of theses happen to me every week :(
Newbosterone@reddit
New contractor thought the Big Red Button was the exit door opener.
Morgan Fairchild voice: “It was not. The Emergency Power Off button immediately shuts down all power to the datacenter except for lighting, alarms, and access control.”
There was a disturbance in the Force as 3500 servers cried out and were silenced. Thousands of virtual machines leaked into the Aether.
The emergency backup power kicked in, but couldn’t be connected to the datacenter until the reason for the shutdown was understood. The contractor fessed up immediately despite being in shock. It still took the electricians an hour to check everything out and approve the transfer.
We spent forty eight hours recovering. Discovered fun things, like router power supplies that self-immolated on power up, the database controlling which VMs went were sequenced after the VMs, and EPOs should really have a safety cover.
To this day, tens of thousands of square feet of completely silent datacenter is the eeriest thing I’ve encountered at work.
LucidZane@reddit
A sign might be a good investment... something like "PRESS IF YOU INTEND TO SHUTDOWN THIS ENTIRE DATACENTER"
UMustBeNooHere@reddit
So, the emergency shutdown took down the UPSes as well? Or were there none in this data center? That's one hell of a single point of failure.
Tatermen@reddit
An EPO is meant to cut the power to the datacentre for safety - in case of fire, water leak etc. If the UPSs were able to bypass it would defeat the purpose and you'd still have a potentially life-threatening electrical hazard to deal with.
It's a deliberate single point of failure that's only supposed to be used to avoid an even bigger disaster.
Most issues come from poor implementation - eg. using a door exit switch as the button instead of a break-glass button with proper warning signage - and/or poor training - eg. unaccompanied and un-inducted contractor being allowed to work solo in the datacentre.
adamixa1@reddit
maybe to cut and avoid any electricity flowing in case of flood, leakage or etc etc
Newbosterone@reddit
You are right about that, nowadays.
An accident waiting to happen.
Ticket closed, operating as designed.
The UPS systems (3 Diesel engines and about 1000 deep cycle batteries) were fine, as were the redundant power feeds (2 substations).
The EPO is on the datacenter side of the power transfer system. It has one job - cutting the power NOW in an emergency.
The CTO
AustinGroovy@reddit
Wait, I am pondering if this has happened in other places or was this my data center? Exact thing happened to us early 2000's.
Different time - same data center. Someone cooking popcorn in the employee lounge set off the smoke detector. Power out.
Another time, same data center. Hard rains causing roof leak and water dripping down the wall precisely where two power mains were attached to the wall. The rain didn't CAUSE the outage, but they disconnected power in an abundance of caution.
Suffice to sat - no longer hosting equipment in THAT datacenter.
Newbosterone@reddit
Midwest USA, maybe 2007 or so? Really big company?
NebraskaCoder@reddit
Not Nebraska, is it? The incident (very similar) was not during 2007 though. Maybe 2015 ish.
Newbosterone@reddit
Nope, not Nebraska.
Apparently, this happens a lot:
NebraskaCoder@reddit
Yikes. Well, the one here didn't do power, just opened all the doors in a secure corporate setting with sensitive data in the building. 😂
AustinGroovy@reddit
Yep
spaceman_sloth@reddit
In what scenario would you actually want to press that button? And why was it so easy to press accidentally?
Newbosterone@reddit
EchoPhi@reddit
Why is said button not shielded, holee fook.
Bijorak@reddit
The VP of IT for this at my first IT job ever.
No one knew who had hit the emergency power shut down button one day. All the servers and the entire company just went offline. The sys admins were trying to get everything back up and did eventually over 6 hours and make sure all Oracle DBs and applications were functioning. Then came the time to figure out who pushed the button. The VP of IT, we will call Edwin, was asking around the service desk and sys admins. No one fessed up. So he singled out the most senior sys admin at the company, we will call him Mark.
Edwin - Just admit you did it Mark and you won't get fired. We will call it an accident
Mark - I didn't push it. I haven't been down to the data center in weeks
E - just tell us Mark. It'll be ok.
M - it wasn't me. Security will have their tapes and access control back up in about 30 minutes. We will see who did it then
Fast forward 37 minutes
We are all standing in the IT conference room watching the video from outside the data center, the button was outside in the man trap room.
We see Edwin walk in and take look around the room. He walks over to the big red button. Pushes it then walks out of the room
Edwin then turned and walked out of the conference room and he never spoke of it again
goshin2568@reddit
I'm confused did he just legitimately forget that he did it or what? How did he possibly think he'd get away with it while he's standing there about to review the tapes?
Bijorak@reddit
he was a very smart guy but had some weird tendencies. we think he didnt know what the button was but we honestly have no idea what he was thinking.
_haha_oh_wow_@reddit
yeah, so I know only that once sentence was Morgan Fairchild but now I'm reading everything in his voice including what I am currently writing.
What have you done to me? Is it permanent?
Substantial_Top_7616@reddit
That sounds like what happened at the place i worked at. The silence of a datacenter is terrifying. It's not a good idea to put the emergency shutdown button next to the door release button.
Bogus1989@reddit
How bout the facilities maintenance and electricians locking themselves out of the machine that controls/monitors all the power in a hospital🤣. Id given them the local admin password, (how do i know theirs, but i guess they didn’t) had to drive in to go up in front of a group of 30 men…just to watch me type it in…and it worked 🤷♂️
RandomPhaseNoise@reddit
Similar happened in a small power plant control room. There were an epo button every 2 or 3 meters.
Subcontractor arrived to fix something, brought a ladder, tipped over and hit the epo! Restarting the power plant took one day, and a lot of paperwork.
mikolajekj@reddit
I’ve had this happen where our “Safety Officer” thought it was the “panic” button. Well, when you saw the look on the Sys Admins face as he came running… you can kinda call that a “successful test…” ;)
Newbosterone@reddit
We started calling it the Pizza button. If someone hit 20-30 pizzas showed up 4 hours later.
Mrmastermax@reddit
Wohhh Holup that’s type on conspiracy might be not allowed here.
spaetzelspiff@reddit
So the contractor single-handedly led a successful DR exercise?
Mrmastermax@reddit
Contractor saved many hours and $1000s of dollars of dr testing by datacenter.
Electrician is a hero.
DontStopNowBaby@reddit
The original CHAOS engineer.
mcast76@reddit
This is why you make sure the big red button is always labeled
4rd_Prefect@reddit
Yep, I was working in an unfamiliar datacenter late at night and studiously avoided the red switch on the wall, it was dark and I had to use my phone to see.
I found out later that the red switch was a light switch, and it was red because it was on the UPS.
I labelled it (correctly) though thinking about it, labelling it "Halon release" would have been funny 🤣
Boone74@reddit
We put a cage around ours at a previous job for a similar reason.
Ben_Sisko69@reddit
Read Fairchild still heard Freeman's voice in my head
hieronymous-cowherd@reddit
I read Fairchild and heard and saw her, circa 1999.
Newbosterone@reddit
Shit, brain fart. I’ll leave it so everyone can make the mental edit.
fitz2234@reddit
Same. Must've been an autocorrect or mental slip. I tried reading it in her voice like in those Old Navy commercials but just didn't vibe like a good old Shawshank narration.
TheGreatNico@reddit
Our primary on-site DC has " the big red button" as an unlabeled door exit button. For about the first month when I was working there I was so afraid I was going to shut the entire DC down by pressing the only button in the place. Still makes me a little nervous having to hit it
smoike@reddit
Thanks for reminding me. I wrote something above and misremembered the button as being on a wall by itself. It wasn't, it initially was only a couple of metres from the door exit on the opposite side to the door exit button, though it WAS a different colour. After a security guard pressed it, they proceeded to move the button 10 metres further away and totally leaned in on making it clear that it was not to be messed with.
fcewen00@reddit
One of our VP was doing a dog and pony show for some elementary kids. I can still it all in slow motion. All I can say is thank god he didn’t push the halon switch next to it.
youfrickinguy@reddit
Fisher Plaza?
Newbosterone@reddit
Nope.
youfrickinguy@reddit
Well, a very similar thing happened at Fisher :)
mmpre@reddit
There's a local business I go to that has a sign that says "Ring the bell before entering". The door bell button looks just like the emergency stop buttons. I have never been able to get myself to ring the bell. I just walk in.
pertexted@reddit
This is my favorite!
1RedOne@reddit
Maybe once a month I would get a request to make a new user in a department. The format was always "make user A just like user B with all of the same groups she has".
I was just learning PowerShell and hated having to do this by hand and also not have any trail of my work. So I wrote a nice script that added a log of the group adds and i tested it over and over in -What If mode.
Eventually I get a new ticket request and tell my coworker, "hey, check This out!" And run the script and it outputs
Copying all groups from user A
Setting user A as member of Exchange Users
Setting user A as member of Building Physical Access
Setting user A as member of Parking deck access
Setting user A as member of Internet access enabled
Etc etc.
I was so proud of that log! And then the phones started to ring like crazy. File shares couldn't be accessed. People couldn't badge in, people couldn't enter the parking deck.
It was hard to conceive of the true problem, until I opened up Active Directory Users and Computers and openee one of the groups from the log.
They all had only User B.
It turns out the Set-AdGroupMember command sets the groups sole member. If you want to add this user to the group, you want to use the -append flag.
We had no records of membership or any sort of backup available other than a remote side which was very out of date so it took an authoritative domain restore to fix the problem.
All to save about a minute of work, approximately once a month
smallbluetext@reddit
Holy shit I did this same thing but copied half the script from someone else because they had a nice GUI, which obviously had the -append tag because it didn't end in disaster. I always wondered what could have gone wrong there and now I know.
tech2but1@reddit
This is supposedly why I have a lab, but it has over the years become more prod and now this sort of thing is "done in prod" rather than in the test environment. One day I'll cadge another server and get a new test env.
1RedOne@reddit
I used the -WhatIf command! and it said
"Setting membership to user X". It never specifically said that it was deleting every other user! I coiuldn't believe it when I realized what happened.
Appropriate-Border-8@reddit
A co-worker that I used to work with, and who was responsible for our AD domain, showed me how to use CSV files in Excel to put in lists of usernames or IP addresses, etc. In a column and a place holder in the left and right columns on either side of the main data column (i.e. hhh and iii). Then use replace in Notepad to replace "hhh," and ",iii" with the command text.
The copy and paste all lines into a PowerShell window. 😉
zaphod777@reddit
Technically it did what it said "Setting membership to user X". Also be really careful when using a group policy to set local PC group memberships. If you don't do it right you will replace the local group memberships with what you set rather than add to what's there.
You can also just right click on a user and choose copy to get all the same group memberships.
1RedOne@reddit
Fortunately this was years ago, now I actually work at one of those giant companies, so I have the opportunity to make tools other people use to screw up their environments!
Mrmastermax@reddit
I write a script and write a function for dry run. Learnt it from hp storage.
BeanBagKing@reddit
Everyone has a test environment, a few of us are lucky enough to also have a prod environment.
DenyCasio@reddit
Perhaps that is an older revision of the command but it's not how that works today.
https://learn.microsoft.com/en-us/powershell/module/activedirectory/add-adgroupmember?view=windowsserver2022-ps
hoodie_man23@reddit
Had a VMware person destroy a VM accidentally. Funny way to lose a server in the infrastructure that way. Just kinda like “Welp everyone makes mistakes”
Newbosterone@reddit
We had someone shutdown and start disassembling a production Sun server. 20 or so disk trays, say 200+ disks. In order for the RAID manager to recognize the trays, they had to be cabled exactly as before. We learned to label servers front and back.
dagamore12@reddit
There is a myth out there that the newer raid controllers will use the drive SN not place in the chain for its mapping and thus moving drives around when it is powered off should not cause any problems.
I dont know if that is a real thing or not, as I dont want to test it, I just put them back in to the same slot and have never had to rebuild an array after a serial number inventory or backplane replacement. It might be slower, but damn it is safer.
Newbosterone@reddit
This was Solaris. From Sol8 to 10 Sun put a lot of effort into reliability, availability, and maintainability. Things like rebootless patching, redundancies, and oops-proofing. Disk labeling and redundant Raid signatures were part of that initiative.
iDam81@reddit
You old!
Lower_Fan@reddit
I had a support for a on prem application and at the time I only had a ~~prod~~ test server. While he was showing me something he was gonna reboot it in the middle of the day I had to jank the mouse from him, all virtual tho.
no-surprise-here@reddit
A few years ago I had a new manager, which they expected me to train, come to work drunk every day of his first week and I couldn’t say anything because I was scared they would think I’m trying to get him fired for his job. He just wandered around smelling like a 12 pack.
sccmskin@reddit
I worked in consulting for five years. Where do I even start? Probably the most frustrating one was getting an engagement to deploy Windows 10 but they wouldn't let me script anything at all because they didn't understand powershell or vb. They kept accusing me of writing "secret code" and/or trying exfiltrate data from their company. Result: They got the most vanilla task sequence ever with very little automation that required extensive hands on touch to get ready for the user. You don't want automation? Fine. Here you go. Have fun.
Tsunpl@reddit
Heard from a buddy: He got laid off from a Fortune 500 company in "security guard will take you to your desk so you can pack up your things" style, tried to warn his manager that project he was working on doesn't have log cleanup functionality implemented yet, so will clog up the server it was running on soon. Was told "don't worry, it's no longer your problem, [Indian outsourcing company brought to replace him] will take care of it". They didn't, a few months later it blew up, and caused cascading failure in their service, resulting in damages well into 8 figures that you've probably seen in the newspapers.
ArchStanton67@reddit
Cognizant?
IAmJustNobodyAtAll@reddit
In 2016 my Syadmin job (formerly IT Manager) was outsourced to Cognizant. There's some quality staff there alright. I was told later that user password reset took 3 days, etc.
TheAuldMan76@reddit
That's just nuts!
IAmJustNobodyAtAll@reddit
There was also the Cognizant tech who came in during the changeover "Information Transfer" who plugged a ransomware infected PC back into the network and didn't know how to boot from a USB drive.
TheAuldMan76@reddit
Pardon??????? Christ, that just takes the biscuit - I assume there was no data loss, and the tech got a boot up the arse!!!
IAmJustNobodyAtAll@reddit
The staff member with the infected PC was no dummy. He called and I immediately pulled it out again and threatened to eject the Tech but officially, I wasn't there anymore and had no authority to remove him. I hung around instead and tried to help the Cognizant tech out but he was completely clueless. In the end I walked and left him to reimage the PC. We had standard images to roll out, don't know if he used one, didn't care.
creamy--goodness@reddit
At&t? 😁
Individual_Fun8263@reddit
Waaay back in my junior admin days, boss asked me to decom a test Exchange server. So I logged on the console and was going to start when user knocked on the door and I had to go help them... ... Meanwhile, boss needed to do something with the email system, so switches the kvm over to the production exchange server .... ... I come back and you probably guessed what happened next. I uninstalled production Exchange. When you do that, it wipes the EDB file that stores the email (of course, why not the worst case scenario!) At least somebody came and said email was down before I got too far. Restoring the backup from night before worked (whew). Lessons learned... Background setting in each server with the server name. (Apps to do this came along later). Log off server, don't just lock it (looking back, having individual admin accounts would have been better).
Brufar_308@reddit
Saw a little trick to add a toolbar next to the system tray and when browse opens to select the folder type. \%computername% And click select. Bam server name right on the taskbar where I can see it and not hidden behind other open windows.
Your exchange scenario would have been a rough day. Yikes.
rednib@reddit
I just had to purchase 3 AIO desktop computer systems rather than laptops with docking stations because the admin thinks that the VPN on the laptops isn't stable. Spent double the money since F5 VPN doesn't work differently if its on laptop or a desktop but wtf do I know, smh.
RedFive1976@reddit
AiO machines in the office are a nightmare on their own.
bachus_PL@reddit
Broadcom got VMware. End of story.
RedFive1976@reddit
You win.
Angelos_yu@reddit
Bunch of them over the last 24 years in IT.
From HP Partner that killed storage controllers in main storage in a bank, to being on the customer site (production of kitchen appliances) for 36 hours when they tried to update storage controllers, not sure was is MSA or EVA. They took down whole factory production.
I have had one instances where brick layers took back side of server room in DC and leave it with nylons, and later in that same DC, they were scrapping paint of the sealing and concrete was falling on servers and racks. We were cleaning servers for one month with blowers and hazmat masks and suites, server by server....
But the worst thing while being admin was maintaining usb shared printers and scanner's. This was woodoo witch craft, never working, print spoolers crashing on XP. Ohh yeah, when moved working from HP to VMWare and Huawei, i meet two idiots from Sa*a (HP partner) that took down Bank server, took out FC card, put it in workstation and patch it from windows, instead using SmartStart CD. I was in charge monitoring that company (partner) for some time, and their engineers while being in HP, for bringing down one other bank.
Brather_Brothersome@reddit
Working for a multinational cable company who lets say wanted everything free, our mail system ran out of space 20000+ users, I pleaded for them to migrate to online or more space for months but no they wouldn't spend the money, the day I knew i'd crash I resigned. I satill get calls asking for help from the users because management sucks.
super_ken_masters@reddit
Did you resign in the very actual day of the crash? Good for you otherwise you would spend weeks rebuilding the whole setup! I have a vague memory of long time ago. Colleagues from another state / city were having low disk space for Microsoft Exchange. I do not know exactly what happened there but they tried something with the raid setup and the server was dead on a Sunday. They spent the next two weeks fixing everything up :/
Brather_Brothersome@reddit
yes I walked into Hr with my Immediate Resign Notice already signed, they asked me why I just gave them a name the CEO. and said tonight everything will crash and noone cared about listening to me. they wre shocked but accepted the note (the phone i had was theirs so i left that too).
Affectionate-Cat-975@reddit
Early 00s remote sales person pitching a fit to CEO about wifi not working on laptop. I’m forced to support playing operator through CEO (not allowed to actually talk to the user). CEO is all in my azz about this not working. Finally get them in the phone….. Me - when did this problem occur and did anything else change? User - It stopped working when they moved 40 miles away.
Me - Who’s your new ISP? User - Never had one Me - How did you connect before User - I used my neighbors unsecured wifi……. Me - welp, You have 3 options…. 1) pay for internet access 2) get a 40 mile network cable 3) move back and keep stealing your neighbors wifi. CEO laughed it off ignoring the pressure and shit he put me through.
550c@reddit
Had a developer call me on a Saturday because he said he couldn't connect to the company WiFi. Quickly found out he wasn't at the office and was at home many miles away from the office. He didn't understand how WiFi worked and he was a developer.
whetu@reddit
I've worked with savant-tier developers who could algorithm solutions to all the worlds problems if they so chose to. Yet adding a printer to Windows? They just couldn't wrap their heads around it.
tankerkiller125real@reddit
I mean to be fair, half the time Windows can't figure out how to add printers to itself.
Appropriate-Border-8@reddit
As a longtime administrator of thousands of shared Windows print queues on multiple enterprise Windows print servers, I can tell you that main reasons people, even ones who do know how to add a printer, often cannot because they do not have the correct domain credentials. The most common way to add a queue is to add one using the directory but, it will show you every published queue, even ones that you do not have the rights for adding them to your local profile. Another way to add a queue is the UNC Path Method in which you specify the UNC path of a print server followed by a forward slash. At that point, you will see a list of shared queues, created on that print server, appear but, only the ones that you actually have the rights to use.
Sarin10@reddit
so... what's the secret to not going batshit insane? daily therapy sessions? medication?
Appropriate-Border-8@reddit
Keeping to a strict naming convention and only have me and the four other guys in my team able to modify the queue settings.
The queue names are made up of a building code, an internal location code, and the last octet in the printer's static IP address. Looking at the queue name you know what it's IP address is, which building it's in, and where inside the building it is.
I am the only one that changes the drivers, whenever that becomes necessary, and alot of our printers are from the same manufacturer so I only use two global drivers for them: one PCL6 and the other PostScript.
Our purchasing department gets a contract with one printer vendor who supplies specific models that we allow. Users are not permitted to connect any old fucking thing to the network and they are not allowed to setup their own WiFi networks.
Affectionate-Cat-975@reddit
+1 for Papercut and ID release
sagewah@reddit
I've long suspected it knows, but really doesn't want to.
lemon_tea@reddit
Nobody wants to deal with printers. Windows is just one of us.
topknottington@reddit
I think windows just forgot what name microsoft changed printer too
KnoxvilleBuckeye@reddit
Printers and scanners are like witchcraft, just without the fun stuff like dancing sky clad under the moon or the Beltane fires.
gargravarr2112@reddit
I worked with AI programmers who were doing code analysis across miles of whiteboard space.
They were convinced the floor socket at their desk was faulty and had me get the office manager involved, when in fact the plug was loose in the socket.
Appropriate-Border-8@reddit
I want a new class to be added to the secondary curriculum of every school board and school district in Canada and the US.
Googling 101
Lesson #1 - Less is more (using too many search terms, such as full sentences, may tend to limit the number of hits in the results).
zeus204013@reddit
I actually "fixed" a printer in a place. Actually only set an ip on the printer (because is was a easy, fast solution) and configure some windows pc to work with the printer, but via IP. And worked...
The issue was problems with the drivers (apparently) and need to share to more PCs. Results that the printer had an nic card...
I was an young student, and the rest of the office engineers/cs people... Laziness
reni-chan@reddit
My very good friend that I met doing my Computer Science degree got his first ever computer 2 months before the start of the course.
He couldn't understand why the .exe file he downloaded onto his Ubuntu Desktop VM wouldn't open.
He now makes more money writing software for the NHS than I do as a network admin, and I am being paid very well for the role I'm in.
oznobz@reddit
Something I've learned is that there is a cube in tech with the vertices of dev, sysadmin, and security. And most people land somewhere in the middle of the cube, there are several people who are squarely on the edges.
KarlDag@reddit
A cube, 3 vertices.
Go on.
oznobz@reddit
I'm not a math guy. I meant something other than vertices, lol
Sarin10@reddit
maybe you meant a triangle or triad (instead of cube)?
CrnaTica@reddit
worked with developer who can do all kinds of algorithms and optimizations (true witchcraft and wizardry)... but if you remove visual studio icon from desktop.... no avail
South_Candle7652@reddit
WHY???? is that story still SOOO common???
550c@reddit
To be fair this was about 7 years ago.
duke78@reddit
As far as I know, nothing has changes about Wifi in the last seven years that makes that any better.
550c@reddit
I meant that this story didn't just pop up recently.
TrainingOrchid516@reddit
"Early 00s" = nuff said
Affectionate-Cat-975@reddit
Sadly we worked at a software company. You’ve hoped someone knew enough….nope
wanderinggoat@reddit
your lucky , in my experience the CEO would never actually let you talk to the user or get more details and then get a third party in to fix the issue at a huge cost then mumble something about support being slack.
Bogus1989@reddit
Lmao how bout a boss entering in a ticket for someone too lazy to…leave a comment on the ticket is supposed to get the contact info….leave that shit open,
Close it 2 weeks later cuz it never happened.
Neurotripsticks@reddit
This reminds me of a time I was working as a tech support in an ISP call center. This lady was vacationing, and couldn't figure out the internet for her tablet. I kept trying to tell her she needed help from hotel staff with that and the following convo looped around something like:
"But all my internet at home is with you guys"
"Yes, but you are in Spain."
/repeats request for assistance
Beefcrustycurtains@reddit
People can be so fucking dumb sometimes.
bigjohnman@reddit
I worked with a guy who was fairly smart when it came to computers, but absolutely awful with people. He was on the autistic spectrum. He would create these insane rules and complain to management that I was able to fix systems outside of the rules of standard procedures. Granted I am a hacker and wouldn't follow the rules often, but this guy made doing my job awful. Like we got hit by an encryption virus demanding bit coins. I found a way to stop the spread by pushing out fake files with write protection & the keys were posted in a blog by another company. I had us back up and running in no time. Oh... this guy was furious that I fixed everything.
CantankerousBusBoy@reddit
I need more information on how you got around this virus!
bigjohnman@reddit
I found a blog where someone discusses how you can open a text editor, write something in it like a period, save it as the specific virus file.dll, right click >>> properties >>> permissions >>> set as write protected with a specific owner removing everyone. It went on to explain how to push these files out using SCCM. I found another blog with the keys. This was 15 years ago and was specific to a method that was happening during this time. Things have changed since then.
Stokehall@reddit
Same, this seems revolutionary
tehgent@reddit
2010 - Was the only IT for a non profit that had 27 locations in 11 localities.
Was on the phone with one about an hour away and we were trying to troubleshoot their internet connection. In the process of rebooting the modem:
me: You see that black box in the corner of room 123, it has a big white sticker on it that says on it?
them: yes i see it
me: what color wires do you see out the back of it?
them: a grey, a orange, and a black.
Me: black cable is the tiniest cable right?
them: yes
me: ok unplug the black cable then please.
them: which one?
me: the black cable we were just looking at
them: ill just unplug all of em
me: NO DO NOT DO THAT JUST THE BLACK ONE
them: oh too late... how do I get this stuff back together
me: Im going to leave this pretty little paper here that shows, with pictures, how to reboot this thing. If yall unplug everything again, Im making you call the ISP and get charged for the visit to hook it back up.
adamixa1@reddit
I once made the same mistake. It was the weekend, and we decided to perform some server maintenance from home. However, some of the servers were not booting up, and no one was willing to drive the 30-60km to the office. So, we decided to seek help from security.
IT: "Please press the power button. It's the round button."
Security: "No, I can't see it."
IT: :: Repeat
Security: "Okay, got it."
So, we trusted him and waited. As time passed, we wondered why it was taking so long, fearing that something might have gone wrong. I decided to drive to the office to investigate. It turned out that he had pressed the CD tray instead of the power button.
ColdCryptographer318@reddit
A long time ago, I was an intern at the company, we had a HPC cluster in Frankfurt and we were located 500km away, it was convenient to have a good interconnection with many european countries because our users were dispatched on several R&D sites abroad.
Anyway, we had two management servers for 80 compute nodes and a company on site was supposed to handle the hardware maintenance. When replacing one hard disk in a management server, they removed the wrong drive and failed the raid 5. Then when trying to silently solved their mistake, they worked on the wrong servers and mixed the drives of the two management servers.
These servers were installed by an external company, nobody knew how to restore them, they caused two weeks of service interruption for more than 100 engineers.
tehgent@reddit
I remember a call center call I had many moons ago at an ISP
Me: Please click the start button in the bottom left corner of the screen
them: what start button? Its not there.
Me: Bottom left, please. Its a big green button that says Start on it.
them: I just see a clock.
me: Your other left.
tehgent@reddit
ExcitingTabletop@reddit
I managed 50 locations across probably dozen states. Some of these were couple dozen acre production facilities.
Each location I physically got to received picture instructions, along with each cable being labeled. The picture instructions had large red arrows.
Oddly everyone loved it instead of feeling it was insulting.
demexe@reddit
I broke a power station once. I was putting some new servers in a cab and switching it on as I went, looked to my right and noticed the power station control cabs were dark. I tripped the ring. The worst part was it was a gas power station that provided power for another type of power station close by. The power station engineer was a very good guy (understatement) his only question was, 'is anyone hurt' and we went from there. I had several stiff drinks when I got home. Lots of lessons learned that day.
CeC-P@reddit
We were erasing about 40 desktops from a medical environment and we knew not to plug in more than that. Someone wasn't listening and ran 52 and blew the circuit. It was end of day and then the morning crew thought they were done and stacked them in the "done" pile without marking serial #'s. So we had to start all 800 over.
punkwalrus@reddit
I worked for a company where the server room was flooded by a burst pipe due to lack of insulation to an outdoor hose bib (faucet). It flooded the entire building. 100% on-site loss. I have posted it on reddit, but can't find it, so I cover it here when I was on Ars Technica as it unfolded, step-by-step.
punklinux@reddit
I remember that on Ars! I still talk to people about it to this day in various scenarios and share that link.
Bamtoman@reddit
Not sure if this counts, but I was asked by HR to clear out a former employee's desk, since the person specifically told HR that I was the only "trusted" one to pack up the items & ship it off.
There was dirty laundry ..... Thank God I had the foresight to wear disposable gloves before doing anything.
AustinGroovy@reddit
First month at new job, got a request from an executive that a manager in another office had ~~shot himself~~ accidentally discharged a firearm into his head while sleeping.
I was sent to his apartment to retrieve "company-owned" equipment, but instructed that if it was covered in brain matter (absolute true quote) they did not want it back.
Most awkward was my rummaging through his office looking for computers and phones while family and friends were in the Apartment helping clean up and prepare for the funeral.
ExcitingTabletop@reddit
I had to clean up after one of those. It was not fun.
Thankfully no sensitive electronics to recover.
Three round burst is messy. Worst part was cleaning the ceiling because drippy contaminated water everywhere.
Newbosterone@reddit
This is why you should never clean your firearms in your sleep.
Another_Night_Person@reddit
Fire recovery. Worked at a place that overhauled aircraft parts, and we had a fire in one of the workshops. Luckily the fire department responded quickly and saved the building, but one of the workshops was completely totaled.
The heat from the fire was conducted down the ethernet cables to the point we found melted wiring nearly 100 feet from the location of the fire. The servers had survived, but the fans had pulled in all the soot from the fire, so we had to quickly disassemble them and clean them as they had a thick layer of soot inside, along with anything else that had fans.
tentends1@reddit
one day i did major updates on a software used by the upper mgmt team. i left for an another site, and started to get phone calls, emails of extremely angry upper mgmt folks. i tought i made a mistake, tried to logged in to correct anything, not working, i made the mistake of telling them i made a big update so they're shitting on me. drove back to main site, tried to log in, doesn't work. go to server room, everything looked ok. i searched for a while and eventually discovered that numerous cables to switches/servers were disconnected.
turns out a janitor got fired and wouldn't leave without a fight and wreaked havoc the server room. took me a while to plug everything back with management breathing down my neck. thanks god some security cameras filmed the guy or else mgmt would have eaten me alive.
19610taw3@reddit
I have had a few users pull that stunt before.
With laptops, monitors ...
fresh-dork@reddit
i never did understand all these places allowing janitors in server rooms. sure, let's allow the guy making minimum wage, possibly on contract and with no particular education or interest in the functioning of the equipment to go in the room full of temperamental gear, maybe unplug something to get the vacuum working
jargonburn@reddit
You say server room; but, in too many cases, I've seen that room pulling double duty as the cleaning supplies room / janitorial closet.
Veranis@reddit
Fun story,
Worked at a hospital and they were doing an addition. They made a big upstairs penthouse area for plant services with a closeted room for IT to put gear in. Turns out they didn't distance things properly so fully half of the addition was too far away.
Their solution was to put a wall mounted rack into a janitorial/storage closet that is unlocked and propped open 24/7.
Don't worry though, the rack is locked with the power plug on the outside.
jargonburn@reddit
Well... Hopefully nobody ever needs to charge their phone while they have a vacuum plugged in... ☠️
fresh-dork@reddit
sure, but i've seen it when they're using a proper room with a keypad for entry. one place did exclude the janitors and simply made the people with access leave the trash outside and not eat in there
tentends1@reddit
yah in that case it was the older, smaller site and server room also had electricity, music, fire alarms, tvs and electrical pieces storage.
ImTheRealSpoon@reddit
Ahh man working in a hospital with Drs... Some of the dumbest people out there computer wise... I've had computers thrown at me when walking in because it required a password to use. Emergency calls at 3am in the morning because they can't figure out how to change toner on the printer... The printer has a screen with a video showing the three step process as well as printed on the toner itself... Good stuff
tentends1@reddit
Heck Yeah! The feels!
From a previous post:
I know! I was always teaching doctors and nurses how to fix basic things not for them to do my job, but for them to not have to wait for my next visit to do their medical things. But I was always met with blank stares and open mouths. And some of them are less than 35 years of age.
They once went a full two weeks without "going to the internet" to fulfill their reports because the Internet Explorer icon was missing on the taskbar. There was a chrome one on the task bar, and a IE shortcut on the desktop, not to mention it's available in another thousand ways.
For a certain app, they were paying big money to have the 24h/7day immediate helpdesk service and it took me 2 years to condition nurses and doctors to call the helpdesk for their passwords resets. Instead they waited for my next visit (every week or two weeks) to tell me to reset the password. But the helpdesk required personal info and secret phrases, so I was calling for them and they had to sit beside me and I handed them the phone for the security questions.
Bogus1989@reddit
Yep. Been there my friend. We had to have a united front on our team to get them that way too.
Nowadays if you are busy with higher priorities…wait a little, someone in the department will solve it. Its the younger generation…
Bogus1989@reddit
THANK GOD we have ricoh on site.
Yeah, the worst places are the ones that doctors are in charge. A good healthcare company is one that makes sure that the doctors have no say in the actual business part of things. I truly appreciate the nurses and others that treat them like children when they get out of line…..ive seen it the opposite way too where everyone worships the doctors, and its all young women fresh out of school… marriages ruined and everyone stabbing each other in the back.
my favorite thing to watch was last week them being assholes to a guy whos retiring next week. He told the doctor, that if hes still there when he comes back in 15 mins, hes going to have to admit himself to the ER 🤣🤣🤣🤣.
Just a Tuesday for my coworker Rick. Gonna miss him.
apatrol@reddit
Mid 90s working for Compaq right after they acquired Digital who made a new main frame system called Alpha. Consultants had spent months moving assembly line production from HP 3000 main frames.
Co-worker who worked on the main frame desk had to go to the bathroom and asked me to watch the console for alerts and answer the phone. I was a simple tape librarian at the time.
Get a call to completely power off the Alpha system called Kirk. I lift the power button cover and hit the red button. All the lights go off and I hear the guy on the phone say "what the fuck". Turns out I had a brief dyslectic moment and turned off Spaq.
Takes a few hours to start up a main frame. Production was down for around 8 hours. Over a million in lost or delayed production.
CONFIGdotSYS@reddit
Similarly I had to vary a bank of drives offline and messed up the addresses. I can't remember the actual addresses but it was along the lines of 1A-1F and instead I typed 1A-FF and was stunned as hundreds of devices went offline. Fortunately, we were able to dup the command and change ,offline to ,online and fix it. I was instructed to vary one drive at a time from then on. That sucked
Random_dg@reddit
When you say vary, you mean the MVS vary command, or does it have the same meaning on other platforms?
CONFIGdotSYS@reddit
Yes, the MVS command: v xxx-yyy,offline
dRaidon@reddit
Vendor reassured me that updating the raid card firmware was risk free and quick process.
It was indeed not a risk free process, but it wiped the raid config real fast, I'll give them that.
What more, that's how we discovered that:
A) One of the servers had been missed in the backup.
B) The vendors response to the raid config being wiped was going, "Oh well. is there anything else I can help you with?"
I have never trusted a word out of any single vendor since.
Appropriate-Border-8@reddit
I've had vendors suggest uninstalling the complete server application (running since 2015) and re-install to fix an issue introduced be an upgrade to that application. I told him to go get his head examined. LOL
50YearsofFailure@reddit
Dell support did this to one of my former coworkers. The RAID card was reporting a fault and the Dell warranty support tech insisted on updating firmware on it before proceeding with the case.
After he protested that this was not safe (and thoroughly reviewed affected server backups), he proceeded with the update and it bricked the RAID card. Dell dispatched a tech the next day to replace it.
_haha_oh_wow_@reddit
Classic Dell support!
fresh-dork@reddit
GG coworker
50YearsofFailure@reddit
This is the way.
19610taw3@reddit
I hate touching firmware. I had a coworker who loved playing with firmware during production hours. I'm surprised, but relived, nothing major ever happened.
Gsxing@reddit
The vendor after nuking the raid: “You may receive a 1 time survey. Please let us know how we are doing.”.
Mrmastermax@reddit
That’s the time they don’t send any survey… I have noticed this.
tankerkiller125real@reddit
Luckily, I have learned this lesson via Reddit and not via actually doing anything a vendor recommended back firing on me. There is a reason I record all calls I have with vendors.
El_Chupra_Nibre@reddit
I got one….. Company records your screen, captures camera feed and listens via the laptop mic. Constant camera watching and surveillance.
jonblackgg@reddit
Alright, here we go, this is when I was lead IT for a neobank.
Mind you, I felt this was probably the best option given I had documented most of our processes and put a ton of automatic remediations into our self service portal should a user need it.
Here comes the fun bit.
In that two week span Charles had achieved the following:
He had taken down the company twice.
Issuing an MDM command that wiped a ton of data on user macbooks.
By locking ALL of our users out of their Google accounts.
I came back to work post-surgery, and within 24 hours of me coming I noticed that about 80% of our macs were missing from Apple Business Manager (ABM). When I called out to chat asking what happened, he lodged his resignation.
allow/block appprofile via MDM which only allowedRectangle.appto run; So he essentially bricked half of the company macbooks until I realized and stopped it from being sent out to any more devices.This fucking dude. I told our head of tech and our head of risk that if he's still here for the next three weeks, he's barred from touching any of our stack and is relegated to writing docs + support via video calls.
This was a neobank striving for ISO27001, and our key rule was that any configuration was tested extensively and checked by multiple people before being sent out. Dude absolutely tarnished the reputation and good will the team had built since the start of that year.
rumandbass@reddit
Client we inherited was writing their only backups to a qnap that was plugged directly into a wall socket in a warehouse. The qnap also hosted their NFS shares, and snaps. They had a brown out situation where the qnap was kicked hundreds of times and it corrupted the raid. Cost them 70k to get about 6 months of data back because they didn't want to buy additional storage and battery backup.
R4LRetro@reddit
Before I knew anything about domain controllers, I once tried to do an offline defrag of the NTDS database. This caused the DC to stop booting, even in DSRM. To make it worse, I restored the DC from a backup. Naturally, my redundant DC flipped the fuck out and downed the entire domain. I paniced. NETLOGON and SYSVOL shares completely gone, domain was completely gone. I fixed it and saved my ass by using the BURFLAGS registry key and setting SYSVOL state to 1. NETLOGON and SYSVOL shares came back and I breathed the biggest sigh of relief ever.
Learned my lesson that day... gracefully demote problem DCs, stand up new ones.
Redcloak12@reddit
Back in '95ish I had two HPUX UNIX servers, 15 miles apart. 50 users on production #1 (Banyon services). Just getting #2 into production (document scanning server). So I am at the #2 console updating the kernel and getting ready to eat lunch. Apparently I was in a telnet session into #1 when I entered "sudo shutdown -H"! Oh crap!!! File, email and printing were now off-line for my 50 users and I am 15 miles away. What to do.
Well I call my friend there on site and I ask him to look at the server box and tell me the color of the power switch. Of course it was dark. I said to open the cover and press the button. He did. After a minute the light went green and I was able to log back in. He then asked me what happened. I said "OMG, you just rebooted the server!!" (in a joking way). I then told him the truth and found out nobody even noticed since it was lunch time.
Several lessons learned from that one.
joshuamarius@reddit
WOL has saved me with similar problems several times!
swuxil@reddit
These days there is the package "molly-guard" that is one of the first o be installed on new systems.
pspahn@reddit
I've done similar things when a shop I used to work for insisted we all use FTP for updating production applications.
Sometimes your dev and prod environments just have a similar name and it's only a matter of time before you accidentally upload dev files to prod.
I was let go for "toxicity" when I had suggested we start using source control. I took that sweet 3 month severance and happily left. Best career move ever.
Plantatious@reddit
I had a school where one of the redundant DCs had lost trust with the domain, funnily enough it also held all the FSMO roles.
Another one had their AD Connect running on an expired Evaluation copy which shut down every hour as a result. I didn't know about the renewing command and the school refused to pay for it, so I put together a PowerShell script that checked if the VM was off every 10 minutes and restarted it if so. It also had a check if the Server Manager had crashed as that was a thing. Worst functional script I've ever written.
A colleague of mine supported another school and found that their "backup server" was running a cracked version of Veeam, and the disks were commercial grade and in RAID 0.
Education can be a right mess sometimes. I don't even know how on-site techs manage to bring their networks to such a state.
s_schadenfreude@reddit
I used to do IT solo for an all-boys Catholic High School 20+ years ago. Many of the faculty and staff were LaSallian Christian Brothers that lived in a residence adjacent to the school. My first week on the job I got called into the principles office. He wanted me to "check out" the computer used by one of the brothers in the special-ed office in the school. I did, and found TONS of gay porn, lots of which depicted subjects of questionable age (I'm not calling them models). This was before we had a proper client-server set up for file sharing (pre-cloud) so at least the material was confined to just the one PC. They pulled him out of his position in the school, but he continued to live in the residence next door and was often in the school for various reasons. I'd see him occasionally over the course of my subsequent 6 years there, and it was always SO FUCKING AWKWARD. Doubtful it was ever looked into by any authorities. I was young and dumb at the time (and new to both the IT field and the city I'm in) and probably should have reported it myself. We implemented web filtering after this and never had the issue with faculty again to my knowledge. The kids, though? Different story. Aside from the normal teenage fascination with porn that I was also trying to thwart, some years later, after I'd moved on, they had an infamous issue that made the news with "teabagging" in the locker room. This is a school that supposedly puts out "fine Catholic men" and indeed they have lots of graduates in local, state, and federal government and at Fortune 100 companies. It's a weird cult-like atmosphere, and it always gave me the creeps.
Moontoya@reddit
That's why IT is/should be considered a mandatory reporter
That should have the police on his ass immediately
Have taken those steps several times in my career, gotten screamed at for doing it, but when it comes to kiddie porn, they're lucky I'm not taking a sledgehammer to them.....
2cats2hats@reddit
Ha!
Around 2001 I worked at a newspaper. A journalist has a problem with his Mac. I investigate and decide to nuke and pave. Backing up the data the filenames were getting suspect. Not CP but teen porn.
In report it, editorial department gives my boss shit about it.
Yea, it could be for a story but give us a heads up fuckin asshole…..
Newbosterone@reddit
If IT were mandatory reporters, the IT department would start reporting to legal instead of the CTO, engineering, or Finance. If you think accountants are hard to work for try lawyers.
ReallTrolll@reddit
I'm a mandatory reporter, I work in a school district. IT just reports to the same old people
fresh-dork@reddit
hang on, what? same reporting structure, you just have mandated reporting. in my state, it's set up as a bunch of regional 800 lines that you call
Bogus1989@reddit
Dealt with this alot at a college. Guy brought us a laptop to fix (students do helpdesk and support service for the IT program)
Id let them hack the local sam file, and change the password, but only after informing me and i or another instructor would be present. Red flags were already goin off…a pink laptop?it wasnt his. A womans name was the only login ever…i got her number from the student directory from the school, and she was a student still…guy was stalking her.
JBLoTRO@reddit
There are places where IT isn't? Where I live, at least, it's drilled into us if we are working in a school, regardless of role, we are mandated reporters.
Aggressive-Carpet918@reddit
I had a client that was a big ethanol plant. They complained about their $30k mash testing computer was running slow. The vendor connected remotely and ran their tests but couldn't find the problem. I went onsite and hopped on. One of their 3rd shift techs figured out how to get to the desktop (the main program basically just ran in full screen mode on W10) and had about 60 Chrome tabs open with all sorts of porn sites left open, along with his Facebook. He didn't get past the door when he came in a few hours later.
sagewah@reddit
Many years ago I encountered a similar situation. Client of ours had some industrial stuff - not under our purview - that was just not working, apparently there timing problems. They'd been working with the overseas the vendor a lot and not getting any progress so a team of engineers were dispatched to see if they can figure it out. While they were in flight (they were flown from the other side of the planet), they just casually asked us we we'd have a quick look to see if anything jumped out.
Now, its important to know that this was a 24/7 process and overseeing it was dull at best, an very dull on the night shift. Which is why we suspect it was the night shift who used their spare time to use the process control PC to browse porn. A Lot of porn. This was back when that activity not only meant a shitload of browsers plugins and extension and stupid little apps and addons, but a very high chance of malware.
So much in fact that the outbound traffic was saturating the control network, which created all the timing issues. We cleaned the PC and had it back in prod and this time, under our watch before the engineers had even passed customs.
The company had until then never even thought about porn on work PCs and the few that knew just wrote it off as a cure for boredom. We were not particularly popular for a while, especially when people who ignored the new rules and warnings and circumvented (or tried..) the monitoring started getting fired.. but those are other stories.
Bogus1989@reddit
LMAO my highschools infamous for our head basketball after a loss, saying to the team: “yall played like a bunch of N words”. Someone recorded it 🤣 this was years and years after my class graduated. He immediately got a job coaching college ball.
Then a few years later, the school made headlines again, about hazing a freshman on the basketball team, they shoved a pool cue up his ass and made him bleed…absolutely crazy. 🤣 wasnt crazy at all back when i went.
Mrmastermax@reddit
Hello there fellow La Sallian
DiscardStu@reddit
About 25 years ago I had a part time job in college as a computer operator where my job was to man the help desk in the evenings, print and sort reports and start the evening backups.
Right after I started there I asked my supervisor about emergency procedures, what to do in the event of a power failure, things like that. He proceeds to walk me through the shutdown procedure and then goes to show me the power switch for the server cabinets. As he points to the switch, from a distance of 3 or 4 feet away, the switch suddenly flips. We hear a ka-chunk sound followed by all the drives suddenly spinning down all while we're peering around the back edge of the server cabinet watching all the disk activity lights reflecting in the computer room window suddenly go dark. We're standing there in total silence, wondering what in the world just happened when the phones all started ringing at once. It was an interesting afternoon to say the least.
me_groovy@reddit
Clearly he used The Force.
DiscardStu@reddit
He certainly did have a sort of Qui-Gon Jinn air about him...
cayosonia@reddit
SQL - Update large table, forgot the where clause.
gangaskan@reddit
We had a guy that was hired to admin a specific department. At the time I kept an eye on this dick bag.
He was a piece of work, scavenged shit from other departments, was very shady. At the time my boss was ignorant to the fact that he could be malicious, he keylogged our domain admin account.
I despised this guy. He did his own consulting off site and said boss let him do that stuff while working.
I hated this guy so bad that I up bid him on an auction I was watching on eBay. It was a lot of Cisco 4k cat equipment. At the time it was a few hundred, but I was getting worried when it hit 800 bucks. Yea, I'm that kind of dick, he won it.
He uses some of it for production, and very shitty configs at that.
Well, one day he finally gets fired.
We are in the middle of trying to pick up this disaster of what he made and we had a drive failure on our left hand san... No biggie right? Threw a new drive in and we watch the rebuild.
Except for a second drive fails. And it's on the same device pool ( was a striped dual raid 5).
So now we are in recovery mode. And guess who decided to not do any backups because he didn't get to co locate his shit at a place he wanted too? Oh, and not only this, but not have any backup software at all? Yep
Vendor flys out a guy to assist, and then a week later the vendor literally goes tits up.
I spent countless hours with a data recovery place to try to get this data back, and even then they couldn't guarantee that it was accurate.
Oh, did I mention this was for a municipal court, and the dick bag blamed everyone but himself? We were still in the process of figuring it all out. Backups were next on the list, the San unfortunately beat us to it.
Turbojelly@reddit
Old classic that reads like a made up story.
Server catches fire and burns down. IT turn on super charged mode to recover from backups. Shitty manager from a different department spends their time screaming for IT to be fired.
Email server is one of the first to be recovered. First company wide email comes through, sent a few hours before the fire. It's Shitty Manager berating IT for leaving all these AC units on for the weekend and congratulating themselves for turning them all off....
Low_Highway_8919@reddit
One day, we were cleaning out a server room because we were moving most of the hardware to a second, bigger one. Some floor tiles were removed, to track the cables. While taking a short break, I was leaning on an iSeries machine, a big clunky mainframe. I didn't realize that the machine had its wheels unblocked. While leaning, talking and laughing to colleagues, I managed to roll the thing into one of the open floor tiles. I nearly destroyed a million euros mainframe by tipping it into the floor ...
As if that wasn't enough, in the same room a few days later one of my crew-members was a bit to energetic and cut a wrong patch panel. The panel provided cabling for 48 phones. 48 ports, cut within a second. Had to call our cabling partner in an emergency, paid double the price to have it recabled within four hours.
sovalente@reddit
Every time I answer the phone and someone asked "Are you busy?".
I know there's some horror heading my way... 👀
Kurgan_IT@reddit
Every time the phone rings I know there is some horror heading my way even before answering.
StrangeCaptain@reddit
I used to never answer my phone, now I don’t even launch the app
Jellovator@reddit
spooky voice "... and the hack was coming from INSIDE THE HOUSE! Muaaahahahaha!"
super_ken_masters@reddit
?
BalderVerdandi@reddit
The "Paper MCSE".
This was back in 1996. Had a guy who was literally a "paper MCSE" back in the days of NT 3.11 and NT 3.51, who had his PDC blue screen, the rest of his servers fighting to be promoted that caused a massive outage, and he passed out from a panic attack - clipping a telecom rack on his way to meet the floor. He ended up with a nice cut on his forehead from a screw on the telecom rack and a lump on the back of his head around the 4 o'clock position from smacking his head on the floor. He hit hard enough that we heard it above the fans and A/C in the server room, so we knew it was serious.
I hadn't even been out of the Marines for a year and had to deal with his injuries, direct the junior sysadmin on how to gracefully shutdown the other servers and restart the blue screened server which needed to be restarted a second time, and then have a few others bring EMS in to get our boy. I had to request a neck brace and backboard since we didn't know how badly hurt he was. That ended up being a good call since EMS thought he could have had a concussion.
Come to find out he hadn't scheduled any restarts of his servers for over two months, hence the reason why the PDC blue screened.
burdalane@reddit
Trying to put in a replacement drive and being unable to close the latch. Getting paged for software or networking problems and having no clue what ot do because I didn't know how the software worked or what it was supposed to to do, or just didn't know anything more than very basic network commands.
idgarad@reddit
Built a Squid\Proxy system for a company with a coworker DB. Two parts, blacklist for the corporate back office, white list for the field\franchises. (By the way I worked with the real Stifler's mom on occasion). When we built the system it was the typical Squid\Snort\Webalizer stack you used back then with some simple magic to have the two types of configs.
In conjunction with that I built a system to silently record a workstation back in the days of shockwave and TightVNC combo.
So if we saw a problematic entry in the proxy log we could record in real time and archive a targeted workstation (I made the argument that simply having traffic logged was insufficient and HR agreed).
Well I had a problematic manager who wanted control of that system so they could spy on everyone's activities. I side stepped that by handing over the system to HR directly so the manager in question couldn't use the tool.
Lets just say the subsequent retribution wasn't terribly good for my mental health. But apparently the system to a degree is still in use 20 years later at more then a few locations.
How do I know? because there was a tiny email server I had used to send error email directly to my, and this should date it, yahoo mail account. Which as of last Wednesday (2/21/24) I still get occasional email!
Who still even uses Squid\Webalizer anymore? But worse yet... here is the horror story... a single self contained local command line mail server.exe from 20 years ago, using stock ports, no SSL, is making out from some idiot's server, through their firewalls, and emailing my yahoo account. Well that and at least 4 different companies are somehow using a shitty squid stack from 20 years ago. Crazy yet if they were copying our setup, that means they are dumping the squid logs directly into SQL via linux formatted FIFO files which means they had to have more or less cloned the whole system unless they are dumping into flat files... bonkers...
Even-Face4622@reddit
I stuck my elbow in the emergency power off switch of a ibm system 36 and immediately killed a computer that ran on 8 inch floppies and was about the size of 3 chest freezers. As it made the brrrrrr shutdown sound there was this horrible smell like burning dust. One fairly large hotel immediately stopped. Made a phonecall, flicked the switch back on and it came back to life. Amazing machine. 2kb of memory iirc
Even-Face4622@reddit
The guy sitting next to me accidentally did the old capital bank (au) issue and released a sccm os build the the whole organization. Servers and all. , amazingly it hit the pc next to him first and cause it was broken just wiped drive, he managed to take down the dp before the whole site went (2000 seats 60 countries). He went white, then straight to the pub
Necessary_Tip_5295@reddit
Back in 1999, I was a young and very tired U.S. Marine, conducting a routine nightly system cleanup. Unfortunately, I failed to realize that I had entered the dreaded command "rm -rf *" under / (root) thinking I was in a different directory without double-checking, and inadvertently hit enter. As I scrambled to cancel the command, a wave of panic washed over me as I watched my life and crucial system files vanish before my eyes.
Gsxing@reddit
We got a high sev ticket for a storage array being offline. We could not figure out why it wasn’t connecting because everything was plugged in.
So in the middle of all of this we are also replacing the existing hardware and removing some other decomm’d equipment.
Same day, a technician came onsite to remove an old Cisco server that was ready to be pulled. So, after awhile I asked our local contact to take a picture.
Sure enough, the technician pulled out our DELL storage array instead of the Cisco server. I had to mute myself on the zoom call as I was laughing so hard for a good five minutes. Had the vendor come back and fix this entire mess.
mallet17@reddit
Lol. "Hmmm this black and grey looks pretty mint for Cisco... R740xd eh... new model!" yoink
Sorcerious@reddit
How did you determine everything was plugged in if nobody went to take a look at the machine?
Gsxing@reddit
Yeah I was trying to explain this a bit better. We had a local contact we were working with. She told us that all of our connections were plugged in (fiber, Ethernet, power). She was actually looking at a newer storage array that was not in prod yet. Aka: she was looking at the new storage array and didn’t notice the old one was gone. We should have just asked for pictures from the beginning, would have saved some time.
ShakataGaNai@reddit
Not "interesting" so much as painful.
Working for a small-ish company circa 2010ish, it's actually like 3 companies in one - but one IT/Server team. That would be me and my Jr Assistant. We were doing some network upgrades for the companies servers. All three companies servers were colocated together and powered every aspect (production, dev, customers, you name it) of all three companies. So the upgrades needed to be handled very very delicately.
The changes required a LOT of work on a pair of Cisco ASA's running in HA, and I was by no means a Cisco expert. So we had remote techs help us get everything setup and prep'd and ready for the migration. We're at the datacenter at 11pm on Friday, getting ready for the migration. Everything is looking good so exactly at midnight we begin the migration process. Except the new setup is not working correctly for some reason.
Well, what do you do when things aren't working correctly? You reboot. We rebooted the ASA's.... except what I didn't realize is that the remote technicians hadn't done a
write mem. When we rebooted the ASA's, weeks of config prep were blown away. AND, to make matters even better, *we* hadn't backed up the running config either in anyway. AND the remote techs hadn't documented what they'd done. AND we couldn't roll back the other network changes.So... we had to call the remote tech hotline and get some uber Cisco expert on the phone at some extremely extortionate rate (IIRC it was something like $500/hr), to rebuild the ASA's for the new setup. It took "only" another 3 or so hours. Most of the time of which my Jr and I had nothing to do but wait for the expert to finish... sometime around 4am.
It sucked, do not recommend.
JBLoTRO@reddit
Took over as a temp contractor after the former sysadmin walked. I missed something during discovery. The Dell MD3000 all of $client's backups lived on had a hot spare - check. Found out the hard way, when a disk died a bit later that those could be configured as RAID 0 with a hot spare.
virtualadept@reddit
"Define interesting."
Late 2004, I was working for a large entity with a terrible mismash of technologies in place, from an IBM S/390 to a whole bunch of ancient Dell beige desktops running Windows 3.11. One of the things they had was a mother-huge (for the time) Citrix Metaframe server that was in heavy use by a couple of departments that were trying to go all-thin client.
Word from the security team came down. Said Shitrix server had to get locked down because it hadn't been hardened since it was originally set up. I'd never worked with Shitrix before so I did some research on how to harden them. Thing and a thing, blah blah blah, and the one thing that every howto I came across said was do not, under any circumstances, try to harden Citrix Metaframe after it had been put in service because it would blow the whole thing up.
I was just a contractor there and smelled a setup, so I wrote everything up (including the multiple warnings that said this was a bad idea) and took it to the stakeholder of the Citrix box and my boss (who was on the security team). Stakeholder said oh, no, this is bad, we can't do it. My boss took the writeup, did his own research (which confirmed the dire warnings), said the same thing, and took it to his bosses.
His bosses visited everybody in the food chain involved personally (including me) to tell us individually that the Shitrix box was going to be hardened to DISA Enclave STIG v1, shut the fuck up, you're going to make this happen, do not pass go, do not collect $200us. My boss, the stakeholder, and I pressured them into putting in writing that they'd read multiple warnings (including from Citrix) that doing this was not going to work, it was going to result in a full rebuild, and the loss of productivity would be extensive. Not only did they put it in writing but they signed it and gave each of us a copy of it, because they knew best and us nerds were full of shit.
This is a sysadmin horror story thread, so you know exactly what happened: About halfway through applying the OS modifications the server tanked. We weren't able to bring it back up. The Citrix team's admins were called in (why they weren't on site I don't know) and they spent two days trying to fix it (yup, 48 hours straight over a weekend). They also found out the hard way that their backups didn't work. They could be restored but were corrupt in some way nobody ever filled me in on so I don't know the story there, but no dice. I think the rebuild process took about three weeks of work.
The only thing that saved our butts was my boss' bosses being arrogant enough to write and sign personally their orders.
Falling-through@reddit
I’d seen quite dumb examples of the Emergency Poweroff located at the same level just inches away from light switches, just asking for a bop.
In another, a pal I worked with went to squeeze round the back of a cab and his arse pressed the power off button, quality job.
nhaines@reddit
That's why they need a molly-guard!
tech2but1@reddit
I've yanked plenty of power leads out of things while clambering over piles of spaghetti in server rooms. Just shove it back in and hope no-one notices!
sys_overlord@reddit
Late 2010s. Christmas Day at 4AM. Call from the Director of IT. Director says, "I need you to meet me at the office, we've got a big problem." I leave the warmth of my bed and my spouse and I drive to office. Ransomware. All servers including domain controllers and exchange servers across multiple data centers; client machines across several geographic locations too. They didn't get to the backups but the damage was done and the next 54 hours were restoration efforts with 15 engineers involved. Business is restored mostly by the next business day; huge win, everyone is exhausted and we aren't even sure how the perp got into the network. CEO sends bonuses and boxes of frozen meat shortly after. Everybody laugh. Roll on snare drums. Curtains.
Eviscerated_Banana@reddit
I have nothing out of the ordinary other than some of the usual shite, like when you fix a problem without interrupting the user and just pop them an email to say it'll clear after next restart and being written up by that same user for not having the courtesy to call and tell them I was working on thier PC, like it was thier personal terminal and not a corp owned workstation.
The worst of it was the grilling I got from the big boss (not a tech) contrasting with every tech in the room shaking thier heads at the pettyness of it. Was a common thing too in this org, lots of healthcare pro's who were in later years utterly roasted for endemic bullying of people.
A month later I was gone.
aleques-itj@reddit
I reported a vulnerability and tried multiple times to get it closed. Even had logs proving it was getting probed. At one point the security guy wound up in a shouting match regarding it - he didn't get anywhere either.
It was the cause of getting ransomed. Felt great having all paper trail and getting to wave it in the face of the person who made it impossible for us to act on it. Didn't feel great working the recovery until I was so sleep starved that I momentarily fell asleep in my car at a streetlight on the way in.
BeenisHat@reddit
A client of the MSP I used to work for has a 3COM NBX system as their phone system, which wasn't terrible except that it was starting to have problems. The power supply was getting wonky I later figured out, but one of the immediate problems was the IDE hard drive starting to fail. So I grab a backup of the database where it stored it's settings and do an install on a different hard drive. Stupid thing wouldn't boot because it checked hard drive size and knew it was a replacement drive. So I had to partition the drive down and try again. Still no dice. Finally I figured it that if I left the remainder of the drive as empty space with no partition, it would work.
But then the database wouldn't go back in because it needed to be the correct incremental OS version. So update the OS (which is vxworks, which I had never even seen before) and then the database works.
For some reason, the phones don't connect. I can't figure out why. Never actually do figure out why, it just worked after the reboot of desperation.
When I finally realize after the thing reboots in the middle of the day taking everyone down, the power supply is getting shitty. I bring it up to the owner of the company and he says just order the PSU. Well at this point, the only place to get one is ebay. He doesn't care.
It's one of those things where every little thing that can go wrong, does. I freaking hated that thing and cringed every time they called about that piece of shit. Eventually they suck it up and buy a new Cisco UCS system. But I nursed that accent 3Com thing along for far too long.
zombieblackbird@reddit
I was a summer intern assigned to help the IT guy for an engineering firm. They had moved to a new backup system 6 years previous and never once tested it. When the file server became infected with ransomware, they opted to restore from tape rather than pay the hackers. I built new file servers, a new domain, all new user accounts, and MFA. It was like a fresh start for them. When it came time to recover data, we found that the tapes were corrupt. Not just one. All of them. Whoever fucked them over played the long game. They had been backing up junk for so long that anything that might be usable had long since been overwritten. They fired the sole IT guy, leaving me as their only resource.
I spent my summer re-creating every last document from hard copies. My IT internship turned into a data entry nightmare. Drawings, docs, spreadsheets, even the corporate website. I tackled IT tasks on the side while trying to drain engineers to manage their own network since my time there was limited by the internship agreement.
They got bought out by a later firm that year. I moved on to a bigger university to finish my degree.
versello@reddit
15 years ago, prior to virtualization, Our server room, which was located in an old aging building, had redundant power but no redundant AC. The AC crapped out overnight and the room got over 120 degrees. Servers crapped out
left and right. We got Vmware instantly approved after that incident.
liposwine@reddit
Isn't it amazing that after a major event, all of a sudden they're able to find the money :)
Wheeljack7799@reddit
Former manager of mine had requested funds to purchase new network equipment. Got denied. (multibillion company btw)
Few months later, the entire network went down for two days. My location also functioned as a major hubsite for many services, so this outtage also affected offices internationally.
When called up to the big boss' office to explain why this happened, my manager referred to the previously denied request to upgrade network equipment.
liposwine@reddit
This is the way.
Hate_Feight@reddit
It's a coincidence, I promise.
The_Wkwied@reddit
Other team rebooted rds server and gateway, middle of the day, kicking off 300 users to run an update.
It wasn't a critical update.
Their justification? They were going to a game this weekend and didn't want to work at the ball game.
Nothing happened because the tech was buddy buddy with the director with the VP.
Bogus1989@reddit
Interesting? Guess when we got hacked. Country wide, we have near 500 hospitals or something like that. Got in by a user clicking a phishing email, in ohio. After that they exploited a vulnerability on vmware. Lots of servers below v6.7 out there back then, i personally have seen some on v5.x
Typical HCL. They were in charge of quite a bit of those hosts.
It was a vacation for me. Literally ALL was down. Im not even sure how many PCs countrywide we have. I know just my sites over 20k including the clinics.
I guess to not be too specific, ill round up a tiny bit.
150 hospitals, 800 care sites
Wont state what was lost. It was announced though. Their are two pending lawsuits.
mro21@reddit
https://www.reddit.com/r/iiiiiiitttttttttttt/s/jO3PhkNow3
jcpham@reddit
About 2005 or so I showed up at a real estate office to swap a RAID5 drive in a Small Business Server and since I was the MSP I was just following Dell’s instructions to CYA.
To this day I don’t remember what they told me to do because I was young but I’m pretty sure they walked me through reinitializing the array versus _ rebuilding_ the array. I know the difference but I don’t think whoever I spoke to that day did. I did it anyways.
I spent the next week restoring backups
homepup@reddit
I posted this story a long time ago on /r/talesfromtechsupport and still wake up in cold sweats remembering it over 20 years later.
HMRRMMMMMMMRPH!!! (Shhhhh, just let it happen. Go to sleep now)
At a previous job, I was out in the mornings taking college classes. Well, one day I show up around noon and as I'm getting closer to my office (which doubled as the main server room), I could hear loud squeals and whistles that made the hair on my arms stand up. Like a case of tinnitus being pumped through a strong amp. It kept getting louder and louder.
I'm met by my supervisor who is glad to see me. He tells me that the annoying bells and whistles have been going on for hours and no one knows why but is hoping I can make it stop.
Finally make it to my door and find the problem.
Apparently there was some construction going on in the adjacent room and to help out, my supervisor (who was the manager of the graphics department and not very techy-minded (I know, I know, why was the manager of Graphics over the IT Manager, but that's another story for another day)) was attempting to be helpful and thoughtful.
Since he noticed there was a decent amount of dust being kicked up by the construction, he didn't want it to harm the servers or networking equipment so he took it upon himself to wrap it all in plastic and seal it up with duct tape so the dust wouldn't get sucked in by the fans.
0_0
I'm a patient person, but it took all my strength to explain to him calmly (as I was screaming inside my brain and quickly ripping all of the plastic off of the racks) that he was smothering everything and it might have just cost us a few hundred-thousand dollars in lost equipment.
Thankfully nothing was greatly harmed, but color me surprised. We both learned to confirm with someone before making any kind of changes to equipment, no matter how helpful your intentions.
KiefKommando@reddit
Day before Thanksgiving of 2020 I am sick with COVID and stuck at the house, I receive a notice about server room temps and ask one of the techs at the office to please check that the A/C was running as we had several instances of the redundancy not kicking over right as dictated by the Liebert panel (eventually construction fixed this). Tech assures me that everything was checked and the A/C was turned on, we are all good. I relax and try to rest and sleep. I’m awoken the next morning by a flood of very ominous NAGIOS texts alerting for critical temps on our NETAPPs, I remote in to see what’s happening and suddenly everything just freezes. Whole NetApp cluster is down. Hard down, and stuck half brained with failed control boards. I rush into the office and then spend the next 12 hours troubleshooting this with TAC support, it takes until almost 4 AM the day after Thanksgiving to get the emergency parts from NetApp delivered and the cluster is back up and servers are getting brought back online. I missed Thanksgiving with my family and had to go into the office (no one was there) feeling like I’m inches from deaths door to fix millions of dollars worth of equipment. To this day I get a pit in my stomach if I ever see a temp alert.
madclarinet@reddit
Old one from the time of serial terminals
User wanted his terminal to be faster, somehow found the key-presses to get to the terminal config screen so set it fro 9600 to 19200 and complained that it didn't work. Even though a faster speed wouldn't make the terminal faster.
The Mux they were connected to only worked at 9600 and even if it did, it was all manual adjustments. Didn't like the response and complained to his boss. His boss ordered us to make it work so I (with my bosses approval) costed out 2 replacement Mux's (one for each side) and my boss sent his boss the quote with a request for a cost center.
Problem magically disappeared - when we moved to ethernet connections a year later, his computer always seemed to be stuck at 10Mb half duplex for some reason......
Feeling_Inspector_13@reddit
do yourself a favor and make a deep dive into some german sysadmin subs like r/EDV or r/de_EDV
chiefsfan69@reddit
Biggest personal horror story. Early in my career back on physical hardware, we did an upgrade, and 8 hours later, the system started to crash. The vendor says the OS is corrupt and has me put in the cd they supplied with the hardware. I verified multiple times that it would just reimage the OS, and there was no chance of data loss and begrudgingly hit enter. My mind melted as I saw all the data drives light up. Turns out they had labeled the disks wrong and sent us the wrong one. Only time I've lost data in my career. Luckily, we still had a lot of paper back in those days, so we could manually reenter almost all of it.
Brufar_308@reddit
I’m not a SQL person. Contracted a consultant to convert some old SQL DTS jobs to SSIS on the new server. Tried to limit their aces but not being a sql person apparently wasn’t giving them enough permissions and they couldn’t explain exactly what permissions I needed to grant That was probably the first red flag. After pressure from multiple directions I granted admin rights to the sql servers.
Told them to do all the work and testing on the old system and a test SQL server I built so they could test this conversion and not to touch the new sql server until we verified the Ssis jobs were working then we would then move to production.
I’m home sick feeling like I want to die and I get a call that the production erp system is down. The oracle ERP migration that we had been working on for months that finally went live the previous week.
Turns out the SQL consultant mixed up the server and DB names in the ssis and truncated about a dozen tables in the production ERP database. Had to restore the database from backup right after revoking all access rights to the sql expert.