Feeling Defeated - Deleted Something Important Today
Posted by AuPo_2@reddit | sysadmin | View on Reddit | 178 comments
Sup, I deleted something important. Pretty much my fault for not asking questions, but it was apart of a bulk cleanup. I can most likely get the data back but it’s going to be a process. Just feeling defeated and dumb. That’s it, thanks for reading.
angrydeuce@reddit
My first month on the job I blew away a production database that was, unbeknownst to me, being hosted from a random sales managers desktop of all things.
Luckily the fact that this dbase was the result of Shadow IT saved my ass; my boss immediately jumped in and asked some pointed questions regarding who thought hosting a mission critical database on a desktop was a good idea when there were literally a dozen servers sitting there to pick from that were backed up regularly, and why wasnt IT informed...they backed off right quick after that lol but man oh man were they pissed and I genuinely thought I had a RGE on my hands, a mere two months after I finally got into this business.
Was a valuable lesson regardless...not only to not ever take what Im being told at face value without verifying first in some way ("Is there anything important on here you need saved besides this data here? Nope? Alright then!") but also how insidious Shadow IT really is and why it must be eradicated at every turn.
forkinthemud@reddit
Ugh and AI is only making Shadow IT situations worse...
pjtexas1@reddit
Happens... restore and document as a test of your restoration process.
Jetboy01@reddit
You didn't delete something important, you effectively highlighted significant gaps and weaknesses in the accepted BDR strategy and provided the company with a strategic opportunity to reassess its resilience, improve recovery processes, and implement stronger safeguards to prevent future risk.
Finn_Storm@reddit
If LinkedIn was a language you would be fluent in it, bravo
OrgIQOfficial@reddit
Reframe of the year award goes to 🏆
TannerNTanner@reddit
You should write resume's for a living Jetboy01
sxspiria@reddit
This is the right mindset
stuckinPA@reddit
That’s beautiful!
ramdomvariableX@reddit
OP - Here's your resume / Linkedin snippet.
SignalCoyote137@reddit
Welcome to the club!
Disastrous_Syrup687@reddit
Sad
oldmuttsysadmin@reddit
This one time, in a maintenance window, I dropped an important database table before I unloaded it. One of us.
stephenc01@reddit
one of use. leaning powershell i sent a command to delete all virtual machines at an org trying to do a clean up.
i logged out and put my head in the keyboard; when i still had a job an hour later i learned that it does not let you delete running VMs.
moved on and now tell it as a cautionary tail to my new people.
Citizen_Null5@reddit
We have all been there
joerice1979@reddit
You're no sysadmin if you haven't done something like this.
Fix and learn, it's what we do.
Hot_Ambassador_1815@reddit
For real. I accidentally deleted a dhcp reservation during some housekeeping that, long story short, took the entire hvac system down for 2 days.
It happens. You just do your best to try to keep your brainfarts at a minimum.
SuccessfulGrape4045@reddit
That sounds like a....heated...situation.
Hot_Ambassador_1815@reddit
Yea. I just needed everyone to chill out ... but they couldn't
xman323@reddit
https://www.reddit.com/r/sysadmin/s/Zsh9w0i55m
See this :)
Hot_Ambassador_1815@reddit
Right on par. As long as we own our mistakes and fix them, people usually understand you're only human.
radraze2kx@reddit
Windows updated on a non-managed machine for an important long time customer... That update triggered Bitlocker recovery on reboot. The key is not in his MS account or our software. We lost everything on his system because of an automatic update. He didn't have backups (non-managed). Thanks, Microsoft.
rared1rt@reddit
Accidents happen.
I once deleted multiple mail stores that I thought were empty only to find out i had deleted mailboxes for multiple presidents and senior leaders of our sister companies. We hosted there mail. We found out it was a known flaw in Exchange 2007 "after filing a bug report" where if you narrowed the scope in the console and it saw no mailboxes it would let you disconnect and delete the store.
Canceled part of my Christmas vacation. We also found out then the disk backup was nkt working as designed. Between OST files an Journaling of the last 30 days we were able to get most of it back.
This too shall pass.
Lesson learned, grow from it and keep moving forward.
BitsNBytes10101@reddit
I asked a user about 4 times if they backed up their OneDrive before I removed their duplicate account to clear an Azure sync issue.
Plot twist they didn’t. Then lost their mind AFTER 30 days when I could have restored the data.
Shit happens. Own it and move on.
zaphod777@reddit
Users lie, back it up anyways.
imnotaero@reddit
You want people who don't listen to IT or care about consequences? Because this is how you get people who don't listen to IT or care about consequences.
If the wrath you're trying to avoid is somebody's boss coming after IT because of data loss when the user lied to IT after being warned of risks of data loss, you're backing it up anyways not because users lie, but because management has lost control and views IT as unworthy of basic human dignity.
In which case, ok fine I endorse, but I hope for your sake you find a better situation.
zaphod777@reddit
It depends on the user and how important the data might be. But given the opportunity I'll always make a backup first, no matter who it is if I might cause damage.
Pupusas_Man@reddit
Agreed. I still recall when this saved me: the user was insistent they didn't need a backup. Lo and behold they did; no worries got the backup right here!
Siphyre@reddit
IT's job is not to teach people to be proper human being's and deal out consequences.
Take the backup.
Disorderly_Chaos@reddit
I had a woman who wrote a rule in outlook to delete everything after 30 days. It deleted everything older than 30 days. It took her 9 months to tell us her (substantial) archive was missing. Gah.
purplemonkeymad@reddit
Did it take her 9 months to notice or was it one of those, "I noticed 8 months ago, but just didn't get around to asking you to fix it."
Disorderly_Chaos@reddit
9 months to notice. She needed to look back and get some info or something.
FunKaleidoscope3055@reddit
Learned our lesson with returning employees and how Sharepoint works. Fucking hell it was a nightmare to get everything working again. Everything is connected to an old Sharepoint ID.
winmace@reddit
How can you own something that wasn't your fault in the least? Luckily where I work if the user fails to listen to us we're not responsible and they have to lump it.
virtikle_two@reddit
I once mistyped a password on a device that is suspended 250 feet in the air on a tower with no way to reset it remotely
That was today
It is going to cost us $3000 or my job cause I ain't climbin that
SuccessfulGrape4045@reddit
Just create an elaborate pulley system to suspend yourself up next to it.
peoplepersonmanguy@reddit
If you still have a job, then it wasn't important enough, go bigger!
SuccessfulGrape4045@reddit
Delete an entire share and somehow flip it into a policy gap!
Stonewalled9999@reddit
Yeah. Screw up bigger maybe they will promote ya!
siedenburg2@reddit
I just deleted over 2m files and 3tb, going to sleep now and will delete as much again. I bet that there are strays where the person says "You havn't told me" (4 mails weeks apart) or "Forgot to safe", we already prepared our backups from last week for the problems we will have with the file delete today and tomorrow.
You can't get all, but you can try to prepare as best as possbile.
1z1z2x2x3c3c4v4v@reddit
Which is why you "quarantine" things before you make a final backup, then delete them forever.
Quarantine means you remove all access, but leave it along for 30 days or so.
The_Original_Conman@reddit
We can that "scream testing". No screaming. It must be safe(r).
doubleUsee@reddit
I've ran into frustratingly little things that are a pain in the ass to quarantine or disable or whatever. Even as simple as an AD group, best I've come up with is export the users in it to a csv and then remove them all.
..and then forget to remove the empty group for half a year and get rid of it next time I feel like cleaning stuff up.
siedenburg2@reddit
for that it would be possible, but other times i had 500m+ files, to change the permission for that would take way longer than a restore in case of a few missing files (we also save everything on tape and store it for at least a year)
1z1z2x2x3c3c4v4v@reddit
I hear you.
But... you are also not doing your access permission correctly if this is the case. In MS land, this is why you want to create local groups with the required access defined (2 local groups - read only and modify), then use global groups to grant access to those local groups, and put your users in the correct global groups.
That way, it doesn't require any modifications to the source files (local groups) when changing access permissions for the users (global groups).
Understand?
siedenburg2@reddit
that's done for new stuff, but there is many "(pre-)historic" stuff where it can only be changed slowly
1z1z2x2x3c3c4v4v@reddit
I know all about that, too. I once inherited a file share where all the users had direct access to 1M files... no groups at all! Took a over a year to get that resolved.
Stonewalled9999@reddit
I become a real sysadmin the day I left a vendor blow both redundant controllers on our SAN. Naturally was a month after our company got Bob another company I thought for sure that they fired my boss looked at me and said stone we just spent a quarter of $1 million training you to not do something stupid in the future we’re gonna keep you.
manvscar@reddit
I once deployed a wipe and reinstall task sequence to about 100 PCs on accident.
Caught it pretty quick but still lost a few. Life goes on.
Rouxls__Kaard@reddit
My junior admin deleted a year’s worth of important test data. Recovered by a specialist. Shit happens man.
DevDude2025@reddit
Well if it was part of a properly manage change.. no issue about was documented and approved by management.. ALWAYS USE CHANGE MANAGEMENT as a butt cover!
xman323@reddit
Last month, I was cleaning out our old print server failover cluster which conatined print servers and dhcp servers for aome openshift clusters, discovered a month later after lease time expired and openshift cluster gone haywire that I've deleted both roles, restored from backup and everyone is happy again. Move on and learn from your mistakes, most of us sysadmins make mistakes eventually.
Nexzus_@reddit
https://i.redd.it/nfmwvt6qqeyg1.gif
Oh yeah, been there.
False_Ad5119@reddit
Had my own share of Data rescue nightshifts. Its All good.
Hebrewhammer8d8@reddit
I don't think OP want that rope yet?
Affectionate-Cat-975@reddit
We’ve ALLL been there
mkinstl1@reddit
I have felt lately that the more senior we get in roles directly correlates with how many big decision deletes we make.
Merdrak@reddit
I broke something two weeks into getting my permissions to stuff. Senior engineer joked that we don't really work for then place until we've accidentally deleted something. 🤣
Spagman_Aus@reddit
Oh yeahhhhhhhhh. Absolutely.
But, this why backups exist. Never waste a crisis.
sir_mrej@reddit
one of us! one of us!
sysadminbj@reddit
This has to be the most appropriate usage of that GIF I can think of. Well chosen.
itishowitisanditbad@reddit
That or the 'i've been chewed out before' inglorious one.
eaglebtc@reddit
Happy Cake Day!
More-Ad2642@reddit
Done that. You are not alone! You learn.
Feran_Toc@reddit
I know someone who deleted an entire OU from AD which affected an entire Hospital.
Some how she was made a manager. So there's still hope for you.
AuPo_2@reddit (OP)
I’m feeling much better after reading everyone’s experiences! Haha.
BigLilUziVert@reddit
If you’re not making mistakes you’re not working!
Junior_Resource_608@reddit
I don’t even know if this qualifies for r/shittysysadmin as others have said you’re just never going to make that mistake again.
imnotatworkxD@reddit
Welcome to the club. It happens to the best of us honestly. You learn and adjust so that it doesn't happen again.
Nandulal@reddit
I accidentally hit myself in the head with a large tree branch and gave myself a concussion once. I still don't remember how I did it. I was nowhere near any trees.
1RedOne@reddit
I accidentally deleted the membership of every group in active directory one time
Had to drive three hours to a remote site where we had a Reed only domain controller, and then we had a complicated call with Microsoft to learn how to do an authoritative restore from this region
Also, we found out that all of our tape backups were down and that this hadn’t replicated in 45 days
therankin@reddit
Holy smokes!
(side note: isn't it fun how voice-to-text just randomly does the most idiotic things? Like putting an uncommon male name "Reed" into a sentence where you said 'read only'. It frustrates me so much sometimes.)
Tr1pline@reddit
Happens to all of us mate. I feel stupid when I delete the wrong account!
1z1z2x2x3c3c4v4v@reddit
Where are the Backups? If the data was not backed up, this is not 100% your fault. Accidents happen, which is why there are backups...
OptimalCynic@reddit
A guy I knew in the old days once wiped a floppy disk with
Yes, with a space between A: and \. Oops.
MikeyRidesABikey@reddit
Guy I used to work with did:
rm -rf . /*
(space between . and *)
Demache@reddit
I accidentally blew away a client's root CA the within a month of me starting to work there. It made me learn their recovery process from daily backups very very quickly and fortunately it worked.
Sometimes you goof or something goes wrong. Part of what makes you a good sysadmin is how you respond to failures like that. And what can you do to prevent it from happening again.
CompletelyUnrelated1@reddit
ha, I just got written up for something similar. apparently fell under "gross negligence", stupid shit, really. happens to the best of us.
MikeyRidesABikey@reddit
Early in my career (early 1990s), I was adding a new drive to a customer's system. The system already had two drives, so we were going to remove the smaller of the two (currently the system drive), reformat the 2nd drive as the new system drive, and add a new drive for data.
After I reformatted the 2nd drive to be the new system drive and added the new data drive I restored the backup, only to find that whoever added the old data drive (now reformatted as the system drive) had not added it to the backup.
Luckily, only history data was on the drive that I reformatted, but that was one heck of an adrenaline jolt, and not a fun conversation with the customer (who actually took the news way better than I expected!)
Lesson learned - check the integrity (and contents!) of your backups first!
StratoLens@reddit
Hey man - we’ve all been there. This is basically a right of passage in IT. Welcome to the club. Be thankful the data is recoverable. We’ve all deleted stuff that wasn’t.
It’s gonna sting. Learn from it. Get involved in fixing it as much as you can, do it yourself if possible or ask to help in the recovery. Recognize where you went wrong, then stop beating yourself up and move on.
It’s not gonna be your last mistake. We’re only human.
mnvoronin@reddit
It's rite of passage
^(hides)
StratoLens@reddit
These days I make mistakes like this on purpose to prove I’m not an AI bot :).
Or I just didn’t realize that.
OptimalCynic@reddit
That's exactly what an AI bot could be trained to say suspicious eyes
StratoLens@reddit
You’re absolutely right!
Einherjar07@reddit
*unhides hidden files*
Gottem
fraghead5@reddit
Back in the early 2000’s when I worked in a Datacenter, I on more than 1 occasion swapped the wrong drive on a failed raid and over wrote the good disk with a new blank disk.
Secure_Cyber@reddit
You aren't truly a professional if you haven't accidentally shut something down or accidentally deleted something. It sucks, it's embarrassing, but life will go on. Just know you care enough to know to be extra careful in the future. That's how we remind ourselves to be extra careful. We all did something we kicked ourselves for, big or small, but now we are better and can use that experience to help teach others.
FrankNicklin@reddit
Baptism of fire. Welcome to the world of sysadmin.
hkusp45css@reddit
You ain't a cowboy, till you been bucked off.
brunogadaleta@reddit
Did you learn something? Well, that's what it take to be experienced.
DarthJarJar242@reddit
This is the point of backups. We all fuck up and every once in a while, what matters is learning from it. Learn from this and don't let it get you down.
Zaiakusin@reddit
EVERY IT professional has done this at least once. Its why we use backups. Shit happens.
Odd-Feedback8338@reddit
I once deleted the configs for all MacBook users in JumpCloud, meaning they lost account, data etc. until I was able to restore it.
All got restored but one user - that was a shitty and stressful day, where I learned some valuable lessons
Civil_Inspection579@reddit
been there, that feeling sucks but it happens to pretty much everyone at some point. the fact that you can likely recover it is already a good sign. take it step by step, you’ll get through it.
thatguyyoudontget@reddit
been there, done that.
own it and move on..shit happens, especially for us!
Proic13@reddit
i once locked out our entire company emails access because i decided to cross reference our Azure security/named locations with those who are on vacation, not realizing i forgot to check US (geo location lockout)
i VPN into serbia (a user vacationed there) and login to with admin credentials to correct the mistake.
i then had SOC team alerting my boss to an unusual login in serbia with admin credentials. they were going to alert the incident response team before i had to fess up my fuck up.
it happens my friend. ITs make mistakes its a learning experience ~~when you try to cover it up and then it blows up in your face so now you get laughed at whenever someone needs to reconfigure the geo-location~~ we all learn to be a better person from it ~~totally gonna screenshot their desktop and hide their icons of those who laughed.~~
FunKaleidoscope3055@reddit
Lmao just last month our SOC locked my account out because I idiotically left my VPN connected to a node in Serbia and signed into Azure. Dumb mistake. Got resolved quickly.
But holy shit I was sitting there, locked out of everything after hours rummaging for my break glass cred thinking... SHIT we just got fucked for about 5 minutes. Also my boss at the time was on vacation so that helped.
eaglebtc@reddit
You must be in /r/YUROP, because an American would not even think to check Serbia.
networkn@reddit
Hang in there sport. The sun will 100 percent come up tomorrow. If you haven't done it, it's not a matter of if, it's a matter of when. Be happy you have backups and remember why you always ensure you have them before making changes.
Accurate_Ice7461@reddit
Pro tip add .old in front that way you can wait for the screams, and instantly resolve.
EvandeReyer@reddit
I see we have a seasoned professional here.
I prefix with Zz (sends them to the bottom where I can easily see them).
WaldoOU812@reddit
We just promoted our help desk lead engineer onto our team, and the three of us all told her, "you're not a senior engineer until you've brought production down at least once."
EvandeReyer@reddit
“However don’t take that as a challenge…”
some_string_@reddit
And?
Suolara@reddit
I accidentally deleted a space once and caused an admin database to crash. One space, in a file that recorded usernames.
DeCnySnI@reddit
Man i remember i deleted the production mailserver only to be reminded how important it is to check your backups ... Ofc they were corrupt ... i had to spend 80 hours with a recovery party to get back the deleted files that were the longest 3 days of my life
When nothing is going on its chill but days like that pohhh
WaterOwl9@reddit
5 whys
Kaballis@reddit
Have you even worked in IT if you haven’t done this at least once? You know how most users have loads of crap on their desktops? I deleted everything for a user assuming it was redirected. It wasn’t, I should have checked. I was so lucky that I did this to a user who didn’t give a flying fuck but I still beat myself up over it. This moment will pass.
ExcellentPlace4608@reddit
I deleted an entire day's worth of work one time. I was tasked with moving all of the virtual machines to a new host and neglected to take a backup of the day so had to restore from the previous night.
Much_Cardiologist645@reddit
It’s okay. Just blame everyone and everything else in the company. Usual MO of everyone here anyway.
h8mac4life@reddit
Dumb, sure. No backup? Really fucking dumb.
AuPo_2@reddit (OP)
Oh there’s a snapshot, it’s just going to be a fun, really FUN, process to get it mounted tomorrow…
h8mac4life@reddit
Your boss must be a dumb ass.
woodrowbill@reddit
Chill man.
SageAudits@reddit
Congratulations, remember this and wear it like a badge of honor. If you weren’t a sysadmin before, today you are one for sure. It happens to us all eventually.
ncc74656m@reddit
My second job I blew away the CEO's wife's personal data because I thought I knew better. I recovered most of it. I lied about the cause. I saved my job that I probably could've saved just by being honest. But I learned important lessons about not making assumptions and exercising due diligence, and also how important it is to take all of 2 minutes to verify an assumption.
Turbulent-Tie7280@reddit
Wiped out server while updating OS. Luckily used dd beforehand.
Impossible_IT@reddit
Recently migrated our Macs from JAMF to Intune. The process was wipe and reload the macOS. This was the last Mac as the user had been on extended leave. I asked if they backed up their data as the laptop needed to be wiped for the enrollment. They said that just completed a Time Machine backup. Took the laptop, wiped the drive, installed the macOS using a bootable USB. Once finished, time for user to login. User logs in, asked if I wiped the drive, as their password manager database wasn’t backed up in their OneDrive. User couldn’t access their external Time Machine backup because they used a long password generated by the password manager. Luckily I hadn’t wiped their Intel MacBook Pro from a year prior and they were able to get some of the missing software/scripts from that.
eaglebtc@reddit
Ow fuck. I'm a Jamf admin that works in shops with Windows. You are going to hate Intune after a while when they realize all the stuff it can't do, and all of its dumb quirks.
Also... did you do this migration before updating the Macs to Tahoe? You're supposed to be able to "migrate" MDMs without wiping the whole device.
Impossible_IT@reddit
I’m just a grunt, so I don’t touch Intune for configuring and such. Only gave us lowly grunts enough to get the encryption keys, admin password, device properties and generate reports. I’ve yet to generate any reports. Migration was way before Tahoe. And from higher up the chain was the directive to wipe, reinstall macOS and have the user login.
eaglebtc@reddit
Yeah, that's too bad. Tahoe makes it possible to switch MDMs now, even with supervision.
Texkonc@reddit
Welcome! New recruits have to bring the cookies! The one I won’t forget is testing TFS upgrades like 4 times, then when I did prod and followed the checklist I built, my dumbass forgot step2, take a new backup after shutting off services. Devs lost a week of work! Luckily they pieced it all together by various people that had checked out code. Welcome to the lost data club!
OrganizationNew9063@reddit
It happens sucks right now. Most everyone has done it if it’s any consolation
tuxnine@reddit
A least the data can be restored. It could always be worse. Remember, no matter how badly you mess up in IT, someone has almost certainly done something worse, probably even within your company.
TechMaster212@reddit
In my college days I was interning with the universities Server team and they had given me the task of running a list of servers about to run out of space and doing clean up of obvious areas like Temp and update files. This one day I was working an afternoon like 12-5 I did my task and the last server I hit had like ~75 GB in temp so I was like cool this all set after I emptied it. I logged everything emailed my supervisor and went home.
The next day I came in at 8 AM and there was major panic because an important web app was down and they couldn’t figure out how. This app was developed by another IT team exclusively for the university and when they mentioned the server I told them that was on my report yesterday and I dumped the Temp folder. The dev team freaked out cause that’s where there app ran from.
One restore later it was back up and they got scolded for running things from temp. Lesson here for you these things happen, backups exist for a reason and you’ll do better next time.
wintermute023@reddit
That’s ok, back in the day I deleted a couple of Exchange databases by mistake (I was running a clean up script ) and went home. I got a call about midnight that it had propagated to multiple connected Exchange servers across the country. 9000 people woke up to no email, and we pulled 36hoirs straight restoring and fixing. Didn’t get fired, it was a valuable lesson. My boss was brilliant . I learned about dry running scripts that day.
BIGt0eknee@reddit
Congrats on getting a promotion soon.
Tx_Drewdad@reddit
Nah, man.
Pyrostasis@reddit
Hey at least you can get it back!
When I first started I had a bosses boss who was a complete Psycho. The end user was right even when they were wrong and you white gloved everyone all the time no matter what. You needed to be FAST all the time cause other wise the prickly site managers would go right to the director (bosses boss) bypassing 3 layers of command to make sure they got their way.
We had a site manager who was an ultra-special snow flake. Her pc had a problem with an installed patch that didnt install clean and wouldnt uninstall. If you tried it would blue screen reboot reinstall get in to windows but it was unstable as hell. Told user she needed a replacement as hers was out of warranty anyway and figured she'd like a new machine.
She wanted a ten key. At the time Ten keys were no longer being delivered for our two models it was only non-10 key. So she wanted to keep HER laptop. We explained it would need to be wiped and she said fine do it. We instructed her to back her shit up and let us know when she was good.
She live with that for 3 weeks cause she didnt want to deal with it all the while her machine kept getting more and more unstable.
Finally she gave in and said do it but she wanted her machine back ASAP. Which meant if I dont get it fast Im calling your bosses boss and you will hear shit for it.
Confirmed with her she'd backed up anything she needed in one drive and she said she did. Told her anything not backed up would be gone. She said she wasnt an idiot and it was do it and do it now... soo I gave her a loaner and went back to the IT department to wipe her machine. Got it up and running her desktop icons popped over when one drive synced and gave it back to her.
Normally our practice was to keep all site manager and other VP's laptops for 90 days before wipe and decommission but she threw a fit so we did as she demanded.
Well 3 days later she calls me fucking livid cause her project shes been working on for six months is gone. We go digging in her one drive and its there... but its an older version. Some how she'd logged out of one drive, never logged back in and so her stuff had stopped syncing months ago. She'd been working on that project file on her local machine instead of in the cloud apps like every other one of her projects aaand since her machine was wiped and a SSD it was fucking gone.
I was sure I was cooked. Called my boss and explained the situation he wanted to know why the fuck I'd wiped the machine instead of a loaner explained she'd demanded I do it, he said it wasnt our policy and I explained everytime we've done that before they went to his boss and we ended up doing it anyway and getting yelled at. He laughed and was like yeah... thats true but what do you think is gonna happen now?
Sure enough she went to the big boss and he raised hell and I kept expecting to be fired.
I should have verified her one drive worked. I should have forced her to sit with me and verify her files. I should have gotten the wipe in place cleared by management aaand my bosses boss shouldnt have routinely violated policy for complaining end users that led to his staff being terrified to cross them.
Ended up keeping my job. Never ever fucked with deleting data again. Its been 8 years and I am still fucking OCD as hell anytime Im wiping or purging something. Backups are like a fucking holy mantra to me.
Wild thing is that bosses boss ended up poaching me a year later cause he liked my work ethic and ended up massively boosting my career.
Anyways point is you fucked up but you got a backup and its happened to all of us in some way form or fashion. Sounds like you fucked up in a way that at least wont cost you your job or the company money or the end user their data. Dont sweat it. Own it, learn from it, and get better.
FormerLaugh3780@reddit
I've been there, try and not beat yourself up too bad.
lastplaceisgoodforme@reddit
Jesus saves, so should you.
Hefty-Prize5713@reddit
I’m pretty sure everyone has had that moment.
NindieNation@reddit
It's alright. When I was new, I decided to run a quick update on the server.
On Saturday. During the Summer. At a 20-store car dealership in texas who sells the most pickup trucks of any location.
Apparently I made them restart over $4M of sales, of which about half were lost that day.
It happens, tomorrow is a new day and 99% of your company has no idea what you did or how to avoid it, so get in good with your boss and just make shit up if anyone else asks.
sir_mrej@reddit
So first - It's ok.
And second - Welcome to the club. We've all taken down prod or done something similar.
It's not fun. But it's how it goes.
TxJprs@reddit
it happens. tell your boss immediately and tell em you plan to fix it.
drewskie_drewskie@reddit
I've done this twice, one was SharePoint which was easily fixable but embarrassing. The other was installing Microsoft PC Manager which was way to aggressive for our environment
PDQ_Brockstar@reddit
I once deleted all my CIO's local data during a reimage. It was not recoverable. It happens man.
TechnicalWaffles@reddit
Happens to most of us at some point. I once accidentally purged the full config of our CI/CD tool. Thankfully the backup was less than 24 hours old
Dekklin@reddit
It's a right of passage
BlazeReborn@reddit
And now you're one of us.
AmiDeplorabilis@reddit
Welcome to the real world... we ALL (probably) have felt your pain!
Routine-Jam-48@reddit
An old Dr Fun webcomic from 1995...
hankhillnsfw@reddit
Yesterday I accidentally deleted an in use api key that our production application uses to allow access for our critical business partners.
NerdWhoLikesTrees@reddit
My friend works in a hospital. When he fucks up someone dies.
It’s important to keep perspective
baconjerky@reddit
It’s ok i pushed a firmwide intune config eod today without notifying anyone and it’s my 3rd week on the job
darkrhyes@reddit
I was told several years ago that I could go ahead and delete about 45 user accounts. They were almost active accounys but had flags in our identity system showing they had never changed their password. I created a script to restore them, just in case. I deleted them and got a phone call within two minutes that about it. I ran the restore and it was, thankfully, just a blip in the users day. We had a meeting about it and I helped to develop a new policy.
SketchyTone@reddit
I agree with its a passage into IT, it happens to the best of us.
A few years ago I left my computer unlocked at lunch during WFH, whats the worst that could happen? Cat fell asleep on keyboard and deleted like 100K files... whoops. That was hard to explain until I got a photo of her sleeping on it after I turned down for the day.
Now I always lock my PC even at home.
Advanced-Ad-1544@reddit
You're not the first and you won't be the last
SomniumMundus@reddit
Ahhh it’s alright. I once replaced the RAM for this small business and it did not come back up after. Learned there was a hardware way before I even worked on it. The client said no regarding replacing it, etc. can’t have experience without a few broken eggs at some point. Hope much luck in your career!
Fliandin@reddit
25… sigh almost 30 years ago I deleted an entire multimillion dollar project off a server. The IT guy was not in town sorting out his retirement. His replacement was not fully onboarded and nobody had a clue where backups were how often they were created etc.
It was a mess and I thought for sure I was getting fired.
In the end we did recover much but not all of it. I recreated what needed recreating and the project got done life went on.
Fast forward a couple firms and a few decades. I’m now the head of IT in a similar firm and backups are better documented lol.
You aren’t defeated at all you just had a little excitement.
djgizmo@reddit
won’t be your last mistake. I’ve taken down 2 company networks production, during the day… , and networking is my specialty… I’m still working (and growing) within my field
gamayogi@reddit
I may have inadvertently unleashed the conficker worm on my work network about 17 years ago. Took a whole weekend of overtime for the entire it department to clean that up. Shit happens. You learn and sometimes you're the better for it afterwards.
Duck_Diddler@reddit
ONE OF US
firesyde424@reddit
I think most "seasoned" IT pro's have been there, myself included. You are now officially in the club.
My "welcome to IT" moment was when I was was deleting old unused datasets from network storage. I went too fast, wasn't paying attention, and I accidentally deleted 56TB of company data. And since the backups were a mess, most of it had to be restored from originals over the course of two weeks.
An important part of this particular rite of passage is learning from it. We all make mistakes. I hope you have an employer who is understanding.
bbushky90@reddit
One of us… one of us…
ryche24@reddit
Happens. Own it and learn from your mistakes.
Big-Narwhal-G@reddit
Ever check that you are about to pull the correct cable twice and then still pull the wrong one after? This guys done that. Things happen, you now have a learning experience to take forward with you!
HairGrowsTooFast@reddit
We’ve all been there. You got it.
secret_ninja2@reddit
My old boss, use to tell me it's only mistake if you fail to learn from it. You've got to break a few eggs to make a omelette.
SevaraB@reddit
Did a little mini-version of that to myself today- copied a local git repo up to a git server, left a dot folder in it. No big, delete it from the git server, add it to gitignore, sync it with the remote repo.
Realize I deleted the folder with my launch.json config. Oops.
Luckily, it didn’t take me much time to build a new debugging profile in the IDE, but still- that was a quality pucker for a moment.
rybosomiczny@reddit
Been there, done that. Get over it. In my case after few weeks turned out no one was missing any files from a corrupt NFS I broke.
zAuspiciousApricot@reddit
NetApp?
gorramfrakker@reddit
Mistakes happen. If you have accepted the responsibility of your mistake and are fixing it then you’re all good. Just be more careful moving forward.
bgdz2020@reddit
Its okay. My lead talks to us like we are dumb even without making mistakes.
dark_frog@reddit
Coworker of yours? https://www.reddit.com/r/sysadmin/s/gjMO9dFmcO
AuPo_2@reddit (OP)
Ha, different situation. But that’s funny
EnigmaFilms@reddit
What's worse? That or cutting a cable
mindsunwound@reddit
Next time use your boss's login from your coworker's PC, no longer your problem.
NickMalo@reddit
Ah you’re alright. I decommissioned an active app server because i was told to and never verified it was defunct.
Substantial-Fruit447@reddit
Should have had backups...
Wooden-Can-5688@reddit
I've been to that rodeo on more than one occasion. You're feeling as I felt but shake it off and move forward. Easier said than done I know l. Last incident where I deleted data stuck with me a LOOOOONNGGG time and cost me hundreds in liquor.
ArcaneTraceRoute@reddit
“It’s a process, “is that code for Iron Mountain? I hate IM and we’ve all been there. I’ve robocopied huge data sets after testing the script numerous times and bunked that up pretty good. On Prem SAN TO azure file share.
Title_in_progress@reddit
Dude, everyone fucked something up at least once in their career. I surely did; everyone else in here did. So welcome on board! See it as a lesson, grow with it and move on.
Turbulent-Tie7280@reddit
Wiped server while updating. Luckily used dd command before, but still.
Fuzilumpkinz@reddit
First time?
Great reminder to have back ups and a great time to push them if your company has been too cheap in the past!
8bit_dr1fter@reddit
Are you my CISO? He’s done that numerous times because there were “vulnerabilities in the files”. 🙄
anonpf@reddit
Shit happens all the time. Take a moment to look at your process and where the failure points are. Ask yourself if you could have backed up data, copy a config, etc. There’s always room for improvement.
PrettyAdagio4210@reddit
Welcome to the family!