“I Got Pwned”: npm maintainer of Chalk & Debug speaks on the massive supply-chain attack
Posted by Advocatemack@reddit | programming | View on Reddit | 33 comments
Hey Everyone,
This week I posted our discovery of finding that a popular open-source projects, including debug and chalk had been breached. I'm happy to say the Josh (Qix) the maintainer that was compromised agreed to sit down with me and discuss his experience, it was a very candid conversation but one I think was important to have.
Below are some of the highlight and takeaways from the conversation, since the “how could this happen?” question is still circulating.
Phishing + MFA is not a silver bullet
The attackers used a fake NPM login flow and captured his TOTP, allowing them to fully impersonate him. Josh called out not enabling phishing-resistant MFA (FIDO2/U2F) as his biggest technical mistake.
The scale of the blast radius
Charlie (our researcher) spotted the issue while triaging suspicious packages:
Wiz later reported that 99% of cloud environments used at least one affected package. Josh reflected:
There were some 'hot takes' that this wasn't actually a big deal because the impact was so limited (only $900 of crypto was actually stolen). However, 10% of all cloud environments analysed had the malware within them. Had the attackers been smarter (or more malicious), the impact of this could have been huge. This wasn't a win for security; it was a lucky break.
Ecosystem-level shortcomings
Josh was frank about registry response times and missing safeguards:
yksvaan@reddit
It's more of community's fault for installing and accepting dependencies so easily. A lot of the packages are small utilities that you can write yourself, rely on new JavaScript features that cover the functionality or just check and copy the source locally.
Npm could have a full list of direct and indirect dependencies.
Bobby_Bonsaimind@reddit
The website will happily lie to you, though, to cover up that the dependency is a fraud:
Kwantuum@reddit
Dependencies are inevitable. I think a bigger problem is the lack of version pinning by default in the node world. A dependency update is something serious, but by default, dependencies are added as "version x.y.z and up" which will download the most recent compatible version (according to semver) instead of the exact version when doing a fresh install (though that shouldn't be what is used in CI or during deployment but it unfortunately is far too often). This is the real reason that causes updates that are up for 2 hours to affect millions. A vanishingly small proportion of those were caused by manual package updates.
But yes, there needs to be a community effort to start removing dependencies from projects when they add little value, and to pin package versions everywhere.
General_Session_4450@reddit
Dependencies are pinned in the package lock file by default since many years ago now.
Kwantuum@reddit
And the lock file is ignored by npm install. You should use npm ci to install based on the lock file but many places don't. And many people will npm install on their machines when first running the project which will update the lock file that they will promptly commit and it will be ignored during review, you have now bumped every dependency by accident.
General_Session_4450@reddit
No, `npm install` will not ignore the lock file. It will only ignore the lock file if you have manually edited the `package.json` by bumping a version to something that is incompatible with the `packge-lock.json`. If you don't touch `package.json` and do a clean intall with `npm install` then it will not bump any dependencies. There are also some differences if you already have an existing `node_modules` directory, where `npm install` can add those to your lock file, etc, but it these edge-cases do not affect clean installs.
grauenwolf@reddit
That's not gonna help when all of the major frameworks do the same thing.
yksvaan@reddit
Exactly why the community needs to step up and change their culture. It can be done since that's definitely not the case in other languages. It just seems that pretty much no-one cares.
Instead of getting hyped about something every week js community needs to go back, learn basic programming principles, architecture, project management etc.
vlakreeh@reddit
Controversial take, but the pattern of many tiny dependencies instead of a few ones is genuinely really nice as a developer.
Countless times in other ecosystems that operate differently I've had some issue with a bigger library that I'm already dependent on and I'm totally at the mercy of the authors of that bigger library to change something about their library (which they are often rightly hesitant to do!) unless I'm willing to fork their library and add even more maintenance burden on myself. In the JS world where everything is so modular with tiny dependencies it's a lot easier to swap out a library with a similar one if it isn't exactly what I'm looking for, and if an alternative doesn't exist there's a much smaller scope for me to reimplement.
NPM and package managers with similar principles (cargo, pip, go) really embrace a modern interpretation of the Unix philosophy of building small, modular, and extensible parts that can be composed to solve non-trivial tasks. The actual issue is NPM's default behavior is to implicitly update patch versions when you run
npm install
unless you explicitly pin dependencies.bhison@reddit
Phishing is a solvable problem, why is this still happening?
Any important service should habitually use a cryptographic signature to prove it is from them. You easily can maintain a keychain of at least 100 critical service providers to prevent this. This could be built as standard into all email clients and have the UX automated, tucked away from the user.
Does anyone know of a reason why this isn’t workable? Considering the risks and costs of phishing why hasn’t there been a push for this to become the norm?
Illustrious_Dark9449@reddit
I imagine while solvable, the road to migration is long.
Mail is just so old, and the backwards compatibility between mail servers quickly becomes a problem.
ArdiMaster@reddit
Neither S/MIME nor PGP require the server to be aware of them. S/MIME even has wide-spread client support. All that’s really missing is an initiative akin to Let’s Encrypt that makes it easy for anyone to get certificates.
bhison@reddit
But cryptographic signing is shared in plain text, the only thing you would need to develop is client support for smoothing the UX
It doesn’t need to be a requirement, I see the direct parallel being the migration to 2FA - those who need security and have the capacity to use the tools offered can improve their security.
This example is one of many which illustrates that the inconvenience of doing this is entirely justified.
BibianaAudris@reddit
I think the best solution to phishing is on the client side: just ignore all notifications for the first time. If it's really important, they'll send it again. Phishers usually don't send it again, due to cost issues.
Cryptographic signing isn't exactly a silver bullet. Big parties like npm send so many different notifications that it can eventually become a signing oracle for attackers. It's not that far-fetched if someone would craft a creative support ticket to elicit a signed reply suitable for phishing someone else.
DorphinPack@reddit
Nobody wants to pay to maintain public services that aren’t the top of a sales funnel.
ptoki@reddit
Find me a site which says that this particular (for example my bank website) ssl cert has this particular hash.
There was a time when I was suspecting my computer/browser was hacked.
I could not find a decent page which publishes the cert info. All web assumes the info is there and no ManInTheMiddle exists or there are ways to verify the certs for the enduser (in a form of computer literate person).
Certs would not solve what was the phishing source.
You have no easy way to know the link you clicked is is valid if the phishing attack is done right.
Email is a problem on its own. There are pages which allows you to send any email with almost any From field.
The trusted content must be confined in very specific form and location and there is very little standards for this in the industry.
piesou@reddit
Because you can verify all you want, if the sender is typo-squatting the target domain it won't do jack shit.
bhison@reddit
It’s more that you can easily verify and alert with a big banner “this message is signed/unsigned” “this user belongs in your trusted list/does not” etc. it’s a case of automatically and loudly invalidating the message
piesou@reddit
My bad, I thought of DKIM.
Old_Pomegranate_822@reddit
99% of node-based cloud environments, maybe. Not sure how this would affect servers not written in node. You might be able to attack the frontend, I guess, but even then 99% seems a lie.
Advocatemack@reddit (OP)
"Our data shows that prior to this campaign, 99% of cloud environments had at least one instance of one of the packages targeted by this threat actor "
https://www.wiz.io/blog/widespread-npm-supply-chain-attack-breaking-down-impact-scope-across-debug-chalk
SanityInAnarchy@reddit
99% of Wiz' own cloud environments?
Otherwise, how did they run that survey? Most "cloud environments" don't publish an inventory of what NPM packages they have deployed.
grauenwolf@reddit
Is there any reason to believe that their customers are not typical?
SanityInAnarchy@reddit
I have a hard time believing any one company's customers are typical, unless maybe that company is Amazon. But no, I don't have any actual reason to believe that Wiz' customer base is especially unusual.
_maggus@reddit
Wiz is a security and analytics solution. I assume the 99% figure means across all their clients‘ cloud environments.
My company uses Wiz too, pretty nifty tool.
Advocatemack@reddit (OP)
I suspect it was a study of their customers yes, but I don't know. I'm trying to reach out to the researchers, Wiz is generally pretty solid in research so don't think it's BS
Halkcyon@reddit
I think you're talking past each other.
chipstastegood@reddit
Plenty of CI/CD and cloud environments use npm and Node for something - not necessarily for production application code, but even Java apps will often use npm/node in some way in the build/deployment process.
FuckOnion@reddit
Not really a fan of how the he discredits the Node, npm and React ecosystems @ 17:30.
A lot of important services have web interfaces built on these technologies these days. Node is massive. Not respecting security as you otherwise would "just because it's JavaScript" is disappointing and reckless.
That said, npm is a minefield and I think it's just a matter of time before we get hit even worse. Supply-chain attacks need to be solved sooner rather than later or we're in for a world of hurt.
ptoki@reddit
I would bash the current web/js/node and all their derivatives and siblings more if possible.
This is crap and it is a shame that we have so many people, yet the code is that crappy and the ecosystem so fragile.
This needs to change. Really. The flash was touted a cancer. Modern js is cancer arrow cancer arrow cancer....
AnnoyedVelociraptor@reddit
I find it insane that NPM doesn't have something like trusted publishers like crates.io has. I cannot publish my crates from locally. It has to be via a PR in an environment.
Second, I find it insane that a maintainer of a code base this size does not use a password manager.
mareek@reddit
Really great interview, Qix seems like a very nice guy
He has some great pieces of advice too:
yojimbo_beta@reddit
I saw the phishing email. I suspect I would have clicked too, if it caught me unawares. The main clue it's bogus is the npm.help domain, which honestly looks a lot like a legit TLD.