‘No Way To Prevent This,’ Says Only Package Manager Where This Regularly Happens | Kevin Patel
Posted by lelanthran@reddit | programming | View on Reddit | 161 comments
pdpi@reddit
For anybody unaware, it’s a reference to these articles by The Onion
Kissaki0@reddit
Iggyhopper@reddit
Its a mass shooting every 40 days statistically.
StickiStickman@reddit
Where did you get that from?
In the US there is one more than once every single day statistically. 2025 there were 425 mass shootings (shootings with 4 or more victims).
Jonathan_the_Nerd@reddit
It depends very heavily on your definition of "mass shooting". From Wikipedia,
"Under the substantially narrower 2022 National Institute of Justice/The Violence Project dataset definition, there were 167 mass shootings (4 or more killed with firearms in public, not connected to "underlying criminal activity or commonplace circumstance") in the U.S. from 1966 to 2019, and 30.8% of the shootings occurred at the workplace."
Source: https://nij.ojp.gov/topics/articles/public-mass-shootings-database-amasses-details-half-century-us-mass-shootings
According to the FBI, there were 333 active shooter incidents in the US from 2000 to 2019.
"The FBI defines an “active shooter” as one or more individuals actively engaged in killing or attempting to kill people in a populated area. Implicit in this definition is the shooter’s use of one or more firearms. The “active” aspect of the definition inherently implies the ongoing nature of the incidents, and thus the potential for the response to affect the outcome."
Source: https://www.fbi.gov/file-repository/reports-and-publications/active-shooter-incidents-20-year-review-2000-2019-060121.pdf/view
BadMoonRosin@reddit
It's kinda like how some sources define "binge drinking" as real alcoholic blackout level consumption. While other sources define it as, well... pretty much a normal day. 🤷♂️
alrightcommadude@reddit
Okay tell me the definition of a mass shooting without looking it up.
pants6000@reddit
And not-mass-shootings about every 5 minutes.
sweetnsourgrapes@reddit
And even less mass shootings if you count in seconds. True genius.
siromega37@reddit
… someone gets shot in the US every 5 minutes. Thats the not-mass-shooting statistic they’re referencing.
lizardhistorian@reddit
Before Democrats passed the Gun Free Zone law nearly every high-school had an after-school rifle club and there were no mass-shootings.
Bullies bully the correct people about 50% of time. Then these kids viscerally learn there are repercussions for acting oddly and against the grain.
The combination of making our school defenseless and enabled unchecked depravity maximizes pathological behavior.
Note that this was done because "it would have been racist" to only ban guns in Chicago schools which was the only place having issues with shootings when this all started and that was mostly accidental shootings because kids were bringing loose guns into class.
Democrat policies have successfully turned this firearm training issue into a psychotic murder machine.
Lastly, Obama had the CDC study gun-violence in the nation and their results were firearm ownership in America prevents about 300,000 violent crimes per year, a fraction of which would have resulted in a homicide, which means they prevent more deaths than they cause, when you subtract out suicides.
It is not possible to be a Democrat and a good person.
YaBoyMax@reddit
Only a Sith deals in absolutes.
garnet420@reddit
The correct person to bully is you.
prophet001@reddit
Over-under on how many bloated vibe-coded repos this guy's sitting on?
StickiStickman@reddit
Weird how the US is the only place on the planet where there's a mass shooting every single day.
Or how the US has more school shooting a year than the rest of the world combined ... the entire decade.
Tired8281@reddit
Who are the correct people to be bullying?
siromega37@reddit
lol just pulling shit out of your ass here to try and make the age old tired gun lobby arguments. The CDC Wants To Study The True Toll Of Guns In America
https://www.npr.org/2021/09/29/1039907305/cdc-study-toll-guns-america
The CDC can’t study mass shooting or really any cause-effect for gun violence because it’s illegal. The gun free school laws came about at the state and local levels first because of Columbine. Being in HS when that happened in rural NC they banned guns and trench coats the week after it happened. The school resource cop ended up arresting kids who refused to take the guns mounted on the back windshield of their trucks. Go peddle your bullshit somewhere else.
beep_potato@reddit
There's a reason the Wikipedia article has two sections, "United States" and "Other Countries".
GimmeSomeSugar@reddit
Haha! Ha. Ha...
I made myself sad.
Nimelrian@reddit
Also, another npm-specific version from 2019: https://itnext.io/no-way-to-prevent-this-says-only-development-community-where-this-regularly-happens-8ef59e6836de
RLutz@reddit
I feel personally attacked
j4bbi@reddit
While humorous, this does happens in many other package Manager
Squalphin@reddit
Rust is at risk as well. If you have developed any application using Rust and used one or two third party libraries, if you check what dependencies it has pulled in, it can get insane.
lelanthran@reddit (OP)
It's a larger risk for Rust, actually. Compromise NPM and you get access to the build, the server process, etc.
Compromise Cargo and you get everything from above, plus a large number of scripts that run with elevated privileges.
HighRelevancy@reddit
Huh? Why are you running cargo with elevated privileges?
lelanthran@reddit (OP)
Where did I say I do that?
Are you under the impression that Rust c/line tools aren't executed as root or similar sometimes?
Compromised Cargo == compromised rust programs.
HighRelevancy@reddit
Um. Yeah. Yeah I'm absolutely under that impression. I've never said "sudo" once while working on a Rust project. Cargo never needs to touch anything outside your project directory.
If you accidentally did a sudo once you might have files owned by root in your project, which now you can't update without sudoing again. That's an user error, not how Cargo works.
lelanthran@reddit (OP)
What does that have to do with your output being compromised?
Maybe you're not new to Rust (I can't tell), but if you are, the problem is that the built artifact (the executable) can be tainted by a compromised dependency, and that artifact is then distributed to a different runtime environment.
For example, compromised cargo == compromised
sudo-rs, then that executable, when run on a different system, can now compromise that system.Your proposed threat model is not a threat model.
HighRelevancy@reddit
OOOHHHH, you mean tools written in rust.
Yeah alright. But are you under the impression that things written in Nodejs/built with NPM packages aren't also sometimes run with root privileges? I don't get what you think is different about rust here.
lelanthran@reddit (OP)
It's frequency; I've not come across many (can't remember any off the top of my head) that have js tools running from a root shell script, but almost every system I've seen has one or two rust tools installed that eventually get called from a root script.
It's not rust that produces the large attack surface area, it's any systems programming language that has this surface area, it's just that the more common ones (C, for example) don't have a dependency manager to target for supply-chain attacks.
Go is susceptible to this as well, it's just that the transitive network on any Go project is tiny compared to Rust, which itself is tiny compared to JS.
Maybe Python can be used for systems programs? You don't need to install a package manager just to run your Python script, while you do need one for a NodeJs script.
HighRelevancy@reddit
This says quite a lot about how much you aren't inspecting the inner workings of your systems. Every significant Linux distro is built on Python. If you've missed that, what makes you think you've got a handle on Node?
That said, Node probably is less used for it, though lots of ad-hoc automations are written in it.
lelanthran@reddit (OP)
What makes you think I missed that? "used for systems programs" is not the same as "used for Linux systems".
I mean, you're not even considering that systems programs are targeted to multiple systems (like Windows).
HighRelevancy@reddit
I didn't say it was just used on Linux, I said distros were built on it. You were talking about what languages are used for things that run as root, Python's gotta be top five (after C, C++, and maybe shell scripts).
3inthecorner@reddit
sudo-rs would be built from sources installed by the distro rather than pulling from crates.io (at least in Fedora)
Dumlefudge@reddit
Maybe I'm missing something, but is the issue of a a compromised executable not also applicable to npm (or any other package manager)?
Ofc, a compromise in sudo-rs is right up at the top of the "Am I fucked" scale, but if you build and distribute any application with a compromised dependency (be it direct or indirect), do you not have the same type of problem?
BayLeaf-@reddit
I think their point is basically that node projects are generally just ran in place on systems after pulling down dependencies, vs rust possibly being built one place, transferred, and executed somewhere else? I've definitely worked on node apps that are distributed the same way, but I can kinda see where they're coming from, I guess?
chucker23n@reddit
But that’s not really true. See Electron apps. Slack and Teams are basically a node project run on hundreds of millions of devices.
And then you get to VS Code.
I think they have a point insofar as: Rust is trusted more as a language for tools that frequently run as root. But these days, “run without a sandbox” is arguably an almost equally large threat as “run as root”. You get access to all kinds of private data, not to mention networking.
lelanthran@reddit (OP)
Generally not in practice: All I've ever seen for JS programs is "pull this repo, then run
npm -i. JS programs are typically not distributed in the final form - the user requires NPM on their local machine in order to run the target application.In theory, sure, you could perform a full build and redistribute only the result of the build, but I don't see that happening in practice.
Dumlefudge@reddit
Thanks for the clarification. I thought that might have been where you were going with it, but then I started thinking "What about something like Electron apps?" (which the other poster mentioned) and wasn't so sure anymore
ElusiveGuy@reddit
You might be missing the giant chunk of client-side applications. Obviously anything running on web (so, most websites/webapps) is built and bundled. But then you also need to consider all the Electron-based apps - Discord, MSTeams, VSCode, ...
Granted these don't usually run at elevated privilege levels, so your point of Rust builds being compromised being worse is still valid.
apadin1@reddit
Does npm also have tools where you can audit the dependencies for suspicious and deprecated packages like cargo? Genuinely curious
LufyCZ@reddit
Yes, it's built into at least some package managers.
afl_ext@reddit
Try telling this in rust subreddit, i tried
Anonymous0435643242@reddit
It's a known and acknowledged issue but crates.io isn't backed by big tech companies while npm is owned by Microsoft
Kuinox@reddit
It isn't a backing issue.
This is a dev community problem, it's not microsoft that make popular packages on npm with hundred of deps.
syklemil@reddit
Yeah, and they are looking at what's happening in other ecosystems to learn from it. Like there's an RFC for cooldowns (see also), and there's various other ongoing projects for supply chain security.
And as for cost, yeah. There's the Open Infrastructure is Not Free: A Joint Statement on Sustainable Stewardship open letter.
lightmatter501@reddit
If you look at the number of people/groups you’ve trusting, that number goes back down. Just because it’s multiple lines in your dependency list doesn’t automatically make it worse than Boost or similar “mega libraries” in the C++ world.
tortridge@reddit
Yes pypi was a big target few years back. Rust can be a big target with proc macros not running in sandboxes. Debian was a target of xz hack.
Everything is at risk. Part of that is that CI runners are built for automation more than artifact production (as opposition of something like nix daemon for example)
neuronexmachina@reddit
Heck, there was one just a few days ago: https://safedep.io/mass-npm-supply-chain-attack-tanstack-mistral/
hu6Bi5To@reddit
Sandboxing macros will solve nothing. Any Rust crate can execute code at compile time regardless of whether they contain macros or not (this is also true of most other programming languages). And unless you plan on never running your code on anything that contains any sensitive data, you'll still remain vulnerable at runtime.
Sandboxing only works as a place to verify your dependencies in isolation, to see if they're trustworthy. After that you're committed.
hgwxx7_@reddit
build.rs is a code smell. It used to be used a lot, but it has fallen off. There was a deliberate effort to wean the ecosystem off build.rs by providing that functionality in macros and such.
Procedural macros aren't a code smell and are the recommended way to implement a lot of functionality.
So yes, in theory any Rust crate can run code at compile time. But you can probably come close to a build.rs free dependency graph, whereas there are always going to be procedural macros. Those really need to be executed in a sandbox.
hu6Bi5To@reddit
I don't know if it can ever be got rid of entirely, e.g. crates that wrap C libraries or other dependencies outside of "pure" Rust will generally need some ability to execute arbitrary code in one way or another (the Makefile of the C code will have that ability even if the Rust build system didn't allow it directly).
This same loophole exists in Ruby, Python, etc. for the vast number of their dependencies that are wrappers around C code.
I don't think it's automatically bad that this happens either, I just think it's a false hope that we can make compiling software safe by sandboxing. We need to make sure that everything upstream is trusted before we build it, insofar as that's possible.
tortridge@reddit
You are right, I forgot the build.rs part lol.
That said compile time execution is still funky as hell, because it let zero trace in the finals binary.
That said, if your dependency is a malware at runtime, it's fucked no matter what.
CondiMesmer@reddit
No it can definitely affect (and does) other packages as well.
ObservantNickle@reddit
As a dev, what actions can we take to help reduce our risk of supply chain attacks?
Altruistic-Spend-896@reddit
Not use npm, migrate to solidjs, build everything by hand.
nemec@reddit
Delay package updates by 7 days from release. until this solution receives widespread adoption, somebody else can be the canary in the coal mine. And if it does (unlikely), you're no worse off than before.
cdb_11@reddit
A dependency is always a risk, so reduce the amount of dependencies. Asses the risk of relying on a dependency vs reimplementing or vendoring it, and just make a decision. For example, depending on a cryptographic library is most likely worth the risk vs your own broken implementation, when you don't know what you're doing. But a library like axios? The browser already does almost everything for you out-of-the-box, and there isn't much opportunity for you to screw up anything important. You can write it yourself, assuming you even need all that extra code in the first place. What probably doesn't help is that even otherwise valuable libraries might not always care about this issue.
Loves_Poetry@reddit
For JS specifically, you can use PNPM instead of NPM. It has several features to stop supply-chain attacks
QliXeD@reddit
All of the practices of reproducible builds help a lot:
https://reproducible-builds.org/docs/which-problems-do-reproducible-builds-solve/
FlipperoniPepperoni@reddit
It's easy to hide behind sarcasm. It's a lot more difficult to put forward ideas that don't reduce the openness of the npm ecosystem.
Lachee@reddit
Don't run scripts by default like pnpm ... Huh that was easy
FlipperoniPepperoni@reddit
You say that like people audit post install scripts. They don't.
Lachee@reddit
Which is the main issue. PNPM forced you to acknowledge that packages are trying to run.
No it's not a fix, but it's a preventative measure.
FlipperoniPepperoni@reddit
I feel this is kind of a logical fallacy, no?
If its a package you trust, that has been infected by a dependency attack -> you get pwned.
If its a package that's new to you that you're just trying out, you're probably still just going to run the post install script because 'surely if it's malicious, someone else has worked it out' -> you get pwned.
Get what I mean?
DanLynch@reddit
I'm not a JavaScript developer, so I guess I don't really understand the nuance, but why do your dependencies need to run any scripts at all? In the ecosystem I program in (Java), a newly added dependency is just a special kind of ZIP file that contains some code, plus an XML file that describes the transitive dependencies, if any. It could be malicious, and I need to be cautious before incorporating it into my product and shipping it, but I don't need to run any "post install script" for it on my development workstation.
FlipperoniPepperoni@reddit
A good example is popular database ORM Prisma. You describe your schema in a special file, then run a command to generate a bespoke "Prisma client" for your database. This means you get E2E type safety, i.e. Prisma.(Table).findMany() to do a SELECT query, etc.
Prismas typical post install script automatically generates the Prisma client for the codebase you're working in for you, so you don't need to explicitly run the command to do so separately from "npm install".
edzorg@reddit
...and this is ridiculous and it should just not do this. Flyway and Liquibase hook in into the existing build tools eg during
npm run buildand define their own commands likenpm run flyway:generateor whatever you need.CherryLongjump1989@reddit
There are always DX problems with systems that are heavily reliant on code generation. It's not atypical to get the code generator to run in both the install and build steps.
_NullScope@reddit
Prisma no longer does this on v7, you have to run the command manually which I guess it’s because the types of issues cropping up
It still wouldn’t help you if you have to run the command manually and the package is compromised though
CherryLongjump1989@reddit
Short answer is you have the same kind of issue but just didn't know it. Java more or less just pushes the attack surface into the runtime. The Log4Shell vulnerability in Log4J was pretty much the same kind of thing that happens to NPM during development time, but now it was happening on your production servers.
The long answer -- where things are not the same -- is native code. Java's virtual machine abstracts the whole machine, so you don't actually need quite as much native code. Also because JNI is terrible enough to "encourage" everyone to implement everything in Java -- and to prefer certain architectures such as client/server designs instead of embedded designs when interacting with systems that use native code.
JavaScript's VM, on the other hand, is far more restrictive and heavily sandboxed -- designed to run safely in a browser, which effectively cuts it off from the OS. That means that for everything else you might ever want to do on a Node.js server, you're calling into native code via FFI. And Node's FFI is quite nice and user-friendly. So the traditional reason you have post install scripts in NPM is because you often have to compile C/C++ code and set up the bindings for the javascript code to be able to call in to native libraries.
equeim@reddit
Java world has the same thing with annotation processors, Gradle/Maven plugins, etc. All that runs arbitrary code on your developer machine.
Glasgesicht@reddit
If a package I've been using for years suddenly runs a post-install script, I'd be hella suspicious.
hennell@reddit
If it's a package that's new to me, I'd either not run the script or go look what it does, check why it needs it.
But the real security is in the package you used for years that has never needed a script but is compromised and one is added. NPM just runs that, pnpm would error until you confirm you give it permission to.
Magneon@reddit
Many packages managers run scripts though. Debs for example do, and the scripts are often modified by distro maintainers on top of being written by the packages developers. The reason they're more secure (or at least seem that way) is that apt repos are much more centralized than npm, but the underlying mechanism is the same concept.
stumblinbear@reddit
Dog, I didn't even know post-install scripts were even a thing until the other person's comment. At least pnpm tells you it's going to happen
Nullberri@reddit
I don't need to audit them persay, if a hack isn't on going and I don't get asked to ok a post install, then suddenly it wants a post install I'm going to be a lot more skeptical and double check if a hack is on going.
danielcw189@reddit
Why does npm even need those?
Somepotato@reddit
Good job you answered a single attack vector that was already getting deprioritized by stackers
lelanthran@reddit (OP)
The question is, do you actually want it so open that malware can fall into it so easily[1]?
What's wrong with "you can only publish using keys"? That cuts out more than half the exploits.
[1] I'm paraphrasing "Have an open mind, but not so open that your brain falls out!"
FlipperoniPepperoni@reddit
I'm not having a go at you, but the article :)
You're right, but what does a "more closed" version of npm realistically look like?
ID checks? It'll be abandoned en masse.
Per package auditing? npm isn't Apple, they're not doing that.
Its a great opportunity for a company like OpenAI or Anthropic to earn some brownie points with the OSS community and partner with npm to audit submissions. Beyond that, fuck knows.
TonyPace@reddit
Things that were once Apple resources level are now available at reasonable cost through an API.
lelanthran@reddit (OP)
Doesn't have to be ID checks, but tying packages to keys means that when a bad actor is identified you can mass-revoke their packages just by key alone, and even if the repo owner doesn't do it, you can set your project up to mass-revoke bad actors.
It won't solve everything, but it's a good start.
FlipperoniPepperoni@reddit
Ok, but how do you tie a key back to a common actor without compromising privacy?
lelanthran@reddit (OP)
Maintaining pseudo-anonymity doesn't compromise someone's privacy. It's not like you can only have either "everyone's ID must be validated against their state-issued ID documents" and "Every package has a different anonymous account tied to it"
It's enough to know that "FilpperoniPepperoni" uploaded their public key and signs all their submissions. We don't need to know that "FlipperoniPepperoni" is actually Matt Damon.
FlipperoniPepperoni@reddit
How is that any different from a GitHub account?
lelanthran@reddit (OP)
I don't understand the question: what does having an account on a specific platform (other than the repo one you want to submit to) have to do with it?
You upload your public key to the repo, and they decide whether or not to trust you. If they do trust you, then you can publish packages signed with the private key.
I see no reason that you need any account on github, etc. You may not even need an email address if the repo owner trusts you. In practice, you would need an email address at least to communicate with the repo.
No violation of your actual privacy, no doxing of your identity, etc.
FlipperoniPepperoni@reddit
How is a public key identifier any greater a trust indicator than a GitHub account username?
lelanthran@reddit (OP)
Because you can mass-revoke based on key, regardless of whether the account is on github, or on gitlab, or on sourceforge, etc.
If a key is compromised, you mass-revoke that key so people who already have the package on their system will have a build failure.
Can't really do that with revoking a bad github account.
BTW: Are you really Matt Damon? I feel like I have a 1 in 7b chance of being correct on that :-)
FlipperoniPepperoni@reddit
If its all going through npm, the difference between revoking a key and blocking a (git repository) account is academic, no?
Sorry if I'm being daft, I just don't really understand how it's any different. There's no offloaded trust layer in your suggestion as far as I can tell.
lelanthran@reddit (OP)
I don't think so; revocation lists is a thing for keys, but not for accounts.
If the account is compromised (and each package is stored with metainfo holding the account name), the repo owner has to be informed to block the account, and the contributor has to create a new account.
If a key is compromised, the contributor themselves can revoke the key and generate a new one. If the contributor themselves are compromised, then tghe repo owner will have to revoke the key.
But, there's other advantages to using keys - trust can be extended via chained signatures, the contributors local machine has to be compromised (not some server with an account that will be phished), keys can be expired, short expiry times means even in the event of a compromise the damage has to be completed by the attacker in a limited timeframe, refresh tokens can be generated so there's a smaller chance of the root key getting compromised (it's only ever used to generate submission tokens), replay attacks can't be performed because the token used to generate the signature can be different each time (short-lived token), etc.
If I were designing a package repo today, I'd simply go ahead and use X509 certs with a requirement that contributors who cannot afford $10/year to maintain that are already a supply chain risk as their account can be bought off them.
chucker23n@reddit
Well, you can
None of these options are great. All of them are an extra hurdle compared to “anyone can publish”.
lelanthran@reddit (OP)
Yes, that's the point - to remove the "anyone can publish" and replace it with "these people are now found to be untrustowrthy so we are revoking all their packages".
In the end, it doesn't really matter what mechanism is in place - the policy is the problem - untrusted and unverified actors are running their code on your machine and in your production environment.
We want to run only the non-malicious code, but we can't tell malicious code from non-malicious.
The solution of "this code can be trusted and that can not" is not working; the better solution is "this actor can be trusted and that cannot" will work, but regards things like scoped namespaces, keys, verification, etc.
msx@reddit
Wtf? There's no need to propose solutions, they're well known. Take a look at any other sane package manager and just do the freaking same thing
Bwob@reddit
I dunno. I think it's still worthwhile to call attention to problems, even if you don't have a solution.
FlipperoniPepperoni@reddit
There's a world of difference between 'hey guys npm has an implicit trust issue' and 'hahaha these idiots are getting everyone pwned, why don't they just implement the obvious solution'.
syklemil@reddit
Kinda funny given that Go and Rust have different attitudes towards stdlib size, and Rust is on the rather small side, leading to plenty of crate imports.
Go practically runs off github links. I can't name any supply chain attacks on Go off the top of my head, but github actions, which uses the same … "package repository", certainly has. So fun pinning SHAs because the human-readable version strings are all mutable.
And in the big-stdlib camp there's Python, whose PyPI also was impacted by one of the recent attacks, even if most of the attention went to NPM.
NPM also serves what's the most widely used language these days. No matter our feelings on JS and node and whatever, being the most common language also means it's the biggest target.
equeim@reddit
Pinning SHAs is also dangerous because if you get a pull request updating that SHA you have no idea whether it belongs to the original repository or a malicious fork (yes you can fork a GitHub repository, push new commits to your fork and then construct an url that consists of original repository url and commit SHA from your fork and that will work - your fork will be downloaded instead of original repo).
sopunny@reddit
Pinning a dependency version would imply also checking any PR that changes the pin. SHAs are hashes so you can't change the content without changing the SHA
equeim@reddit
You can't change commits but you can change the hash to another commit that belongs to a fork (as a part of "update dependencies" PR) without changing the url of the repository and it will work, because GitHub allows this. See https://www.vaines.org/posts/2026-03-24-the-comforting-lie-of-sha-pinning/
yawara25@reddit
Unless they force a collision, at which point it them becomes a matter of what your particular threat model is (and what resources your adversary has).
lizardhistorian@reddit
If you did that you would end up with a chunk of dead text the purpose of which was just to craft the collision and a cursory AI review would flag it as suspicious.
nemec@reddit
Why doesn't NPM offer protection from maintainers getting kidnapped and forced to upload a malicious package at gunpoint? /s
_predator_@reddit
Go has the "advantage" of modules being cached by its module proxy when they're being requested for the first time. That kinda achieves pinning without you knowing it (outside of inspecting your go.sum file). It's effectively "trust on first use". It also means even if projects move off of GitHub, their historic code remains available in the module proxy via its old coordinates.
Just gotta pray the module proxy will stay around, and Google doesn't eventually get tired of running it for free.
TomWithTime@reddit
I didn't know Google ran their own. My business recently created their own module proxy. Probably because of the increased supply chain attacks going around.
Houndie@reddit
Enterprise module proxies are also helpful for private repositories.
_predator_@reddit
Google still owns large chunks, if not all of, the infrastructure of the Go project. Go is not under the umbrella of a foundation, it's just Google.
That aside, every business should run its own module proxy or repository mirror. Depending on the goodwill of megacorps is just as much of a supply chain risk as downloading malicious packages.
KrazyKirby99999@reddit
That "advantage" has also caused a number of security issues where the proxy caches a malicious version, then the repository is switched for innocent code.
_predator_@reddit
Yep. IIRC it's called out as not optimal yet better than what they had before in the initial proposal.
I'd be curious if the Go team has since re-evaluated their original analysis and whether they still stand behind it. Centralized repositories have a lot of problems but being able to revoke malicious packages feels like a basic requirement these days.
headinthesky@reddit
We check our vendors directory in
Maybe-monad@reddit
As long as they need it.
_predator_@reddit
Maybe they'll try to dump it on the CNCF of the OpenSSF like they're apparently trying to do with the OSV database.
CherryLongjump1989@reddit
Go has the advantage of "works on my machine". People actually say this with a straight face...
Seref15@reddit
Python is big-stdlib but they've been resistant to expanding the stdlib for a long time now citing operational overhead on the python team. For example in the intervening years yaml has become almost a defacto standard configuration language in the industry and yet yaml isnt in the stdlib, etc.
Captain-Barracuda@reddit
That's part of why I wish that every popular language had a rich standard library like Java's. It really cuts down on the number of imports.
RiPont@reddit
I think it comes down to "micro dependency" philosophy.
The richer the stdlib features, the more dependencies collapse down into the stdlib at the core.
The poster child for this is the "leftpad" debacle. Why does a 3rd party package for leftpad even exist? Because the capabilities in the stdlib or stdlibex don't suffice for too many people.
le_bravery@reddit
I have a good idea. When someone types a few letters into their terminal or auto upgrades a package in a large project we can run arbitrary code on their system! Thats clearly the best way to distribute dependencies!
According_Taste_1598@reddit
Selling my account dm me
crusoe@reddit
Worked at a startup that was preventing this stuff several years ago. Worked with NPM closely. Even offered free tools and paid tools such as private verified repos.
No one wanted to buy it so it went under. But we stopped several supply chain attacks in npm.
Oh well.
n3phtys@reddit
Anyone who uses both NPM and Maven in their stack knows that there is a big difference.
You either go with container isolation like wasm wants, you create a cultural change in your ecosystem that prevents custom auto-install scripts on package download from being considered a 'feature', or you get hacked enough that governments will step in at some point.
The tooling itself is good, yes. Much like France, the problem is not NPM as a platform, it's the people in it.
captain_obvious_here@reddit
This came from absolutely nowhere.
TheSurprisingFire@reddit
Probably safe to assume it came from across the channel.
captain_obvious_here@reddit
Secretly he loves us.
gopher_space@reddit
France is everyone's cool older sister who smokes cigarettes.
cesarbiods@reddit
Nice jab there at the end 😂
itgforlife@reddit
/r/france catching strays
r/FUCKYOUINPARTICULAR/
Worth_Trust_3825@reddit
The problem absolutely is NPM. The first time supply chain attack they needed to stop supporting version ranges. Full stop. Drag the users screeching into safe environment whether they like it or not. Same with 2FA - stop pretending that it doesn't help.
edzorg@reddit
Absolute violation
funkie@reddit
such an old, tired, stupid, obsolete trope
Connect_Ad3557@reddit
You mean old, tired, stupid, obsolete French?
ScriptingInJava@reddit
Always nice to meet a fellow brit on /r/programming
MRxShoody123@reddit
why tf do we you take a stray bullet 😔
hu6Bi5To@reddit
This is satire, I get it. But the headline is dangerously complacent, this same thing can easily happen in practically all package managers in use today.
NPM is just the most common target because it's the most widely used.
2bdb2@reddit
It can happen in all package managers, sure. But NPM is particularly vulnerable due to a mix of technical and cultural reasons. And borderline criminal negligence.
In the Maven ecosystem. At a technical level:
Packages cannot run scripts on install or import.
Packages must be signed to be published, adding a second factor that needs to be compromised as well.
Packages are typically set to a specific version, not a range. Thus, updating to a new version is always an explicit choice.
Packages published to Maven Central by default need a user to login and physically click publish before it goes live.
None of these make it impossible, but greatly reduce the surface area.
At a cultural level
The Maven ecosystem favours using a small number of big libraries from trusted vendors, not lots of small libraries from random individuals.
A typical Java project is almost entirely made of libraries published by actual organisations with controlled release processes, not random individuals.
Common release processes include things like
Signing keys being owned by a team seperate to engineering, and no single machine is allowed to have both a publishing and signing key on it.
Releases being built twice - on a CI/CD server and a separate trusted workstation, checking that the hashes are identical.
Final approval gate requiring a pedantic greybeard to login to Maven Central and hit the publish button manually.
Nothing about Maven forces this, that's just cultural in the Java ecosystem.
NPM is the most common target because they leave the barn door wide fucking open with no attempt to implement even the most rudimentary fixes that have proven successful at greatly mitigation the threat in other package managers.
Just removing postinstall scripts alone would dramatically reduce the surface area, as would enforcing package signing.
madisp@reddit
There's not that much special about the Java ecosystem, compared to npm. Mostly it boils down to npm having lots more users so it's a juicier target. Central isn't perfect either, there's no publishing flow with OIDC auth and I'm not sure whether they finally have 2FA support for logging in.
Unless they include an annotation processor so
javacinvokes it at compile time if you're running an older JDK.Central requires signing, yes, but they don't afaik actually verify the signatures in any way, you could sign every release with a new key and only the few users who actually validate signatures would fail to get the package.
Same applies to
npm, typically you have apackage-lock.jsonchecked in to the repository.They need it by default, but the publish button can also be "pressed" with an API call using the same token you used for uploading.
RiPont@reddit
...and don't forget the cultural issue of having to use 3rd party dependencies for basic things that should be in the standard library.
gellis12@reddit
The headline is a reference to the onion article that gets reposted every time there's a mass shooting in America
Ouaouaron@reddit
Making a reference to a famous parody article doesn't make your own parody less stupid.
vplatt@reddit
#nottheonion
riv3rtrip@reddit
Nerd sniping ass post.
Cargo has good and safe defaults which are feature you need to proactively enable in package managers like pip and npm. If npm had safe and sane defaults then sure, 95%+ of the impact of these attacks would go away.
But there is nothing fundamental about the Rust ecosystem or Go ecosystem that fully prevents supply chain attacks. "cryptographic verification" is just checksums but doesn't prevent you from getting pwned on updating a dependency. Also, "drastically reduce reliance on third-party code" is hilarious to say in the same breath as Rust.
FastHotEmu@reddit
This is fucking amazing. I haven't laughed this hard in a long while.
Competitive-Aspect46@reddit
Careful now. You clearly hurt a few js folks' feelings.
magnetronpoffertje@reddit
Another C# win
chucker23n@reddit
There's nothing magical that C# (NuGet) figured out that JS/TS (npm) hasn't. There's just different context, including:
magnetronpoffertje@reddit
So I'm right. C# wins. That's a significant difference.
Vectorial1024@reddit
At this point, other than frontend web dev, there is just no reason to use NodeJS. Even frontend web dev may be done with WASM (C#). Just let the NPM/NodeJS ecosystem rot away into irrelevance.
wasdninja@reddit
WASM code can't touch the DOM so that's an instant showstopper.
SirFireball@reddit
I like typescript. It's a nice enough language to work in for me. If I have to make a script, and it's a bit too complex to do in Bash in a single line, I'll probably default to typescript to write it up in. What would you recommend as a replacement for nodejs here?
lelanthran@reddit (OP)
More than a single line of bash? That's a tough bar!
In any case, if it is that simple, the system's default Python with no deps is probably sufficient.
Note sure you can even use NodeJS without npm.
Vectorial1024@reddit
You are correct, but I am more focused on the NPM ecosystem of packages.
ABotheredMind@reddit
Go, Go, Go, every single time
lelanthran@reddit (OP)
And Bun is doing their best to accelerate that process.
afl_ext@reddit
Bun is now very much vibe coded, its going to go downhill with quality or already is going
franklindstallone@reddit
The fair take would be that JS is the only language available on the backend and front end so it’s the best one to attack.
Yet another reason the monoculture of front end languages is bad.
CrankBot@reddit
Unfortunately I knew this would be about NPM before I opened it..
yksvaan@reddit
Well it's mainly cultural, js devs don't seem to basically care about anything so this is the result. They just npm i whatever instead of writing 10 lines of code and the worst thing is 99.999% don't even consider indirect dependencies.
Every library should aim for 0 dependencies, naturally it's not always viable but then handpick and audit the ones to use. A lot of utility packages can be vendored directly. Secondly all indirect dependencies should be listed when looking at the package and installing.
JS standard library has sucked but it has improved and a lot functionality that "required" a package can now be done using standard APIs. But again, noone cares about removing dependencies .
CrankBot@reddit
A lot of libraries attempt to maintain backwards compatibility so I may be using the latest release of package but it contains whatever polyfills so it can be compatible with older browsers or node versions. I still see warnings about how somewhere in my dependency tree an ancient version of glob is being installed.... Wtf
yojimbo_beta@reddit
It's not that nobody cares about removing dependencies. It's that the key OSS developers who benefit from the status quo, resist changing it.
e.g. Jordan Harband.
jack-of-some@reddit
Please leave the Onion reference alone. The impact of the thing the thing the article is criticizing is so much worse and copying it like this trivializes it.
chucker23n@reddit
I think most people can extrapolate that people dying is worse than security threats.
tumes@reddit
Hey, come on, be fair and try to look at the bright side, without these supply chain attack boogeymen to leverage, Shopify and DHH could not have executed a hostile takeover of rubygems under the hand waving excuse of security concerns (which pretty much exclusively effect Shopify). As a now unfortunately pretty much former ruby dev (voluntary), we should be so lucky to have gotten that kind of attention after 2015.