Malware found on NPM infecting local package with reverse shell
145 comments
·March 26, 2025love2read
jrmann100
I believe the Deno permission system[0] does what you're asking, and more.
(Deno is a JavaScript runtime co-created by Ryan Dahl, who created Node.js - see his talk "10 Things I Regret About Node.js"[1] for more of his motivations in designing it.)
DimmieMan
Yes, explicitly asking you if you want to run the install script is the first warning (which pnpm can do too)
Then would halt due to file access or network permissions.
Could still get you if you lazily allow all everywhere though and this is why you shouldn’t do that.
simlevesque
Yes and you can run almost every npm packages:
deno run npm:@angular/cli --help
bilalq
pnpm skips all `postInstall` runs by default now. You can explicitly allow-list specific ones.
If you use that, I'd highly recommend configuring it to throw an error instead of just silently skipping the postInstall though: https://github.com/karlhorky/pnpm-tricks#fail-pnpm-install-o...
spiffytech
Bun does the same.
bilalq
Sure, but switching from node to bun is a much more invasive change than switching from npm to pnpm. And not always possible.
no_wizard
I will say for all the (sometimes valid) complaints about NPM and the ecosystem, I don’t hear about Go.
Go encourages package authors to simply link to their git repository. It is quite literally cloning source files onto your computer without much thought
skydhash
No code execution during dependency fetching. And the dependency tree is very shallow for most project, making it easier to audit.
no_wizard
still no guard rails, simply raw source code. It would be easy for anything to be hiding within. Given observed behavior I doubt most people are auditing the source either
It’s ripe for an exploit
teknopaul
Just package node_modules subdirectories as tar files.
I stopped using npm a while back and push and pull tar files instead.
Naturally I get js modules from npm in the first place, but I never run code with it after initial install and testing of a library for my own use.
simpaticoder
This is a valid choice, but you must accept some serious trade-offs. For one thing, anyone wanting to trust you must now scrutinize all of your dependencies for modification. Anyone wanting to contribute must learn whatever ad hoc method you used to fetch and package deps, and never be sure of fully reproducing your build.
The de facto compromise is to use package.json for deps, but your distributable blob is a docker image, which serializes a concrete node_modules. Something similar (and perhaps more elegant) is Java's "fat jar" approach where all dependencies are put into a single jar file (and a jar file is just a renamed zip so it's much like a tarball).
no_wizard
May not be a well known feature however npm can unpack tarballs as part of the install process, as that’s how they’re served from the CDN.
If you vendor and tar your dependencies correctly you could functionally build a system around trust layers by inspecting hashes before allowing unpacking for instance.
It’s a thought exercise certainly but there might be legs to this idea
CBLT
I think Yarn zero install is now the default, and does the same thing you're advocating? I'm not really a JS person, but it looks like it's done reasonably competently (validating checksums etc).
deepsun
Same as Python (setup.py). It's even worse in Go, as they encourage to just link github repos at the currently latest version.
Only Java, .Net and R just download files, at a declared (reproducible) version.
delusional
You're already including arbitrary code into your application. Supposedly you're intending to run that application at some point.
nextts
What is the answer to that? Learn x86 and bootstrap?
CGamesPlay
Capability-based security within Node. The main module gets limited access to the system (restricted by the command-line, with secure defaults), all dependencies have to be explicitly provided with capabilities they need (e.g. instead of a module being able to import "fs", it receives only an open directory handle to the directory that the library consumer dictates). Deno already does the first half of this.
davidmurdoch
The lavamoat npm package does something similar. It's maintained by the security team at MetaMask (crypto wallet extension and app). It's used in the extension runtime as well as wraps the build process.
feross
We built “safe npm”, a CLI tool transparently wraps the npm command and protects developers from malware, typosquats, install scripts, protestware, telemetry, and more.
You can set a custom security policy to block or warn on file system, network, shell, or environment variable access.
geenat
Downloads used in infrastructure... VSCode Extensions, Github repos, PyPI, NPM, etc. all need to be scrutinized.
Open source at least has the option to audit; closed source (or "closed build" stuff like 7zip) is at far higher risk: mostly just VirusTotal which mostly will not catch backdoors of this type.
Mainland China, Russia, North Korea, use these vectors for corporate and government espionage: https://www.youtube.com/watch?v=y27B-sKIUHA ...XZ, Swoole are 2 examples off the top of my head.
phito
Malware in a crypto-related JavaScript package. Surprised Pikachu face
deanc
I'd like to see a world where the JS community focused more on improving the stdlib across the browser and in nodejs - much like bun is doing. Common packages for node such as mysql2, axios etc. are so widely used and are huge attack vectors should they ever be compromised.
DimmieMan
Deno is perhaps a better example with browser API's, part off the winterTC committee and a growing set of std packages[1].
Possibly more importantly, it has a security model that defends against this kind of exploit.
I will agree with the sentiment though, I get not wanting to jump on new shiny things but for some reason I keep getting the vibe that the community is closer to crab mentality than healthy skepticism, downright hostile towards any project making a genuine effort to improve things.
azemetre
My issue with deno and jsr is that it only exists at the discretion of VC funding.
That does not feel like a sustainable ecosystem where the incentives are misaligned (wanting a return on capital versus proper open engineering).
sieabahlpark
[dead]
BrouteMinou
I am reading this while downloading half the internet doing a `cargo build` for my hello world program.
At least it's not the cursed javascript, right...
theteapot
Put `ignore-scripts=true` in your .npmrc
gruez
that just delays the exploit. The exploit can still run next time you import the file.
theteapot
This mostly defends against name squatting and other malicious dependencies that never get imported.
Haven't reviewed code but article says its entry point is install script. install scripts don't run on import. I guess you saying it triggers from import too.
megadata
Could we start a community review pool?
dingi
Why does NPM always seem to have this kind of issues? Why do we rarely hear about similar problems with Maven Central, for example?
dlachausse
I think a lot of it comes down to attack surface. JavaScript famously has a very limited standard library, so it is very common to pull in massive dependency chains of modules containing trivial functionality.
Contrast this to Java, C#, and Python, where you can write robust applications with just the standard libraries.
thewebguyd
Yep. See left-pad from from 2016. Something so trivial it was basically part of every other major language's standard library (and it is now in JS, but wans't at the time).
Other gems exist in the NPM world like is-odd, isarray, and is-negative-zero.
The whole ecosystem developed this absurd culture of micro-packages that all do something insanely trivial that's built-in to pretty much every other language. All of it is a result of trying to force the web into doing things it was never really designed for or meant to do.
Browsers were never supposed to be full application runtimes, and yet we (as an industry) keep doubling down on it
yupyupyups
>Browsers were never supposed to be full application runtimes, and yet we (as an industry) keep doubling down on it
Web technologies are multi-platfrorm, widely accessible, has an unmatched UI API, has great tooling and a large ecosystem. It's the path of least resistance.
Contrast this with something like Qt, which is an absolute pain to compile and is riddled with bugs. It's GUI library is mature, but nowhere near the web.
Time is money, and developers are many times more expensive than the extra cpu cycles the web consumes.
whstl
The culture of micro-packages is mostly pushed by people with financial interests.
The modus operandi is to add a semi-useful package to a popular project, and then divide said project into other sub-projects just to bump the numbers.
Then those individuals start claiming that they have "25 packages that are foundational to the web infrastructure", while in fact it's just a spinner that has 24 dependencies, some of those being one-liner packages.
Parallel to that we also have things like Babel, which has hundreds of packages, a huge chunk of them being almost empty and only triggering flags in the core package.
pjc50
> Browsers were never supposed to be full application runtimes
Unfortunately all the proprietary OS vendors want to prevent the development of fully portable runtimes, so we've ended up where the browser is the only fully portable application runtime.
marginalia_nu
To be fair Java has a pretty large package ecosystem as well, including a significant number of dependencies. Though the culture is very different, and you generally tend to avoid dependencies if possible. The idea of pulling in a package for testing whether a number is even or odd is incredibly foreign.
feross
Great question. A few reasons:
– The JavaScript ecosystem moves faster — way more packages, more frequent updates, and more transitive dependencies (avg 79 per package).
– npm has lower barriers to publishing, so it’s easier for malicious actors to get in.
– Java developers often use internal mirrors and have stricter review processes, while npm devs tend to install straight from the registry.
– But to be clear, supply chain attacks do happen in other ecosystems — they’re just underreported. We’ve seen similar issues in PyPI, RubyGems, and even Maven.
JavaScript just happens to be the canary in the coal mine.
robinsonb5
Perhaps because nowhere else has one package for every 2,972 humans on planet Earth?
carlmr
Biggest target had biggest number of issues?
rvz
The Javascript ecosystem is the problem and is completely immature by design.
johnny22
not sure about java land, but we have seen this on pypi and i think even on rubygems.
tedd4u
I think the industry is going to soon look back on building with Wild West open-source repos like we looked back on not having absolutely everything running on HTTPS in the Snowden era. I know Google has "assured" open source repos for Python and Java [1]. Are there other similar providers for those and other languages?
[1] https://cloud.google.com/assured-open-source-software/docs/o...
giantg2
Any reasonable company already knows this and sets up a proxy repo of scanned/approved versions (this is important for licensing too).
swatcoder
You're absolutely right, but you've just asserted that almost all companies making software are unreasonable.
Distressingly, doing what you suggest remains the exception by orders of magnitude. Very few people have internalized why it's necessary and few of those have the political influence in their organizations to make it happen.
nukem222
[flagged]
pletnes
Not from what I’ve seen. What are the relevant products in this space? Can’t expect every random company to set up package scanning from scratch.
chrisweekly
JFrog / Artifactory is one very common provider of private npm registries. There are a ton of security-scan vendors out there (mend/whitesource, socket, black duck...)
tsm
I worked for an IBM acquiree 13 years ago and as part of the "Blue-washing" process to get our software up to IBM spec we had to use their proprietary tools for verifying our dependencies were okay.
giantg2
Well then I wouldn't expect to do business with every random company. TPRM is a big issue today, so I wouldn't expect any company not performing basic due diligence to service.
null
0cf8612b2e1e
How much is that automated scanning worth? Sure, we have mirrored repos, but I assume the malware authors pre test their code on a suite of detectors in CI. So infected packages will happily be mirrored internally for consumption.
feross
Totally agree. Most companies using mirrors or proxies like Artifactory aren’t getting much real protection.
- They cache packages but don’t analyze what’s inside.
- They scan or review the first version, then auto-approve every update after that.
- They skip transitive deps — and in npm, that’s 79 on average per package.
- They rely on scanners that claim to detect supply chain attacks but just check for known CVEs. The CVE system doesn’t track malware or supply chain attacks (except rarely), so it misses 99%+ of real threats.
Almost everything on the market today gives a false sense of security.
One exception is Socket — we analyze the actual package behavior to detect risks in real time, even in transitive deps. https://socket.dev (Disclosure: I’m the founder.)
delusional
Not much. as you say, static scanning is pretty much a dead end strategy. Exploiters have long since realized that you can just run the scan yourself and jiggle the bytes around to evade the signature detection.
giantg2
At least at my company, I think someone at least has to approve/verify the scan results. Of course it's still a risk, but so are external emails, vendor files, and everything else.
thibaut_barrere
It is worth a fair bit. If you control the mirroring you can ensure the malware is flagged but not deleted, so forensics can assess how much damage has been done or would have been done, for instance.
poincaredisk
>I assume the malware authors pre test their code on a suite of detectors in CI.
Maybe some do, but you give the average malware developer way too much credit.
ohgr
Bugger all. We had something go straight through.
theteapot
> npm is a package manager for the JavaScript programming language maintained by npm, Inc., a subsidiary of GitHub. -- [1]
and Microsoft own Github so Microsoft is the provider? Pretty sure they're running malware scanners over NPM constantly at the least. NPM also has (optional) provenance [2] to a Github build workflow which is as strong as being "assured" by Google IMO. Only problem is it's optional.
[1]: https://en.wikipedia.org/wiki/Npm [2]: https://github.blog/security/supply-chain-security/introduci...
the8472
This is a coordination failure. We have ways to distribute the source, but not the reviews. Every time someone does any level of reviewing that should be publishable too.
karlding
Things like cargo-crev [0] or cargo vet [1] aim to tackle a subset of that problem.
There’s also alternate implementations of crev [2] for other languages, but I’m not sure about the maturity of those integrations and their ecosystems.
[0] https://github.com/crev-dev/cargo-crev
donnachangstein
> like we looked back on not having absolutely everything running on HTTPS in the Snowden era.
Apples and oranges and this is far, far worse.
You can absolutely ship signed, trusted code over standard HTTP. Microsoft did this for years and Debian and OpenBSD to name a few still do.
HTTPS does not assure provenance of code.
Anyone who doesn't understand this is very misinformed about what HTTPS does and doesn't do.
tedd4u
Sorry, I wasn't clear. I meant only in the general sense of in the not too far past, the industry was content with a huge hole like only running login under HTTPS and no site traffic, which in hindsight seems insane. What I mean is the situation (explored in the rest of this thread) where many in the industry seem to be content with consuming code extensively from public repos without many obstacles to prevent a supply-chain attack. What I'm saying is that soon the industry will probably look back on this in the same way: "what were we thinking!?"
genewitch
I think this depends on one's definition of "code"
fc417fc802
> Wild West open-source repos
There's a deeper issue though. I frequently have difficult getting things to build from source in a network isolated environment. That's after I manually wrangle all the dependencies (and sub-deps, and sub-sub-deps, and ...).
Even worse is something like emscripten where you are fully expected to run `npm install`.
Any build process that depends on network access is fundamentally broken as far as I'm concerned.
no_wizard
Which is nearly all of them, except perhaps C/C++, that I can think of, in terms of languages broadly adopted
You can cache and/or emulate the network to go offline but fundamentally a fresh build in most languages will want to hit a network at least by default
robinsonb5
In my world (VHDL/Verilog and some C/C++) there's a difference between the "fetch" and "build" steps. It's perfectly reasonable for the fetch step to require network access; the build step should not.
The real problem is that some language ecosystems conflate those two steps.
fc417fc802
> at least by default
Even in C/C++ after changing the relevant parameters to non-default values things often break. It seems those configurations often go untested.
Google managed repos are a nice exception to this. Clearly documented commit hashes for all dependencies for a given release.
BrenBarn
So then instead of knowing nothing, we'll know that Google wants us to use it, which . . . is a different problem. :-)
nextts
How does https help with the problems Snowden uncovered? You don't run on https, https just does in transit encryption between 2 points of the service architecture. That is why you can (could?) slap cloudflare atop your http only site and get a padlock!
reactordev
Because one of the methods reported was scanning http packets. Easily read without ssl from any hop in the chain. More importantly, he blew the lid off the fact that governments had access to this via the very ISP’s everyone relies on for telecom. By making everything TLS, they can look all they want but they can’t read it.
You could do tls offloading at your load balancer but then you have to secure your entire network starting with your isp. For some workloads, this is fine, you aren’t dealing with super sensitive data. For others, you are violating compliance.
tedd4u
I'm referring to programs like MUSCULAR [1] and PRISM [2] where NSA was tapping inter- and intra-datacenter traffic of major internet communications platforms like Gmail, Yahoo Mail, Facebook etc. At the time, that kind of traffic was not encrypted. It was added in a hurry after these revelations.
nextts
Oh yeah where I work we run encryption between ec2s. But I don't think it is https. Probably more low level (todo: read up on how it works!)
feross
Totally agree — we’re going to look back and wonder how we ever shipped code without knowing what was in our dependencies. Socket is working on exactly this: we analyze the actual code of open source packages to detect supply chain risks, not just known CVEs. We support npm, PyPI, Maven, .NET, Rubygems, and Go. Would love to hear which ecosystems you care about most.
(Disclosure: I’m the founder. https://socket.dev)
xyst
People wonder why I run their shitty apps in VMs and nuke the VM afterwards.
This is why, lol.
fsflover
Sounds like Qubes OS, which is my daily driver.
submeta
What‘s the advice? Only develop in a sandbox environment? Otherwise chances are our main machines get compromised?
rcxdude
Vet your dependencies, at least to the level of making sure your direct ones are actually real projects done by known people and reasonably widely used. Note that for all the buzz about these kinds of attacks, there's relatively little evidence they are actually successful at being downloaded and installed by anything but automated scanning/archiving systems.
null
The fact that http fetches and fs reads don't prompt the user are continually the craziest part of the `npx` and `package.json`'s `postinstall`.
Does anyone have a solution to wrap binary execution (or npm execution) and require explicit user authorization for network or fs calls?