Modern Node.js Patterns
117 comments
·August 3, 2025farkin88
tanduv
I never really liked the syntax of fetch and the need to await for the response.json, implementing additional error handling -
async function fetchDataWithAxios() {
try {
const response = await axios.get('https://jsonplaceholder.typicode.com/posts/1');
console.log('Axios Data:', response.data);
} catch (error) {
console.error('Axios Error:', error);
}
}
async function fetchDataWithFetch() {
try {
const response = await fetch('https://jsonplaceholder.typicode.com/posts/1');
if (!response.ok) { // Check if the HTTP status is in the 200-299 range
throw new Error(`HTTP error! status: ${response.status}`);
}
const data = await response.json(); // Parse the JSON response
console.log('Fetch Data:', data);
} catch (error) {
console.error('Fetch Error:', error);
}
}
farkin88
Yeah, that's the classic bundle size vs DX trade-off. Fetch definitely requires more boilerplate. The manual response.ok check and double await is annoying. For Lambda where I'm optimizing for cold starts, I'll deal with it, but for regular app dev where bundle size matters less, axios's cleaner API probably wins for me.
franciscop
As a library author it's the opposite, while fetch() is amazing, ESM has been a painful but definitely worth upgrade. It has all the things the author describes.
farkin88
Interesting to get a library author's perspective. To be fair, you guys had to deal with the whole ecosystem shift: dual package hazards, CJS/ESM compatibility hell, tooling changes, etc so I can see how ESM would be the bigger story from your perspective.
franciscop
I'm a small-ish time author, but it was really painful for a while since we were all dual-publishing in CJS and ESM, which was a mess. At some point some prominent authors decided to go full-ESM, and basically many of us followed suit.
The fetch() change has been big only for the libraries that did need HTTP requests, otherwise it hasn't been such a huge change. Even in those it's been mostly removing some dependencies, which in a couple of cases resulted in me reducing the library size by 90%, but this is still Node.js where that isn't such a huge deal as it'd have been on the frontend.
Now there's an unresolved one, which is the Node.js streams vs WebStreams, and that is currently a HUGE mess. It's a complex topic on its own, but it's made a lot more complex by having two different streaming standards that are hard to match.
exhaze
Tangential, but thought I'd share since validation and API calls go hand-in-hand: I'm personally a fan of using `ts-rest` for the entire stack since it's the leanest of all the compile + runtime zod/json schema-based validation sets of libraries out there. It lets you plug in whatever HTTP client you want (personally, I use bun, or fastify in a node env). The added overhead is totally worth it (for me, anyway) for shifting basically all type safety correctness to compile time.
Curious what other folks think and if there are any other options? I feel like I've searched pretty exhaustively, and it's the only one I found that was both lightweight and had robust enough type safety.
jbryu
Just last week I was about to integrate `ts-rest` into a project for the same reasons you mentioned above... before I realized they don't have express v5 support yet: https://github.com/ts-rest/ts-rest/issues/715
I think `ts-rest` is a great library, but the lack of maintenance didn't make me feel confident to invest in them. So instead I tried building an in-house solution and actually landed on something I'm quite happy with! If you're feeling up for it, I would recommend spending a day or 2 trying to create your own abstraction while referencing `ts-rest` as a source of inspiration. My solution isn't perfect, but it works well enough and having full control/understanding of the internals feels worth it.
I want to strongly emphasize though that I never would have attempted something like this had LLMs not existed. Claude Sonnet was expectedly not able to 1-shot this problem, but it helped fill a loooot of knowledge gaps, namely with complicated Typescript types. I would also be wary of treeshaking and accidental client zod imports if bundle size is a concern.
Honestly, I'm still blown away I was able to do this. Or rather that Claude was able to do this (albeit with some heavy hand holding). AI is crazy.
farkin88
Type safety for API calls is huge. I haven't used ts-rest but the compile-time validation approach sounds solid. Way better than runtime surprises. How's the experience in practice? Do you find the schema definition overhead worth it or does it feel heavy for simpler endpoints?
yawnxyz
node fetch is WAY better than axios (easier to use/understand, simpler); didn't really know people were still using axios
mcv
This is all very good news. I just got an alert about a vulnerability in a dependency of axios (it's an older project). Getting rid of these dependencies is a much more attractive solution than merely upgrading them.
reactordev
You still see axios used in amateur tutorials and stuff on dev.to and similar sites. There’s also a lot of legacy out there.
bravesoul2
AI is going to bring that back like an 80s disco playing Wham. If you gonna do it do it wrong...
Raed667
I do miss the axios extensions tho, it was very easy to add rate-limits, throttling, retry strategies, cache, logging ..
You can obviously do that with fetch but it is more fragmented and more boilerplate
farkin88
Totally get that! I think it depends on your context. For Lambda where every KB and millisecond counts, native fetch wins, but for a full app where you need robust HTTP handling, the axios plugin ecosystem was honestly pretty nice. The fragmentation with fetch libraries is real. You end up evaluating 5 different retry packages instead of just grabbing axios-retry.
hiccuphippo
Sounds like there's space for an axios-like library built on top of fetch.
benoau
axios got discontinued years ago I thought, nobody should still be using it!
farkin88
Right?! I think a lot of devs got stuck in the axios habit from before Node 18 when fetch wasn't built-in. Plus axios has that batteries included feel with interceptors, auto-JSON parsing, etc. But for most use cases, native fetch + a few lines of wrapper code beats dragging in a whole dependency.
vinnymac
Undici in particular is very exciting as a built-in request library, https://undici.nodejs.org
farkin88
Undici is solid. Being the engine behind Node's fetch is huge. The performance gains are real and having it baked into core means no more dependency debates. Plus, it's got some great advanced features (connection pooling, streams) if you need to drop down from the fetch API. Best of both worlds.
pbreit
It has always astonished me that platforms did not have first class, native "http client" support. Pretty much every project in the past 20 years has needed such a thing.
Also, "fetch" is lousy naming considering most API calls are POST.
synergy20
axios works for both node and browser in production code, not sure if fetch can do as much as axios in browser though
null
simonw
Whoa, I didn't know about this:
# Run with restricted file system access
node --experimental-permission \
--allow-fs-read=./data --allow-fs-write=./logs app.js
# Network restrictions
node --experimental-permission \
--allow-net=api.example.com app.js
Looks like they were inspired by Deno. That's an excellent feature. https://docs.deno.com/runtime/fundamentals/security/#permiss...tyleo
This is great. I learned several things reading this that I can immediately apply to my small personal projects.
1. Node has built in test support now: looks like I can drop jest!
2. Node has built in watch support now: looks like I can drop nodemon!
pavel_lishin
I still like jest, if only because I can use `jest-extended`.
vinnymac
If you haven't tried vitest I highly recommend giving it a go. It is compatible with `jest-extended` and most of the jest matcher libraries out there.
pavel_lishin
I've heard it recommended; other than speed, what does it have to offer? I'm not too worried about shaving off half-a-second off of my personal projects' 5-second test run :P
hungryhobbit
Eh, the Node test stuff is pretty crappy, and the Node people aren't interested in improving it. Try it for a few weeks before diving headfirst into it, and you'll see what I mean (and then if you go to file about those issues, you'll see the Node team not care).
NackerHughes
Be honest. How much of this article did you write, and how much did ChatGPT write?
Our_Benefactors
What, surely you’re not implying that bangers like the following are GPT artifacts!? “The changes aren’t just cosmetic; they represent a fundamental shift in how we approach server-side JavaScript development.”
jameshart
And now we need to throw the entire article out because we have no idea whether any of these features are just hallucinations.
gabrielpoca118
Don’t forget the native typescript transpiler which reduces the complexity a lot for those using TS
sroussey
It strips TS, it does not transpile.
Things like TS enums will not work.
mmcnl
Exactly. You don't even need --experimental-strip-types anymore.
bravesoul2
Anyone else find they discover these sorts of things by accident. I never know when a feature was added but vague ideas of "thats modern". Feels different to when I only did C# and you'd read the new language features and get all excited. In a polyglot world and just the rate even individual languages evolve its hard to keep up! I usually learn through osmosis or a blog post like this (but that is random learning).
austin-cheney
I see two classes of emerging features, just like in the browser:
1. new technologies
2. vanity layers for capabilities already present
It’s interesting to watch where people place their priorities given those two segments
ale
Why bother with node when bun is a much better alternative for new projects?
tonypapousek
Why bother with bun when deno 2 is a much better alternative for new projects?
0x073
Why bother with deno 2 when node 22 is a much better alternative for new projects?
(closing the circle)
rco8786
I've been away from the node ecosystem for quite some time. A lot of really neat stuff in here.
Hard to imagine that this wasn't due to competition in the space. With Deno and Bun trying to eat up some of the Node market in the past several years, seems like the Node dev got kicked into high gear.
prmph
I think slowly Node is shaping up to offer strong competition to Bun.js, Deno, etc. such that there is little reason to switch. The mutual competition is good for the continued development of JS runtimes
gear54rus
Slowly, yes, definitely welcome changes. I'm still missing Bun's `$` shell functions though. It's very convenient to use JS as a scripting language and don't really want to run 2 runtimes on my server.
adriancooney
You might find your answer with `zx`: https://google.github.io/zx/
azangru
Matteo Collina says that the node fetch under the hood is the fetch from the undici node client [0]; and that also, because it needs to generate WHATWG web streams, it is inherently slower than the alternative — undici request [1].
[0] - https://www.youtube.com/watch?v=cIyiDDts0lo
[1] - https://blog.platformatic.dev/http-fundamentals-understandin...
vinnymac
If anyone is curious how they are measuring these are the benchmarks: https://github.com/nodejs/undici/blob/main/benchmarks/benchm...
I did some testing on an M3 Max Macbook Pro a couple of weeks ago. I compared the local server benchmark they have against a benchmark over the network. Undici appeared to perform best for local purposes, but Axios had better performance over the network.
I am not sure why that was exactly, but I have been using Undici with great success for the last year and a half regardless. It is certainly production ready, but often requires some thought about your use case if you're trying to squeeze out every drop of performance, as is usual.
sieabahlpark
[dead]
vinnymac
You no longer need to install chalk or picocolors either, you can now style text yourself:
`const { styleText } = require('node:util');`
Docs: https://nodejs.org/api/util.html#utilstyletextformat-text-op...
The killer upgrade here isn’t ESM. It’s Node baking fetch + AbortController into core. Dropping axios/node-fetch trimmed my Lambda bundle and shaved about 100 ms off cold-start latency. If you’re still npm i axios out of habit, 2025 Node is your cue to drop the training wheels.