Scripts I wrote that I use all the time
64 comments
·October 22, 2025oceanplexian
It's weird how the circle of life progresses for a developer or whatever.
- When I was a fresh engineer I used a pretty vanilla shell environment
- When I got a year or two of experience, I wrote tons of scripts and bash aliases and had a 1k+ line .bashrc the same as OP
- Now, as a more tenured engineer (15 years of experience), I basically just want a vanilla shell with zero distractions, aliases or scripts and use native UNIX implementations. If it's more complicated than that, I'll code it in Python or Go.
trenchpilgrim
When I had one nix computer, I wanted to customize it heavily.
Now I have many nix computers and I want them consistent and with only the most necessary packages installed.
ozim
Besides many nix computers I also have wife, dog, children, chores, shopping to be done. Unlike when I was young engineer I could stay all night fiddling with bash scripts and environments.
chis
I think it's more likely to say that this comes from a place of laziness than some enlightened peak. (I say this as someone who does the same, and is lazy).
When I watch the work of coworkers or friends who have gone these rabbit holes of customization I always learn some interesting new tools to use - lately I've added atuin, fzf, and a few others to my linux install
heyitsguay
I went through a similar cycle. Going back to simplicity wasn't about laziness for me, it was because i started working across a bunch more systems and didn't want to do my whole custom setup on all of them, especially ephemeral stuff like containers allocated on a cluster for a single job. So rather than using my fancy setup sometimes and fumbling through the defaults at other times, i just got used to operating more efficiently with the defaults.
planb
Yeah - been there, done that, too. I feel like the time I gain from having a shortcut is often less that what I wound need to maintain it or to remember the real syntax when I'm on a machine where it's not available (which happens quite often in my case). I try to go with system defaults as much as possible nowadays.
eikenberry
Prepare to swing back again. With nearly 30 years experience I find the shell to be the best integration point for so many things due to its ability to adapt to whatever is needed and its universal availability. My use of a vanilla shell has been reduced to scripting cases only.
D13Fd
I would still call my Python scripts “scripts.” I don’t think the term “scripts” is limited to shell scripts.
null
jamesbelchamber
I am going through a phase of working with younger engineers who have many dotfiles, and I just think "Oh, yeh, I remember having lots of dotfiles. What a hassle that was."
Nowadays I just try to be quite selective with my tooling and learn to change with it - "like water", so to speak.
(I say this with no shade to those who like maintaining their dotfiles - it takes all sorts :))
nonethewiser
On the other hand, the author seems to have a lot of experience as well.
Personally I tend to agree... there is a very small subset of things I find worth aliasing. I have a very small amount and probably only use half of them regularly. Frankly I wonder how my use case is so different.
edit: In the case of the author I guess he's probably wants to live in the terminal full time. And perhaps offline. there is a lot of static data he's stored like http status codes: https://codeberg.org/EvanHahn/dotfiles/src/commit/843b9ee13d...
In my case i'd start typing it in my browser then just click something i've visited 100 times before. There is something to be said about reducing that redundant network call but I dont think it makes much practical difference and the mental mapping/discoverability of aliases isnt nothing.
javier123454321
This is one area that I've found success in vibe coding with. Making scripts for repetitive tasks that are just above the complexity threshold where the math between automating and doing manually is not so clear. I have copilot generate the code for me and honestly I don't care too much of its quality, extensibility, and are easy enough to read through where I don't feel like my job is AI pr reviewer.
southwindcg
Regarding the `line` script, just a note that sed can print an arbitrary line from a file, no need to invoke a pipeline of cat, head, and tail:
sed -n 2p file
prints the second line of file. The advantage sed has over this line script is it can also print more than one line, should you need to: sed -n 2,4p file
prints lines 2 through 4, inclusive.tonmoy
It is often useful to chain multiple sed commands and sometimes shuffle them around. In those cases I would need to keep changing the fist sed. Sometimes I need to grep before I sed. Using cat, tail and head makes things more modular in the long run I feel. It’s the ethos of each command doing one small thing
1-more
yeah I almost always start with `cat` but I still pipe it into `sed -n 1,4p`
southwindcg
True, everything depends on what one is trying to do at the time.
soiltype
This is exactly the kind of stuff I'm most interested in finding on HN. How do other developers work, and how can I get better at my work from it?
What's always interesting to me is how many of these I'll see and initially think, "I don't really need that." Because I'm well aware of the effect (which I'm sure has a name - I suppose it's similar to induced demand) of "make $uncommon_task much cheaper" -> "$uncommon_task becomes the basis of an entirely new workflow/skill". So I'm going to try out most of them and see what sticks!
Also: really love the style of the post. It's very clear but also includes super valuable information about how often the author actually uses each script, to get a sense ahead of time for which ones are more likely to trigger the effect described above.
A final aside about my own workflows which betrays my origins... for some of these operations and for others i occasionally need, I'll just open a browser dev tools window and use JS to do it, for example lowercasing a string :)
chipsrafferty
I'd love to see a cost benefit analysis of the author's approach vs yours, which includes the time it took the author to create the scripts, remember/learn to use them/reference them when forgetting syntax, plus time spent migrating whenever changing systems.
te_cima
why is this interesting to you? the whole point of doing all of this is to be more efficient in the long run. of course there is an initial setup cost and learning curve after which you will hopefully feel quite efficient with your development environment. you are making it sound like it is not worth the effort because you have to potentially spend time learning "it"? i do not believe that it takes long to "learning" it, but of course it can differ a lot from person to person. your remarks seem like non-issues to me.
akersten
It's interesting because there's a significant chance one wastes more time tinkering around with custom scripts than saving in the long run. See https://xkcd.com/1205/
For example. The "saves 5 seconds task that I do once a month" from the blog post. Hopefully the author did not spend more than 5 minutes writing said script and maintaining it, or they're losing time in the long run.
o11c
I keep meaning to generalize this (directory target, multiple sources, flags), but I get quite a bit of mileage out of this `unmv` script even as it is:
#!/bin/sh
if test "$#" != 2
then
echo 'Error: unmv must have exactly 2 arguments'
exit 1
fi
exec mv "$2" "$1"
yipbub
I have mkcd exactly ( I wonder how many of us do, it's so obvious)
I have almost the same, but differently named with scratch(day), copy(xc), markdown quote(blockquote), murder, waitfor, tryna, etc.
I used to use telegram-send with a custom notification sounnd a lot for notifications from long-running scripts if I walked away from the laptop.
I used to have one called timespeak that would speak the time to me every hour or half hour.
I have go_clone that clones a repo into GOPATH which I use for organising even non-go projects long after putting go projects in GOPATH stopped being needed.
I liked writing one-offs, and I don't think it's premature optimization because I kept getting faster at it.
justusthane
Obviously that script is more convenient, but if you’re on a system where you don’t have it, you can do the following instead:
mkdir /some/dir
cd !$
(or cd <alt+.>)
revicon
I have a bunch of little scripts and aliases I've written over the years, but none are used more than these...
alias ..='cd ..'
alias ...='cd ../..'
alias ....='cd ../../..'
alias .....='cd ../../../..'
alias ......='cd ../../../../..'
alias .......='cd ../../../../../..'
tacone
I have setup a shortcut: alt+. to run cd.., it's pretty cool.
I also aliased - to run cd -
vunderba
Does zsh support this out-of-the-box? Because I definitely never had to setup any of these kinds of aliases but have been using this shorthand dot notation for years.
Bishonen88
up() { local d="" for ((i=1; i<=$1; i++)); do d="../$d" done cd "$d" }
up 2, up 3 etc.
SoftTalker
Some cool things here but in general I like to learn and use the standard utilities for most of this. Main reason is I hop in and out of a lot of different systems and my personal aliases and scripts are not on most of them.
sed, awk, grep, and xargs along with standard utilities get you a long long way.
pinkmuffinere
I totally agree with this, I end up working on many systems, and very few of them have all my creature comforts. At the same time, really good tools can stick around and become impactful enough to ship by default, or to be easily apt-get-able. I don't think a personal collection of scripts is the way, but maybe a well maintained package.
alentred
> alphabet just prints the English alphabet in upper and lowercase. I use this surprisingly often (probably about once a month)
I genuinely wonder, why would anyone want to use this, often?
sedatk
> `rn` prints the current time and date using date and cal.
And you can type `rn -rf *` to see all timezones recursively. :)
WhyNotHugo
> cpwd copies the current directory to the clipboard. Basically pwd | copy. I often use this when I’m in a directory and I want use that directory in another terminal tab; I copy it in one tab and cd to it in another. I use this once a day or so.
You can configure your shell to notify the terminal of directory changes, and then use your terminal’s “open new window” function (eg: ctrl+shift+n) to open a new window retaining the current directory.
> trash a.txt b.png moves `a.txt` and `b.png` to the trash. Supports macOS and Linux.
The way you’re doing it trashes files sequentially, meaning you hear the trashing sound once per file and ⌘Z in the Finder will only restore the last one. You can improve that (I did it for years) but consider just using the `trash` commands which ships with macOS. Doesn’t use the Finder, so no sound and no ⌘Z, but it’s fast, official, and still allows “Put Back”.
> jsonformat takes JSON at stdin and pretty-prints it to stdout.
Why prioritise node instead of jq? The latter is considerably less code and even comes preinstalled with macOS, now.
> uuid prints a v4 UUID. I use this about once a month.
Any reason to not simply use `uuidgen`, which ships with macOS and likely your Linux distro?
https://www.man7.org/linux/man-pages/man1/uuidgen.1.html