Skip to content(if available)orjump to list(if available)

I'm Archiving Picocrypt

I'm Archiving Picocrypt

110 comments

·August 6, 2025

uludag

I've felt similar to the author, a sort of despair that the only point of writing software now is to prop up the valuation of AI companies, that quality no longer matters, etc.

Then I realized that nothings stopping me from writing software how I want and feel is best. I've stopped using LLMs completely and couldn't be happier. I'm not even struggling at work or feeling like I'm behind. I work on a number of personal projects too, all without LLMs, and I couldn't feel better.

immibis

This is also a good opportunity to remember that MIT is not a strong enough open source license, and if you want to prevent corporations making money off your work, make it AGPL or even SSPL, plus a statement that AI training creates a derivative work (the latter may or may not have any legal effect).

MIT is a donation of your labour to corporations. With a stronger license, at least they're more likely to contribute back or to pay you for a looser license.

kannanvijayan

Tangentially, I wonder if logins and click-throughs can help address this on the legal front.

If you set up a login flow with a click through that explicitly sets the terms of access, specifying no cost for access by a person, and some large cost for access by AI.

Stepping past this prompt into the content would require an AI to either lie, committing both fraud and unauthorized access of content.. or behave truthfully, opting in the proprietor of the API to the associated costs.

In either case, the site operator can then go after the company doing the scraping to collect the fees as specified in the copyright contract (and perhaps some additional delta of punitive fines if content was accessed fraudulently).

fouronnes3

When are we getting a GPLv4 that's AGPL + no LLM training? This is overdue.

Octoth0rpe

Given Meta's history of torrenting every book it could get its hands on for training, I'm not convinced that the majority of AI companies would respect that license. Maybe if we also had a better way to prove that such code was part of the training set and see a couple of solid legal victories with compensation awarded.

ramses0

"Adversarial Internet" => if it touches the internet it's no longer yours. See a previous comment chain: https://news.ycombinator.com/item?id=44616163

pjerem

Like if LLM training cared about respecting licenses. :(

jvanderbot

My boss has taken this approach, and it took a load off the "LLM pressure".

iaiuse

MIT isn’t “weak” because it allows LLM training; it’s weak because it puts zero obligations on the recipient.

Blocking “LLM training” in a license feels satisfying, but I’ve run into three practical issues while benchmarking models for clients:

1. Auditability — You can grep for GPL strings; you can’t grep a trillion-token corpus to prove your repo wasn’t in it. Enforcement ends up resting on whistle-blowers, not license text.

2. Community hard-forks — “No-AI” clauses split the ecosystem. Half the modern Python stack depends on MIT/BSD; if even 5 % flips to an LLM-ban variant, reproducible builds become a nightmare.

3. Misaligned incentives — Training is no longer the expensive part. At today’s prices a single 70 B checkpoint costs about \$60 k to fine-tune, but running inference at scale can exceed that each day. A license that focuses on training ignores the bigger capture point.

A model company that actually wants to give back can do so via attribution, upstream fixes, and funding small maintainers (things AGPL/SSPL rarely compel). Until we can fingerprint data provenance, social pressure—or carrot contracts like RAIL terms—may move the needle more than another GPL fork.

Happy to be proven wrong; I’d love to see a case where a “no-LLM” clause was enforced and led to meaningful contributions rather than a silent ignore.

stoneyhrm1

I understand the author's sentiment but industries don't exist solely because somebody wants them to. I mean, sure, hobbies can exist, but you won't be paid well (or even at all) to work with them.

Software engineering pays because companies want people to develop software. It pays so well because it's hard, but the coding portion is become easier. Vibe coding and AI is here to stay, the author can choose to ignore it and go preach to a dying field (specifically, writing code, not CS), or embrace it. We should be happy we no longer need to type away if and for loops 20 times and instead can focus on high level architecture.

zamadatix

In regards to "As long as you can run the code, archiving this project means nothing, really." I think this section misses the main concern of archived software - what happens when one of those bugs is run into (either something not yet noticed or something due to external changes down the road) and there is no actively maintained version (which could include one you're willing to hack st yourself) to just update to?

The simpler the software the less urgent the concern but "I haven't had a problem in the last 2 years" is something I could say of most software which I end up needing to update, and it makes sense to make myself able to do so at a convenient time rather than the moment the problem occurs.

This project seems popular enough I'm sure eaiting a bit and seeing who the successor project is would be a safe bet as well though.

Tepix

The (current) last commend by hakavlad (same as hakavlad on HN perhaps?):

    @HACKERALERT Your decision may be somewhat irresponsible towards those who donated to the audit.
That audit was one year ago. The money didn't go towards the author. The source continues to be available. The author doesn't own you zilch.

hakavlad

>The money didn't go towards the author.

Perhaps many would have refused to donate if they knew that the project would be archived in a year. Collecting for audit and then archiving the project is, in a way, a violation of expectations.

rowanG077

Yes, I found this a profoundly weird comment. The audited code will be forever available and audited.

jvanderbot

Honestly I feel the same.

I'm holding out hope that there will always be boutique/edge software to be written, which requires enough design and care to be mentally challenging and engaging - the craftsmanship kind.

When AlphaGo was announced, I had to keep telling people that "It's not like computers win at Go, it's just that we now have a tool that makes us way better at Go". If an alien race showed up and challenged us to Go to save the species, would we use a Go player or a AlphaGo, if we had to choose?

The problem is LLMs aren't like that, because software isn't like Go. And, they really are annoying to use, frustrating to redirect all the time, and generally cannot do what you want precisely, without putting in more mental energy to provide context and decompose further than you'd need to do the damn thing yourself. And then you lose an hour/two of flow, which is the reward for the whole process.

But at certain times they are a godsend and they have completely replaced some of the more boring parts of my job. I wouldn't want to lose them, not at all.

Like the author, I don't think we're heading to a healthy balance where LLMs help us be better at our job. I do think the hype is going in the wrong direction, and I do worry for the state of our field (Or at least the _ideals_ of our field). Call me naive, but I also thought it mattered what the code was.

raincole

I don't get it.

(I'm not trying to throw shades at the author. I know they have no obligation to maintain an open source project. I'm simply having a hard time gasping what's happening.)

epolanski

Seems like the author is abandoning software because in his opinion due to AI explosion employers don't care anymore about code quality and only quality.

I don't get it either, because that has always been the case, thus most of his post is borderline non sense.

Imo, what happened is that he took the opportunity of him entering academia to vent some stuff and quit maintaining his project.

worldsavior

He doesn't have interest in the project anymore. He didn't have a long time, and now that he stopped with software development and gone into the academia- he certainly doesn't have interest. Is that hard to get?

He explained the reasons he went into the academia, which is because of the AI, and AI is not the reason he stopped with development.

shiomiru

> I don't get it either, because that has always been the case, thus most of his post is borderline non sense.

Yes, making software development cheaper has been the main priority of the industry for a long time. The new development is that there's now a magic "do what I want" button that obviously won't quite do what you want but it's so cheap (for corporations, not humanity...) that you might as well pretend it does. (Compared to paying professionals who might even care about doing a good job, that is.)

sunshine-o

I believe you need to separate two things:

- The author enjoyed writing quality open source code

- The author needs to make strategic decisions for his own career and livelihood and he doesn't have enough bandwidth for both

I don't feel he is happy about the decision he needs to make and he is pointing to something dark happening to software development and open source.

Now this is not new, and didn't start with LLM. I am sure if we ask the OpenBSD devs what they think about the modern mainstream open source ecosystem, docker, full stack development, etc. they see it like we might look at LLM generated code. This is just a question how much of a purist you are.

karteum

I was thinking exactly the same : I also don't get it (even though I totally get that someone may lose motivation to work on a project, and certainly has no obligation to continue. But this justification sounds a bit weird to me).

Could this mean that he has been approached by some "men in black" asking to insert some backdoor in the code or to stop working on it, together with a gag order ? (actually I also wondered the same a long time ago with Truecrypt, even though to my knowledge no backdoor has ever been found in further audits...)

000ooo000

Did you not read the comment he wrote? It's straightforward

blenderob

I'm starting to feel kinda old and out of the loop. Could someone please explain the conversational style of this post?

It begins with a prompt directed at Gemini, followed by what appears to be an AI-generated response. Are these actual AI responses or is the developer writing their parting message in a whimsical way? I'm genuinely confused. Help much appreciated!

yifanl

This is a post framed in medias res, from the perspective of the developer, as portrayed by themselves, asking an LLM to construct the post that they post immediately afterwards.

I'm unsure if the post is actually created by Gemini or the developer's imitation. I suspect the latter.

MrGilbert

I like the creativity behind this. And I feel sorry for them that the current wave of AI has lead to them abandoning their pet project. Maybe they will pick up the project again, once the dust has settled. At the end, at least for me, they are pet projects for exactly that reason: An escape route from the day to day business. One should still be proud of what they achieved in their spare time. I don’t care if my job requires me to use K8s, Claude or Docker. Or if that’s considered "industry standard".

My projects, my rules.

leonewton253

I forked it and named it NanoCrypt. Time to rip out the GUI code. muha ha ha!

zikduruqe

There's a cli version anyways... https://github.com/Picocrypt/CLI

skylurk

Wouldn't that make it femtocrypt?

miroljub

> muha ha ha!

Feeling ok? Do you need some support?

bravesoul2

I can recommend an excellent system prompt, if times are rough

tromp

How does "a very small (hence Pico), very simple, yet very secure encryption tool" come to depend on OpenGL, threatening its future on MacOS?

Timshel

> It's not easy to fix in the code either because it'll require major changes to the GUI library which can get messy, especially since GUIs were never a strength of Go.

Cthulhu_

There just doesn't seem to be a lot of viable competition to web based UIs these days.

kevingadd

If you're using a portable library that needs to render graphics on mac, it's probably using OpenGL to do it unless it has a platform-specific backend.

nicoburns

Historically, yes. These days it might well be using wgpu.

neom

https://github.com/Picocrypt/Picocrypt/issues/134#issuecomme...

This to me is the crux of the whole thing.

Almost like a knitter throwing away their needles because they saw a loom.

latexr

Considering the author is explicitly going into AI research, has an AI-generated profile picture, and claims front-and-centre on their website they are excited about LLMs, I don’t think that analogy works. Or rather, it is like a knitter throwing away their needles to eagerly go work in the loom manufacturing industry.

rikafurude21

I dont think many people would be excited at the thought of going from handcrafted artisan knitting to babying machines in the knitting factory. You need a certain type of autism to be into the latter.

dewey

The author is a student at a university. There's many paths to take that early in the career, I don't think people have to read too much into it.

stoneyhrm1

I'd think it would be more autistic to continue to use and have interest in something that's been superseded by something far more easier and efficient.

Who would you think is weirder, the person still obsessed with horse & buggies, or the person obsessed with cars?

roenxi

Fortunately this is the software industry. We've got a lot of those autists and that automating urge is the best part about software. If someone don't like the idea of sitting around babysitting factories of machines they certainly shouldn't go into DevOps. It would be safest to just avoid programming in general, given how much of the industry centres on figuring out how to deploy huge amounts of computing power in an autonomous fashion.

fxtentacle

So basically, he’s leaving software development because the job market is bad. Instead, he’s joining AI research which (currently) has a more healthy job market. That seems pretty reasonable to me, given that even widely used open source projects are only barely financially viable. Many open source projects end when the author finally gets a girlfriend, this one ends for a new job. Seems like a good outcome to me. Plus truly fascinating presentation.

KaiserPro

I mean your analogy is almost there. The loom is pretty old.

What your grasping for is https://en.wikipedia.org/wiki/Power-loom_riots

Where a precipitous drop in earning power, combined with longer working hours, high inflation and large companies making people unemployed cause large social unrest.

And yeah, I can see why they rioted.

FergusArgyll

Yeah, life has just been on a steady decline since 1826. Who wants all this food and medicine anyway

KaiserPro

I mean if you want that argument then sure, but given that those riots were one step in a long path to workers rights. The lesson here is that if we avoid exploiting workers and/or throwing them out on their arses, we can sidestep a load of social upheaval.

or we can not and just end up having a blood bath.

null

[deleted]

uyzstvqs

TLDR: Author of an open-source project has a crashout over other people using LLMs for coding, believes that AI will replace all developers, and decides to preemptively give up on software engineering entirely because of that.

IMO anyone who understands AI at a technical level will understand that this won't happen. No matter how many parameters, training and compute you throw at it, putting AI in direct charge of anything that's critical and not entirely predictable is going to backfire. Though, based on response from this author, it should be apparent that his response comes from a place of emotion, misunderstanding, and likely conformism to dogmatic anti-AI rhetoric of the same nature, rather than actual reason and logic.

progx

It is a matter of time. 5 years no problem. In 10 years some devs will be replaced. In 15 years, i don't think that "pure" developer jobs will exist in the most companies.