The Storm Hits the Art Market
news.artnet.com
Chat Control Must Be Stopped
privacyguides.org
Is OOXML Artifically Complex?
hsu.cy
NPM debug and chalk packages compromised
aikido.dev
Experimenting with Local LLMs on macOS
blog.6nok.org
Liquid Glass in the Browser: Refraction with CSS and SVG
kube.io
Immich – High performance self-hosted photo and video management
github.com
Ex-WhatsApp cybersecurity head says Meta endangered billions of users
theguardian.com
Will Amazon S3 Vectors kill vector databases or save them?
zilliz.com
World Nuclear Association Welcomes Microsoft Corporation as Newest Member
world-nuclear.org
Contracts for C (Early Stages)
gustedt.wordpress.com
The key points of "Working Effectively with Legacy Code"
understandlegacycode.com
Tesla market share in US drops to lowest since 2017
reuters.com
David Walker's Paper Clip Collection
presentandcorrect.com
Learning the soroban rapid mental calculation as an adult
github.com
Seedship [Text-Based Game]
philome.la
AMD claims Arm ISA doesn't offer efficiency advantage over x86
techpowerup.com
Job mismatch and early career success
nber.org
So Long [Nova Launcher's FOSS release blocked by its owners,despite obligations]
teslacoilapps.com
A clickable visual guide to the Rust type system
rustcurious.com
Red Hat back-office team to be Big and Blue whether they like it or not
theregister.com
What are the next steps after the install? Discover that local LLMs are inadequate or freeze the machine (as stated in the article)?
If you are at NIH, perhaps wait until you are fired?
It is very sad that the whole scientific ecosystem is jumping on the hype train. There are no interesting articles any longer, no real scientific discoveries. Just article after article how to feed the bureaucratic LLM machinery and become a good apparatchik within it.