

Reminder that Dawkins is in the Epstein files


Reminder that Dawkins is in the Epstein files


Goodharts Law: When a measure becomes a target, it ceases to be a good measure.
Only some of the new code does. Unfortunately, most of the code base is 10 years old written in the good ol’ Anarchy Driven Programming paradigm.
Another movie pitch:
Due to management decision, a code freeze has been mandated to the main branch. 50 feature branches have accumulated waiting to be merged.
Management has now finally approved to lift the code freeze - but only for 24 hours. Will the poor engineering team manage to merge all feature branches in time?


I’m lucky I’m in a company that hasn’t gone far deep into the AI rabbit hole. They have set up Claude Code for us and encourage us to use it if we want, but that’s about it.


The answer is to say “We will try our best, but this is very ambitious.”
If there’s some urgent feature request coming from sales then that means the Thursday deadline is in the contract and it’s already signed.


It’s usually because the engineers were rushed to deliver completely new features because the sales department over promised again.
Adding to this, transformer models started out in machine translation (the Attention Is All You Need paper is about machine translation). Conceptually it’s the same as previous iterations of machine translator models.
What do you consider an ”AI free” translator? Most, if not all, machine translators rely on some machine learning model.


The most maintainable code is built to be replaced with minimal impact.
How much of the program will must be replaced if you remove one module? If you need to replace the entire program, then your program is not maintainable. Too much is heavily dependent on this module.


That’s the joke


Great to use DuckDuckGo. Don’t want to rely on Google for anything.


Legal or not, using code generation to bypass GPL is just shitty behavior. Personally I think this counts as ”derived work” and should remain GPL.
I bought new RAM last year and it has already gone up 4.5x in price. It’s beyond crazy.


I’ve tested Claude Code at my work. I think it’s impressive what it can do. I give it some vague instruction, and it still manages to locate where to make the code change.
However, I don’t think it’s making me more productive and I probably won’t continue using it. It often gives subpar results, and the time save is often minimal. Writing the code isn’t the bottleneck for me either, so any time save is unlikely at all.
It also detaches me from the code in ways I don’t like. My job as a programmer isn’t only to write code. More importantly, it’s also about being an ambassador for the code I write. My role as a ”code ambassador” is going to be more difficult if I don’t write it.
LLMs are only good if you value quantity over quality. Kind of like how some executives thinks number of lines of code is a reliable productivity metric.


It has actually 100% accuracy


Seize the means of prod!


Enhance!
And not once have I regretted removing inheritance.
I’m betting on that if I ever need to catch up with this nonsense, everything will be so streamlined no catching up is needed. All ”prompt engineering” and manually crafted SKILLS md files or whatever is just going to be a thing of the past.
Until then, I’m happy to avoid AI.