• 2 Posts
  • 11 Comments
Joined 9 months ago
cake
Cake day: June 18th, 2025

help-circle

  • The pressure isn’t really coming from clients anyway. It’s coming from the web itself, from a decade of bloated pages, dark patterns, and feature arms races that quietly redefined what a “real” website looks like. Clients are just reading the room. The room is wrong, but they’re not imagining it.

    The shift might come from users, not decision-makers. It might come when enough people notice that the fast, calm site was easier to use. That they actually found what they came for. That they didn’t have to close three things before reading a single line.

    Everyone is to blame here:

    clients want flashy websites, not considering user experience

    managers don’t translate wants to real needs and pass the problem to devs

    devs like to have less work, so they will gladly insert random external dependency to fulfill the growing number of wants

    users just accept shitty websites without complaining, even letting themselves take the blame - if X is slow, then it is time to buy a new PC








  • I agree that LLM are made to be more exploratory, this is good as it allows them to experiment with more different topic, as opposed to always saying the same. However, I do not agree it is a feature for code generation, as you would need it to follow strict ruleset (code syntax, specification, tests). Whatever errors it generates and people accept are little mistakes in the threshold of acceptance for the person and a tradeoff for the cost of fixing the problem. In some contexts we see people focusing almost only on short term which leads to a lot of errors being allowed.

    Moreover, you cannot say compilers are deterministic. There are situations where they are not (at least for the user).

    https://krystalgamer.github.io/high-level-game-patches/

    GCC’s unwarranted behaviour

    In order to keep the code as small as possible I was compiling the code with -Os. Everything was working fine until I started to remove some printfs and started to get some crashes. Moving function calls around also seemed to randomly fix the problem, this was an indication that somehow memory/stack corruption was happening. After a lot of testing, I figured out that if -O2/-O3/-Os were used then the problem would appear. The issue was caused by Interprocedural analysis or IPA. One of its functions is to determine whether registers are polluted across function calls and if not then re-use them.


  • a relative time formatting library that contains no code

    The library is two text files (code) that are processed by an LLM (interpreter) to generate code of another type. This is not that new in terms of workflow.

    I think what makes this the worst is the fact that the author admits that you can’t be sure the library will work until you generate the code and test it. Even then you cannot guarantee the security of the generated code and as you do not understand the code you also cannot give support or patch it.

    When Performance Matters

    If performance of a datetime processor is not relevant, what is? The author mentions they would like a browser implementation to be fast, documentable, fixable. However, operative systems, browsers, and other complex systems are made of little utilities like this that have very well documented functionalities and side effects.

    But the above isn’t fully baked. Our models will get better, our agents more capable.

    The whole assumption is that instead of creating a good stable base that anyone can use we should be just shtting out code until it works.

    Eventually the hardware will be good enough to support a shitty bloated browser so we don’t need to optimize it.

    Eventually people will harden their PC enough so we shouldn’t care about security.