• ZDL@lazysoci.al
    link
    fedilink
    arrow-up
    11
    arrow-down
    2
    ·
    2 months ago

    To be fair to the clankfucker, even Windows, one of the worst piles of dog fæces to have ever been foisted on the public for money, knows enough to ask “are you sure?” on mass deletions, etc.

    Doesn’t stop him from being a moron for not backing up, but there you go.

    • Thorry@feddit.org
      link
      fedilink
      arrow-up
      11
      ·
      2 months ago

      I’m pretty sure it isn’t that easy to do what that dude did. It’s a multi step process. It doesn’t say: “This will delete your data, are you sure you want to continue?”, but it also isn’t like he clicked on the x top right and all of the data was gone. The language of the function is also pretty clear and there are a lot of ways to find out what it does. The dude even admits himself he wanted to know if he could toggle that and still have access to his data, but instead of asking the chatbot beforehand he just tried it and then cried foul when it actually locked him out.

      • ZDL@lazysoci.al
        link
        fedilink
        arrow-up
        4
        ·
        2 months ago

        It really should, however, if you’re making software for people to use, point out the consequences, especially if the consequences aren’t immediately obvious to the requested action. There’s a sizable divide between “don’t share my data” and “OK, we’ll delete everything”.

        Don’t get me wrong. ChatGPT is a festering pile of shit even without this. This “professor” should be stripped of his teaching credentials and be thrown into an LLMbecile detox centre, only allowed to exit when he learns to think for himself. These are both true.

        But it is also true that if there are drastic implications to an action that isn’t an obvious outcome from the requested function it should warn you.