People are far more likely to lie and cheat when they use AI for tasks, according to an eyebrow-raising new study in the journal Nature.

“Using AI creates a convenient moral distance between people and their actions — it can induce them to request behaviors they wouldn’t necessarily engage in themselves, nor potentially request from other humans,” said behavioral scientist and study co-author Zoe Rahwan, of the Max Planck Institute for Human Development in Berlin, Germany, in a statement about the research.

  • ExtremeDullard@piefed.social
    link
    fedilink
    English
    arrow-up
    29
    ·
    4 hours ago

    Who’s surprised by this?

    If you use AI to do things for you that you could do yourself, fundamentally you cheat.

    I’m not talking about a doctor asking AI to interpret some difficult medical data. In that case, AI is a tool.

    But when you ask it to write a summary of the boring-ass meeting you just slept through, or code a piece of code you can’t be assed to understand properly yourself, or do your homework, you shortcut your responsibilities.

    And at the core of the request lies a profound desire from the requestor to get a result without efforts, which is kind of morally bankrupt to begin with.

    • eatCasserole@lemmy.worldM
      link
      fedilink
      arrow-up
      1
      ·
      9 minutes ago

      It’s not surprising in the slightest, but it’s good to have our suspicions validated. Unlike chatbots, we can base our actions on reality, rather than just vibes.

    • Catoblepas@piefed.blahaj.zone
      link
      fedilink
      English
      arrow-up
      16
      ·
      3 hours ago

      AI pushers know this on a fundamental level, and it pisses them off that you won’t go along with it. Same as not letting someone copy your homework in grade school.

      More important than the cheating I think is just the outright offloading of thinking. Just the acceptance of no longer having thoughts without running them past the computer first, which itself I think is tied to this deep fear of being wrong about something.

    • 13igTyme@piefed.social
      link
      fedilink
      English
      arrow-up
      15
      ·
      3 hours ago

      My wife’s friend is having marriage issues. She wants to go to marriage counseling. He wants to use an AI marriage counselor.

      Guys a dumbass that quit his job to start a Crypto currency business.