• qyron@sopuli.xyz
    link
    fedilink
    arrow-up
    27
    ·
    5 days ago

    AI models to “aid” in court, listening to witnesses in order to assert if said person is telling the truth or lying are being proposed in my country.

    Argument: it will speed up trials, declogging the justice system by extension.

    Most lawyers are horrified, as well some judges.

    Meanwhile, a judge as been suspended and reprimanded for using AI tools to write his decisions for him.

    Yes, the bot did allucinate arguments and used argumentation in common law style, while my country is civil law model.

    • LeninsOvaries@lemmy.cafe
      link
      fedilink
      English
      arrow-up
      7
      ·
      5 days ago

      You know, an AI designed to tell if a witness is lying would be really useful…

      For autism diagnosis. If the AI thinks the patient is lying no matter what they say, the patient has autism.

        • Blemgo@lemmy.world
          link
          fedilink
          arrow-up
          9
          ·
          5 days ago

          Uncommon speech patterns and behaviours. People with ASD are more likely to be suspected to be lying when they are telling the truth, due to avoidance of eye contact, lower stress threshold, talking about unnecessary tangents that seem unrelated to the topic and uncommon stress reactions like fawning.

          • qyron@sopuli.xyz
            link
            fedilink
            arrow-up
            4
            ·
            4 days ago

            You’re describing me but I am not autistic. Can we again just say it is just a bad idea all together.

    • kameecoding@lemmy.world
      link
      fedilink
      arrow-up
      7
      ·
      5 days ago

      There is also that guy in the US that submitted arguments to the judge that referenced hallucinated case law.