• cyrano@lemmy.dbzer0.comOP
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      6 days ago

      Must be because AI 🤔 The study itself suggest that could be an improvement and not to replace a therapist by Ai

      Using different measures, we then confirmed that responses written by ChatGPT were rated higher than the therapist’s responses suggesting these differences may be explained by part-of-speech and response sentiment. This may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes. We anticipate that this work may lead to the development of different methods of testing and creating psychotherapeutic interventions. Further, we discuss limitations (including the lack of the therapeutic context), and how continued research in this area may lead to improved efficacy of psychotherapeutic interventions allowing such interventions to be placed in the hands of individuals who need them the most.

      • haui@lemmy.giftedmc.com
        link
        fedilink
        English
        arrow-up
        14
        arrow-down
        1
        ·
        5 days ago

        Its possible that AI was the reason.

        But the far bigger reason is that chatgpt does not mix with mental health. Not because it is bad but because it is unreliable, potentially hazardous, controlled by a corporation (and soon elonazi iirc), has potential to manipulate a person into even worse things than advertisement is currently doing.

        Science should be on an ABSOLUTELY NOT stance towards chatgpt (and all corpo ai) in therapy.

        Please all watch the mandatory viewing exercise called idiocracy as many times as needed until it is understood that we are on the totally wrong path.

    • Delta_V@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      I also think the anti-AI bandwagon is sus.

      But this time I downvoted because the linked website appears to be broken:

      There’s no article, its just a headline and a stock photo.

    • Ephera@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      5 days ago

      It’s the usual ridiculous premise of “AI is better, if you remove all the aspects that are actually involved in doing the thing”.
      A real therapist looks at emotions, at your dynamic when you talk about your conflicts. They throw in suggestions to gauge your reaction. And they can get to know you and push you towards a positive outcome over many sessions. In particular, they also serve as a form of mediation. Having a person that holds you accountable for disrespectful behavior and for ridiculous expectations can do a lot for bringing a couple back on track. A chatbot just does not do these things.

      But on top of that, even the methodology seems to be flawed. A person liking a response is not an indication of it being good. Sometimes, a therapist needs to tell you that you’re being an ass.

      Like, yeah, the research itself has some amount of merit. A chatbot therapist might be better than having no therapist at all (so long as it doesn’t encourage self-harm). But the headline and the premise of the article is so far removed from any truth that it doesn’t belong in a science community.