> I think the broader issue here is people using ChatGPT as their own personal therapist.
It's easy to blame the user - we can think of some trivial cases where we wouldn't blame the user at all.*
In this, like all things, context is king.
* one example passed around a lot was an interlocutor who is hearing voices, and left their family for torturing them with the voices. More figuratively, if that's too concrete and/or fake, we can think of some age group < N years old that we would be sympathetic to if they got bad advice
It's easy to blame the user - we can think of some trivial cases where we wouldn't blame the user at all.*
In this, like all things, context is king.
* one example passed around a lot was an interlocutor who is hearing voices, and left their family for torturing them with the voices. More figuratively, if that's too concrete and/or fake, we can think of some age group < N years old that we would be sympathetic to if they got bad advice