People choosing to use an LLM for therapeutic support probably do so with the belief that it is, while maybe not as beneficial as a human therapist, better than nothing. Early RCT results[0] so far do seem promising.
It's not equivalent to choosing to drink poison, which would very clearly be far worse than nothing.
It's not equivalent to choosing to drink poison, which would very clearly be far worse than nothing.
[0]: https://ai.nejm.org/doi/full/10.1056/AIoa2400802