The rising popularity of generative artificial intelligence and chatbot features in systems like ChatGpt for supporting mental health requires research into long-term risks and benefits. Mental health concerns are on the rise around the world. This heightened demand has exacerbated wait times, impacted access, and generally made it harder for those seeking services to be referred to and connected to mental health support. ChatGPT has the potential to become a simple, costeffective, and easy-to-access solution to the increased demand. ChatGpt may eventually fill a gap in services for many suffering people. Empathy is the cornerstone of the therapeutic relationship, and understanding how users perceive ChatGPT’s empathy is imperative to building safe and effective mental health support.
Eight participants engaged in an interactive mental health scenario with ChatGPT. Then they participated in a virtual semi-structured interview to share their reflections on how empathy was experienced during the exchange. To support and prompt further reflection, they also completed the Perceived Empathy of Technology Scale (PETS), a brief survey focused on their perceptions. Interview narratives formed the primary source of insight, with survey responses used to enrich and triangulate emerging themes. Inductive thematic analysis was used to identify key patterns in participants’ perceptions of empathy, and basic descriptive statistics were used to triangulate and validate the integration of study findings. Results from the qualitative-driven study indicate that the perception of empathy is nuanced, but overall, participants felt that when their feelings were mirrored and validating language was used in the responses, they thought they were experiencing cognitive empathy from ChatGPT. These findings suggest that ChatGPT can be perceived as empathic by users seeking mental health support, with the subjective experience reflected in the post-activity interviews as well as from the survey.