The Illusion of Understanding: Digital Empathy and Emotional Mimicry in AI-Driven Education

Tamara Vučenović1[0000-0003-4152-4372] and Marija Maksimović2[0009-0006-6013-079X],
1,2 Belgrade Metropolitan University, Serbia
tamara.vucenovic@metropolitan.ac.rs
marija.maksimovic@metropolitan.ac.rs
DOI: 10.46793/eLearning2025.348V

Abstract.  The phenomenon of increasing algorithmic comfort can be described as a process in which apparent security turns into a serpent embrace-like pressure, with increasing automation producing a tension between comfort and suffocation, as well as a dynamic of resistance, adaptation, and release. In the age when education is shaped according to code, the rule prevailed: “no more endless hours creating something.” Efficiency has become the new holy grail, but it is increasingly being forgotten that this convenience comes at a price – the loss of creative effort and human authenticity. Within the Education 4.0 paradigm, contemporary education strives to develop the skills of the future – creativity, critical thinking, and digital literacy – while at the same time increasingly relying on large language models (LLMs) like ChatGPT. However, we cannot help but wonder if this approach, while opening up possibilities for new forms of learning, actually leads to a superficial automation of the process that ignores the key component of learning – the emotional dimension. The emotional mimicry that large language models achieve through imitating empathy and using apparently empathic phrases actually re mains on the surface, as confirmed by research showing that students who use LLM tools record less cognitive effort, as well as a lower quality of reasoning – which indicates emotional shallowness and thought passivation. The findings of a survey conducted on a sample of 60 students show that young users simultaneously recognize and support the practical benefits of AI tutors, yet express concerns about cognitive addiction and illusory empathy. This study examines the consequences of LLM tutor-based educational practices on thought identity, emotional depth, and confidence development. The findings, including the results of the survey, indicate that interaction with algorithmic tools reduces creative and semantic brain activity, encouraging superficial automatism and digital dementia, i.e., a decline in memory, concentration, and critical reasoning. In response, the paper proposes digital emotional literacy as a key pedagogical strategy and safeguard against the manipulation of empathy, the addictive reliance on tutors, and the substitution of emotional reality for algorithmic comfort.

Keywords: Artificial Intelligence in Education, Emotional Intelligence, Simulation of Empathy.

References

  1. Prensky, M. (2001). Digital natives, digital immigrants. On the Horizon, 9(5), 1–6. https://doi.org/10.1108/10748120110424816
  2. Kurian, N. (2023). AI’s empathy gap: The risks of conversational Artificial Intelligence for young children’s well-being and key ethical considerations for early childhood education and care. Contemporary Issues in Early Childhood, 26(1), 1–8. https://doi.org/10.1177/14639491231206004
  3. Ta, V., Holländer, A., & Krämer, NC (2021). Too much information? Exploring the relationship between information disclosure and the development of parasocial relationships with virtual agents. Computers in Human Behavior, 120, 106756. https://doi.org/10.1016/j.chb.2021.106756
  4. Scheutz, M. (2012). The inherent dangers of unidirectional emotional bonds between humans and social robots. In P. Lin, K. Abney, & GA Bekey (Eds.), Robot ethics: The ethical and social implications of robotics (pp. xx–xx). MIT Press.
  5. Chartrand, T. L., & Bargh, J. A. (1999). The chameleon effect: The percep tion–behavior link and social interaction. Journal of Personality and Social Psychology, 76(6), 893–910. https://doi.org/10.1037/0022-3514.76.6.893
  6. Nass, C., & Moon, Y. (2000). Machines and mindlessness: Social responses to computers. Journal of Social Issues, 56(1), 81–103. https://doi.org/10.1111/0022-4537.00153
  7. Rutledge, P. B. (2025, June 24). Kids and chatbots: When AI feels like a friend. Psychology Today. https://www.psychologytoday.com/intl/blog/posi tively-media/202506/kids-and-chatbots-when-ai-feels-like-a-friend
  8. Storm, BC, Stone, SM, & Benjamin, AS (2017). Using the internet to access information inflates future use of the internet to access other information. Memory, 25(6), 717–723. https://doi.org/10.1080/09658211.2016.1210171
  9. Holmes, W., Bialik, M., & Fadel, C. (2019). Artificial intelligence in education: Promises and implications for teaching and learning. Boston, MA: Center for Curriculum Redesign.
  10. Roll, I., & Wylie, R. (2016). Evolution and revolution in artificial intelligence in education. International Journal of Artificial Intelligence in Education, 26(2), 582–599.
  11. Spitzer, M. (2012). Digitale Demenz: Wie wir uns und unsere Kinder um den Verstand bringen. München, Germany: Droemer Knaur 355
  12. Kanbaj, A., Demir, H., & Yilmaz, F. (2025). Digital dementia: Early exposure to digital media and cognitive risks in adolescents. Journal of Neurological Sciences, 42(1), 55–67.
  13. Paul, K. (2023, May 3). Snapchat’s My AI chatbot offers unsafe advice to teens, investigation finds. The Guardian.https://www.theguardian.com/
  14. Vučenović, T., & Stojanović, M. (2024). Digital Communications: Management, Marketing Strategies and Practical Examples. Novi Sad: Academic book.
  15. Ayers, JW, Poliak, A., Dredze, M., Leas, EC, Zhu, Z., Kelley, JB, Faix, DJ, Goodman, AM, Longhurst, CA, & Hogarth, M. (2023). Comparing physician and artificial intelligence chatbot responses to patient questions posted to a public social media forum. JAMA Internal Medicine, 183(6), 589–596. https://doi.org/10.1001/jamainternmed.2023.1838
  16. Chen, D., Leung, C., Chen, B., Miao, C., & Barnes, B. (2024). Physician and artificial intelligence chatbot responses to cancer patient questions posted on a social media forum: A comparison of empathy and quality. JAMA Oncology. Advance online publication. https://doi.org/10.1001/jamaon col.2024.0836
  17. https://docs.google.com/forms/d/e/1FAIpQLSc6SDCNyuF5xGg3o3eXcqNI 7EtiAn-jVoopWvBMGvBXk7-S6w/viewform
  18. Sparrow, B., Liu, J., & Wegner, D. M. (2011). Google effects on memory: Cognitive consequences of having information at our fingertips. Science, 333(6043), 776–778. https://doi.org/10.1126/science.1207745

Izvor: Proceedings of the 16th International Conference on e-Learning (ELEARNING2025)