Artificial intelligence (AI) has already transformed sectors from healthcare to finance, but could it soon redefine how humans learn, grow, and perform at work? According to a new study led by Jonathan Passmore of the University of Reading’s Henley Business School, the answer may be yes.
The paper, published in the Journal of Work-Applied Management and titled “A systematic literature review of artificial intelligence (AI) in coaching: insights for future research and product development,” offers one of the most comprehensive reviews to date of how AI is being integrated into professional coaching.
Drawing on 16 empirical studies published between 2019 and 2024, the research team, which also included Bergsveinn Olafsson from Claremont Graduate University and David Tee from the University of Cambridge, investigated whether AI coaching can match the effectiveness, usefulness, and ethical integrity of human-led approaches.
The rise of AI in coaching
In just a few years, large language models (LLMs) such as GPT-4, Google’s BERT, and Meta’s LLaMA have evolved from experimental tools to widely used conversational systems. These models can interpret human language, analyze emotions, and generate realistic dialogue, allowing users to engage in conversations that feel genuinely authentic.
In coaching, such systems have opened a new frontier. AI can simulate a human coach by asking reflective questions, tracking progress toward goals, and providing feedback in real-time. It can also support professional coaches by automating administrative tasks, such as note-taking, progress tracking, and matching clients with coaches.
The review found that AI coaching systems, from text-based chatbots to voice-driven avatars, have already been deployed in diverse contexts. Participants ranged from students and healthcare patients to employees in the banking sector. What unites these applications is the potential to make coaching more accessible, affordable, and data-driven.
Passmore and his colleagues define AI coaching as a “synchronous coaching experience, where the machine replaces the role of the human coach, facilitating the client in goal setting, issue exploration, personal reflection, and developing insights and actions.” In essence, it represents a shift from purely human interaction to a blended, technology-assisted model of personal development.
The effectiveness of AI coaching
The review identified four recurring themes in the empirical literature: the integration of AI into coaching, its usefulness, the impact on outcomes, and the ethical concerns surrounding its use.
The analysis revealed that AI systems excel when they are designed with a narrow and specific focus, such as helping users increase self-awareness, manage anxiety, or improve accountability. In such contexts, algorithms that follow structured frameworks, such as the GROW model or solution-focused coaching, can deliver reliable and replicable results.
However, AI still struggles with emotional nuance and adaptive thinking. Unlike human coaches, AI cannot yet respond to subtle cues such as tone of voice, facial expression, or complex cultural references. This limitation highlights that while AI may match human coaches in structure, it remains inferior in terms of empathy and intuition.
A question of trust and acceptance
One of the most fascinating findings in Passmore’s review is that participants often describe AI coaches as non-judgmental, accessible, and psychologically safe. For many users, interacting with a chatbot reduces the fear of embarrassment or bias that might arise with a human coach.
In fact, several studies cited in the review suggest that trust and technology acceptance may play a greater role in AI coaching effectiveness than traditional “working alliance” measures, which are crucial in human coaching. According to Terblanche and Cilliers (2020), individuals are more likely to engage with AI coaches when they perceive the technology as competent, easy to use, and socially validated by their peers. These findings however are questioned by a 2025 quasi experimental design study by Passmore and colleagues, who found levels of trust were lower for AI coachbots than human coaches.
User perceptions will shape how people approach self-improvement going forward. As AI systems become more advanced and multimodal, using voice, virtual avatars, and even avatars of trusted and respected individuals, the connection between the human users and artificial coaches may deepen, raising both opportunities and ethical questions.
The ethics of digital empathy
While the study celebrates AI’s potential, it also issues a clear warning: ethical frameworks in AI coaching remain underdeveloped.
Concerns about data privacy, bias, and emotional safety were highlighted in multiple studies reviewed by Passmore and colleagues. In one example, participants expressed reluctance to share personal information, such as location data, with AI coaches, citing a limited understanding of how their data might be stored or used.
Another critical issue is AI’s inability to make ethical judgments. Systems such as GPT-4, while capable of simulating empathy and compassion through language, do not truly understand emotional suffering. As Passmore notes, empathy in AI is “cognitive” rather than “affective” meaning machines can understand emotions but not feel them.
The authors argue that professional bodies such as the International Coaching Federation (ICF) and the European Mentoring and Coaching Council (EMCC) must take a proactive stance in regulating AI coaching practices. Both organisations have begun issuing ethical guidelines on digital coaching, yet enforceable standards comparable to human coaching codes are still lacking.
Future legislation, such as the EU Artificial Intelligence Act (2024), could introduce more stringent oversight, requiring developers to adopt transparent data handling, fairness, and safety protocols.
What AI means for human coaches
For many professionals, the prospect of AI replacing coaches may seem threatening. Yet, the review paints a more nuanced picture. Rather than replacing humans, AI may serve as a complementary partner that enhances human effectiveness.
AI can handle repetitive and administrative tasks, freeing coaches to focus on the interpersonal and emotional dimensions of their work. It can also assist in analysing client data, monitoring progress between sessions, and generating evidence-based recommendations.
Passmore and his co-authors suggest that coaching schools should now include AI literacy in their training curricula. Understanding how AI systems operate and how to integrate them ethically could become a key competency for future practitioners.
From research to real-world innovation
The paper also identifies promising directions for future product development. One major trend involves embodied conversational agents (ECAs), AI avatars that can engage users through voice, text, and visual cues. These systems could eventually take on diverse appearances or even replicate historical and cultural figures, allowing users to “learn” from digital versions of renowned personalities such as Albert Einstein or Marshall Goldsmith.
The authors predict that personalisation and multimodal design will be key to the next generation of AI coaches. This includes allowing users to choose whether to interact via text, voice, or video, and offering diverse avatars representing different genders, ethnicities, and communication styles.
Wearable technology integration could further enhance AI coaching, enabling systems to monitor physiological data, such as sleep patterns or heart rate, to provide context-aware feedback. However, the paper warns that such innovation must not come at the cost of privacy, fairness, or autonomy.
The road ahead
Despite the promising evidence, Passmore’s review underscores that AI in coaching remains a nascent field. Many of the studies analysed involved small sample sizes or specific populations such as students. Future research should employ larger randomised controlled trials to validate findings across diverse demographic and professional groups.
As AI continues to evolve, the balance between technological efficiency and human empathy will define its success in coaching and beyond. For now, the evidence suggests that AI can enhance, but not yet replace, the uniquely human aspects of personal development.
By 2035, the coaching industry, like many others, will look radically different as a result of AI. Researchers should focus on how AI coaching can contribute to human development and wellbeing, while developers must design ethical, transparent systems that serve the needs of users.
-Jonathan Passmore
References
Passmore, J., Olafsson, B., & Tee, D. (2025). A systematic literature review of artificial intelligence (AI) in coaching: insights for future research and product development. Journal of Work-Applied Management. https://doi.org/10.1108/JWAM-11-2024-0164
Passmore, J. Tee, D., Palermo, G. & Rutschmann, R. (2025). Passmore, J., Tee, D., Palermo, G., & Rutschmann, R. (2025). Human coaches and AI coaching agents: an exploratory quasi-experimental study of workplace client attitudes. Journal of Work-Applied Management, 1-14., http://doi.org/10.1108/JWAM-02-2025-0032
