Presence: Supporting Individual Practice and Reflection
Coaching depends on trust, attention, and a deep sense of presence. Rather than interrupt this, AI can offer support behind the scenes by providing session summaries, transcripts, and feedback that helps coaches to reflect without becoming sidetracked during live interactions. Tools like Otter.ai and emerging coaching platforms can capture key themes or track changes in tone and speech patterns, supporting reflection aligned with Schön’s (1983) concept of “reflection-on-action” and Kolb’s (1984) Experiential Learning Cycle.
Passmore, Olafsson, and Tee (2025) highlight how AI can help identify patterns over time, offering coaches insight into their questioning style or pacing. These insights allow a deeper review of practice, potentially strengthening a coach’s confidence and sharpening judgement.
However, while AI can assist in capturing and organising what was said, it cannot interpret what was meant in the way a human coach can. Goleman’s (1995) work underscores why empathy and presence cannot be replaced by algorithms. Boyatzis, Smith, and Beveridge (2013), highlight the irreplaceable power of emotional attunement. Used wisely, AI should ease cognitive load and free coaches to engage fully in the human connection.
As AI becomes better at handling structured coaching processes, it’s up to human coaches to lean into the things it can’t do: emotional intelligence, intuition, connection. When we rely too much on structure, coaching can start to feel mechanical or transactional, something many experienced supervisors and trainers worry about.
Progress: Scaling Coaching and Expanding Strategic Insight
AI’s value becomes more apparent at scale. Organisations are increasingly using digital platforms to deliver personalised coaching. Tools like BetterUp apply algorithmic matching, goal tracking, and dashboards to streamline access and measure outcomes.
Terblanche (2024) argues that generative AI can better align coaching with organisational goals. Anonymised data can highlight common capability gaps, wellbeing risks, or cultural patterns.
Some findings support this approach. Barger (2025) found no significant difference in working alliance between human coaches and AI avatars; still, concerns remain. Plotkina and Sri Ramalu (2024) suggest AI is most effective for structured tasks like prompts or feedback, not complex dialogue.
Bachkirova and Kemp (2024) warn against mistaking task-based assistance for true coaching depth, especially in leadership, change, and ethical contexts.
As coaching becomes more standardised, it risks becoming a checklist. The more structured it is, the easier it is to replicate. Coaches who embrace discomfort, trust their instincts, and allow space for exploration preserve what makes coaching distinct. Data privacy, algorithmic bias, and transparency also matter. Coaching depends on trust. AI tools may raise concerns around consent and confidentiality. Jones, Woods, and Zhou (2024) stress the need for clear roles between coaches, clients, and platform providers. Floridi et al. (2018) offer an ethical framework grounded in beneficence, justice, autonomy, and explicability. These align closely with coaching values.
Rather than view AI as a threat, it may be more useful to treat it as a mirror. It can reveal blind spots and patterns, but only humans can decide what to do with that reflection.
Possibility: Where Practice Might Go Next
AI tools used in adjacent fields, such as therapy, offer ideas for coaching. For example, Eleos Health has shown improved outcomes through post-session analysis and theme detection. In a recent trial, therapists using Eleos saw a 34% drop in depression and a 29% drop in anxiety (JMIR, 2023). While not directly applicable to coaching, this model of digital support alongside human insight may be worth exploring.
Still, human coaches do something AI cannot. They can stay with discomfort, navigate complexity, and hold space for contradiction. As Blakey and Day (2012) argue, the ‘zone of uncomfortable debate’ is often where real change begins. Coaching that provokes thought and emotion in a safe way is deeply human work.
AI may not offer a final answer, but it introduces a new set of questions. It can reduce access barriers and provide new insight, but it also brings trade offs. Heron’s (1999) Six Categories of Intervention, prescriptive, informative, confronting, cathartic, catalytic, and supportive, remind us that coaching requires judgement, sensitivity, and presence. If coaches lean too heavily on AI outputs, these qualities risk being diluted.
Final Thought
This is not the end. AI will not replace coaching, but it may reshape how we approach it. Reflective coaches, ethical organisations, and well-designed tools will shape its future.
In a world of intelligent systems, the most impactful coaches will be those who invest in self-awareness, recognise their blind spots, and lead with authenticity. No algorithm can replace that.
Action Point
Choose an area of your coaching practice or organisational coaching strategy where AI is, or could be, introduced. Rather than asking “Can AI do this?”, explore “How can AI support this without replacing the human elements that coaching requires?” Reflect on how presence, progress, and possibility show up in your work. Use tools like reflective journaling or peer supervision to explore how technology might enhance rather than diminish your impact. Stay curious. Stay human. But don’t stand still.