Why Most Online Training Only Changes Confidence, Not Behaviour

Most organisations treat online training as a success if completion, satisfaction, and confidence scores look good – but those metrics say little about real behaviour change. A recent study comparing slide-based, video-based, and practice-based online training found that only the practice group significantly improved how people actually gave feedback.


When did your last piece of online training actually change how people behaved on a Monday morning?

Completion rates looked fine. Feedback forms were glowing. People said they “felt more confident”. And yet, in the next team meeting, the same old patterns showed up: difficult conversations avoided, quiet people staying quiet, and thorny issues kicked further down the road.

A young woman smiling while holding her phone outdoors, representing a positive online training experience. Sidestream UK

If that sounds familiar, you’re not alone. And uncomfortably, the data suggests this isn’t a “your organisation” problem – it’s a how most online training is designed problem. (Courtesy photos from Freepik)

This first article looks at one detailed study of online feedback training delivered fully online. In part two, we’ll explore what to do differently if you actually want behaviour change.

The illusion of progress in online training

Online training has given L&D teams superpowers: scale, convenience, and the ability to roll out content to hundreds or thousands of people without leaving a desk.

The catch is that most of our default metrics were built for content, not behaviour:

  • Completion: who clicked through all the modules


  • Satisfaction: who liked the course, the slides, the facilitator


  • Self-reported confidence: who feels more able afterwards


If we’re honest, a lot of online training is designed to move those numbers. We make the slides cleaner, the videos more polished, the interfaces smoother. And people do enjoy it more.
— Sidestream


But enjoyment and confidence aren’t the same as doing something differently in a real conversation with a real human.

To see this gap in action, it helps to look at one piece of research that tried to simulate what many organisations are trying to do: teach people how to give better feedback, fully online.

Inside a real-world experiment in online training

The study took a very recognisable scenario: a tense video-conference team meeting during the 2020 lockdown, where one team member keeps interrupting and shutting down a colleague’s ideas. Participants watched this fictional meeting between “Ben” and “Chris” and were then introduced to the SBI feedback model (Situation–Behaviour–Impact). Everyone started with the same foundation.

From there, they were randomly assigned to one of three types of online training on how to use SBI:

  1. Slideshow
    A PowerPoint-style instructional deck walking through the SBI model.


  2. Video
    An instructional video covering the same content as the slides.


  3. Practice (online role-play coaching)
    A live video-call session with a role-play coach.
    Participants practised giving SBI feedback several times and received brief, focused feedback from the coach.


Instructor waving during an online training session while using headphones and a laptop. Sidestream UK

All three formats lasted 20 minutes and covered exactly the same content. The only real difference was the modality – more passive (slides/video) versus active practice with a coach. (Courtesy photos from Freepik)

After that, everyone – regardless of group – had to do the same thing:

  • Join a simulated feedback conversation online with a professional role-play actor playing “Chris”.


  • Give feedback on his behaviour in the team meeting, using whatever they’d learned.


  • Have the conversation recorded and rated by two experienced L&D assessors using a detailed rubric based on the SBI model (“situation”, “behaviour”, “impact”).


So this wasn’t “how did you feel about the training?”
It was: “What did you actually say when you were face-to-face with the difficult colleague?”

What changed: content vs practice

When the researchers looked at the conversations, they found a clear pattern.

Participants in the practice (role-play) group delivered significantly more effective feedback than those who had only seen the slideshow or the video.

  • On a seven-point scale of feedback effectiveness, the practice group scored notably higher than both slideshow and video groups.

  • The differences weren’t trivial: effect sizes between practice and video, and practice and slideshow, were large (Cohen’s d around 1.08 and 0.75 respectively).

  • There was no meaningful difference between the slideshow and video groups – swapping slides for video, or video for slides, didn’t change behaviour.

In other words:

Simply changing the format of passive online training – from deck to video – did very little.
Changing the nature of the training – from consuming to practising – changed how people behaved.

Confidence: everyone feels better. That’s the problem

The study didn’t just look at performance. It also tracked self-reported confidence – how confident people felt about giving feedback – at three points: before the intervention, after the intervention, and after the final simulation.

Two things stood out:

  1. Confidence went up across the board.
    Regardless of whether people had slides, video, or practice, their confidence scores increased significantly after the intervention, and again after the final simulation.

  2. But that boost in confidence did not correlate with how effective they actually were.
    When researchers compared post-simulation self-reported confidence with the assessors’ ratings of feedback effectiveness, the relationship was essentially non-existent.

Participants felt better. Their confidence graphs went up.
Their behaviour? Not necessarily.

To complicate things further, when assessors were asked to rate how confident participants appeared in the conversation, those ratings did strongly correlate with actual performance – especially in the practice group.

So:

  • How confident people say they are after online training → poor indicator of genuine skill.

  • How confident they look in a real conversation → much better indicator.

  • The format that most improved performance (practice) is also the one where observable confidence and ability aligned best.

Illustration of a smiling woman with a phone next to a quote about confidence in online training programmes. Sidestream UK

Where this leaves us (and what comes next)

So far, the picture isn’t flattering:

  • Most online training still leans heavily on passive formats like slides and video.

  • Those formats can make people feel more confident, but don’t reliably change what they do in real conversations.

  • Practice-based formats – like online role-play with a coach – take more effort, but deliver better behavioural outcomes.

In part two of this series, we’ll zoom out and look at:

  • Why authentic, practice-based learning consistently outperforms passive online training

  • Three common traps organisations fall into when designing online training

  • Practical shifts you can make if you want your programmes to change behaviour, not just confidence

Because if your training doesn’t show up in the next difficult conversation, it’s worth asking: does it really count?

Talk to us about your online training
Next
Next

Psychological Safety Training That Actually Changes Behaviour