I help organisations move beyond tick-box compliance and one-off training events. Engaging, evidence-based learning experiences designed to change behaviour.
Outdated compliance training was hurting the Learning & Development Team's reputation. Here's how smart design, data, and strategic thinking helped me to build influence, trust, and change perceptions of digital learning at a top UK insurer.Date: 2023-25
Audience size: 10,000
Tools used:
Articulate Storyline, Rise, 7Taps
Adobe Creative Cloud: Illustrator, After Effects, Photoshop, Premiere Pro
Power Automate, Power BI
Workday Learn, SumTotal

How I went from Excel pivot tables to machine learning – and why the real achievement was changing how my team makes decisions.Date: 2022-23
Tools used:
Data Analysis, Automation, Data Storytelling
Excel, Power BI, Python
Power Automate
Workday Learn, SumTotal

How I joined nine enterprise datasets, analysed 1.2 million records, and connected learning interventions to real contact centre performance for the first time.Date: 2023
Tools used:
Data Engineering, Data Analysis, Regression Modelling, Machine Learning
Stakeholder Strategy
Python, Power BI
How I put professional video production capability in the hands of colleagues across the business – no technical expertise required.Date: 2024-25
Audience: Hundreds of internal collaborators from across the organisation.
Technology:
RapidMooc Pro+
Learning Glass
Video by Blackmagic
Audio by Audient, AKG, and Boya
I combine creativity, analytics, and instructional expertise to deliver impactful digital learning experiences.Core strengths include:
Instructional design: Designing innovative, learner-focused experiences grounded in adult learning principles, cognitive science, and modern instructional theory.
Multimedia production: Creating professional digital media, including video, motion, graphic design, audio production and sound design, narration, and interactive content using authoring tools or HTML/CSS/JS.
Data analytics, statistics and machine learning: Leveraging Python, Power BI, Tableau and Excel to analyse learner behaviour, predict training effectiveness, and drive continuous improvement.
Leadership & coaching: Experience mentoring, coaching, and embedding industry-leading practices to elevate instructional quality and strengthen team culture.
Fluent across tools and technology: Articulate Storyline & Rise, Adobe Creative Cloud (Illustrator, Photoshop, After Effects, Premiere Pro, InDesign), HTML/CSS/JS and Python, MS 365, Power BI, Tableau, Learning Management Systems (Cornerstone, Workday, Success Factors, SumTotal).
Jan 2021 -
Digital Design Consultant
Major UK Insurer
End-to-end learning consultancy for a major UK insurer, specialising in digital and data-driven solutions. Accountabilities include stakeholder engagement, project management, needs analysis, instructional design, creative development and evaluation.
Managed a mandated regulatory learning programme for a company-wide audience of 10,000 colleagues regulated by the FCA, PRA, ICO, CMA, etc. Generated over £45k annual savings. Enhanced learner experience; average ratings increased from 3.8 to 4.7/5.
Established a multi-functional in-house media production studio, enabling high-quality video creation and user-generated content through professional-grade audio/video, lighting and green screen. Powered by RapidMooc and Learning Glass.
Developed data-driven learning strategies and internal products for customer-facing teams, driving improved performance outcomes through learning and development.
| Email: | [email protected] |
| LinkedIn: | Follow me on LinkedIn |
| YouTube: | @danfosterUK |
Transforming regulatory learning at a major UK insurer
Outdated compliance training was hurting the Learning & Development Team's reputation. Here's how smart design, data, and strategic thinking helped me to build influence, trust, and change perceptions of digital learning at a top UK insurer.Date: 2023-25
Audience size: 10,000
Why was this project a success?
I used data to create a mandate for change: Nobody wants to do a bad job. But user data told us that our colleagues demanded more. I created an interactive dashboard to visualise user feedback and began to build the case for change – first with my leadership team, then with each of the 12 compliance topic owners.
I brought a compelling vision: “I want to build something we can be proud of. Something that feels modern and impactful. I want to present your topic in a way that delivers value for our business and our people. No more eye-rolling every quarter. I have the skills and vision to get us there, all I need is your trust and expertise.”
I made decision-making the focus of compliance learning: Reorienting my stakeholders away from what people need to know, to what people need to do was crucial. Following this change, training centres on action rather than theory – identifying triggers and choosing the appropriate response.
What was the impact?
Improved user response: I increased average ratings from 3.8/5 to a sustained 4.7/5 while increasing the volume of responses - over 380,000 user reviews were submitted over three years.
Reduced development and project management costs: I brought all work in-house, slashing annual supplier costs by £45k. I also managed project planning, co-ordination, and implementation.
Increased demand for regulatory & compliance learning: I worked with new stakeholders to add topics like Data Ethics and Artificial Intelligence, Diversity, Equity and Inclusion, Anti-Bullying and Harassment, and Economic Crime to the traditional compliance curriculum.
Credibility for the Digital Learning Team: The sustained success of this initiative created a platform for my team to collaborate with business functions across the organisation – moving beyond compliance into behaviour change and strategic enablement.
Tools and technology
Articulate Storyline, Rise, 7Taps
Adobe Creative Cloud: Illustrator, After Effects, Photoshop, Premiere Pro
Power Automate, Power BI
Workday Learn, SumTotal
How I went from Excel pivot tables to machine learning - and why the real achievement was changing how my team makes decisions.Date: 2022-23

Most L&D teams measure learning through completion rates and post-course surveys. I wanted to know more. Over the course of a Level 4 Data Analyst apprenticeship, I designed and delivered three analysis projects that gave my team something it had never had before: a systematic, evidence-based view of how learning was performing – and the automated tools to keep it updated.The technical progression was real, moving from Excel dashboards through to Python and machine learning. But the bigger shift was cultural. By the end, my team was making decisions differently.
The Problem
Learning & Development teams generate enormous amounts of data – LMS interactions, user feedback, training records, survey responses. At Direct Line Group, almost none of it was being used. Reports sat in spreadsheets. Feedback was buried in a web interface nobody checked. Decisions about which content to commission, renew, or retire were based on intuition and anecdote.
I set out to change that – project by project, tool by tool.
Overview: Three Projects, One Direction
Project 1: MindGym Usage Dashboard
Who is using our learning content, and when?Built in Excel with Power Pivot. Revealed that promotional campaigns drove short-term spikes but not sustained engagement — directly informing a commercial contract decision.
Project 2: Learning Management System Ratings
What do learners actually think?Built in Power BI with full end-to-end automation via Power Automate. Turned a feedback system nobody was reading into a weekly, self-updating intelligence tool.
Project 3: ILT Ratings – Does quality of materials predict learner confidence?
What actually drives learning effectiveness?Built in Python with Jupyter Notebooks and scikit-learn. Used linear regression and statistical hypothesis testing to find a measurable link between training material quality and learner confidence.
The Outcome
By the end of these three projects, my L&D team had:
A live, self-updating dashboard surfacing learner feedback every Monday morning — without manual intervention
Evidence that one supplier’s content wasn’t performing, leading to a contract not being renewed
A process for identifying and acting on poor-performing courses within days, not months
A statistical baseline for measuring the impact of training material quality on learner outcomes
A leadership team aligned around investing in performance data for the first time
And a proof of concept: that end-to-end data automation was possible within our existing tech stack.
Tools and technology
Data Analysis, Automation, Data Storytelling
Excel, Power BI, Python
Power Automate
Workday Learn, SumTotal
Finally, an answer.
How I joined nine enterprise datasets, analysed 1.2 million records, and connected learning interventions to real contact centre performance for the first time.Date: 2023

It’s the question every L&D professional should be able to answer and almost none can. For years, DLG’s Learning Experience team evaluated training through engagement rates and user feedback. Both are useful. Neither tells you whether the training changed anything.For my Level 4 Data Analyst EPA project, I set out to answer it properly. That meant accessing data that L&D had never touched, building relationships across HR, operations, and a data warehouse team, and engineering a dataset complex enough to track individual employees from their first week on the phones through to twelve months in role.The answer – and what to do with it – is what this case study is about.
The Problem
My organisation’s Motor Insurance contact centre runs a structured induction programme for all new Retention handlers. These employees are critical to the business: their role is to prevent customers from cancelling their policies. Marginal improvements in their performance translate directly to revenue.Despite this, nobody had ever looked at whether the induction programme was working. Not because the data didn’t exist - it did, in abundance - but because it lived across three separate enterprise systems that L&D had no access to, no relationships with, and no framework for joining.The hypothesis was simple: if we can benchmark new inductee performance against experienced staff, and track how that gap closes over time, we can make targeted decisions about training content, timing, and sequencing.
What I Did
Data sourcing and access:
I worked with the Motor Performance team to gain access to trading reports held on DLG’s AWS cloud data warehouse — data that L&D had never accessed before. Nine source files were identified across three systems (Workday, iLearn, and the iMI trading platform), totalling over 1.2 million records and requiring a secure Citrix connection for export.
The analysis, in brief:
After an extensive data engineering phase, I conducted an exploratory analysis of Retention KPIs — call volumes, call-to-save rates, renewals, customer feedback (MyCustomer), and quality assurance scores — across four employee populations: new inductees, existing staff, offshore contractors, and Sensee third-party contractors. I then built four linear regression models (one per tenure band) to predict the number of policies saved per 100 calls, and investigated the relationship between cross-skill training and performance uplift.
The recommendation:
New inductees become broadly comparable to experienced staff on most KPIs at around four to six months. Call-to-save — the most commercially significant metric — takes longer, and the data suggests the turning point coincides with a cross-skill training programme typically delivered at six months. My recommendation: pilot bringing elements of that training forward, measure the results against the benchmarks established here, and use A/B testing to validate impact.
The Outcome
First ever data-driven benchmark of new inductee performance at DLG
Evidence that most contact centre KPIs are comparable between inductees and experienced staff at 4–6 months
Identification of call-to-save as a lagging metric — and a training-based hypothesis for why
A specific, testable recommendation: pilot earlier delivery of cross-skill training
A flagged gap in Rescue Upsell performance that doesn’t improve with tenure — suggesting a training design issue rather than an experience effect
A leadership team newly committed to A/B testing as an evaluation methodology
Tools and technology
Data Engineering, Data Analysis, Regression Modelling, Machine Learning
Stakeholder Strategy
Python, Power BI
Enabling knowledge sharing and amplifying colleague voices through user-generated content.
How I put professional video production capability in the hands of colleagues across the business – no technical expertise required.Date: 2024-25
Audience: Hundreds of internal collaborators from across the organisation.
Why was this project a success?
The studio was designed for everyone, not just creatives: Most colleagues don’t think of themselves as video producers – and they shouldn’t have to. The studio’s user-friendly system meant anyone could walk in and start recording without technical knowledge. To make the experience even smoother, I created digital training resources accessible via QR code, so users could quickly launch step-by-step guides. The result was a space that felt exciting and empowering rather than intimidating.
It broke a bottleneck that was slowing the business down: Before the studio, video production meant joining a long queue for the company’s London-based Design Team, or settling for low-quality webcam recordings. The studio gave teams across the business a fast, professional alternative – putting high-quality video capability in their hands without the wait, cost, or compromise.
It opened up new ways of learning across the organisation: The impact stretched well beyond the L&D team. Teams who had always relied on face-to-face delivery, like those responsible for health and safety training across our vehicle repair network, were able to create professional video-based training for the first time, reducing costs and enabling on-demand access for a hard-to-reach workforce.
What was the impact?
Lower bar to entry for high-quality User-Generated Content: The combination of an intuitive touchscreen system and QR-accessible guides meant colleagues needed no technical background to produce professional-quality video. Teams who had never considered creating their own content were recording, editing, and publishing independently – shifting the L&D team’s role from bottleneck to enabler.
Amplifying diverse voices: Partnering with the Diversity Trust, the studio became a space for colleagues to share stories that really mattered – LGBTQ+ experiences, faith, family, and caring responsibilities. These weren’t polished corporate messages; they were authentic, personal, and human. Content that would have been difficult or costly to produce any other way became some of the most meaningful material we created.
Reducing costs and unlocking on-demand training for hard-to-reach teams: For geographically dispersed teams who were reliant on face-to-face delivery, the studio was transformative. Health and safety training that once required an instructor to travel to site could now be delivered on-demand via video, reducing delivery costs and making it easier for colleagues to access training when and where they needed it.
Tools and technology
RapidMooc Pro+
Learning Glass
Video by Blackmagic
Audio by Audient, AKG, and Boya
Adobe Creative Cloud
Articulate Rise