← Back
Royal Navy logo

Royal Navy - Enhancing the Defence Learning Environment (DLE).

Overview

This case study is a narrative summary of the work and outcomes. Timings are described in terms of phases/sprints rather than exact dates.

Enhancing the Defence Learning Environment (DLE), the Royal Navy’s e-learning platform, to improve training accessibility,
flexibility, and effectiveness, ensuring cadets receive solid, upfront training that accelerates their preparation for further offline instruction.

What did my UX and UI practice look like?

We conducted research to understand who our main users were and what problems they were facing regarding access to e-learning.

We discovered that personnel at various stages of their careers (recruits, active service members, and those transitioning to civilian life) access the e-learning platform.
To ensure comprehensive insights, we engaged with a wide range of stakeholders, from recruits to officers.

Images of training sessions and marching drills.

Images of officers and recruits during training sessions and marching drills.

Source images: leolearning.com, www.naval-technology.com, royalnavy.mod.uk,
www.hsdc.ac.uk, Jo Szczepanska (unsplash.com), Amelie Mourichon (unsplash.com), Alvaro Reyes (unsplash.com).

1 - Discovery & Research

Research and workshops were conducted to identify the main challenges and explore potential solutions to improve platform accessibility.
We performed research on the use of the RN eLearning platform (DLE), its current and potential users, their habits, emotions and frustrations.
We then prototyped changes and improvements that were tested in different sprints alongside an implementation culture program to change the mindset and facilitate adoption.

Following a bespoke Design Thinking method to accommodate the hierarchical nature of military life, we served primarily a critical number of users, tested
and iterated on solution hypotheses by involving the client at each stage through presentations of findings and workshops.

UX workshops

UX workshops and findings exercise.

Source images: leolearning.com, www.naval-technology.com, royalnavy.mod.uk, www.hsdc.ac.uk, Jo Szczepanska (unsplash.com),
Amelie Mourichon (unsplash.com), Alvaro Reyes (unsplash.com).

During this initial research phase, I visited various Royal Navy establishments throughout the country, such as BRNC Dartmouth, HMS Collingwood,
CTCRM Lympstone and HMS Raleigh, and facilitated workshops to assess the platform's current state. Conducted as-is evaluations with recruits and trainers to identify key areas for improvement.

On-site meetings

UX workshops and findings exercise.

Source images: leolearning.com, www.naval-technology.com, royalnavy.mod.uk, www.hsdc.ac.uk, Jo Szczepanska (unsplash.com),
Amelie Mourichon (unsplash.com), Alvaro Reyes (unsplash.com).

User Journey (recruit)

Discover
Enrol / Sign-in
Start
Learn
Assess
Continue
Actions
Receives joining info; scans QR
SSO, device check
“Start now” CTA
Micro-units + scenarios
Formative checks
Resume via alerts
Pain points
Info scattered
Auth loops, access
Where do I begin?
Long modules
Feedback lag
Hard to remember where I left
Opportunities
Central starter page
SSO + access checks
Guided “first lesson”
Microlearning + dual coding
Mastery with remediation
Pathway + reminders

2 - Synthesis & Definition

We created several deliverables such as business process mapping,
impact maps and risk assessments, ecosystem maps, stakeholder maps, persona and empathy mapping, user journeys, user flows, different flowcharts,
reports with key findings and advisory solutions based on solid problem solving and UX hypothesis. We also created living prototypes using the legacy system that we tested throughout the project and a custom-made design system.

DLE user journey.

Service Blueprint (simplified)

Discover
Enrol
Start
Learn
Assess
Continue
Frontstage (UI)
Starter hub
SSO screen
Welcome / CTA
Lesson player
Check / Quiz
Pathway / Alerts
Backstage (Logic)
Eligibility rules
AuthN/AuthZ
Progress init
Resume, SCORM/xAPI
Scoring & mastery
Resume & nudges
Support (Ops)
FAQs/Onboarding
Helpdesk SSO
Champions
Content QA
Assessment policy
Reporting & SLAs
Service Blueprint RN. On-site UX workshops. DLE courses

DLE assets and examples.

Source images: leolearning.com, www.naval-technology.com, royalnavy.mod.uk, www.hsdc.ac.uk, Jo Szczepanska (unsplash.com),
Amelie Mourichon (unsplash.com), Alvaro Reyes (unsplash.com).

3 - Ideation & Service Design

We made service design recommendations to improve access to e-learning by scoping personas, tasks and goals; mapping user journeys,
conducting workshops and interviews face-to-face and remote (using Teams) from different RN establishments,
and by trying to understand current challenges. We also collaborated with the RN key stakeholders to change the culture, consolidate and incorporate
a more agile mindset so we could have effective adoption to the e-learning platform.

Royal Navy establishments.

Royal Navy establishments. BRNC Dartmouth, HMS Collingwood, CTCRM Lympstone and HMS Raleigh.

Source images: leolearning.com, www.naval-technology.com, royalnavy.mod.uk, www.hsdc.ac.uk, Jo Szczepanska (unsplash.com), Amelie Mourichon (unsplash.com), Alvaro Reyes (unsplash.com).

4 - Prototyping & Testing

We identified and addressed several areas for improvement, including:

User Flow (first-time recruit)

Access email/QR SSO sign-in Welcome & device check Start lesson Assessment Microlearning & resume Pathway & alerts
First-time flow with resume and nudges reinforcing continuation.

5 - Adoption, Handover & Next Steps

We prototyped changes and iterated in short sprints, paired with a culture-change programme to shift mindsets and accelerate adoption. Alongside the prototypes, we prepared a practical handover so instructors and recruits could access training faster and continue learning with fewer blockers.

Note: Where the DLE stack differs, these recommendations map 1:1 to other LMSs; references to “Moodle-based” reflect the legacy LMS in use at the time.

Example of an implementation checklist and process I would have followed:

Recommended Learning Styles & Patterns

Skip to content

Microlearning & Mastery Learning

Two complementary approaches that make learning lighter and outcomes stronger. Short bursts for memory; checkpoints and support for true understanding.

Short bites (5–10 min)

Teach one idea at a time in compact units. Easier to fit into busy days and kinder on attention.

  • 5–10 minutes each
    Just enough to learn one concept without overload.
  • Spaced repetition
    Revisit key points on a schedule (e.g., Day 0 → Day 2 → Day 7 → Day 21) so knowledge sticks.
  • Retrieval practice
    Quiz, explain, or apply—actively recalling beats re‑reading.
⏲️ 5–10 min 📆 Spaced reviews Recall > re‑read

Progress only after you’ve got it

Learners advance when they show solid understanding—no one is pushed ahead with shaky basics.

  • Frequent formative checks
    Low‑stakes quizzes, quick exercises, or micro‑projects give timely feedback.
  • Adaptive remediation
    If a check flags a gap, provide targeted practice or another explanation until it’s mastered.
Check: short quiz or task
If pass: move on confidently
If miss: targeted support → try again

How they work together

Microlearning delivers focused content you can actually remember. Mastery learning ensures you don’t move on until you truly understand. Together: right‑sized lessons + feedback loop = durable learning.

  • Bite → Check → Support → Next
    Repeat in small cycles to build momentum and confidence.
  • Better retention, fewer gaps
    Memory is reinforced and misunderstandings are caught early.

Go-Live Runbook (lite)

  1. Week 0: final UAT (desktop/mobile/offline), accessibility spot-check, content freeze.
  2. Week 1: phased rollout to pilot units (e.g., BRNC Dartmouth & HMS Collingwood), enable analytics & alerts.
  3. Week 2: champion feedback loop, fix fast-follows, publish quick-start videos.
  4. Week 3: broader rollout incl. ships/submarines, switch on badges/leaderboards.
  5. Week 4: metrics review (TTFL, completion, mobile starts), adjust pathways and nudges.

Note: LMS assumed Moodle-based (legacy DLE). Map steps 1:1 if a different LMS is in use.

Project at a glance

  • When: Several quarters, delivered in multiple sprints
  • Ownership: ~90% UX (discovery → handover)
  • Team: Product, engineering, training, CS, stakeholders across RN establishments
  • Platforms: Web & mobile; offline packages for ships/submarines
  • LMS: Delivered within the DLE’s legacy LMS (Moodle-based at the time)

*This case study reflects my recollection and generalised details for confidentiality.*

Key Results

↓ 25–40% Access drop-off (sign-in/auth & first steps)
↑ 15–25% Mobile course starts
↑ 10–20% Module completion rate (boosted by microlearning & scenarios)
↓ 30–50% Time to first lesson
↑ 50–70% SSO adoption across users

*Indicative ranges shown where exact figures are confidential.

Key Results – visualised

Access drop-off
↓ 25–40%
Mobile course starts
↑ 15–25%
Completion rate
↑ 10–20%
Time to first lesson
↓ 30–50%
SSO adoption
↑ 50–70%

Bars show midpoints of the reported ranges for visual comparison.

Reflection

Bringing DLE closer to real Royal Navy contexts meant designing for shipboard constraints, security, and a hierarchical training model—while still delivering modern e-learning patterns. By simplifying access (SSO), supporting low-connectivity scenarios (offline content), and prioritising mobile, we shifted learning from “when I can get to a desktop” to “when the moment allows,” increasing engagement and reducing friction. Service design work (personas, journeys, culture change) ensured the platform aligned with training realities, not just UI best practices.

My Contribution

Outcomes are directional; ranges are illustrative where exact figures are confidential or unavailable.

← Back to case studies