From Classroom to Clinic: Making Evidence Travel
Author: Dr Eimear Morrissey
Lecturer in Evidence-Based Healthcare, Centre for Health Research Methodology, School of Nursing and Midwifery, University of Galway, Ireland
If you work in healthcare, you’ve probably felt the gap between excellent research and useful knowledge. Papers are long, time is short and meetings tend to move quickly. Meanwhile, misinformation spreads faster than good evidence online. The result is that good evidence often stalls at publication, rather than shaping the choices people make in the clinic on a Tuesday morning.
Our MSc in Evidence-Based Future Healthcare at the University of Galway was designed to tackle exactly that. It’s a fully online, interprofessional program built for healthcare professionals with real lives and rotas.
We designed the program with collaborative knowledge translation in mind and we aim to make evidence travel in two ways:
1. By growing confident evidence champions – healthcare professionals who can interpret results, spot traps and communicate them plainly to colleagues and patients.
2. By focusing assessment on real-world, meeting-ready products: briefs, visuals and implementation plans designed for use in actual services.

The challenge
A lot of effort across our healthcare system produces knowledge about care. Far less produces knowledge for care. The outputs many researchers were trained to value – a long report, a dense slide deck, an academic paper – don’t meet the needs of a busy clinic, ward, pharmacy or community service.
We saw two patterns that kept repeating. Far too often, we met skilled healthcare professionals who felt ‘rusty’ around methods: uneasy reading a statistical test, unsure what a systematic review really was in practice, or intimidated by research jargon. Second, we realised that our well-intentioned traditional course assignments didn’t translate into something a team could use next week. If we wanted evidence to travel, we had to tackle both.
What we tried (and why)
We built the spine of the program so that learning builds confidence and assessments produce usable things.
Students take seven modules, plus a year-long capstone project. Every week is anchored by a workbook that sets the aim, explains why it matters and walks through the ‘how’ with one or two small, practical tasks. The workbook links to key readings, templates and examples, and flags anything to bring to the discussion board or (recorded) live tutorial. The schedule is predictable so people can plan around work rotas.
Lower the threshold and build confidence
Each workbook opens with a short, plain-language overview of the method or idea, then moves straight into quick knowledge checks. Step-by-step prompts ask learners to try a small, concrete task in their own context (e.g. frame a clear question, outline a simple search, summarise a key finding in one sentence or note one limitation). A clear weekly checklist keeps things doable. Discussion boards with interdisciplinary peers and live sessions with international experts bring in different professional angles – useful for pressure-testing how an idea might land in different settings.
Design for real use from day one
Two assignments show the idea in action – one service-facing, one public-facing – each ending with a product someone else can actually use.
1) The capstone project: A plan you can take to your manager
Instead of a traditional primary-research thesis, each student chooses a challenge from their own context and develops an evidence-informed intervention with an implementation plan. Practicability is key: the plan includes who does what; timelines and milestones; risks and enablers; fit with existing systems; a realistic 2-year budget (we use a hypothetical €90,000 cap); and a proportionate evaluation plan. The finished report is something students can bring to a service lead or a small funding call and have a real conversation.
2) Person-centred care, translated for the public
In the Person-Centred Care and Shared Decision-Making module, students produce a lay-facing resource (blog, infographic, poster or brochure) on a concrete communication problem in their setting: what’s not working, why it matters, what could improve and what patients or families should know or ask. A short reflection explains their choices on tone, visuals and language for that specific audience. It’s a small assignment that reliably turns hesitant readers into people who can explain evidence clearly to others.
What happened
We’re early in the journey (first cohort 2024–26), so rather than headline numbers we’re concentrating on the communication work itself. This year, our attention is on how people frame questions, summarise findings in plain language and respond to peers from different professional angles on the discussion boards. Over 2025–26 we’ll document, for a sample of outputs, where they were shared beyond the course (if at all), what was adapted for local use and what learners would change next time.
Key take-aways for fellow educators in evidence-based healthcare
1. Assess the thing you want to see in practice. If you want usable knowledge, make the assignment a usable artefact.
2. Design for real lives. Asynchronous core, recordings available and a transparent weekly schedule help learners continue to engage.
3. Build confidence first. Short explainers and immediate application tasks can help students build capability.
4. Make interdisciplinarity structural. Don’t just invite multiple perspectives – require them in the task.
Conclusion
In an interdisciplinary postgraduate setting, collaborative knowledge communication is a shared practice. Our approach is twofold – grow evidence fluency and centre assessment on usable artefacts for real services – and we’ll keep sharing what helps evidence move from classroom to clinic.
To link to this article - DOI: https://doi.org/10.70253/XZJV4680
Disclaimer
The views expressed in this World EBHC Day Blog, as well as any errors or omissions, are the sole responsibility of the author and do not represent the views of the World EBHC Day Steering Committee, Official Partners or Sponsors; nor does it imply endorsement by the aforementioned parties.