3 – More is not necessarily better
In this Chapter: Introduction (this page) Intensive treatments for breast cancer Mutilating surgery Bone marrow transplantation Dare to think about […]
| 0 Comments8 – Assessing all the relevant, reliable evidence
In this Chapter: Introduction (this page) Is one study ever enough? Systematic reviews of all the relevant, reliable evidence Reducing […]
| 0 Comments7 – Taking account of the play of chance
In this Chapter: Introduction (this page) Assessing the role that chance may have played in fair tests What does a […]
| 0 Comments6 – Fair Tests of Treatments
In this Chapter: Why are fair tests of treatments needed? The beneficial effects of optimism and wishful thinking The need […]
| 0 Comments5 – Dealing with uncertainty about the effects of treatments
In this Chapter: Introduction (this page) Dramatic treatment effects: rare and readily recognizable Laser treatment of portwine stains Imatinib for […]
| 0 Comments4 – Earlier is not necessarily better
In this Chapter: Introduction (this page) Lessons from neuroblastoma screening Weighing benefits and harms Phenylketonuria screening: clearly beneficial Abdominal aortic […]
| 0 Comments2 – Hoped-for effects that don’t materialize
In this Chapter: Introduction (this page) Advice on babies’ sleeping position Drugs to correct heart rhythm abnormalities in patients having […]
| 0 CommentsLinguistic strategies for improving informed consent in clinical trials among low health literacy patients
Evidence-based guidance on how to improve informed consent processes for patients being invited to participate in clinical research.
| 0 Comments | EvaluatedInformed Health Choices Podcasts
Each episode includes a short story with an example of a treatment claim and a simple explanation of a Key Concept used to assess that claim
| 1 Comment | EvaluatedInformed Health Choices Primary School Resources
A textbook and a teachers’ guide for 10 to 12-year-olds. The textbook includes a comic, exercises and classroom activities.
| 0 Comments | EvaluatedEbm@school – a curriculum of critical health literacy for secondary school students
A curriculum based on the concept of evidence-based medicine, which consists of six modules.
| 0 Comments | EvaluatedConfidence Intervals – CASP
The p-value gives no direct indication of how large or important the estimated effect size is. So, confidence intervals are often preferred.
| 0 Comments | EvaluatedKnow Your Chances
This book has been shown in two randomized trials to improve peoples' understanding of risk in the context of health care choices.
| 0 Comments | EvaluatedPhilosophy for Children (P4C)
P4C promotes high-quality classroom dialogue in response to children’s own questions about shared stories, films and other stimuli.
| 0 Comments | EvaluatedThinking, talking, doing science
An experimental educational intervention in teaching science at primary schools.
| 0 Comments | EvaluatedMcMaster Evidence-Based Clinical Practice Workshop Resources – Therapy module
This is the therapy module resources provided to the attendees at the McMaster Evidence-Based Clinical Practice Workshop.
| 0 CommentsMcMaster Evidence-Based Clinical Practice Workshop Resources – Systematic review module
The Systematic review module resources provided to the attendees at the McMaster Evidence-Based Clinical Practice Workshop.
| 0 CommentsHow well is the clinical importance of study results reported?
How well is the clinical importance of study results reported?
| 0 CommentsWhat is meant by intention to treat analysis? Survey of published randomised controlled trials
Results of a survey to document the meaning of ‘intention to treat’ analysis.
| 0 CommentsBlinding in clinical trials and other studies
Simon Day and Doug Altman discuss blinding in clinical trials.
| 0 CommentsDistinguishing between “no evidence of effect” and “evidence of no effect” in randomised controlled trials and other comparisons
Distinguishing between “no evidence of effect” and “evidence of no effect” in randomised controlled trials and other comparisons.
| 0 CommentsTips for learners of evidence-based medicine: 1. relative risk reduction, absolute risk reductions and number needed to treat
Relative risk reduction, absolute risk reduction and number needed to treat.
| 0 CommentsBasic statistics for clinicians: 1. Hypothesis testing
The statistical concepts of hypothesis testing and p values.
| 0 CommentsBasic statistics for clinicians: 2. Interpreting study results: confidence intervals
Interpreting study results: confidence intervals.
| 0 CommentsBasic statistics for clinicians: 3. Assessing the effects of treatment: measures of association
Assessing the effects of treatment: measures of association.
| 0 CommentsTips for teachers of evidence-based medicine: Relative risk reduction, absolute risk reduction and numbers needed to treat
Tips for teachers of evidence-based medicine: 1. Relative risk reduction, absolute risk reduction and number needed to treat.
| 0 CommentsThe 2011 Oxford CEBM Levels of Evidence: Introductory Document
The 2011 Oxford Centre for Evidence-Based Medicine’s Levels of Evidence.
| 0 CommentsTips and tricks in performing a systematic review
Why do, and what to do when starting a systematic review.
| 0 CommentsMeta-analysis: Its strengths and limitations
The strengths and limitations of meta-analysis.
| 0 CommentsMeta-analysis, collaborative overview, systematic review: what does it all mean?
Mike Clarke’s 9-minute read on meta-analysis, collaborative overview, systematic review.
| 0 CommentsThe interpretation of clinical trials
Peter Greenberg’s 9-minute read on the interpretation of clinical trials.
| 0 CommentsEvidence Based Drug Therapy: What Do the Numbers Mean?
Strengths and limitations of different measures of the effects of treatments.
| 0 CommentsHarm
A University of Massachusetts Medical School text on adverse effects of treatments.
| 0 CommentsTherapy
A University of Massachusetts Medical School text discussing the strengths and limitations of different measures of the effects of treatment
| 0 CommentsWhat Evidence in Evidence-Based Medicine?
Philosopher John Worral’s reflections on the evidence used in Evidence-Based Medicine.
| 0 CommentsYou Can’t Trust What you read about nutrition
Beware of misleading correlations between foods and chance associations with other factors.
| 0 CommentsAssociation is not the same as causation. Let’s say that again: association is not the same as causation!
This article explains how to tell when correlation or association has been confused with causation.
| 0 CommentsEvidence for the frontline: A report for the Alliance for Useful Evidence
Jonathan Sharples’ introduction to evaluation in education, policing and other public services.
| 0 CommentsThe DIY evaluation guide
The Educational Endowment Foundation’s DIY Evaluation Guide for teachers introduces the key principles of educational evaluation.
| 0 CommentsUsing research evidence: a practice guide
NESTA’s guide to using research evidence to inform decisions in policy and practice.
| 0 CommentsLearning from research: systematic reviews for informing policy decisions
The EPPI Centre’s guide to using systematic reviews to inform policy decisions.
| 0 CommentsPatients as Consumers: Physician’s conflicts of interest
James Rickert talks with Helen Osborne about looking at healthcare from the perspectives of both a patient and provider.
| 0 CommentsCritical Appraisal of Research Evidence 101
Ontario Public Health Libraries Association guide to critical appraisal of research evidence.
| 0 CommentsPolicy: twenty tips for interpreting scientific claims
This list will help non-scientists to interrogate advisers and to grasp the limitations of evidence.
| 0 CommentsWhat makes a good systematic review?
What makes a good systematic review from Oxford University’s Centre for Evidence-Based Intervention?
| 0 CommentsUnderstanding Health Research: evidence-based medicine, practice and policy
Evidence-based medicine, practice and policy are terms used to describe making decisions using scientific evidence.
| 0 CommentsUnderstanding Health Research, a tool for making sense of health studies: use of statistics
In health research, researchers typically use statistics to determine statistical significance and effect size.
| 0 CommentsUnderstanding Health Research, a tool for making sense of health studies: Confounders
A confounder (or 'confounding factor') is something, other than the thing being studied, that could be causing the results seen in a study.
| 0 CommentsUnderstanding Health Research: are some types of evidence better than others?
Understanding Health Research, a tool for making sense of health studies: are some types of evidence better than others?
| 0 CommentsUnderstanding Health Research: how science media stories work
Understanding Health Research, a tool for making sense of health studies: how science media stories work.
| 0 CommentsUnderstanding Health Research: Correlation and Causation
A discussion of the difference between correlation and causation.
| 0 CommentsUnderstanding Health Research: A tool for making sense of health studies
An interactive online tool designed to help anybody to understand scientific health research evidence.
| 0 CommentsIce bucket challenge “breakthrough”? Experts pour cold water on superficial reporting
Beware claims of treatment breakthrough. They’re probably not.
| 0 CommentsThe Slippery Slope: Is a Surrogate Endpoint Evidence of Efficacy?
A discussion of the dangers of relying on surrogate outcome measures.
| 0 CommentsAssessing Risk of Bias in Included Studies
An introduction to assessing risk of bias using the Cochrane ‘Risk of Bias Tool’.
| 0 CommentsSystematic Review X Narrative Review
Describing the distinct characteristics and goals of systematic and narrative reviews of the literature.
| 0 CommentsReading the Medical literature
American College of Obstetricians and Gynaecologists (ACOG) introduction to critical appraisal and evidence-based medicine.
| 0 CommentsUniversity of Western Australia: Bias Minimisation, were the right patients included?
University of Western Australia’s explanation of the importance of involving the right people in treatment comparisons.
| 0 CommentsUniversity of Western Australia: Bias Minimisation, randomisation and blinding
University of Western Australia’s explanation of why random allocation to comparison groups and blinding (if possible) are important.
| 0 CommentsSun Downstate; The Double Blind Method
Suny Downstate’s explanation of why blinding is important in assessing the effects of treatments.
| 0 CommentsSuny Downstate; Randomized Controlled Studies
Suny Downstate’s explanation of why random allocation to treatment comparison groups is important.
| 0 CommentsSuny Downstate; Systematic Reviews and Meta-analysis
Suny Downstate’s explanation of why it is important to consider all studies addressing a specific question.
| 0 CommentsWhat is a meta-analysis? How to use a systematic review
Oxford University’s Centre for Evidence-Based Intervention guide on how to use evidence from systematic reviews.
| 0 CommentsWhat is a meta-analysis?
An explanation of meta-analysis from Oxford University’s Centre for Evidence-Based Intervention.
| 0 CommentsDo the statistics back up the claim?
‘Ask for Evidence’ introduction to the interpretation and assessment of statistics.
| 0 CommentsIs the therapy clinically useful?
An article from the PEDro database on whether a treatment is useful.
| 0 CommentsIs the trial valid?
An article from the PEDro database on assessing the validity of a study.
| 0 CommentsEvidence-Based medicine in Pharmacy Practice
An article by Suzanne Albrecht on Evidence-Based Medicine in Pharmacy Practice.
| 0 CommentsGoals and tools in Meta-analysis
Meta-analysis in Michigan State University’s Evidence-Based Medicine Course.
| 0 CommentsGoals and tools in Prognosis evaluation
How to assess prognosis in Michigan State University’s Evidence-Based Medicine Course.
| 0 CommentsEvaluating relevance
How to evaluate relevance of research in Michigan State University’s Evidence-Based Medicine Course.
| 0 CommentsLimitations of current clinical practice
Discussion of the need to recognise the limitations of current clinical practice in Michigan State Univ’s Evidence-Based Medicine Course.
| 0 CommentsEvidence-based medicine
The European Patients’ Academy web-based introductory course on Evidence-Based Medicine.
| 0 CommentsWho funded the study?
‘Ask for Evidence’ warning about the way that vested interests can distort research.
| 0 CommentsAnecdotes, testimonials and personal studies
‘Ask for Evidence’ warning that anecdotes are not a trustworthy basis for inferring treatment effects.
| 0 CommentsCommon pitfalls with studies and things to look out for
‘Ask for Evidence’ introduction to the need for critical appraisal of research studies.
| 0 CommentsRandomised controlled trials (RCTs)
‘Ask for Evidence’ introduction to the concept of a randomised comparison.
| 0 CommentsAnimal Studies
‘Ask for Evidence’ information about the relevance and limitations of animal studies for promoting human health.
| 0 Comments‘In vitro’ (e.g. test tube) studies
‘Ask for Evidence’ explanation of the term ‘in vitro’ research.
| 0 CommentsApply the results to your patients
A Duke Univ. tutorial explaining how to address the question: how relevant is the research evidence to the needs of my patient?
| 0 CommentsWhat are the results?
A Duke Univ. tutorial explaining how to address the questions: How large was the treatment effect? What was the absolute risk reduction?
| 0 CommentsEvaluating the validity of a therapy study
A web-based Duke University tutorial explaining how to address the question: are the results of the study valid?
| 0 CommentsDelfini: Critical appraisal matters
A 20-minute slide cast discussing how reliable evidence and critical appraisal can help to improve health outcomes.
| 0 CommentsRandomisation explained in 1 minute
A 1 minute animation produced by Cancer Research UK, explaining the term ‘randomised trial’.
| 0 CommentsEvidence-Based and Shared-Informed Decision-Making According to Homer (Simpson)
With help from Homer Simpson, James McCormack uses a 17-minute slide cast to explain the principles of thoughtful treatment.
| 0 CommentsTeaching Tips: randomisation for trials
Chris Del Mar describes a group exercise that enables students to appreciate how trials work, and how they can go wrong.
| 0 CommentsTeaching Tip: Understanding Regression to the mean in preparation for teaching EBM
Chris Del Mar uses dice to simulate the natural fluctuations in pain, and to illustrate regression-to-the mean by re-testing the outliers.
| 0 CommentsEvidence for everyday health choices
A 17-min slide cast by Lynda Ware, on the history of EBM, what Cochrane is, and how to understand the real evidence behind the headlines.
| 0 CommentsSunn Skepsis
Denne portalen er ment å gi deg som pasient råd om kvalitetskriterier for helseinformasjon og tilgang til forskningsbasert informasjon.
| 0 CommentsDancing statistics: Explaining variance
A 5-minute film demonstrating the statistical concept of variance through dance.
| 0 CommentsDancing statistics: sampling & standard error
A 5-minute film demonstrating the statistical concept of sampling and standard error through dance.
| 0 CommentsDancing statistics: correlation
A 4-minute film demonstrating the statistical concept of correlation through dance.
| 0 CommentsJulia Belluz – Lessons from the trenches of evidence-based health journalism at Vox.com
20-minute talk by Julia Belluz on the need to bring the cultures of health journalism and EBM together.
| 0 CommentsDon’t jump to conclusions, #Ask for Evidence
An introduction to the ‘Ask for Evidence’ initiative launched by ‘Sense about Science’ in 2016.
| 0 CommentsHow can you know if the spoon works?
Short, small group exercise on how to design a fair comparison using the "claim" that a spoon helps retain the bubbles in champagne.
| 0 CommentsEnglish National Curriculum vs Key Concepts – Key Stage 3
A linked spreadsheet showing how the Key Concepts map to the Science National Curriculum in England at Key Stage 3 (ages 11-14).
| 0 CommentsDRUG TOO
James McCormick with another parody/spoof of the Cee Lo Green song ‘Forget You’ to prompt scepticism about many drug treatments.
| 0 CommentsCalling Bullshit Syllabus
Carl Bergstrom's and Jevin West's nice syllabus for 'Calling Bullshit'.
| 0 CommentsThe surrogate battle – is lower always better?
James McCormick recruits a furious Fuhrer to point out that taking drugs to lower surrogate measures of ill health is a confidence trick.
| 0 CommentsTom Hanks and Type 2 Diabetes
A 50-minute illustrated talk by James McCormack prompted by Tom Hanks’ announcement that he had been diagnosed with Type 2 diabetes.
| 0 CommentsBohemian Polypharmacy
James McCormack recruits help from Queen to warn of the dangers of ‘Bohemian Polypharmacy’ in music.
| 0 CommentsChoosing Wisely
James McCormack using song and dance to warn about the negative effects of overtreatment.
| 0 CommentsLike a bridge overdiagnosis
James McCormack with another of his brilliant parodies, warning about the dangers of becoming inappropriately labelled as ill.
| 0 Comments‘Tricks to help you get the result you want from your study (S4BE)
Inspired by a chapter in Ben Goldacre’s ‘Bad Science’, medical student Sam Marks shows you how to fiddle research results.
| 0 CommentsReporting the findings: Absolute vs relative risk
Absolute Differences between the effects of two treatments matter more to most people than Relative Differences.
| 0 CommentsIt’s just a phase
A resource explaining the differences between different trial phases.
| 0 CommentsStrictly Cochrane: a quickstep around research and systematic reviews
An interactive resource explaining how systematic and non-systematic reviews differ, and the importance of keeping reviews up to date.
| 0 CommentsThe Princess and the p-value
An interactive resource introducing reporting and interpretation of statistics in controlled trials.
| 0 CommentsTeach Yourself Cochrane
Tells the story behind Cochrane and the challenges finding good quality evidence to produce reliable systematic reviews.
| 3 CommentsExplaining the mission of the AllTrials Campaign (TED talk)
Half the clinical trials of medicines we use haven’t been published. Síle Lane shows how the AllTrials Campaign is addressing this scandal.
| 0 CommentsFish oil in the Observer: the return of a $2bn friend
Ben Goldacre draws attention to people’s wish to believe that a pill can be the solution to a complicated problem.
| 0 CommentsBuilding evidence into education
Ben Goldacre explains why appropriate infrastructure is need to do clinical trials of sufficient rigour and size to yield reliable results.
| 0 CommentsAnecdotes are great – if they convey data accurately
Ben Goldacre gives examples of how conclusions based on anecdotes and biased research can be damagingly misleading.
| 0 CommentsStudies of studies show that we get things wrong
Ben Goldacre gives examples of how conclusions based on anecdotes and biased research can be damagingly misleading.
| 0 CommentsDodgy academic PR
Ben Goldacre: 58% of all press releases by academic institutions lacked relevant cautions and caveats about the methods and results reported
| 0 CommentsAll bow before the mighty power of the nocebo effect
Ben Goldacre discusses nocebo effects, through which unpleasant symptoms are induced by negative expectations, despite no physical cause.
| 0 CommentsHow do you regulate Wu?
Ben Goldacre finds that students of Chinese medicine are taught (on a science degree) that the spleen is “the root of post-heaven essence”.
| 0 CommentsScience is about embracing your knockers
Ben Goldacre: “I don’t trust claims without evidence, especially not unlikely ones about a magic cream that makes your breasts expand.”
| 0 Comments