Research is a form of resistance. My research exposes the assumptions that govern health professions education and reimagines what competence, fairness, and expertise could look like. By tracing how systems produce knowledge, regulate participation, and distribute opportunity, my work challenges the belief that assessment is neutral, and that opportunity is meritocratic.
I apply expertise about cognition and behaviour to reveal how inequities are sustained. I conduct research to interrupt complacency, refusing inherited truths, and expanding what our field imagines as possible. Resistance, in this sense, is not just critique. It is intervention — a commitment to evidence-informed change that confronts flawed logics, narrow definitions of competence, and structures that enable exclusion while claiming fairness.
Surveys are ASSUMED to be an efficient method for capturing important data
DISPUTED: RARELY IS THIS TRUE
Actually, in health professions education research, there are many better ways to answer a research question than using a survey. And many of those approaches do not require a sample size calculation. Indeed, an anonymized survey is a poor research tool for studying many of life's complex issues, like feelings.
Many ASSUME we are susceptible to irrational biases, so we must train ourselves to act rationally
DISPUTED: THIS IS NOT POSSIBLE
The discipline of psychology, a social science, is composed of many epistemologies and research paradigms which create evidence that may or may not have any application to real life. Most claims about the irrationality, and inaccuracy, of human thinking are based on findings from artificial scenarios. Indeed, the intuitive and emotional strengths of human thinking have always supported expert clinicial decisions well.
Norman GR, Monteiro SD, Sherbino J, Ilgen JS, Schmidt HG, Mamede S. The causes of errors in clinical reasoning: cognitive biases, knowledge deficits, and dual process thinking. Academic Medicine. 2017 Jan 1;92(1):23-30.
Monteiro S, Sherbino J, Sibbald M, Norman G. Critical thinking, biases and dual processing: The enduring myth of generalisable skills. Medical education. 2020 Jan;54(1):66-73.
Monteiro S, Norman G, Sherbino J. The 3 faces of clinical reasoning: Epistemological explorations of disparate error reduction strategies. Journal of evaluation in clinical practice. 2018 Jun;24(3):666-73.
Many ASSUME that discussing unconscious biases brings them up to consciousness so we can change them
DISPUTED: THIS IS HARMFUL TO COMMUNITY AND BELONGING
It's not that simple - if it were we would have eliminated discrimination long ago. Harmful implicit attitudes develop through implicit social and cognitive processes. However, harmful behaviours, which are explicit, develop through explicit social and cognitive processes - which means we can develop education interventions to teach better behaviours.
Tong XC, Chopra S, Jordan H, Sibbald M, Geekie-Sousa A, Monteiro S. The Creating Brave Spaces workshop: a report on simulation-based faculty development to disarm microaggressions. BMJ Leader. 2023 Aug 1;7(Suppl 2).
Monteiro S, Chan TM, Kahlke R. His opportunity, her burden: A narrative critical review of why women decline academic opportunities. Medical Education. 2023 Oct;57(10):958-70.
Last N, Sheth U, Keuhl A, Geekie-Sousa A, Yilmaz DU, Monteiro S, Sibbald M. Engaging and supporting standardized patients involved in equity-seeking healthcare training: a qualitative study. Int J Med Educ. 2025 Mar 20;16:64-74.
Many ASSUME that effective simulation-based education must feel real
DISPUTED: THIS CAN BE HARMFUL FOR LEARNING
Many assume that simulation must feel real to be effective. Yet research shows that high realism does not reliably improve learning outcomes, and can increase cognitive load and distraction. What matters more for learning is purposeful design, clear objectives, and structured reflection — not how lifelike the scenario feels. Realism is neither necessary nor sufficient for effective simulation-based education; focusing on it can divert attention from deeper learning.
Monteiro S, Sibbald M, Beecroft J, Bhanji F, Caners K, Chen R, Dhir J, Kahlke R, Keuhl A, LeBlanc V, Nagji A. Choosing Wisely for Simulation-Based Learning in Health Professions Education. Medical Science Educator. 2025 Jul 31:1-4.
Monteiro S, Sibbald M. Aha! Taking on the myth that simulation‐derived surprise enhances learning. Medical education. 2020 Jun;54(6):510-6.
Many ASSUME that defining professional competencies will ensure standardized education and values - which also ASSUMES that education can be standardized
DISPUTED: STANDARDIZED EDUCATION IS HARMFUL
Many assume that defining professional competencies will ensure standardized education and shared values. But standardization narrows the curriculum toward what can be specified, tested, and scored, privileging measurement over meaning and performance over judgment. It also obscures diversity by treating all learners as the same, reproducing advantage for some while marginalizing others. Under the guise of fairness and rigour, standardized education can produce inequity and diminish belonging. Learning is contextual, and competence cannot be made uniform without flattening context, values, and identity.
LoGiudice AB, Sibbald M, Monteiro S, Sherbino J, Keuhl A, Norman GR, Chan TM. Intrinsic or invisible? An audit of CanMEDS roles in entrustable professional activities. Academic Medicine. 2022 Aug 1;97(8):1213-8.
Many ASSUME that learners must ONLY know the "do not miss diagnoses"
DISPUTED: THIS ERODES PATIENT CARE
Focusing exclusively on “do not miss” diagnoses trains learners to search for pathology, not to recognize the typical, expected, and normal. But expert reasoning depends on calibrating what is normal in order to recognize what deviates from it. Normal cases anchor diagnostic reasoning, shape base rates, and support calibration. Without a model of normal, everything can appear abnormal. Detecting abnormality requires first perceiving normal variation. Perceptual learning research shows that novices benefit from broad exposure to typical cases because it trains discrimination and improves pattern recognition.
Assessment practices that focus exclusively on rare or dramatic illness distort competence by privileging the exceptional over the representative. Understanding normal supports safer, more calibrated, and more ecologically valid decision-making.
Logiudice, Andrew; Sibbald, Matthew; Monteiro, Sandra,. The Unexplored Value of “Normal”: A Commentary on the Lack of Normal Cases in High-Stakes Assessment. Archives of Medicine and Health Sciences 9(1):p 136-139, Jan–Jun 2021. | DOI: 10.4103/amhs.amhs_106_21
Monteiro S, Sherbino J, LoGiudice A, Lee M, Norman G, Sibbald M. The influence of viewing time on visual diagnostic accuracy: less is more. Medical Education. 2024 Jul;58(7):858-68.
Many ASSUME that assessment is objective and value-free
Why it persists: Objectivity is socially reassuring and administratively efficient
Why it fails: Assessment relies on social judgments, values, and cultural norms. Neutrality is a claim, not a condition.
Assessment is never purely objective because it relies on human interpretation at every step. Decisions about what to measure, which tools to use, how much evidence is “enough,” and what counts as acceptable performance all reflect implicit values and beliefs. Even structured tools cannot eliminate the influence of context, risk perception, and professional judgment.
Assessment is therefore value-laden, not value-free; acknowledging this allows us to design systems that make these values explicit rather than pretending they do not shape the outcomes. Our perception of competence is shaped not only by observable performance but also by the values and assumptions we bring to assessment. Factors such as context, institutional priorities, rater expectations, and systemic inequities all influence judgments of who is seen as competent. Thus, measurement is never neutral: it reflects the power dynamics and biases of the systems in which it is embedded. My research applies theories of measurement and human behaviour to clarify these limits, expose inequities in assessment practices, and design approaches that support fairer and more just decisions in health professions education.
Monteiro SD. Everything alone: Is medical education chasing a harmful myth in its effort to embrace societal need?. Medical Education. 2024 Apr 1;58(4).
Many ASSUME that more data leads to more accurate decisions
DISPUTED: RARELY IS THIS TRUE
More data does not guarantee more accurate judgments. Excessive or fragmented information can create ambiguity, overwhelm decision makers, and amplify noise rather than signal. Accuracy depends less on volume and more on shared interpretive frameworks, coherent narratives, and the alignment of values guiding judgment. Without these anchoring structures, additional data often obscures rather than sharpens understanding. High-quality assessment requires meaningful data, not simply more data.
We have not designed fully accurate measures of clinical competence, because competence is not a single, fixed construct—it is complex, subjective, and context-dependent. Each health profession, regulator, and training program defines “minimal competence” differently, reflecting local standards, values, and priorities. Standardized assessments such as written or performance exams are useful for supporting high-stakes decisions, but they inevitably simplify and constrain what competence means. Reliability and validity frameworks help us evaluate these measures, yet they cannot fully capture the nuanced, situated ways that competence is enacted in practice.
Chan TM, Sebok-Syer SS, Sampson C, Monteiro S. The Quality of Assessment of Learning (Qual) Score: Validity Evidence for a Scoring System Aimed at Rating Short, Workplace-Based Comments on Trainee Performance. Teaching & Learning in Medicine. 2020 Jun 1;32(3).
Many ASSUME that competence can be directly observed and quantified
DISPUTED: RARELY IS THIS TRUE
Competence is not an observable property of a learner; it is an inference constructed from imperfect observations of behaviour in specific contexts. Observable actions are only fragments of a broader capability that includes knowledge, judgment, adaptability, and professional identity. Quantification systems—scores, checklists, numbers—translate these complex judgments into simplified outputs, but they do not capture the construct itself. Competence must therefore be interpreted, not merely observed, and its evaluation is inherently uncertain and contextual.
Olson A, Kämmer JE, Taher A, Johnston R, Yang Q, Mondoux S, Monteiro S. The inseparability of context and clinical reasoning. Journal of Evaluation in Clinical Practice. 2024 Jun;30(4):533-8.
Monteiro, S., McConnell, M.M. Evaluating the Construct Validity of Competencies: A Retrospective Analysis. Med.Sci.Educ. 33, 729–736 (2023). https://doi-org.libaccess.lib.mcmaster.ca/10.1007/s40670-023-01794-z
Many ASSUME that learners must be self-directed to acquire knowledge efficiently
DISPUTED: RARELY IS THIS TRUE
Learning complex skills relies on the same fundamental cognitive processes across the lifespan. From infancy through adulthood, we interpret the world through our senses, retain information that connects to prior knowledge, and struggle with what feels unfamiliar. This means that even highly motivated adults in clinical training face the same slow, effortful pathway of adaptation as children learning new skills.
In complex environments such as healthcare, this universal process can make the transition into new clinical spaces particularly difficult. Evidence from developmental neuroscience shows that our brains remain experience-dependent and plastic well into later life, meaning that clinical expertise develops gradually through repeated, meaningful exposure to varied cases. Efficiency can be supported not by shortcuts, but by carefully structured experiences—such as guided simulations or staged introductions to common tasks—that help learners link new information to what they already know.
By recognizing that learning remains an effortful, developmental process at all ages, we can design educational environments that reduce unnecessary barriers, challenge assumptions about adult learners, and create more equitable opportunities for developing expertise.
Monteiro S, Cavanagh A, Biswabandan B, Sibbald M. Separating the Noise from the Signal: The Role of Familiarity and Pattern Recognition in the Development of Clinical Expertise. InFundamentals and Frontiers of Medical Education and Decision-Making 2024 Jul 22 (pp. 169-185). Routledge.
de Bruin AB, Sibbald M, Monteiro S. The science of learning. Understanding medical education: Evidence, theory, and practice. 2018 Dec 3:23-36.