King's College London

Research portal

When assessment validation neglects any strand of validity evidence: An instructive example from PISA

Research output: Contribution to journalArticle

Original languageEnglish
JournalEducational Measurement: Issues and Practice
Publication statusAccepted/In press - 15 Jul 2020

King's Authors

Abstract

The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD’s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of ‘factors’ explaining student performance in mathematics, thereby serving the ‘policy orientation’ of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454