King's College London

Research portal

When assessment validation neglects any strand of validity evidence: An instructive example from PISA

Research output: Contribution to journalArticle

Standard

When assessment validation neglects any strand of validity evidence: An instructive example from PISA. / Pepper, David.

In: Educational Measurement: Issues and Practice, 15.07.2020.

Research output: Contribution to journalArticle

Harvard

Pepper, D 2020, 'When assessment validation neglects any strand of validity evidence: An instructive example from PISA', Educational Measurement: Issues and Practice.

APA

Pepper, D. (Accepted/In press). When assessment validation neglects any strand of validity evidence: An instructive example from PISA. Educational Measurement: Issues and Practice.

Vancouver

Pepper D. When assessment validation neglects any strand of validity evidence: An instructive example from PISA. Educational Measurement: Issues and Practice. 2020 Jul 15.

Author

Pepper, David. / When assessment validation neglects any strand of validity evidence: An instructive example from PISA. In: Educational Measurement: Issues and Practice. 2020.

Bibtex Download

@article{b07a908dcbf94db58b3514f124dd60f4,
title = "When assessment validation neglects any strand of validity evidence: An instructive example from PISA",
abstract = "The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD{\textquoteright}s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of {\textquoteleft}factors{\textquoteright} explaining student performance in mathematics, thereby serving the {\textquoteleft}policy orientation{\textquoteright} of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).",
keywords = "validation, large-scale assessment, response processes, self-efficacy, PISA, mathematics",
author = "David Pepper",
year = "2020",
month = jul,
day = "15",
language = "English",
journal = "Educational Measurement: Issues and Practice",
issn = "1745-3992",
publisher = "Wiley",

}

RIS (suitable for import to EndNote) Download

TY - JOUR

T1 - When assessment validation neglects any strand of validity evidence: An instructive example from PISA

AU - Pepper, David

PY - 2020/7/15

Y1 - 2020/7/15

N2 - The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD’s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of ‘factors’ explaining student performance in mathematics, thereby serving the ‘policy orientation’ of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).

AB - The Standards for Educational and Psychological Testing identify several strands of validity evidence potentially needed as support for particular interpretations and uses of assessments. Yet assessment validation often does not seem guided by these Standards, with validations lacking a strand even when it appears relevant to an assessment. Consequently, the degree to which validity evidence supports the proposed interpretation and use of the assessment may be compromised. Guided by the Standards, this article presents an independent validation of OECD’s PISA assessment of mathematical self-efficacy (MSE) as an instructive example of this issue. OECD identifies MSE as one of a number of ‘factors’ explaining student performance in mathematics, thereby serving the ‘policy orientation’ of PISA. However, this independent validation identifies significant shortcomings in the strands of validity evidence available to support this interpretation and use of the assessment. The article therefore demonstrates how the Standards can guide the planning of a validation to ensure it generates the validity evidence relevant to an interpretive argument, particularly for an international large-scale assessment such as PISA. The implication is that assessment validation could yet benefit from the Standards as “a global force for testing” (Zumbo, 2014, p. 33).

KW - validation

KW - large-scale assessment

KW - response processes

KW - self-efficacy

KW - PISA

KW - mathematics

M3 - Article

JO - Educational Measurement: Issues and Practice

JF - Educational Measurement: Issues and Practice

SN - 1745-3992

ER -

View graph of relations

© 2018 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454