TY - JOUR
T1 - Exploring the multimedia effect in testing
T2 - the role of coherence and item-level analysis
AU - Arts, Jorik
AU - Emons, Wilco
AU - Dirkx, Kim
AU - Joosten-ten Brinke, Desirée
AU - Jarodzka, Halszka
N1 - Publisher Copyright:
Copyright © 2024 Arts, Emons, Dirkx, Joosten-ten Brinke and Jarodzka.
PY - 2024/4/2
Y1 - 2024/4/2
N2 - Educational tests often combine text and images in items. Research shows that including images in test items can influence response accuracy, termed the Multimedia Effect in Testing. This effect suggests that using pictures in tests can enhance student performance and reduce the perception of item difficulty. As such, the Multimedia Effect in Testing could influence test validity. However, research in this area has produced varied and conflicting results, which may be partly attributed to the functionality of the images used. Besides, many studies only offer test-level data, making it challenging to determine whether the outcomes represent a generic phenomenon or result from averaging mixed outcomes in individual test items. This present study examined whether coherency of pictures in tests influences response accuracy, mental effort and time-on-task at the test level and item level. Item-level analysis showed that the Multimedia Effect in Testing is not universal; only a small subset of items showed significant differences between text-only and text-picture items. The degree of coherence also did not give unambiguous results. In summary, the study highlights the complexity of the Multimedia Effect in Testing, suggesting it is context-dependent, with not all test items benefiting equally from multimedia elements. The findings emphasize the need for a nuanced understanding of how multimedia affects educational testing.
AB - Educational tests often combine text and images in items. Research shows that including images in test items can influence response accuracy, termed the Multimedia Effect in Testing. This effect suggests that using pictures in tests can enhance student performance and reduce the perception of item difficulty. As such, the Multimedia Effect in Testing could influence test validity. However, research in this area has produced varied and conflicting results, which may be partly attributed to the functionality of the images used. Besides, many studies only offer test-level data, making it challenging to determine whether the outcomes represent a generic phenomenon or result from averaging mixed outcomes in individual test items. This present study examined whether coherency of pictures in tests influences response accuracy, mental effort and time-on-task at the test level and item level. Item-level analysis showed that the Multimedia Effect in Testing is not universal; only a small subset of items showed significant differences between text-only and text-picture items. The degree of coherence also did not give unambiguous results. In summary, the study highlights the complexity of the Multimedia Effect in Testing, suggesting it is context-dependent, with not all test items benefiting equally from multimedia elements. The findings emphasize the need for a nuanced understanding of how multimedia affects educational testing.
KW - computer-based testing
KW - item construction
KW - multimedia assessment
KW - multimedia effect
KW - multimedia testing
KW - representational pictures
KW - test design
U2 - 10.3389/feduc.2024.1344012
DO - 10.3389/feduc.2024.1344012
M3 - Article
AN - SCOPUS:85190372090
SN - 2504-284X
VL - 9
JO - Frontiers in Education
JF - Frontiers in Education
M1 - 1344012
ER -