Publications & Resources
On Science Achievement From the Perspective of Different Types of Tests: A Multidimensional Approach to Achievement Validation
Carlos Cuauhtemoc Ayala, Yue Yin, Susan Schultz and Richard Shavelson
Students bring to achievement tests a complex mix of cognitive, motivational, and situational resources to address the tasks at hand. Previous research (Hamilton, Nussbaum, Kupermintz, Kerkhoven, & Snow, 1995; Hamilton, Nussbaum, & Snow, 1997; Nussbaum, Hamilton, & Snow, 1997) has demonstrated the usefulness of a multidimensional representation of science achievement and, in particular, three reasoning dimensions: basic knowledge and reasoning, spatial-mechanical reasoning, and quantitative science reasoning. Though other authors in this set of reports look at the different patterns of student cognitive, motivational, and situational responses as they predict science achievement, our focus is on the science achievement measures and on their relationships with the three reasoning dimensions. Thirty multiple-choice items, 8 constructed response item and 3 performance assessments, each nominally assigned to one of the reasoning dimensions, were administered to 35 students—a representative subsample of the whole study (N = 341). We found that the different measures of science achievement were moderately correlated with each other, suggesting that these measures tap into somewhat different aspects of science achievement, as expected. We also found that the correlational patterns of student scores on items of like reasoning dimensions did not group as expected, and that student knowledge and experience seemed to suggest how a student solved a problem and not the problem alone. We therefore concluded that the nominal assignment of our items to three reasoning dimensions was problematic.
Ayala, C. C., Yin, Y., Schultz, S., & Shavelson, R. (2002). On science achievement from the perspective of different types of tests: A multidimensional approach to achievement validation (CSE Report 572). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).