Publications & Resources

Assessment of Content Understanding Through Science Explanation Tasks

Nov 2008

Christy Kim Boscardin, Barbara Jones, Claire Nishimura, Shannon Madsen and Jae-Eun Park

Our recent review of content assessments revealed that language expectations and proficiencies are often implicitly embedded within the assessment criteria. Based on a review of performance assessments used in high school biology settings, we have found a recurring discrepancy between assessment scoring criteria and performance expectations. Without explicit scoring criteria to evaluate the language performance, it is difficult to determine how much of the overall performance quality can be attributed to language skills versus content knowledge. This is an especially important validity question for English Learners (ELs) under the current state assessment mandates. To date, studies of the validity and consequences of standards-based assessments for ELs have been limited. In the current study, we examined the various cognitive demands including language skills associated with successful performance on content assessments. Also, as part of the validity investigation, we developed and examined the relative sensitivity of performance-based assessment, which is constructed to be a more proximal measure of student understanding and sensitive to detecting instructional differences.

Boscardin, C. K., Jones, B., Nishimura, C., Madsen, S., & Park, J.-E. (2008). Assessment of content understanding through science explanation tasks (CRESST Report 745). Los Angeles: University of California, Los Angeles, National Center for Research on Evaluation, Standards, and Student Testing (CRESST).