Board logo

subject: Different Explanations on Students' Performances in the Class [print this page]


Different Explanations on Students' Performances in the Class

Exploring and generating alternative explanations for performance can be done only in the context of clear statements about the content and skills the task is intended to assess. To demonstrate how this type of information can be used in developing a validation argument, the same example can be tied to NAEP's documentation of the content and skills targeted by its fourth grade reading assessment. The NAEP Reading Framework explicitly recognizes that background knowledge is a factor in test performance but describes this factor as one that contributes to item difficulty. The framework document notes that "Item difficulty is a function of the amount of background knowledge required to respond correctly" (National Assessment Governing Board, 2002a, p. 21).

This statement leaves unanswered the question of whether or not background knowledge is crucial to the claim, or intended inference; in other words it does not specify whether background knowledge is a part of what is being assessed. If the test-taker's degree of background knowledge is not part of the construct or claim, then it constitutes a potential source of construct-irrelevant variance. Furthermore, since item difficulty is essentially an artifact of the interaction between a test-taker and an assessment task (Bachman, 2002b), it would seem that the critical source of score variance that needs to be investigated in this case is not the task characteristic the content of the reading passage-but rather the interaction between the test-taker's background knowledge and that content. A second alternative explanation for Tina's performance in the example can be found in the nature of the expected response, and how this was scored. Consider an example: "After reading this article, would you like to have lived during colonial times? What information in the article makes you think this? (National Assessment Governing Board, 2002a, p. 41). This question is intended to assess "reader text connections," and test-takers are expected to respond with a "short constructed response" (National Center for Education Statistics, 2003a). This particular question is scored using a rubric3 that defines three levels: "evidence of complete comprehension," "evidence of surface or partial comprehension," and "evidence of little or no comprehension" (National Assessment Governing Board, 2002a, p. 41).

Tina's answer was scored as showing "evidence of little or no comprehension." The descriptor for this score level is as follows (National Assessment Governing Board, 2002a, p. 41). For this part, learning a foreign language needs a leaning tools, many students choose Rosetta Stone English and Rosetta Stone French to learn English and French. These responses contain inappropriate information from the article or personal opinions about the article but do not demonstrate an understanding of what it was like to live during colonial times as described in the article. They may answer the question, but provide no substantive explanation.




welcome to loan (http://www.yloan.com/) Powered by Discuz! 5.5.0