Different Explanations on Students' Performances in the Class
Different Explanations on Students' Performances in the Class
Exploring and generating alternative explanations for performance can be done only in the context of clear statements about the content and skills the task is intended to assess. To demonstrate how this type of information can be used in developing a validation argument, the same example can be tied to NAEP's documentation of the content and skills targeted by its fourth grade reading assessment. The NAEP Reading Framework explicitly recognizes that background knowledge is a factor in test performance but describes this factor as one that contributes to item difficulty. The framework document notes that "Item difficulty is a function of the amount of background knowledge required to respond correctly" (National Assessment Governing Board, 2002a, p. 21).
This statement leaves unanswered the question of whether or not background knowledge is crucial to the claim, or intended inference; in other words it does not specify whether background knowledge is a part of what is being assessed. If the test-taker's degree of background knowledge is not part of the construct or claim, then it constitutes a potential source of construct-irrelevant variance. Furthermore, since item difficulty is essentially an artifact of the interaction between a test-taker and an assessment task (Bachman, 2002b), it would seem that the critical source of score variance that needs to be investigated in this case is not the task characteristic the content of the reading passage-but rather the interaction between the test-taker's background knowledge and that content. A second alternative explanation for Tina's performance in the example can be found in the nature of the expected response, and how this was scored. Consider an example: "After reading this article, would you like to have lived during colonial times? What information in the article makes you think this? (National Assessment Governing Board, 2002a, p. 41). This question is intended to assess "reader text connections," and test-takers are expected to respond with a "short constructed response" (National Center for Education Statistics, 2003a). This particular question is scored using a rubric3 that defines three levels: "evidence of complete comprehension," "evidence of surface or partial comprehension," and "evidence of little or no comprehension" (National Assessment Governing Board, 2002a, p. 41).
Tina's answer was scored as showing "evidence of little or no comprehension." The descriptor for this score level is as follows (National Assessment Governing Board, 2002a, p. 41). For this part, learning a foreign language needs a leaning tools, many students choose Rosetta Stone English and Rosetta Stone French to learn English and French. These responses contain inappropriate information from the article or personal opinions about the article but do not demonstrate an understanding of what it was like to live during colonial times as described in the article. They may answer the question, but provide no substantive explanation.
Enhance The Backup Storage Performance With Dell LTO-4-120 Drive and LTO4 Tape Media Upgrading performance for a dodge neon srt-4 High Temperature Sealing Solutions for Enhanced Performance The Finest Key Performance Indicators For General Contractors Performance And Convenience Combine In The Peg Perego GT3 Completo High Performance Graphic Multimeter with DMM Functions How Fear Blocks Performance Improving file system performance utilizing dynamic record sizes in ZFS Know More About Performance and Features of 3.6 V Lithium Battery How to Update a Driver and Improve Your PC's Performance freely? Affordable Ways to Improve Trailer Suspension Performance How a Leader Can Increase an Employee's Performance Sexual Performance Pills
www.yloan.com
guest:
register
|
login
|
search
IP(216.73.216.63) California / Anaheim
Processed in 0.017605 second(s), 7 queries
,
Gzip enabled
, discuz 5.5 through PHP 8.3.9 ,
debug code: 6 , 3115, 428,