Testing Duration: How Long is Too Long to Spend on the MAP Growth Assessment? - TLG-IMG-09212018Every bit autumn is in full swing, and almost schools and districts are either testing or starting to view results, we thought it was timely to share some findings from NWEA Enquiry on average examination duration. The MAP Growth assessments are untimed, and as such, there is breadth during administration of the tests for students to work at their ain pace. When a challenging question is presented, the student tin can consider the response, or with some items, use a manipulative like a calculator to work out the answer without the pressure level of being on the clock.

By using an untimed exam, the proctor in the room has some level of obligation to monitor the students' progress and make a determination if in that location's a need to interruption a examination to pick it back up at some other fourth dimension.

My colleague, Dr. Steven Wise, has written nigh the piece of work NWEA Research has done to unpack the relationship between student appointment and the validity of MAP Growth scores. Terminal autumn, we introduced notifications in the MAP Growth assessment that help proctors monitor for students who may be "rapid guessing," or demonstrating the beliefs of quickly choosing an answer faster than they could have had fourth dimension to fully read and understand the challenge posed past the detail. Last week, I wrote about how our research on rapid guessing informed our policy on invalidating MAP Growth tests when a student rapid-guesses on 30% or more than of the items.

But what if the issue is not a question of educatee effort, merely rather a lapse of practiced testing practices? To respond this, NWEA Research wanted to dig in a bit on the relationship between administration practices and test integrity.

Tweet: Testing Duration: How Long is Too Long to Spend on the MAP Growth Assessment? https://ctt.ac/6Vzx8+ #edchat #edresearch #MAPGrowthNow, when I refer to test integrity, I'm referring to the question of whether a test score is a legitimate estimate of a student's power, or if it were produced under weather that could over- or under-represent that ability. Examples of weather like this would be allowing students to take multiple hours to complete the assessment in a single sitting, frequent interruptions or pauses initiated by the proctor, or frequent retesting of students. Educators also demand to attend to the consistency of their practices across different examination administrations. For case, completing autumn tests in a single sitting, and and then giving the jump test over four or five sittings compromises the integrity of a growth score considering conditions were not consequent. Thankfully, these examples are not mutual, but they do happen.

I of the areas nosotros focused on is examination duration. An efficient measure of student learning shouldn't have the student away from the classroom for several hours at a time. In general, NWEA expects that students will complete a MAP Growth examination in well-nigh 45 to 75 minutes, with loftier-performing students taking longer in some cases. There is, of course, variability depending upon testing flavor, student class level, and the subject field of the assessment.

The NWEA Inquiry team studied the examination durations and changes in exam durations between terms for grades K-10 and take documented the averages and benchmarks for more unusual times to help inform our partners. The results are now bachelor to MAP Growth users both as a study and a data visualization. While the average durations are not strict requirements, they practise offer some ranges and comparative information to consider.

At NWEA, our mission is "Partnering to help all kids larn." The goal of our piece of work and our assessments is to help educators help students improve their learning,not their scores.In other words, improved scores should follow improvements in learning, and not be an end to themselves. One of the keen joys I take in working at NWEA is knowing that we are committed to that mission and improving learning. When used properly by defended and smart educators, our assessments show what students know and the progress in learning that'southward happening.