Education Week - September 6, 2017 - 7
New Tool Alerts Teachers When Students Give Up on Tests
By Sarah D. Sparks
How much can a test really tell
you if a student gives up while taking it?
Quite a lot, as it turns out-if
teachers know exactly when and
how a student disengages. That's
what some schools nationwide
started to do earlier this summer,
using real-time alerts during a computer-based adaptive test to spot
students going off task, and salvage
meaning from assessments that students think they are writing off.
"It's really a shift from seeing
[testing] simply as a way to improve achievement scores-which it
does-to seeing it as a source of data
that teachers can use to understand
their students," said Jim Soland, a
research scientist with the research
and testing firm Northwest Evaluation Association.
The alerts are part of an ongoing
research project between NWEA
and the 50,000-student Santa Ana
school district, a member of the California Office to Reform Education, or
CORE, district consortium.
Teachers in the mostly poor and
Hispanic Santa Ana district have
always spotted the signs of students
shutting down on a test, but they
couldn't tell how badly it affected
their performance until the test was
"They play with the keyboard, they
may be fidgeting," said Emily Wolk,
the assistant director of research
and evaluation for Santa Ana, who
has watched over her share of tests.
"Of course, we are very concerned
about that; we know there is a connection between their engagement
on a test and how they do in other
areas as well."
Gauging the Speed of Thought
Schools generally use two ways to
measure student engagement on a
test. Teachers can look at the results
of the test and compare it to what
they know of the student to decide
how hard he worked, or they can ask
a student directly if she gave the test
her best effort. Both measures are
subjective, and aside from noticing a
student physically putting her head
down, it can be difficult for teachers
to realize a student has checked out
during a test until it's already been
turned in. Even then, it's hard to tell
the difference between a student
who is distracted because he had a
fight with his mom over breakfast
and one who despairs because a
question is too hard.
Instead, NWEA researchers Stephen Wise, Soland, and Nate Jensen
measure student-response times to
hundreds of questions on the NWEA's
MAP Growth test, an adaptive, computer-based assessment taken by
about 20 percent of U.S. students. It
tracks the difficulty of the questions
it presents to students' previous test
Wise and his colleagues found
that under normal circumstances,
students take 40 to 50 seconds to
read and answer each question. If
students starts to disengage from
a test, "they start answering items
very quickly-2 to 3 seconds-more
quickly than it would take to even
Student thought a long time on a few items.
read the question," Wise said.
By tracking this "rapid guessing,"
Wise and his colleagues can monitor
how much effort students put into
each question in real time and compare it with how hard students and
teachers think the students worked.
"People had thought that you are
either engaged or disengaged on a
test, and if you disengage, you stay
disengaged," Wise said, "but we
found there's no set pattern. It's not
like people shift into not trying and
stay there. It's more that they size
up an item when they see it, and
if it looks like more effort than I'm
prepared to give, I'm just guessing."
On average, the researchers found
that after students first start rapidguessing, they still legitimately try
to answer 80 percent of the remaining questions on a test. Only 1 percent to 2 percent of those who disengage from a test rapid-guess on all
or most of their questions.
Elementary students are less likely
to disengage on a test than those in
middle and high school-where as
many as half of students rapid-guess
at least once in a given test, Soland
said. While there are relatively small
racial differences in disengagement,
boys are significantly more likely
than girls to start rapid-guessing.
The subject matters, too. Though
students often report math as more
difficult than reading, the researchers
found that students on average are
nearly twice as likely to rapid-guess
on a reading test as on a math test,
and they are more likely to guess at
individual questions on any test that
requires more reading.
Soland and Jensen now are working
with Santa Ana to connect frequent
test disengagement to other problems
in school. They found that students
who are less skilled at communicating with adults and classmates were
more likely to disengage, as were
those with more of a fixed mindset
about academic skills-the belief that
such skills are innate, rather than
built with practice. But most of all, a
student's likelihood of disengaging on
a test was associated with his or her
self-management and self-regulation
skills, the ability, for example, to show
up for class prepared and on time. "As
they disengage from tests and the
course material, a whole host of other
things come up ... attendance, suspensions, course failure ... that have been
connected to risk of dropping out of
school," Soland said.
"What we're really showing is lack
of test engagement is a symptom
around a lot of deep-rooted problems," Soland said. "In my mind,
there's this chain from, if a kid has
low motivation, a lack of self-belief
in academic subjects, that can manifest itself in a lot of different ways."
Soland and Jensen are now using
the tests to build practical measures
of students' social-emotional development and connection to school under
the first national Social-Emotional
Assessment Design Challenge award
by the Collaborative for Academic,
Social, and Emotional Learning.
NWEA is also changing its test reporting to show a student's engagement levels as well as a performance
score. It would show, based on the student's performance before she started
guessing, how much better she could
have performed by trying harder.
"Eventually, people are going to
start viewing test data differently: not
just, here is your score, but here is how
engaged people were when they gave
you these scores; here's how well does
this reflect what students know and
can do," Wise said.
In the pilots so far, alerting the
teacher to disengagement during
the test "had a profound effect on
[students'] engagement," Wise said,
though there are no formal evaluations of the intervention yet.
"This is a tool," said Santa Ana's
Wolk. "A teacher knows the students, knows how to quietly go over
and say, 'Hey, how's it going? I care
about you, and I'm concerned you
are moving too quickly here.' "
Coverage of learning mindsets and
skills is supported in part by a grant
from the Raikes Foundation, at www.
raikesfoundation.org. Education Week
retains sole editorial control over the
content of this coverage.
TRACKING WHEN STUDENTS DISENGAGE
Research by the Northwest Evaluation Association explores how students disengage
on tests. A student spends the first third of the test below putting out the average
effort-or a bit more-for each question, before disengaging and answering
questions toward the end of the test too quickly to have actually read them.
Student starts to disengage.
Response time in seconds
Student is answering most questions faster than they can be read.
EDUCATION WEEK | September 6, 2017 | www.edweek.org | 7
Table of Contents for the Digital Edition of Education Week - September 6, 2017
Education Week - September 6, 2017
Teachers Carve Out a Place in the Curriculum For LGBT History
Learning to Teach Via Virtual Reality
Rule Targets District Bias In Spec. Ed.
Hurricane Takes Heavy Toll on Schools
News in Brief
State Educational-Leadership Initiatives In Budget ‘Pickle’
New Tool Alerts Teachers When Students Give Up on Test
DIGITAL DIRECTIONS: Mobile Devices Put Education In Hands of Syrian Refugees
LGBT Curricula Spreads Slowly
Tweaking School Turnarounds
After Fierce Fight, Illinois Enacts Tax-Credit Scholarship Program
President’s Youngest Son Joins Back-to-School Crowd
Sarah M. Stitzlein: How to Define Public Schooling in the Age of Choice?
Q&A With Jack Schneider: What Makes a School Good? It’s More Than Test Scores
READERS REACT: Have SAT Accommodations Really Gone Too Far?
TopSchoolJobs Recruitment Marketplace
Chris Elmendorf & Darien Shanske: We Need Better Education Data
Education Week - September 6, 2017 - Hurricane Takes Heavy Toll on Schools
Education Week - September 6, 2017 - 2
Education Week - September 6, 2017 - 3
Education Week - September 6, 2017 - News in Brief
Education Week - September 6, 2017 - 5
Education Week - September 6, 2017 - State Educational-Leadership Initiatives In Budget ‘Pickle’
Education Week - September 6, 2017 - New Tool Alerts Teachers When Students Give Up on Test
Education Week - September 6, 2017 - DIGITAL DIRECTIONS: Mobile Devices Put Education In Hands of Syrian Refugees
Education Week - September 6, 2017 - 9
Education Week - September 6, 2017 - 10
Education Week - September 6, 2017 - 11
Education Week - September 6, 2017 - 12
Education Week - September 6, 2017 - 13
Education Week - September 6, 2017 - LGBT Curricula Spreads Slowly
Education Week - September 6, 2017 - 15
Education Week - September 6, 2017 - After Fierce Fight, Illinois Enacts Tax-Credit Scholarship Program
Education Week - September 6, 2017 - President’s Youngest Son Joins Back-to-School Crowd
Education Week - September 6, 2017 - Sarah M. Stitzlein: How to Define Public Schooling in the Age of Choice?
Education Week - September 6, 2017 - Q&A With Jack Schneider: What Makes a School Good? It’s More Than Test Scores
Education Week - September 6, 2017 - Letters
Education Week - September 6, 2017 - 21
Education Week - September 6, 2017 - TopSchoolJobs Recruitment Marketplace
Education Week - September 6, 2017 - 23
Education Week - September 6, 2017 - Chris Elmendorf & Darien Shanske: We Need Better Education Data
Education Week - September 6, 2017 - CW1
Education Week - September 6, 2017 - CW2
Education Week - September 6, 2017 - CW3
Education Week - September 6, 2017 - CW4