Education Week - February 17, 2016 - (Page 7)

U.S. Manages to Reduce Share Of Low PISA Scores-in Science By Sarah D. Sparks After more than a decade of heavy investment in closing achievement gaps and bringing all students to proficiency in reading and mathematics, the United States has fewer low-performing students on the Program for International Student Assessment-but only in science. In math and reading, by contrast, there were no changes at all in the share of low-performing students on PISA between 2003 and 2012, according to a new analysis by the Organization for Economic Cooperation and Development. America was mostly flat during that period, remaining a little worse than the international average in the share of students who performed below minimum proficiency in all three subjects. Each of the three core subjects in PISA is administered together every three years to 15-year-olds in more than three dozen countries. The assessment tends to focus on critical thinking and ways students apply what they have learned. Among U.S. students in that age group, 26 percent were low-performing in math, 17 percent in reading, and 18 percent in science. More than 1 in 10-some 95,000 students- scored low in all three subjects. "These are big numbers," Andreas Schleicher, OECD's director for education and skills, said in a briefing BLOGS with reporters. "You translate that into the future, these are people who will be underemployed, unemployed. ... This is a very significant liability for our society." Nine other countries did significantly reduce the number of students who were low-performing during the same time frame, including Brazil, Mexico, and Russia. The OECD considers students "low performing" if they score below level 2-for example, less than 420 points on a 1,000-point scale in math. And American students didn't always do well even on level 1 questions: Only 54 percent of U.S. students correctly answered a math question requiring a student to calculate an exchange between two currencies, which was set at a difficulty level well below level 2 and which 80 percent of students across the OECD answered correctly. In fact, out of 41 OECD countries, only Brazil had fewer students get the question right. Science a 'Puzzle' In contrast to math and reading, the proportion of low-performing students in science decreased by 6 percentage points between 2003 and 2012. "I think the science result in the U.S. deserves some further analysis," Schleicher said. "It's a puzzle to us, a puzzle to me." OECD's analysis, like many other studies, found that a student's risk of being a low performer creeps up steadily from a host of disadvantages that vary in importance from country to country. For example, 80 percent of girls in poverty with other challenges performed below minimum proficiency in math. Poverty was a factor everywhere, but its effect differed widely. In the United States, a student in poverty was seven times as likely to be a poor performer as a wealthy student, while in the OECD generally, poor students were four times as likely to be low performers. Moreover, in the United States and 24 other countries with similar demographic and educational profiles, a student's poverty increased the risk of other characteristics, such as being an immigrant or a girl, speaking a different language from the home country, or having had little or no preschool. By contrast, 21 countries including Brazil, Mexico, Tunisia, and Turkey, all showed that students in poverty with other risk factors had a lower likelihood of being low-performing, suggesting they had more supports for those students. The OECD also found that while educational resources were needed to reduce a country's pool of lowperforming students, the amount of per-pupil spending in each country HOW DID LOW PERFORMERS IN MATHEMATICS DO ON PISA IN 2003 AND 2013? Percentage of students who score below minimum proficiency in a sample of PISA countries: 2012 2003 Sweden Finland Canada OECD average 2003 United States South Korea Japan Russian Federation Brazil Mexico 0 20 40 60 80% SOURCE: OECD, PISA 2012 Database was not as closely linked with performance as with how equitably countries spent the money they had. Students' own dedication and confidence in their abilities played a big role, too, the OECD found. For example, the OECD found students who completed six to seven hours of homework each week were 70 percent less likely to be lowperforming in math, and those who participated in such extracurricular activities as art or music were even more likely to be proficient. But the OECD also found that low-performing math students, wealthy or poor, were significantly more likely to believe that their efforts were meaningless and nothing could help them get better. "Low-performers look alike in attitudes toward school, attendance, belonging, and math self-efficacy, regardless of whether they are from disadvantaged backgrounds," Schleicher said. "Many students say, that's all about talent, that's all about things beyond my control." Visit the Inside School Research blog, which tracks news and trends on this issue. Educators Talk About How They Know They're in the Right School | TEACHING NOW | Every school is different. Different administrators, different colleagues, different families, different buildings-very possibly, different funding and curricula, too. Finding the right school might be a big challenge, but how hard is it really? We recently held a Twitter chat to dive into those topics. Our #ewedchat featured Eric Cooper, the president of the teacher-training-focused National Urban Alliance, and Danielle Brown, a national-board-certified teacher from Arizona. While the discussion covered a lot of ground, it's worth highlighting two questions that dealt explicitly with how teachers enter and exit schools. Q: As a new teacher, how much did you know about the school you ended up at? Carly Lutzmann: "For me, it was easy to pick up the routines of my school, but harder to understand and apply system-level expectations." Diana Maskell: "It's hard to 'know' about a school until you experience it firsthand. Often, what you think you know is merely another's opinion." Kim Dunnagan: "When I walked in the door, the two secretaries greeted me warmly and carried on a pleasant conversation. That is all I needed!" National Urban Alliance: "Had to ask the students who they thought was the best teacher. I went to that teacher and learned from that teacher." Q: What factors do teachers consider when deciding whether or not to leave a school? Cody Norton: "Are kids and families treated with respect? Would I ever let my future children attend this school? I said no & had to leave." Mr. K!: "Was I happy in the school? The community? Was there room to grow or had I reached my full potential here?" Danielle Brown: "Is the district about change & thought based on student needs. Are school & community focused on what students need?" Owl Mt. Coach: "Decision to leave: poor administrative support and incompetence." Christine Dahnke: "Feeling empowered, room for growth, feedback, and visionary leadership." National Urban Alliance: "Retention plans that ignore the uniqueness of each teacher have an inherent weakness, which can also lead to departure." If none of that sounds new, it's because it really isn't. Studies have shown that many of the factors mentioned influence teacher attrition. In a sense, this chat served to tie anecdote and living educators to a wealth of data and research on this topic. But as former educator Rosa Nam wrote in an Education Week Teacher commentary last year, there's another reason for teachers' leaving: Sometimes, it's the right thing to do: "My students deserve a better teacher. In an ideal world, the folks in charge of educating the youth of America would be the most passionate, level-headed, mentally stable, and educated scholars amongst us, but that's like finding a unicorn. All I know for sure is that teaching can be depressing." -ROSS BRENNEMAN Tennessee Online Test Crashes, Causing Return to Paper and Pencil | DIGITAL EDUCATION | Tennessee officials have halted the online administration of the state's assessment after widespread failures that they attributed to a "procedural problem with the vendor." Commissioner of Education Candice McQueen said the breakdowns occurred on the morning of the first day of testing, in the first year that the exams were implemented by a new testing vendor for the state, Measurement Inc. As the scope of the breakdowns became evident, state officials directed districts to postpone the exams until paper- and-pencil versions could be printed and distributed. According to McQueen, state education department officials had been working with the Durham, N.C.-based company since an October "stress test" of the platform to increase server capacity and fix problems. State officials said that shortcomings discovered during that stress test, and ongoing concerns about the stability of the platform-called Measurement Incorporated Secure Testing, or MIST-led them to alert districts two weeks ago that they had the option of testing with paper and pencil. Then, last week, a new batch of network failures, unrelated to the issues that officials thought were fixed, forced the state to require all schools to administer hard copies of the exam. In a conference call with reporters, McQueen blamed the vendor for the failures. But she also ultimately said, "When you are talking about the vendor, you are talking about the state," and that districts "are absolutely not to blame." Cliff Lloyd, the department's chief information officer, said the breakdowns "occurred because of processes kicked off by the vendor," which led to flooded servers and system failures. In a statement, Measurement Inc.'s president, Henry Scherich, voiced confidence in the platform, saying that Tennessee students took more than 1.1 million practice assessments in January to get ready for this week's exams. He added that the company is convinced that the server-overload problem has been fixed and attributed the problems some students had logging in to "improper network utilization, not [platform] functionality." Disruptions of online assessments have become common across the country, enraging district leaders, teachers, and parents, and fueling anti-testing sentiment. Causes of breakdowns have varied. Recent testing failures in Kentucky and Florida were later linked to cyberattacks. Tennessee is one of many states that have embarked on the transition to online platforms for administering their state assessments in recent years. Recent analyses, however, have shown that the format in which students take tests can affect their scores. -LEO DORAN EDUCATION WEEK | February 17, 2016 | | 7

Table of Contents for the Digital Edition of Education Week - February 17, 2016

Education Week - February 17, 2016
Preservice Programs Seek To Head Off Teacher Biases
Black Male Teachers a Rarity
Consolidation Fight Erupts In Vermont
In Cities With Choice, Single- Enrollment Systems Hit Hurdles
DIGITAL DIRECTIONS: Letting Students Work From Home Adds Policy Twist
News in Brief
Report Roundup
Q&A: Principals Urged To ‘Shadow’ Students
Study: Showing Standout Work To Students Can Backfire
U.S. Manages to Reduce Share Of Low PISA Scores— in Science
Blogs of the Week
Lawmakers Pledging to Keep Close Eye on ESSA Implementation
Obama Budget Doubles As Policy Document
Blogs of the Week
Five-State Study Examines Teaching Shifts Under Core
Kansas High Court Strikes Down Stopgap Aid Formula
State of the States
Increased Accountability of Teacher Prep Gives Equity the Back Seat
Self-Care Is the Educator’s Core Standard
Beware the Racist Subtext Of Children’s Books
TopSchoolJobs Recruitment Marketplace
Dispatch From Flint, Mich.: Our Water Crisis Is a Crisis of Trust

Education Week - February 17, 2016