Larry Cuban on School Reform and Classroom Practice: OECD Report: Puzzles To Solve (Part 2)
In this post, I will sketch out two puzzles that emerge from the OECD report, “Students, Computers, and Learning.” The first arises from the gap between high PISA test scores and low use of computers in school in particular countries. The second puzzle is trying to explain the inattention that media both mainstream (newspapers, magazines, network news) and side-stream (opinion and curated blogs, Twitter) has paid to this report.
Puzzle 1: Students from countries that score high on PISA in 2012 spend less time in school using computers than European and North American students.
International test comparisons have driven the past thirty years of school reform in the U.S. Doing poorly on international rankings has prodded reformers to call for U.S. students to copy Asian and Scandanavian countries in their language, math, and science lessons. The OECD report on computers in 60-plus countries’ schools, however, offers empirical data that raise serious questions about one sturdy pillar of U.S. school reform: more access to and use of high-tech devices and software will improve teaching and learning.
Consider that 15 and 16-year old students in Singapore, Korea, Japan, China (Hong-Kong and Shanghai), have scored higher on PISA (first, second, third, fourth, and sixth) than the U.S. (twelfth) yet–this is one big “yet’–have less access to computers in their schools and spend less time in school on the Internet (pp.18- 22). Thus, the report concludes: “PISA results show no appreciable improvements in student achievement in reading, mathematics or science in the countries that had invested heavily in ICT for education” (p.15).
How come? Why the disparity in the above countries between access and use of computers in schools (all of the above countries have very high rates of computers in homes) and scores on PISA. No cause and effect do I suggest. This is a puzzling correlation that goes against the non-stop championing of school reformers who tout the virtues of getting more and more devices and software into U.S. classrooms. The OECD report does suggest one tantalizing (and possible) reason, however. Maybe, just maybe, the thinking and writing skills necessary to navigate the Internet and read with understanding web articles and documents, as the OECD report says, can be just as well taught in conventional lessons without use of tablets, laptops, and top-of-the-line software (pp. 15-16). The puzzle remains.
Puzzle 2: Media attention to the OECD report has been minimal, especially in high-tech rich areas.
The report appeared on September 13, 2015. “Warp speed” news in the 24/7 media cycle guaranteed immediate reference to the report. And a flurry of articles in U.S., European, and Asian news outlets appeared (see here, here, here, and here). Within days, the report had been picked up by bloggers and occasional tweeters. Many of the articles and news briefs leaned heavily on OECD press releases and statements in the document by Andreas Schleicher, Director of Education and Skills for OECD. In the U.S., national and regional newspapers and network TV stations ran pieces on the report (see here, here, and here).
In those areas of the U.S. where high-tech businesses are crucial parts of the economy (e.g., California’s Silicon Valley, Austin, Texas, Boston, Massachusetts) barely a passing reference to the OECD report. None at all (as of 9/22) appeared in news organizations in the San Jose-to-San Francisco corridor. Of course, it may be a matter of time–I scoured Google’s references to the OECD report for only 10 days after it appeared. In the face of the ever-hungry news cycle, however, if the OECD report went unnoticed after it appeared, chances that the report’s findings on computer access, use, and academic performance turning up later are slim, given the media imperative to produce fresh news hourly. There may well be analyses in magazines, journals, and the blogosphere that appear weeks or months later but after 10 days, the report will be stale and forgettable news.
Here’s what’s puzzling me: National coverage in the U.S. of the OECD report was spotty. While the Wall Street Journal, Los Angeles Times, and the Washington Post ran pieces on the report, The New York Times has not made reference to it. And in the nation’s hot spots for birthing hardware, software, and apps in northern California, Texas, and Boston, barely a mention. How come?
I can only speculate about the little attention that this eye-catching report on the connections between computer access, use, and performance has attracted at a moment in time in the U.S. when entrepreneurs and vendors promise efficient and effective management of resources and student improvement in reading, math, and science. Across the nation more and more school districts are spending scarce dollars on tablets, laptops, and software. My hunch is that the mindsets of high-tech entrepreneurs, vendors, media executives, foundation officials, and school district policymakers contain deep-set beliefs in the power of technology to make fundamental changes in every sector of society, including schools. When occasional reports like the OECD one appear that challenge the beliefs, it is occasionally noted but not taken seriously or simply ignored. Academics call this inability to absorb information running counter to one’s beliefs, “confirmation bias.” My hunch is that the OECD report has been largely dismissed by ever-scanning mainstream and side-stream media editors, journalists, and bloggers precisely because of this bias toward the power of computers and technology to whip schools into academic shape.
This blog post has been shared by permission from the author.
Readers wishing to comment on the content are encouraged to do so via the link to the original post.
Find the original post here:
The views expressed by the blogger are not necessarily those of NEPC.