Sections:

Article

The 'Decline' in reading in America: Another case of the "Shock Doctrine"?

The recent report from the National Endowment for the Arts (NEA), To Read or Not To Read, announced that Americans are reading less and reading worse. This resulted in a flurry of articles and reports in the media declaring that we were in a genuine state of crisis, e.g. there is a “remarkable decline” in reading (National Public Radio, November 19, 2007), “the young turn backs on books” (Dallas Morning News, November 20), and “the death of reading” (National Center for Policy Analysis, November 21). The head of the NEA, in fact, has stated that the “decline” in reading was the most important problem facing American society today.

There is indeed a crisis in reading. Few people, it seems, have read the NEA report, and the authors of the report did not do their reading homework. A close examination of the report reveals very little cause for concern. In fact, some of the data suggests that things are just fine.

Are Americans Reading Less?

17-year-olds

The argument that American youngsters are reading less comes the NEA’s Table 18, in which young people of different ages (9, 13 and 17) were asked how often they read “for fun” in the years 1984, 1999, and 2004. The nine year olds show no change at all since 1994. The 13-year olds show some decline (70 percent said they read “almost every day” or “once or twice a week” in 1984 but only 64 percent did so in 2004), and the 17-year-olds appear to have declined even more, (64 percent in1984 and 52 percent in 2004 said they read almost every day or once or twice a week). Also, the older the group, the less reading seems to be taking place.

One issue is how the youngsters interpreted the question: Responders sometimes don’t think some kinds of reading are worth reporting. In one poll of teenagers, of 66 respondents who said they did “no reading” 49 checked several categories of leisure reading when asked what they liked to read (Mellon, School Library Journal, 1987).

This is a very likely factor when considering differences between older and younger readers and changes over time. The NEA cited a study by Kaiser (the M Generation study) in which young people were asked how much reading they did “yesterday”: The NEA reports the results for book reading in the main text of their report: 63 percent of 8 to 10 year olds, 44 percent of 11 to 14 years, and only 34 percent of 15 to 18 year olds read at least five minutes “yesterday.” It looks like those lazy 17-year-olds lose again. But in a footnote, the NEA mentions that if we include magazine and newspaper reading, there is no difference among the groups. I read the actual Kaiser report, and added their data on time spent looking at websites on the internet to the data on book reading, magazine reading and newspaper reading. If we total up all reading, the 17-year-olds read the most, 60 minutes, and the other groups do quite a lot as well (8 to 10, 51 minutes, 11 to 15, 57 minutes).

Since 1984 there has, of course, been increased use of the internet, as well as other forms of reading (e.g. Graphic Novels), and other forms of input of literate texts (audiobooks). We need to know if these kinds of reading were considered worth mentioning by respondents in 1992 and 2004 before we conclude that young people are reading less today.

College Students

The NEA presents data showing that college students read less than they did in high school. Not mentioned, however, is one study showing that college students read quite a bit, and this has not changed over three decades. Hendel and Harrold (College Student Journal, 2004) surveyed the leisure activities reported by undergraduates attending an urban university from 1971 to 2001. Among the questions asked were those related to leisure reading. In agreement with other studies Hendel and Harrold reported a decline in newspaper reading and reading news magazines, but there was no decline in reported book reading. On a scale of 1-3 (1 = never, 2 = occasionally, 3 = frequently), the mean for book reading in 1971 was 2.35; in 2001 it was 2.26, with only small fluctuations in the years between 1971 and 2001.

Moreover, the ranking for reading books was higher than that reported for attending parties (2.14 in 2001), going to the movies (2.16) and for all categories of watching TV (sports = 2.07). Book reading held its own despite a clear enthusiasm for surfing the internet (2.78) and e-mail (2.84), both newcomers.

Adults

To Read or Not to Read also tells us that 38 percent of adults said they read something yesterday, citing a 2006 Pew report. But they do not mention that according to a previous Pew study published in 2002, 34 percent said they read something yesterday. In 1991, this figure was 31percent (see “Public’s news habits little changed by September 11.” Pew Research Center, 2002). Also, a major study of reading published in 1945 found that only 21 percent of those ages 15 and older said they read something yesterday, with the most reading done by those lazy teenagers, ages 15-19, 34percent (Link and Hopf, People and Books, 1945).

Are We Reading Worse?

The NEA used what many consider to be the gold standard for reading: The National Assessment for Educational Progress (NAEP) results, a national test given to samples of fourth, eighth and 12th graders every few years. Once again, the problem is those lazy 17-year-olds, the 12th graders.

There are two kinds of NAEP tests, the long-term trends assessment which allows comparisons of performance years apart, and main assessments, given every two years.

For the long-term trend scores, there has been no decline for fourth or eighth graders, but 12th graders scored four points less in 2004 than 12th graders did in 1984, which the NEA called a “downward trend.”

There are several problems with concluding that this represents anything real. First, whether or not there has or has not been a decline depends on what year you use for the initial comparison: The 2004 national reading scores for 12th graders in 2004 are identical to those made by 12th graders in 1971. This is mentioned only in passing by the NEA. Here are the scores:

NAEP reading scores for 12th graders:

1971: 285

1975: 286

1980: 285

1984: 289

1988: 290

1990: 290

1992: 290

1994: 288

1996: 288

1999: 288

2004: 285

Second, the “downward trend” since 1984 is quite small, four points on a test in which the highest 10percent and lowest 10percent differed by nearly 100 points. The NEA’s chart 5B makes the ”decline” look a lot larger than it is, charting only the changes, not the total scores, and using a y-axis ranging from -6 to + 10. If the y-axis had included the entire range of scores, the differences would look quite small, which they are. Viewed in terms of the possible range in scores, NAEP results for 12th graders are remarkably consistent over the years.

The most outrageous misreporting in the NEA report is in their table 5F, where we are told that on main NAEP reading assessments, test scores for the lowest scoring 10percent of 12th graders dropped 14 points between 1992 and 2005. A look at the actual NAEP report (The Nation’s Report Card: 12th Grade Reading and Mathematics, 2005, page 2, figure 5) reveals that most of this happened between 1992 and 1994, a ten-point drop. Similarly, seven points of the nine-point drop between 1992 and 2004 for the lowest 25 percent occurred between 1992 and 1994. Clearly, something was wrong with one of those tests.

It is hard to see how anyone can look at the figure in the NAEP report and conclude that the drop occurred between 1992 and 2004. A look at the figure also shows that this one-time unusual drop is the only real change in NAEP scores since 1992.

In other words, much of the fuss about declines in reading scores is really about scores for a subgroup of 12th graders between 1992 and 1994. It is not clear whether the authors of the NEA report deliberately constructed table 5F in a misleading way. If it was deliberate, they are dishonest. If it was not deliberate, they are incompetent.

The NEA also faults young readers at all three levels for “how poorly” they read, citing the percentages who read below the “proficient” level or the “basic” level, e.g. 36percent of 4th graders read below the basic level in 2005 and only 31percent were “proficient” or better. Not mentioned in the report is the fact that there is no empirical basis for determining what score should be considered “basic” or “proficient.”

Gerald Bracey has published several penetrating critiques of the NAEP performance levels (see eg. Reading Educational Research: How to Avoid Getting Statistically Snookered) pointing out that the “proficient” level is set very high, and that other countries that consistently rank near the top of the world in reading would not do well on our NAEP: For example, only one-third of Swedish children would be considered “proficient” on the NAEP, nearly the identical percentage of US fourth graders (31 percent in 2005).

The suspicion is that the definition of “proficient” is deliberately set too high, in order to create the illusion that there is, in fact, a crisis in American education, an application of what Naomi Klein has called “The Shock Doctrine,” the deliberate creation of a crisis in order to create an environment to institute policies that would be normally unacceptable.

The NEA report itself is, of course, a candidate for an application of The Shock Doctrine, possibly motivated by federal policy on education. Federal policy is based on the assumption that the path to higher literacy is direct instruction in phonics, reading strategies, and vocabulary, not just for the early grades, but for middle school and even higher levels. The problem is that the NEA report contains evidence that another policy, improving access to books, is more appropriate, but avoids embracing it, or even explicitly mentioning it.

“No Single Barrier” to Raising Reading Rates?

The NEA report presents an impressive set of data showing that reading is good for you, that those who read more do better on NAEP tests of reading and writing, and that those with more books in the home do better on NAEP tests of math, science, history, and civics. Books in the home is, in fact, a better predictor of scores on these tests than is parental education, indicating that it is access to books that is crucial. These findings are consistent with those reported elsewhere (e.g McQuillan, The Literacy Crisis: False Claims and Real Solutions, 1998, and Krashen, The Power of Reading, 2004.)

Yet, the NEA mysteriously insists that “there is no single barrier, which, if removed, would raise reading rates for young Americans” (p. 41). Of course there is: Increasing access to reading materials by improving libraries.

The research is overwhelming. It tells us that those with more access to books read more, and that children of poverty have very little access to books, at home, in their communities, or in school (reviewed in Krashen, 2004, Power of Reading). And of course, as noted earlier, the NEA report confirms that more reading leads to better literacy development and more knowledge.

Research done by Keith Curry Lance, Jeff McQuillan and others also shows that students in schools in higher quality school libraries staffed with credentialed librarians do better on tests of reading, and some of this research specifically shows that library quality (public and school) has a strong relationship with scores on the fourth grade NAEP reading examination: McQuillan (The Literacy Crisis: False Claims and Real Solutions, 1998) reported that children in states with better school and public libraries do better on the NAEP, even when the effect of poverty is controlled.

No single factor? How about improving school and public libraries, especially in high poverty areas? The real problem is not a decline but the fact that children of poverty have less access to books and read more poorly than others. This is something we can do something about. 

This article was originally published in the print edition of Substance (January 2008).



Comments:

Add your own comment (all fields are necessary)

Substance readers:

You must give your first name and last name under "Name" when you post a comment at substancenews.net. We are not operating a blog and do not allow anonymous or pseudonymous comments. Our readers deserve to know who is commenting, just as they deserve to know the source of our news reports and analysis.

Please respect this, and also provide us with an accurate e-mail address.

Thank you,

The Editors of Substance

Your Name

Your Email

What's your comment about?

Your Comment

Please answer this to prove you're not a robot:

4 + 2 =