Sections:

Article

Data-Driven Analysis: How Illinois school officials rigged the ISAT Reading Test for political purposes

About The Reporter: I am a twenty-plus year, elementary school teacher in the Chicago Public Schools (CPS). I am a proud and long-time subscriber to and presently a reporter for Substance, a member of CORE (Caucus of Rank and File Educators), and PURE (Parents United for Responsible Education).

NARRATIVE

I’ve been teaching one elementary grade-level for three years now, but prior to that I taught a different grade-level for five years, in my present prekindergarten through eighth grade school. Little did I realize how much my teaching abilities in reading would diminish from the grade level switch, that is, “according to the Illinois Standards Achievement Test (ISAT) data.”

Over the past two years, I witnessed with concern some of my students’ previous ISAT reading scores deteriorate with me at their next grade level as I apparently made them “less intelligent.” And I appeared to be such a great success when I taught my previous grade level!! All this from “the data” — even though it seemed to me that the students just kept progressing and advancing along, working as hard as I was working.

So, as a teacher, I began to wonder: What is going on?

Thanks to Mr. “Data-Driven” Huberman, the latest in a too-long line of non-educators in charge of education in the City of Chicago, I have put together data that I believe might shed some light on what might be happening to me/my students, as well as other teachers/students in CPS within the Illinois State Board of Education (ISBE) system.

Two things triggered my research project. One, I, along with the faculty of my school, received information via one of the many sheets of deep-dive data-driven diggin’analysis given to us from the “experts in charge of CPS” that the “value-added” calculations of certain grade levels in our school amounted to negatives. I thought: Whew! What does one do in a case like that? Crawl in a hole?

Two, as part of an assignment for professional development at the beginning of the school year, the intermediate and upper grade teachers were required to compile our own data, tracking student ISAT test scores over time.

Two grade levels (third and fourth) did not have any data to track for purposes of comparison, so they actually did not have any assignment due. (Thankfully, no one asked these teachers to compare their students’ ISATs to any previous DIBELS data from earlier grades; this would have been an exercise in futility, given that all the time spent on DIBELS as part of CPS’s primary grade reading program, still following George Bush’s national model, has nothing to do with students actually reading anything for comprehension.)

I decided (after crawling out of the psychological hole hearing from the “experts” that hard-working teachers are a negative to the education of their students threw me into) to do a further data-driven research project of my own. I was curious.

As is my habit anyway, I had already taken my students' scores from the previous grade(s) and compared them to their newer scores at the end of their (unfortunate?) sojourn with me as their teacher. I do this every year, but for the last two, since my grade level switch, I have seen the same, overall pattern — the scores in reading generally rise — but not at “the expected levels.” I recalled this seemed to happen at my new grade level as a whole, not just my classroom; I also vaguely recalled this happening for other grade levels as well.

So, I decided to go backward then forward to see where my new students in this new school year just might already stand at this stage in the ISAT game. Lo and behold! The students were already greatly losing ground! Among other things, their incoming ISAT reading scores indicated that XX% met/exceeded the standards upon entry to my classroom, yet the fall CPS Reading Benchmark Assessment scores (not that I have any faith in that particular assessment instrument) now predicted a very worrisome percentage of students who would no longer meet/exceed the ISAT standard for this year. All this in just the first 5-6 weeks of the new school year!

Of course, all this made me wonder… maybe, just maybe, were the kids and I just plain normal and their previous year’s ISAT scores INFLATED??? My reluctance to consider this possibility for the past two years was due to two things:

One, I never questioned the legitimacy of the scores from these, it seemed to appear, always high-scoring grade levels (whereas others always seemed to score lower) because I knew that I (and the teams I either worked with or socialized with) did not cheat, therefore an otherwise obvious reason that scores might be inflated was never considered;

Two, I did not like the idea that I/we might not have been the “great” teachers the scores appeared to indicate. (In a nutshell: naiveté and vanity.)

But now, armed with my suspicions, and stuck with these new and not-improved scores at my different grade level, I had to see if I was alone in the CPS/ISBE education universe or in good company.

METHOD

I used a rare copy of the CHICAGO PUBLIC SCHOOLS 'Proposed BUDGET 2009-2010.' The document lists all elementary schools. I selected every 10th school in the “Neighborhood Schools” section. Since they are listed in ABC order by the first name of the school, I thought – how much more random could I get? I chose only schools with student populations of grades PreK or Kindergarten through grade 8, both for comparison to my own school and for a sense of school continuity. If the 10th school in the selection was a high school or grade 4-8 school, etc., I simply took the very next school that fit the criteria but continued counting by 10 from the actual 10th school listed. I ended up with a list of 45 schools for the first of what became five research quests.

QUEST/DATA SET #1

The first thing I did was simply look on the ISBE website and note whether or not the percentage of students who met/exceeded the standards in reading increased or decreased in the randomly-selected schools from grade three (2008 scores) to grade 4 (2009 scores) and from grade seven to grade eight. The basic question was: What happened to the scores of the same students when they moved from one grade level (3 and 7) to the next (4 and 8)? Yes, students transfer in and out of schools, but it was good enough for my purposes.

My results were as follows: For the same third graders taking the test in fourth grade, 24 of the schools showed increases in the percentage of students meeting/exceeding the standards, while 18 showed decreases. That’s 57% for the increases to 43% for the decreases. For the same seventh graders taking the test in eighth grade, 35 of the schools showed increases in the percentage of students meeting/exceeding the standards, while 7 showed decreases. That’s 83% for the increases to 17% for the decreases.

Note: Though I started with 45 schools, the number dropped to 42 because I was still figuring out how to navigate the ISBE website/data; I simply couldn’t find three schools, and one turned out to be a new school with no data available. In fact, with all 45 schools, the percentages are as follows: grade 3 → 4 +58%/-42%; grade 7 → 8 +84%/-16%.

My suspicion/hypothesis was that the 3rd → 4th grade numbers would drop more than they did, but the 7th → 8th grade results had me curious for more. Quite frankly, they were (and are) unbelievable.

QUEST/DATA SET #2

I took the 45 schools from the method described above. I eventually found the two missing schools and selected the next one in the listing to replace the school with no data – and decided to look at what happened to all the students’ ISAT reading scores from one year (2008) to the next (2009) – 3rd to 4th, 4th to 5th, 5th to 6th, 6th to 7th, and 7th to 8th.

After further checking and rechecking, my results are as follows for the averages of the percentages of the same students increasing or decreasing in “meets/exceeds the ISAT reading standards” from one year to the next:

Grade 3 → 4 increase 1.4% 58% of the schools increase; 42% decrease

Grade 4 → 5 decrease .8% 53% of the schools increase; 47% decrease

Grade 5 → 6 increase 12.1% 91% of the schools increase; 9% decrease

Grade 6 → 7 decrease .8% 47% of the schools increase; 49% decrease; 2 schools remained the same

Grade 7 → 8 increase 7.7% 84% of the schools increase; 16% decrease

I certainly acknowledge(d) the possibility (if not certainty) of human error! But, at the time, I felt I was on to something.

QUEST/DATA SET #3

I went back to the Proposed BUDGET and selected every 7th school with the same method described above to see if the pattern would be replicated. I ended up with 65 schools this time. I have not had time to thoroughly double-check my own calculations; they may be published (as here) when I or others have time to do so. However, the results were very similar and clearly showed the same pattern as demonstrated above for the previous 45 schools.

QUEST/DATA SET #4

I was on a roll. Then it occurred to me to look at the data already compiled for the Subregions, District, and State – right in line next to the data by grade levels for the individual schools on the ISBE Report Cards. (Duh.) I calculated the year-to-year data (2008-2009) from the totals given for the “Subregions” (Chicago neighborhood schools), District (all Chicago Public Schools, including selective enrollment, charters, etc.), and State. Triple-checked by me (but see the human error statement above at the end of Set #2), my results are as follows:

Subregion District State

3 → 4 +3.3 +4.2 +2.1

4 → 5 -.6 -.2 +.3

5 → 6 +12.5 +12.8 +6.4

6 → 7 +1.8 +2.2 -1.5

7 → 8 +8.1 +8.1 +5.9

At this point, the pattern was/is indisputable. The number of students from third grade meeting/exceeding “the ISAT standards in reading” slightly increases in their next (fourth grade) year, but close to half of the schools register a decrease; the number of students from fourth grade meeting/exceeding “the ISAT standards in reading” drops when those same students take the test in their fifth grade year of elementary school; the number of students from fifth grade meeting/exceeding “the ISAT standards in reading” skyrockets in sixth grade, particularly in Chicago with close to all of the schools registering increases in the number of students who meet/exceed the standards; however, when sixth graders from one year take the test for their next seventh grade year, the number of students who meet/exceed “the ISAT standards in reading” nosedives; and finally, the number of students who meet/exceed “the ISAT standards in reading” from seventh grade now in eighth grade increases nicely for a grand elementary school finish.

In other words, the composite percentages for the number of students “meeting/exceeding the ISAT standards in reading” are nice and high for Chicago’s “Bridge” grades (3rd, 6th, and 8th). If they were lower, CPS would have to spend more money to enforce its policy of requiring all students who do not meet a certain (very low) standard to attend summer school, test again, and either pass or be retained based on the scores of the retests. (Conveniently, summer school is not funded by CPS for any other grades.) However, in quite an opposite fashion, lest anyone draw a very wrong conclusion from this data that such high-stakes testing works, the composite percentages for the high-stakes 7th grade test are comparatively low – a test that has traditionally been used to determine which students (without political clout) are allowed to enter Chicago’s selective enrollment high schools, i.e. Whitney Young & Co.

Note: The (non-double-checked by me/see human error statement above) data for the Math ISAT showed a similar pattern, just much less dramatic.

I am not alone in the CPS/ISBE education universe after all. In fact, I think my students and I do damn well on a RIGGED TEST.

QUEST/DATA SET #5

The last data I crunched was time-lapsed. I calculated then averaged one grade-level to the next grade-level for the District (CPS) and State over time from 2006. In 2006, for the first time, the ISAT reading test was given to all students in grades third through eighth; prior to that, one cannot make the same grade-to-grade comparisons due to differing grade levels taking or not taking the reading test. The (double-checked by me/see human error statement above) results are as follows, suggesting that these politically-expedient test results have been going on since, I fuzzily recall, about the time Arne Duncan became CEO of the CPS:

DISTRICT STATE

3 → 4 +4.9 +1.8

4 → 5 -2.2 -1.0

5 → 6 +12.2 +6.9

6 → 7 +5.4 +1.1

7 → 8 +11.8 +7.9

True experts, not self- and media-proclaimed experts who have no experience in the given field, need to verify the information/data presented above. (Substance has someone working on this.) In the meantime, the following are the concerns, questions, and comments of the reporter:

• An independent investigation should take place. Who might have done this to the students and teachers in the state of Illinois, aka the city of Chicago?

• Until an independent investigation is completed, a moratorium on any “standardized” testing, along with any related “high stakes” uses of these tests (failing students, merit pay, school closings/turnarounds, etc.), from ISBE or CPS should take place. And the nation should take heed.

• Transparency is demanded. Let the public see these “standardized tests.” Doesn’t the term imply valid and reliable movement by students through their grade levels in even chunks? Can you spell n-o c-o-n-f-i-d-e-n-c-e?

• The test-takers, aka students, along with their teachers/principals/school communities, have been held accountable long enough. It is now high time to hold the test-makers and their political cheerleaders accountable.

• Did someone in CPS/ISBE confuse “No Child Left Behind” with “Leave the Non-Bridge Grade-Level Children in Chicago Behind to Make Mayoral Control of the Public Schools Look Good”?

• If the politicians of Illinois had legislated merit pay (read: Race to the Top) starting in 2006 based on students’ test scores, it appears that in Chicago/Illinois elementary school teachers of grades 3, 6, and 8 would have suspiciously hit the jackpot, leaving the teachers of the non-bridge grades behind.

• Huberman’s super-dooper-goober-calculating-pooper-scoopers don’t know about or suspect this? Really? Every elementary school in CPS has been presenting them - via performance management, local school, data-crunching teams - data-driven “concerns” and data-driven “celebrations.” No patterns have arisen?

• Why did CPS/ISBE feel the need to give a “pilot ISAT Test” to 3rd, 6th, and 8th graders recently in December? What is the real what and why of CPS’s new policy for selective enrollment schools? In other words, are the above-demonstrated patterns of test score calculations now being rigged to change with this year’s round of tests? Or next year’s possible “new” test (see Chicago Tribune 01-29-10)?

• What is the real what and why of ISBE wanting to “scrap” (a term used for junk) the ISAT Test? In a cash-starved, supposedly flat broke system in a cash-starved, supposedly flat broke state? Different politics now demands different tests (with different results?) whatever the costs? The 3rd through 8th grade tests the children of Illinois have taken for the past four years now somehow aren’t good enough? Explain, please. Schools have been closed, children have been uprooted, employees have lost their jobs – due to tests now determined to not be somehow appropriate for Barack Obama’s Flunkin’ Duncan national demands?

• Where was/is Senator Meeks and his… gang on the Illinois Senate Education Committee as this was allegedly happening? Participating? Or sleeping at the watchdog helm? • Where was/is the Chicago “mainstream” media, that is, besides cheerleading all things related to “education reform” especially via “test scores,” and swallowing whole and regurgitating all numbers presented to them by all the CEOs of the CPS over these many years? • What might this mean for the education policies of the President of the United States and his non-educator education secretary, whose previous incarnation was the politically-appointed CEO of the CPS underlying this alleged data scandal? Such leadership for our nation’s most precious resource, its children?!

I believe our country, following Chicago’s example, is in a Race to the Top of SHAME. 



Comments:

February 4, 2010 at 8:33 AM

By: Susan Ohanian

activist

Thank you for this compelling look at what's going on with standardized test. Let the rest of the country be warned about the Chicago Model.

February 4, 2010 at 9:06 AM

By: truth seeker

maybe it's time for action now.

When will students be encouraged to systematically organize and boycott this ridiculous sham? Perhaps the union could organize teachers to systematically not administer the tests.

February 4, 2010 at 3:10 PM

By: Jim Horn

blogger professor

Nice work. Berliner and Nichols write about this corrupting influence of high-stakes testing in their book, "Collateral Damage . ."

It's called Campbell's Law, and it owes its origin to theory by Donald Campbell.

As if CPS needed another corrupting influence.

February 4, 2010 at 4:25 PM

By: le1212

Wow

Susan, thanks for finding time to expose the absurd and hostile conditions teachers throughout the U.S. face daily.

Yes, high stakes testing should be stopped--everywhere, today! The NCLB accountability model is fraudulent; designed to fail, not assist, schools. This awareness is spreading.

The attack on public education is best summarized by managerial guru Tom Peters, "It's easier to kill an organization than to change it. Big idea: DEATH!"

The pure market/privatization movement is viscous. Reform, in the current context, is a form of destruction, not assistance. Educators, especially those who "know better", need find ways to survive in contest.

Oh, and find ways to support the March 4th protests!

In Solidarity

Marc

February 4, 2010 at 5:32 PM

By: Rod Estvan

An easier way to look at ISAT and PSAE cohorts

Susan there is a far easier way to look at cohort ISAT scores than your approach. Go to the ISBE website http://www.isbe.state.il.us/

Now go to the pull down menu for the interactive report card. Search by district and type in Chicago and click Chicago SD 299. Click Trends and then select “by cohorts.” Now you can see clear cohort data going from grade to grade for selective years for the whole of CPS. You can even create pdf files of your analysis.

You can also do this for individual schools. For example I pulled down the data for Uno Charter and did a cohort analysis of students with disabilities who were enrolled at Uno in at the sixth grade level in 2006 when 36% were reading at or above state standards and then in 2009 only 22% of this same cohort could read at or above state standards for grade 8.

This data is controlled by Northern Illinois University and I have found it to be invaluable in my research when I need limited information. For larger analysis you need to down load files from ISBE and run them through a statistical package. But for most of us this site gives us the information we need.

Rod Estvan

Access Living

February 4, 2010 at 6:54 PM

By: Pat C.

teacher

I understand all of the conspiracy theories. But my understanding of the value-added metric is that is accounts for differences in achievement among the various grade projections. It also factors in about a half dozen other student-specific elements. So, if you received a negative score, that means, compared to other similar students within CPS at the same grade and with these similar characteristics, you didn't do as well teaching them.

February 4, 2010 at 9:20 PM

By: ChicagoTeacher

Value-Added

Before the value-added measuring, you would prefer to teach in either 6th or 8th grade because it meant more of your kids would meet or exceed.

However, with value-added comparison, your kids' scores are compared to kids in the same grade and from similar socio-economic status. If your 6th graders value-added score is in the negative, it means your kids are underperforming relative to other kids in the same grade and similar socio-economic status.

I don't understand why you aren't embracing value-added measurement. By using the value-added contribution, even a school with very tough student demographic could easily outshine gifted schools.

February 4, 2010 at 9:24 PM

By: Wendy Swanson

Thanks

Thank you for speaking up. I hope that others will follow-up and continue to explore these issues.

February 4, 2010 at 10:13 PM

By: ppr

This happens a lot with state tests

How do you think cut scores are set? Basically by a bunch of politicians sitting around making decisions about how many kids should flunk. It's all a political game. Even the cut scores on the National Assessment of Education Progress (NAEP) -- the so called "gold standard" of tests -- are fixed politically. Read the chapter about how that went down in Richard Rothstein's book "Grading Education." The politicos (led by Chester Finn) repeatedly pressed the scientists and statisticians for cut scores, and the scientists repeatedly pushed back, telling them there was no legitimate way to do it. Finally the politicos had their way and set the cut scores high to justify their propaganda about how bad the education system is.

February 4, 2010 at 10:27 PM

By: teacher at a large school

Mental health issues

at our school we have had 10 students (maybe more) this year transfered to various mental health facilities and hospitals and another 11 to juvenile prison--missing many days of school and getting poor grades and poor scores. Too many other stuednts have shared observation of domestic violence within the home, DCFS called on their own personal abuse, etc...Show me where Value Added takes these numbers and stats in its formula.

February 4, 2010 at 10:46 PM

By: Pat C.

teacher

10:27, if those 10 students have IEPs, that is factored into value-added. I know there are other factors, but I don't have the list here.

February 4, 2010 at 10:47 PM

By: Karen Lewis

BRAVA Susan!!!!

Thank you for taking the time to blow the lid off this madness. It reminds me of the time test scores started flattening out during the Vallas administration (which catapulted him out of Chicago into the ujnsuspecting arms of Philadelphia). Because education policy is now driven by politicians who have starved public schools, test scores are the fad du jour. They're easy to exploit and interpret and have been used for the past 35 years to denigrate public school teachers and their students. Thanks again for your hard work. This is groundbreaking.

February 5, 2010 at 8:29 AM

By: don perl

What an investigator you are!

Dear Susan,

Thank you so much for investigating this corruption, this malpractice, this sleaze, that so debases us all. Let the word go forth.

February 5, 2010 at 12:35 PM

By: Sarah H.

teacher

Susan, your research is adding credence to the value added approach because it takes into account all of the variables that could otherwise distort the data.

February 8, 2010 at 10:45 PM

By: Wade Tillett

Fire the 5th grade teachers!

Your results are clear. Statewide, all 5th grade teachers must be fired and replaced by various privately-managed educational service providers. The RFP is open now...

February 10, 2010 at 10:52 AM

By: kugler

Rethinking National Certification For Teachers

there is discussion now about the effectiveness of National Boards and its costs.

here is a link to the story

http://www.chicagonow.com/blogs/district-299/2010/02/rethinking-national-certification-for-teachers.html

October 19, 2010 at 10:51 AM

By: Megan

State Assessments

I understand the purpose of having state assessments but enough is enough. I have known teachers who to have anxiety attacks over state testing. Put National Standards in place, allow districts to manage their own data systems, and put an end to some to the stress great teachers are being faced with.

February 11, 2011 at 8:36 PM

By: Susan Zupan

Verification of "data pattern"

The "pattern" discussed in my report was verified by a Substance statistician prior to its publication here and in the print edition. This "pattern" was also independently verified (with one different finding) after it was printed; the independent party found that the fourth grade scores for the state dropped slightly instead of rising slightly. For me, two basic questions still remain: Has the ISAT Test ever been a valid assessment? ("Reliably," 3rd, 6th, and 8th graders basically "outperform" 4th, 5th, and 7th graders year after year.) And if not, how can anyone (read: school boards and politicians) continue to base students' promotions and now consider basing teachers' salaries on such tests?

Add your own comment (all fields are necessary)

Substance readers:

You must give your first name and last name under "Name" when you post a comment at substancenews.net. We are not operating a blog and do not allow anonymous or pseudonymous comments. Our readers deserve to know who is commenting, just as they deserve to know the source of our news reports and analysis.

Please respect this, and also provide us with an accurate e-mail address.

Thank you,

The Editors of Substance

Your Name

Your Email

What's your comment about?

Your Comment

Please answer this to prove you're not a robot:

2 + 3 =