Why iReady is Dangerous

This school year Fairfax County Public Schools, the 10th largest school division in the United States, adopted the iReady assessment as a universal screener across all of its elementary schools. Students in grades K-6 take these assessments individually on the computer three times per year, and the results are made available to both teachers and parents.

According to Curriculum Associates, the company that makes iReady, these assessments are an “adaptive Diagnostic for reading and mathematics [that] pinpoints student need down to the sub-skill level, and [provides] ongoing progress monitoring [to] show whether students are on track to achieve end-of-year targets.”

The Fairfax County Public Schools website further asserts that iReady is a “tool that has the potential to streamline Responsive Instruction processes, promote early identification and remediation of difficulties and improve student achievement.”

While I have found this assessment deeply troubling all year, it has taken me a while to be able to articulate exactly why I think this assessment is so dangerous, and why I think we need to use our voices as teachers, administrators and parents to speak out against it.*

So, let’s get back to the claim in the title of this blog post. iReady is dangerous. This might sound like hyperbole. After all, this is just a test, right? In this era of public schooling, children take many assessments, some more useful than others, so what’s the big deal with iReady?

I could spend this blog post writing about how these tests are designed to be short, but often stretch on for hours, but I won’t.

I could spend this blog post writing about how this assessment includes test items that are mathematically inaccurate or misleading, but I won’t. (Ok, I actually will just a little at the end of this blog post.)

These things make the test a bad test, but what makes it truly dangerous is the way iReady reports data and makes suggestions for instruction. Stick with me for a moment as I explain how this works.

After a student takes the iReady screener in the area of mathematics, the teacher can download individual and class reports. Each child receives an overall scale score for the math assessment as a whole, as well as a scale score in each of the four domains of 1) number and operations, 2) algebra and algebraic thinking, 3) measurement and data, and 4) geometry.

Based on the scores, iReady generates a report for each student for each of the domains. The report offers a bulleted list of what the student can do and next steps for instruction. However, if you take a look at the finer print you’ll learn that these reports are not generated from the specific questions that the child answered correctly or incorrectly, but rather are a generic list based on what iReady thinks that students who score in this same range in this domain likely need.

Screen Shot 2018-05-29 at 10.07.32 PM.png

(Part of FCPS training for school leaders on understanding iReady data.)

The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable.

But above all else, the iReady Universal Screener is a dangerous assessment because it is a dehumanizing assessment. The test strips away all evidence of the students’ thinking, of her mathematical identity, and instead assigns broad and largely meaningless labels. The test boils down a student’s entire mathematical identity to a generic list of skills that “students like her” generally need, according to iReady. And yet despite its lumping of students into broad categories, iReady certainly doesn’t hesitate to offer very specific information about what a child likely can do and what next instructional steps should be.

On paper, one of the goals of iReady is to increase equity, to make sure everyone has access to understanding in mathematics. I’ve learned from thoughtful folks who are dedicated to equity in mathematics educations (including Professors Rochelle Gutiérrez,Danny Martin, and Christopher Emdin) that we must question practices that purport to increase equity but actually serve to reinforce the status quo. iReady, and assessments of this nature, overwhelming identify poor students and students of color as most in need of intervention.

So, what does that mean for these students’ instruction? What does it mean for how we position teachers to view their students? Will it mean that instead of rich mathematical experiences these students are relegated to computer-based or scripted intervention (conveniently sold by the same company that makes the assessment)? Will it mean that these students are denied access to the bigger picture of what it means to do mathematics and be a mathematician and are instead are fed a steady diet of, in the words of the makers of iReady, tiny discrete skills “down to the sub-skill level”?

FCPS defends critique of the iReady assessment by asserting that teachers should use iReady as a screener to identify students “at risk,” not as a diagnostic assessment. I think this defense wears thin when schools begin to use iReady assessment data as a measure of growth on their School Improvement Plans. I think this defense wears thin when schools print out the reports and use them to sort and label children for intervention in data dialogue meetings.  

Here’s where I return to my earlier point about the iReady universal screener containing inaccurate and misleading questions. I was given access to an unused student account at the beginning of the year to “see what the iReady test is like.” In my first ten minutes of clicking around, I came across a myriad of troubling questions that made me seriously question whether the makers of this assessment have any business writing a math test, let alone making instructional recommendations. Here’s one example. I’ve changed the context (it wasn’t originally cats and dogs) and numbers (it wasn’t originally 7 and 11), but otherwise the question and answer choices are identical.

Screen Shot 2018-06-11 at 10.34.03 PM.png
Can you figure out which equation is the correct answer? I can’t because the the first three equations are all correct! Strong math instruction encourages students to understand the inverse relationship between addition and subtraction and solve problems in flexible ways that might involve thinking about  7+4=11, 11-7=4 or 11-4=7 to solve this problem. And perhaps most importantly, strong math instruction encourages students to develop a deep understanding of the equal sign as a symbol that describes a relationship between the two sides of the equation, not a symbol that means “here comes the answer!”

But as a teacher whose students take the iReady test, you’ll never see this question. You’ll never have the opportunity to decide if the test or the question are worthy of using to assess student thinking.

The good news is that we can do better. Fairfax County Public Schools, you can do better! We can all do better. We can make it our top priority to assess in ways that allow us to collect data on student thinking and strategies. We can only do that by investing in teachers who learn to listen to students’ thinking about math, analyze student work and choose assessments that allow us to see how students engage with the math. This is what helps us choose next steps for instruction.

When I started working for Fairfax County Public Schools twelve years ago I knew very little about math or how children learn math. But I was lucky to end up in a district that invests in teachers. I had amazing math coaches (who inspired me to become a math coach!) and support from the Title I office, I took courses in Cognitively Guided Instruction (CGI) and Developing Mathematical Ideas (DMI), learned how to use the Investigations curriculum well, and wrote a book about nurturing young mathematicians through small group instruction. I say this to point out that tremendous resources were poured into me (and many others!) as a classroom teacher and a coach to help me learn to listen to students and teach and assess responsively.

The best “screener” is a knowledgeable teacher and our first question of any potential assessment should be, “Does it provide a window into student thinking or is student thinking hidden behind scale scores and graphs?”

So, Fairfax County and all the other districts directing millions and millions of dollars towards iReady, I know we can do better. We have to.

*Note: It’s important for me to acknowledge that writing publicly about iReady carries much less risk for me than my colleagues. I am about to finish my last day of work for FCPS after serving as a classroom teacher and math coach for twelve years in the district. Many of the colleagues I’ve spoken to about iReady feel similarly to me, but feel powerless, or even scared, to speak out against it in a public way. Others who have spoken out feel their voices have gone unheard.

34 thoughts on “Why iReady is Dangerous

    1. AMEN. We are required to use IReady and their workbooks. I should have bought stock in I Ready….. I feel sad for our kids.

  1. Here in the UK, across the pond, the issues are no different. That a child’s educational achievements are described by levels or grades via compulsory national testing at various ages is more than nonsensical; it’s a complete anathema to what an education means and what education should be for. Tests show what students have understood or achieved in terms of being able to conjecture, to reason, to conjecture, to generalise or to prove. Tests become what teachers feel the have to teach to; thus their professionalism is stripped away and is replaced not by the joy of having done a ‘good’ job but by the grades their students are awarded. Billy Bragg has an excellent way of describing such an appalling state of affairs on his new album: Bridges not walls. The lines are: “Not everything that counts can be counted not everything that can be counted counts”.

  2. Amen! We use MAP, not iReady but same difference. One of the dangers I see all over my district is that these tests and the “suggested next steps” lead to teaching skills in isolation. The program spits out that a kid is ready to multiply two digit numbers in 2nd grade and teachers start teaching it with zero context and without laying a strong foundation. Dangerous! What’s worse- when this is done during “intervention blocks,” when kids in a certain test score band are grouped together and travel to another teacher (who doesn’t know them) so she can drill them for 3 sessions and then move on to something different. Just in the hope of clicking a few more multiple choice answers correctly and getting that “growth!” It’s so upsetting to me!

    1. What concerns me a great deal about adopting published schemes is the failure to recognise the importance of the emotional aspects of learning and the sets of relationships between:
      students & mathematics (S/M), teachers & mathematics (T/M), and teachers & students (T/S). Classrooms are emotional places and whilst the S/M relationship is central the teacher can only impact on this through a combination of T/M with T/S. After all the teacher cannot DO the learning for the students!

  3. Good for you for speaking up! And good for you for being very specific about your concerns. You know who to tell, and I hope that you have the opportunity to do so.

  4. Amen! Thank you for taking the time to post this. I have the same feelings around the NWEA MAP test, only there aren’t purchases of additional resources hinging on or waiting to be sold following the test. I was concerned about your write up, wondering what would happen to you in your job until the end. Isn’t it sad that we feel valued in many ways, by our districts, yet paralyzed of any true impact over chosen resources. Best of luck in your new district!

  5. Thank you for putting into words (so eloquently!) my objections to iReady. I would add that iReady also sucks up valuable instructional time.

    Sorry to hear that you’re leaving FCPS. What a loss for the district! What’s next for you?

    1. Agreed, Gail! It does funnel our limited instructional time into an assessment that doesn’t provide any information of real value to me in terms of what a student understands mathematically. I’ve also seen students who I KNOW are above grade level score very low (I’m talking the first percentile) on this thing. It does not seem to be a very reliable source of data.

      Also, I agree that Kassia leaving us is a HUGE loss for our county! But, she is so gifted and has so much to teach not just students from our county, but nationally and internationally through her writing and research. You will be missed, Kassia. 🙂

  6. I’ve also felt the same about iReady and NWEA for quite some time, but it has also been difficult for me to speak up. Thank you for articulating this in such a professional way. Our students need a voice. I will be sharing.

    1. Thank you for being the facilitator of strategies and knowledge of math and best practices. I am a more open-minded, confident teacher, because of you.

  7. Very well written post. I appreciate your passion for thinking and learning in the true context. I am curious, where you are going next? Who will see your post that needs to? What the school district will do to keep the child in focus instead of the product because it looks good on paper?

  8. My kids despise it! I have heard directly from teachers it is of no use to them and only causes more anxiety in the children. They already take 2-3 SOLs! My son was finishing 2nd grade and came home in tears and hysterical because he was stuck on question and all of his friends were done and he was on #30 (he was convinced their were #50). I asked him to write down the question and quickly realized it was not a 2nd grade math problem. When I approached the school, they agreed that it was 6th grade level math, a concept my son had not started in school, and the program keeps going if he gets a problem right…without cutting off when they have exceeded a certain level. So, my take away? iReady is inaccurate, stress inducing, and yet another ”measure” that does more harm than good. Why should my child be pushed to tears when the program is so poorly designed?!

    1. I had students (high school) who quit answering questions accurately just to get the test to stop both with the MAP and with a reading program. An administrator who was neither a special education teacher nor taught language arts chastised me for not using small group recommendations for skill instruction generated by computer algorithms. Of course there was no consideration of abilities and disabilities not to mention group dynamics. Not following the reading program “with fidelity” became the excuse for terminating me. Once control of special ed was ceded to management types with no professional expertise, I was all of a sudden no longer a competent special education teacher.

  9. Bravo. You articulated this so wonderfully. Same holds true for English Language Arts. Thanks for writing this and sharing.

  10. Thank you Kassia for your thoughts. It has really caused me to think more about these types of assessments. I think that it is important that they occupy their proper place. I think that maybe it can give us preliminary information about a kids math knowledge, but to use it to recommend specific interventions is not helpful. I have really never had a kid who had a great result on the STAR (the assessment we use) and not been a pretty strong math student. Likewise, a student who performs quite poorly generally is in need of some assistance, although the recommendations given have to be taken with a large chunk of salt. I am not sure that dehumanizing is the right word to use, however. That seems a bit strong and maybe should be reserved for much more severe situations than a math screener.

    1. Thanks for your comment, Josh.

      I’m not super familiar with the STAR, but have done quite a bit of comparison between iReady tests and how students do on other assessments at my particularly school. At our school having a kid score poorly on iReady and fine on our CGI assessment or not so great on our assessments and good on iReady is not uncommon.

      I think I might have agreed with you a year ago that dehumanizing was too strong a word. However, I now strongly believe that to be an appropriate word. I’ve learned a lot from Professor Rochelle Gutierrez on this topic. This is a talk when she’s talking about dehumanizing practices in math.

      I also think we can use the word dehumanizing and not have to mean the MOST horrific dehumanizing experiences possible, such as genocide or separating children from their families. Of course, those are the most horrific things.

      1. I wonder if there are assessments out there that don’t necessarily require interviewing the students (which is important for sure) that can give us some basic information about how kids are doing. As is with reading, I do think that ultimately we need to sit and talk to students and listen to how they solve problems in order to have a deep sense of what they are doing. Is there a way to get at least some information without the logistical issues that this might create?

  11. Kassia are there alternative web-based math intervention programs (that include assessment and instruction) that you feel are better? Are you familiar with DreamBox?

  12. Good question “GR” – ARE there other web-based intervention programs that you feel are better such as iStation, Let’s Go Learn, Dreambox, etc…

    1. This is such a hard question. I think we’re asking a computer program to do something that only a skilled teacher can do.
      That being said, I’m familiar with Dreambox and it’s the best of its type of program I’ve seen. I think there’s a lot of good in the math they present, although it isn’t perfect. However, I wouldn’t allow Dreambox to replace a teacher intervention.

  13. Our district used iReady and it was awful. We sat in a computer lab for 2 PDs while the “trainer” basically did a sales pitch – both times. Never answered our questions, just repeated what was probably a scripted bit on nonsense. After all that money, we stopped using it after 2 years…… such a waste of time for the kids, and teachers and a huge waste of money that could have gone to real instruction.

  14. “The test strips away all evidence of the students’ thinking, of her mathematical identity, and instead assigns broad and largely meaningless labels. The test boils down a student’s entire mathematical identity to a generic list of skills that “students like her” generally need, according to iReady.”

    And I bet they refer to this as Personalized Learning!

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s