This school year Fairfax County Public Schools, the 10th largest school division in the United States, adopted the iReady assessment as a universal screener across all of its elementary schools. Students in grades K-6 take these assessments individually on the computer three times per year, and the results are made available to both teachers and parents.

According to Curriculum Associates, the company that makes iReady, these assessments are an “adaptive Diagnostic for reading and mathematics [that] pinpoints student need down to the sub-skill level, and [provides] ongoing progress monitoring [to] show whether students are on track to achieve end-of-year targets.”

The Fairfax County Public Schools website further asserts that iReady is a “tool that has the potential to streamline Responsive Instruction processes, promote early identification and remediation of difficulties and improve student achievement.”

While I have found this assessment deeply troubling all year, it has taken me a while to be able to articulate exactly why I think this assessment is so dangerous, and why I think we need to use our voices as teachers, administrators and parents to speak out against it.*

So, let’s get back to the claim in the title of this blog post. iReady is dangerous. This might sound like hyperbole. After all, this is just a test, right? In this era of public schooling, children take many assessments, some more useful than others, so what’s the big deal with iReady?

I *could* spend this blog post writing about how these tests are designed to be short, but often stretch on for hours, but I won’t.

I *could* spend this blog post writing about how this assessment includes test items that are mathematically inaccurate or misleading, but I won’t. (Ok, I actually will just a little at the end of this blog post.)

These things make the test a bad test, but what makes it truly dangerous is the way iReady reports data and makes suggestions for instruction. Stick with me for a moment as I explain how this works.

After a student takes the iReady screener in the area of mathematics, the teacher can download individual and class reports. Each child receives an overall scale score for the math assessment as a whole, as well as a scale score in each of the four domains of 1) number and operations, 2) algebra and algebraic thinking, 3) measurement and data, and 4) geometry.

Based on the scores, iReady generates a report for each student for each of the domains. The report offers a bulleted list of what the student can do and next steps for instruction. However, if you take a look at the finer print you’ll learn that these reports are not generated from the specific questions that the child answered correctly or incorrectly, but rather are a generic list based on what iReady thinks that students who score in this same range in this domain *likely* need.

(Part of FCPS training for school leaders on understanding iReady data.)

The teacher can never see the questions the child answered correctly or incorrectly, nor can she even access a description of the kinds of questions the child answered correctly or incorrectly. The most a teacher will ever know is that a child scored poorly, for example, in number and operations. Folks, that is a giant category, and far too broad to be actionable.

But above all else, the iReady Universal Screener is a dangerous assessment because it is a dehumanizing assessment. The test strips away all evidence of the students’ thinking, of her mathematical identity, and instead assigns broad and largely meaningless labels. The test boils down a student’s entire mathematical identity to a generic list of skills that “students like her” generally need, according to iReady. And yet despite its lumping of students into broad categories, iReady certainly doesn’t hesitate to offer very specific information about what a child *likely* can do and what next instructional steps should be.

On paper, one of the goals of iReady is to increase equity, to make sure everyone has access to understanding in mathematics. I’ve learned from thoughtful folks who are dedicated to equity in mathematics educations (including Professors Rochelle Gutiérrez,Danny Martin, and Christopher Emdin) that we must question practices that purport to increase equity but actually serve to reinforce the status quo. iReady, and assessments of this nature, overwhelming identify poor students and students of color as most in need of intervention.

So, what does that mean for these students’ instruction? What does it mean for how we position teachers to view their students? Will it mean that instead of rich mathematical experiences these students are relegated to computer-based or scripted intervention (conveniently sold by the same company that makes the assessment)? Will it mean that these students are denied access to the bigger picture of what it means to do mathematics and be a mathematician and are instead are fed a steady diet of, in the words of the makers of iReady, tiny discrete skills “down to the sub-skill level”?

FCPS defends critique of the iReady assessment by asserting that teachers should use iReady as a screener to identify students “at risk,” not as a diagnostic assessment. I think this defense wears thin when schools begin to use iReady assessment data as a measure of growth on their School Improvement Plans. I think this defense wears thin when schools print out the reports and use them to sort and label children for intervention in data dialogue meetings.

Here’s where I return to my earlier point about the iReady universal screener containing inaccurate and misleading questions. I was given access to an unused student account at the beginning of the year to “see what the iReady test is like.” In my first ten minutes of clicking around, I came across a myriad of troubling questions that made me seriously question whether the makers of this assessment have any business writing a math test, let alone making instructional recommendations. Here’s one example. I’ve changed the context (it wasn’t originally cats and dogs) and numbers (it wasn’t originally 7 and 11), but otherwise the question and answer choices are identical.

Can you figure out which equation is the correct answer? I can’t because the the first three equations are all correct! Strong math instruction encourages students to understand the inverse relationship between addition and subtraction and solve problems in flexible ways that might involve thinking about 7+4=11, 11-7=4 or 11-4=7 to solve this problem. And perhaps most importantly, strong math instruction encourages students to develop a deep understanding of the equal sign as a symbol that describes a relationship between the two sides of the equation, not a symbol that means “here comes the answer!”

But as a teacher whose students take the iReady test, you’ll never see this question. You’ll never have the opportunity to decide if the test or the question are worthy of using to assess student thinking.

The good news is that we can do better. Fairfax County Public Schools, you can do better! We can all do better. We can make it our top priority to assess in ways that allow us to collect data on student thinking and strategies. We can only do that by investing in teachers who learn to listen to students’ thinking about math, analyze student work and choose assessments that allow us to see how students engage with the math. This is what helps us choose next steps for instruction.

When I started working for Fairfax County Public Schools twelve years ago I knew very little about math or how children learn math. But I was lucky to end up in a district that invests in teachers. I had amazing math coaches (who inspired me to become a math coach!) and support from the Title I office, I took courses in Cognitively Guided Instruction (CGI) and Developing Mathematical Ideas (DMI), learned how to use the Investigations curriculum well, and wrote a book about nurturing young mathematicians through small group instruction. I say this to point out that tremendous resources were poured into me (and many others!) as a classroom teacher and a coach to help me learn to listen to students and teach and assess responsively.

The best “screener” is a knowledgeable teacher and our first question of any potential assessment should be, “Does it provide a window into student thinking or is student thinking hidden behind scale scores and graphs?”

So, Fairfax County and all the other districts directing millions and millions of dollars towards iReady, I know we can do better. We have to.

*Note: It’s important for me to acknowledge that writing publicly about iReady carries much less risk for me than my colleagues. I am about to finish my last day of work for FCPS after serving as a classroom teacher and math coach for twelve years in the district. Many of the colleagues I’ve spoken to about iReady feel similarly to me, but feel powerless, or even scared, to speak out against it in a public way. Others who have spoken out feel their voices have gone unheard.

You go girl, well thought out!

AMEN. We are required to use IReady and their workbooks. I should have bought stock in I Ready….. I feel sad for our kids.

God Bless You for caring enough about the students who are falling through the cracks and suffering to the effects repetitive and constant regurgitation IReady is causing these kids to review lessons they may not even be ending review in. Therefore, these students are missing the review time they are actually needing more of because of mis guidance from Iready!!

Here in the UK, across the pond, the issues are no different. That a child’s educational achievements are described by levels or grades via compulsory national testing at various ages is more than nonsensical; it’s a complete anathema to what an education means and what education should be for. Tests show what students have understood or achieved in terms of being able to conjecture, to reason, to conjecture, to generalise or to prove. Tests become what teachers feel the have to teach to; thus their professionalism is stripped away and is replaced not by the joy of having done a ‘good’ job but by the grades their students are awarded. Billy Bragg has an excellent way of describing such an appalling state of affairs on his new album: Bridges not walls. The lines are: “Not everything that counts can be counted not everything that can be counted counts”.

Oops I meant to write “Tests do not show…”

Amen! We use MAP, not iReady but same difference. One of the dangers I see all over my district is that these tests and the “suggested next steps” lead to teaching skills in isolation. The program spits out that a kid is ready to multiply two digit numbers in 2nd grade and teachers start teaching it with zero context and without laying a strong foundation. Dangerous! What’s worse- when this is done during “intervention blocks,” when kids in a certain test score band are grouped together and travel to another teacher (who doesn’t know them) so she can drill them for 3 sessions and then move on to something different. Just in the hope of clicking a few more multiple choice answers correctly and getting that “growth!” It’s so upsetting to me!

What concerns me a great deal about adopting published schemes is the failure to recognise the importance of the emotional aspects of learning and the sets of relationships between:

students & mathematics (S/M), teachers & mathematics (T/M), and teachers & students (T/S). Classrooms are emotional places and whilst the S/M relationship is central the teacher can only impact on this through a combination of T/M with T/S. After all the teacher cannot DO the learning for the students!

Good for you for speaking up! And good for you for being very specific about your concerns. You know who to tell, and I hope that you have the opportunity to do so.

Brava!!! It’s like you read my mind but articulated my thoughts much better than I could myself.

Amen! Thank you for taking the time to post this. I have the same feelings around the NWEA MAP test, only there aren’t purchases of additional resources hinging on or waiting to be sold following the test. I was concerned about your write up, wondering what would happen to you in your job until the end. Isn’t it sad that we feel valued in many ways, by our districts, yet paralyzed of any true impact over chosen resources. Best of luck in your new district!

Thank you for putting into words (so eloquently!) my objections to iReady. I would add that iReady also sucks up valuable instructional time.

Sorry to hear that you’re leaving FCPS. What a loss for the district! What’s next for you?

Agreed, Gail! It does funnel our limited instructional time into an assessment that doesn’t provide any information of real value to me in terms of what a student understands mathematically. I’ve also seen students who I KNOW are above grade level score very low (I’m talking the first percentile) on this thing. It does not seem to be a very reliable source of data.

Also, I agree that Kassia leaving us is a HUGE loss for our county! But, she is so gifted and has so much to teach not just students from our county, but nationally and internationally through her writing and research. You will be missed, Kassia. 🙂

I’ve also felt the same about iReady and NWEA for quite some time, but it has also been difficult for me to speak up. Thank you for articulating this in such a professional way. Our students need a voice. I will be sharing.

Thank you for being the facilitator of strategies and knowledge of math and best practices. I am a more open-minded, confident teacher, because of you.

Thank you, Tracey.

I hope you get some traction. This is a miserable situation. (For students and teachers)

In my county in Florida, i-Ready is also used th evaluate teachers…scary!

Us too in Tennessee!

Very well written post. I appreciate your passion for thinking and learning in the true context. I am curious, where you are going next? Who will see your post that needs to? What the school district will do to keep the child in focus instead of the product because it looks good on paper?

My kids despise it! I have heard directly from teachers it is of no use to them and only causes more anxiety in the children. They already take 2-3 SOLs! My son was finishing 2nd grade and came home in tears and hysterical because he was stuck on question and all of his friends were done and he was on #30 (he was convinced their were #50). I asked him to write down the question and quickly realized it was not a 2nd grade math problem. When I approached the school, they agreed that it was 6th grade level math, a concept my son had not started in school, and the program keeps going if he gets a problem right…without cutting off when they have exceeded a certain level. So, my take away? iReady is inaccurate, stress inducing, and yet another ”measure” that does more harm than good. Why should my child be pushed to tears when the program is so poorly designed?!

I had students (high school) who quit answering questions accurately just to get the test to stop both with the MAP and with a reading program. An administrator who was neither a special education teacher nor taught language arts chastised me for not using small group recommendations for skill instruction generated by computer algorithms. Of course there was no consideration of abilities and disabilities not to mention group dynamics. Not following the reading program “with fidelity” became the excuse for terminating me. Once control of special ed was ceded to management types with no professional expertise, I was all of a sudden no longer a competent special education teacher.

I am going thru a similar situation. My daughter is 3rd grade gifted. She is practically starting fractions so I know her ability to do higher levels of math is there BUT she is scoring at a 2nd grade level in iReady and the material they teach is very different than what is taught in class. This is INSANITY!!!!! The stress alone should be indicative of how poorly this program is designed. It should encourage kids NOT deflect them.

Bravo. You articulated this so wonderfully. Same holds true for English Language Arts. Thanks for writing this and sharing.

Thank you Kassia for your thoughts. It has really caused me to think more about these types of assessments. I think that it is important that they occupy their proper place. I think that maybe it can give us preliminary information about a kids math knowledge, but to use it to recommend specific interventions is not helpful. I have really never had a kid who had a great result on the STAR (the assessment we use) and not been a pretty strong math student. Likewise, a student who performs quite poorly generally is in need of some assistance, although the recommendations given have to be taken with a large chunk of salt. I am not sure that dehumanizing is the right word to use, however. That seems a bit strong and maybe should be reserved for much more severe situations than a math screener.

Thanks for your comment, Josh.

I’m not super familiar with the STAR, but have done quite a bit of comparison between iReady tests and how students do on other assessments at my particularly school. At our school having a kid score poorly on iReady and fine on our CGI assessment or not so great on our assessments and good on iReady is not uncommon.

I think I might have agreed with you a year ago that dehumanizing was too strong a word. However, I now strongly believe that to be an appropriate word. I’ve learned a lot from Professor Rochelle Gutierrez on this topic. This is a talk when she’s talking about dehumanizing practices in math.

I also think we can use the word dehumanizing and not have to mean the MOST horrific dehumanizing experiences possible, such as genocide or separating children from their families. Of course, those are the most horrific things.

Here’s the link to a talk by Professor Gutierrez that I think is really interesting on the topic of both DEhumanzing mathmeatics and REhumanizing mathematics. https://www.youtube.com/watch?v=D266LYIigS0

I wonder if there are assessments out there that don’t necessarily require interviewing the students (which is important for sure) that can give us some basic information about how kids are doing. As is with reading, I do think that ultimately we need to sit and talk to students and listen to how they solve problems in order to have a deep sense of what they are doing. Is there a way to get at least some information without the logistical issues that this might create?

I will definitely watch the video-Thank you.

Kassia are there alternative web-based math intervention programs (that include assessment and instruction) that you feel are better? Are you familiar with DreamBox?

Good question “GR” – ARE there other web-based intervention programs that you feel are better such as iStation, Let’s Go Learn, Dreambox, etc…

This is such a hard question. I think we’re asking a computer program to do something that only a skilled teacher can do.

That being said, I’m familiar with Dreambox and it’s the best of its type of program I’ve seen. I think there’s a lot of good in the math they present, although it isn’t perfect. However, I wouldn’t allow Dreambox to replace a teacher intervention.

Our district used iReady and it was awful. We sat in a computer lab for 2 PDs while the “trainer” basically did a sales pitch – both times. Never answered our questions, just repeated what was probably a scripted bit on nonsense. After all that money, we stopped using it after 2 years…… such a waste of time for the kids, and teachers and a huge waste of money that could have gone to real instruction.

Thank you !

Reblogged this on The Seminary of Praying Mantis.

“The test strips away all evidence of the students’ thinking, of her mathematical identity, and instead assigns broad and largely meaningless labels. The test boils down a student’s entire mathematical identity to a generic list of skills that “students like her” generally need, according to iReady.”

And I bet they refer to this as Personalized Learning!

This is sooooo true! I totally dislike these tests, and I dislike the Ready Common Core workbooks and assessments. The stories in the workbook are good, for the most part, but the assessments are very difficult and over the students’ heads. But, it is our “curriculum”. Now Iowa State is using these diagnostics from Curriculum Associates, too. Really??

Yes! Someone who understands how horrible i ready is!

I appreciated your thoughtful analysis of this program. I’m a third grade teacher struggling to understand the ranking i-Ready gives my students (they never seem to match their in class/formative performance?). Today, during a growth diagnostic, a student asked me a very similar question to your dog/cat dilemma. There were two acceptable answers and I was appalled that they were going to mark one of those choices as incorrect.

I feel validated reading your comment. My daughter is in 3rd grade gifted with a 3.78 in her math grades and is scoring 2nd grade level on iReady. It is appalling. She says its confusing and irrelevant to her school work. An 8 year old’s EXACT words just to show you how eloquent she speaks. I came to tears when I saw her diagnostic and then I was immediately upset at the system because I know my kid’s ability. I refuse to punish her for this. All I can do is tell her to try her best and keep trying. This saddens me deeply for our kids.

I’ll put it another way…IReady is junk! My district is whole hog into IReady. We even require our kindergarteners to take the IReady diagnostic in the first month of the school year. Why? Many of our kinders can’t even read! And then we (teachers) are required to examine the data. What data? It’s junk. Our students dislike IReady because they are saturated by it. Sadly, many parents use IReady as punishment, meaning they require their kids to go on IReady for hours because they believe that it will make them smarter. The IReady diagnostic tests have no time limits and it takes days to complete them. Valuable instruction time is lost because we are required administer the IReady diagnostic test four times during the school year. We are suppose to be a profession of highly educated people, and I know that we can do much better.

People jump very quickly to blame a test instead of being critical thinkers and asking how they can use the test to give them information. There is no substitute for good teaching, for being responsive to students in real time.

The real danger is teachers’ overreliance on any one assessment or curriculum. What happened to the craft of teaching? No matter what tools or technology come our way, we really have a responsibility to uphold human judgement.

I agree that learning to analyze student work in thinking is a critical practice for teachers. However, for the reasons I argue in the blog post, I don’t think iReady offers us data that is usable for anything besides predicting how students do on other standardized tests. We need assessment that allows us to see students’ thinking, study what they know, and think about how we can best respond.

Kassie,

I’m not sure I agree with your comments about the test question you provided at the end of your article. Yes, I agree there is more than one correct answer. However, there is no way for you to know that the question was counted incorrect if a student chose something other than 11 – 7 = 4. The reason I am saying this, is because I just came across that with one of the interactive videos. It had the students choosing an equation based on a picture. There were actually two equations that were correct, but the student could only select one. The one they selected was counted correct. Then it told us that there was actually another correct equation, and allowed us to select another one. I would say from that experience, that it is possible that any of the three would have been counted as correct.

Thanks for your comment, Sue. II think you’ve really identified what is really one of the core problems is with this kind of assessment–teachers can’t access the questions nor can they access any evidence of student thinking–not even which questions an individual completely accurately or inaccurately. These are really crucial for making any kind of instructional decision.

Thank you for sharing. I wish I knew how to get my district to stop using iready. My 7 year-old gets sooo bored with it and it is not at all a reflection of his abilities. His IQ is in the 99.9th%, his MAPs scores were in the 99.9th% since he started kindergarten. Then all of a sudden the district switches to iready and his reports start showing him at or below grade level. WTF?! I know he didn’t lose knowledge that quickly. He is super bored and just clicking through! I sat down to do it with him and saw that he had to do two digit addition for weeks on end even when he received perfect scores. I finally asked the teacher about this and she admitted it was an issue and then had to get special permission to move him up a level in math. I suspect it may be helpful to identify those kids at risk but not at all helpful to monitor progress in kids who weren’t at risk in the first place. And our school district DEFINITELY believes it to be a valuable measure of all students, which is clearly not true.

Thank you for this post. I am a parent of an I-ready kindergartner and I am constantly taken aback at the vagueness of the questions. Many have multiple answers and as a college educated adult I was completely dumbfounded that even I got a few answers wrong (on a kindergarten test!)

Thank you so much for this post. I have encountered the same issues with my child and the I-Ready assessment. It is shameful decision-level educators are not acknowledging the drawbacks which affect all age-levels and provide false or very misleading information across the board.

Kudos to you and all who have shown the courage to speak out about the disservice of an assessment tool that has highjacked our kids enthusiasm for school and desire to learn.