AASD Fall i-Ready Results For Reading

On 11/09/2020 the Board of Education had a work session to review the Fall 2020 i-Ready student assessment results. These meetings are pretty much always a slog, and this one is no exception. They covered both the Math and Reading assessment results in this presentation, but, due to time constraints, I will only be covering the overview of i-Ready and the Reading results now.

This is the first year of i-Ready being implemented across all schools. Last year it was piloted by 5 elementary schools, 2 middle schools, and 1 charter school. Since it’s so new, Steve Harrison, the Assistant Superintendent for Assessment Curriculum and Instruction, gave a presentation on what it is and how it works.

i-Ready is a computer-based assessment for reading and math which they’re replacing the MAPP assessment with. Like MAPP, they’ll give it in Fall, Winter, and Spring. It’s an adaptive program so a series of correctly answered questions will result in it asking the student slightly harder questions while a series of incorrect answers will result in slightly easier questions being asked. The end goal is to reach a point where the student is correctly answering half the questions and incorrectly answering half the questions. The purpose of this is NOT to give a score or grade but instead to determine how to best support student learning in math and reading.

After completing the assessment the program creates a personalized learning path for each student that supposedly matches their learning needs. [It should be noted that it is very easy to find many negative i-Ready reviews and one wonders if the personalized learning path is genuinely useful. For a couple examples, see here: https://www.commonsense.org/website/i-ready/teacher-reviews and here: https://www.sitejabber.com/reviews/iready.com]  

The math diagnostic measures 4 areas of mathematical knowledge: 1. Number and Operation, 2. Algebra and Algebraic Thinking, 3. Measurement and Data, 4 Geometry

The reading diagnostic measures 7 areas of language knowledge: 1. Phonological Awareness, 2. Phonics, 3. High-Frequency Words, 4. Vocabulary, 5. Comprehension Literature, 6. Comprehension Informational Text

The assessment does not rank students but it intended to show who possessed grade-level knowledge, who has achieved above grade-level knowledge, and who does not possess grade-level knowledge. It also gives the ability to compare the performance of groups of students both against themselves from fall to spring or from one year to the next as well as against the performance of other groups of students on a nation-wide scale.They aren’t comparing students to students but comparing current student level of progress against their expected level of progress. [It seems to me that one could easily fiddle with the results simply by changing the expected level of progress. There was no explanation as to how the standards of knowledge are determined or whether it’s possible to change the assessment’s expectations in order to make it look like students are doing better than they actually are.]

Last year they piloted i-ready with 8 schools overall. They’ll be able to share how those students did last year compared to this year compared to nationally overall.

The fall assessment was administered online between 9/29/2020 and 10/21/2020 to grades 5k-8. 90% of AASD students took the i-Ready assessment. The bulk of the 10% who did not were from charter schools who had the flexibility to not give the assessment.

For the fall assessment, the students are differentiated by color based on performance. Per the PowerPoint slide:

“Green = Early on, Mid-, or Above Grade Level. Students testing at this level in the fall demonstrated at least partial understanding of grade level standards in reading or math for their grade level. Students within the mid or above grade level range may also benefit from additional enrichment. [As an aside, Steve didn’t clearly explain if what qualifies a child for each color changes throughout the year. While it would be appropriate for a child in the fall to be categorized as green if they demonstrate early partial understanding of the things that will be taught later that year, it would be highly inappropriate for a child to be categorized as green in winter or spring if they still only possess early or mid-level knowledge of grade level standards. Also, it seems like students who are excelling and are above grade level may not be well served by being lumped into the category of students who only possess grade level knowledge.]

Yellow = One Grade Level Below. This is typically where we expect most students to be entering the fall as students testing at the yellow level during the fall window have not yet received most instruction for the year relating to their grade level. Students with a yellow status are ready for grade level instruction but may need additional supports for grade level instruction. Yellow phase during the fall assessment window does NOT mean students are entering the school year already one grade level below expectations.

Red = Two or More Grade Levels Below. Students testing at this grade level have unfinished learning needs relating to standards that serve as prerequisites for grade level success.”

So essentially a student entering 2nd grade would be categorized as green if they had any knowledge of 2nd grade material. They would be categorized as yellow if they had a full understanding of 1st grade material but no knowledge of second grade material. And they would be categorized as red if they did not properly understand 1st grade material.

They compared the 2020-21 results to the 2019-20 results from the AASD pilot schools. They also compared it to national historical norms from 2018-19 (they used that year because that year has a full year of data that was not disrupted by coronavirus like this last year was). They also compared it to the year-to-date national tested population fall 2020-21. Steve asked i-Ready to provide a breakdown of students nationwide who took the test virtually like AASD students did vs students who took it in person.

Now, on to the assessment results:

“National Norms and Placement Distribution As of 10-18”: This slide shows how things compare nationally between students who took the assessment in person and those who took it remotely. The breakdown of kids 2-3 grade levels below is fairly similar, but there’s a little bit of a difference between the yellow and green. There’s a higher rate of kids demonstrating early on-grade level learning when taking the test virtually (both nationally and at AASD) vs those who take the test in school.

“How Do the District’s Placement Compare To The Benchmarks?”

AASD had 8,665 students in grades 5k-8 take the test, compared to the over 3.4 million students who took it remotely, and compared to the 1.9 million students who took it in school. Our results parallel closely with the nationally normed data. AASD has slightly more kids in the green and slightly fewer kids in the red as compared to the national numbers.

This slide compares scores across grade levels to national historical norms and to the national testing scores of schools who did the assessment virtually. Of note: there is no red at the kindergarten level because there is no previous grade level to compare their score to. 

“Fall Placement Distribution By Grade” k-6

Taken at face value this slide would seem to indicate the students learn more when school is virtual than when they are attending in-person school, but Steve did caution that the assessment was not given in a controlled manner and that there could be some grade performance inflation due to parents helping their children.

Kay Eggert was curious about the kindergarten score. The national in-person scores showed mostly yellow with little green. AASD’s scores and the national virtual scores both seemed disproportionately green. Was that the result of parental help creating grade inflation?

Steve thought that likely although he said it was only an assumption on his part and not something he could substantiate with data. Even last spring when districts were closing all over the country, there were some districts that chose to use i-Ready and delivered in remotely, and they indicated that they saw more parental help at the early levels.

“Fall Placement Distribution By Grade” 6-8

Steve stated he put 6th grade on both the elementary and middle schools slides because it’s a pivotal year. 

“How Does Domain Level Performance Compare to National Norms?”

They broke out performance within each domain. Only “Vocabulary”, “Comprehension: Literature”, and “Comprehension: Informational Test” are reported for all students and all grades. He put a red box around the areas where AASD is performing at or below the national average instead of above it. It looks like the 4th and 5th grade reading comprehension skills could be worked on.

“How Does the Overall Fall 2020 Pilot School Data Compare to Fall 2019 Data”

At this point, things became interesting. 8 schools participated in the i-Ready last year, so how did they compare to this year?

Unfortunately, it turns out this slide is completely worthless because the data presented on it is clearly not correct. The percentages of 2019 scores add up to 106% The percentages of 2020 scores add up to 94%. Clearly, data somewhere was entered incorrectly. Gary Jahke pointed this discrepancy out. Steve suggested it could be a rounding error [unless he’s using “new math” there is no way that could be a rounding error] and he could not say what the actual numbers were. It is not clear to me at all that even the proportions as displayed in the visuals are correct if the numbers are obviously off.

The next slide compared the Fall 2019 k – 2nd grade vs Fall 2020 1st – 3rd grade cohorts against themselves. Essentially it was comparing each grade’s performance in 2019 against its performance in 2020–the kindergarten class of 2019 against those same kids in 1st grade in 2020 and so on. As you can see, each class had a surprising increase in the number of kids in the green category from last year to this year. During the question and answer section, Barry O’Connell remarked on this and said it looks like kids are actually doing better in virtual school than in in-person school. To him, this data seemed to contradict the notion that kids have lost a lot through remote learning. Leah Olson also said that they had received a lot of letters from parents expressing concern that children would be struggling academically due to not being in school, but she didn’t see that concern born out by the data.

[Personally, there are two interpretations of this data that immediately spring to my mind, neither of which necessarily reflects well on AASD. The first is that parents are probably just helping their kids out and their kids aren’t actually learning how to do stuff on their own. I think this article is relevant. The second is, if we take this data at face value and believe it accurately reflects student’s abilities, does that not indicate that AASD is not as competent at teaching children as their parents and grandparents are? Maybe we shouldn’t be paying AASD as much money when they can’t even do as good a job as run-of-the-mill non-educators.]

The next slide compared Fall 2019 3rd – 5th grade vs Fall 2020 4th – 6th grade pilot school cohorts 

[The 4th grade to 5th grade results look somewhat odd to me. The kids in the green category jumped up by 12% while at the same time the red increased by 11%. That warrants some further digging into, in my opinion.]

The next slide compared the Fall 2019 6th – 7th grade vs Fall 2020 7th – 8th grade pilot school cohorts.

Steve noted they don’t administer i-Ready at the high school level, so their data stops at 8ths grade.

[Personally, I find this slide alarming. It looks like last year even prior to the pandemic, when everything was still normal, 44% of 6th graders who took this test and 43% of 7th graders who took this test were at least 2 grades behind. How is that acceptable? In some respects, the slides tell a grim tale of children starting out in kindergarten at or above grade level and then as the grades progress some of them falling further and further behind until they hit middle school and over 40% are failing.]

To the Board’s credit, the issue of the red students was at least remarked upon during the question and answer session. Jim Bowman was really bothered by the reds. He said that they’ve looked at this before, but he thinks they need to put a more intensive effort into supporting the reds. He suspects that much of that effort is outside the classroom. He thought there were two areas that should be focused on. (1) Students should be helped to see job opportunities earlier in life and  (2) there should be efforts to increase academic family engagement through parent teacher teams such as Lincoln Elementary School is doing.

Per Kay Eggert the red band increasing over the years is a phenomenon that is supported by research and is not something unique to AASD. She thought it was not a topic for right now, but she did wonder about it.

Leah Olson was concerned that although students seemed to be doing decently overall, certain subpopulations might be more vulnerable during virtual learning–for example special education students or other students who would have less supportive homes because families are working. She wanted to see the data broken out into sub groups. Steve had mentioned earlier in the presentation that he could get that data. She didn’t want to give him more work unless the entire Board agreed they wanted to see that data, but she would be interested in seeing it. [The idea that a Board of Education member would feel hesitant asking an administrative staff member to get them some information strikes me as rather ridiculous. It’s his job; he’s being paid to do it. And it’s your job as a board member to review information so you can guide the administration.]

Jim Bowman liked the adaptive nature of the program. He wondered if the students understand how the assessment works.If the student does really well on the diagnostic, do they know they did well? Or is this intentionally kept away from kids so kids don’t rank themselves relative to each other? [As an aside, preventing kids from ranking themselves and judging each other’s abilities seems like an insane proposition which flies in the face of millions of years of evolutionary development.]

Per Steve, they did communicate with both teachers and families (via letters to parents and presentations to teachers) about how the assessment works so that they could help students approach it with the right mindself. But he acknowledged that it was a mindset and culture shift after so many years of giving the MAPP assessment.

Kristine Sauter wondered how the information was being shared with parents particularly in light of the mindshift in how the assessment works.

Per Steve, many of the schools have used parent teacher conferences to go over the results. Other sites sent out letters/emails with the test results. The schools were using “multiple modalities” [he used that term un-ironically] to share info with parents.

Gary Jahnke didn’t have much to add during the discussion session because he was called on last. He was the only Board member who noticed that the graphs in the one slide didn’t add up to 100%. He also remarked on a jump in the number of students who were tested in 2019 (3,026) to 2020 (3,410). Steve said he could check on numbers with their Educlimber program which can interact with i-Ready and also connects to their enrollment data, but he had no answer right then for Gary.

At that point, the presentation moved on to the results of the math assessment, which I will have to save for a later day.

You can watch the entire meeting here: https://www.youtube.com/watch?v=G0-jMU3PX_s&t=2285s

Follow All Things Appleton:

Be the first to reply

Leave a Reply

Your email address will not be published. Required fields are marked *