To Multiple Choice and Beyond!

Case Studies in English Language Arts: Part 1

As a teacher, how do you know if your students are on track for meeting the expectations for year-end performance in the state standards? During the year we may see that many students spend large amounts of time in a similar stage of learning whereas other students grow. When you see one of these three scenerios –(a) students increase scores along the district’s interim assessment scale, but their year-end predicted achievement level does not change, (b) students’ scores decrease, or, (c) students are  staying the same across time – it is time to dig in. In such a situation, you have spent three to four months teaching new topics, without a corresponding substantive change in student cognition in the content area. Thus, we need to investigate where students are functioning in their thinking.

Students who are growing to become proficient in their state standards are demonstrating sufficient mastery of a state’s academic content standards at particular levels of difficulty, integrated with related standards, at particular levels of cognition (higher level thinking skills). In reading where the goal across years is to transition students from learning to read, to literal comprehension, to text analysis, it is possible to take students through new content at lower levels of cognition. When we see test data that essentially suggests students are not growing much, if at all, across time we have to ask the question, “Why is the child stuck?” Once we have this figured out, we can determine next instructional steps.

How are you having students show their thinking? For students who remain in the same stage of learning in the standards, we need to move quickly and shake things up to support their growth. One fast approach is to use multiple choice items less and use questioning, constructed response, and extended response items more! It is often expedient to ask students, “How do you know?” when you are on a fact-finding mission. Allow me to elaborate to demonstrate this idea more concretely.

Text Complexity

Perhaps, you have run across the book The Emerald Atlas by John Stephens. This text is coded as a Lexile level of 720. States often use the ubiquitous Lexile Framework® for Reading offered by MetaMetrics or other metrics to help teachers track if they are using grade-appropriate texts. They also use it to help ensure ELA assessments are differentiating text complexity across grade levels. This novel falls in the moderate text complexity range for Grade 3. A partner tool for the quantitative evaluation of text complexity is shown here. The qualitative evaluation of texts helps teachers, curriculum designers, and assessment developers think about the characteristics of text sections that make interpreting text more or less complex for students.

Foundational reading skills are needed to support growth in reading comprehension. However, with state assessments, by Grade 3 the expectation is that students have transitioned from learning to read to reading to learn. While it would not be unusual to ask a Grade 3 student to read aloud to check foundational reading skills, it can also be a good idea to listen to students read out loud (privately as students increase in age), regardless of grade, who are not growing as expected. It is a way to investigate decoding skills, and most critically, it allows you an opportunity to probe what students are thinking on the fly as they read.

Cognition

Here, we have a child who is reading The Emerald Atlas come to the following two sentences of a paragraph.

“Gabriel’s band had entered through the dark northern end of the city. Two Screechers standing sentry had been felled by arrows, another by Gabriel’s falchion.”

There are two interesting words a teacher could choose to ask the student about in these sentences to determine what the student understands in their literal understanding of the text: sentry and falchion. Asking students about vocabulary words in context is a component of almost every state’s  standards. For example, in the Common Core Standards we see

CCSS.ELA-LITERACY.L.3.4.A
Use sentence-level context as a clue to the meaning of a word or phrase.

CCSS.ELA-LITERACY.L.4.4.A
Use context (e.g., definitions, examples, or restatements in text) as a clue to the meaning of a word or phrase.

Given that we have two word choices, which path should a teacher choose?

Valencia, Wixson, and Pearson posited that reading is found at the intersection of the text and task. They noted in their research that we understand reading comprehension development by focusing on the task the student is asked to perform while at the same time noticing the amount and location of the textual evidence, the degree of abstractness in textual evidence needed to perform the task, and the reader. That is, we cannot look at these elements in isolation.

A student in earlier stages of reading comprehension development might be asked,  “Why are the bad guys standing in the dark?” to determine if the student can infer the meaning of sentry. The student who makes the inference is doing so using evidence located between two consecutive sentences within a single paragraph. This type of interaction makes the inference easier. A student who is in a more advanced stage of reading comprehension might be asked, “What is a falchion?” Thus in the same paragraph and in nearly the same reading moment we have opportunities that can lead us to different evidence trails within the text that help us understand what and how a student is thinking while reading. This is why Valencia et al., (2014) do not support interpreting the text complexity of the entire passage as the main driver of when a student can demonstrate mastery of a standard. In this real-life example, the latter path was chosen.

As it turns out, the Grade 3 child was not really sure what a falchion was. Though there are hints in the entire paragraph. The Screechers were felled (child must infer killed). The men were armored but moving silently. If the men were moving silently, they likely needed to kill the guards standing sentry (lookout) quietly. What beside arrows could be used to quietly kill guards on lookout? Perhaps the child who can answer the question correctly at this point could do so using other connections. We don’t know how to interpret what a student understands until we investigate the chains of evidence available and that the student used to respond to the task correctly.

Two pages later the word appears again!

“There was a soft twang beside her as Gabriel released his arrow…” “There was a volley of rifle fire, the thick swoof of a dozen arrows taking flight, the broken thudding as they found their targets, and all was chaos and shouting. Dropping his bow, Gabriel pulled the falchion off his back, gave a great, bellowing cry, and leapt through the gap in the wall.”

The child looks up as says, “A falchion is a broad sword kind of a thing.”

“How do you know?”  

“He pulls it off his back.”

“Why not a rifle, it could have been on his back too. How did you know?” the teacher probes.

“Well, first the name; it sounds weird. Also, when they [the main characters] were going through the maze [9 chapters earlier] he [Gabriel] unwrapped that bag, and it had that weapon that looked like a big machete. “

In the response, this student is citing evidence from Chapter 10 in the book, and he is connecting this to separated evidence much later in the text in which the character pulls the weapon off his back. It is the how the student came to the correct answer and where the evidence is located within the text that shows us the sophistication of the student’s thinking. This aligns to the NAEP Reading Framework Complexity Model  of “Integrate/Interpret” because the student is making an inference by making connections across situations. It is very hard to glean where the evidence is located a student is using to make an inference from a multiple-choice item. For this student the teacher decided to begin moving the student into the text dependent analyses process.

Deeper learning prepares students to work collaboratively and master challenging academic content. https://images.all4ed.org/license/

Are you over-relying on multiple choice items?

While multiple-choice questions are often used on large-scale assessments because they are fast and efficient, many states have moved to incorporating a variety of item types, including technology-enhanced items and short constructed response items, in addition to multiple choice items, to support better measuring more complex thinking skills. Constructed response items have been found to be significantly more difficult than multiple choice items. Technology-enhanced items have been found to elicit similar levels of cognition as short constructed response items. Large-scale assessments can play an important role in our assessment ecosystem, but remember as a teacher you have other assessment types in your toolbox.

If you are spending much of your instructional and assessment cycles focused on measuring student learning through a singular format in the classroom, then there is a limit to the flexibility of thinking you elicit from students. Students may not have the opportunities they need to have sufficient practice with higher level thinking skill interactions with the texts that they read. In the classroom it becomes almost impossible to give students these critical growth opportunities without using performance tasks. It also makes it quite difficult for you to understand where students are in their abilities to gather evidence from texts.

In the next installment of this blog, how to adapt assignments and a sample of student work from such an adaptation will be provided so you can see it can be easy and fast for you to collect more robust evidence types that support older students. These are tasks that are meant for formative processes, where the focus is supporting students first in analysis then in documenting their work in an essay. Just like Buzz Lightyear, students will hit bumps and snags along their learning journey, but with additional supports and scaffolding, we continue their growth!


Posted

in

by

Comments

2 responses to “To Multiple Choice and Beyond!”

  1. karinhessvtgmailcom Avatar
    karinhessvtgmailcom

    Several excellent examples of how to “uncover thinking” in order to frame actionable feedback, part of the Actionable Assessment Cycle https://www.karin-hess.com/_files/ugd/5e86bd_002cffef2d5b4ef2b481e008b4ec2289.pdf

    Like

  2. To Multiple Choice and Beyond! – Actionable Assessments in the Classroom Avatar

    […] the first blog of this series, I argued that when teachers use multiple-choice items as the predominate way of […]

    Like

Leave a reply to karinhessvtgmailcom Cancel reply