Teaching

See the Error, See the Individual — and See the Intervention Teaching
One size does not fit all in dealing with educational assessment.

See the Error, See the Individual — and See the Intervention

January 30, 2017 3931

One size does not fit all in dealing with educational assessment.

One size does not fit all in dealing with educational assessment.

jpa-coverThe following article synthesizes scholarship in the latest issue of the Journal of Psychological Assessment. The special issue was guest edited by Kristina C. Breaux of Pearson Clinical Assessment; Melissa A. Bray and Melissa M. Root, both at the University of Connecticut; and Alan S. Kaufman of Yale University.

***

Too often an assessment of academic achievement focuses only on standard scores and eligibility determinations; however, a standard score does not provide information about level of mastery for specific skills nor does it reflect the specific processes that mediated performance at the item level. Students with learning disabilities and those who do not benefit as expected from evidence-based instruction need comprehensive evaluations that provide data to inform a change in instruction that will better meet each student’s learning needs. Drilling down to the item level via error analysis is critical to this process. Unfortunately, no research has been conducted to date on the error analysis data obtained from diagnostic achievement tests.

The primary focus of this special issue of Journal of Psychoeducational Assessment is to understand what students’ errors on standardized tests of academic achievement tell us about teaching and learning, and to use this knowledge to inform the assessment process and development of educational interventions. The goal is to provide a first step in remedying that gap in the literature, based on data obtained during the standardization and validation of the Kaufman Test of Educational Achievement—Third Edition, known as KTEA-3.

There has been a movement in school psychology and special education to focus on Response to Intervention, or RTI, which usually means eliminating cognitive tests from the diagnostic process.  Many who favor RTI methodology disdain identifying students’ cognitive strengths and weaknesses, believing that an individual’s particular pattern has little to do with selecting appropriate interventions.  The contributors to this special issue believe the opposite—that the understanding of a person’s strong and weak areas of functioning will help psychologists and educators identify the best interventions for each person and not rely on the notion that one size fits all. In this special issue, the authors of the diverse empirical studies focus on patterns of errors on tests of academic achievement to help clarify the link to intervention.

The exploratory factor analysis of errors conducted by Rebecca O’Brien, et al, revealed the error factor structure upon which most other articles in this special issue were based. The resulting factor structure has implications for understanding the relationships between students’ errors in word recognition, decoding, spelling, and math. For example, the composition of error factors differed for the basic reading and spelling subtests — even though they shared most of the same error categories. Hence, students with a skill weakness in a particular area may make different patterns of errors in word reading, pseudoword decoding, and spelling.

Similarly, the article by Ryan C. Hatcher and colleagues revealed a trend of dissimilar errors across parallel subtests (listening-reading and speaking-writing), suggesting that errors in one language system may not necessarily translate into more errors on a similar task that originates in a different modality. These results highlight the separability of the listening, speaking, reading and writing language systems. Errors must be analyzed within and across parallel language systems to determine the extent of a student’s skill weaknesses. In addition, the instructional approach must respond to each language system’s unique developmental and skill trajectory.

Some of the articles in this issue examined whether students with specific cognitive profiles associated with learning problems differ in the kinds of errors they make on tests of reading, writing, math, spelling, and oral language. For example, Taylor Koriakin, Erica White, and colleagues found that students with high crystallized ability (knowledge) but low processing speed and memory were stronger on basic and complex math problem solving than students with the opposite cognitive patterms. Additionally, Xiaochen Liu and colleagues found that the high knowledge—low speed/memory group outperformed other cognitive profile groups on almost all error factors in the areas of phonological processing, word reading, decoding, reading comprehension, and written expression. These findings reinforce the notion that cognitive patterns of strengths and weaknesses have clear-cut implications for targeted intervention in specific skill areas.

A subset of articles in this issue investigated whether special populations differ in their patterns of errors on achievement tests. Matthew S. Pagirsky and colleagues, for example,  found that children with ADHD and comorbid reading difficulties demonstrated greater errors across reading, math, and writing subtests than those with ADHD alone or a control group, especially on error factors requiring phoneme awareness, language skills, and inhibition. In addition, Maria Avitia, Emily DeBiase and colleagues found more similarities than differences between the error patterns of students with reading versus math disorders. With a focus on students with giftedness and learning disabilities, Karen L. Ottone-Cross and colleagues found that gifted students with learning disabilities (gifted LD) scored similarly to the gifted group on many higher-level skills, whereas they scored similarly to students with learning disabilities (SLD) on several lower-level skills. For example, the gifted LD sample demonstrated similar strengths to the GT sample across the Written Expression and complex Math Concepts & Applications skills, which have a higher level processing demand. However, the gifted LD sample was statistically different from the SLD group on all subtests and factors – with the exception of two tasks with lower level processing demands: decoding nonsense words with silent letters and computing addition math problems.

All of the articles with clinical groups demonstrate the importance of looking beyond diagnostic categories when planning instruction, focusing instead on the types of errors students make and their specific instruction needs.A subset of articles in this issue investigated categories when planning instruction, focusing instead on the types of errors students make and their specific instruction needs.

The research described within this special issue is the first of its kind to look within individual profiles and consider the kinds of errors students make within and across academic domains to reveal valuable insights for improving assessment and instruction. These articles stem from a belief that assessment must lead to instruction, and error analysis is at the heart of understanding how students learn and need to be taught.

The key feature of the special issue tying all the separate articles together are the invited commentaries. We asked luminaries in the field to integrate the array of articles and to apply the results of the studies to diagnosis and treatment. They did not disappoint. These leaders in school psychology, neuropsychology, and special education helped to integrate the diverse studies and make their findings come alive. Nancy Mather and Barbara Wendling explain how a careful analysis of errors is key to planning the most appropriate instructional interventions. They noted that, “A careful error analysis goes beyond the quantitative results, and focuses upon the qualitative information that may be derived from reviewing an individual’s performance on the specific items of the various subtests.”

Reflecting on the past, present, and future applications of error analysis, George McCloskey explains how to use process error analysis to shed light on the most frequently overlooked mental capacity—executive functions. Alyssa Montgomery, Ron Dumont, and John Willis reinforce the importance of examiners looking beyond standard scores when analyzing results and consider the evidential validity as well as the predictive and practical utility of the research findings. Dawn Flanagan, Jennifer Mascolo, and Vincent Alfonso explain the utility of KTEA-3 error analysis for the diagnosis of specific learning disabilities based, in part, on a consideration of the findings from various studies in this special issue. To demonstrate that process, Flanagan, Mascolo, and Alfonso discuss error analysis within the context of a pattern of strengths and weaknesses approach and provide an illustrative case example.


Kristina C. Breaux is a senior research director with Pearson Clinical Assessment, where she led the development of the WIAT®-III and worked in close collaboration with Alan and Nadeen Kaufman to develop the KTEA™-3 and KTEA™-3 Brief. She is coauthor of Essentials of WIAT®-III and KTEA™-II Assessment.

View all posts by Kristina C. Breaux

Related Articles

Tom Burns, 1959-2024: A Pioneer in Learning Development 
Impact
November 5, 2024

Tom Burns, 1959-2024: A Pioneer in Learning Development 

Read Now
Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions
Research
October 10, 2024

Exploring the ‘Publish or Perish’ Mentality and its Impact on Research Paper Retractions

Read Now
AI Upskilling Can and Should Empower Business School Faculty
Higher Education Reform
July 10, 2024

AI Upskilling Can and Should Empower Business School Faculty

Read Now
Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work
Social Science Bites
July 1, 2024

Megan Stevenson on Why Interventions in the Criminal Justice System Don’t Work

Read Now
Responsible Management Education Week 2024: Sage Asks ‘What Does It Mean to You?’

Responsible Management Education Week 2024: Sage Asks ‘What Does It Mean to You?’

Sage used the opportunity of Responsible Business Management week 2024 to ask its authors, editors, and contacts what responsible management education means to them.

Read Now
How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

How ‘Dad Jokes’ Help Children Learn How To Handle Embarrassment

Yes, dad jokes can be fun. They play an important role in how we interact with our kids. But dad jokes may also help prepare them to handle embarrassment later in life.

Read Now
How Social Science Can Hurt Those It Loves

How Social Science Can Hurt Those It Loves

David Canter rues the way psychologists and other social scientists too often emasculate important questions by forcing them into the straitjacket of limited scientific methods.

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments