News

Stuck on New Stupid

Share this article

Those we trust to nourish our children’s minds and teach them to think are falling victim to “the new stupid” according to Frederick M. Hess of the American Enterprise Institute (AEI). Data-driven decision making has become so commonplace in the classroom, that it is blindly embraced by our nation’s teachers and education employees, leading to some embarrassing and appalling approaches to educational reform.

Hess notes “three elements of the new stupid,” the name he has given to the “reflexive and unsophisticated reliance on a few simple metrics.”

The first of these elements is using data in half-baked ways. This problem stems from teachers’ eagerness to apply data in the classroom, even in situations when the data is not necessarily applicable. With quantifiable data like test scores and grades, educators must ask basic questions about the reliability and validity of the data they hope to apply. For example, if a test shows that teachers in higher income areas produce better results, educators cannot expect simply to transplant teachers to suffering schools and see achievement improve.

“The key is not to retreat from data,” Hess explained, “but to embrace it by asking hard questions, considering organizational realities, and contemplating unintended consequences.”

Another element of the new stupid is translating research simplistically. Data and data analysis can often be very complicated and hard to understand, especially when those using the data did not help gather it. Oftentimes educators make the mistake of applying research to classroom settings without fully understanding the results or implications of the research.

Hess offers as an example the Student Teach Achievement Ration (STAR) in California, which seemed to indicate that students performed better when class sizes were smaller. As a result, California spent billions of dollars trying to reduce class sizes across the state. Unfortunately, these efforts failed, “with the only major evaluation…finding no effect on student achievement.”

“What happened? Policymakers ignored nuance and context. California encouraged districts to place students in classrooms of no more than twenty—but that class size was substantially larger than those for which STAR had found benefits…The moral is that even policies or practices informed by rigorous research can prove ineffective if the translation is clumsy or ill-considered.”

Finally, the new stupid is giving short thrift to management data. Hess argues that the results of tests evaluating reading and math skills can only tell us so much, and yet these types of results are being used to shape our education system. “State tests tend to provide results that are too coarse to offer more than a snapshot of student and school performance,” he explains.

Hess goes on to describe how many school district employees cannot be assessed by looking at the data that is most often collected. “Data-driven management should not simply identify effective teachers or struggling students but should also help render schools and school systems more supportive of effective teaching and learning. Doing so requires tracking an array of indicators, such as how long it takes books and material to be shipped to classrooms, whether schools provide students with accurate and appropriate schedules in a timely fashion, how quickly assessment data are returned to schools, and how often the data are used.”

Hess offers four keys to avoiding data pitfalls:

1. “Educators should be wary of allowing data research to substitute for good judgment.”

2. “Schools must actively seek out the kind of data they need, as well as the achievement data external stakeholders need.”

3. “We must understand the limitations of research as well as its uses.”

4. “School systems should reward education leaders and administrators for pursuing more efficient ways to deliver services.”

Data can and should be used in the classroom. When it is used properly, data can reveal harmful trends and patterns which we can solve for, or successes that we can reward. However, as data-gathering systems become more advanced and data-driven education becomes more commonplace, “let us take care that hubris, faddism, and untamed enthusiasm do not render these gifts more hindrance than help.”

Daniel Allen is an intern at the American Journalism Center, a training program run by Accuracy in Media and Accuracy in Academia.


danielallen_342

Sign up for Updates & Newsletters.

Recent articles in News