Wednesday 11 September 2013

Letters: Examining the Studies

Re “Guesses and Hype Give Way to Data” (Sept. 3): The quality of conclusions drawn from data depends upon the study that produces it. The study of Upward Bound conducted by Mathematica is a case in point.

Its conclusion that Upward Bound has “no effect” has been repudiated by much of the educational opportunity community, including a statistician at the Department of Education who argued against its publication. Yet this article presents it as the first major achievement in the use of randomized studies in the field of education.

I have directed an Upward Bound program since 1990 and have seen it do wonders. Randomized studies make sense when it is feasible to limit the impact of diverse variables. When attempting to assess programs that impact a wide range of intellectual and social abilities over time, this method must invariably fall short.

Dan Gordon

Eliot, Me.

TO THE EDITOR:

Educational research uses all sorts of approaches to get at the complex issues involved in education and schooling, and “crunching numbers to evaluate impact” is but one of them. A cursory glance at the materials produced by the American Educational Research Association (AERA) would reveal this fact.

It is true that much of educational research is not as rigorous as it needs to become, but rigor is not the same thing as crunching numbers. AERA has been working to integrate all approaches to educational research by both honoring their validity and demanding their integrity.

Meanwhile, we have a real-life story in the transformation of Finnish schools, based on increasing teachers’ knowledge and understanding of their subject matters. In this case, history and historical method reveal something that the studies in the article did not.

Timothy Leonard

Cincinnati


View the original article here

No comments:

Post a Comment