Once in awhile, I take the time to sniff around and find an education study worth talking about in this blog. I wish I had the time to do it more often, but judging from my quick look-see this afternoon, the research terrain isn't overflowing with milk and honey these days anyway.
First, there's this Education Week news story about a technology study conducted by Central Connecticut State researchers. We're told that, when college students respond to instant messages while they are reading, they take longer to read. Alrighty then. The supposed shocker of the research is that students still understand what they read... probably because they re-read. Now, I'm all for learning more about how new technologies affect learning. It's one of the reasons I'm pretty excited that we have a new, federally-funded research center on education technology. To be sure, we need to better understand how to harness new technologies and learning forums for such media. But I'm also of the opinion that some research questions can be answered by common sense. ??And whether kids take more time to read and understand while they are instant messaging falls into that category. (Granted, I didn't read the study and cringe at not doing so, but Ed Week didn't provide the link--which also makes me cringe.)
Onto the next study out this week. It's a high school graduation report based on Common Core data from 2005-2006. It's always good to keep up with these types of data since they've been largely ignored in the past and energy is now rightly going into making them more reliable. Anyway, analysts here use what they call an "averaged freshman graduation rate" or AFGR to calculate students receiving high school diplomas. (This is basically an estimate of the number of kids who come in as freshman and graduate four years later.) Across 48 states in 2005-2006, our AFGR is 73.4 percent (part of the reason we do poorly in the "high school graduation" event in the Education Olympics). Rates were particularly low in Alabama, Alaska,?? and California, among other states. They were high in Connecticut, New Jersey, and Iowa, among others. Then there are the states that have markedly increased their graduation rates in recent years, like Hawaii, Tennessee, and Kentucky. Kudos to them.
Finally, there's this report out from the Center on Education Policy on a similar topic--it tallies state progress on high school exit exams. We learn that, during 2007-2008, 23 states "withheld diplomas based on students' performance on state-mandated high school exit exams." Hmmm... that withholding diplomas part sounds like it might actually mean something--exactly what, no one can be sure. Between "alternative paths to graduation," allowing kids to re-take exams multiple times, and questions about test quality, among other areas, we have very little handle on the true impact of these tests. Moreover, we have no national data regarding success and failure on the exams--and the state data, as usual, are all over the map. As with any accountability policy, we often like the "A" word more in theory than in reality. Accountable consequences are fine when it's not your kid, your school, or your job on the line. If you were to ask 10 people why we should have high school exit exams, you'd likely get very different answers. One of them, if we're truthful, is to point fingers at someone or something falling down on the job. Apparently we're not ready to hear that. And until we get real about reporting high school exit data fully, accurately, and meaningfully, what they say will remain a mystery.