There's been a lot of discussion recently about NCLB/waivers, teacher accountability, performance-based pay, etc., and a constant theme throughout all of those discussions is the increased use of data to hold teachers and schools accountable for what they do. Often times over the past 10 years, we've seen poor examples of how data have been used in these processes. I thought the article below from Education Week was interesting because it seemed like there were actually some constructive outcomes from this particular use of data. In short, this article describes how data on student achievement were used to compare effectiveness of teachers coming from different teacher prep programs, holding variables like poverty constant. Rather than emphasizing consequences for the programs, though, the data was used to identify areas of improvement and related strategies. One particular program, for example, was identified as weak in reading instruction. As a result, they beefed up the direct instruction component of their program. They then improved to being the top program in their category. Of course, the system isn't without problems or debate, but the point it there is a positive side. Thoughts? http://www.edweek.org/ew/articles/2012/02/17/21louisiana_ep.h31.html?tkn=ZQVFFfO0MAFXMLvuI2wcE1jYTTMjdGwfRyG1&cmp=ENL-EU-NEWS1
It sounds like there are positives to using value-added approaches to evaluate these college programs, but I don't think those positives would translate to evaluating teachers based on student test scores. First, I worry about small sample sizes skewing the results. The sample sizes for the teacher's colleges were probably much larger. Second, the graduates of these programs had reasons to perform as well as they could - their salary, their job, etc. Too often, our students don't have any reason to actually try on the standardized tests. Until they find a way around these issues, I would not be in favor of using a value-added model as part of my yearly evaluation.
Good point, and an important distinction - the difference between using student achievement data to evaluate teachers, as opposed to evaluating teacher prep programs!
We have value added in the UK. It worked well for a while until somone high up noticed that it was the schools in poorer areas that got the highest value added! (My school, a pretty below average one) came in the top 5% for the whole country for value added despite our actual final exam grades being well below the 'best' schools. The 'best' schools (remember we have selective education over here in many areas) for the most part came in pretty low in the value added stakes. This obviously upset all the politicians and movers and shakers who either ran or sent their kids to the 'best' schools! So the concept of 'value added' whilst not dropped was sidelined. This was illustrated when our school was last inspected. We were rated as 'satisfactory' (3rd position on a 4 point grading) which in eduspeak over here means 'unsatisfactory'. This was despite having the best value added in the City! The grading was made purely on our final exam results! So whilst Value added looks like a useful tool beware of politicians who will just find a use for it as a stick to beat schools with.
I applaud the efforts to try and improve the college programs-I often wonder if they even track to see if their teachers completed their first year teaching (we had a new graduate from a very reputable program quit this year 2 weeks in). So I think examining their practices is a good thing and the data is already there. I still don't agree with value-added and shudder to think how far it's being taken (next year it's 50% of our evaluations). When our district switched to it, I attended the board meeting and saw teacher after teacher get up to make the arguement why it's ineffective. One had 100% of her kids passing-they all passed the year before so she had negative growth. Another was actually voted Teacher of the Year for the entire district and had negative growth. So I don't think those scores for that one day really show the "value" of a teacher.
This has made headlines in teaching circles in the UK http://www.nytimes.com/schoolbook/2...chers-denounces-public-shaming/?ref=education
Data-mongering is the lowest mode of truth-seeking, comparable in the history of economics to Tulipomania.