"'Value Added' Concept Proves Beneficial to Teacher Colleges"

Discussion in 'General Education' started by EdEd, Feb 17, 2012.

  1. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,750
    Likes Received:
    217

    Feb 17, 2012

    There's been a lot of discussion recently about NCLB/waivers, teacher accountability, performance-based pay, etc., and a constant theme throughout all of those discussions is the increased use of data to hold teachers and schools accountable for what they do. Often times over the past 10 years, we've seen poor examples of how data have been used in these processes. I thought the article below from Education Week was interesting because it seemed like there were actually some constructive outcomes from this particular use of data.

    In short, this article describes how data on student achievement were used to compare effectiveness of teachers coming from different teacher prep programs, holding variables like poverty constant. Rather than emphasizing consequences for the programs, though, the data was used to identify areas of improvement and related strategies.

    One particular program, for example, was identified as weak in reading instruction. As a result, they beefed up the direct instruction component of their program. They then improved to being the top program in their category.

    Of course, the system isn't without problems or debate, but the point it there is a positive side.

    Thoughts?

    http://www.edweek.org/ew/articles/2012/02/17/21louisiana_ep.h31.html?tkn=ZQVFFfO0MAFXMLvuI2wcE1jYTTMjdGwfRyG1&cmp=ENL-EU-NEWS1
     
  2.  
  3. KateL

    KateL Habitué

    Joined:
    Aug 3, 2007
    Messages:
    810
    Likes Received:
    2

    Feb 17, 2012

    It sounds like there are positives to using value-added approaches to evaluate these college programs, but I don't think those positives would translate to evaluating teachers based on student test scores. First, I worry about small sample sizes skewing the results. The sample sizes for the teacher's colleges were probably much larger. Second, the graduates of these programs had reasons to perform as well as they could - their salary, their job, etc. Too often, our students don't have any reason to actually try on the standardized tests.

    Until they find a way around these issues, I would not be in favor of using a value-added model as part of my yearly evaluation.
     
  4. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,750
    Likes Received:
    217

    Feb 17, 2012

    Good point, and an important distinction - the difference between using student achievement data to evaluate teachers, as opposed to evaluating teacher prep programs!
     
  5. waterfall

    waterfall Maven

    Joined:
    Feb 5, 2011
    Messages:
    5,849
    Likes Received:
    715

    Feb 17, 2012

    This was my thinking as well.
     
  6. blazer

    blazer Connoisseur

    Joined:
    Feb 18, 2009
    Messages:
    1,616
    Likes Received:
    321

    Feb 18, 2012

    We have value added in the UK. It worked well for a while until somone high up noticed that it was the schools in poorer areas that got the highest value added! (My school, a pretty below average one) came in the top 5% for the whole country for value added despite our actual final exam grades being well below the 'best' schools. The 'best' schools (remember we have selective education over here in many areas) for the most part came in pretty low in the value added stakes. This obviously upset all the politicians and movers and shakers who either ran or sent their kids to the 'best' schools! So the concept of 'value added' whilst not dropped was sidelined. This was illustrated when our school was last inspected. We were rated as 'satisfactory' (3rd position on a 4 point grading) which in eduspeak over here means 'unsatisfactory'. This was despite having the best value added in the City!
    The grading was made purely on our final exam results!

    So whilst Value added looks like a useful tool beware of politicians who will just find a use for it as a stick to beat schools with.
     
  7. KinderCowgirl

    KinderCowgirl Phenom

    Joined:
    Apr 1, 2006
    Messages:
    4,858
    Likes Received:
    0

    Feb 18, 2012

    I applaud the efforts to try and improve the college programs-I often wonder if they even track to see if their teachers completed their first year teaching (we had a new graduate from a very reputable program quit this year 2 weeks in). So I think examining their practices is a good thing and the data is already there.

    I still don't agree with value-added and shudder to think how far it's being taken (next year it's 50% of our evaluations). When our district switched to it, I attended the board meeting and saw teacher after teacher get up to make the arguement why it's ineffective. One had 100% of her kids passing-they all passed the year before so she had negative growth. Another was actually voted Teacher of the Year for the entire district and had negative growth. So I don't think those scores for that one day really show the "value" of a teacher.
     
  8. blazer

    blazer Connoisseur

    Joined:
    Feb 18, 2009
    Messages:
    1,616
    Likes Received:
    321

    Feb 25, 2012

  9. TeachOn

    TeachOn Habitué

    Joined:
    Jan 28, 2012
    Messages:
    804
    Likes Received:
    0

    Feb 27, 2012

    Data-mongering is the lowest mode of truth-seeking, comparable in the history of economics to Tulipomania.
     
  10. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,750
    Likes Received:
    217

    Feb 28, 2012

    Could you elaborate?
     

Share This Page

Members Online Now

  1. newteacher101,
  2. RainStorm,
  3. catnfiddle,
  4. ready2learn,
  5. Maithal
Total: 595 (members: 10, guests: 563, robots: 22)
test