Fluency ideas

Discussion in 'Elementary Education' started by myKroom, Apr 10, 2015.

  1. myKroom

    myKroom Habitué

    Joined:
    Nov 6, 2006
    Messages:
    835
    Likes Received:
    0

    Apr 10, 2015

    Our state requires that students meet a fluency benchmark during the three testing periods or be labeled "substantially deficient." I'm an At-Risk teacher and so I am one of the teachers working with these "substantially deficient" kids. My team is looking for some more ideas on how to improve our student's fluency. We've done repeated reading, poetry, reader's theater, sight word work, etc. Some of our students are getting bored with it and, while we are seeing improvement, it's not as much as we would like. What else could we be doing?
     
  2.  
  3. Missy

    Missy Aficionado

    Joined:
    Aug 5, 2004
    Messages:
    3,701
    Likes Received:
    275

    Apr 11, 2015

    Our district uses a program called Read Naturally. I personally think it is boring as heck, but it works for some students.
     
  4. a2z

    a2z Maven

    Joined:
    Sep 16, 2010
    Messages:
    5,881
    Likes Received:
    1,800

    Apr 11, 2015

    Read Naturally is a good program for students.

    The thing you have to be very cognizant when you work with the student on fluency is that you aren't trying to develop their fluency when they are reading at or above their frustration level. Fluency must be done at the independent level.

    Some kids need more work on blending so that it becomes automatic because not all words will be memorized and they need to be able to fluently decode a word. Because the kids are young, some still need something to help the track the line while they are reading. Their visual system is not developed enough to keep their eyes stable while reading a line of text. That may be a non-educational intervention to help fluency.
     
  5. Missy

    Missy Aficionado

    Joined:
    Aug 5, 2004
    Messages:
    3,701
    Likes Received:
    275

    Apr 11, 2015

    Do you have Fundations? Our K-2 students get that and many of the teachers think it has helped a lot.
     
  6. czacza

    czacza Multitudinous

    Joined:
    Sep 30, 2001
    Messages:
    24,956
    Likes Received:
    2,108

    Apr 11, 2015

    How are you assessing fluency?
     
  7. myKroom

    myKroom Habitué

    Joined:
    Nov 6, 2006
    Messages:
    835
    Likes Received:
    0

    Apr 12, 2015

    Missy, we do use Foundations. I think it really helped the students, but they seem to get the phonics skills within the program in isolation, but to apply it to any other text beyond the program is not happening.

    czacza, the state has us using the FAST assessment. The fluency screener/progress monitor is called CBM and the students are assessed on grade level. It essentially is assessing words per minute and accuracy.
     
  8. czacza

    czacza Multitudinous

    Joined:
    Sep 30, 2001
    Messages:
    24,956
    Likes Received:
    2,108

    Apr 12, 2015

    I'm assuming student reading level isn't considered? Also this tool doesn't account for prosody? I'm not a big fan of these kinds of assessments..I do think teachers are better judges of students' growth as readers.:2cents:
     
  9. myKroom

    myKroom Habitué

    Joined:
    Nov 6, 2006
    Messages:
    835
    Likes Received:
    0

    Apr 13, 2015

    You are correct. I don't like it either, it's not best practice, it tells me nothing, and it shouldn't be the final say. But it's the state mandate so I have to. Any ideas to help them improve?
     
  10. a2z

    a2z Maven

    Joined:
    Sep 16, 2010
    Messages:
    5,881
    Likes Received:
    1,800

    Apr 13, 2015

    I disagree that the opinion of the teacher is all that is needed. It may be fine for czacza or you, OP, but not all teachers are created equal.

    Also, specific skill testing does indicate areas that will impact the student in the future. A student relying on language knowledge to support poor decoding or fluency skills will suffer later when the text becomes more complex and goes beyond the language skills. It will also come out when the text starts becoming non-predictable based on common language or pre-loaded vocabulary in a lesson.

    I think knowing there are deficit areas that may not be as noticeable in overall comprehension at the time is important. It signals problems for the future, most likely when the student has gone on to higher levels.
     
  11. myKroom

    myKroom Habitué

    Joined:
    Nov 6, 2006
    Messages:
    835
    Likes Received:
    0

    Apr 14, 2015

    My 'you are correct' statement was answering her questions. I feel the teachers opinion is important, but it definitely is not the ONLY source needed. I feel multiple sources are needed to truly define a students deficit areas. There is definitely much more involved in a students reading ability than just fluency. However, since fluency is what the state wants us to use, my original questions is what else can I do to help build their fluency ability?

    We spent the first half of the year working on phonics and phonemic awareness almost exclusively. Now we need to carry those skills into their reading level to increase their fluency and boost their reading level. The activities are just so monotonous and they are getting bored. What else is out there for me to try?
     
  12. czacza

    czacza Multitudinous

    Joined:
    Sep 30, 2001
    Messages:
    24,956
    Likes Received:
    2,108

    Apr 14, 2015

    I didn't say a teacher's assessment was all that is needed. I said a teacher knows more about a student as a reader than this kind of test will show.

    I can read a law or medical text book fairly quickly. But I would be merely calling words. I'd have little idea about what I read meant, I wouldn't be reading with expression, but I'd score well on a words per minute test.

    I can tell you the level, fluency nd comprehension of any student in my room because of my one on one conferring, expert instruction and anecdotal note taking.
     
  13. SCTeacher23

    SCTeacher23 Comrade

    Joined:
    Jun 3, 2009
    Messages:
    378
    Likes Received:
    3

    Apr 19, 2015

    I teach K. My students are assessed on fluency too and we have a "Beat that Time" challenge where students are timed when they read every week on Friday's. There is a certain time that they need to be within and they enjoy it because it's a challenge every week to see if they can beat the time. We also send home Reading A-Z books to parents with the target time that they have to beat and have them read it at home too.
     
  14. gr3teacher

    gr3teacher Phenom

    Joined:
    Sep 13, 2013
    Messages:
    4,256
    Likes Received:
    794

    Apr 19, 2015

    The problem with a test like this is that it doesn't necessarily assess what it claims to assess though. Take, for example, two kids each read 30 words per minute with two errors. One kiddo reads word for word, choppily, etc., and missed two words because he saw two he didn't know and just skipped them. The second kiddo read a lot smoother, and sounded more like a reader. When he got to two words he didn't know, he used various word attack skills, but finally at the five second mark, the teacher told him the word and counted it as an error. Per the test, these two students have identical fluency, but clearly they don't.

    My biggest pet peeve with reading assessments is those assessments that give a firm cut-off based on fluency. Those types of assessments reward the kids that just give up on tricky, unknown words and punish the kids who actually use good reading strategies (going back, using word attack skills, using text and picture clues, etc).
     
  15. a2z

    a2z Maven

    Joined:
    Sep 16, 2010
    Messages:
    5,881
    Likes Received:
    1,800

    Apr 19, 2015

    You could make the same argument about any form of assessment, even a teacher made assessment. Regardless of whether the student has a 30 wpm rate and 2 errors for inability to efficiently and accurately read a word or because he skips it, it both points to a problem. The final number isn't supposed to be the end all for the assessment. It is supposed to indicate problems if any exist.
     
  16. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,769
    Likes Received:
    232

    Apr 20, 2015

    Not to be argumentative, but there's actually research out there that they aren't. That being said, I definitely think that teachers know a great deal about their students, but I think there are things that reading assessments can give teachers in terms of information that their informal impressions can't.

    In terms of prosody vs fluency (not that they're completely separate), I'd first say that there's a difference between "fluency the skill" (or collection of skills) and "fluency the indicator." In terms of "fluency the indicator," including prosody isn't really as helpful because you really aren't assessing the skill of reading fluency at that point, but using that information as a general outcome measure for other reading skills. Because prosody isn't part of that equation, including prosody isn't that helpful.

    However, if you're assessing "fluency the skill," which is more specifically fluency with reading whole passages, then I totally agree - prosody becomes much more relevant.
     
  17. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,769
    Likes Received:
    232

    Apr 20, 2015

    myKroom part of the answer in terms of what to do goes back to some of the discussion already being had, and what I mentioned in response to czacza - that "fluency" is really both an outcome indicator and cluster of skills plural. So, the repeated readings thing is awesome, but really only targets fluency with reading whole passages. There are so many skills that go into getting a student to the point of just being able to read a passage accurately (which is subsumed under fluency), even before they can read fluently, that there isn't one set of interventions that will magically work.

    My suggestion would be to go back to your most recent assessment data (if you don't have anything in the past month or two I'd redo the assessments as a first step). Base your instructional activities on areas of deficit lowest in sequence of reading events. If there are gaps, remediate the gaps first. Build fluency with those individual skills, then move forward. This may seem obvious, but what isn't necessarily obvious is that reading fluency can't really be seen as a singular skill, but as a culmination of all those others. So, working on decoding actually helps with reading fluency. That's the main point.

    So, what I might suggest if you're stuck with particular students is to create a new post with your assessment data and get intervention ideas for those particular kids with their particular skill deficits.

    Sorry to obfuscate! :)
     
  18. EdEd

    EdEd Aficionado

    Joined:
    Jan 12, 2011
    Messages:
    3,769
    Likes Received:
    232

    Apr 20, 2015

    So first CBM isn't claiming to assess prosody or those other reading areas - it's by definition a general outcome measure that is just using that raw number (WCM) to statistically measure overall reading growth. Yes, it leaves out certain variables, and yes it's not perfect, but the data strongly suggest CBM is a pretty good general outcome measure.

    Contrast this concept of general outcome measure (GOM) with a diagnostic test, which is designed to thoroughly assess each skill area for the purposes of planning intervention. You CAN use CBM for a bit of this, but you really have to go "off the books" in terms of making observations about prosody, doing an error analysis with decoding skills, and follow-up with additional assessments.

    So, I'd say CBM isn't the problem, but the expectation of what CBM will do and an understanding of what it is.

    In terms of "rewarding and punishing kids," I get what you are trying to say, but I'd also suggest that assessments don't reward or punish - they just assess. Getting a higher score on a CBM probe isn't beneficial to a child if that doesn't reflect their actual level. You're right in that a CBM error isn't sensitive to the type of error, but it can be if you do an error analysis afterward and include that data in your report/assessment.

    All in all, here's a good bottom line thing to keep in mind: Know what an assess can and can't do - what it can and can't say - and use it for that. CBM does some things really great, but it isn't purporting to do others. Understand what WCM translates into, what it can predict, etc., but don't expect it to distinguish between a child who is using work attack skills and not.
     

Share This Page

Members Online Now

  1. Worrygooseb,
  2. YoungTeacherGuy
Total: 396 (members: 5, guests: 372, robots: 19)
test