Wednesday, March 18, 2015

Post for the Week of March 16-20

In planning my article, I most definitely am going to be writing an article that “makes the most sense” once I have the results.  My study has certainly taken a life of its own and has not at all looked like how I imagined it.  Most of this is due to unforeseen circumstances when planning my study which include our school’s new A/B schedule, nine snow days, and the almighty ACT.  Our time working in our reading journals has been slim (at best), and I see now that I am definitely going to have to make the best of what I have and I am praying that it all makes sense.

I am somewhat relieved to learn that the data may not be “strong enough to justify new insights formally” and that this may set me up for a future study. I feel like if I could do it all over again during a fall term, it would be so much better. The spring semester is always so sporadic with snow days, and  having to interrupt my study with intense ACT preparation really caused a lot of problems in my data collection.  In addition, the fall term would be fresh – I could really see their changes from the beginning to the end of my study. Starting mid-year just seems to be somewhat skewed.  I find that with high school students the apathy grows as they year progresses, particularly as spring approaches and as there have been huge interruptions of snow days. 

I am really glad to have this article as a resource as I begin crafting my paper, particularly the examples and strategies to use within each section.  I am getting a bit anxious about all of it and hoping as I begin the daunting task of “putting it all together” that I can utilize effective organization that will help my study to be useful to others.  J



Finally Back in the Groove!

The ACT, A/B schedule, and snow days have absolutely wreaked havoc on my study, and I am really freaking out that I don't have many journal entries at this point.  Spring break is only 7 days away, and I will only see these students 4 times during those days.  In addition, last week, I missed Monday due to a family situation, and on Wednesday, we had to STAR test, which took up nearly the whole block.  Our writing journals have suffered, and I am growing more and more concerned and upset about how my eventual findings will turn out. Ugh. So frustrating.

Yesterday, Tuesday, March 17th, we finally were able to spend time with our reading journals.  I decided to allow students to read a book or magazine of their choice and have them practice the skill of summarizing.  This is a skill that we have been working on all year to help with comprehension so they are familiar with doing this and know what is expected.  I then thought that it would be useful to know if students would read more thoughtfully if they knew that they would be expected to retain what they'd read for the purpose of writing a summary.

Before reading time began, I explained what they were going to do when it was finished by saying, "Read with the intention of summarizing when we are finished."  They then read for 15 minutes.

When reading time was complete, they were asked to summarize.  I gave them about 7 minutes to do so.  Some students finished very quickly and some took the entire time. I observed three students look back at their reading material for details.

Once the summaries were complete I posed the following question:  "Do you think you read more effectively having known that you were going to be asked to write a summary at the end of the reading session?"  They simply had to respond with a YES or NO.


53% NO
47% YES
.
I find this to be very interesting because I assumed that most students would read more intentionally, and I am going to ask a follow-up question on Friday as to why they did not read with more intention.  I am so curious as to whether this is apathy, lack of interest, or something that I am not thinking of...




Friday, February 27, 2015

Concerns at this Point...

Part of my baseline information is the Lexile reading levels of my students.  This year, our English department offers four levels to our juniors: Advanced Placement, Advanced, Regular, and EMA (Emerging Acheivers).  As mentioned in an earlier blog, the class I have chosen for this project is a an EMA class.  Throughout the course of the year, my students have been given the STAR test four times (3 in the fall and once in January)  The STAR test is also new to our school this year as it replaced the MAP test.   I have taken the January test results to determine lexile reading levels of each student in this study and included this information in the chart below:


As you can clearly see, according to this assessment, the majority of this class reads on a 6th grade level, and not one of the students is reading on a high school level.

My next step is to complete the San Diego Quick Assessment (SDQA) & I can't wait to see what the comparison reveals.

The SDQA measures a student's ability to read and recognize grade-level words as they are viewed out of context.  The test consists of 13 lists of equally difficult words that correspond to designated grade levels.  When conducting this test, the proctor is to begin with a list that is 2 or 3 levels below the reading level of the student.  As the student reads each list, the examiner is to note which and how many of the words in the list that the student mispronounces.  Once errors begin, they can be interpreted as follows:

1 error  = Independent Reading Level
2 errors = Instructional Reading Level
3 errors = Frustrating Reading Level

Today, I was able to give the SDQA  to student #4 and found that there was certainly a correlation between the two assessments with this student.  Her Lexile level score on the STAR test indicates that she reads on a 5th grade level, and when reading the words below, she went from reading independently with the 4th grade words but shot straight to the frustration level with the 5th grade words.  I feel like these are so close that when looking at the two, I have a fairly solid understanding of where she is.  It's nice to know this and makes me really excited to see what this reveals with the other 20 students in my study!



On February 26th, I was able to give the SDQA to the students in the study and have some interesting results.

I followed the instructions on the SDQA and had each student begin reading the word list that was two grade levels below the Lexile level indicated on the last STAR test.  For instance, if the STAR indicated that a student read on a fifth grade level, that student began reading the 3rd grade level list. 

The test indicates three levels of reading: Independent, Instructional, & Frustration.  


INDEPENDENT LEVEL OF READING

This first chart reveals a comparison between the student's Lexile reading level and the students level of reading INDEPENDENCE, according to the SDQA.  

  • 84% of students are reading independently ON or BELOW their Lexile.
  • 68% of students are reading independently BELOW their Lexile.
  • This is very interesting to me, and I am so surprised that so students many are reading INDEPENDENTLY below their Lexile.  I know that this is the "be-all/end-all" of reading level assessment, but I think such a close correlation between the two assessments does indicate some truth.  

INSTRUCTIONAL LEVEL OF READING

This second chart reveals a comparison between the student's Lexile reading level and the student's level of reading INDEPENDENCE, according to the SDQA.  


This is definitely the most interesting finding! How weird is it that 73% of students' assessments did not indicate an instructional level...  They were cruising right along, and then they were stumped completely.   I have included a couple of kids' assessments to give a visual of what was going on here on their sheets.  





  • 22% of students are reading on  an instructional level that is ON or BELOW their Lexile.

FRUSTRATION LEVEL OF READING

This third chart reveals a comparison between the student's Lexile reading level and the student's level of reading FRUSTRATION, according to the SDQA.  

  • 58% of students were found to be at the frustration level on or below their Lexile level.  
  • 41% of students were found to be at the frustration level above their Lexile level.  

Who we are...

As this is a collaborative class, it is very important when planning and examining the data to consider 504's, IEP's, and those students with neither type of plan.  Below is a chart that indicates this information:




As this reveals, well over half of the students involved in this study (63%) have some type of learning disability that hinders their educational growth in some form or fashion.  With that said, even the 37% of students without a plan are still struggling readers.  As a reminder, from the STAR Lexile levels revealed in my last post, no student in this study reads above an 8th grade level.  In regards to my study, I am encouraged that I am examining new ways to help these struggling students - at this point in their high school careers, anything is sure to help them.  I know the reading journals are a new concept to them, and I feel confident that they will at least look at how they read and process a little bit differently once we really get going.

Wednesday, February 25, 2015

Student Reading Levels - As of January 2015

Part of my baseline information is the Lexile reading levels of my students.  This year, our English department offers four levels to our juniors: Advanced Placement, Advanced, Regular, and EMA (Emerging Acheivers).  As mentioned in an earlier blog, the class I have chosen for this project is a an EMA class.  Throughout the course of the year, my students have been given the STAR test four times (3 in the fall and once in January)  The STAR test is also new to our school this year as it replaced the MAP test.   I have taken the January test results to determine lexile reading levels of each student in this study and included this information in the chart below:


As you can clearly see, according to this assessment, the majority of this class reads on a 6th grade level, and not one of the students is reading on a high school level.

My next step is to complete the San Diego Quick Assessment (SDQA) for all students in the study to the two.  I can't wait to see what the comparison reveals.

The SDQA measures a student's ability to read and recognize grade-level words as they are viewed out of context.  The test consists of 13 lists of equally difficult words that correspond to designated grade levels.  When conducting this test, the proctor is to begin with a list that is 2 or 3 levels below the reading level of the student.  As the student reads each list, the examiner is to note which and how many of the words in the list that the student mispronounces.  Once errors begin, they can be interpreted as follows:

1 error  = Independent Reading Level
2 errors = Instructional Reading Level
3 errors = Frustrating Reading Level

Today, I was able to give the SDQA  to student #4 and found that there was certainly a correlation between the two assessments with this student.  Her Lexile level score on the STAR test indicates that she reads on a 5th grade level, and when reading the words below, she went from reading independently with the 4th grade words but shot straight to the frustration level with the 5th grade words.  I feel like these are so close that when looking at the two, I have a fairly solid understanding of where she it.  It's nice to know this and makes me really excited to see what this reveals with the other 20 students in my study!



Wednesday, February 4, 2015

What's the problem? What's so difficult?


Back to the survey data!  The more I look at the Baseline Reading Survey, the more valuable I see this information.  It is so nice to really stop and examine what my students think about reading.  I think sometimes as teachers we are SO bogged down, we think we are tapping into every need of our students and that we really know them, but we may not be as much as linked-in as we think.   Today, I am examining the following question: 


WHAT DO YOU FIND MOST DIFFICULT ABOUT READING? 

Prior to the survey, I would have have put money on the following response that they "just don't like it."  Well, well, well...what a surprise! I don't know everything after all!  I was completely blown away to learn that 50% of them find unknown and difficult words to be the most difficult! What!?! Vocabulary!?!  Never in a million years...so....now, I see that vocabulary work is much needed, and I am seeing a need to incorporate this skill into our work EVERY DAY!  In fact, the journals need to incorporate a new "word of the day" - selected individually by each student from whatever they read that day in class.  We shall see...this may help!

In addition to struggles with vocabulary, my students shared a number of other difficulties:





Tuesday, February 3, 2015

How do you rate?

Today, I distributed a Baseline Reading Survey and have begun to crunch some of the data.  These open-ended questions were handwritten.  


1. On a scale of 1 to 10 (ten is the best) how would you rate your feelings about reading:
2. What do you like about reading?
3. Why is this?
4. What do you dislike about reading?
5. Why do you think this is?
6. What do you find most difficult about reading?
7. What are some things that help you with reading?
8. Have you noticed your reading improving as you get older?
9. If you answered yes to #8, why do you think it has?
10. If you answered no to #8, why do you think it hasn’t?

I am finding this sooooooooo interesting and I am loving seeing my students thoughts on reading and it is amazing how much I am learning just by looking at their thoughts differently than I have before!  I thought I totally knew my kids, but just from this little ten question survey, I feel so much more in touch.  The first question asked them to "rank" themselves as readers - simple scale of 1 to 10 with ten being the best.  When I first looked at it, I tallied the percentages of 1's, 2's, etc...They were all so similar, it really didn't tell me anything - I have included that graph below: 







The next chart  includes the same data in a different way.  As I mentioned, the first chart didn't really reveal a quick glance of my students' opinions of their reading abilities.  As a result, I decided to "rank" the 1 to 10 scale as percentages: 


A = 9-10
B = 8
C = 7
D = 6
F = 5 and below