Pages

Sunday, November 24, 2013

Data Tracking Using "Dot Charts" and Consensograms

I find myself learning a lot this year about how data analysis and informed instruction looks at different grade levels. Especially formative assessment.  As someone who worked almost exclusively with small groups and test preparation groups for 5 years, I was seeing excel spreadsheets in my sleep.  All the number crunching and analyzing growth from one week to the next had my head spinning most of the time, as when you focus so much on numbers and not on students, the task of engaging students in activities that will actually grow them seems impossibly overwhelming.  What I have started to realize over the past 2-3 years is that your formative assessments (those quick checks you do all along the way) help students not only to communicate with you about how they are understanding a concept, but also to feel actively involved and empowered to change their achievement level. Take a look at the scientifically named  :)  "dot-chart" above.  


As the data dots show us, in the Fall of 2011, I had one student score "accelerated" on the Reading Ohio Achievement Test, and all the rest of my class scored "Limited".  Needless to say, I wanted to see a lot of growth before the next round of testing in May.  So, I devised a schedule for my class to take bi-weekly practice assessments beginning in January and track their progress after each test on this graph.  The dot color is meaningless, the number on each dot represented a student.  Each week in small groups the students would practice all the different aspects of test preparation and practice questions with a teacher.  We would mark up the practice questions with highlighters, rewrite our answers--you name it.  We would also look at our chart, and discuss which box the student was in that week, and how their goal should be to move up one box the next week.  

Here is why this was effective:  
1.  The students knew that although the data was "anonymous" it would still be posted for all to see.  
2.  It gave them a visual of where they were scoring with regard to where the other kids in the room were scoring.
3.  Moving up one box every two weeks was a goal that seemed tangible and realistic when it was presented in this way.
4.  Students felt and appreciated that they had active participation and control over where their dot would be placed the next time.  They knew it was directly related to their performance and hard work in between tests.  

As you can see, the dots started moving up week by week until we had about 8 students who were consistently passing the test.  Not a lot, but a vast improvement from just one!  The data on the chart from April was pretty consistent with the data I got back from the state after the state tests were scored, so I knew what I was doing was a reliable measure.  

Imagine if I had started this method in say, October, rather than waiting until January!  How powerful would the dots have been then in motivating even more students to set and surpass their goals?  

The other thing I wanted to mention was Consensograms.  These are a relatively new concept to me this year as I explore within the profession and get to know some of the lower grade levels.  I adore them!  Not only are they cute additions to your classroom that also serve a purpose, they are a way to engage those young kiddos and turn them into active participants in their learning as well!  And (as someone who loves technology I can't believe I am blogging this statement) what a great alternative to using those darn clickers that might take forever to set up and/or malfunction once the kids are signed in anyway?!  Phew, what a relief to just have a permanent display of your best practice on the board for the "I Can" statement or learning standard of the week!  Below are a couple of examples from my friends' rooms (since I am more transient as a sub this year ):  

1st and 2nd grade Consensogram 

 




3rd and 4th grade Consensogram










Thoughts?  :) 

No comments:

Post a Comment