Session Overview/Purpose:

To ensure that all observers are on the same page in their ability to identify and rate observations with consistency, Understanding Rater-to-Rater Reliability will help observers take the time to view teacher practice together and then discuss what each person observed. Two of the techniques used to develop rater-to-rater reliability are the use of video observation and providing evidence for marks made by each observer.

Session Description:

During the facilitated training, groups of observers will use a video to view and rank teachers on chosen elements and then participate in discussions about the evidence they collected as well as why teachers were rated at a certain level on the scale. As a practice for continued improvement, teams of observers will use collaborative discussion to grow their rater-to-rater reliability. To do this, teams of observers, will view a teacher’s lesson with the intent to look for evidences of a specific element. The observers mark the evidences and then discuss what they saw together. Again, they talk about the evidences they collect and the rationale they use to assign the score. The intent of using the common lesson is for rater-to-rater reliability only and is not intended to give feedback to the teacher.

Learning Outcomes:

By the end of Understanding Rater-to-Rater Reliability, participants will be able to:

  • Identify and rate observations with consistency through the use of a video observation

  • Rank teachers on chosen elements

  • Participate in discussions about the evidence they collected during the observation and the rationale they use to assign a score

Session Objectives:

Participants Will:

  • Explore the North Carolina Educator Evaluation System wiki to gain three new resources
  • http://ncees.ncdpi.wikispaces.net/ncees+wiki
  • Identify and locate the standards and elements from the North Carolina Teaching Standards
  • Watch a lesson and complete the NCEES rubric for an observation
  • Share the marks and evidence collected during the observation
  • Review the "summary data" collected from the observers
  • Compare and defend the evidence that supports the marks with their colleagues to determine rater-to-rater reliability and consistancy
  • Access, review and discuss the sample summary report created for this observation

Click here for the interactive agenda.
Click here for the full presentation.

Pre-Work- There is no pre-work for this session.

Team Leaders:

East: Dianne Meiggs-Educator Effectiveness
West: Amy-Blake Lewis-Educator Effectiveness

Meeting Dates;
3/11/2012 10:30 am

Important Timeline Events

  • Mar. 8-15: 1st round supply request and necessary materials due (to Kristin)
  • Week of Mar. 21-25: [[#|Webinar]] Series for LEA Teams (prior to registration) – just be aware
  • Apr. 8-12: Content for Sessions [[#|complete]]
  • Apr. 15-May 3: Session Review (Vetting)
  • Apr. 15: LEA Pre-work due
  • Apr. 22: Pre-work posted to wiki
  • By Vetting Sessions: Submit final list of trainers
  • By Apr. 30: Design Teams sharing meeting (whole group)
  • By May 10: Submit final supply request list for ordering
  • Week of June 28: Trainers Workshop, Dress Rehearsal
  • By June 30: Dates for follow-up PD ready for each RESA Director

July 8-9
Region 2
Greenville Convention Center Greenville
July 8-9
Region 6
Hilton Charlotte University Hotel Charlotte
July 10-11
Region 1
July 10-11
Region 8

July 15-16
Region 4
Sheraton Imperial Convention Center Durham
July 15-16
Region 5
Koury Convention Center Greensboro
July 17-18
Region 3
July 17-18
Region 7