- Make the case. Carefully script the assessment as diagnostic like before, but in the context of lost time at school. A trip to the doctor is always important, but especially when you’ve not been for a while. In the script, be very transparent with students about the purpose of the assessment, who will see the data, how it will be used and what helpful instructional adjustments students might see from it. Train all teachers on the script and reduce variability in assessment conditions as much as possible.
- Watch closely. Create assessment note catchers for teachers who are proctoring the assessments to document the behaviors of students during the testing period. Students have spent an inordinate amount of time receiving instruction through screens over the past eighteen months. Every teacher has encountered screen fatigue in online instruction. Students know that the medium has become the message. Careful documentation of student affect (i.e., instances of encouragement and/or disengagement) will provide critical context for teachers as they begin studying the data in PLCs.
- Mind the metadata. All online assessments generate data about how the student took the assignment. In addition to the standard inquiries a PLC would do (i.e., looking at multiple choice distractor patterns and rubric scoring distributions), teachers should spend time exploring whatever metadata the tool makes available. Exploring questions like which students finished quickly, which labored, or which completed the fewest questions will add valuable context to ensure or temper the validity of conclusions reached from the data.
Finally, schools should realize the limitations of these tools and regard them as an important but supplemental source of information about student levels of understanding. Two limitations deserve a mention:
- They are not culturally responsive. Instead, they are bland by design, intended to elicit reliable data across broad swaths of a very diverse population. Since the topics or context contained are neutral (i.e., calibrated to interest but not necessarily to engage the test taker), it is reasonable to question students’ investment in the diagnostic, or whether the results represent their best effort. This is especially true for struggling readers or mathematicians who might need more reason, relevance or context to fully engage in the work.
- They typically produce little to no visible student thinking. Driven by algorithms, the assessments leave teachers to inference the how and why of student responses. The test formats invalidate many active reading and problem solving strategies students learn such as gisting, chunking and annotation. On the math side, even when students are instructed to use scratch paper to show their work during an online assessment, our experience indicates most do not or the work shown is piecemeal and difficult to decipher.
In addition to coaching, Teaching Matters has a wide array of resources for schools to address assessment structures and practices.
- If your school is seeking to inventory current assessment practice, our Assessment Competency Self Study can be used to gather data about teacher and student practice
- Schools needing to develop local measures using a culturally responsive framework can look to the Assessment Strategies Mini-Courses available through @SchoolAnytime, our powerful online professional development platform. These mini-courses include a module on diagnostic assessment with suggestions for planning, administering and responding to assessments to meet the needs of diverse learners. Learn more about how @SchoolAnytime can provide teachers with the support they need today!
As students re-enter the classroom, teachers and staff are the people best positioned to create assessments that will help them truly show what they know. Only with a rich and well founded data set that represents students’ best efforts and makes their thinking visible can schools truly chart a path forward to accelerate learning for all.