Assessments that overvalue and undervalue – what are we doing about them?

I’ve changed my routine for watching TED talks. Because I have pivoted from viewing one a day, I watch a few on the weekends. As I was “catching up” this morning, Susan Etlinger’s talk, “What do we do with all this big data?,” stopped me in my tracks.

He was teaching himself to communicate, but we were looking in the wrong place, and this is what happens when assessments and analytics overvalue one metric — in this case, verbal communication — and undervalue others, such as creative problem-solving. Communication was hard for Isaac, and so he found a workaround to find out what he needed to know. And when you think about it, it makes a lot of sense, because forming a question is a really complex process, but he could get himself a lot of the way there by putting a word in a search box.

I am thinking a lot recently about assessment, progress reports, and how we communicate about what we most deeply value in schools. Etlinger poses such a powerful challenge: “This is what happens when assessments and analytics overvalue one metric and undervalue others.”

What are we overvaluing and undervaluing on our school progress reports? If we look at our students’ report cards, do they express what we most deeply value? Across the city, state, country, and world, we should be deeply involved in resolving such a question.

My main blog’s 2013 in review – and a brief spur of reflection on “report cards”

While I am fascinated, in some ways, with analytics like the one below emailed to me from WordPress, I can’t help but think that “the numbers” only tell part of the story for me – a minor fraction. They certainly don’t tell the most compelling parts of the story, in my opinion. Not by any stretch.

So much more than the number of views or most-read posts, I care about HOW the writing-as-thinking represented here has changed me, and, hopefully, how it has potentially helped change others.

In this regard, such analytics and report cards make me think about what our school report cards lack and suffer from, as story tellers. If I look at my collection of report cards, I see mostly quantitative analytics – proxies for some measurement of my learning and development. Inadequate dashboards claiming to summarize me as a learner of math, English, science, economics, etc.

Perhaps these quantitative measures must play a part in telling my learning story. I’m not convinced. But certainly, we in education can devise better proxies for telling the stories of human development, deep learning, and awe-inspiring growth.

+ + +

The stats helper monkeys prepared a 2013 annual report for this blog.

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about 26,000 times in 2013. If it were a concert at Sydney Opera House, it would take about 10 sold-out performances for that many people to see it.

Click here to see the complete report.

Are traditional report cards “vanity metrics?”

Are traditional report cards “vanity metrics?”

In the business world, we talk about the difference between vanity metrics and meaningful metrics. Vanity metrics are like dandelions – they might look pretty, but to most of us, they’re weeds, using up resources, and doing nothing for your property value.

from “Know the Difference Between Your Data and Your Metrics” by Jeff Bladt and Bob Filbin, Harvard Business Review  |  11:00 AM March 4, 2013

English – 92
Math – 89
Science – 91
History – 88
PE – 93
Art – 90

Such a report card might make the refrigerator. But does it really say very much about the student’s growing capacities in writing (ideas, organization, word choice, sentence fluency, conventions, voice)? And did you assume from that last sentence that I was referring to English, or did you assume that such metrics could cut across the departmentalized curricular landscape and comprise parts of all the subject grades? Do we know if the student asks probing questions and demonstrates curiosity for understanding more deeply? What do those dandelions tell us about the student’s application of such thinking skills as divergent, emergent, and convergent explorations? What do we know of the student’s perseverance, resilience, risk taking, and grit? Can we make any deductions about the student’s observation and experimentation capabilities? Do we know how the student has demonstrated integrity and empathy from those pretty flowers on the refrigerator door?

From that data set, the one that will live indefinitely in the fireproof cabinets and ethernets of schools, what do we know about the student’s growth and emergence in the 4, 5, or 7Cs of 21st century skills? What do we see on that report card about mindset?

What are we grading? What are we measuring? What are we commenting on? What are we collecting and recording and archiving data on? What do we say matters most about our children? Do our report cards shed light on what we say we value most?

Is your school asking these questions? Why or why not? Are these questions among more systemic considerations that you are examining?