The least favorite part of my job is grading students, so this semester I decided to outsource some of it.
In my Social Media for Reporters class at UNC, 20 percent of each student’s grade will be based on the number of points that his or her Klout score goes up over the course of the semester. But the best thing about doing so is probably not that it’s easy, but that it is flawed.
Boiling a semester’s worth of effort and accomplishment down into a single number has always seemed to me to have a certain false sense of precision to it. More than once I’ve looked down at the end of the semester and wondered to myself how one student or another ended up with a grade that was so much worse — or better — than I would have handed out just based on gut instinct.
That’s the problem that many folks seem to have with Klout and other similar social media metrics tools. Boiling continuous interaction across a variety of social networks down into a single number opens lots of room for argument — sort of like debating whether the impact of Angel outfielder Mike Trout’s 129 runs and 49 stolen bases is more deserving of the MVP award than Detroit third-baseman Miguel Cabrera’s .330 batting average, 40 home runs and 139 RBI.
Boiling multiple data points from disparate contexts down to one, final judgment is often overly simplistic. But we do it.
Room for improvement
The question for me is whether we’re doing it the best way possible, and how we might do it better in the future.
Klout doesn’t release the algorithm it uses for calculating scores, and it doesn’t disclose the distribution of its scores. The most precise piece of data they share is that the average Klout score is 40.
How is that possibly fair to students who are struggling to raise this arbitrary number that’s contrived inside a black box? It’s fair because it transforms the class from a workshop on button-pushing to an exercise in hypothesis testing, strategy and critical thinking. Students — who often approach grades with calculating economy of effort — don’t know what they have to do to boost their Klout scores, so they are forced to design simple experiments, isolate variables, and generalize their findings.
We aren’t totally shooting blind. Here’s what we know about how Klout creates its scores:
- There are more than 400 variables in its calculation.
- New variables were added and scores were redistributed in August, just before the semester started.
- It only counts networks it can see — so either public posts, or private posts that you’ve connected to Klout.
- Your Klout score is a reflection of your activity within the last 90 days.
- New Klout scores are released each morning. Older data is decayed in favor of newer data. (But Klout doesn’t say at what rate data is decayed.)
- The score factors how much content you create compared to how much engagement you are receiving
- Klout says it attempts to measure engagement equally across all the networks it monitors, so that it doesn’t favor activity on one network over another.
- On Twitter, Klout looks at retweets and mentions. And it is better to be re-tweeted or liked by people who do those things rarely than by people who do those things often.
- On Facebook personal profiles, Klout measures comments, wall posts and likes. Since late last month, you can have Klout measure your Facebook page instead of your personal profile. For Facebook pages, Klout measures the number of fans and how many people are talking about your page. Having a Facebook page increases scores by an average of seven points. A Klout user with a score between 70-80 has an average of 13,000 users talking about their Facebook page. (Although we don’t know which is the cause and which is the effect.)
- On Google+, it measures comments, reshares, and “+1“s.
- On LinkedIn, it measures comments and likes.
- On Foursquare, it measures Todo’s and Tips.
- Since August, if there is a Wikipedia page about you, Klout measures the page’s rank, number of inlinks and outlinks.
- Since late last month, you get extra Klout credit if people search for your Wikipedia page on Bing. And if you appear as an expert in the “People Who Know” section of Bing’s sidebar.
- Some networks — YouTube, Instagram, Tumblr, Blogger, WordPress.com, Last.fm and Flickr accounts — can be connected to Klout, but don’t affect your score.
- There is no reward for just adding networks that you do not participate in. Neither is there punishment.
- Adding a new account is reflected in your Klout score within 24-48 hours.
What’s missing
But there’s also a lot we don’t know. Perhaps the most important piece of missing transparency is the “difficulty rating” that students should receive for each additional point increase in their Klout score.
Two students who had almost no social media activity when they started the semester registered initial Klout scores of 12 and 18 within the first week, but have had little movement since. But the two students who started at 55 have also seen little growth.
The most rapid growth in Klout scores during the first four weeks I’ve been tracking them has come from the students who had scores in the 30s and 40s. One student jumped from 33 to 52 and another from 42 to 58. But another moved only from 43 to 46.
I wanted to measure only growth that happens during the semester, so as not to punish students who started out with little or no social media experience. But what I may have ended up with is a system that punishes students who began with extensive social media engagement.
In an effort to prevent sandbagging, I’m distributing the Klout portions of their grade on a curve relative to the class. But that means all of my students could end up with a number that’s in the top 10 percent of all Klout scores and still get an average grade. The only safety valve for that is my promise that I will give an “A” Klout grade to any of my students who end the semester with a score higher than mine, regardless of where they started. Right now, that bar is set at 62 — third among my UNC colleagues, below Chris Roush and Paul Jones.
In the end, I’ll add my own judgment about my students’ effort and ability to use social media as reporters. I’ll consider qualitative measures such as how trusted they are on their beat, whether they used it to give voice to the voiceless, hold powerful people accountable, shine light in dark places, explain our increasingly complex and interconnected world, and get the right information to the right people at the right time.
A high Klout score is something I’d expect from a solidly average student. A B student will be able to pick apart and critique Klout’s system. And an A student? Someone who will one day build a better Klout.
Ryan Thornburg researches and teaches online news writing, editing, producing and reporting as an assistant professor in the School of Journalism and Mass Communication at the University of North Carolina at Chapel Hill. He has helped news organizations on four continents develop digital editorial products and use new media to hold powerful people accountable, shine light in dark places and explain a complex world. Previously, Thornburg was managing editor of USNews.com, managing editor for Congressional Quarterly’s website and national/international editor for washingtonpost.com. He has a master’s degree from George Washington University’s Graduate School of Political Management and a bachelor’s from the University of North Carolina at Chapel Hill.
Photo by Tyler Ingram on Flickr and used here with Creative Commons license.
View Comments (6)
Hi Ryan, I love this post!
I'm currently running at Princeton called Human Nature and Technology, for the Center for Talented Youth through Johns Hopkins University. I've been using Klout as an educational tool for the last two summers, with mixed results that reflects a lot of your experiences (especially concerning transparency of the algorithm).
However, the assignment structure is a bit different, since we don't operate on a grading system that requires a letter grade. My students are also minors, so there are guidelines we follow to protect the student's identity that prevents individual students from operating their own social media for class purposes. My solution to these complications might be of interest to you.
For my class there is one collective class Twitter account (@htecb) that everyone has the password to and is connected to our Klout account. The students are collectively responsible for raising the Klout score over the duration of three week camp. Since we only meet over the summer, our score had bottomed out at 16 when we started. Today is day three, and we've already revived the score to 23. Not bad!
We discuss how the service is used in class, and strategies for media engagement and developing influence. Students are encouraged to follow, mention, and retweet using the account with the goal of beating last year's peak of 43. There are some content restrictions we impose (for instance, they can't post images of themselves or each other), but otherwise they are free to use Twitter as their playground.
I've found that Klout is an excellent educational resource; it gives the students a chance to poke at real, living digital networks and get feedback on how they respond. Doing the assignment collectively means that any disparity in experience among the students only contributes to the educational process, and removes competitiveness between the students or shame about their existing social networks. It also shields the student's identities and makes it easier for me to monitor as an instructor. Perhaps more importantly, it provides a baseline for measuring the performance of the entire class.
I'd strongly recommend the strategy of a collective internet presence, and I think it could be pretty easily adapted to a college context. Instead of using existing accounts, register a new Facebook and Twitter account for the class, and distribute the password to the students. Klout gives an overall score, but also ranks individual posts with between one and four green dots. If you require the student to identify their posts from a collective stream, it could be used to evaluate the influence each student had on the overall score, and thus allow a uniform way of measuring the student's performance. It also normalizes the students who have established networks, making the grading process more fair throughout. I hope the suggestion makes sense!
One great thing about @htecb is that we've been using the Twitter account for several years, and so we have several years worth of previous students who have gone on to university but continue to follow and engage with @htecb every summer, often with fond memories of years past. It provides a ready-made audience to encourage initial engagement and exploration, and it gives evidence of the trails blazed in the network by those who have come before.
I left a long post here a few days ago and it was deleted! I had specific recommendations from my own experiences using Klout in the classroom, and was hoping to engage with fellow educators on the topic :/
If you'd like to see my comment, I archived it on my G+ profile: https://plus.google.com/u/0/117828903900236363024/posts/F6AUWjZgx7C
We're in transition right now so comments are temporarily in migration! Your post isn't lost, and will be back here soon :)
Sorry for the trouble!
Ah, it showed up! Thanks =)
Ryan, I'm thinking of adopting this method in my course on startups for the fall. Would you consider writing a "lessons learned" post. I'd like to "stand on the shoulders of giants" on this one if I could.
This is a thoughtful approach. I'm struggling to figure out how I might use Klout's ranking of individual posts. I see that Klout has a Score Activity section, but can you view all of your historical activity? And do you know what the difference is between the "scores" given in the Recency sub tab and the Score Impact sub tab?