Today on Blogcritics
Home » Culture and Society » Science and Technology » Website Review: Criticker – Crunching the Numbers

Website Review: Criticker – Crunching the Numbers

Please Share...Tweet about this on Twitter0Share on Facebook0Share on Google+0Share on LinkedIn0Pin on Pinterest0Share on TumblrShare on StumbleUpon0Share on Reddit0Email this to someone

Both Anthony Kaufman and Anne Thompson have written about this new movie review site called Criticker. Their descriptions intrigued me, and today I finally popped over to their site to check it out.

The idea is pretty straightforward. You rank movies on a scale from 1-100, they plug your rankings into their "Taste Compatibility Index (TCI)," and then you get a list of movie recommendations and film critics whose taste matches yours. It's a fun toy and a good way for movie buffs to while away a few minutes or hours.

But being obsessive-compulsive (and stuck at work and bored), I decided to really run the site through its paces. Specifically I wanted to test this claim: "The more films you rank, the more accurate your TCIs will be!"

Methodology:

For my experiment I paused and recorded data after ranking 10, 25, 50, 100, and 250 films. I kept track of the critics that Criticker told me were my kindred spirits and the movies that Criticker recommended to me. As the number of films I ranked increased, I gradually followed more recommendations, but I tracked the same base group of films. For each film I noted Criticker's Probable Score Indicator, my own score, and the difference between the two.

Predicted Outcome:

As I rank more and more films, Criticker's recommendations should become more accurate and a stable group of kindred spirit critics should emerge.

Data:

  • 10 Films Ranked
  • 25 Films Ranked
  • 50 Films Ranked
  • 100 Films Ranked
  • 250 Films Ranked
  • Results:

    Critics:

    I'm not as concerned here with the second predicted outcome (the emergence of a stable group of kindred spirit critics) because the TCI depends on the number of films you've seen in common with the critic in question. Even after ranking 250 films, that number is still typically very low. John Hartl is number one on my list, but that's based on only 10 films. Number 2, Mark Caro, is based on 30.

    Also, this number depends greatly on which films you've ranked. Rank different films and get a different list. Variety's David Rooney ranks 86th on this list, but after 250 different films he was number 1. Still, some critics do appear quite often in the upper echelons of all of my lists: Mark Caro, Scott Foundas, Marjorie Baumgarten, Rick Groen. All of these are critics that I read regularly and whose opinions I respect. And there are few surprises at the top of the list — the only critics in my top 25 with whom I frequently disagree are V.A. Musetto (11), Peter Travers (17), and James Berardinelli (24).

    It's a fun tool, a different way to find new critics, and somewhat reliable. But especially at 10, 25, and 50 films ranked (the extent to which most people will use the site), it doesn't mean much. I'll weigh in after ranking 500 and 1000 films and we'll see if it becomes more reliable.

    Recommendations:

    Now this is interesting. At 10 films ranked, Criticker's predicted score deviated from my actual score by 16.2 — pretty good on a 100 point scale. There are a few big deviations: the difference of 30 between my predicted score of 85 and my actual score of 55 for V for Vendetta is the difference between "loving" it and just "liking" it (with "enjoying" it in between).

    The average deviation does become more accurate as more films are ranked, but it levels off after 50 at around 10, which seems reasonable enough. To use the example of V for Vendetta again, after 250 films the predicted score is now 65 (I will "enjoy" it). Not too bad.

    But even more impressive is the way the size of the deviations shrinks. After 10 films, the biggest difference is 30. At 25 this falls a bit to 27, but at 50 it's only 18, and at 100 it's just 17. Throw out the aberrant difference of 34 for Silent Hill (predicted 26, actual 60) and only 5 out of 24 predicted rankings are off by more than 15 at 250 films ranked. And 4 are either exactly right (65 for Cinderella Man, 80 for In the Company of Men) or only off by 1 (78-79 for Donnie Darko, 64-63 for Citizen Ruth). That's impressive.

    Conclusions:

    Thus far I've collected insufficient data to conclude whether Criticker can succeed in this mission:"Criticker aims to match you with the people who share your taste in film most exactly." But I suspect most users won't rank more than 250 films, so it's fair to say that for most people, Criticker won't really fulfill this function.

    But what it will do is deliver uncannily accurate predictions, based on the critics in your TCI, of what movies you will like. So, as a recommendation service, I heartily endorse it. I also endorse it as a delightful way for movie-lovers to amuse themselves if they're bored.

    Notes:

    Criticker will also match you with fellow-users, but I only tested the critics' option.

    Powered by

    About A. Horbal