Comparing Movie Ratings Sites

The other day, somebody sent me a link to fflick (myself, I’d have capitalized the first ‘f’). It’s a site which (I assume) uses language analysis to aggregate movie reviews off of Twitter. They present this data as a rating out of 100 for any movie, and enable you to just check out reviews by your Twitter friends.

I wondered how accurate these ratings were. And, of course, I saw a chance to make a chart.

I compared the fflick ratings from the top ten box office films this week to those of another crowd-sourced site, IMDB, as well as two professional review aggregation sites, Metacritic and Rotten Tomatoes. Here’s what I came up with–apologies for the goofy X-axis labeling. As usual, cliquer pour agrandir l’image:

I know that’s a pretty small data set, but it’s a start. It’s actually interesting how close the four data sets are. I’d guess that the disparity in Charlie St. Cloud can be explained by two factors: uncritical teenage lust for Zac Efron, and the movie’s newness. It’s also not surprising that the critics are generally less enthusiastic about a movie than the general public. Still, fflick seems to do a pretty decent job of distilling Twitter’s cinematic zeitgeist.

This language analysis is a very deep vein in social media channels like Twitter. Marketers, researchers and hackers across the globe will be keen to explore what people love and hate, whether it’s movies, music or recliners.

Speaking of movies and charts, I love the vote distribution for Eclipse on IMDB.

%d bloggers like this: