Sight & Sound released its “The Greatest Films of All Time 2012” on its website. The first place went to Hitchcock’s “Vertigo”, while “Citizen Kane” slipped down to second. Ozu’s Tokyo Story made the third place. Separate poll by film directors placed Ozu’s film the first.
Sight & Sound Top 10
1. Vertigo (Hitchcock)
2. Citizen Kane (Welles)
3. Tokyo Story (Ozu)
4. La Règle du jeu (Renoir)
5. Sunrise: A Song of Two Humans (Murnau)
6. 2001: A Space Odyssey (Kubrick)
7. The Searchers (Ford)
8. Man with a Movie Camera (Vertov)
9. The Passion of Joan of Arc (Dreyer)
10. 8½ (Fellini)
Sight & Sound The Director’s Top Ten
1. Tokyo Story (Ozu)
2. 2001: A Space Odyssey (Kubrick)
3. Citizen Kane (Welles)
4. 8½ (Fellini)
5. Taxi Driver (Scorsese)
6. Apocalypse Now (Coppola)
7. The Godfather (Coppola)
8. Vertigo (Hitchcock)
9. Mirror (Tarkovsky)
10. Bicycle Thieves (De Sica)
I always restrained myself from making comments about these lists, “all-time top ten” or “the greatest film ever made” or whatever. Because we, film lovers, can argue about the list for hours after hours without making any contribution to our understanding of the art, let alone humankind; “’Vertigo’? I don’t think it’s the Hitch’s best”, “’Man with Movie Camera’ over ‘Battleship Potemkin’? Capitalists!”, “Where’s Michael Bay’s ‘Transformers’?”. I could watch two or three films for the amount of time I have wasted on repeating the same argument over and over. And the discussion usually ends with the phrase, “Yeah, Yeah, I know, that’s not my top ten list.”, or “Film experience is personal, so what’s the point in making such a list, anyway?”.
So what’s the point in making such a list anyway? Especially these lists/poll are prepared by institutions like BFI/Sight & Sound, AFI, or national/public entities, every 10 or 5 years. When a private blogger makes lists of his/her own top ten films on his blog, it is just that; private thoughts. But this list is the result of the poll by 846 “critics, programmers, academics, distributors, writers and other cinephiles”, conducted by the professional organization/publisher. Interesting is the invitation letters explaining what “the greatest” means; “We leave that open to your interpretation. You might choose the ten films you feel are most important to film history, or the ten that represent the aesthetic pinnacles of achievement, or indeed the ten films that have had the biggest impact on your own view of cinema.” This really bothered me. Not only this statement conveniently conceals underlying arrogance, but also it just aborted any journalistic commitment to bring something meaningful to general public.
When they say “great” without defining it, it will allow voters with impression-based selection. There are some critics and professionals who are very cautious about this process, weighing his/her personal assessment against public reception of the particular work in the context of broader cinematic culture as a whole. But there are also many industrial insiders, critics, distributors with many different perspectives. Without defining what metric should be used, then the result would be a collection of impressions by each individual, no matter how professional they are. That’s fine, if they publish the all the top ten lists of 846 professionals, without summary ranking.
If they really want to rank films, at least they should have rephrased the question. To agree on what scale they are measured against. Just as stated in the letter, “Which are the most important to film history?” or “Which are the top ten films that have the biggest impact on your own view of cinema”, Or maybe “Which are the top ten films with lasting impact on today’s film industry?”, “Which are the top ten films you discovered since 2002?” or “Which are the top ten films we should preserve to represent 20th century?”. They are professionals, BFI being the guardian of cinematic culture, they should be free from impressionistic ranking of this kind. I believe such organizations should utilize the pool of films (in this case, 2045) listed by professionals to advance the cultural recognition of the past works (i.e. to ‘efficiently’ allocate funds to preserve past works and screen them for public etc.). However, as it stands as it is, we don’t know why “Vertigo” is ranked higher than “Citizen Kane”.
Wait, maybe that’s the whole point. This list exists so that the amateur bloggers like me can debate why Vertigo is ranked higher than Citizen Kane, if Ozu is worthy of third place while Kurosawa is at 17th, or why Sunrise is still in the list, on and on. And maybe some of us will be frustrated enough to make our own top ten list. The list to stimulate discussion among patrons, so to speak.
Or maybe it’s just me. I can’t rank films, even my favorites. I can’t rate films with stars. To me, each film is separate entity, and rating them is like comparing apples with oranges with pumpkins with cabbages. I don’t think Ozu’s Tokyo Story is better than Kurosawa’s Seven Samurai. When I watch Tokyo Story, I am just awed by the world defined by Ozu’s construction. But again, when I watch Seven Samurai, I am overwhelmed by the Kurosawa’s visions. Even when I watch so-called B-movies, I am just happy to have an occasion to encounter that particular film. When I see something I don’t agree with, I just go silent. Because I am not a professional critic. I am not fit to judge films in the context of broader perspective with vast knowledge of cinematic culture. I come to love some films based on my own private impressions.
Then, this discussion completes the incomplete full circle. Didn’t those professionals rate these movies based on their own private impressions? In the days of hundreds of thousands of movie blogs and rotten tomatoes and imdb, what is the point for professional organization to make this kind of list?