Tough choice this weekend: You could go to the multiplex and catch such acclaimed new features as superhero epic "The Avengers," romantic comedy "Think Like a Man" or stop-motion animation gem "The Pirates! Band of Misfits." Or, you could stay home and watch new episodes of brainy period drama "Mad Men" or the sprawling fantasy "Game of Thrones."
But many people will probably wait until those movies come out on Netflix and stream them at home, or DVR those series and then watch them in blocks. At that point, with the viewing experience digitized, flattened, squeezed, shrunk, re-shaped, and rescheduled for our convenience, the distinction between whether the story began as a movie or TV show won't matter anymore.
That's why it's quaint to read an article like James Wolcott's in the recent "television issue" of Vanity Fair magazine, arguing for TV's supremacy over movies, or a rebuttal like Mark Olsen's, in the Los Angeles Times, insisting that movies still rule. The truth is, while each has its advantages, the digital tsunami threatens to wash those differences away, to the point where all that remains are our endless demand for visual storytelling and an industry rushing to meet it.
Wolcott's argument isn't new, though his restatement of it is elegantly written. To him, TV is superior for its superior writing, its novelistic ability to allow characters to develop and grow, and most of all, for its watercooler urgency -- that is, its ability to unite large groups of people the next morning to talk about it. Movies used to have these traits, but no longer.
Olsen's argument seems to concede most of Wolcott's points but still insists that films offer the compact efficiency of narrative closure (two hours is all you need for a complete cathartic ride), the unique auteurist vision of great directors (something that gets suppressed when those same directors shoot individual episodes of a TV series) and the immersive experience and overwhelming visual spectacle of the movie theater. And there are still movies with cultural urgency (say, anything by James Cameron, Christopher Nolan, or Quentin Tarantino) that everyone will be talking about, not just the next day, but for months and years.
But these arguments are incomplete. Both ignore Sturgeon's Law, the maxim that 90 percent of everything is crap. When Wolcott says TV has better storylines and more flattering showcases for women, he's not talking about "Keeping Up With the Kardashians" or "American Idol" or anything broadly popular; he's talking about niche cable shows that have critical and ardent cult followings but not mass appeal. Same with Olsen, who cites a lot of art-house films that Wolcott and most other moviegoers haven't seen; he's not making a case for "The Lucky One," "Think Like a Man," or "Wrath of the Titans." Both writers also ignore the commercial imperatives behind most of what's released in each medium (the need for advertiser or subscriber support on TV, the need to have as big an opening weekend as possible in movies) that result in creative timidity and mediocrity in all but a few cases.
One thing both essays do touch on is the changing media landscape wrought by the shift from analog to digital entertainment. It's made the movies more like TV and TV more like the movies. Celluloid film is all but gone from theaters now, so what you're watching on the big screen looks very much like the array of pixels you'd see at home. 3D was supposed to be the theatrical innovation that got us off our couches and back into theaters, but home 3D screens will be commonplace in a few years. Just last week, Peter Jackson previewed for theater owners the latest digital innovation: 3D movies like his first "Hobbit" installment (due at the end of 2012) that are shot at 48 frames per second (twice the frame speed that has defined movies for about 90 years), packing twice as much visual information into each second of play and all but eliminating strobing and flickering. Viewers of the demonstration complained that the visuals looked too real, like videotape. So the latest advance in cinemas will be movies that look even more like TV.
Of course, people seem to be going to the movies less and less, which means that a movie is less about the immersive experience of a giant screen in a darkened auditorium with hundreds of strangers and more about the content alone, watched on a smaller screen (maybe much smaller, like a couple inches wide), by yourself. The specialness of the movie theater experience is something we don't value much anymore; it's been devalued by the increasing arduousness and cost of a night at the movies, by home viewing options that offer ease and comfort and convenience and economy, and by the mediocre quality of much of the film release slate, something we don't mind so much when we're watching the same mediocre films at home for a fraction of the cost. We used to watch movies at home on DVDs, but the industry wants discs to go the way of celluloid, to be replaced by cheaper streaming files that we'll keep not on our shelves but in virtual lockers floating in the Internet cloud. So we're soon to be completely divorced from movies even as physical objects -- no longer the result of light reflected from real objects and captured in silver on a strip of celluloid, or even those same images digitized and burned onto a disc, but merely a string of 1s and 0s generated inside a computer and sent to you via a computer, without ever having touched the real world.
We can also make TV more like movies, watching shows on a living-room entertainment setup whose large screen and surround-sound audio approach a cinema-quality experience. We can watch series in marathon blocks, turning a few episodes into an epic-length, movie-like session. Of course, if we miss a few episodes, we can stream them sometime later, so that TV programming becomes a disembodied digital file, just like a film. Freed not from physical concreteness but the temporal concreteness of the weekly schedule, TV loses its specialness as well, just like movies.
One area where movies still have an edge over TV is glamour and prestige. After all, which would you rather win, an Oscar or an Emmy? Would you rather be known as a movie star or a TV star? Then again, stars are becoming increasingly irrelevant in movies, where the concept itself (a familiar title, an established franchise) is the star and actors are interchangeable or easily replaced by digitally generated characters. Hollywood still needs stars to promote the movies at premieres, but that need for glitz will diminish as theaters vanish and premieres take place in the theater in your living room -- or in the tablet on your lap, or the smartphone in your pocket.
When that happens, no one will be arguing anymore about which kind of visual story has more cultural primacy or watercooler urgency because there won't be a collective experience for us to talk about. Movies and TV will continue to be the repositories of our dreams and fears, but they won't be the way we talk to each other about those dreams and fears, not when they're all just digital snippets stored on third-party servers, just streaming clips that each of us watches on his or her own schedule, on his or her own screen.