Monday, November 30, 2009

How I built my Father. (And where I went wrong)


Jonathan Cape's illustrated short "How I Built My Father (And Where I Went Wrong)" is a beautiful and sad bit of magical realism, set in a world where children build their parents from scratch, but still can't always fix them.

Indie Film "INK" pirated; Filmmakers Pleased


Filmmaker Magazine has a great interview with the team behind the Indie film "INK". The text and link to the article are located below.

When I attended the Future of Music Conference this year I heard a lot of talk about all of the opportunities that exist today for indie musicians to create and distribute their products via digital media on the web. Later, at the Flyway Film Festival I heard former Tribeca CEO Brian Newman speak on similar topics in relation to indie filmmakers. The central theme to all of it is that indie artists can be successful without a major label contract or major studio distribution.

In the end though talk is cheap and what looks good on paper doesn’t always translate easily into the real world. I wanted to test the waters firsthand so I created a video podcast featuring live performances by indie musicians. The show runs roughly a half hour and I have been shooting a new episode every week for the past three weeks. In that time I have arranged distribution of the show via all the major video sites on the web. It is also available on TV via iTunes and Roku as well as on mobile devices and game systems.

So far the show has answered all my questions. It is indeed possible to create content of reasonable quality and achieve worldwide distribution using commonly available digital means. In addition it is possible, using these same resources, to cross the divide between computers and other systems such as cell phones, PDA’s, game systems and even TV via Roku and TiVo or AppleTV/X-Box/PS3. It is also possible to do it on a shoestring budget. The experiment, called The Indie Music Show has, to date, cost me around 2.5k.

This week, when I saw Jamin and Kiowa Winans of Double Edge Films sending out excited messages on Twitter to the effect that their movie Ink had been ripped and uploaded to Pirate Bay I was intrigued. I guess I was just sort of programmed by negative publicity to see sites like Pirate Bay as a bad thing. On the other hand, once I thought about it, I could certainly see the exposure potential of putting a project in front of the 140 million users of bittorrent sites worldwide. So I put my show up on Pirate Bay to see what would happen. In two days views of the show on its home page tripled.

It is more tricky for a movie though, since a movie is much more of a one shot deal than a weekly TV show. Most people will see a film once, maybe twice if they really like it and maybe buy the DVD if they really, really like it whereas a TV show needs to attract and hold repeat viewers. From that perspective the major studios and probably most indie filmmakers see a pirated film as lost revenue and so bittorrent remains pretty much unexplored territory in relation to positive outcomes.

Kiowa and Jamin on the other hand seem to be approaching the issue from a different perspective. I wanted to get their views on what is happening with their film and spoke to Jamin about it.

Filmmaker: Why are you guys having such a positive reaction to your film being pirated?

Winans: The last eight months have been a brutal struggle for Ink. We premiered the film at Santa Barbara Int'l Film Festival, signed with the agency UTA, and opened in Denver for a very successful eight-week run. However, indie film distribution in general has imploded. All the indie branches of the big studios have shut down and no one is buying films. So we took Ink out one theater at a time for the last several months ourselves trying to gain some momentum. The little money we made on each screen we used to push to the next screen. Theater after theater we had amazing crowds, reactions, and new fans, yet every decent distributor wouldn't touch the film. We knew we had an audience, but no way to get the film out wider to them. We were getting hundreds of emails, Facebook, and Twitter notes from people wanting to see Ink all over the world, but all we could tell them was "we're trying".

We finally decided to walk away from theatrical and make the film available on DVD, Blu-ray, and download as soon as possible. We figured the only way Ink was going to find it's way was to hand it over to the fans and hope they would run with it. Our hope was that Ink would slowly travel by word-of-mouth over the next year and ideally find it's way.

We knew Ink would likely get bit torrented eventually and accepted that it was unavoidable. However we never imagined it would happen immediately, blow up overnight, and spread all over the world. We were shocked by what was happening and spent the next several hours thinking there was some sort of mistake. But as it turns out, our one-year strategy of word-of-mouth was instead moving instantaneously. I've never seen a Hollywood campaign so effective and so instant as this has been.

Sure we could be upset that the film is getting downloaded for free, but that would make us jackasses wouldn't it? Ink was a $250,000 film with previously unknown actors. Hollywood distributors made it more than clear they saw no future for it. It was too bizarre, a mixed genre, unknown actors, low-budget. They wanted nothing to do with it. To pretend that we're really upset about the torrent would be acting as if we had all kinds of other options. No, we're thrilled Ink is exploding so much faster than we ever hoped.

FilmmakerWhat is the actual number of downloads that Ink has seen since being made available on the bittorrent channels?

Winans: It's hard for us to equate, but last I heard from the experts Ink's been downloaded over a half million times in about five days.

Filmmaker: I remember hearing you talk about making Ink at the Flyway Film Festival and you were saying that you raised the $250,000 budget for the film in part by mortgaging your house. So you obviously have a huge personal stake in the financial success of the film. On the one hand every download on Pirate Bay can be viewed as lost revenue which could be used to offset the cost of producing the film. On the other hand such a large number of downloads can be seen as a form of advertising that exposes the film to a much wider audience. I realize that it is much too soon to calculate the actual impact from having the film made available via pirate channels but what are the best and worst case scenarios from your viewpoint?

Winans: Kiowa and I don't see it as lost revenue, but fans gained. In fact, our revenue on the film has quadrupled in the last few days as a result of the exposure. It's still a fraction of what we need to be making to make it work, but it's a big step in the right direction. People are coming back to our website and buying disks, the soundtrack, posters, shirts, and making small donations. If that continues we'll be in good shape. However most downloaders are not spending money and it's certainly a possibility that they never will if that's the case, we could be hurting.

Here's the irony. We got completely screwed by the people distributing our first feature film, 11:59. We didn't get paid at all from one distributor, and barely from another. In the last five days, we've made more money from donations from "pirates" than we've ever made from a distributor. You tell me who the crooks are. Everyone is concerned piracy is going to destroy the indie film world, but I can say unequivocally that the distribution world is already destroyed because it's primarily made up of scam artists and thieves. If someone's going to rip off our film, I'd rather it be our fans than some sleaze bag feeding on struggling indie artists.

Filmmaker: I have been thinking about why major label bands or big studio films have the success they do. I mean, as often as not, products by the majors are no better than products by unknown artists in many respects and yet the majors totally control the traditional market. The obvious answer is the star power of the people in the film and the enormous amount of money that is spent on marketing. This seems to be why, even with two products of essentially equal quality (one indie, one major studio) side by side on the same "shelf" whether in a store or on the web, the studio film is always the one which makes money.

This is true even if consumers have not yet seen either film.

I wanted a term that would express this advantage in simple terms. I came up with "implied value" as a distillation of all of the ingredients that make an unseen film attractive enough to consumers so that they will invest their money in a movie ticket or DVD. Word of mouth from consumer peers is an important example of how a media project can gain this type of value and one which indie artists can best capitalize on since it doesn’t necessarily require a huge advertising budget to achieve.

From this perspective do you think that having Ink pirated and exposed to the huge audience represented by bittorrent users will increase the Implied Value of your film with the world audience? What is the biggest benefit you see — increasing general awareness of the film or sparking a larger base of word of mouth recommendations? I know from reading your tweets this week that this exposure has already caused Ink to rise to the level of a top 20 movie on the IMDb (Independent Movie Database) chart, which is certainly encouraging but do you see this translating into actual income via theater placement/attendance or DVD sales?

Winans: I don't think word-of-mouth has ever been as powerful as it is right now. Social networks and online communities have changed everything. From the beginning our principal has always been to establish fans and care for them. We're far more interested in creating a family-like fan base than we are in making general films that the studios can distribute. Rising the ranks on IMDB is cool because it's quantifiable in some way and it's nice to see Ink and the actors getting exposure, but we're much more interested in the individual notes that we get from fans telling us how much they love the film. These people are all we really care about because they'll likely be with us for a very long time. When all the hype dies down, they're still going to be our fans. And if we have our fans we don't need anyone else. I think your Implied Value theory is exactly right. Yes, I do think the recent explosion of the film has created new value for Ink. Paranormal Activity's implied value obviously sky rocketed even though it was made for $11k. In the end value really is perception. Each of us want to see the thing the rest of the world is seeing.

As far as translation into sales, the growing implicit value is certainly helping. Because Ink is blowing up a lot of people see it as a bigger film, more of a brand, and thus they're more willing to pay for it.

All this said, it's a scary time. We look at the file sharing of Ink as a great thing, however it works for us because we're a small film. The fact is, most people downloading it are not supporting it financially in any way. From everything I can tell this is not a sustainable model for bigger films. By bigger, I mean anything above $1 million which isn't much. If fans aren't paying for the films, who is? Hopefully it will all work out, but the concern is that the illegal downloading will destroy movies simply because producers have no way to fund them anymore. The only other tested and working alternative that I'm aware of is advertising and product placement and an enormous amount of it. So in the near future our film could be entitled Ink: Brought to you by McDonalds. - written by Mike Johnston (link: http://filmmakermagazine.com/webexclusives/2009/11/indie-film-ink-pirated-filmmakers.php)

Sunday, November 29, 2009

Outfoxed: How Roald Dahl's stories for children eclipsed his fiction for adults.

"I could feel him smiling," said Felicity Dahl, widow of the great Roald, of her experience of viewing Wes Anderson's Fantastic Mr. Fox. "I was thinking, he'd love this." Well, she would know, I suppose. But what am I to do then with my conviction that her late husband would have loathed this? That Wes Anderson, with his glockenspiels and drolleries and minutely faceted interiors, has travestied the raucous spirit of Dahl? And that the ideal Fantastic Mr. Fox movie would be a work of slapdash animation, soundtrack by Mötorhead, directed by Bobcat Goldthwait? I'll just have to sit on it, I suppose.

Rarely can the movements of the muse be charted with any precision, but it appears that around 1959 the tutelary presence that handled Roald Dahl Inc. decided, with very little warning and no consultation, upon a major shift in direction. Ideas for the short stories with which he had made his name in the pages of The New Yorker and the Atlantic Monthly dried up, and Dahl found himself temporarily at a loss. It was not a position to which he was accustomed. Long-bodied, dented, worldly, impatient, Dahl came from enterprising Norwegian stock and had been educated in the heart of the British establishment. He was a former WWII flying ace (he fought with the Royal Air Force in Greece and North Africa), a former spy (as an attaché to the British Embassy in Washington, D.C., he had funneled political tidbits back to London), and the husband of screen goddess Patricia Neal. No literary career is easy, but his had gone pretty smoothly, relatively speaking: His first short-story collection, 1953's Someone Like You, had garnered him comparisons with Saki, Somerset Maugham, and O. Henry, and his second, Kiss Kiss, was selling nicely.

But a limit seemed to have been reached. Those grisly, sting-in-the-tail plotlets of his, each with the economy of a black joke—they weren't coming anymore, as he admitted to his publisher, Alfred Knopf. The one about the woman who beats her husband to death with a frozen leg of lamb, then defrosts the murder weapon and serves it to the investigating police officers ("Lamb to the Slaughter") or the sickly baby dosed by her beekeeping father with the healthful secretions of the hive until she acquires "a powdering of silky yellowy-brown hairs" on her stomach ("Royal Jelly") ... now, for some reason, Dahl was writing page after page about a small boy, a group of talking insects, and an enormous airborne peach.

Knopf didn't blink, and James and the Giant Peach was published in 1961. The opening—"Until he was four years old, James Henry Trotter had had a happy life"—could have come from one of the short stories, but within a few lines little James' parents had been dispatched (day out in London, escaped rhinoceros) with a cruelty that was part folktale gruffness, part-Nabokovian élan. He had magically fused his New Yorker voice with one that seemed to issue from the blackest Norwegian forest: brisk, practical, unsparing, mildly atavistic, and quite at home in the bizarre. This was Dahl 2.0. Charlie and the Chocolate Factory came next, and then, in 1970, Fantastic Mr. Fox.

Life, meanwhile, had missed few opportunities to pulverize Roald Dahl. In 1961 his 4-month-old son Theo was critically injured when his baby carriage was hit by a taxi. Olivia, Dahl's first daughter, caught the measles in 1962, slipped into a coma, and died. In 1965 Patricia Neal, pregnant, suffered a massive stroke: Much of Dahl's energy went into her subsequent years-long rehabilitation.

Fantastic Mr. Fox, coming at the end of this decade of punishment, was understandably not the tightest or most elaborate of his works for children. But then that's the foxy thing about it—the book gets by on a scrape of a plot, some top-notch Anglo-Saxon alliteration (Boggis, Bunce, and Bean: You can't beat that), and the charm of its leading man. Ted Hughes had come out with his classic The Iron Man a couple of years before, and there were elements in common: vengeful mechanized digging, for one, as both Mr. Fox and the Iron Man came up against the terrible tractors of postindustrial English farming. Dahl's tale, however, unlike Hughes', was free of mystical overtone. Mr. Fox is simply a dashing paterfamilias under siege, struggling to protect his brood and sustaining a fearful wound, a castration almost, in the form of his shot-off brush—that bleeding tail stump, "tenderly licked" by Mrs. Fox, providing one of the most shocking images in all of Dahl's work.

Can we separate Dahl the Pied Piper, the battered figure at the heart of 20th-century children's writing, from Dahl the littérateur? The light thrown retrospectively on his early stories is revealing: Their tone of sinuous expertise now seems rather obviously that of an adult spinning naughty tales for an audience of juniors. (Adolescent readers, for example, have always particularly enjoyed them.) Post-Peach attempts to recapture this tone, to go grown-up again, would be unsuccessful: Once the muse had made her move, that was that. Switch Bitch, a collection of creakily pornographic stories that had appeared in Playboy, seemed a relic even in 1974. "She laid a lovely long white arm upon the top of the bar and she leaned forward so that her bosom rested on the bar-rail, squashing upward." (You can catch there a debauched echo of his early hero Hemingway—until the word "squashing," that is, which is pure Dahl.) A 1979 novel, the dreadful My Uncle Oswald, was low-intensity ho-ho smut of the sort that might have tickled his old friend and fellow roué, the world's laziest writer: Ian Fleming.

In the general economy of Dahl's art, however, these books perhaps served their purpose, burning off a spurious sophistication and allowing him to perfect his true style, which was scruff-of-the-neck storytelling. ("Listen very carefully," urges the narrator of The Witches. "Never forget what is coming next.") The slightly ponderous precision with which he had set up his punch lines in Someone Like You became a secret weapon when he wrote for children—an exhilarated, second-by-second focus on the matter at hand. No one who has read it, or had it read to them, forgets the moment in Danny, the Champion of the World when the 7-year-old hero drives a car down a dark country lane, exquisitely slow to begin with but picking up speed, going from first to second gear, and second to third, in a mounting mechanical ecstasy ...

Dahl was not religious by temperament or philosophy, and this seems important. Compare his bristling, stinking, unmetaphorical characters with the watery allegories of the Harry Potter cycle—and his prose with J.K. Rowling's—and you begin to see that a supernatural frame of reference might not always be such a wonderful thing. A good Roald Dahl sentence is a physical event: It can leave a child literally writhing with glee. "The hailstones came whizzing through the air like bullets from a machine gun, and James could hear them smashing against the sides of the peach and burying themselves with horrible squelching noises—plop! plop! plop! plop!" You don't need to know anything about Dahl's dogfights over wartime Greece to enjoy that. He was better at beginnings than endings—"The Wonderful Story of Henry Sugar" begins three times—but then aren't we all.

There will be kids, no doubt, who writhe with glee at Wes Anderson's Fox, and more power to them: It has plenty of marvelous qualities. But something grizzled, abrupt, and rough-humored is missing. Something warty. "You can smell the danger, watch your step/ See the friendly stranger, stretch your neck ..." One of Dahl's Revolting Rhymes? Not quite. It's Lemmy, from Motörhead's "Die You Bastard." I think the two of them would have got along very well.

British Books Offer A "Cosy" Antidote To Apocalyptic Horror. Let's Be Civilized, Shall We?


Looking for an alternative to the horrific scenarios of 2012 and The Road? Try the "cosy catastrophe" genre, the Guardian suggests: Stories like Day Of The Triffids and The World In Winter feature a less violent version of the end.

In the "cosy catastrophe" genre, the end of civilization happens more gently, or is passed over altogether, and there's often some hope for the rebuilding of the world. The Guardian explains:
The phrase is attributed to the British author Brian Aldiss, who mentions it in his fascinating history of science fiction, Billion Year Spree, while talking about the author of Day of the Triffids, John Wyndham. While Triffids, with its blinded populace and sinister, stalking plants, could hardly be described as "cosy", it is an example of a largely non-violent, non-destructive doom. Wyndham also wrote The Kraken Wakes, in which an alien invasion gradually destroys civilisation by way of melting the ice caps rather than with death rays and war machines. The book chronicles the rebuilding of a massively de-populated world once the aliens have been despatched.
John Christopher is another British author who embraced the idea of a cosy catastrophe. While his novel, The Death of Grass – which so worried Sam Jordison when he was younger – does feature an ecological disaster that causes often violent social breakdown, Christopher (real name Sam Youd) also wrote The World in Winter, a very much more British version of Emmerich's movie The Day After Tomorrow, in which increasingly harsh winters drive the population of western Europe towards the suddenly more temperate African regions. And then there's JG Ballard, who employed ecological apocalypse in his debut novel The Wind from Nowhere, as well as in his more famous works The Drowned World, The Burning World, and many of his short stories.
Of course, there may be a bit of wish-fulfillment on the part of these authors, as the Guardian quotes author Jo Walton suggesting. The survivors of these catastrophes are often very middle class, and they get to wander around a suddenly depopulated world, with the working class wiped out in a guilt-free way. And then they get to rebuild the world along more civilised lines.

But leaving aside the classist undertones of the genre, who's to say that a collapse of civilization wouldn't be slow and relatively non-violent? And that we wouldn't pull together to rebuild afterwards?

Fox Chairman has "No Doubt" that AVATAR will turn a profit: Therefore it must be true.

20th Century Fox's Fox Filmed Entertainment chairman Jim Gianopulos says claims that "Avatar" cost $500 million (though the NYTimes initially said it "could cost up to") were "ridiculous."

"That's a ridiculous number," Gianopulos told Reuters, however refusing to divulge the actual cost of the film. "It has actually no relationship to the actual cost of the movie. The movie was quite expensive, there is no question about that. But viewed now, from the perspective of its completion and having seen it, it's a formidable work and money well spent."

Will the movie make its money back or even turn a profit? "I have no doubt about that," he said, which is sort of amusing to watch fanboys around the web sort of go, "oh good! It's going to make money!"

We're not saying it won't and we would never count James Cameron out (we sort of did that back in the day with "Titanic" when the press made it seem like an expensive disaster waiting in the wings), but it's still kind of funny to see a dubious quote like this was taken at face value from the geeks and thrown in headlines as truth. What the hell was the Fox guy going to say, "Actually, it cost more than that. We're really scared we won't make our money back and we'll be fucked"???

Meanwhile, James Cameron was on 60 Minutes this weekend, but the appearance was somewhat dry and boring. However, 60 minutes did say that the film cost "roughly $400 million-plus for production and promotion." Also Cameron give us pearls of genius like, "Hell yeah, [I gave the Na'vi character tails]. Tails are cool."

One technical achievement the director is proud of is actually quite sad. "Even when we were doing 'Titanic' twelve years ago, the shot at the bow where [Leonardo DiCaprio and Kate Winslet] kiss, we waited two weeks for the right sunset to get that shot. Now we just shoot it in front of a green screen and choose the right sunset later."

Geez, if that's not a sad pullquote for the state of soulless cinema, we're not sure what is. He's also quite modest. He tells crew members that for each project of his they work on they are "going to the Superbowl" and warns them, "don't get on the boat if you're not ready to go all the way." His reputation as a bonafide asshole seems calcified.

Cameron did use and abuse his painful "crowning" metaphor once again. Ugh
. "You don't ask a woman if she wants to have more kids at the exact moment when she is having a baby, you know what I mean?," he said, apparently pleased with himself that he's used this analogy about 12 times in different interviews now.

Evidently Cameron is working down to the wire to get "Avatar" complete for its December 18 release date. Though he actually has less time. The world premiere of "Avatar" will take place December 10 in London and its U.S. premiere will happen in L.A. on December 16, but he's tinkering down to the last second. Here's the 60 Minute piece and a chase sequence that was part of the 20 minutes of footage we saw earlier this year.

Saturday, November 28, 2009

The Orange/Blue Contrast in Movie Posters


I’m sure you’re aware of Hollywood’s overuse of floating heads on movie posters… but have you noticed the excessive use of orange/blue contrast on theatrical one-sheets? David Chen happened to come across this comic illustrating the Blue/orange contrast, although I’m not sure where it originated or who created it. After the jump you will see a ton of examples of orange/blue contrast, however I must warn you — as the comic says, once you see it, you’ll notice it everywhere.

Thursday, November 26, 2009

Change in Copyright Law: A possible solution to news content crisis?

Copyright law reform as one remedy for plummeting profits at traditional news organizations was proposed at a media affairs panel organized by the non-profit Center for Communication and hosted by Fordham University earlier this month.

Former public television executive and current Fordham Professor William F. Baker moderated the event which was called “The Audience: How America Uses its Media.” On the panel: Nielsen executive Gerry Byrne and media lawyer Dean Ringel, a partner at the New York law firm of Cahill Gordon & Reindel.

Ringel advocated introducing compulsory licensing fees for Web-based agregators or re-distributors of news content. Under Ringel’s system, sites like Google would be required to share profits with or pay a fee to any news organization whose content they post, in a system similar to the compulsory licensing system than currently manages rights for cable television and music.

He noted that current copyright law protects the specific expression of information but does not protect the work necessary to obtain that information. Ringel argued that papers like the New York Times, which spend prodigious sums on reporter security in dangerous places around the globe, should get some of the revenue made by third-parties who distribute their content.

Ringel’s proposed system would apply only to sites obtaining revenue from re-posting news content. Sites which did not charge fees or seek advertising revenue, and perhaps even some commercial sites whose readership is below a certain threshold, would be exempt from the requirements.

Denying profits to newspapers and magazines is to “risk depriving our society as a whole of neutral, professional, prepared and analytic information” said Ringel. Dr. Baker echoed Ringel’s statement by quoting Thomas Jefferson: “If a nation expects to be both ignorant and free, in a state of civilization, it expects what never was and never shall be.” All three participants agreed a reliably informed citizenry underpins any successful democracy.

Legal solutions that prohibit the re-posting of headlines or web links entirely risk inhibiting the free flow of information, said Ringel. A content licensing and revenue sharing plan, argued Ringel, would allow the marketplace of ideas while also assuring that the activity of reporting is properly rewarded.

Bynre, senior vice-president at Nielsen Business Media -- he oversees Hollywood Reporter, Adweek and Editor & Publisher, among many other titles -- spoke frankly about the prospects for economic stability in the news business.

“The TV advertising market in Los Angeles used to be worth $1.2 billion dollars but now the same number of stations are fighting over $500 million,” said Byrne, “and that means that reporters all across L.A. are getting wiped out of stations because there isn’t anybody whose willing to pay for them anymore.”

Byrne, who is a Marine Corps veteran, joked that “Sometimes I feel like it would be easier to put on an army uniform and go fight in Afghanistan or Iraq than it would be to stay in the news business.”

The panel members also speculated on what the media industry may look like in the future.

Agreeing with Dr. Baker’s statement that mobile devices are sure to become more important as media outlets, Byrne said, “Mobile realities are going to explode. Outside the U.S. mobile platforms are way more advanced than we are here.”

Byrne was hopeful about the ability of local media to perform a watchdog function. “Journalists on a local basis are the cleansing system for what goes on in small communities. Thing like whether local politicians or real estate developers are doing what they’re supposed to.” said Byrne. - Evan Leatherwood

Sunday, November 22, 2009

Are Zombies America's Godzilla?


Zombies have been enjoying a heyday of late, but why are Americans so obsessed with the walking dead? One theory is that Westerners love zombies for the same reason Japan loves giant monsters: they represent technology gone awry.

James Turner, an editor for O'Reilly Media, claims that zombies share a kinship with Godzilla. His theory is that, just as Godzilla was inspired by the dropping of the atomic bomb, Western filmmakers (Romero aside) latched onto zombies in the wake of Three Mile Island, the recognition of AIDS, the Ebola outbreak, and similar medical and technological disasters. He goes on to posit that the increasing popularity of zombie movies involving a biological outbreak suggests a Western ambivalence toward biotechnology.

It's an interesting thought, though perhaps a bit reductive. Certainly zombies have been used to comment on biotechnology, but they've also been used to comment on a number of social issues, including consumerism, corporate greed, and the objectification of women. And what causes the zombie outbreak is often less important than what comes afterward. Still, Turner makes an interesting case that biotechnology-based zombies could evolve to more acutely reflect our biological and technological fears:
"Blackberry-spawned abominations, anyone? Dawn of the Single-Payer Healthcare Undead? What about, They Came From H1N1?"
He's far more convincing when he talks about the important differences between giant monsters and zombies, namely that it's the military and scientists who fight Godzilla, where zombies fall to resourceful and self-reliant survivors.

Americans must like the idea that, as out of control as our hubristic science might become, a good machete and a 12 gauge in the hands of a competent man or woman can always save the day. The 2003 bestselling title, The Zombie Survival Guide, offers the same message of self-reliance. (I'm not sure what lesson we can take from the success of Pride and Prejudice and Zombies.)

Here's the actual Article:
-----------------------------
A Brief History Of Zombies

The atomic bombs that dropped on Japan in 1945 inspired movie director Ishiro Honda to give the world the big, bad, grey monster, born of irresponsible nuclear weapons tests that we know to this day as Godzilla. Godzilla was, quite literally, the personification of humanity's science and technology gone bad. The message was simple: With atomic weapons, we had unleashed a monster that was beyond our ability to control.

In the West, Godzilla's cautionary tale (and tail) never really took hold. To Americans, Godzilla was just a guy in a rubber suit stepping on model houses. But that's not to say that the West hasn't had its own cinematic symbol of science run amuck. Instead of giant irradiated monsters, our preferred poison has been flesh-eating zombies.

Until George Romero's landmark 1968 film, Night of the Living Dead, zombies in movies usually were created from voodoo or magic (or aliens, as featured in Ed Wood's groundbreakingly awful Plan 9 From Outer Space.) Romero gave us brain-munching corpses produced from a space probe blowing up in the atmosphere. Once again, the monsters were created by our out-of-control technology.

Night of the Living Dead didn't spawn an immediate clutch of imitators, possibly because it came in the midst of America's race to the moon and most people were hopeful about advances in science. But when Romero returned with Dawn of the Dead in 1978, that optimism had already begun to fade. By 1984's C.H.U.D., disasters such as Three Mile Island had primed the movie-going public for the idea of a horde of killer zombies created by nuclear waste.

Along with nuclear waste and mysterious space-borne radiation, pandemic plagues have also spawned zombies. This zombie type has become the dominant movie form over the last few decades, no doubt a reaction to AIDS, Ebola, cloning, genetically modified foods and the remainder of the brave new world of biotechnology.

It seems you can't throw a half-eaten cerebrum these days without hitting a posse of zombies brought to life by some kind of biological mishap (28 Days Later, Resident Evil, Planet Terror, Quarantine). Like Godzilla, zombies keep up with the times, always ready to mirror whatever aspect of science and technology people feel most uncertain about at the moment.

But there's one major difference between Godzilla and the attack of the zombies: Godzilla fought scientists and the military (and maybe the occasional band of adorable children), but zombie battles usually are a person-to-ex-person struggle. While Godzilla swatted at planes and crushed tanks underfoot, zombies are done in by weapons such as shotguns, hand grenades and the ever-handy chainsaw.

Americans must like the idea that, as out of control as our hubristic science might become, a good machete and a 12 gauge in the hands of a competent man or woman can always save the day. The 2003 bestselling title, The Zombie Survival Guide, offers the same message of self-reliance. (I'm not sure what lesson we can take from the success of Pride and Prejudice and Zombies.)

To be sure, it's easy to read more into the popularity of zombies than might actually be there. Film-goers have always loved a good scare, and a shambling collection of neuron-challenged corpses make a pretty terrifying story. And if my zombie-obsessed 14-year-old son is a representative sample, blowing the undead away with heavy weaponry has a solid adolescent demographic appeal. But there's no question, at least in my mind, that zombies (and Godzilla) are an allegorical representation of our fear that science and the technologies it spawn will lead to our destruction.

Who knows what the future may hold for zombie evolution? But it's a pretty good bet that whatever we're uncomfortable with at the moment stands a good chance of turning into the next zombie-generator. Blackberry-spawned abominations, anyone? Dawn of the Single-Payer Healthcare Undead? What about, They Came From H1N1?

IFC & Netflix Team To Stream 53 Indies

IFC Entertainment and Netflix have announced a partnership that gives Netflix the U.S. rights to 53 unique titles from IFC Entertainment. Through this agreement select titles from IFC Entertainment’s library of independent films will become available to be streamed instantly to televisions and computers via the Netflix service. The deal was announced jointly by Lisa Schwartz, executive vice president for IFC Entertainment, and Robert Kyncl, vice president of content acquisition for Netflix.

“Netflix has always championed independent cinema and has creatively built audiences for films in this genre, and we’re excited to give their customers instant access to this wide-ranging collection of independent film,” said Lisa Schwartz, executive vice president for IFC Entertainment, in a statement. “Our top priority is to make independent film available to a wider audience and this partnership further underscores that commitment.”

The partnership allows Netflix members on an unlimited plan to instantly watch the newly acquired films on their computers or TVs through a range of Netflix ready devices. The films will be available beginning Friday, November 20th.

“Partnering with IFC Films gives us the opportunity to expand the number of quality films that our subscribers can watch instantly,” said Robert Kyncl, vice president of content acquisition for Netflix, in a statement. “This deal reinforces our commitment to bringing diversity to the library and properties like this collection of titles bring us closer to that goal.”

The deal will include 53 titles, including English-language independents from John Sayles’ “The Brother From Another Planet” and “Return of the Secaucus Seven,” Christopher Nolan’s debut “Following,” Joe Swanberg’s “Nights and Weekends,” James Toback’s “When Will I Be Loved,” and Rebecca Miller’s first film “Angela.”

The library will also feature documentaries by filmmaker Errol Morris, including “The Thin Blue Line” and “Gates of Heaven”. Joe Berlinger and Bruce Sinofsky’s award-winning “Brother’s Keeper,” and Jim Stern and Adam Del Deo’s political documentary “So Goes The Nation” are also featured.

In addition, some recent foreign language titles included in the deal: Susanne Bier’s “Brothers,” Patrice Chereau’s “Gabrielle,” Hirokazu Kore-Eda’s Cannes prize winner “Nobody Knows,” Lukas Moodysson’s “Together,” Christophe Honore’s “Dans Paris,” Catherine Breillat “Sex is Comedy, Alfonso Cuaron “Solo con tu Paraja,” Kristian Levring’s “The Intended,” and Hou Hsiao Hsien’s “Three Times.”

Friday, November 20, 2009

Black or White: Making moral choices in Video Games





You and your three companions step out of the elevator doors to a peaceful scene: an airport lounge milling with unsuspecting bystanders. No one has seen you, or the machine guns. In an instant it’s all over: a shower of bullets, screams, falling bodies, and blood-stained carpet. Your companions have moved on, executing those left alive. What do you do?
 
The emergence of morality in video games is arguably one of the most important innovations of the medium to date. Like in the above example from Call of Duty: Modern Warfare 2, giving players moral choice is a progressive development in games that adds more weight and substance to player decisions, leading to a more immersive and satisfying experience. Whether it’s abstaining from shooting civilians while infiltrating a terrorist cell, saving or harvesting Little Sisters, or holding the fate of the Capital Wasteland’s people in your hands, moral decision making in games is becoming an increasingly popular aspect of game development.

But is it all an illusion?

Morality is not a black-and-white concept. Reality is very seldom as simple as a choice between good and evil; the spectrum of moral behaviours is as complicated and consequential as our emotions. Instead of mirroring this complexity and including moral choices that lead to genuine in-game consequences, video games often do the opposite--they present a watered-down version of moral choice that ultimately results in players having to choose between good or evil: to harvest or not to harvest (BioShock), to be “paragon” or “renegade” (Mass Effect), to kill innocents or to save them (inFamous), to have a halo or devil horns (Fable II).


In this feature we will look at the problems arising from morality systems in video games, and seek to answer why morality is needed in games, why moral choice is so often just black and white, and what developers can do to change this. In Part One of the feature we’ll speak to philosophers and game theorists and in Part Two we'll speak to developers to find out whether complex moral choices are needed--or wanted---in games and how morality systems can be improved.

Morality 101

In a nutshell, morality refers to the codes of conduct that form the backbone of a society. Generally, morality is concerned with how people should behave rather than how they do behave. Morality can change over time and take on new meaning as people and environments evolve--for example, slavery was once accepted as morally permissible, whereas now it is accepted that enslaving another human being is immoral. In philosophy, morality and ethics go hand in hand: morality pertains to certain rules and codes of conduct while ethics pertains to the application of these rules in society. 



Morality as it applies to video games can be thought of in much the same way. Players are most often asked to decide on a morally correct or incorrect course of action. This pertains to in-game behaviour and, in most games, is intended to shift the outcome of the game in one way or another depending on what the player has chosen to do. However, as we will see later, it is most often the case that these in-game choices have little or no bearing whatsoever on the end outcome, resulting in an insincere portrayal of morality. But why should we care? Why do we need morality in games at all, when it’s perfectly obvious that some games function perfectly without it? 


Emil Pagliarulo, lead designer for Fallout 3, knows that morality does not play a role in every game. He does, however, believe that if the scope for a moral system is there, it’s up to the developers to make it work. 



“If it makes sense to include moral choices, if that’s something central to a game’s themes or gameplay, and it makes the game a more enjoyable experience overall, then morality certainly has a role,” Pagliarulo said. “In Fallout 3, the struggle of people in a post-apocalyptic wasteland lends itself perfectly to a morality component, so for us, it was a must," he said.
“It’s the job of the developers to define their experience for players, and determine exactly where each system fits in. Is the game fun in a hack-everyone’s-limbs-off sort of way, or is it fun in a wow-this-game-made-me-think-and-did-stuff-I-never-expected sort of way?” 


For Pagliarulo, the appeal of a morality system is to break the monotony of experiencing the same thing over and over again. He says gamers have come to realise that there isn’t a lot of experimentation or thinking outside the box in the games industry at the moment--for every LittleBigPlanet there are five first-person shooters with the same mechanics, structure, and story. But morality systems shake things up; gamers have to think about their actions and choices and, more importantly, the reasons behind them.

 “I think players simply get tired of experiencing the same things over and over and over in games. Frankly, it gets boring. When morality’s involved, the simple act of shooting a bad guy isn’t so simple anymore. You’ve got to ask yourself, 'Well, is he really the bad guy? Was he maybe just trying to defend himself? Should I really be doing this?' So just the act of questioning what you’ve done a thousand times before instantly makes it different, and more interesting, and therefore, in a lot of cases, more fun," he said.

BioWare writer and designer Mike Laidlaw agrees that morality adds depth to games. He says that even when morality has no long-term impact in the game world, a game with a morality system is better than one without it.

“The role of a morality system is a means by which a game can be aware of the way a player is interacting with the in-game world; in some ways it’s a way for players to measure their own progress in a certain way. It’s also a mechanic that lets us realise that these choices have some weight. It helps players understand that the things they’re doing and the choices they’re making have an impact beyond the moment," he said.

“Even if it doesn’t have a long-term effect, it still forces players to think about those moments. I’m not saying that every morality system ever made is the best thing ever. I think in general, anything that makes a game more interactive, whether it’s successful or not, is good. It plays to the strength of the developer and the medium. I can’t defend it in all cases, but when it’s done with a good intent and done as well as it can be I think it makes the player engage with the game on a deeper level.”

Most games portray a dualistic morality system: regardless of context, players end up playing as either ‘good’ or ‘evil’ characters. Some games employ a ‘morality meter’ that promises to keep track of players’ in-game actions and change their experience accordingly. Sadly, this very rarely happens--most games that promise a tailor-made experience according to player choices end up disappointingly consistent and devoid of any real consequences for a player's actions. This results in an experience that feels like it has more depth but very often just has the illusion of depth. So why is morality in games so black and white?

Peter Rauch, a Comparative Media Studies graduate from the Massachusetts Institute of Technology (MIT) in Cambridge, MA, is a veteran gamer: in his own words, he’s been gaming since he was “old enough to stand on a milk crate to reach the joystick at arcades”, and he’s been studying them ever since. His last years at MIT were spent researching morality in games, looking at how moral arguments could be used in games to encourage players to pay attention and provide new ways to think about it in the real world.

“What I’ve found is that video games are a great medium for provoking discussion of moral issues among players who already think a great deal about such things,” Rauch said. “However, most games use a hodge-podge of different moral systems and when these conflict, the result can be bizarre.”


Rauch believes a lot of games make use of very simplistic moral ideas, which at times can take players out of the game,though it’s all about how well the morality works in the context of the gameplay.

“Seeing certain options open up and others close off was one of my favourite things about the single-player mode in Star Wars Jedi Knight: Dark Forces II, and the difficulty of maintaining a consistent ethic in The Suffering made my whole experience more intense,” Rauch said. “Fable’s morality system is a train wreck, but even that made it a more memorable experience. Laughing at the fact that eating tofu helped me prepare for cold-blooded murder was probably the one saving grace of that game.”

According to Rauch, people like a little villainy with their heroism, which is why morality in games is becoming so popular. Besides adding an extra layer to the gameplay, morality systems are supposed to allow players to better identify with their characters and to some extent, begin to better understand their choices and actions in the game. But does this actually happen when players are presented with black-and-white moral choices? The problem, according to Rauch, is the limitations of the medium itself.

“In a game, actions only have moral meaning when they're attached to a symbol that plays a role in the storyline. What actions can be performed from that is largely determined by the genre’s conventions. There are certain moral ideas that just aren't going to make sense in certain genres without substantive changes to the game rules, and you're going to have some limitations in any game in which there's one win condition and one loss condition, especially if that loss condition is usually the player's death. Martyrdom is a tough thing to reward in most genres," he said.

"There are creative challenges for game developers to overcome, but this is always risky because video game production is a capital-intensive business. Games are expensive and slow to produce, and the big name titles are expected to subsidise the losses. Investors would much prefer another Halo clone over something new that might fail.”


But that’s not all. According to Rauch, while moral conflicts appear interesting in dramatic situations, the simple fact is that day-to-day moral choices are usually very simple and intuitive in normal circumstances. The trouble is, video games don’t involve normal circumstances, which is partly what makes them so fun and what makes the idea of a moral system so intriguing. So perhaps one of the reasons why in-game morality tends to be so simple is that most people, including game developers and players, think about it in simple terms when presented with the abnormal circumstances of most games.

The question of whether developers should try and mirror real-life moral choices in games is a complex one to answer. This would certainly break the illusion and give players agency, but would it be a successful game? While Rauch is not entirely convinced it would, he still believes developers should experiment with the possibility.

“I think any new gameplay concept, or any new game genre, is a good thing in itself,” Rauch said. “I like games, and I like seeing them change over time. I think developers should make games that mirror real-life moral choices, and games that mirror highly unlikely, super-heroic choices, and games that imagine entirely hypothetical, otherworldly choices. These games might be boring, but I think that games like The Sims and Diner Dash have pretty conclusively shown us that any activity can be fun with the right design.”

The way to do this, according to Rauch, is to start a conversation.

“Designers, players, and especially critics would benefit from having a few long conversations about how people act in certain situations, and whether they ought to act differently. Players need to be allowed to fail once in a while; it would be nice to have some unambiguously bad choices available. Games right now seem to be stuck in a place where the consequences of player actions are entirely predictable, and take effect either immediately or at the very end of the game. Some kind of partial randomisation, or delayed effect, might help to deepen the kind of experiences we could have with games.”

Variety is the Spice of Life

Games like BioShock and inFamous have attracted criticism from gamers who have discussed the morality element of the gameplay for a number of reasons. The most important flaw cited in these discussions is that the morality systems used in these games give the illusion of meaning to a player's actions. For example, killing or saving the Little Sisters in BioShock is promoted as a very weighty and important player decision, when in reality it has little bearing on your character: both paths give you roughly the same amount of ADAM. Similarly, the "morality moments" in inFamous present a very crude and simplistic idea of morality: participating in either the ‘right’ or ‘wrong’ choice doesn’t advance or impact gameplay in any way, other than adding points to a meter and producing differently-coloured lighting attacks. In order for morality to function properly within a game environment, developers need to pay attention to the consequences of in-game actions, making them lead somewhere instead of nowhere and using them to shape and affect narrative and gameplay.

Assistant professor of philosophy at the University of Central Oklahoma Mark Silcox and associate professor of philosophy at Louisiana State University Jon Cogburn have partnered up to write Philosophy Through Video Games (Routledge, 1999), a text that discusses the relationship between philosophy and video games, and looks at how morality systems work within the game environment. The first thing Silcox and Cogburn do in the text is strongly encourage game developers and designers to study different philosophical systems of morals instead of using inchoate intuitions like ‘good’ and ‘bad’.

“If we could do one thing here, we would first require all video game designers to read one of the excellent short introductions to the western philosophical tradition in ethics,” Silcox said. “There are so many fundamental issues on which smart, informed people disagree that it would not be difficult to design games around the kinds of conundrums and tragic choices that lead thoughtful people away from thinking in such black-and-white terms. One thing we can’t stress strongly enough is that great art can’t just always show ‘bad’ acts, which leads to ‘bad’ consequences. Real life isn’t always that way, nor is great art.”

According to the two philosophers, the function of a morality system in a video game is, like all art, to allow people to imaginatively play with compelling possibilities in a safe manner, experiencing what it is like to be a very different person in a very different situation. This works by the portrayal of certain kinds of actions in games that have typical consequences, which players must react to in various ways. However, how to translate this successfully to video games is still being worked out; according to Silcox and Cogburn, players are beginning to demand something above higher scores and more loot for good behaviour.

“Some of Peter Molyneux’s more recent games were going to be something like this, where the behaviour of the character affects the interface in interesting ways, but neither of us has been very moved by the games themselves. Mechanisms like the one in Fable where you can reverse the polarity of a character’s morality just by mooching off to a temple and making a sacrifice to some deity have a horribly trite and arbitrary feel to them," Silcox said.


“We think this might be done better in games like Empire: Total War and the Civilization series. We’re also encouraged by the fascinatingly complicated systems of etiquette that have sprung up around highly social MMORPGs like Second Life and EVE Online. The more that people’s lives as gamers start to blend together with their social interactions in games like these, the more closely what happens in them can be expected to reflect the ethics of our everyday interactions.”

Something that video game developers should steer away from in trying to achieve a more realistic and textured morality system is the temptation to include real-life moral choices. Silcox and Cogburn agree with Rauch that games based on real moral dilemmas just wouldn’t work.

“It is admittedly very difficult to imagine a genuinely fun video game that mirrored the sort of everyday moral choices that most people end up being preoccupied with, e.g. whether to tell off one’s boss, how much to spend on grandma’s birthday gift, or whether to be faithful to your spouse, as opposed to whether or not to nuke eastern Europe or to spray machine gun fire into a crowd of zombies.”

One solution would be to create unpleasant consequences stem from committing morally questionable acts in games, which would heighten the experience of playing and make game narratives seem more vivid and realistic. Silcox and Cogburn believe that the representation of morality in games is more than just a passing trend, and, as games take on a more and more central position in our culture, designers will find a way to fix the existing problems.

“Game designers would be well served by immersing themselves in the debates between adherents of different moral theories in the Western traditions," Silcox said. "Might we dare to imagine a future in which video game designers actually had a place on the very short list of people whom we routinely expect to provide us with real moral wisdom?”

Monday, November 16, 2009

Ten things that Anne Thompson learned at the Governors Awards

1. The Governors Awards will not be televised. At the orange Grand Ballroom at Hollywood and Highland Saturday night, the Academy of Motion Picture Arts and Sciences gave out four honorary Oscars at a new annual event on the awards calendar. Academy executive director Bruce Davis, president Tom Sherak, Oscar show producers Bill Mechanic and Adam Shankman, and Oscar-host-to-be Alec Baldwin all attended this relaxed, celebratory black-tie cocktail and dinner party. The ceremony awarding four honorary Oscars to actress Lauren Bacall, producers John Calley and Roger Corman and cinematographer Gordon Willis, punctuated by repeated standing ovations, lasted three hours and 18 minutes, to be exact.

2. The awards circuit always draws would-be Oscar contenders. Glad-handing were Morgan Freeman (Invictus), Jeff Bridges (Crazy Heart), Vera Farmiga (Up in the Air), Gabourey Sidibe (Precious: Based on the Novel ‘Push’ by Sapphire), Abbie Cornish in lavender Dior (Bright Star) and Christoph Waltz and Quentin Tarantino (Inglourious Basterds), who met director Julie Taymor for the first time. New World alumnus James Cameron (Piranha II), who was supposed to attend, was stuck in the Avatar editing room, said Sherak.

3. Avatar production and marketing costs will not reach $500 million. Producer Jon Landau insists the final costs of the movie will come nowhere near that. (The LAT reports a $310 million budget tally without P & A.) Cameron and Vince Pace’s special 3-D camera rigs, for example, are rented like any other cameras. Weta Digital made a bid for how much the visual effects would cost. The actors were not expensive. There isn’t all that much live-action shooting in the movie, which filmed in soundstages in Playa Vista and in New Zealand. OK…

4. If director Clint Eastwood delivers yet again on Invictus, the movie could be the one to beat for Best Picture. When I told Freeman how much I admired the Invictus script, adapted from John Carlin’s book by Anthony Peckham, Freeman said, “I guarantee you, we did not mess it up.” Freeman plays Nelson Mandela opposite Matt Damon as rugby captain Francois Pienaar. In order to unite South Africa, the two men push to win the 1995 World Cup. 

5. Christoph Waltz is now a working Hollywood actor. Of his good fortune, the multi-lingual German, who is currently shooting The Green Hornet with director Michel Gondry and Seth Rogen, said, “It’s unbelievable!”

6. Lauren Bacall is still beautiful at 85. Introduced by Kirk Douglas, who was a fellow student at the American Academy of Dramatic Arts, Bacall said that throughout her career and backstage Saturday night, he reassures her by saying, “Never fear, Kirk is here.”

“People said Bacall was tough,” said Douglas. “She’s a pussycat with a heart of gold. I’m sure Lauren Bacall will teach the Oscars how to whistle.” When Douglas had a threadbare coat, she got him a thicker one. He admitted that he tried to seduce her, but they became friends instead. Anjelica Huston, who was born while her father John, Humphrey Bogart and Bacall were on faraway location in Africa filming The African Queen, was visibly moved as she thanked Bacall for being her “mother, friend, guide, teacher.” Here’s the video clip.

Bacall waved her Oscar overhead, crowing, “I can’t believe it, a man at last!” She went on, “I’ve been very lucky in my life, luckier than I deserve. At the age of 19 to have been chosen by Howard Hawks to work with a man like Humphrey Bogart… It ended up being Bogart that was my great luck. He was not only a wonderful actor but an extraordinary man. He gave me a life and changed my life.” Her children with Bogey, Stephen and Leslie, attended the ceremony.

7. Roger Corman is cool. “This is the cool table!” Tarantino told me as I cruised the long New World table in the middle of the room. Sure enough, hovering there were Jack Nicholson (in shades), Curtis Hanson and Rebecca Yeldham, Jonathan Demme, Allan Arkush, Joe Dante, Gale Anne Hurd and Jonathan Hensleigh, Lewis Teague, Peter Bogdanovich, and Jon Davison. Along with Ron Howard, who got his start directing on Corman’s Grand Theft Auto, Tarantino delivered an entertaining, heartfelt intro to 83-year-old King of Independents Corman, producer of 550 and director of 50 indie films, complete with clips of Man with the X-Ray Eyes, The Intruder, The Wild Angels, The Trip, St. Valentine’s Day Massacre, Bloody Mama, Martin Scorsese’s Boxcar Bertha, Francis Ford Coppola’s Dementia 13, and Corman’s series of Edgar Allan Poe films. “For all the wild, weird, cool, crazy moments you have put on the drive-in screens, the movie lovers of the planet earth thank you,” said Tarantino, to rousing applause.

8. New Disney chairman Rich Ross is not cool. Ramrod straight and not knowing many people in the room, Ross kept his council at a chilly Disney table while surviving production chief Oren Aviv greeted well-wishers. Ousted studio head Dick Cook would have been cheerily hanging with his old cronies. Ross represents a threatening new studio order in more ways than one.

9. Director of photography Gordon Willis, 78, probably has not received an Oscar until now because he was a “crusty curmudgeon,” as described by Francis Coppola via videotape. The Prince of Darkness was overlooked for such films as The Godfather 1 and 2 (a sequel), All the Presidents Men, Presumed Innocent, Klute and a series of Woody Allen films including Annie Hall (a comedy) and Manhattan (black-and-white), explained cinematographer Caleb Deschanel, because the New York-based d.p. (who had to be coaxed into coming to the awards ceremony) refused to suck up to the establishment. In fact, he could be quite blunt: “Some Hollywood cameramen are like a bunch of flame throwers,” Deschanel quoted Willis as saying. “Some directors are dump truck directors. They fill a dump truck with shots and dump it in editors’ laps.” Said Woody Allen in a video clip, “I think he’s the best cinematographer America has ever seen, really.” I concur.

10. Producer/executive John Calley, 79, couldn’t make the Governors Awards because he was too frail. A line of past Irving G. Thalberg recipients gathered to honor Calley, including Steven Spielberg, Norman Jewison, Warren Beatty, Dino De Laurentiis, Walter Mirisch, George Lucas and Saul Zaentz. Here’s video of Spielberg accepting the Irving G. Thalberg Award for Calley, whose clip reel included such classics as Catch-22, The Loved One, The Shining, The Exorcist, All the President’s Men, and Ten. I enjoyed spending time with Calley over the years: charming and canny, he liked to recount his experiences with Stanley Kubrick. Jewison, who first worked with Calley on the troubled set of The Cincinatti Kid in 1965, said: “He was a sophisticated man with an eye for talent. He protected me. John, you were a mensch.”

Is the Hollywood Moviestar dead?


A Reuters piece that’s been making the rounds this weekend speculates that Hollywood may be thinking twice about banking on A-list celebrities in the future. The piece points to recent low-budget and star-free fare like The Hangover, District 9, and Paranormal Activity that each went on to be wildly successful, and contrasts them with big-budget, star-studded flops like A Christmas Carol, Land of the Lost, and Funny People. The overall lesson seems to be that star-power doesn’t have nearly the draw that it used to, and that budgets aren’t much of a factor for audiences either.

But of course, I don’t really think this is news to most of us. While some may bemoan the tastes of general audiences when they overwhelmingly support movies like Transformers: Revenge of the Fallen, I don’t think they did so for hunky Shia LaBeouf. Instead, they were probably looking to revisit the magic from the first film–which, let’s face it, was far better than it had any right to be. (Or the simpler answer, they just wanted to see things blow up.)

In any case, it was the quality of the concept of Transformers 2 (magic revisited and/or ‘splosions) that most likely led audiences showing up in droves, and not stars. You could apply a similar logic to The Hangover and its ilk mentioned above. I’d like to believe that audiences are smarter than we give them credit for—or at the very least, most can tell when studios are pushing crap on them. And sometimes they completely surprise us, just look at how well Inglourious Basterds performed.

Pronouncements of the movie star disappearing are nothing new—it’s simply something that always tends to come up after a wave of high-profile flops. I think there will always be room for stars, the lesson we need to learn is how to use them.  As the Reuters piece mentions, studios are looking to scale back on large up front salaries for big stars, and instead ask them to bank of potentially greater rewards if the film breaks even. And if some stars want to remain big-salary hogs who care more about a paycheck than their work, then perhaps it is time for them to step down.

Ultimately, the success of these lower budget features is a good thing for cinema. It makes studios less uneasy about moving forward with low budget features, and opens the doors for innovative new projects down the line. And after all, releasing several smaller features instead of relying on returns from a few big-budget films is a much safer bet for them as well.

Here's the actual article:

Hollywood rethinks use of A-list actors

Fri Nov 13, 2009 4:51pm EST
By Alex Dobuzinskis

LOS ANGELES (Reuters) - Hollywood studios are now thinking twice about splurging on A-list movie stars and costly productions in reaction to the poor economy, but also because of the surprising success of recent films with unknown actors.

After buddy comedy "The Hangover," a movie with a little known cast, made $459 million at global box offices this past summer, several films have shown that a great concept or story can trump star appeal when it comes to luring fans.

"District 9," a low budget movie in which the biggest stars were space aliens treated like refugees and the lead actor was South African Sharlto Copley, made $200 million. Thriller "Paranormal Activity," starring Katie Featherston and Micah Sloat, has cash registers ringing to the tune of $100 million.

Next up, on November 20, comes Summit Entertainment's relatively low-budget ($50 million) franchise movie "The Twilight Saga: New Moon," a sequel to 2008 hit vampire romance "Twilight" which made global stars of Robert Pattinson and Kristen Stewart. Online ticket sellers report "New Moon" is one of their highest pre-sale movies of all time, and box office watchers expect the film to have a smash opening.

"Nobody says that a big wonderful movie needs to be expensive, it's just that that's been the trend, and perhaps the trend is misguided," said University of Southern California cinema professor Jason E. Squire.

Last weekend, comic actor Jim Carrey's "A Christmas Carol" became the latest celebrity-driven movie to stumble at box offices, opening to a lower-than-expected $30 million. Aside from Jim Carrey and "Carol," which cost at least $175 million, A-listers who suffered box office flops recently have included Bruce Willis ("Surrogates"), Adam Sandler ("Funny People"), Will Ferrell ("Land of the Lost"), Eddie Murphy ("Imagine That") and Julia Roberts ("Duplicity").


"The (major movie) machine didn't fly last summer, if you look at the movies and the names, they were not star-driven movies, they really weren't," said Peter Guber, chairman of Mandalay Entertainment and former head of Sony Pictures. Hollywood insiders say A-listers currently are having trouble with salary demands in the $15 million range or participation approaching 20 percent of gross profits -- deals that were once somewhat common for top talent. Instead, they are being asked to take less money upfront and greater compensation only if a film breaks even.

FRANCHISE ON THE CHEAP

In "New Moon," actors Robert Pattinson and Kristen Stewart rekindle their romance between an immortal vampire and a high school girl that they brought to silver screens in last year's adaptation from Stephenie Meyer's "Twilight" books.

At the time, Pattinson and Stewart were unknown stars but that did not hurt "Twilight," which made $384 million at global box offices and gave Summit a bona fide franchise.  It's not unusual for franchises like the "Harry Potter" movies to begin with unknown actors, but as the films' popularity takes root, production budgets relax and actor, producer and other salaries soar.

But in recent years, Hollywood has been racked by the recession, competition from videogames and the Web, declining DVD sales and fewer licensing deals with television networks. This week, Disney chief Bob Iger said in a conference call that the sluggish DVD market is one reason the major studio has altered its moviemaking. "It causes us to really reconsider not only what we're investing in our films, but how we market them and how we distribute them," he said.

For its part, fledgling Summit has positioned "Twilight" as a franchise for the recession era by keeping the pressure on the costs for "New Moon," and Hollywood producers are praising them for it. "Good for them, they are really keeping the costs down. It is unusual," said Lauren Shuler Donner, a producer on the "X-Men" films and 2008's "The Secret Life of Bees." Summit, whose executives declined to be interviewed, took a page from the playbook of "The Lord of the Rings" by shooting the second and third films back-to-back this summer.

When director Peter Jackson made his three "Lord of the Rings" films simultaneously 10 years ago, it was a novel idea that reduced costs because actors, sets, costumes, locations and other items only had to be assembled and paid for once.

Similarly, by shooting the next two "Twilight" movies together, Summit kept the cost of the third film, "Eclipse," due out June 30, around $60 million, one source said. "What I like is they didn't have a long window (between films), they went in to make a franchise, they didn't go in to see if they had a franchise," said Warren Zide, producer on the "American Pie" and "Final Destination" movies.