Wednesday, September 30, 2009

Indie Future: Mechanic Offers Answers

Always one of the smarter players in Hollywood, ex-Fox studio chairman Bill Mechanic (now owner of the production company Pandemonium LLC) gave a keynote speech Tuesday morning on the future of the indies at the Independent Film & Television Alliance Production Conference. Is he looking for a job (there’s a big opening at Disney, where he used to run home video)? Maybe.

In any case, his analysis of what’s going on is strong. As always, Mechanic fights for the interesting, challenging and original over standard-issue formula fare:

It’s disrespectful if not downright dumb to think audiences can’t tell the difference between the original, which occasionally might even have some fresh faces, and the copy, which almost always is populated with retreads. It’s like thinking you can sell yesterday’s news under a different banner.

AND

While use of the internet and video games have dominated leisure time activities, movie consumption is down or flat over the same period. And, more to the point, you can see that there is a 21% drop in film going amongst the core target audience and a 24% drop in the next key category, 25-39 year olds.

And yes, these charts beg another question: if the audiences are shifting, why isn’t the product shifting as well. Name 5 mainstream films this year that successfully targeted an over-30 year audience. In that way, Hollywood in the broadest sense of the word is much like Detroit. It’s a manufacturer’s mentality that reigns, seemingly indifferent to the consumers it serves. Ignore whether the consumer likes our product as long as they buy it.

Market it and they will come.

and MY personal favorite (I've been saying it for years): "We have too many insignificant movies clogging our distribution channels."

read the rest on Thompson's Blog here: http://blogs.indiewire.com/thompsononhollywood/2009/09/29/indie_future_mechanic_offers_some_answers/

Monday, September 28, 2009

"Kill More Kids" Strong statement. Let’s see how offensive this gets.

Kids suck. Everyone knows this. They’re noisy, they smell, they ruin sex lives and movie going experiences. But that’s in the real world. Kids in the movie world are far, far worse. They’re sickeningly cute, they say the darnedest things, and they somehow survive every situation no matter how fucking implausible. I mean seriously, kids are easy targets. In the middle of the African jungle, kids don’t accidentally stumble out of the way of the lion. The lion eats the kid. Or kills it for sport. Don’t even get me started on Velociraptors. Kids are slow, stupid, and delicious. Any kid entering Jurassic Park ever is nothing more than a snack.

So why don’t movies reflect this reality? Why aren’t more kids biting the big one? I’m not advocating dropping a house on some fat kid in a Pixar movie and having his brain squirt out of his ankles. But in big boy adult movies, why are kids given a free pass? Why is Michael Myers slashing babysitters but not babies? You want to show true evil, kill a kid. You want to show reality, kill a couple. It just takes you out of the movie. You seriously want us to believe that the ultimate evil isn’t evil enough to kill a kid? You think I’m going to believe this assassin can shoot the wings off a fly but misses a kid running down a hallway? Am I supposed to buy some maniacal killing machine with an ax to grind and cut heads off with can’t catch a rugrat and smash his head in? Let’s get real, Hollywood!

Some movies get it right though, and they deserve some props. Who Can Kill a Child? is a good place to start. You want to know who can kill a child? Tom, the protagonist. Dude is down with machine gunning kids and beating them with planks of wood. Because they deserve it, the evil little bastards. Tales from the Crypt: Demon Knight also deserves a little love. There is a young kid in it, maybe 13 or 14 or whatever, and he doesn’t make it. You know why? Because kids are shit when compared to demons. Demons are immortal hell warriors, not little bitches like kids. Sleepaway Camp was all about youngsters getting the ax, even a whole bunch of like 5 year olds got hatcheted. Hell yeah! Tim Burton’s Sleepy Hollow featured a young boy who didn’t get away – because the Headless Horsemen don’t fuck around. Rambo featured some very effective massacre scenes where, true to life, no one gets away. Not women, children, mothers, daughters, sons. Because the real world sucks and everyone gets it eventually. Now that last one is certainly a bit more disturbing than the other ones, which are less serious fare. Back on the lighter side of things, in a way, is perhaps the ultimate in child killing – Beware: Children at Play, from masters of messed up horror, Troma Entertainment. It’s said when the trailer for the film was shown, half the theater got up and walked out. The climax of the film is a 10 minute sequence of adults killing kids. Lots of kids. Like 50 kids. Do the kids survive or get away? No. They had it coming. Also, they’re dumb and slow.

If you can’t go into an R-Rated movie and watch some kids get killed, you probably shouldn’t go into a movie. If you’re one of those morons who thinks watching something in a movie or on TV means its going to happen, you’re an idiot – in addition to being a moron. Movies are movies, not reality. Wanting to see something happen on film is in no way indicative of someones mental state. Millions of people watch horror movies without killing people. We watch movies about Nazis without joining up. We watch movies about infidelity yet an alarming number of us keep it in our pants. Fuck, even the new remake of A Nightmare on Elm Street seems to have removed the child killing aspect. What the fuck, man? Can we not show bad people doing bad things? It’s not the writer or director killing people. It’s a character in a movie. No children actually died.

So wake up, Hollywood, and grow some nuts. It’s time to kill more kids. They’re asking for it. Begging for it. Give your bad guys more oomph, your villains more evil, and your serial killers more fodder. It’s just being realistic. Show it how it is. You have no trouble killing off the fat guy because he’s slow, but even the fatty can outrun a child. It bugs me when someone little brat who has no real shot of evading anything somehow manages to come out unscathed. In fact, seeing a kid survive some implausible situation that an adult wouldn’t pushes me right past my boiling point. - Robert Fure

Thursday, September 24, 2009

The Seasons of Festival Buzz


We’re now well past the two major events that kick off the film festival season, with both the Telluride Film Festival and Toronto Film Festival behind us. As expected by most, the news isn’t great at all, with many highly touted films that debuted at those festivals still lacking distribution deals. The failure of some – most – movies to come away with sales under their belts is just the latest nail to be driven into the coffin of the independent film market, one that’s fallen victim to a combination of over-saturation in theaters, too much production money being thrown around and an economic downturn that has bigger studios turning away from unknown properties in favor of comic adaptations, sequels and remakes.

Despite this gloomy picture the festivals are more highly covered by the entertainment press in the last three years then ever before, with almost every high profile movie blog sending one or more representatives and plenty of journalists there from what remains of the trade publications. In a funny way, coverage of the festivals has increased almost exactly as much as the actual sales market at those festivals has declined. More stories are filed but fewer deals are done.

One of my gripes in the last couple years has been that despite the saturation of movie industry writers buzzing about all those movies, very little of that buzz is carried over to when those movies are sold to a distributor are eventually released. If a filmmaker and his team sell their movie to one of the remaining buyers, that studio seems to discard whatever positive word-of-mouth and goodwill has been established through the festival appearance in favor of a marketing campaign that follows what for them is a more traditional model.

That’s why I’ve been thinking lately that studios have in their very own hands the key to turning the independent film market around and making their festival acquisitions a more profitable investment. And that key looks very much like a strategy involving embracing the word-of-mouth a movie already has in its favor.

The reality is that, with very few exceptions (the upcoming Up in the Air comes to mind) it’s really hard to market a festival film to a mainstream audience. Running a traditional marketing campaign for a movie like Moon, one of the most-buzzed about films from the most recent Sundance Film Festival felt very much like the studio was trying to fit a square peg in a round hole. More effective what director Duncan Jones’ interactions with fans and bloggers on Twitter, where he could talk to people and tell them about the movie’s theatrical expansion and other developments and thank people who had championed the movie for their efforts. Even Up in the Air, which features one of the biggest current movie stars is more…polished…than most arthouse fare is benefiting from director Jason Reitman’s Twitter presence, which has a similarly conversational and behind-the-scenes feel.

Both of these efforts have many things in common, but the biggest one is that they use social media tools to embrace, communicate and empower their fans. Which is kind of the point.

One of the problems with the studio model is that it’s one that’s only built to define success based on a single yardstick. More accurately there’s one set of tactics that a movie has to be promoted through because that’s how the infrastructure within studios has been established. If the movie is one that could benefit from something different, well that’s just too bad. It needs a mass-appeal poster, a mass-appeal trailer and then a distribution pattern that is meant to minimize financial risk but which has the side effect of minimizing audience exposure, thus dooming the movie. I’ve referred previously to this as kind of a self-fulfilling prophecy, where a studio buys a movie for the prestige value but then expresses doubts about its market viability and executes such a limited distribution program that they wind up proving their own skepticism.

So instead going down this road again and again, what if studios bought a movie and then:

-made sure all the positive reviews that resulted from the festival screening were linked to from the film’s official website

-executed an ad campaign promoting screenings on those blogs that were among the film’s initial champions

-used those blog writers as hosts for some screenings, letting them further express their passion for the movie

engaged in outreach to other online communities that were relevant to it (special interest groups, fans of the movie’s genre, etc)

-used social media to allow people to meet others in their geographic area that were interested in the movie and plan a real life meeting, with the studio popping for coffee, pizza or whatever


Basically find ways to harness some of the enthusiasm people have coming out of the festival and use it to expand the movie to more audiences. Don’t try to go wide at first and don’t even try a traditional platform release schedule. Let the movie work organically from point A to point B, with growth coming from the audience itself, with some sort of online ambassador – the director, a studio publicist or whomever is best and most passionate about it – making sure that the online buzz is extended and broadcast in the easiest and most engaging possible way.

These tactics aren’t going to work for every movie. Hell they may not even work for one. But the only people are going to turn around the film festival sales market are those with the money to spend, and that’s the studios. If they start innovating and experimenting with new ways to promote and publicize the movies that emerge from those festivals as favorites I firmly believe we’ll see a market currently suffering a massive downturn right itself and vibrancy return. Best of all, this sort of experimentation is going to prove out the legitimacy of other “true” independents that are already doing some of these things, possibly putting them on some people’s radars and bringing them more success.

The primary point, though, is that there’s all this positive word-of-mouth that results from film festival appearances, something that’s still sought after by filmmakers of all shapes and sizes. Letting that go and not building upon it is a missed opportunity to let fans of a movie contribute to the success everyone is striving for. - Chris Thilk

Tuesday, September 22, 2009

Do You Give Any Filmmakers the Benefit of the Doubt?

When the closing credits began to roll at the end of Joel and Ethan Coen's new film A Serious Man, nobody at the press screening moved. The end comes at a rather surprising moment, when a lot of things are happening, and we all found it necessary to sit there for a moment and process everything. One colleague and I talked about the movie over lunch -- and more specifically, we talked about what it means when you see a movie and don't understand what it means.

Not that A Serious Man (no spoilers here) is mystifying or hard to follow or anything like that. But it has elements that may not make sense at first glance. It has a prologue, set several decades ago in Eastern Europe, that has no obvious connection to the main story, set in 1967 in Minnesota. There are a few characters and plot threads that don't seem to fit with the others. Overall, it's a very satisfying and engaging film. It just might not all add up at first.

And that's what my friend and I were talking about. My contention is that since this is the Coen Brothers -- a pair of experienced filmmakers with a proven track record -- I'll give them the benefit of the doubt. If some significant aspect of the film seems puzzling, I'll assume it's because I've failed to grasp its meaning and not because the Coens have screwed up. I mean, that prologue: It's not like it's there on accident. The Coens put it there for a reason, to support a theme or to enhance an idea. Now it befalls me to figure out what that reason was.


Now, we can talk about whether the filmmakers ought to have done a better job of making their points. That's a valid concern. You don't want a movie that's obvious and spoon-feeds everything to you, but you don't want something obtuse and impenetrable, either. My rule of thumb has always been that if a second viewing is required to even understand what's going on in a movie, then the filmmaker hasn't told his story very well. But if you can understand it on a single viewing and the second viewing merely expands on that understanding, then that's OK. In fact, that's terrific. We love movies that are deep enough to reward multiple viewings. And that's generally been my experience with the Coens, particularly with No Country for Old Men, which I liked the first time but didn't feel everything "click" until I saw it a second time.

But there are plenty of filmmakers that I wouldn't give the benefit of the doubt to. I've seen plenty of movies where I thought, "Well, that doesn't make sense. These people are idiots." Sometimes you can tell what they were trying to do, and they simply failed at it; sometimes you're not even sure what they had in mind. It's fair to think, "OK, why is this in the movie? What were they thinking?" -- but you can't always assume it's your fault for not getting it. Sometimes you have to accept that yeah, these people really just didn't know what they were doing (or there was studio interference that forced something incongruous into the story, or someone's contract stipulated that this scene needed to stay even though it was wrong, etc., etc.).

So when you see a movie with elements that don't seem to add up, what do you do? Does it depend on who made the movie? Do you give it any additional thought? Do you figure that if it didn't work for you, that's it, no need to waste any more time with it? Do you blame the film? Do you ever think: Wow. This movie is smarter than I am? - Eric D. Snider

Canada Picks “Mother” For Oscars


Telefilm Canada has announced that Xavier Dolan’s “I Killed My Mother” (J’ai tué ma mère) has been submitted for nomination as Best Foreign Language Film at the 82nd annual Academy Awards. It was selected among 18 eligible films from a committee comprised of 24 voting members representing major film industry associations and government agencies.

“A Canadian film in the race for the Oscars provides outstanding exposure for Canadian productions among domestic as well as international audiences,” stated Sheila de La Varende, Telefilm Canada’s Director of National and International Business Development, in a statement. “We applaud the precious work accomplished over the past few weeks by the selection committee, which gathers together industry representatives from all regions of Canada.”

“Mother” - written, directed, starring and produced by the 20-year old Dolan - details the intensely volatile relationship between a gay 16-year-old, Hubert (Dolan), and his mother, Chantale (Anne Dorval, noted by indieWIRE’s poll of critics and bloggers as one of the best performances at TIFF). The film builds through a series of richly hysterical conflicts that find these two characters exceedingly incapable of living with or without one another. - Peter Knegt

Monday, September 21, 2009

"Inglourious Basterds" is Tarantino's Top Earner - Because of Twitter?

In what could be read as a big "nyah, told you so" press release, The Weinstein Company would like you all to know that Inglourious Basterds has not only grossed over $108M* in North America but has now out-earned Pulp Fiction, which was previously Tarantino's biggest money-maker to date.

But what's strange is that TWC is giving some of the credit to "an innovative marketing plan. The film was the first to make use of Twitter and other social networking sites in such a direct fashion, even involving Twitter in the film's LA premiere," according to the press release.

Harvey Weinstein is even quoted as saying, "It was great working with Biz Stone at Twitter on Inglourious. It took the campaign to another level."

Okay, what have I missed? How was the Inglourious campaign different from any other of the studios' use of Twitter or Facebook to promote movies through links, contests, and meet-ups? I don't even recall seeing anything on Twitter about it, other than the normal studios using Twitter to cross-pollinate coverage.


Advertising and marketing execs are still speaking of "The Twitter Effect" in hushed tones – word of mouth, which used to take at least a whole weekend to damn a movie, is now zooming across the Internet at the speed of text messaging, according to some analysts and pundits at Advertising Age and The Guardian.

But I still find it really hard to believe the claim that Inglourious the first to make use of this marketing strategy, or that it used it at all, except perhaps as Tweeters saw it and gave it a yay or nay. Smaller, more niche movies like District 9 and Moon have benefited from social networking or even good old real-life networking far more -- Moon director Duncan Jones' tireless schedule of Q&As, festivals, and interviews, between Tweeting with fans, is especially impressive.

Personally, I think Inglourious Basterds benefited far more from its ubiquitous ad campaign and the lure of Tarantino, not to mention the promise of a Nazi bloodbath led by Brad Pitt's marble-mouthed Lt. Aldo Raine. The fact that it also offered excellent performances from Christoph Waltz and Mélanie Laurent was just a bonus.

What do you think? Is this Twitter effect hooey? Does it ever give you cause to pause before spending your hard-earned bucks on an opening-weekend film?

* Box Office Mojo puts Pulp at $107.9M for a domestic total gross to date, although its numbers show that Inglourious Basterds has earned about $109.9M domestically. - Jenni Miller

Saturday, September 19, 2009

FCC to Propose Net Neutrality

The U.S. government plans to propose broad new rules Monday that would force Internet providers to treat all Web traffic equally, seeking to give consumers greater freedom to use their computers or cellphones to enjoy videos, music and other legal services that hog bandwidth.

The move would make good on a campaign promise to Silicon Valley supporters like Google Inc. from President Barack Obama, but will trigger a battle with phone and cable companies like AT&T Inc. and Comcast Corp., which don't want the government telling them how to run their networks.

The proposed rules could change how operators manage their networks and profit from them, and the everyday online experience of individual users. Treating Web traffic equally means carriers couldn't block or slow access to legal services or sites that are a drain on their networks or offered by rivals.

The rules will escalate a fight over how much control the government should have over Internet commerce. The Obama administration is taking the side of Google, Amazon.com Inc. and an array of smaller businesses that want to profit from offering consumers streaming video, graphics-rich games, movie and music downloads and other services.
Julius Genachowski, head of the Federal Communications Commission, is also expected to propose in a speech Monday, for the first time, that rules against blocking or slowing Web traffic would apply to wireless-phone companies, according to people familiar with the plan.

Wireless carriers, which have been among the fiercest opponents of such regulation, continue to restrict what kind of data travels over the airwaves they control. For example, earlier this year, AT&T restricted an Internet-phone service from Skype so iPhone users couldn't place calls on AT&T's cellular network. At the time, AT&T cited network congestion concerns.

"We believe that this kind of regulation is unnecessary in the competitive wireless space as it would prevent carriers from managing their networks -- such as curtailing viruses and other harmful content -- to the benefit of their consumers," said Chris Guttman-McCabe, vice president of regulatory affairs for CTIA, the wireless industry's trade group.

If the FCC does force U.S. wireless carriers to open their networks to data-heavy applications like streaming video, it could push them beyond the limited capacity they have. Already, in areas like New York and San Francisco, a high concentration of iPhones has caused many AT&T customers to complain about degrading service.

In such a scenario, wireless carriers may have to rethink how much they charge for data plans or even cap how much bandwidth individuals get, said Julie Ask, a wireless analyst at Jupiter Research.
The FCC's proposal will take into account the bandwidth limitations faced by wireless carriers, according to people familiar with the plan, and would ask how such rules should apply to current networks.

The rules could encourage big Internet companies to launch new data-intensive services by establishing that their traffic can't be slowed or blocked. In the business market, companies that make Internet-phone services or video-conferencing software may invest more heavily in those services, some analysts say.

The rules are likely to be a big boon to smaller tech companies, like Silicon Valley start-ups and small makers of mobile software for Apple Inc.'s iPhone and other devices, that wouldn't be able to afford paying Internet providers for special access.

"Any company or piece of software that becomes popular, generating a lot of traffic, would tend to benefit," said Jonathan Zittrain, the co-founder of the Berkman Center for Internet & Society at Harvard University.

The FCC has four "net neutrality" principles, which call on Internet providers to avoid restricting or delaying access to legal Internet sites and services. Carriers are permitted to block access to illegal services and sites.

Mr. Genachowski is expected to propose the agency clarify its current principles and turn them into formal rules. He will also tack on a new one, which would require carriers practice "reasonable" network management. The agency will ask for guidance on how to define "reasonable."

Most Internet providers have resisted "net neutrality" rules in the past, saying they have a right to control traffic on networks they own and it's not a good idea for the government to micro-manage Internet traffic.
Phone companies including AT&T have argued that they can live with the FCC's existing principles, but they've argued there's no reason to put more formal rules put into place.

Representatives from AT&T, Verizon Wireless, Comcast and Sprint Nextel Corp. declined to comment ahead of the FCC's anticipated announcement.
The proposals come as the FCC faces a federal appeals court case over its authority to regulate Web traffic. Comcast is fighting an FCC decision last year to ding it for violating the agency's "net neutrality" principles when it slowed traffic for some subscribers who were downloading big files. Comcast said it didn't violate any rules because the FCC had never formally adopted any, but it did change how it manages its network.

Republicans are likely to oppose the FCC's new proposal -- both at the FCC and in Congress -- arguing that the FCC is trying to fix problems that don't exist and that the agency should take a more hands-off approach to the fast-changing industry.

"With only a few isolated instances of complaints alleging net neutrality-like abuses ever having been filed, it is a mistake," said Randolph May, president of Free State Foundation, a free-market oriented think tank.

The concept of network neutrality originated with the nation's longtime telephone monopoly. AT&T and its successors were prohibited from giving any phone call preference in how quickly it was connected. Since the Internet was born on phone wires, the concept survived into the Internet age largely by default.

That notion was challenged toward the end of the 1990s, as cable companies began offering Internet service. Cable companies argued since they were content companies not phone companies, the principle of network neutrality didn't apply to them.

Phone companies responded by getting into the content business as well, with television service. As a result, both the cable companies and phone companies had incentives to create conditions on the Internet -- either through pricing or slowing or speeding up certain sites -- to favor their own content.

In 2005, the FCC deregulated the Internet business, by ruling that Internet providers were communications companies and not phone companies and, importantly, were therefore no longer subject to the old phone rules such as network neutrality.

The FCC instead created its four "guiding principles" for protecting network neutrality. They were vague enough to embolden those looking for ways around it. Major phone companies like AT&T subsequently said they were considering creating "fast lanes" on the Internet, available at a higher price -- plans they put on hold amid an outcry.

Now, by codifying the principle, the FCC is seeking to limit erosion of network neutrality.

Mr. Genachowski is expected to set plans to open a formal rule-making process on the issue at the FCC's October meeting. The rules would have to be approved by a majority of the FCC's five-person board; whose three Democrats support net neutrality.

Thursday, September 17, 2009

UPDATE - Social Media: Fad or Revolution?

The Numbers Don’t Lie: Welcome to the Revolution
by Christine McNabb

For those of you that didn’t have time to watch the full video from OTD’s previous post this week, I wanted to highlight some of these amazing stats brought to you by Socialnomics:

96% of Gen Y has joined a social network

Social media has overtaken porn as the #1 activity on the Web.

1 out of 8 couples married in the US last year met via social media.

If Facebook were a country it would be the world’s 4th largest after China, India and then the United States

Ashton Kutcher and Ellen DeGeneres have more Twitter followers than the entire population of Ireland, Norway and Panama.

YouTube is the 2nd largest search engine in the world.

80% of companies are using LinkedIn as their primary tool to find employees.

If you were paid $1 for every time an article was posted on Wikipedia, you were earn $156.23 per hour.

78% of consumers trust peer recommendations, only 14% trust advertisements.

Hulu has grown from 63 million total streams in April 2008 to 373 million in April 2009.

25% of Americans in the past month said they watched a short video on their phone and 35% of book sales on Amazon are for the Kindle.

More than 1.5 million pieces of content (web links, news stories, blog posts, notes, photos, etc.) are shared on Facebook, DAILY.


Clearly, social media is not the fad or the fun past-time that many thought it was at one point. It has become a deeply embedded part of our lives, more than many of us might even realize, but the numbers don’t lie.

Saturday, September 12, 2009

Pussies and Dicks: Hollywood's lack of Balls. Exhibit A-12,092,323

Charles Darwin film "too controversial for religious America"

A British film about Charles Darwin has failed to find a US distributor because his theory of evolution is too controversial for American audiences, according to its producer.

Creation, starring Paul Bettany, details Darwin's "struggle between faith and reason" as he wrote On The Origin of Species. It depicts him as a man who loses faith in God following the death of his beloved 10-year-old daughter, Annie. The film was chosen to open the Toronto Film Festival and has its British premiere on Sunday. It has been sold in almost every territory around the world, from Australia to Scandinavia.

However, US distributors have resolutely passed on a film which will prove hugely divisive in a country where, according to a Gallup poll conducted in February, only 39 per cent of Americans believe in the theory of evolution. Movieguide.org, an influential site which reviews films from a Christian perspective, described Darwin as the father of eugenics and denounced him as "a racist, a bigot and an 1800s naturalist whose legacy is mass murder". His "half-baked theory" directly influenced Adolf Hitler and led to "atrocities, crimes against humanity, cloning and genetic engineering", the site stated.

The film has sparked fierce debate on US Christian websites, with a typical comment dismissing evolution as "a silly theory with a serious lack of evidence to support it despite over a century of trying". Jeremy Thomas, the Oscar-winning producer of Creation, said he was astonished that such attitudes exist 150 years after On The Origin of Species was published. "That's what we're up against. In 2009. It's amazing," he said.

"The film has no distributor in America. It has got a deal everywhere else in the world but in the US, and it's because of what the film is about. People have been saying this is the best film they've seen all year, yet nobody in the US has picked it up. "It is unbelievable to us that this is still a really hot potato in America. There's still a great belief that He made the world in six days. It's quite difficult for we in the UK to imagine religion in America. We live in a country which is no longer so religious. But in the US, outside of New York and LA, religion rules.

"Charles Darwin is, I suppose, the hero of the film. But we tried to make the film in a very even-handed way. Darwin wasn't saying 'kill all religion', he never said such a thing, but he is a totem for people." Creation was developed by BBC Films and the UK Film Council, and stars Bettany's real-life wife Jennifer Connelly as Darwin's deeply religious wife, Emma. It is based on the book, Annie's Box, by Darwin's great-great-grandson, Randal Keynes, and portrays the naturalist as a family man tormented by the death in 1851 of Annie, his favourite child. She is played in the film by 10-year-old newcomer Martha West, the daughter of The Wire star Dominic West.

Early reviews have raved about the film. The Hollywood Reporter said: "It would be a great shame if those with religious convictions spurned the film out of hand as they will find it even-handed and wise." Mr Thomas, whose previous films include The Last Emperor and Merry Christmas Mr Lawrence, said he hoped the reviews would help to secure a distributor. In the UK, special screenings have been set up for Christian groups.

IS ANYONE ELSE ANGRY AND EMBARRASSED BY THIS?

Its about time: Apple brings bonus content to iTunes movies

If you were on the internet at all on Wednesday you probably were having load-time issues because everyone was F5ing a bunch of pages to get the latest from that day’s Apple event. There wasn’t the anticipated news that Beatles albums would be available on iTunes but there was lots of hoopla around new features for various versions of Apple’s iPod.

There was also the announcement that select movies within iTunes would be bundled with bonus content (Video Business 9/9/09) akin to what’s available on DVD under the iTunes Extra banner.

Bonus content will include what you’d expect from DVDs, meaning documentaries and trailers. The movies with Extra material will be priced the same as they otherwise would have, a sign that distribution costs are falling since there’s no need to bump up the price to send more information to the consumer.

This is the kind of thing Scott Kirsner was asking about last month and it’s going to be interesting to see what the consumer adoption of digital extras actually winds up being. Are they a holdover of the DVD age or is there a place for them in digital distribution models? I’m not sure there’s an answer about that yet but I’m sure one will emerge.

Personally I’d be interested to see how these things evolve. I have a hunch that that same sort of pre-packaged featurettes aren’t going to be as popular when they’re *meant* to be watched on a computer screen so will probably have to begin incorporating more interactive features, something that’s been part of the promise of next-gen DVD formats like Blu-ray. See recent stories about up-to-date IMDb information on the cast and crew being available (Video Business 9/10/09) to buyers of X-Men Origins: Wolverine as an example of what I’m talking about. When people watch something online I think there’s the implied promise it will be interactive to some extent and efforts like Apple’s will eventually have to cater to that.

Thursday, September 10, 2009

John Brockman - 1991: The Third Culture


The third culture consists of those scientists and other thinkers in the empirical world who, through their work and expository writing, are taking the place of the traditional intellectual in rendering visible the deeper meanings of our lives, redefining who and what we are.

In the past few years, the playing field of American intellectual life has shifted, and the traditional intellectual has become increasingly marginalized. A 1950s education in Freud, Marx, and modernism is not a sufficient qualification for a thinking person in the 1990s. Indeed, the traditional American intellectuals are, in a sense, increasingly reactionary, and quite often proudly (and perversely) ignorant of many of the truly significant intellectual accomplishments of our time. Their culture, which dismisses science, is often nonempirical. It uses its own jargon and washes its own laundry. It is chiefly characterized by comment on comments, the swelling spiral of commentary eventually reaching the point where the real world gets lost.

In 1959 C.P. Snow published a book titled The Two Cultures. On the one hand, there were the literary intellectuals; on the other, the scientists. He noted with incredulity that during the 1930s the literary intellectuals, while no one was looking, took to referring to themselves as "the intellectuals," as though there were no others. This new definition by the "men of letters" excluded scientists such as the astronomer Edwin Hubble, the mathematician John von Neumann, the cyberneticist Norbert Wiener, and the physicists Albert Einstein, Niels Bohr, and Werner Heisenberg.

How did the literary intellectuals get away with it? First, people in the sciences did not make an effective case for the implications of their work. Second, while many eminent scientists, notably Arthur Eddington and James Jeans, also wrote books for a general audience, their works were ignored by the self-proclaimed intellectuals, and the value and importance of the ideas presented remained invisible as an intellectual activity, because science was not a subject for the reigning journals and magazines.

In a second edition of The Two Cultures, published in 1963, Snow added a new essay, "The Two Cultures: A Second Look," in which he optimistically suggested that a new culture, a "third culture," would emerge and close the communications gap between the literary intellectuals and the scientists. In Snow's third culture, the literary intellectuals would be on speaking terms with the scientists. Although I borrow Snow's phrase, it does not describe the third culture he predicted. Literary intellectuals are not communicating with scientists. Scientists are communicating directly with the general public. Traditional intellectual media played a vertical game: journalists wrote up and professors wrote down. Today, third-culture thinkers tend to avoid the middleman and endeavor to express their deepest thoughts in a manner accessible to the intelligent reading public.

The recent publishing successes of serious science books have surprised only the old-style intellectuals. Their view is that these books are anomalies--that they are bought but not read. I disagree. The emergence of this third-culture activity is evidence that many people have a great intellectual hunger for new and important ideas and are willing to make the effort to educate themselves.

The wide appeal of the third-culture thinkers is not due solely to their writing ability; what traditionally has been called "science" has today become "public culture." Stewart Brand writes that "Science is the only news. When you scan through a newspaper or magazine, all the human interest stuff is the same old he-said-she-said, the politics and economics the same sorry cyclic dramas, the fashions a pathetic illusion of newness, and even the technology is predictable if you know the science. Human nature doesn't change much; science does, and the change accrues, altering the world irreversibly." We now live in a world in which the rate of change is the biggest change. Science has thus become a big story.

Scientific topics receiving prominent play in newspapers and magazines over the past several years include molecular biology, artificial intelligence, artificial life, chaos theory, massive parallelism, neural nets, the inflationary universe, fractals, complex adaptive systems, superstrings, biodiversity, nanotechnology, the human genome, expert systems, punctuated equilibrium, cellular automata, fuzzy logic, space biospheres, the Gaia hypothesis, virtual reality, cyberspace, and teraflop machines. Among others. There is no canon or accredited list of acceptable ideas. The strength of the third culture is precisely that it can tolerate disagreements about which ideas are to be taken seriously. Unlike previous intellectual pursuits, the achievements of the third culture are not the marginal disputes of a quarrelsome mandarin class: they will affect the lives of everybody on the planet.

The role of the intellectual includes communicating. Intellectuals are not just people who know things but people who shape the thoughts of their generation. An intellectual is a synthesizer, a publicist, a communicator. In his 1987 book The Last Intellectuals, the cultural historian Russell Jacoby bemoaned the passing of a generation of public thinkers and their replacement by bloodless academicians. He was right, but also wrong. The third-culture thinkers are the new public intellectuals.

America now is the intellectual seedbed for Europe and Asia. This trend started with the prewar emigration of Albert Einstein and other European scientists and was further fueled by the post- Sputnik boom in scientific education in our universities. The emergence of the third culture introduces new modes of intellectual discourse and reaffirms the preeminence of America in the realm of important ideas. Throughout history, intellectual life has been marked by the fact that only a small number of people have done the serious thinking for everybody else. What we are witnessing is a passing of the torch from one group of thinkers, the traditional literary intellectuals, to a new group, the intellectuals of the emerging third culture.

Monday, September 7, 2009

Is Social Media flattening the Complexities that make us Human?

Chris Brogan’s recent post “All the Hats and Faces” reminds me of Erving Goffman’s influential work on interpersonal communication theory called “On Face Work” written over four decades ago. Chris is definitely onto something and it drives a point that I want to bring up: Is social media flattening the complexities that make us human?

Goffman’s work discusses how as individuals we negotiate “face” in our daily social interactions. What is face? Face is basically our self image within the bounds of what is appropriate within a given situation. Meaning, when we are at work we put on a “face” that fits the work environment which is different than what we may put on when with friends and family.

Chris goes on to list what he does to maintain these various faces. Goffman calls this “face-work”, behavior that helps maintain the “face” you put on. In Chris’ case, when he says he is a blogger, his face-work involves writing about what intrigues him.

This was only possible when these different environments could be kept separate. When Goffman wrote his piece, there were no BlackBerries or iPhones, Youtube, Twitter, Facebook, and on and on. This offered us to be “messy and complex” and at times contradictory. But in today’s connected world, barriers that once existed have melted away. What you say or write in a “personal” capacity on your “personal” blog or Twitter profile may have repercussions at work and various other situations. What may have been said off-hand by an executive that was completely unrelated to the business at hand can affect how business is done off-line. Was this a failure of the person or a failure of the reacting organization to recognize or understand how face plays into it?

Social media is creating enormous stresses in how we reconcile these different faces that we have in a medium that does not afford nuances that exist in reality. Are we having a generation growing up on Facebook and Twitter who do not understand such differences? Currently, as Chris suggests, our professional face is the dominant face online, risking that risks a whole host of other faces that we put on in other social circumstances. But we are more than our professional selves as Chris points out too. Is corporate culture ready to take this into account going forward knowing that future generations will continue to spill everything onto the medium unfiltered? Or will social media allow greater control to the user in how, when and by whom their communication can be viewed online?

All the Hats and Faces We Wear

From Chris Brogan: I’m asked often how I can keep the various roles in my world straight. The question baffles me, to be honest. But, for the sake of education, I’ll share what I believe are all my roles, and then I’ll tell you a bit about what this means to me, to the online space, and to my life in business so far.

First, Who I Am and Who I Am Not

-I am human. I most certainly conduct myself like one.
-I am a blogger. I write about what captures my attention.
-I am a business man. I run a company of marketers and business communicators.
-I am a marketer. I promote my clients, and help them gain more business.
-I am an author. I write books about things I feel are helpful.
-I am a father. I have two loving children I enjoy playing with.
-I am a husband. I have a supportive wife (who made the collage in this post).
-I am a speculator/future-thinker. I love thinking about what’s next.
-I am a community guy. I love people. I live in that world. I love community.
-I am a friend. I have lots of friends, and often wish I could give each of them more time.
-I am a fan. I love lots of things: hotels, media making, reading, scotch, liquor.
-I am a consumer. I buy and use products all the time. I have opinions.
-I am a publisher. I write a successful and well-ranked blog.
-I am NOT a journalist. Wish I had that training. I’m a reporter at times.
-I am NOT a PR or Marketing person by education. I’m a hack.
-I am NOT ever going to sell out my community for a dollar. (Though I would for $10 million.)

I could go on for a while. You get the picture. For some reason, people seem to think this is hard to keep straight. When I wrote the list, I didn’t have to think too hard about it. I just wrote down some things that I am and some things that I am not. I believe we know what we are and what we aren’t. I tend to like for my actions to prove what I am and what I am not.

Sometimes, that can get murky. I understand that. In my case, I disclose and report, and comment, and explore when those kinds of moments happen. I jump right in. If I feel I should, I apologize (though I receive many emails telling me that I apologize too much, so I can’t seem to win on that one).

The Future of Human-Shaped Business

Humans are messy and complex. They are non-linear. They don’t think in straight lines. They don’t act in clear-cut ways. They don’t fit into simple slots very easily. Thank the sweet *.deity for all that.

Business, however, tried really hard for many decades to push us into those slots. It still does for most of the world. Here’s the thing: factories aren’t here any more (for the most part). Industrial-age workers aren’t necessary. The entire US school system is built to educate children to take jobs as servants to the factory culture, when what the world needs most right now are entrepreneurial-minded individuals, who understand the importance of execution, and who can form and dissolve business to match their goals and objectives instead of as a means of sustaining the corporate construct.

Social media and social networks are just one signal that things are shifting. If you’re a publisher online, you are often in a shop of one, or two, or less than a dozen. That makes you sales and editorial. (Oh, there went that journalist’s wince.) That’s just one example. Business never was neat. Now, it’s downright frayed and smudgy and organic.

Ahhh, organic. Human.

I am fully and utterly embracing my facets. I make no apologies. Instead, I aim to educate, to build bridges and interfaces, and I intend to inject this human-shaped-business DNA back into the corporate culture until we all come back into alignment between our passions and our vocations, and have companies to fit those shapes.

And My Point?

Call me by whatever label suits you best. I’m making my own game. I’m bringing it to my networks. I’m spreading it with idea handles.

Sound familiar to any of you?

Cherry Picking the Social Media Books

Malcolm Gladwell has a new book "The Tipping Point" [of Social Media] that describes some new words on familiar social media marketing ideas. Some highlights:

The Law of the Few- The Law of the Few is surprisingly simple. Through studies and testimonials Gladwell found that a majority of people can usually be connected to one person. Like a “6 Degrees of Kevin Bacon Game” on steroids, Gladwell determined that there are three people that tend to connect others: Connectors, Mavens, and Salesmen. The connectors are the people that can quickly disseminate a message to many. Mavens are those who study certain things very carefully and can always help you out. And salesmen are persuasive, knowledgeable and want to help you. By knowing your trusted mavens, your ultra-connected connectors and your salesmen, your message can easily be shared. Will it be viral? Perhaps not, but at least you are giving it a chance.

The Stickiness Factor- When developing the show Seasme Street, creators saw that when a child was confused or bored they didn’t learn the lesson. It wasn’t sticky. In the same way your application/message/ iPhone app or whatever Web 2.0 tool you have need to make sure its sticky. If it isn’t then it won’t travel anywhere.

The Power of Context- In any situation where a message is being spread, context plays a huge factor. In The Tipping Point, Gladwell examines the context of large scale events like the crime on subways in New York. You don’t need to go to that scale, but if you want to spread a message, you do need to know the context. For example, is Twitter the best place for your brand or are you just trying to jump on the hottest technology?