Monday, February 22, 2010

An Alternative View of Change - 30 Years Later

As a 55-year-old watching nearly everything change before my eyes - dramatic changes in public/private partnerships across the board, techniques of capital formation, restructuring of healthcare and banking, how people make and buy music, new global views and expectations, online connectivity, and on and on -- I've been casting around a bit to see whether anyone truly has a reliable forecasting model.
To that end, I recently plowed through the highly-worthwhile Clayton Christensen tome Seeing What's Next, which is a fascinating read on how innovation models can predict industry change (which, if you haven't read, proceed immediately to your nearest bookstore - unless you're a MEK competitor, upon which forget all of this). I further took a fresh look at the Garner Technology Hype Cycle, which mercilessly lampoons the claims of new "world-changing" tech like the iPad.
What I was struck by is how reactive these models are. They really don't predict or pontificate on potential change until after the change agent is visible and has entered the marketplace or social sphere. So the questions become: how does one truly anticipate change, the appearance of the actual change agent, and the outcome (preferred or otherwise).
Now to be fair (and before your eyes glaze over), considerable attention to finding a truly predictive model has been the obsession of humans for Millennia - from Nostradamus to ancient biblical prophecies to "infallible" prognostications of every economist or social scientist known to humanity -- all with varying degrees of failure, misinterpretation or outright fallacious conclusions.
Do I have the answer? No. Obviously.
What did strike me recently is this: in the Feb. 22 issue of Newsweek, editor Jon Meacham pens a remarkable column about change: "The System's Not to Blame. We Are." In it he speaks of gradualism, the capacity to "leap backwards," and the fact that it is "dangerously self-important" to believe that our current problems are unique. Read the whole thing by following the link.
Meanwhile, what's the point of this blog's title and how does this all fit together?
If we are in a era of "dangerously self-important thinking" (my take on Meacham's column), are we failing to see what is presently hiding out in plain view? If we over-analyze variables upon variables, are we missing the point of possibly forecasting human, business and institutional behavior?
Let me leave you with this to think about. Some 30 years ago, as a twenty-something, I remember being fascinated by a British television series titled Connections. James Burke, a science historian, produced 10 episodes, traveling all over the world for location shots to demonstrate -- with highly effective dry humor -- just how seemingly unrelated events lead up to major moments in human scientific and social history. Based on my current quest to better understand where we are as a community, a nation, and a cosmos, I found Connections was available on NetFlix. So I thought, "I wondered how all that turned out," since three decades had lapsed since I last viewed it. So I ordered it online.
After it came in the mail, I slipped it into my DVD player (which technology didn't exist in 1978), and proceeded to be stunned.
The first episode, titled "The Trigger Effect," Burke begins his series trek atop the Tower One of the World Trade Center in New York City, now of course destroyed in the infamous 9/11 terrorist attack. After warning of humanity's looming capacity to fall into a "technology trap," he chronicles the dangerously high degree of vulnerability of the fragile electrical grid (sound familiar these days?), using real events in New York City as the example. The first episode ends up in Kuwait, where Burke asks how the Kuwaiti people will respond to their new-found oil wealth and the possibility of integrating into Western society of sorts.
Of course, a little over a decade later, Kuwait was invaded by Iraq, setting off a whole new global reaction. What would Burke -- or his audience for that matter -- said if someone had walked up to him in 1977 and said "Oh, the building for your first setting will be annihilated by global terrorist forces from halfway around the world in a decade or two, and second, the electric grid problems will be far more serious in 30 years with new solutions only now coming into view, and by the way, the country where you filmed your last bit will be the site of a near-total global military reaction to the vulnerability of energy supplies."
Order the episode for yourself and see if you don't experience a eerie feeling or two. Then ask yourself: what are we missing today that is right in front of us? Are we in fact in an era of "dangerously self-important thinking" when it comes to achieving real and strategic change?
If you have any thoughts on this (and don't want to log into Google to comment), e-mail me at msnyder@themekgroup.com and we'll continue the conversation about "an alternative view of change."

Fortuna favet fortibus.

Labels: , , , , , , ,

Friday, February 12, 2010

Strategic Content + Earned Media = Success


The phrase "earned media" has taken on a largely different meaning in the second decade of the 21st century. "Earned media" (sometimes previously referred to as "free media" or "unpaid media" in the advertising world) used to nearly exclusively refer to media placements by PR professionals in newspapers, magazines and broadcast TV/radio.

Today, the phrase includes a dramatic shift toward content generated by consumers and users. With hundreds of thousands of opportunities available for customers and activists to blog, post and share, the Voice of the Consumer has become nearly all powerful.

How is this important? A company can make a claim in paid traditional or online advertising, but if the brand promise doesn't hold up, the same company can subjected to a bloody digital onslaught fomented by disgruntled consumers or stakeholders. Want proof? Browse through brand names (particularly one pizza company) on YouTube and see what you come up with. As the last 20+ months have taught us, consumer and shakeholder trust is all important for real and lasting success.

The positive side is -- as Seth Godin advocates -- is that if you have a product or service that meets, or better, exceeds, the perceived brand promise, then give your consumers and stakeholders a digital megaphone. They will reward you with highly credible, believable and widespread commentary that can increase sales, buy-in and lead generation.

As eMarketer recently pointed out in its brief "What You Need to Know about Earned Media," good brand content will be at the heart of any successful strategy. The above chart provides details.

So the question becomes: how do you generate strategic content?

B2B magazine recently pointed out that content/PR/content pros are hiring researchers who can write compelling and huighly relevant material drawing on a variety of primary and secondary sources. Not flamboyant copywriters and creatives who produce "art."

That's not meant to be a slam against traditional ad copywriters or creative professionals, but the point is that truth, honesty and highly relevant content again rule. People want actionable information that is relevant to their lives.

David Ogilvy, the king of content, wherever he is, must be smiling.

Labels: , , , , , ,

Monday, February 01, 2010

Goodbye to the Web's Golden Age - Hello SplinterNet

With the myriad announcements of tablet PCs galore - iPad, HP, and the like at CES and later - a great deal of buzz has hit the blogsphere about the demise of the Golden Age of the "free" Internet. Most interesting, this trend, as it represents vertical divergence while Web based technologies are defying typical trends by converging (Web and TV, mobile telephony and PCs, etc.).
Forrester, notably one Josh Bernoff, has served up real proof of this new phenomenon, dubbed "The SplinterNet." He recently published this table, which saliently makes his point.
This, of course, comes on the heels of the much-ballyhooed iPad, which now seems to be following the archetypal Apple product introduction:
1) Ship a beta version minus functionality to come (E.g., Flash capability, still camera, video camera, phone capability); 2) Follow the Microsoft model of letting the consumers find and report the bugs; 3) Deploy an overpriced initial model(s) in the penetration pricing policy/penalty for early adopters; 4) Drop the price after a few months and roll out new versions with the previously withheld functionality.

But I digress. Bernoff counsels us thusly: "As we all gird for the launch of the Apple Tablet, take a moment to step back and realize what all these new devices are doing." Ala Web 3.0 thinking, everything is supposed to be transparent and compatible. As these new devices emerge, compatibility and open source is thrown under the bus. Anyone with a market that researches Web sites using mobile devices knows exactly what this means. To one extent or another, companies practically have to have an entirely different Web version to be mobile compatible for various phones.

Bernoff continues: "Meanwhile, more and more of the interesting stuff on the Web is hidden behind a login and password. Take Facebook for example. Not only do its applications not work anywhere else, Google can't see most of it."
For more than a decade, lots have people have asked "how do we monetize content?" With the iPhone (and now the iPad/iPod), that discussion has expanded to include formerly free apps.
What astonishes me is the current near-complete lack of outrage that many news and content groups (read: The New York Times) are openly erecting paywalls and cutting off free content.
Platforms and hardware are quietly taking over (again) as the Golden Age of the Internet starts its long tail into legend.
And by the way, the FCC is worried that insufficient bandwidth exists to accommodate true tablet growth and usage.
Thoughts? E-mail me here.

Labels: , , , , , , , , ,