Friday, December 15, 2006

INVEST OR FAIL

We don't know all sorts of things about the business future of online media, but there's one thing that's already quite predictable: the eventual profit-margins in 21st century media are likely to be far less generous than the fat and complacent margins to which we grew addicted in the 20th century.

I'm confident about this because we already know that the Web is inherently competitive, and in any type of competition, small margins of quality and success produce outcomes that are wildly disproportionate. This is why baseball franchises will spend millions of dollars on a free-agent pitcher who offers only the prospect of a slight improvement in the rotation's overall ERA. On such tiny margins are championships won and windfall profits earned. One might think that this is a rather obvious observation, but one would be wrong.

Here's why our industry doesn't get it: Modern media executives -- not to mention investors in traditional media companies -- earned their fortunes by wringing every possible dollar out of their properties. The rise of corporate ownership in the 1980s and 1990s certainly cut a lot of non-productive fat from newsroom budgets, but the last of the fat disappeared a decade ago. Every cut since has been to quality, and the cumulative effects are now apparent. Mention this in certain carpeted hallways and you'll get funny looks, because "quality" is a subjective concept to these people -- not nearly as tangible or meaningful as the highly objective concept of profit.

So while media executives will always talk about public service and editorial quality and civic duty, their aversion to invest in their products is a much more meaningful impulse. And why shouldn't it be? Newspapers have returned double-digit profits for decades with almost no visible correlation to editorial quality. To put it bluntly, the return-on-investment on "quality" has been so minimal for so long that it's now barely worth consideration. So long as newspapers remain local monopolies, this isn't likely to change.

Consequently, media companies of late have been far more interested in adding new publications and products (without adding staff) than they've been in improving the quality of their core enterprises. If newspapers make 10 to 15 percent profit no matter what you put on their front pages, why worry about the various erosions now besetting the industry? Squeeze your staff and production capabilities harder and get your growth out of new products.

This works up to a point -- specifically, to the point at which these local monopolies break apart into multiple competitions for suddenly elusive market share. I'm convinced that the truly significant shakeout change in our industry over the next few years will be this shift to a truly competitive media economy.

The current economy rewards shallow-minded journalistic shoddiness. The coming economy will punish it mercilessly.

Consider: If the choice between city hall coverage is a choice between two clickable links, one that belongs to your paper and another that belongs to a start-up competitor that treats city hall coverage seriously, how long will readers continue to click the link that finishes second in terms of quality? What will it take to get that click back once you've lost it?

In that kind of competition, suddenly quality isn't an afterthought -- it's the entire game.

When you see things this way, you wonder: Why aren't media companies investing more money to secure those clicks today and into the next decade? If media Goliaths enjoying 15 percent profits in 2006 routinely invested half that money in quality improvements, they'd be well positioned to head off the inevitable future challenge from an army of Davids. Wouldn't they?

But investing in quality content just isn't an option in the current financial climate, and even when corporations think ahead to emerging technologies and media, they almost invariably wind-up fixating on the cost-cutting potentials of new media tools.

Consider this Frank Ahrens piece from earlier this month in the WaPo ("A newspaper chain sees its future, and it's online and hyper-local") about Gannett's decision to push the Web-first concept out to its subsidiaries.
The chain's papers are redirecting their newsrooms to focus on the Web first, paper second. Papers are slashing national and foreign coverage and beefing up "hyper-local," street-by-street news. They are creating reader-searchable databases on traffic flows and school class sizes. Web sites are fed with reader-generated content, such as pictures of their kids with Santa. In short, Gannett -- at its 90 papers, including USA Today -- is trying everything it can think of to create Web sites that will attract more readers.
Some of those ideas are excellent, and the shift to a Web-first publishing model is inevitable. But the star of the Ahrens article isn't a good idea but a truly bad one:
Darkness falls on a chilly Winn-Dixie parking lot in a dodgy part of North Fort Myers just before Thanksgiving. Chuck Myron sits in his little gray Nissan and types on an IBM ThinkPad laptop plugged into the car's cigarette lighter. The glow of the screen illuminates his face.

Chuck Myron is one of more than a dozen "mobile journalists" -- mojos -- for the Fort Myers News-Press. He doesn't have an office or even a cubicle, so his car is his newsroom. The paper's parent company, Gannett, hopes the mojos' local focus will drive readers to its community-specific Web sites.

Myron, 27, is a reporter for the Fort Myers News-Press and one of its fleet of mobile journalists, or "mojos." The mojos have high-tech tools -- ThinkPads, digital audio recorders, digital still and video cameras -- but no desk, no chair, no nameplate, no land line, no office. They spend their time on the road looking for stories, filing several a day for the newspaper's Web site, and often for the print edition, too. Their guiding principle: A constantly updated stream of intensely local, fresh Web content -- regardless of its traditional news value -- is key to building online and newspaper readership.
Let me clarify: Training talented journalists to use laptops, recorders, cameras, camcorders and all manner of associated hardware and software isn't a bad idea. Pushing reporters out the door with these tools in order to produce more crap more cheaply is a bad idea.

Because that's what the corporate idea of a "mojo" is: a combination print writer, still photographer, radio host and TV production crew, all wrapped into one isolated, overwhelmed package, rolling endlessly from one meaningless chicken-dinner story to another. Not an evolution in journalistic capability, but a way to have one person do the work of three while pocketing the difference. Quality be damned.

The "mojo" concept isn't something that Gannett just invented, either. In March while attending the Western Knight Center multimedia reporting seminar at UC Berkeley, we were introduced to a television reporter whose station changed hands and was placed under new management that delivered the following ultimatum: Only those employees who could operate both in front of the camera and behind the viewfinder would continue to receive paychecks.

Gone were the days of broadcast teams of two (reporter/videographer) and sometimes three (reporter/videographer/producer) professionals. Now this reporter (who also had to take a pay cut) is her own producer, her own camera operator. When she needs a "two-shot" at an interview, she has to set up the tripod and film herself pretending to listen to the subject. She even transmits from remote locations and must solve any technical problems that arise. So she's her own engineer, too.

The great irony is that these low-cost tools now enable freelance journalists to compete in media arenas that were once the sole province of well-funded professionals. Instead of three local news channels, the Web offers the opportunity for dozens. One would think that media companies would understand that skimping on quality actually encourages competition from motivated amateurs, but again, one would be wrong.

In the week after the Ahrens piece ran, links to it were sent to me by no fewer than three other journalists. You could practically hear the gears in their head cranking through the possibilities, wondering, "Is this good? Is this bad?" The answer is neither: tools are tools. How we choose to use them determines whether they are good or bad.

The good news for those of us in the business is this: media isn't going away, and people with talent, ethics and commitment will be more valuable -- not less -- after the transition. On the other hand, the survival of publicly traded media mega-chains is another matter entirely.

If you're wondering how your company will stack up, here's a good test: Is your management investing in quality content or looking for vapid new ways to cut corners? Are they innovating or copying? Are they acting like scrappy competitors or vaguely annoyed sleepwalkers?

Answers, please, on a postcard...

Friday, November 10, 2006

DEATH TO UNSIGNED EDITORAL ENDORSEMENTS!

Attention, newspaper executives: In case you were wondering what to cut next, the dumbest thing you put in print every damned day are those unsigned editorials written by ... well, who exactly? Even the people in the newsroom don't really know, and the people outside are pretty sure Satan is involved somehow.

You've been told this before, but you're creatures of habit, and apparently one of your habits is trying every truly stupid idea at least twice before you do the obvious thing that's right in front of your face. Newspapers are in decline, and your response has been to chase around after every snake-oil solution that promised to stop the bleeding and not cost you any more money. As Andy Cline put it:
Shortening stories didn't work. More graphics didn't work. Putting fluff above the flag didn't work. Targeting free publications to young people didn't work. Shrinking the news hole didn't work. Cutting editorial staff didn't work. Cutting foreign news didn't work. Running wire fluff didn't work. Ignoring the poor and working class in favor of the middle class didn't work. Partnering with the advertising department didn't work. Speciality publications aimed at the rich didn't work. Re-design after re-design after re-design didn't work.
But how many newspapers looked around at the totality of their product and asked "Why are we devoting all this space and payroll to the opinions of an elite cabal who hide behind anonymity while demanding that everyone else be identified down to a street address?" I don't have a number, but I can tell you this: They're holding their annual convention in a utility closet at the Holiday Inn Express.

Editorial writers tend to be an untouchable class in the culture of metro newspapers. One of journalism's dirty little secrets is that the editorial page is the playground of publishers and board presidents, many of whom have never even covered a city council meeting. The thinking goes like this: Don't mess with the editorial pages, or "those people" will start trying to make their rich and powerful friends happy by messing around with news coverage. Consequently, we respond to critiques of the editorial page by talking about "firewalls" between opinion and news reporting.

But enough with the carefully parsed media-speak bullshit. Traditional editorial pages are simply indefensible in the early 21st century and the hypocrisy they represent is killing what remaining standing we still have with the public. No, you won't listen to me. I'm just a blogger. But hey, you can listen to Jeff Jarvis. He's a blogger and a media consultant:
The irony is that the editorialists have long been guilty of the sins most often attributed to bloggers: They rarely report and mostly just leach off the work of other journalists. And they work anonymously. Worse, they attempt to speak as the voices of institutions, issuing opinions as if from the mountaintop. But today, we do not trust institutions. We are impatient with lectures. We demand to speak eye-to-eye as humans. We require conversation. The form of the editorial is as outmoded as its medium. News organizations should no longer define themselves by the ink on their paper. And publishers may no longer assume the prerogative of telling us what to think just because they buy that ink by the barrel. Now we all have our barrels of bits.
Still not ready to confront the editorial page and do what must be done? All right then, let's take this in easy steps. Even if you're not ready to kill your editorial page entirely, stop right now, raise your right hand and repeat after me: "I will never again publish a slate of anonymous editorial endorsements of candidates."

Instead, try using your power for something good: Book a public venue, schedule the politicians to come in for their usual "endorsement interviews" and then invite all sorts of groups -- civic groups, political groups, interest groups, commerce groups, student groups, volunteer groups, you name it -- to come in and join you. Ask your questions to the candidates in public, and let other groups ask questions, too.

Then publish a summary of the other groups' endorsements. Serve the people's power, not your own.

Yes, it makes us feel Very Important Indeed to see endorsement-seeking candidates and their entourages waiting uncomfortably in our lobbies, but it really shouldn't be about how things make us feel, should it? Because we have the attention of the public, we have the clout to bring candidates to the table. Why not turn around and use that clout on behalf of the citizens to whom it rightly belongs?

Just promise to do that, and we'll let you keep publishing your quaint little editorials. For a while.

Wednesday, October 18, 2006

Hey, let's hire this Winer kid...

Wanted: News organization with vision, guts, asbestos underwear and enormous brass balls, willing to hire free-thinking, outspoken tech-pioneer to serve as chief technology officer. Apply to Dave Winer, somewhere on the road to the future...
Today on Scripting News, Dave runs an excellent list of suggestions for news organizations that want to master the transition to the new news. But the last graph made me sit up in my chair:
Disclaimer: I am looking for a job as CTO or Chief Scientist at a professional publisher that wants to make a strong transition to the new environment. So here I practice what I preach, I'm floating ideas in advance of using them.
Dave has mentioned some interest in trying his hand at "professional" media before, but this really puts his goal right out there for the world. If I were the CEO of a media group, the availability of Dave Winer would be what football coaches like to call "a gut check." Do I really want to be the best, or am I more comfortable with the known, predictable and non-threatening?

To put this in context, this is the equivalent of having Vincent Van Gogh apply for a vacancy in your graphic arts department. Sure, you could say that Vincent lacks Photoshop experience and move on to the next applicant. That would be the "safe" move. But "safe" seldom equals "great."

From a typical corporate perspective, Dave would be a "difficult" employee. He wouldn't need your management team's validation to make him feel confident in his beliefs about technology and media. He wouldn't "stick to computers" when it comes to discussions of ethics and responsibility, so the newsroom had better be ready for some knockdowns. And fire him? The guy's a millionaire who does what interests him, with a blog audience that reaches the most influential readers in the tech world. So go ahead and fire him. Make his day.

But screw the typical corporate perspective. Even if you set aside the things on Dave's resume (including blogs, RSS, OPML, unconferences, podcasts, etc.) and just look at him as a modern technologist who "gets" media, the man is a compelling candidate with an obvious passion about improving the quality of news in society. He makes a point of speaking to groups of journalists not because it profits him, but because he's passionate about the subject and willing to engage with reporters and editors in an effort to spread the new gospel.

His reputation is brash, and there's no doubt that Dave sometimes chooses to confront differences publicly at moments when public confrontation is purely optional. He's a burly bear of a man who can be an intimidating presence, and to judge by comments around the Web, he is a polarizing figure, particularly in tech circles. Is he a bully? I don't believe so, and my personal experience is that he's always been more than fair -- helpful, even -- to me. But he doesn't make nice if he thinks you're wrong.

In other words, you don't hire Dave Winer if you doubt your courage, or if your commitment to being great and innovative is less than his. You hire Dave because you see in him an opportunity to take great risks and reap great rewards and make your mark on this Golden Age Moment in which we find ourselves. Because make no mistake: This IS the Golden Age of the Internet, the particular instant in which the tools have emerged and the path is clear and the answer has not yet been delivered, the moment when people with clear vision can invent and build and leave their mark. We are -- all of us -- making history.

Dave doesn't need to be someone's CTO. He doesn't need the money, the headaches, the meetings or the constraints. I suspect he wants this career change for one simple reason: he sees an opportunity to change the world in ways he thinks will make it better.

If there's a media company that wants the same thing, then let me introduce you to your next Chief Technology Officer...

Thursday, October 05, 2006

It's the tools, stupid

Some things are so fundamental that you just have to keep repeating them until people start imagining the concept, and this is one of them: The future of news media and its affects on society isn't going to be shaped by updated versions of the obsolete institutions we already have, but by the invention of technological tools that radically change the rules.

Today's mediascape is superhuman in scale, allowing the manipulation of public opinion and consent via familiar means: FUD, "The Memory Hole," and basic obfuscation. Individuals -- even well-funded, organized groups of professionals -- struggle to cope with the torrent of information that now flows across the mass-media infrastructure.

To put it bluntly: In an age of unprecedented information flow, we're all still basically guessing. Better policies won't change that: only tools that scale to the size of the problem will.

In other words, don't tell me how the media needs to change: Tell me what tool you can build that will give people the power to bring order to data in credible, meaningful, real-time ways.

And when I say this, people think I'm nuts. Discovery Informatics? Sounds weird. Egg-heady. Besides, they're all too busy trying to figure out how to sell traditional print and broadcast ads on Web content.

So don't listen to me. Listen to Google Inc. boss Eric Schmidt, who spoke in England on Tuesday:

LONDON (Reuters) - Imagine being able to check instantly whether or not statements made by politicians were correct. That is the sort of service Google Inc. boss Eric Schmidt believes the Internet will offer within five years.

Politicians have yet to appreciate the impact of the online world, which will also affect the outcome of elections, Schmidt said in an interview with the Financial Times published on Wednesday.

He predicted that "truth predictor" software would, within five years, "hold politicians to account." People would be able to use programs to check seemingly factual statements against historical data to see to see if they were correct.

"One of my messages to them (politicians) is to think about having every one of your voters online all the time, then inputting 'is this true or false.' We (at Google) are not in charge of truth but we might be able to give a probability," he told the newspaper.

None of this is new, really. Google signaled this move back in May of 2005. But dammit, sooner or later, people are going to need to grasp that this is happening. Five years equals Right Now.

Those of us in the news business need to think about these big science-fiction media/informatics tools, because what Schmitt is talking about is a neutron bomb for traditional media. Once it drops, the buildings where we now gather to produce "the daily miracle" will be empty, yet valuable, pieces of real estate. But they won't be media companies anymore. Not the way we think about media companies today, anyway.

These tools can be used for or against the public interest, so those of us in the public need to be paying attention as well. For all our interests in the blogging revolution, in citizen media, in networked journalism -- all these wonderful new self-organizing communites we're creating -- these are just shifts in the way we deal with the basic problem of trying to stay informed and connected. The real revolution comes when structured informatics principles are applied to real-time information in ways that individual users can direct and apply. When "system" trumps "news judgment" once and for all.

The obvious applications are big, corporate and funded, with enormous server farms dispersed around the planet sucking riverwater through cooling towers to keep the world's electronic brain humming. But if people are creative at the grassroots, they can build new structures around these new human-sized tools, turning them into agents that actually serve the needs of real people, not just governments and investors.

Ask yourself: How could we apply these ideas? What do you want to do with them?

Start thinking about scenarios. Imagine media structures that have never existed. Chew on some wild ideas. And then hang on: Because your wildest ideas really aren't all that wild, after all.

Technology, like the civilization that spawns it, is biological evolution by means other than genetics, and our tools -- while not biological -- are as human as the hands that made them. We are like a colony of bacteria on the verge of becoming self aware. And that's not just a different business model, folks.

Wednesday, September 20, 2006

Our motto at work



Commerce hubs and the future of advertising

I've learned three things of consequence since I set aside reporting to work on new-media development last November: first, if your news site sucks, upgrade your content management system; second, stop thinking about documents and start thinking about databases; and third, online advertising is primarily a form of information, and its most profitable future relies on packaging and delivering that information in increasingly useful ways.

I think we'll see some big changes in those first two items over the next couple years. Even smaller papers are starting to get more serious about their websites, and Adrian Holovaty recently drew attention to the value of databases in journalism, sparking a conversation that is long overdue. So there's movement on the editorial side, thank goodness.

Yet the advertising side of the industry still seems mired in a fatal illusion: That the online realm is, and will continue to be, an analog to print publishing. I don't blame advertising professionals for failing to spot the coming shift -- they've got their hands full with the here and now, and there is apparently no shortage of paid consultants to feed them status-quo projections. But the culture is shifting around us, and anticipating where it's headed isn't just marketing. It's the only way media companies are going to survive.

Earlier this summer, new-media pioneer Dave Winer summarized the issue and prescribed a course of action:
There's a big trend here, imho it's the difference between the 20th and 21st centuries. In the past the flow of ideas for products was heavily centralized, and based on advertising to build demand. In the future, the flow of ideas for products will happen everywhere, all the time, and products with small markets will be worth making because we'll be able to find the users, or more accurately, they'll be able to find us. "Targeting" customers is the wrong metaphor for the future. Instead make it easy for the people who lust for what you have to find you. How? 1. Find out what they want, and 2. Make it for them and 3. Go back to where you found out about it, and tell them it's available.
But that's a model for people who have something to sell, not a model for those of us who are in the business of helping people sell things. Since the thing that I do (journalism) is (in this country, at this moment) funded by that "help people sell things" role, imagining its future seemed like a natural role for a combination journalist/science fiction writer. By mid-2005 I was beginning to sense a series of developing patterns, concepts that seemed to embody the spirit of the new medium, ideas that I thought might have a future.

I began writing drafts of these concepts last fall with the intent of turning them into a series of posts on this blog, but in December, after accepting a temporary assignment working for our executive editor, I decided to rewrite those posts into one document for my bosses. I've been told that the ideas "sparked some discussions" at higher levels, and here's hoping that those discussions someday lead to something profitable.

Last fall I thought my concepts were extremely obvious, perhaps (as Ed Cone likes to say) "just 30 seconds ahead of the curve." But ten months later, they still read pretty fresh -- particularly on the subject of what the new online media business model is going to look like. So, in the interest of "sparking some discussions" on a wider basis, here's my December 2005 take on the new business model for media, lightly edited back to toward the original blog-oriented version:
Readers, buyers and users
The print-newspaper business model is contradictory and more than a little confusing: We attract readers by covering news, which they pay to read. But the heart of our business lies with sellers, for whom we deliver the attention of buyers via advertising. This is a conflict. For our brand to be valuable, readers must see it as independent of advertiser interests. To keep that reader trust, traditional newspapers have long erected “firewalls” between their news and advertising and editorial departments.

But we have a secret: Those firewalls come with built-in doors and windows. For all our talk about independence, no newspaper wants its editorial department to go around casually angering advertisers. Sure, we do it, but we don’t do it lightly. Not for long. And when we do, even for the best of reasons, we hear about it from those advertisers.

There’s another flaw in this model: Because the firewall isn’t really what we say it is, we’ve developed this odd tradition about covering commerce. Instead of acknowledging that buying and selling and consuming are among the most important activities in our readers’ lives, we pretend that these topics are really of interest only to businessmen. Why? Because writing about products and businesses from the reader’s perspective is a great way to irritate advertisers. This absurd-but-inevitable position creates the market for Consumer Reports, a relatively expensive magazine that actually covers these subjects with authority. Consumer Reports, of course, accepts no advertising, which means it costs the reader more.

So why do they buy it? Because Consumer Reports saves people money. It is, in the purest sense, what all news media aspire to be: Something so valuable that a subscription is considered an investment. Consumer Reports has no firewalls, because everybody at Consumer Reports is working for the same person: The reader. The buyer.

Which brings us to Amazon.com.

I use to go to Amazon to buy books. Now I go there buy to books, movies and music, even gardening tools and college-logo stadium jackets for my parents. It’s where I’ll do the bulk of my Christmas shopping, because most of my family lives someplace else.

I’m a loyal Amazon shopper, because Amazon works for me. In fact, I’m its business plan.

It learns what I like. It remembers me. It suggests things, based on what I’ve looked at previously.

Amazon respects my time. It collects, organizes and displays information for me in ways that help me make decisions, and when I’m ready to buy, I don’t have to jump through hoops. I can do research, comparison shop, try a free sample of the book I’m thinking about buying. Even its suggestions are useful, tailored to my interests. It never wastes my time with an annoying pop-up ad for something I would never consider buying.

But there’s one other thing Amazon lets me do: It lets me talk about what I’ve bought and read what other buyers have written. If you think about that from a newspaper perspective, that’s a pretty radical idea. We won’t even let reporters write about the relative merits of one furniture brand over another, or the friendliness and honesty of the staff at a particular used-car lot. Would we just open up a forum for readers to pop in and say critical things about our advertisers?

Of course not. That would be stupid.

But then there’s Amazon, making an awful lot of money by being stupid that way.

And when I really sit down and think about it, I could use a lot more Amazon in my life. I need it saving me time, helping me to make smart choices, connecting my needs to products and services in intelligent, adaptive ways.

I need this concept at work in my community. Not just an Amazon for the same-here-as-there world of books, but a similar concept that connects me to the businesses in easy walking distance or a short drive.

When I’m going out to dinner, I’d like to know what’s on each restaurant’s specials board. I’d like see their price for fresh fish, read our restaurant critic's most recent review and check to see what other diners had to say about their experience. Sure would be nice if I could make a reservation online, too.

When I want to go to the movies? Same thing. I don’t just want to buy a ticket online – I want to interact with other movie fans. I want to read what others think, but maybe my tastes are special – maybe I only want to read reviews from people with similar viewpoints. I want a site that helps me filter out the clutter and find the things that are relevant to me.

I need other things, too – like, say, a mechanic I can trust. Actually, I need two – one for a domestic sedan, the other for an import van. I need plumber and a guy who can repair the plaster in my ceiling. I need a lawyer and a built-in dishwasher. Ads give me a clue about who to call, but it’s not generally high-quality information. Sure would be nice if I could do my research, price my job, schedule my service, check references from previous users and conduct my business all in one place.

I think the future of our online business lies within that example. Why not use our unique position in the local media market to build online marketplaces that integrate all the research, interaction, information and commerce functions of smart, 21st century commerce in one, easy-to-use site? Not an advertising section with stories in it. Not a story section with ads on it. An integrated commerce hub where readers will research, locate, compare, price, discuss and buy – all in one place.

It’s a product that generates revenue for us before we sell the first ad, but I have no doubt that ad sales will follow. Why? Because by bringing interested, motivated buyers to a single virtual marketplace, we will have created a location where sellers want to be.

It’s hard to even imagine such a thing today, but it’s worth doing. Imagine, for instance, a site that combines all our golf coverage with all the stories we’ve ever written about local courses, all the relevant external links (no pun intended, but hey…), columns and blogs by golf writers (both professional and amateur) … and then, at the end, a service that lets the user find available tee times, compare costs and features, and then book a round for his foursome, all without having to leave our site. Everyone benefits, and we get our cut as the middleman.

Now imagine you’re an ad salesman. Why would a local golf course want to advertise online anywhere else?

We need to get in the habit of imagining such relationships.

Could you apply such thinking to automobile maintenance? Getting your house painted? Buying stock online? Could you build a hub for tourists, integrating all the things tourists need to plan and enjoy a trip to Charleston?

So where’s our money in this model?

Everywhere – depending on the type of transaction we’re assisting. Sometimes we’ll sell upgrades from our standard free listings – a detailed restaurant menu, maybe photos or maps. Sometimes we’ll sell services: Perhaps a blog-like “blackboard” where business owners can talk about their specials or chat about whatever’s on their minds. We might charge some vendors a flat fee for providing e-commerce services that lets shoppers buy their merchandise through our site. Others might list their products and services for free, but pay us a commission when we connect buyer and seller. A car dealer might pay us a finders fee. We could even put a Pay-Pal button on some of our sites and ask for tips!

Other types of companies could do this – and, if news organizations don’t do it, these competitors likely will. But nobody else starts with the local advantages we enjoy. They would have to create the content that would give such hubs their depth and value. We already own it. It’s called our electronic archive. They would have to hire writers, set up business, I.T. and advertising staffs. They would have to spend money on marketing just to introduce their brand. But everybody knows about the local paper, and every day thousands of people wander through our website. How might that number change if we made our online brands and their affiliated products significantly more useful?

Coordinating our efforts
The key concept here – the thing that makes the whole idea work – is that because a commerce hub doesn’t require advertising to generate revenue, the traditional firewall between advertising and news becomes purely elective.

Sure, a restaurant might protest a negative review by threatening to pull its ad from our online dining hub – but if that hub is the single spot where the majority of local diners go before going out, that ad won’t stay gone for long. What’s more, the interactive features of a truly integrated hub change the dynamics of a bad review. In print, publication is the end of the story (except for the angry phone calls and occasional letter to the editor). Online, publication is just the beginning of the conversation. Some readers will come to the restaurant’s defense. Others may take the reviewer’s side. Restaurants that participate in the debate – or at least follow it – will learn valuable information about how to improve their business, and would-be diners will get a much more detailed picture of whether or not a restaurant suits their tastes.

As an advertising market, a hub also offers the advantage of being far less susceptible to competition from our traditional media rivals. Smaller publications have been able to take some of our ads by promising downtown bars and restaurants better bang for their buck. While these generic-content competitors won’t go away, the self-selecting nature of a commerce hub audience makes the hub a naturally focused and efficient advertising market, something far more similar to the Yellow Pages than a typical newspaper section or website home page.

Some businesses will be happy to be part of our free listings but may pass on the functional upgrades we’ll offer them for a price. Others may choose to buy tradition online ads and ignore the new products we have to offer. Whatever. Hubs offer businesses a greater ability to customize their presence, and that’s a very Web 2.0 concept. This new medium rewards customization and punishes one-size-fits all inflexibility.

To offer that flexibility, our approach to online business must be equally fluid. Gone are the days of the all-encompassing revenue source, and that means we must learn where we can add value in each transaction. That observation will determine where, and how, we get our cut.

This means changing the way we think about our web operations. It is no longer enough to bundle our multiple products under a single brand, with built-in firewalls between departments. We must begin thinking about every meaningful subcategory of reader/user/buyer interest as the focal point of a new product. We must organize our new-media efforts via interdisciplinary teams rather than by traditional departments.

This last step is only possible when we align our focus to serve one goal: The needs of each individual reader.

Almost a year later, those ideas remain left-field visions, easily derided and ignored. We'll see how it all looks five years from now, and compare notes. See ya in the future, campers!

Friday, August 25, 2006

Hope for the indies

Progress toward the new information economy continues to surge and sputter in uneven ways, sometimes confusing us but other times pointing ahead to tantilizing possibilities. I found an example in September's issue of WIRED, in an article about Netflix's new role in creating distribution deals for independent filmmakers.

The article (not available online until Sept. 1) showcased the power of Netflix to connect unknown movies to actual movie theaters and explores the possibility that the company will expand its reach into actually producing independent films. But that's not what caught my eye -- rather, it was the story of how the company did a profit-sharing deal with the makers of a romantic comedy called Nice Guys Sleep Alone. Enter Neflix Chief Content Officer Ted Sarandos, who told the film's director "Send me 500 DVDs. Every time it rents, we'll pay you something."
"An awful lot of people started renting this no-name title with zero marketing budget," (director Stu) Pollard says. "As a result, it was picked up by HBO."
The article doesn't really linger on this point, but to me it said something of fundamental importance about the future of all sorts of independent media. Because what Netflix has done for independent film is what the Web needs to do for independent comment: apply informatics techniques to connect content to its natural audience.

We've already taken the first step, providing all sorts of ways for creative people to express themselves in public. We've got blogs and free websites that let us speak our minds, indulge our passions and share knowledge. We've got Flickr and Zooomr and others for sharing photos, plus YouTube and sites like Lawrence.com to help us "publish" our videos and films.

But if a blogger rants and there's no one there to read it, did she make any sound? For all of Chris Anderson's talk about the future of niche media and The Long Tail, the volume of new content is so large that, even with the best modern search engines, I have no reliable way of finding the portion of that content that most interests me. Rather than finding new writers, I wind up wearing RSS grooves to a stock list of A-List bloggers. Because we lack the proper 21st century tools, 20th century mass-media business models continue to thrive on the Web and The Long Tail remains more of an abstract concept than a solid business plan.

As I wrote last month on Xark, the expansion of the read-write Web has created an un-met demand for real-time information tools that scale to the size of the modern mediascape. In other words, what I really need is a tool that spots content that will interest me, personally, as it is published, no matter who wrote it. Right now, when it matters.

Which brings us back to Netflix.

If you're a Netflix user, and you have been for any length of time, the service you get now is likely far better than the service you got when you first signed up. At sign-up, you picked movies for your queue the 20th century way: You browsed through lists of movies by genre, release date and critical ratings. There were lots of movies to choose from, but few of your choices were all that surprising.

But Netflix is successful because it asks you to rate the movies you watch, and it learns from your preferences. Other sites -- most famously, Amazon -- do something similar, but nobody does this as well as Netflix. This is in part a function of volume: the company has collected more than a billion movie ratings, with a per-person average of 200.
With rich data like that, the company can develop sophisticated profiles to anticipate preferences and tastes. "It can tell you that you like The Godfather because you love family immigrant pics, and I liked it because I enjoy gangster flicks," Sarandos says. "So the next film suggested to you will be Avalon, and the next one for me will be Scarface."
In other words, had Blockbuster agreed to let Pollard put a copy of Nice Guys Sleep Alone on its vast video store shelves, the odds are slim that the film would have ever found its audience. Nobody had ever heard of the movie or Pollard, and unless its box randomly caught someone's eye, it would be doomed to gathering dust. But by being included in Netflix's recommendation system, anyone with tastes that matched the movie's profile found out about it.

Not blanket, expensive, annoyance marketing in which money drives audience, but marketing as a service to the user.

Everyone wins.

This is the spirit of the web as it could be, but there are enormous hurdles involving privacy, technology, intellectual property, investment and pure, unreasoning human stubbornness between here and there. Yet there's almost no doubt in my mind that this is one of the fundamentals of the new information economy, a concept that cannot be avoided.

Not surprisingly, Dave Winer gets it, although he's writing about it in the context of advertising:
There's a big trend here, imho it's the difference between the 20th and 21st centuries. In the past the flow of ideas for products was heavily centralized, and based on advertising to build demand. In the future, the flow of ideas for products will happen everywhere, all the time, and products with small markets will be worth making because we'll be able to find the users, or more accurately, they'll be able to find us. "Targeting" customers is the wrong metaphor for the future. Instead make it easy for the people who lust for what you have to find you. How? 1. Find out what they want, and 2. Make it for them and 3. Go back to where you found out about it, and tell them it's available.
Guys like Winer made all this possible, but it's now time for the Informatics guys to step in and start building the tools that will make sense of the info-torrent Winer and others created. We can help them by thinking about the legal, ethical and economic issues that will surround these tools. We can pave the way.

There's a popular New Age axiom that says "We are our choices." In the future, on the Web, this will be literally true. And that's a good thing.

Wednesday, May 31, 2006

Corrections: The error of our ways, & vice-versa

A letter to Romanesko from David Cay Johnston, written in response to Ted Vaden's ombudsman's column in the N&O, bears amplification, because Johnston flat-out gets it (although I do disagree with one of his formulaic solutions). He writes (bolded emphasis is mine):
The analysis of errors and corrections in Ted Vaden's Sunday column in the Raleigh News & Observer is troubling on many levels -- and raises issues that ought to prompt deeper thinking by his fellow ombudsmen, as well as the rest of us.
The number of corrections run annually is a lousy measure of actual errors, arguably worse than crime statistics are of actual crime. Indeed, the correction process itself reeks of bias that favors softball journalism.

Mr. Vaden's numbers measure only errors that drew complaints sufficient to prompt a correction. How many errors did not draw complaints? (This is why careful reporters qualify their articles on crime statistics by referring to "reported crimes" and why ombudsmen ought to take note of requests for corrections.)

The corrections data counts all errors as equal, just as the FBI crime index gives the same weight to a murder and a tricycle theft -- one. But of course all errors are not equal. Analysis is required to understand the nature of the problem. What share of corrections were spelling errors? Math errors? Editing errors? Typos? Erroneous official reports that were accurately cited? How many were about messed up facts, or stereotypes, that created a false impression? What is the ratio of complaints made to corrections run?

Perhaps most important, what portion of complaints are determined to be malicious?

We rarely tell readers, listeners and viewers about complaints from those who those who twist, distort and lie to shut down hard-hitting reporting.

At many news organizations, just complaining can produce benefits, especially if it results in clear facts being muddied with extraneous details. Ombudsmen could do a lot of good by describing such dishonest complaints so readers get a fully balanced view of journalism.

And what of substance? Murders are much more likely than petty thefts to turn up in crime statistics. But in journalism it is the easy to verify errors, such as misspelled names, that tend to result in corrections, while unchallenged journalistic felonies lie in the published record.

The ease or difficulty of making a complaint is another factor. Just as the police can make it appear crime is down by creating obstacles (e.g., requiring one to come to a police station rather than sending a car to the scene), editors can restrict the volume of complaints by how they require them to be handled.

One way to narrow the gap between actual and corrected errors would be to require that all complaints, regardless of merit, be referred in writing to a designated editor high in the organization. A rigid enforcement mechanism -- firing on the second omission, perhaps -- would encourage compliance.

The correction process is also biased against tough reporting.

Hardly anyone complains about errors that make them look good. All sorts of errors can be found in stories with heroic themes (rescues, crimes solved, etc.) and in stories about politicians, actors and athletes without any complaints.

Lack of corrections should never be taken as an indication that a journalist does quality work. One can write pap and never get a complaint even though the work is riddled with errors of fact, omission and distortion.

Do errors that distort reality by polishing an image differ from those that tarnish? A case can be made that fawning errors do more harm, especially when they advance the careers of politicians, cops, prosecutors, judges, surgeons, scientists and executives who use their power for venal purposes or prove incompetent.

The volume of corrections may speak more of readers than to the publication’s relative accuracy. I read far more corrections in The New York Times than in the New York Post, but then do Post readers have the same expectations of fidelity to fact as Times readers?

Perhaps we should think about corrections as a measure of integrity -- and running many may signify commitment to fact, openness to complaints and high reader expectations.

A quarter century ago I suggested to David Shaw that he undertake a project to verify every fact in one day's Los Angeles Times. The conversation was prompted by my volunteering a correction (which, as I recall, did not run) that we had the age of a woman in a brief item wrong because the official police report was in error, which I learned while doing a follow-up.

David said he could imagine a year traveling the globe and he was certain that he could find some error in almost every article in that day’s paper. We talked about people who get facts about themselves wrong and reporters suspected of piping quotes and of important stories ignored because they were beyond the skill, or interest, of the beat reporter. David observed that most errors would turn out to be second hand, as with the age of the woman, and many others trivial, so that at the end of the day it would be a wasted exercise. I agreed.

One last thought in the hopes it will prompt some deeper thinking about the flaws and biases in correction policies:

There are reporters who spot mistakes in their own work that no one complained about, and submit corrections, a point no reader would imagine based on Mr. Vaden's unqualified assertions at the top of his column. What does it say about our craft that this is the just the kind of stereotypical false impression that is likely to stand uncorrected?
In parting, let me add: The ethical failings of newspapers often have nothing to do with the accuracy of the limited set of facts on which they report, but rather on the decisions that determine which facts will be revealed to the public. Which is worse? Calling a 36-year-old woman a 37-year-old in the story of her murder or leaving out information that would indicate the killing was the result of a drug dispute?

If the murder occurs in the ghetto, the drug reference is likely used without much thought. But what if the murder occurs in suburbia, and the victim is a member of a well-connected family? Publish the police's suspicions and you hurt the family's reputations and provoke a bunch of angry phone calls. Omit those suspicions and you give the community the impression that there is a random killer at large, targeting women for unknown reasons. Does the suppression of credible information count as an error when it gives the public a significant misconception?

Counting corrections doesn't tell you as much about quality as the press fundamentalists would have you believe. Kudus to Johnston.

Friday, May 05, 2006

Media message? Kill the court jester

So the family downloaded the video of Stephen Colbert at the White House Correspondents Association dinner and gathered around the old monitor to watch.

And being a journalism-based household, we all had the same reaction:

What in the HELL were these "journalists" doing at a gala event with the people they're supposed to be covering?

Most of the reaction to Colbert's performance has focused on how inappropriate it was for him to so badly embarrass the POTUS in public. And, not surprisingly, the Usual Suspects on the Right have crawled out of their spider holes to slime Colbert. What's surprising is how willing the "mainstream liberal media" has been to join the attack.

Actually, let me restate that: It would be surprising if you still bought the old canard about how the press -- especially the Washington press -- is a just partisan unit lying in wait for a chance to snipe at anything Republican.

Colbert is a comedian (from Charleston) who is funny because he finds ways to talk about the truths those of us in the media can't find the words to discuss. And one of those truths is that today's corporate media and "power-player press elites" are often as compromised as the political institutions they cover. Because if you think Colbert's humor targets only conservatives, you don't get the joke. Colbert's character on his Comedy Central show is a clown's mask on a media monster, Bill O'Reilly slipping on a banana, FOX News without the pretense. His subject isn't just politics, but media, and he's got our number.

So when Colbert makes fun of the White House's inability to give straight answers to hard questions, he's doing classic political humor. But his method of delivery -- his "newsman" character could be the evil love spawn of Sean Hannity and Ann Coulter -- expertly mocks my profession.

When Colbert tells the assembled White House press corps to stop questioning what the official spokesmen say, he advises his audience to go home and write that book they've always dreamed of, the one about the brave Washington reporter who isn't afraid to tell the truth, no matter the cost.

"You know," he says, "fiction!"

So that was a hard shot, and I'm glad it made some people squirm.

But let's talk about the other, unintended effect of Colbert's performance. Because of the viral publicity it has engendered -- concentric waves of awareness spreading out over a matter of days, unconstrained by mass-media news cycles -- people across the country are witnessing something that probably appears unfamiliar to most: The image of the political establishment and the media establishment in their true, cheek-to-jowl guise.

Listen: When the fox and the watchdog get dressed up and sit down for a collegial banquet of rubber chicken, that doesn't bode well for the henhouse. And because of Colbert's ballsy, truth-telling stunt, that's what Americans saw.

Here's a suggestion: Rather than carving up Colbert, why don't America's top Washington reporters swear off events like the WH Correspondents Dinner? If you don't want to be seen as a lapdog, try getting off of the goddamn lap first.

Wednesday, April 26, 2006

A blog blog

To get some idea of what we've been up to recently, take a peek at a new blog-about-local-blogs that Janet and I started a week ago. We're doing it for the newspaper as part of the new Postscripts project (Janet came up with the brand, logo and slogan), and here's the skinny: We "launched" it on the night of April 18th by quietly sending out the link to two local bloggers.

The next morning we had more than 50 hits.

Since then we've been covering local blogs in a manner that would be familiar to anyone who reads The Hotline's Blogometer. By the end of its first week, our stealth blog had already generated more than 1,000 hits and 20 comments. Local bloggers have used it to announce plans for a blogger party next month. People are excited, and The Big Blogroll keeps growing. We're indexing more than 50 local, active blogs, and that's already more than half the number listed at Greensboro 101, a pioneering local blog site created by Roch Smith.

Could it be that there's a more active local blogosphere in the Lowcountry than any of us imagined? What happens to a community of writers and readers when that community becomes aware of itself? Can a local metro daily connect to its local blogosphere in a constructive way?

I dunno, but we're about to find out. Today the webmaster for our Charleston.net web site is working out the domain redirects that will give our local blog-blog a shorter, easier-to-remember URL, and once that's set, we'll be linkable off our own site for the first time.

It's early, but the early signs are encouraging. Particularly this one: In the post in which local blogger Walter (of Baxter Sez) announced his discovery of Lowcountry Blogs, he wrote:
since i now see that we may have just a leetle bit of local attention i want to take advantage and offer up my first bit of critique for something charleston. here it is:

Crosstown. the road that truncates the peninsula. man, that is one ugly sumbitch.
... and went on from there to talk about the congested local freeway that carves up neighborhoods on the Charleston peninsula.

Which leads to my point: if connecting local bloggers to each other encourages them to write more about the place where they live, why would a local news organization not want to help with that?

Beats me.

Thursday, March 23, 2006

Competition and its alternatives

(Cross-posted from Xark!)

I offer this as proof that even really cool ideas can take a while to find their audiences, particularly when that audience is saturated by media in the first place. Anyway, thanks to the few Charleston bloggers I have found and added to my reading list, I've finally encountered something I should have been tracking for weeks: a Charleston City Paper guy named Jay Stecher. Apparently, Stecher has been doing a thing since Feb. 1 called "The Weekly Geekly" -- great name, eh? -- in which he rounds up local blogs and podcasts, etc.

The Weekly Geekly isn't a blog -- though it would make a natural one, hint-hint -- but it's already doing one of the things I hope to convince my bosses to do: draw attention to the work of people who are making their own media in the city we share. This is hardly a cutting-edge idea, but it's alien to the competitive-media-mind until you wrap your brain around some of the related concepts. In case you haven't noticed, newspapers traditionally hate citing the work of their competitors.

And this is one of those related concepts: in blogging, you don't deal with competition by trying to freeze out your competitors.

One of my favorite stories from ConvergeSouth was someone (I forget who) talking about how The (Greensboro) News and Record had improved its relationship with the local blogosphere by changing the way it dealt with staff-written stories that were developed from independent blog posts. The first N&R stories that were "inspired" by information originally reported on blogs routinely made zero mention of that fact. Bloggers cried foul.

What those bloggers probably didn't know was this: reinventing someone else's wheel so that you can say "Look! I made a wheel!" is a longstanding and truly stupid tradition in traditional media. Every paper I've ever worked for has done it, and as city editor, I used to tell people to do it.

Anyway, here's the thinking (or rationalization, take your pick): Readers want local news. Readers will define "local" as "staff written." So if the Associated Press or one of our wire services moves a story that is local to our market or touches on a local issue, we will "localize" the story. For you. The reader.

At a minimum, localization means that I'll take a story that deals with a national or regional trend or issue and make sure that there is information in that story that places the local situation in that larger context. The AP moves a story on freedom of information compliance, so I add some paragraphs that tell local readers what the rules are here. I don't have any problem with that.

Where I go off the reservation is when it comes to the second level of localization, which I like to call "theft." A wire service writes a story about something in our state, and we don't have it. So we assign a reporter to call all the same sources from the original story, then write the same thing using different words. When that story appears in print, there's no mention the original article.

And it's even worse if the source for the story is a direct competitor instead of a mutual wire service. Yikes! We've been scooped! We need to do this story right now!

We rationalize this by saying "Well, we need to verify the story independently," but this just isn't a meaningful response. Yes, independent verification is a good thing. But we run wire stories that we don't verify every day. We couldn't possibly meet such a standard.

The truth is, news organizations recreate existing news stories when they're caught flatfooted by other news organizations and don't want to admit it. I think it's dishonest, wasteful of precious reporting resources and disrepectful of our readers -- not to mention the originator of the story.

So when you hear that the Greensboro reporters weren't citing the work of local bloggers, you can understand why. The N&R reporters and editors were doing what we've always done as an industry. They were applying "traditional newspaper values" to their relationship with the blogosphere. And those values just didn't work in the new medium.

The N&R made peace with its local bloggers by changing the way it looked at them. Competitors? Maybe. But nature suggests that there are viable relationships between species that are something other than competitive. A better word is symbiosis. And a symbiosis between Big Local Media and independent local bloggers is a much more productive relationship for everyone concerned.

Greensboro solution? When an N&R reporter finds a legitimate story in a local blog, the reporter does all the necessary reporting as if he or she was starting from scratch. But when the story appears, the blog is cited. The blogger gets traffic, credibility and juice. The honoring of that work then reflects back on the N&R, building its brand within the local blogosphere. Everyone benefits.

As frequent Xark commenter Pam Morris (a brilliant and witty microbiologist) described it to me once, not every bacteria colony succeeds by out-competing its rivals for available resources. Some get along by handicapping their rivals' ability to compete. "They're really kind of evil. It's like a really bad corporate environment," she said (I'm quoting from memory).

I don't think that attitude has a long-term future with online media. If you base your product on claims of fairness, openness, transparency, whatever, then you just can't go around acting like evil bacteria. You can't see bloggers who write about your local market as enemies to be crushed. You can't even view your critics this way.

Read this carefully: I'm not talking about a future in which there is no competition. I'm suggesting instead that we start developing new attitudes and relationships. It is possible to be both competitive and cooperative. Think I'm just an unrealistic hippie selling psuedo-socialist Kool-Aid? Allow me to introduce you to The NFL.

My bosses crossed the first of these bridges in May 2005, when my posts at our Spoletoblog routinely cited the work of bloggers for The Charleston City Paper. Some people in the building were disturbed, but my boss wasn't.

"I told 'em that's just the way this new world works," he said.

Exactly.

Tuesday, March 07, 2006

The campaign against Wikipedia

(Editor's note: This post began as a news item at Xark!, but grew into a stand-alone essay.)

I first noticed this back in February while speaking about Web trends to a Public Relations/Business Communications class at a local college. When I asked about Wikipedia, everyone who spoke expressed a clear message: Wikipedia, to them, was not so much a resource as it was a threat.

Multiple students reported they had been told by their instructors not to use it -- ever*. Some spoke of professors who routinely threatened to punish anyone caught using it. And even my host allowed that her attitude toward the online encyclopedia was less than charitable.

Last night, speaking to a group of high-school journalists, I got similar responses. In these instances, I detect not only scholarly skepticism, but something more. Something bordering on scorn.

Like the college students from last month, these high schoolers knew that anybody could edit Wikipedia, though none of them expressed any understanding about how the system functioned, what a wiki is, or how a community of editors becomes a self-correcting entity, etc.

It's a disconnect. I see Wikipedia as a way of thinking about information and virtual community. They see it as a free-for-all. Their teachers see it as anarchy.

I call this a backlash. Wikipedia came out of nowhere, fast, to become the largest encyclopedia in history. Some academics, who as a group are used to controlling such things, were horrified by the wiki concept -- and financially threatened by the open-source, free-info, non-profit model that keeps Wikipedia a living, growing document. Rather than checking it out further, a segment of academia appears to have united against it.

From a PR standpoint, the big blow came in December when John Seigenthaler wrote a widely publicized piece citing inaccuracies in a Wikipedia article on his life.The founding editorial director of USA Today called Wikipedia "a flawed and irresponsible research tool" in a column that reflected the attitude I've noticed among some academics -- that whatever else Wikipedia may be, it also is a sandbox for malcontents, anarchists and children who run around with scissors. In other words: Not For Serious Adults.

In the wake of the Seigenthaler column, big-name bloggers and technorati, including Dave Winer and Adam Curry, came out with their own criticisms. Making matters worse, Wikipedia founder (or, as the case may be, co-founder) Jimmy Wales got "caught" editing his Wikipedia bio and taking out references to his early collaborators.

Wikipedians got their say in the ensuing coverage, but from an outsider's perspective, it seemed like the Wikimedia Foundation -- a concept I dearly love -- was suddenly in public-relations damage control mode.

Here's my take:

The idea that Wikipedia is less accurate because it doesn't have top-down editorial control is, itself, inaccurate. A better question would be, When do we know it to be accurate? The whole concept of a collaborative information project is based on the idea that community collaboration will identify and correct errors -- in public. Traditional media also involves editing and fact-checking, but it does so before publication and without transparency. Yet traditional media routinely stumble when it comes to correcting the errors that slip past those all-too-human pre-pub controls.

"Does Wikipedia have errors?" isn't a meaningful question, but "what errors will an individual Wikipedia entry contain in the snapshot of time that I see when I call up the entry?" is a question that actually takes us somewhere.

Wikipedia asks that we correct the errors we see -- and, unlike the popular stereotype of Wikipedia as an irresponsible Wild West of disinformation -- Wikipedia as a process includes multiple feedback loops that address vandalism, inaccuracies, biased writing, etc. It assumes that people are adults.

The question, then, is not whether Wikipedia has editorial quality controls (it does), but whether those controls work fast enough.

That's an open-ended question (fast enough compared to what?), but I contend that a wiki-model encyclopedia will probably correct its errors far faster than a proprietary encyclopedia. My reasoning? Top-down, for-profit editoral control pays a few people to ride herd on a large range. It cannot mobilize as many corrective resources, as quickly, as the Wikipedia community can.

Plus, is Wikipedia really inaccurate? Again, accurate compared to what?

Nature decided to compare Wikipedia to Britannica, considered the Gold Standard of traditional enclopedias. Its finding? On a survey of 42 science articles, Britannica was more accurate.

But how much more accurate? Not much. The Nature study found an average of four errors in its Wikipedia entries... compared to three errors, on average, in a Britannica entry.

The debate goes back and forth, with some Wikipedians contending that the average Wikipedia entry is 2.6 times longer than the average Britannica entry, then doing the math to produce a lower error rate. Yada yada yada. I don't care. Framing this as a competition between Wikipedia and Britannica misses the point.

The more telling comparison is between Wikipedia and Google, because when you consider how I've come to use Wikipedia, it's as an alternative to general web search engines. Wikipedia is just as fast, far more relevant and much more accurate in the information it returns. Viewed as a form of curated search, Wikipedia looks a lot less threatening.

In this sense, Wikipedia is a through-point, not a destination. And, ironically, this was exactly how my teachers told me I was supposed to use an encyclopedia Back In The Day.

Not only are we comparing Wikipedia to the wrong standard and failing to understand it as a process and a community, we're also missing the most valuable points of the accuracy debate by taking Wikipedia out of its natural context: the larger Web. Dave Winer has been cited by Wiki-haters for his criticisms, but that's far from the complete picture. Consider this Scripting News post from December, in which Winer addresses the Siegenthaler case and the larger ethical question of who-should-edit-what (emphasis added):
Ross Mayfield sees the pros and cons of editing your bio page on Wikipedia. Here's my take on it. No, you must not edit your bio page, or any page about a topic in which you have an interest. It's impossible to disclose that interest, so the poor reader has no idea how to credit what's on the page. This is the weakness of Wikipedia, in fact of all wiki. But his point about the knowledge you have about yourself is an important one. Imho, the obvious answer is that your page, on your site, edited only by you, should be linked to from the equivalent Wikipedia page, in a consistent and prominent way. Your review of a page about something you're involved in is important, but it must be clear to the reader that they are reading something that's interested. Ultimately, this combination of wiki and blogging is going to be the answer. It's how Jimmy Wales will be able to tell us he doesn't think the stuff on his Bomis site was porn and how his Ferrari cost less than most SUVs, and how Adam Curry can tell you all about himself and edit everyone else out. Now the question is, who is qualified to edit the Wikipedia page?
That's a great question, with multiple possible "correct" answers. But Dave Winer's perspective demonstrates how wholistic thinking trumps simplistic, out-of-context analysis. Lets see Wikipedia for what it is, what it can be, how it fits into its environment, and encourage people to use it properly.

Exactly. So what if I can't cite Wikipedia the same way I would a static source? It's still immensely valuable to me.

Should we take what we find at Wikipedia at face value? No. Duh. But let's restate the question: Should we take ANY information we find, online or otherwise, at face value? Answers, please, on a post card.

Ultimately, the Wikipedia controversy, if it can be called that, is about how we feel about control. I know where I come down on that subject, and it's right where Jimmy Wales was when he spoke to USA Today in December: "'Any place where the general public is allowed to freely express their opinion without having any sort of prior approval from authority — it is dangerous,' Wales says. 'Free speech is dangerous. But it's also incredibly powerful and useful.'"

Amen.

(* March 8 editor's note: I've been thinking about the sentence where I wrote that some students said they had been told not to use Wikipedia "ever," and I feel I should clarify this statement. I wasn't taking notes, but the more I think about that sentence the more I worry that it is misleading. They were certainly told not to use Wikipedia in the limited sense of citing it in a footnote. And at least one student mentioned that a teacher had told him that it was OK to go to Wikipedia so long as he didn't use any information he found there. My interpretation of their comments doesn't change, but I think the wording in my original sentence overstates the level of explicit prohibition. Restated, it would be this: they're free to read Wikipedia if they choose to do so, but they are not to use it in their assignments. -- dc)

Thursday, March 02, 2006

The Katrina Tapes

What's really all that new in the AP Katrina video story?

In a word: Video.

The failure of the Katrina relief effort isn't news. Americans learned back in August and September that the government response to the Katrina disaster was inadequate. And though bias-warrior conservatives tend to blame the media for all negative perceptions of their champions, Katrina swept those arguments away like so many shacks in the 9th Ward.

Why?

In a word: Video.

It wasn't subtle liberal framing by CNN or CBS News or the NYT that sank Bush in September: It was video of the President saying "You're doing a heckuva job, Brownie." Plus video of the President trying to look Presidential by hugging two black storm "victims" at a fictional "aid station" on the Mississippi coast. Plus video of a passerby shouting "Go fuck yourself!" to Dick Cheney during a live news photo-op.

For all the media sturm und drang, the post-Katrina days were a period when the images the White House engineered to deliver its message just looked... phony. People might not have been able to put their finger on what was wrong, exactly, but it didn't take a rocket surgeon figure out that the reality of the unfolding tragedy just didn't jibe with the official response.

But that was then. What's the big deal now about the video of these of these pre-landfall FEMA briefings?

It's not that we didn't have solid evidence that the federal response had been bungled. And we've had plenty of evidence that the White House had gone into bunker mode, refusing for months to cooperate with Capitol Hill investigators. We knew last week that the White House report on the Katrina response managed to point no fingers at the Oval Office. It's all there if you want to read it. But few do.

The facts in those stories have a fatal flaw: they're just words. Written words. And in the war of the written word, there is no end to the parsing and the framing and the sense that the real truth lies somewhere else, beyond some media curtain, obscured by partisan interests and secretive agendas.

Informed media consumers are aware that video is at least as easy to manipulate as words, and that pictures can, in fact, lie. But the power of the image is undeniable. Why else would the question of whether Bush did a photo op with Jack Abramoff take on such high-stakes importance? Even if such a photo recorded nothing more than a meaningless "Thanks for your support" moment, in political terms such an image represents a tremendous weapon for the President's opponents.

So this is the meaningful part of the Katrina briefing-video story: non-partisan people who see it just won't walk away with the impression that the President of the United States was all that involved or concerned. There's a hollowness to his promises of federal support. There's a visual difference in the urgency expressed by the emergency officials seated cheek-to-jowl around a conference table and the president, seated beside an advisor and a cameraman, alone in a room at the ranch where he was spending his vacation.

No amount of journalistic balancing can undo the impression that such a video presents.

Such impressions can be misleading, and so far, hammering on this point and blaming the media -- again -- seems to be the best the Right can do.

"WE'RE BACK TO HEARING ABOUT KATRINA, which is a pretty good sign the media is trying to gin up an other anti-Bush swarm," Glenn Reynolds wrote at Instapundit. "Katrina taught the media that if they all swarmed Bush at once they could do harm even if -- as turned out to be the case -- much of what they reported was outright false. I've noticed a lot more of that since. The Bush Administration is quite capable of making its own trouble with PR -- see the ports issue, for example -- but it's also quite clear that the media is doing this sort of thing for entirely partisan reasons."

Entirely partisan reasons? I think that entirely misses the point. The show we're watching could be titled "The Bureaucracy Strikes Back." The White House strategy has been to scapegoat its underlings. It just didn't figure that the underlings would be smart enough to tape the proceedings -- and keep copies.

John Hindraker at Powerline plays lawyer tricks. We haven't seen the videos in their entirety. The clips were edited "in a way obviously intended to make President Bush and the administration look bad." Do the clips show the President misled the country? Hindraker's answer sounds an awful lot like "It depends on what your definition of the word 'is' is."

Hindraker parses with excruciating care the sourcing of AP phrases like "and Bush was worried too" while focusing enormous attention on the difference between "breaching" and "overtopping." Apparently, to Bush loyalists, the difference between one and the other proves that the media is bad and that Bush is blameless, although I honestly can say that after reading his arguments carefully I reached this conclusion: If one of my kids rationalized a failure of that magnitude with such threadbare word-play, I'd laugh while I whupped his sorry butt.

Anyway, none of this amounts to anything more than a temporary rhetorical fallback position for the partisan Right. When it's just words in play, more words can usually blunt their effect. But words can't undo the effect of images, as the Rodney King riots illustrated vividly in 1992.

There's still a lot of Bush administration tenure ahead of us, and for close observers, this may be little more than a footnote.

But to casual TV consumers, this looks an awful lot like that last, heavy straw.

Tuesday, February 28, 2006

Forget all that other stuff

Talk media with people who aren't in the media, and you'll figure out pretty quickly that the motives outsiders ascribe to us generally fail to connect with reality because their assumptions have one basic, fundamental flaw: They figure the process of newsgathering is somehow rational and deliberate.

Here's a much better picture, from the great Lenslinger: Mad Skills of a Veteran Photog.

(Crossposted @ Xark!)

Friday, February 17, 2006

Journalism from a software perspective

On Feb. 9, while reading up on the web framework Django, my eye gravitated toward an unfamiliar acronym in this sentence: “Django focuses on automating as much as possible and adhering to the DRY principle.”

So what’s DRY? To programmers, DRY means “Don’t Repeat Yourself,” and the link explaining the principle led eventually to this rather elegant statement: ''Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.''

My study of content management systems returned me to the Django page, but within minutes I found myself drifting back to this simple-but-powerful concept: Express every individual bit of knowledge in a clear and authoritative manner. It whispered in my ear, tugged at my sleeve, told me there was more in play here than geek esoterica.

But duty called, so to move on I scribbled the principle on a piece of sketchpad paper and pinned it to the corkboard behind my desk with this question: How Could We Apply This to 21st Century Journalism?

Because to be blunt, modern journalism – not to mention the larger culture – is in desperate need of some clear and authoritative factual statements.

Where we are
One of the great ironies of modern life is how post-modern conservatives become when the topic turns to the media. The Bible, Adam Smith, warped timber – these are articles of faith, received wisdom. Conservatives don’t generally challenge such statements with much fervor, nor should we expect them to. The goal of conservatism isn’t the questioning of authority but the bolstering it, typically against the critiques of the offensive, silly or malicious elements in society.

Yet when it comes to discussions of the media, even the most rock-ribbed of John Birchers turn downright existential.

Objectivity? Impossible! The very terms of the discussion render it so.

To 21st century American conservatives, any media claim of objectivity represents an overtly political act. What’s more, they say, the claim of journalistic objectivity is actually a partisan political act because – as any rational person can plainly see – the media is biased in favor of the Democratic Party.

Which leads us, ipso facto, to a surprisingly radical conclusion: Since the post-modern conservative critique has now eliminated even the possibility of journalistic objectivity, and since – as any rational person can plainly see – liberal media bias serves the Democratic Party, then the most essential media reform of the 21st century is the creation of a separate, subjective, distinctly partisan, pro-GOP press to compete with the old-line “mainstream media.”

This idea – that the antidote to a subjective-but-dishonest liberal media is an overtly subjective conservative media – isn’t new. FOX News and The Washington Times predate the current administration, as do numerous conservative press critics. What’s new are concepts that NYU journalism professor Jay Rosen calls “de-certification” and “rollback.” Both are now operative principles in both the Bush White House and the larger conservative movement.

De-certification rejects the notion that journalists have any unique standing to critically examine or publicly challenge the statements of political leaders. It identifies Big Journalism as just another special interest, and treats reporters as special pleaders. De-certification identifies press “spin” (coverage) as just another message in competition with the conservative message. It rejects the idea that reporters can be objective, or that critical coverage can be anything other than partisan.

Rollback is the implementation of de-certification. It is Scott McClellan repeating the same plainly non-responsive statement no matter the question. It is the release of the Dick Cheney hunting accident story to the local paper rather than the Washington press corps. It is President Bush pointing out that nobody elected the person asking him questions.

Though elements of this conservative critique may well be worth a larger discussion, it is their net effect that concerns me. Together, they portend a future in which the mass media will present Americans not just with competing viewpoints, but with competing facts. In the worst-case scenario, these polarized partisan presses will present factual claims that are mutually exclusive.

Which raises the question: Is American benefiting from its now-Balkanized mass media? Would subjectivity in mass media be helpful or harmful? And with critics on both sides of the political spectrum united in the belief that human objectivity is not possible, is there any way that those of us in the journalism business can steer our profession back toward something resembling a common frame of reference?

The stakes are visible in this study: In 2004, FOX News viewers were far more likely to vote for President Bush. These viewers were also far more likely to believe statements about Iraq that were factually untrue, and each of these inaccuracies negated Democratic critiques of the administration’s foreign policies.

It is now clear to me that simply appealing to the good faith of media consumers will never allow us to address this status quo. Reporters, editors and producers will never be able to regain objective credibility across partisan lines by making reforms in the way we report or package the news. Professionalism is good, but it won’t change the basic equation.

Two types of objectivity
Which is why in 2005 I began proposing that an optimistic vision of our future requires that journalists stop thinking about news as a craft and start thinking about news as an informational system.

I was covering science at the time, and you can’t do that very long without recognizing that objectivity wasn’t an impossibility for the biologists I covered – it was just another factor in their experiments. They controlled for it, and then they documented those controls for all to see. Not even Heisenberg’s Uncertainty Principle, the ultimate statement of observer-subjectivity, derails the scientific concept of objectivity.

Why? Because unlike journalistic objectivity, which proposes itself to be an artificial perspective, scientific objectivity is a documented process. A requirement of that process is that it be recorded clearly enough that findings are repeatable for all observers (in the case of laboratory experiments) or clearly controlled for the observer’s subjective perspective (field observation of a single event or series of events). When viewed from a distance, this process of objectivity varies for each individual discipline, but its philosophy is constant: Always be aware of the subjectivity of the observer, use agreed-upon standards, and show your work.

In other words, scientists have created a system of objectivity, and by abiding within its rules, civilization has flourished. Scientific objectivity allows a physicist in Oslo to derive a bit of knowledge that a physicist in Kyoto can apply to a larger experiment. While scientists do test each other’s findings, science does not re-invent wheels. This is why there is only one Uncertainty Principle – Heisenberg’s.

Compare this to modern journalism.

By our standards, if Al Gore took up physics and claimed he had derived an Uncertainty Principle, journalists leaving his press conference would be expected to call the White House for a response. The story announcing the Gore Uncertainty Principle (GUP) would likely point out that the Heritage Foundation has a competing Uncertainty Principle (HFUP), then noting in passing that that someone named Heisenberg had done similar work in the 1920s. Being journalistically objective, most versions of this story would report each of these claims as limited facts (the fact being that individuals had stated the claims) without attempting to evaluate those claims.

Along the way, we’d quote Gore saying why his GUP reaffirms the principles of participatory democracy, while a Heritage Foundation spokesman would opine about how the GUP gets it entirely, backwards wrong: the HFUP clearly proves that President Bush won both Florida in 2000 and Ohio in 2004.

A week later, a major media outlet might attempt to write a follow-up piece critically examining the claims, and if the reporter had any scientific expertise, this new story would likely conclude that Heisenberg’s Uncertainty Principle is the only one that matters, and that the partisan versions of this essential theory of quantum physics are, at best, irrelevant.

This story would be immediately assailed as biased, of course. Conservative viewers, watching their network, would reach one conclusion. Liberals another. And while this echo-chamber effect might be comforting for both groups, it’s hardly the prescription for creating an informed, constructive national debate on any subject.

Rethinking journalism
On December 9, 2005, I left this comment on a particularly contentious PressThink thread:

We need to create some kind of new information tool that helps us manage these situations, so that basic facts can be established and stipulated. If we don't trust the government and we don't trust the media and we don't trust each other, how can we get anywhere? We know how to build websites and blogs and news wires ... but how do (we) build trust in the 21st century?

Five days later I wrote a lengthy post (“21st century trust … the techno-geek way!”) at Xark! trying to answer my own question. And in early January, I actually proposed in another PressThink thread that journalists publicly evaluate their confidence in the factual content they were publishing.

The tricky part is that being explicit about confidence means editors would have to accept greater accountability. If I've overrated my 12-miners-alive story a 7 and it reverses, I look pretty damned stupid. Then again, if I'm systematically underbidding my confidence to prevent being revealed as wrong later, I'm not doing much to build my credibility. You want an incentive for people to be candid and thorough, and I think this might provide it.

To be truly useful, such a system would need to be keyed to something, whether it's a number system or a color code or a bar graph or a slider. Whatever. A 5 rating should mean the same thing to the reader as to the editor. The beauty of the web is that editors don't have to redundantly explain this stuff in print -- rather, they can post the rating and know that anybody who isn't sure what it means can click and find out exactly what it means. And the more specific the better.

Some found the idea interesting. Kinda. Sorta. Most didn’t. But even though some people I respect – namely Paul Lukasik and Steve Lovelady – have rather graciously tried to tell me that I’ve got quite overboard with such thinking, I have the sneaking suspicion that the problem with most proposed solutions for our current media malaise is that they don’t go far enough.

The other problem is that they’ve thrown out the baby with the bathwater when it comes to objectivity. Fine: Let’s junk journalistic objectivity and its Halfling brethren “news judgment” and “fairness.” But let’s not concede the intellectual ground to competing subjective visions without first exploring the possibilities of a more scientific form of objectivity. Not a particularly enlightened perspective or state of being, but a transparent process.

How it might work.
Imagine for a moment that your next word processor came with an annoying “intelligent agent” feature that recognized any declarative statement of fact you ever wrote and then asked you to cite its definitive source. An incredible pain in the ass, yes.

But now imagine that, as a reader, every document you were ever asked to evaluate came to you as rich hypertext, with each summary fact transparently sourced all the way back to its original, definitive expression. Would you treat its claims differently than you would a document that arrived without that kind of depth behind it?

I’d wager you would. Sure, most writers cite sources, even if they don’t expressly name them. But are the sources definitive, or are they just dueling “facts” – on-the-public-record but never actually challenged or verified?

But back to our imagining. Anyone, given unlimited time and resources, could produce dry, boring, factual articles that are nevertheless elaborately festooned with hypertext-footnotes. Someone with zero understanding of how the modern mediascape works might even prescribe this as a solution for what ails us.

Realistically though, most reporters and editors will never have the time or resources to produce such exhaustive fact-check formatting on deadline. Even with modern
Web search-engines, checking a relatively simple statement back to its “single, unambiguous, authoritative representation” is exceedingly time-consuming and tedious task. Allow me to demonstrate:

“North Charleston, despite being one of the youngest cities in the state, is also among the largest.”

Now. Time me.

Two minutes: “As a means of bringing government closer to the people, an incorporation referendum was held on April 27, 1971. On June 12, 1972, after a series of legal battles, the South Carolina Supreme Court upheld the referendum results and North Charleston became a city.” (http://www.northcharleston.org/AboutUs/History.aspx)

Four minutes: “Incorporated in 1972, it is South Carolina's youngest city of any size.” (http://www.northcharleston.org/AboutUs/LocationMap.aspx).

Seven minutes: 2000 US Census population (via http://factfinder.census.gov/servlet/GCTTable?_bm=n&_lang=en&mt_name=DEC_2000_PL_U_GCTPL_ST7&format=ST-7&_box_head_nbr=GCT-PL&ds_name=DEC_2000_PL_U&geo_id=04000US45): 79,641

Eight minutes: Charleston, 96,650; Columbia 116,278;

Nine minutes. None other found.

So there we are: Almost 10 minutes of searching for a basically benign statement. The sources look pretty good, too – but they still aren’t anywhere close to the single, unambiguous, authoritative representations that the DRY principle calls for.

For instance, when the North Charleston city website calls itself “South Carolina’s youngest city of any size,” is that independent of the term “town?” It certainly doesn’t take into account the municipal soap opera that has been the recent history of the Town of James Island, which has been incorporated and disbanded twice in the last decade (James Island is currently unincorporated, which wouldn’t precisely invalidate this statement of fact). Beyond that, can the town of North Charleston be trusted to provide authoritative statements about itself?

Neither is the information up-to-date. There’s a 2003 census estimate that I found that shows North Charleston with roughly 81,500 residents… but that’s at least three years old now, and it’s an estimate. It doesn’t change the statement I made, but now I’m foundering. Which one would I pick as the authoritative representation of the original bit of knowledge?

Given this quick searching, perhaps I would edit my statement: “North Charleston, despite being the youngest city in the state, is also its third-largest.” The sentence is actually three factual statements: 1. North Charleston is the most recently incorporated municipality in South Carolina; 2. North Charleston’s population is estimated at roughly 81,500 people; 3. Only two other municipalities (Columbia and Charleston) in SC have larger populations. So my searching has marginally strengthened my statement and the hypertext footnoting may have improved your willingness to believe its veracity.

And yet in no way have I met the standards of the DRY Principle. I’ve wasted valuable time bolstering a sentence that – even when upgraded – makes the same point specifically that it originally made generally. And the items to which I point as my proof lack truly authoritative status. No doubt I’ll be fielding pointless phone calls from miffed James Islanders, who interpret the statement differently and want to argue.

Even under the most cursory examination, my DRY experiment is a tremendous timewasting flop.

All of which demonstrates why a real DRY fact-base would be tremendously valuable.

The trouble with search
Google is far from the definitive source most people imagine it to be. Just try updating your website and Googling the changes for proof. In fact, no web search engine can meet this standard, because the people writing the search algorithms aren’t the same people managing the data. So while web search points us toward facts, it cannot, as a system, create truly authoritative factual statements.

We need another tool. In fact, we need several of them.

  1. We need a curated fact-base. From raw data like census reports to statements contained in magazine articles, we need a database of primary factual statements that have been sourced and verified according to transparent and universally recognized standards.
  2. We need a system by which new primary factual statement may be reviewed and added to the factbase.
  3. We need a system by which all facts within the database can be reviewed and updated automatically. Such a system would also connect changes of primary fact to secondary statements such as “North Charleston is the state’s third-largest city.”

And then there’s No. 4:

  1. We need an intelligent word processing tool that automatically relates each factual claim to its original, unambiguous, authoritative statement.

No. 4 is the idea transports DRY Principle Journalism from the impractical to the sublime. Why? Because relevant factual statements tend to become pyramids over time. Down at the bottom? Census figures. Incorporation records. Later comes a statement, like mine, that combines census figures and incorporation records. Eventually, you reach statements like this one: “Along with its relative youth and rapid growth comes crime. North Charleston violent crime rate was among the highest in the United States in 2005 (ranked No. 79 for US municipalities).” Facts correlate, interrelate, expand and contrast.

If I write using DRY-principle facts, then each level of complexity I ascend becomes its own DRY-principle statement.

With the right tools linking the DRY factbase to my word processor, I’d know if my statement was generally correct, generally incorrect, or questionable. As I write, the built-in analyzer would search the factbase for relevant facts, perhaps listing them in a scrolling window beside my word processing field. At the end of the article, I’d probably edit by scanning back over the cited links generated by my intelligent agent, check to see if there were any obvious ways to improve the factual rigor of my article, and then press save.

Of course, if I’m reporting, my job is to generate new facts. How might DRY help me there?

Well, for starters it might let me know whether my subject is actually news – or just news to me. It would guard against me making factual and context errors. Perhaps we could even train it to recognize and challenge certain types of logical fallacies or misleading rhetorical devices.

But the most important role such an agent might play for a reporter is that it would recognize new, unsupported factual statements, note their cited sources, and apply to the factbase process.

Memory and power
On Sept. 1, 2005, with his administration beginning to come under fire for its response to the Katrina disaster, President Bush told reporters “I don’t think anyone anticipated the breech of the levees.”

With an intelligent agent dynamically connecting the DRY factbase to their word processors, reporters would have known this statement to be factually incorrect before they had finished typing the closing quotation mark. Why? Because multiple previous articles and disaster exercises had done exactly that – predicting with great accuracy the impact of a Katrina-like storm.

Yes, we all need an ever-expanding database of original-source facts, stated clearly and authoritatively. But the trick to making such a thing useful would be to embed in our writing tools the kinds of pattern-seeking software that first recognizes declarative grammar and then applies the words as search terms.

Positive correlations might stream into the word processor’s “hits” window as green supporting citations. But contradictory facts – like FEMAs previous disaster exercises – would flash red.

At the very least, a reporter following these protocols would know that the President’s statement was less than rigorously true. So too would anyone following the story at home. What’s the point of building the world’s greatest factual reference and not making it public to the world?

The President should have access to the factbase as well – if only to get his story straight before he goes out to meet the press.

And if the President wishes to contend that the factbase is wrong, well – we should be able to build feedback mechanisms that allow that, as well.

Administration
So, imagine for a moment that our discovery informatics wizards could develop the right interface. And imagine that our systems geniuses could invent the right storage, cross-referencing and retrieval processes. Imagine that our best archivists and data specialists could create a transparent system for batch-converting the huge volume of new data that would soon flood into your fact base. Imagine that the wise among us could create fair and practical ways or making sure the factbase stays accurate.

OK then: How would we pay for it?

One answer might be that the nation’s media outlets could work cooperatively on such a system, much in the same way that that competitors work together to make the Associated Press. The project is too large for any single participant, but if each worked together, each would benefit.

Colleges and universities? Sure. Research and development labs? You bet. Anyone with an interest in the expansion and vetting of information could benefit.

Governments?

Well, that’s another question.

Regardless of who would pay and how much such a system would cost, I see nothing it what I propose here that exceeds the theoretical capabilities of existing or developing technologies. And if science is any guide, then the value of having solid factual information at the world’s disposal – without having to independently verify each individual bit of knowledge – would be a tremendous economic multiplier.

I believe a system like this will be within our reach within a decade.

Would it be a magic bullet? No. So much of what passes for fact is actually only “facty.” How much of our political reality is based on guesses, attitudes, opinions? A DRY factbase and standards-based journalism wouldn’t change that.

But creating a standard repository of “single, unambiguous, authoritative representations” of knowledge would be a transformative technology both for journalism and society. Not because it would expand knowledge – but because it would allow the creation of a system of mutually agreed-upon, standards-based journalism and communication.

Some people would choose to stay outside such a system. They would challenge its validity, appeal to fear, appeal to divine authority. They could appeal to “truthiness” just as they do today.

But by making such a system open-source, and by inviting everyone to participate in monitoring it, you would move truthiness from the heart of the culture to its periphery.

People like me will still write to persuade. We will still argue over which facts are relevant.

But no longer will you have to trust me to see the relative value in what I have to say.

And that would be the biggest improvement in communication I could ever imagine.