Wednesday, April 26, 2006

A blog blog

To get some idea of what we've been up to recently, take a peek at a new blog-about-local-blogs that Janet and I started a week ago. We're doing it for the newspaper as part of the new Postscripts project (Janet came up with the brand, logo and slogan), and here's the skinny: We "launched" it on the night of April 18th by quietly sending out the link to two local bloggers.

The next morning we had more than 50 hits.

Since then we've been covering local blogs in a manner that would be familiar to anyone who reads The Hotline's Blogometer. By the end of its first week, our stealth blog had already generated more than 1,000 hits and 20 comments. Local bloggers have used it to announce plans for a blogger party next month. People are excited, and The Big Blogroll keeps growing. We're indexing more than 50 local, active blogs, and that's already more than half the number listed at Greensboro 101, a pioneering local blog site created by Roch Smith.

Could it be that there's a more active local blogosphere in the Lowcountry than any of us imagined? What happens to a community of writers and readers when that community becomes aware of itself? Can a local metro daily connect to its local blogosphere in a constructive way?

I dunno, but we're about to find out. Today the webmaster for our Charleston.net web site is working out the domain redirects that will give our local blog-blog a shorter, easier-to-remember URL, and once that's set, we'll be linkable off our own site for the first time.

It's early, but the early signs are encouraging. Particularly this one: In the post in which local blogger Walter (of Baxter Sez) announced his discovery of Lowcountry Blogs, he wrote:
since i now see that we may have just a leetle bit of local attention i want to take advantage and offer up my first bit of critique for something charleston. here it is:

Crosstown. the road that truncates the peninsula. man, that is one ugly sumbitch.
... and went on from there to talk about the congested local freeway that carves up neighborhoods on the Charleston peninsula.

Which leads to my point: if connecting local bloggers to each other encourages them to write more about the place where they live, why would a local news organization not want to help with that?

Beats me.

Thursday, March 23, 2006

Competition and its alternatives

(Cross-posted from Xark!)

I offer this as proof that even really cool ideas can take a while to find their audiences, particularly when that audience is saturated by media in the first place. Anyway, thanks to the few Charleston bloggers I have found and added to my reading list, I've finally encountered something I should have been tracking for weeks: a Charleston City Paper guy named Jay Stecher. Apparently, Stecher has been doing a thing since Feb. 1 called "The Weekly Geekly" -- great name, eh? -- in which he rounds up local blogs and podcasts, etc.

The Weekly Geekly isn't a blog -- though it would make a natural one, hint-hint -- but it's already doing one of the things I hope to convince my bosses to do: draw attention to the work of people who are making their own media in the city we share. This is hardly a cutting-edge idea, but it's alien to the competitive-media-mind until you wrap your brain around some of the related concepts. In case you haven't noticed, newspapers traditionally hate citing the work of their competitors.

And this is one of those related concepts: in blogging, you don't deal with competition by trying to freeze out your competitors.

One of my favorite stories from ConvergeSouth was someone (I forget who) talking about how The (Greensboro) News and Record had improved its relationship with the local blogosphere by changing the way it dealt with staff-written stories that were developed from independent blog posts. The first N&R stories that were "inspired" by information originally reported on blogs routinely made zero mention of that fact. Bloggers cried foul.

What those bloggers probably didn't know was this: reinventing someone else's wheel so that you can say "Look! I made a wheel!" is a longstanding and truly stupid tradition in traditional media. Every paper I've ever worked for has done it, and as city editor, I used to tell people to do it.

Anyway, here's the thinking (or rationalization, take your pick): Readers want local news. Readers will define "local" as "staff written." So if the Associated Press or one of our wire services moves a story that is local to our market or touches on a local issue, we will "localize" the story. For you. The reader.

At a minimum, localization means that I'll take a story that deals with a national or regional trend or issue and make sure that there is information in that story that places the local situation in that larger context. The AP moves a story on freedom of information compliance, so I add some paragraphs that tell local readers what the rules are here. I don't have any problem with that.

Where I go off the reservation is when it comes to the second level of localization, which I like to call "theft." A wire service writes a story about something in our state, and we don't have it. So we assign a reporter to call all the same sources from the original story, then write the same thing using different words. When that story appears in print, there's no mention the original article.

And it's even worse if the source for the story is a direct competitor instead of a mutual wire service. Yikes! We've been scooped! We need to do this story right now!

We rationalize this by saying "Well, we need to verify the story independently," but this just isn't a meaningful response. Yes, independent verification is a good thing. But we run wire stories that we don't verify every day. We couldn't possibly meet such a standard.

The truth is, news organizations recreate existing news stories when they're caught flatfooted by other news organizations and don't want to admit it. I think it's dishonest, wasteful of precious reporting resources and disrepectful of our readers -- not to mention the originator of the story.

So when you hear that the Greensboro reporters weren't citing the work of local bloggers, you can understand why. The N&R reporters and editors were doing what we've always done as an industry. They were applying "traditional newspaper values" to their relationship with the blogosphere. And those values just didn't work in the new medium.

The N&R made peace with its local bloggers by changing the way it looked at them. Competitors? Maybe. But nature suggests that there are viable relationships between species that are something other than competitive. A better word is symbiosis. And a symbiosis between Big Local Media and independent local bloggers is a much more productive relationship for everyone concerned.

Greensboro solution? When an N&R reporter finds a legitimate story in a local blog, the reporter does all the necessary reporting as if he or she was starting from scratch. But when the story appears, the blog is cited. The blogger gets traffic, credibility and juice. The honoring of that work then reflects back on the N&R, building its brand within the local blogosphere. Everyone benefits.

As frequent Xark commenter Pam Morris (a brilliant and witty microbiologist) described it to me once, not every bacteria colony succeeds by out-competing its rivals for available resources. Some get along by handicapping their rivals' ability to compete. "They're really kind of evil. It's like a really bad corporate environment," she said (I'm quoting from memory).

I don't think that attitude has a long-term future with online media. If you base your product on claims of fairness, openness, transparency, whatever, then you just can't go around acting like evil bacteria. You can't see bloggers who write about your local market as enemies to be crushed. You can't even view your critics this way.

Read this carefully: I'm not talking about a future in which there is no competition. I'm suggesting instead that we start developing new attitudes and relationships. It is possible to be both competitive and cooperative. Think I'm just an unrealistic hippie selling psuedo-socialist Kool-Aid? Allow me to introduce you to The NFL.

My bosses crossed the first of these bridges in May 2005, when my posts at our Spoletoblog routinely cited the work of bloggers for The Charleston City Paper. Some people in the building were disturbed, but my boss wasn't.

"I told 'em that's just the way this new world works," he said.

Exactly.

Tuesday, March 07, 2006

The campaign against Wikipedia

(Editor's note: This post began as a news item at Xark!, but grew into a stand-alone essay.)

I first noticed this back in February while speaking about Web trends to a Public Relations/Business Communications class at a local college. When I asked about Wikipedia, everyone who spoke expressed a clear message: Wikipedia, to them, was not so much a resource as it was a threat.

Multiple students reported they had been told by their instructors not to use it -- ever*. Some spoke of professors who routinely threatened to punish anyone caught using it. And even my host allowed that her attitude toward the online encyclopedia was less than charitable.

Last night, speaking to a group of high-school journalists, I got similar responses. In these instances, I detect not only scholarly skepticism, but something more. Something bordering on scorn.

Like the college students from last month, these high schoolers knew that anybody could edit Wikipedia, though none of them expressed any understanding about how the system functioned, what a wiki is, or how a community of editors becomes a self-correcting entity, etc.

It's a disconnect. I see Wikipedia as a way of thinking about information and virtual community. They see it as a free-for-all. Their teachers see it as anarchy.

I call this a backlash. Wikipedia came out of nowhere, fast, to become the largest encyclopedia in history. Some academics, who as a group are used to controlling such things, were horrified by the wiki concept -- and financially threatened by the open-source, free-info, non-profit model that keeps Wikipedia a living, growing document. Rather than checking it out further, a segment of academia appears to have united against it.

From a PR standpoint, the big blow came in December when John Seigenthaler wrote a widely publicized piece citing inaccuracies in a Wikipedia article on his life.The founding editorial director of USA Today called Wikipedia "a flawed and irresponsible research tool" in a column that reflected the attitude I've noticed among some academics -- that whatever else Wikipedia may be, it also is a sandbox for malcontents, anarchists and children who run around with scissors. In other words: Not For Serious Adults.

In the wake of the Seigenthaler column, big-name bloggers and technorati, including Dave Winer and Adam Curry, came out with their own criticisms. Making matters worse, Wikipedia founder (or, as the case may be, co-founder) Jimmy Wales got "caught" editing his Wikipedia bio and taking out references to his early collaborators.

Wikipedians got their say in the ensuing coverage, but from an outsider's perspective, it seemed like the Wikimedia Foundation -- a concept I dearly love -- was suddenly in public-relations damage control mode.

Here's my take:

The idea that Wikipedia is less accurate because it doesn't have top-down editorial control is, itself, inaccurate. A better question would be, When do we know it to be accurate? The whole concept of a collaborative information project is based on the idea that community collaboration will identify and correct errors -- in public. Traditional media also involves editing and fact-checking, but it does so before publication and without transparency. Yet traditional media routinely stumble when it comes to correcting the errors that slip past those all-too-human pre-pub controls.

"Does Wikipedia have errors?" isn't a meaningful question, but "what errors will an individual Wikipedia entry contain in the snapshot of time that I see when I call up the entry?" is a question that actually takes us somewhere.

Wikipedia asks that we correct the errors we see -- and, unlike the popular stereotype of Wikipedia as an irresponsible Wild West of disinformation -- Wikipedia as a process includes multiple feedback loops that address vandalism, inaccuracies, biased writing, etc. It assumes that people are adults.

The question, then, is not whether Wikipedia has editorial quality controls (it does), but whether those controls work fast enough.

That's an open-ended question (fast enough compared to what?), but I contend that a wiki-model encyclopedia will probably correct its errors far faster than a proprietary encyclopedia. My reasoning? Top-down, for-profit editoral control pays a few people to ride herd on a large range. It cannot mobilize as many corrective resources, as quickly, as the Wikipedia community can.

Plus, is Wikipedia really inaccurate? Again, accurate compared to what?

Nature decided to compare Wikipedia to Britannica, considered the Gold Standard of traditional enclopedias. Its finding? On a survey of 42 science articles, Britannica was more accurate.

But how much more accurate? Not much. The Nature study found an average of four errors in its Wikipedia entries... compared to three errors, on average, in a Britannica entry.

The debate goes back and forth, with some Wikipedians contending that the average Wikipedia entry is 2.6 times longer than the average Britannica entry, then doing the math to produce a lower error rate. Yada yada yada. I don't care. Framing this as a competition between Wikipedia and Britannica misses the point.

The more telling comparison is between Wikipedia and Google, because when you consider how I've come to use Wikipedia, it's as an alternative to general web search engines. Wikipedia is just as fast, far more relevant and much more accurate in the information it returns. Viewed as a form of curated search, Wikipedia looks a lot less threatening.

In this sense, Wikipedia is a through-point, not a destination. And, ironically, this was exactly how my teachers told me I was supposed to use an encyclopedia Back In The Day.

Not only are we comparing Wikipedia to the wrong standard and failing to understand it as a process and a community, we're also missing the most valuable points of the accuracy debate by taking Wikipedia out of its natural context: the larger Web. Dave Winer has been cited by Wiki-haters for his criticisms, but that's far from the complete picture. Consider this Scripting News post from December, in which Winer addresses the Siegenthaler case and the larger ethical question of who-should-edit-what (emphasis added):
Ross Mayfield sees the pros and cons of editing your bio page on Wikipedia. Here's my take on it. No, you must not edit your bio page, or any page about a topic in which you have an interest. It's impossible to disclose that interest, so the poor reader has no idea how to credit what's on the page. This is the weakness of Wikipedia, in fact of all wiki. But his point about the knowledge you have about yourself is an important one. Imho, the obvious answer is that your page, on your site, edited only by you, should be linked to from the equivalent Wikipedia page, in a consistent and prominent way. Your review of a page about something you're involved in is important, but it must be clear to the reader that they are reading something that's interested. Ultimately, this combination of wiki and blogging is going to be the answer. It's how Jimmy Wales will be able to tell us he doesn't think the stuff on his Bomis site was porn and how his Ferrari cost less than most SUVs, and how Adam Curry can tell you all about himself and edit everyone else out. Now the question is, who is qualified to edit the Wikipedia page?
That's a great question, with multiple possible "correct" answers. But Dave Winer's perspective demonstrates how wholistic thinking trumps simplistic, out-of-context analysis. Lets see Wikipedia for what it is, what it can be, how it fits into its environment, and encourage people to use it properly.

Exactly. So what if I can't cite Wikipedia the same way I would a static source? It's still immensely valuable to me.

Should we take what we find at Wikipedia at face value? No. Duh. But let's restate the question: Should we take ANY information we find, online or otherwise, at face value? Answers, please, on a post card.

Ultimately, the Wikipedia controversy, if it can be called that, is about how we feel about control. I know where I come down on that subject, and it's right where Jimmy Wales was when he spoke to USA Today in December: "'Any place where the general public is allowed to freely express their opinion without having any sort of prior approval from authority — it is dangerous,' Wales says. 'Free speech is dangerous. But it's also incredibly powerful and useful.'"

Amen.

(* March 8 editor's note: I've been thinking about the sentence where I wrote that some students said they had been told not to use Wikipedia "ever," and I feel I should clarify this statement. I wasn't taking notes, but the more I think about that sentence the more I worry that it is misleading. They were certainly told not to use Wikipedia in the limited sense of citing it in a footnote. And at least one student mentioned that a teacher had told him that it was OK to go to Wikipedia so long as he didn't use any information he found there. My interpretation of their comments doesn't change, but I think the wording in my original sentence overstates the level of explicit prohibition. Restated, it would be this: they're free to read Wikipedia if they choose to do so, but they are not to use it in their assignments. -- dc)

Thursday, March 02, 2006

The Katrina Tapes

What's really all that new in the AP Katrina video story?

In a word: Video.

The failure of the Katrina relief effort isn't news. Americans learned back in August and September that the government response to the Katrina disaster was inadequate. And though bias-warrior conservatives tend to blame the media for all negative perceptions of their champions, Katrina swept those arguments away like so many shacks in the 9th Ward.

Why?

In a word: Video.

It wasn't subtle liberal framing by CNN or CBS News or the NYT that sank Bush in September: It was video of the President saying "You're doing a heckuva job, Brownie." Plus video of the President trying to look Presidential by hugging two black storm "victims" at a fictional "aid station" on the Mississippi coast. Plus video of a passerby shouting "Go fuck yourself!" to Dick Cheney during a live news photo-op.

For all the media sturm und drang, the post-Katrina days were a period when the images the White House engineered to deliver its message just looked... phony. People might not have been able to put their finger on what was wrong, exactly, but it didn't take a rocket surgeon figure out that the reality of the unfolding tragedy just didn't jibe with the official response.

But that was then. What's the big deal now about the video of these of these pre-landfall FEMA briefings?

It's not that we didn't have solid evidence that the federal response had been bungled. And we've had plenty of evidence that the White House had gone into bunker mode, refusing for months to cooperate with Capitol Hill investigators. We knew last week that the White House report on the Katrina response managed to point no fingers at the Oval Office. It's all there if you want to read it. But few do.

The facts in those stories have a fatal flaw: they're just words. Written words. And in the war of the written word, there is no end to the parsing and the framing and the sense that the real truth lies somewhere else, beyond some media curtain, obscured by partisan interests and secretive agendas.

Informed media consumers are aware that video is at least as easy to manipulate as words, and that pictures can, in fact, lie. But the power of the image is undeniable. Why else would the question of whether Bush did a photo op with Jack Abramoff take on such high-stakes importance? Even if such a photo recorded nothing more than a meaningless "Thanks for your support" moment, in political terms such an image represents a tremendous weapon for the President's opponents.

So this is the meaningful part of the Katrina briefing-video story: non-partisan people who see it just won't walk away with the impression that the President of the United States was all that involved or concerned. There's a hollowness to his promises of federal support. There's a visual difference in the urgency expressed by the emergency officials seated cheek-to-jowl around a conference table and the president, seated beside an advisor and a cameraman, alone in a room at the ranch where he was spending his vacation.

No amount of journalistic balancing can undo the impression that such a video presents.

Such impressions can be misleading, and so far, hammering on this point and blaming the media -- again -- seems to be the best the Right can do.

"WE'RE BACK TO HEARING ABOUT KATRINA, which is a pretty good sign the media is trying to gin up an other anti-Bush swarm," Glenn Reynolds wrote at Instapundit. "Katrina taught the media that if they all swarmed Bush at once they could do harm even if -- as turned out to be the case -- much of what they reported was outright false. I've noticed a lot more of that since. The Bush Administration is quite capable of making its own trouble with PR -- see the ports issue, for example -- but it's also quite clear that the media is doing this sort of thing for entirely partisan reasons."

Entirely partisan reasons? I think that entirely misses the point. The show we're watching could be titled "The Bureaucracy Strikes Back." The White House strategy has been to scapegoat its underlings. It just didn't figure that the underlings would be smart enough to tape the proceedings -- and keep copies.

John Hindraker at Powerline plays lawyer tricks. We haven't seen the videos in their entirety. The clips were edited "in a way obviously intended to make President Bush and the administration look bad." Do the clips show the President misled the country? Hindraker's answer sounds an awful lot like "It depends on what your definition of the word 'is' is."

Hindraker parses with excruciating care the sourcing of AP phrases like "and Bush was worried too" while focusing enormous attention on the difference between "breaching" and "overtopping." Apparently, to Bush loyalists, the difference between one and the other proves that the media is bad and that Bush is blameless, although I honestly can say that after reading his arguments carefully I reached this conclusion: If one of my kids rationalized a failure of that magnitude with such threadbare word-play, I'd laugh while I whupped his sorry butt.

Anyway, none of this amounts to anything more than a temporary rhetorical fallback position for the partisan Right. When it's just words in play, more words can usually blunt their effect. But words can't undo the effect of images, as the Rodney King riots illustrated vividly in 1992.

There's still a lot of Bush administration tenure ahead of us, and for close observers, this may be little more than a footnote.

But to casual TV consumers, this looks an awful lot like that last, heavy straw.

Tuesday, February 28, 2006

Forget all that other stuff

Talk media with people who aren't in the media, and you'll figure out pretty quickly that the motives outsiders ascribe to us generally fail to connect with reality because their assumptions have one basic, fundamental flaw: They figure the process of newsgathering is somehow rational and deliberate.

Here's a much better picture, from the great Lenslinger: Mad Skills of a Veteran Photog.

(Crossposted @ Xark!)

Friday, February 17, 2006

Journalism from a software perspective

On Feb. 9, while reading up on the web framework Django, my eye gravitated toward an unfamiliar acronym in this sentence: “Django focuses on automating as much as possible and adhering to the DRY principle.”

So what’s DRY? To programmers, DRY means “Don’t Repeat Yourself,” and the link explaining the principle led eventually to this rather elegant statement: ''Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.''

My study of content management systems returned me to the Django page, but within minutes I found myself drifting back to this simple-but-powerful concept: Express every individual bit of knowledge in a clear and authoritative manner. It whispered in my ear, tugged at my sleeve, told me there was more in play here than geek esoterica.

But duty called, so to move on I scribbled the principle on a piece of sketchpad paper and pinned it to the corkboard behind my desk with this question: How Could We Apply This to 21st Century Journalism?

Because to be blunt, modern journalism – not to mention the larger culture – is in desperate need of some clear and authoritative factual statements.

Where we are
One of the great ironies of modern life is how post-modern conservatives become when the topic turns to the media. The Bible, Adam Smith, warped timber – these are articles of faith, received wisdom. Conservatives don’t generally challenge such statements with much fervor, nor should we expect them to. The goal of conservatism isn’t the questioning of authority but the bolstering it, typically against the critiques of the offensive, silly or malicious elements in society.

Yet when it comes to discussions of the media, even the most rock-ribbed of John Birchers turn downright existential.

Objectivity? Impossible! The very terms of the discussion render it so.

To 21st century American conservatives, any media claim of objectivity represents an overtly political act. What’s more, they say, the claim of journalistic objectivity is actually a partisan political act because – as any rational person can plainly see – the media is biased in favor of the Democratic Party.

Which leads us, ipso facto, to a surprisingly radical conclusion: Since the post-modern conservative critique has now eliminated even the possibility of journalistic objectivity, and since – as any rational person can plainly see – liberal media bias serves the Democratic Party, then the most essential media reform of the 21st century is the creation of a separate, subjective, distinctly partisan, pro-GOP press to compete with the old-line “mainstream media.”

This idea – that the antidote to a subjective-but-dishonest liberal media is an overtly subjective conservative media – isn’t new. FOX News and The Washington Times predate the current administration, as do numerous conservative press critics. What’s new are concepts that NYU journalism professor Jay Rosen calls “de-certification” and “rollback.” Both are now operative principles in both the Bush White House and the larger conservative movement.

De-certification rejects the notion that journalists have any unique standing to critically examine or publicly challenge the statements of political leaders. It identifies Big Journalism as just another special interest, and treats reporters as special pleaders. De-certification identifies press “spin” (coverage) as just another message in competition with the conservative message. It rejects the idea that reporters can be objective, or that critical coverage can be anything other than partisan.

Rollback is the implementation of de-certification. It is Scott McClellan repeating the same plainly non-responsive statement no matter the question. It is the release of the Dick Cheney hunting accident story to the local paper rather than the Washington press corps. It is President Bush pointing out that nobody elected the person asking him questions.

Though elements of this conservative critique may well be worth a larger discussion, it is their net effect that concerns me. Together, they portend a future in which the mass media will present Americans not just with competing viewpoints, but with competing facts. In the worst-case scenario, these polarized partisan presses will present factual claims that are mutually exclusive.

Which raises the question: Is American benefiting from its now-Balkanized mass media? Would subjectivity in mass media be helpful or harmful? And with critics on both sides of the political spectrum united in the belief that human objectivity is not possible, is there any way that those of us in the journalism business can steer our profession back toward something resembling a common frame of reference?

The stakes are visible in this study: In 2004, FOX News viewers were far more likely to vote for President Bush. These viewers were also far more likely to believe statements about Iraq that were factually untrue, and each of these inaccuracies negated Democratic critiques of the administration’s foreign policies.

It is now clear to me that simply appealing to the good faith of media consumers will never allow us to address this status quo. Reporters, editors and producers will never be able to regain objective credibility across partisan lines by making reforms in the way we report or package the news. Professionalism is good, but it won’t change the basic equation.

Two types of objectivity
Which is why in 2005 I began proposing that an optimistic vision of our future requires that journalists stop thinking about news as a craft and start thinking about news as an informational system.

I was covering science at the time, and you can’t do that very long without recognizing that objectivity wasn’t an impossibility for the biologists I covered – it was just another factor in their experiments. They controlled for it, and then they documented those controls for all to see. Not even Heisenberg’s Uncertainty Principle, the ultimate statement of observer-subjectivity, derails the scientific concept of objectivity.

Why? Because unlike journalistic objectivity, which proposes itself to be an artificial perspective, scientific objectivity is a documented process. A requirement of that process is that it be recorded clearly enough that findings are repeatable for all observers (in the case of laboratory experiments) or clearly controlled for the observer’s subjective perspective (field observation of a single event or series of events). When viewed from a distance, this process of objectivity varies for each individual discipline, but its philosophy is constant: Always be aware of the subjectivity of the observer, use agreed-upon standards, and show your work.

In other words, scientists have created a system of objectivity, and by abiding within its rules, civilization has flourished. Scientific objectivity allows a physicist in Oslo to derive a bit of knowledge that a physicist in Kyoto can apply to a larger experiment. While scientists do test each other’s findings, science does not re-invent wheels. This is why there is only one Uncertainty Principle – Heisenberg’s.

Compare this to modern journalism.

By our standards, if Al Gore took up physics and claimed he had derived an Uncertainty Principle, journalists leaving his press conference would be expected to call the White House for a response. The story announcing the Gore Uncertainty Principle (GUP) would likely point out that the Heritage Foundation has a competing Uncertainty Principle (HFUP), then noting in passing that that someone named Heisenberg had done similar work in the 1920s. Being journalistically objective, most versions of this story would report each of these claims as limited facts (the fact being that individuals had stated the claims) without attempting to evaluate those claims.

Along the way, we’d quote Gore saying why his GUP reaffirms the principles of participatory democracy, while a Heritage Foundation spokesman would opine about how the GUP gets it entirely, backwards wrong: the HFUP clearly proves that President Bush won both Florida in 2000 and Ohio in 2004.

A week later, a major media outlet might attempt to write a follow-up piece critically examining the claims, and if the reporter had any scientific expertise, this new story would likely conclude that Heisenberg’s Uncertainty Principle is the only one that matters, and that the partisan versions of this essential theory of quantum physics are, at best, irrelevant.

This story would be immediately assailed as biased, of course. Conservative viewers, watching their network, would reach one conclusion. Liberals another. And while this echo-chamber effect might be comforting for both groups, it’s hardly the prescription for creating an informed, constructive national debate on any subject.

Rethinking journalism
On December 9, 2005, I left this comment on a particularly contentious PressThink thread:

We need to create some kind of new information tool that helps us manage these situations, so that basic facts can be established and stipulated. If we don't trust the government and we don't trust the media and we don't trust each other, how can we get anywhere? We know how to build websites and blogs and news wires ... but how do (we) build trust in the 21st century?

Five days later I wrote a lengthy post (“21st century trust … the techno-geek way!”) at Xark! trying to answer my own question. And in early January, I actually proposed in another PressThink thread that journalists publicly evaluate their confidence in the factual content they were publishing.

The tricky part is that being explicit about confidence means editors would have to accept greater accountability. If I've overrated my 12-miners-alive story a 7 and it reverses, I look pretty damned stupid. Then again, if I'm systematically underbidding my confidence to prevent being revealed as wrong later, I'm not doing much to build my credibility. You want an incentive for people to be candid and thorough, and I think this might provide it.

To be truly useful, such a system would need to be keyed to something, whether it's a number system or a color code or a bar graph or a slider. Whatever. A 5 rating should mean the same thing to the reader as to the editor. The beauty of the web is that editors don't have to redundantly explain this stuff in print -- rather, they can post the rating and know that anybody who isn't sure what it means can click and find out exactly what it means. And the more specific the better.

Some found the idea interesting. Kinda. Sorta. Most didn’t. But even though some people I respect – namely Paul Lukasik and Steve Lovelady – have rather graciously tried to tell me that I’ve got quite overboard with such thinking, I have the sneaking suspicion that the problem with most proposed solutions for our current media malaise is that they don’t go far enough.

The other problem is that they’ve thrown out the baby with the bathwater when it comes to objectivity. Fine: Let’s junk journalistic objectivity and its Halfling brethren “news judgment” and “fairness.” But let’s not concede the intellectual ground to competing subjective visions without first exploring the possibilities of a more scientific form of objectivity. Not a particularly enlightened perspective or state of being, but a transparent process.

How it might work.
Imagine for a moment that your next word processor came with an annoying “intelligent agent” feature that recognized any declarative statement of fact you ever wrote and then asked you to cite its definitive source. An incredible pain in the ass, yes.

But now imagine that, as a reader, every document you were ever asked to evaluate came to you as rich hypertext, with each summary fact transparently sourced all the way back to its original, definitive expression. Would you treat its claims differently than you would a document that arrived without that kind of depth behind it?

I’d wager you would. Sure, most writers cite sources, even if they don’t expressly name them. But are the sources definitive, or are they just dueling “facts” – on-the-public-record but never actually challenged or verified?

But back to our imagining. Anyone, given unlimited time and resources, could produce dry, boring, factual articles that are nevertheless elaborately festooned with hypertext-footnotes. Someone with zero understanding of how the modern mediascape works might even prescribe this as a solution for what ails us.

Realistically though, most reporters and editors will never have the time or resources to produce such exhaustive fact-check formatting on deadline. Even with modern
Web search-engines, checking a relatively simple statement back to its “single, unambiguous, authoritative representation” is exceedingly time-consuming and tedious task. Allow me to demonstrate:

“North Charleston, despite being one of the youngest cities in the state, is also among the largest.”

Now. Time me.

Two minutes: “As a means of bringing government closer to the people, an incorporation referendum was held on April 27, 1971. On June 12, 1972, after a series of legal battles, the South Carolina Supreme Court upheld the referendum results and North Charleston became a city.” (http://www.northcharleston.org/AboutUs/History.aspx)

Four minutes: “Incorporated in 1972, it is South Carolina's youngest city of any size.” (http://www.northcharleston.org/AboutUs/LocationMap.aspx).

Seven minutes: 2000 US Census population (via http://factfinder.census.gov/servlet/GCTTable?_bm=n&_lang=en&mt_name=DEC_2000_PL_U_GCTPL_ST7&format=ST-7&_box_head_nbr=GCT-PL&ds_name=DEC_2000_PL_U&geo_id=04000US45): 79,641

Eight minutes: Charleston, 96,650; Columbia 116,278;

Nine minutes. None other found.

So there we are: Almost 10 minutes of searching for a basically benign statement. The sources look pretty good, too – but they still aren’t anywhere close to the single, unambiguous, authoritative representations that the DRY principle calls for.

For instance, when the North Charleston city website calls itself “South Carolina’s youngest city of any size,” is that independent of the term “town?” It certainly doesn’t take into account the municipal soap opera that has been the recent history of the Town of James Island, which has been incorporated and disbanded twice in the last decade (James Island is currently unincorporated, which wouldn’t precisely invalidate this statement of fact). Beyond that, can the town of North Charleston be trusted to provide authoritative statements about itself?

Neither is the information up-to-date. There’s a 2003 census estimate that I found that shows North Charleston with roughly 81,500 residents… but that’s at least three years old now, and it’s an estimate. It doesn’t change the statement I made, but now I’m foundering. Which one would I pick as the authoritative representation of the original bit of knowledge?

Given this quick searching, perhaps I would edit my statement: “North Charleston, despite being the youngest city in the state, is also its third-largest.” The sentence is actually three factual statements: 1. North Charleston is the most recently incorporated municipality in South Carolina; 2. North Charleston’s population is estimated at roughly 81,500 people; 3. Only two other municipalities (Columbia and Charleston) in SC have larger populations. So my searching has marginally strengthened my statement and the hypertext footnoting may have improved your willingness to believe its veracity.

And yet in no way have I met the standards of the DRY Principle. I’ve wasted valuable time bolstering a sentence that – even when upgraded – makes the same point specifically that it originally made generally. And the items to which I point as my proof lack truly authoritative status. No doubt I’ll be fielding pointless phone calls from miffed James Islanders, who interpret the statement differently and want to argue.

Even under the most cursory examination, my DRY experiment is a tremendous timewasting flop.

All of which demonstrates why a real DRY fact-base would be tremendously valuable.

The trouble with search
Google is far from the definitive source most people imagine it to be. Just try updating your website and Googling the changes for proof. In fact, no web search engine can meet this standard, because the people writing the search algorithms aren’t the same people managing the data. So while web search points us toward facts, it cannot, as a system, create truly authoritative factual statements.

We need another tool. In fact, we need several of them.

  1. We need a curated fact-base. From raw data like census reports to statements contained in magazine articles, we need a database of primary factual statements that have been sourced and verified according to transparent and universally recognized standards.
  2. We need a system by which new primary factual statement may be reviewed and added to the factbase.
  3. We need a system by which all facts within the database can be reviewed and updated automatically. Such a system would also connect changes of primary fact to secondary statements such as “North Charleston is the state’s third-largest city.”

And then there’s No. 4:

  1. We need an intelligent word processing tool that automatically relates each factual claim to its original, unambiguous, authoritative statement.

No. 4 is the idea transports DRY Principle Journalism from the impractical to the sublime. Why? Because relevant factual statements tend to become pyramids over time. Down at the bottom? Census figures. Incorporation records. Later comes a statement, like mine, that combines census figures and incorporation records. Eventually, you reach statements like this one: “Along with its relative youth and rapid growth comes crime. North Charleston violent crime rate was among the highest in the United States in 2005 (ranked No. 79 for US municipalities).” Facts correlate, interrelate, expand and contrast.

If I write using DRY-principle facts, then each level of complexity I ascend becomes its own DRY-principle statement.

With the right tools linking the DRY factbase to my word processor, I’d know if my statement was generally correct, generally incorrect, or questionable. As I write, the built-in analyzer would search the factbase for relevant facts, perhaps listing them in a scrolling window beside my word processing field. At the end of the article, I’d probably edit by scanning back over the cited links generated by my intelligent agent, check to see if there were any obvious ways to improve the factual rigor of my article, and then press save.

Of course, if I’m reporting, my job is to generate new facts. How might DRY help me there?

Well, for starters it might let me know whether my subject is actually news – or just news to me. It would guard against me making factual and context errors. Perhaps we could even train it to recognize and challenge certain types of logical fallacies or misleading rhetorical devices.

But the most important role such an agent might play for a reporter is that it would recognize new, unsupported factual statements, note their cited sources, and apply to the factbase process.

Memory and power
On Sept. 1, 2005, with his administration beginning to come under fire for its response to the Katrina disaster, President Bush told reporters “I don’t think anyone anticipated the breech of the levees.”

With an intelligent agent dynamically connecting the DRY factbase to their word processors, reporters would have known this statement to be factually incorrect before they had finished typing the closing quotation mark. Why? Because multiple previous articles and disaster exercises had done exactly that – predicting with great accuracy the impact of a Katrina-like storm.

Yes, we all need an ever-expanding database of original-source facts, stated clearly and authoritatively. But the trick to making such a thing useful would be to embed in our writing tools the kinds of pattern-seeking software that first recognizes declarative grammar and then applies the words as search terms.

Positive correlations might stream into the word processor’s “hits” window as green supporting citations. But contradictory facts – like FEMAs previous disaster exercises – would flash red.

At the very least, a reporter following these protocols would know that the President’s statement was less than rigorously true. So too would anyone following the story at home. What’s the point of building the world’s greatest factual reference and not making it public to the world?

The President should have access to the factbase as well – if only to get his story straight before he goes out to meet the press.

And if the President wishes to contend that the factbase is wrong, well – we should be able to build feedback mechanisms that allow that, as well.

Administration
So, imagine for a moment that our discovery informatics wizards could develop the right interface. And imagine that our systems geniuses could invent the right storage, cross-referencing and retrieval processes. Imagine that our best archivists and data specialists could create a transparent system for batch-converting the huge volume of new data that would soon flood into your fact base. Imagine that the wise among us could create fair and practical ways or making sure the factbase stays accurate.

OK then: How would we pay for it?

One answer might be that the nation’s media outlets could work cooperatively on such a system, much in the same way that that competitors work together to make the Associated Press. The project is too large for any single participant, but if each worked together, each would benefit.

Colleges and universities? Sure. Research and development labs? You bet. Anyone with an interest in the expansion and vetting of information could benefit.

Governments?

Well, that’s another question.

Regardless of who would pay and how much such a system would cost, I see nothing it what I propose here that exceeds the theoretical capabilities of existing or developing technologies. And if science is any guide, then the value of having solid factual information at the world’s disposal – without having to independently verify each individual bit of knowledge – would be a tremendous economic multiplier.

I believe a system like this will be within our reach within a decade.

Would it be a magic bullet? No. So much of what passes for fact is actually only “facty.” How much of our political reality is based on guesses, attitudes, opinions? A DRY factbase and standards-based journalism wouldn’t change that.

But creating a standard repository of “single, unambiguous, authoritative representations” of knowledge would be a transformative technology both for journalism and society. Not because it would expand knowledge – but because it would allow the creation of a system of mutually agreed-upon, standards-based journalism and communication.

Some people would choose to stay outside such a system. They would challenge its validity, appeal to fear, appeal to divine authority. They could appeal to “truthiness” just as they do today.

But by making such a system open-source, and by inviting everyone to participate in monitoring it, you would move truthiness from the heart of the culture to its periphery.

People like me will still write to persuade. We will still argue over which facts are relevant.

But no longer will you have to trust me to see the relative value in what I have to say.

And that would be the biggest improvement in communication I could ever imagine.

Wednesday, December 21, 2005

NYT gots some 'splainin' to do...

I've been waiting, but it's clearly not going to happen. Not on its own accord, anyway. Not out of a sense of transparency or ethics or the public's right to know.

The brass at The New York Times thinks its decision to hold a story about the White House's warrantless domestic spying program for more than a year is none of our business. So they're not talking.

And, not to put too fine a point on things, that's bullshit.

Gabriel Sherman at the New York Observer had a piece about it today, along with this quote from NYT Executive Editor Bill Killer:

“I’m not going to talk about the back story to the story,” Mr. Keller said by phone on Dec. 20. “Maybe another time and another subject.”


Say what?

Read about it. Read about how Keller and Publisher
Arthur O. Sulzberger Jr. and Washington Bureau Chief Phil Taubman were summoned to the Oval Office on Dec. 6 by a president who didn't want them to publish the story. They published it anyway -- bully for them -- but this begs a serious question: If the story was so significant that three executives from the Times were willing to buck the most powerful man on the planet to print it this month, then why didn't they publish it much, much earlier?

As in before the 2004 election. That much earlier.

According to the NYO piece, reporter
James Risen was prepared to write the story 14 months ago. When Risen's attempts to get the story approved were unsuccessful, he went on book leave "and his piece was shelved and regarded as dead, according to a Times source," Sherman said.

Was the Times forced to publish the story by the upcoming January release of Risen's book? Was it motivated by partisan bias (a popuar charge) to publish right before a key congressional vote on renewing The Patriot Act? Were there substantial holes in Risen's original reporting? Was it delayed by lawyers? Or by government influence? What issues were in play on this story behind closed doors at the Times?

But Keller, Sulzberger, Taubman, Risen, a second reporter assigned to the story and Managing Editor Jill Abramson have all since declined to comment. The wagons at the Times are circled.

It's all so very Old School. The adults got together behind closed doors and decided what they were going to do and say, and now all we get is the party line. To both the White House and the Times, it seems, the rest of us are children. Their message to us? You can't handle the truth.

The Times owes us an explanation, today. Not some other day, not some other subject, not whenever it suits Keller. Now. Come out and write a candid description of what went into this agonizingly slow decision, let us see behind the curtain and judge the credibility and motivation of all parties on our own. That's New School.

I've been a part of these newsroom battles, and they're ugly. Candor about who said what and why is painful, awkward and potentially career-threatening. But what's good for the goose is good for the gander.

Did these people learn nothing from the Judith Miller disaster? Apparently not.

18:43 Update: A far better account of the story-behind-the-story at the NYT, written by reporter James Rainey, was published yesterday in The Los Angeles Times. Excerpt:
In a statement over the weekend, Keller said the paper printed the story after more reporting, which uncovered additional "concerns and misgivings" about the surveillance and also persuaded Times editors that they could proceed and "not expose any intelligence-gathering methods or capabilities that are not already on the public record."
The initial Times statements did not say that the paper's internal debate began before the Nov. 2, 2004, presidential election — in which Iraq and national security questions loomed large — or make any reference to Risen's book, due out Jan. 16.
But two journalists, who declined to be identified, said that editors at the paper were actively considering running the story about the wiretaps before Bush's November showdown with Democratic Sen. John F. Kerry of Massachusetts.

Top editors at the paper eventually decided to hold the story. But the discussion was renewed after the election, with Risen and coauthor of the story, reporter Eric Lichtblau, joining some of the paper's editors in pushing for publication, according to the sources, who said they did not want to be identified because the Times had designated only Keller and a spokeswoman to address the matter.
Dec. 22 Update: Here's Evan Dirkacz blogging the topic at Alternet.

Tuesday, November 15, 2005

An online reading list for old and new media

I want to offer the folks in my newsroom a useful reading list on topics related to journalism, emerging media, etc. But rather than build it as a text document, I'm going to build it here.

So this is the goal: Instead of compling an exhaustive, show-off list of web resources related to journalism, media, new media, convergence, blah-blah-blah, I'm going to put together an edited list: Not everybody. Not even all the big somebodies. Just sites I think might be useful to journalists who want to join the conversation, plus notes.

Here's what I came up with this afternoon (Ed note: updated on Nov. 30):

Big ideas, discussion and criticism
The Mac Daddy of all media/new media sites: Jay Rosen's PressThink. You at least check it every day, even if there's not a new post, because the comments and discussions can be great. Rosen is currently off writing a book, so his readers are keeping up the site ... even though Jay is still guiding things from behind the scenes. (RSS)

Jeff Jarvis' Buzzmachine is considered by many to be the perfect companion/counterpoint to PressThink. I disagree. I think you read Jarvis for Jarvis and Rosen for Rosen. Rosen has a better comments section.

Professor Andrew Cline has a blog and a podcast and calls it the Rhetorica Network. Cline is great at cutting through the crap on bias claims. Does bias exist? You betcha. Only it's more complex than most people think.

First Draft by Tim Porter is a very Rosen-esque site, in that the concepts are big and the context runs deep. The big difference is that Porter is a newsroom veteran, while Rosen is an academic. Porter isn't trying to bury newspapers -- he wants to save them. But he also understands that the real goal is improving journalism, no matter where it appears.

Dan Gillmor used to be the main man when it came to blogging about grassroots journalism. He even wrote the book on the subject: We The Media. These days he's involved in a local San Francisco project, so his blog isn't quite as useful to people outside the Bay Area as it used to be.

CJR Daily: Real-time media analysis from the Columbia Journalism Review. CJR Daily is the lair of Steve Lovelady, one of the great personalities and thinkers from PressThink.

MediaChannel is a broad, deep resource. The MediaChannel's Danny Schechter blogs as The News Dissector, and you can get him every morning in your e-mail if you subscribe.

Steve Yelvington works for Morris in Augusta and was the brains behind the Bluffton Today model. He's awfully damned smart. Don't-miss link: Ten years in new media: Looking back, looking forward. (Special recommendation from Andy Rhinehart)

Morph is a collection of writings, with comments, at The MediaCenter. (RSS)

Why should we limit our discussions of new media to old media forms? EPIC 2014 is an eight-minute Flash animation by Robin Sloan and Matt Thompson that will orient you as well as any ponderous feed you might add to your browser.

Technology, culture, geekery, etc.
Slashdot: News for Nerds.Stuff that Matters is one of the best-read sites on the net, but if you're a journalist, you've probably never heard of it. (RSS)

Dave Winer's essential proto-blog, The Scripting News (RSS available, but not recommended, and isn't that ironic considering that Dave is the RSS Daddy?) probably has more daily readers than your newspaper does. Chew on that for a while.

The business
Digital Deliverance is a media business blog by Vin Crosbie. Crosbie also blogs over at the Corante group collaboration Rebuilding Media. He thinks big thoughts, but they're generally industry thoughts, not journalism-practioner thoughts. (Special recommendation from Andy Rhinehart)

Lost Remote TV blog is useful to print journalists trying to get the new media world, in which you have to grasp multiple media, not just newsprint. (Special recommendation from Andy Rhinehart)

Romanesko is the place to get the buzz about the journalism business. It's more like gossip column/newsletter for journalists more than it is a serious discussion of Big-J Journalism (although this is where you get the news that fuels those discussions). It's part of the Poynter Institute, which also offers the useful E-Media Tidbits, which is nice to have as an e-mail subscription.

Zeitgeist
Memeorandum: The Newfangled News Tangle (RSS) is a way of tracking what stories are being discussed. I keep it as both a static bookmark and as an RSS bookmark, and if you check it out you'll understand why. The static bookmark gives me more info about the items. The RSS bookmark makes for faster scanning.

For more meme tracking with strange little graphics that mean something, try The Daypop Top 40 (RSS available, but not recommended).

Blogdex calls itself The Weblog Diffusion Index. (Keep the RSS and static bookmarks side by side in your favorites)

The Hotline's Blogometer might just be the one thing I'd read every day if I could only read one thing. This is the best digest of blogosphere comment I've ever found. New one posted every business day at noon.

Blogpulse has multiple Zeitgeist/meme-tracking tools. I don't use them, but some people will love 'em because they're so easily customized.

Technorati is the standard for blog searching and its basic search page gives you the Top 10 search terms for the past hour. So if you wanna write about something that people are talking about, ya go to Technorati...

Newsmap gives you a real-time visual grok of the global info stream by country perpspective. Size, color, x-and-y coordinates -- everything means something at Newsmap.

Media watchdogs
Liberal watchdog: Media Matters for America. (RSS)

Conservative watchdog: Media Research Center.

Newspaper blogs/sites
The Times Picayune became a virtual newspaper the day Katrina hit, because so much of its readership was literally scattered to the winds. Today, its NOLA.com website has elements that might make it the world's biggest blog.

John Robinson's The Editor's Log. Here's the editor of a mid-size metro daily walking the walk for transparent, user-focused 21st century journalism..

The (Greensboro, N.C.) News and Record's staff blog and reader-writer blog idex page is called Town Square. Notice how the reporters who blog here are blogging as an extension of their beats, not as opinion columnists. Unless, of course, they happen to be opinion columnists.

rtptv.com is a different kind of site. It's the online component of the newsprint RTP Tech Journal, but rtptv.com manages to be innovative in all sorts of ways. I like the idea behind its frontpage design. I like the way they use video. I like the way they structure and organize content.

GoUpstate is the website for The (Spartanburg, SC) Herald-Journal. It's run by Andy Rhinehart, a former print reporter who taught himself HTML back in the 1990s. GoUpstate works in ways many larger newspaper sites don't, and despite being the website's only employee, Andy still finds time to innovate. Consider: GoUpstate has been streaming live audio coverage of local high school football games for years, while other papers in the state struggle to post high school scores before Saturday morning. Note how the homepage design is optimized to fit your a computer screen.

Bluffton Today is a significant experiment in combining online and print products. Lots of web-savvy thinking here, and bold uses of user-generated content. It would belong under the hyperlocal header, but it's corporate, not independent.

The Knoxville News Sentinel site (KnoxNews.com) is one of the better ones out there, at least as far as design goes.

Will Bunch is a newspaper guy who runs the full-service blog Attytood in Philadelphia. It wasn't on my list at the beginning, but after going back on a recommendation, it's high on the list now. (Special recommendation from Paul Lukasiak.)

The Cincinnati Post has a blog for its photo staff that offers a simple and attractive concept: Put up a shot, write about it, then let the users comment. Good photos provoke strong emotions, so this is a natural fit. Special recommendation from Grace Beahm.

Metablogging
Ed Cone, who could write the book on metablogging, is best known for his Word Up blog, which everybody just call's "Ed's blog." He also blogs about ACC basketball, among other topics. (RSS) I could point out other metablogs, but in the spirit of the original idea, I'll stop here. Just read Ed.

Hyperlocal, independent news blogs/sites
The New Haven Independent was founded by a long-time reporter who spotted something I had written at PressThink and decided to give journalism another try -- on his terms. I think it's a great site, naturally.

Hypergene is a participatory journalism blog, with plenty of how-to stuff. (Special recommendation from Andy Rhinehart)

If you live in Watertown, Mass., and you care about local events and have a computer and a sense of humor, you read H2OTown. It's written by Lisa Williams.

I seldom get around to reading this, but check out Essex County, New Jersey's The Barista of Bloomfield Avenue by Debbie Galant and Liz George. This is hyperlocal news with style and pop and personality. It's a nice bookend for H2OTown, too.

Greensboro101 is Roch Smith's community blog portal. Between Roch and Ed and The News and Record, Greensboro, NC, had everything it needed to become the capital city of Blog Nation.

Craig's List could have been filed in all sorts of places on this list, but I'm putting it here because it functions as a useful connection to your community. Those of you in the newsprint business should be paying particular attention, because Craig's List and its unborn cousins are far bigger threats to your traditional business model than flashy start-ups like Pajamas Media will ever be. Craig's List is where you go to find a job, a rug, a roommate or a ride, but you can also rant about what bugs you, write a love poem to the woman who smiled at you as her train left the station, or book a multi-partner neighborhood sex party for the weekend.

Useability, web design and site architecture
Jay Small works for Scripps-Howard and blogs at Small Initiatives. (Special recommendation from Andy Rhinehart)

Jeffery Zeldman is a web designer and author who blogs at Zeldman.com. (Special recommendation from Andy Rhinehart)

Steve Krug wrote a great book about website useability (Don't Make Me Think). There's more at his site.

Misc.
Chris Nolan is a recovering reporter who morphed into a techie and now blogs with a sharp, cold eye.

Wikipedia belongs on any new media list for a couple of reasons. In the first place, if you need a reference for new media developments and their related technical terms, or if you're looking for information on the innovators in the field, Wikipedia is probably the best place on Earth to find quick answers. Secondly, it's the largest encyclopedia ever created, a feat accomplished with only two paid employees. But I think it belongs here for a third reason: Wikipedia is a model of the voluntary cooperation that is an emerging theme in Internet culture. This tends to freak out business types, who can't imagine such behavior on such a broad scale. Like it or not, wiki-esque cooperation and community is going to be a feature of future commerce, so you'd better adapt your thinking to include it.

(Nov. 21 update: Wrote a more descriptive header for the first category; added the link to the EPIC 2014 Flash animation; added Craig's List and Wikipedia; fixed the spelling of Jay Small's name, which has only one "S" in it; fixed the link to CJR; after going back and reading it more thoroughly than I have in the past, I added Attytood to the list on the recommendation of investigative reporter Paul Lukasiak. Thanks for the other recommendations so far -- I'll have to take some time to check them out before deciding whether or not to add them.

(Nov. 22: Took the non-existant "e" off Roch Smith's name. Thank you, Anna H.)

(Nov. 30th: Fixed style and bad-writing glitches, fixed fixes I missed, added the interesting photoblog suggested by Grace Beahm. I've gotten myself an in-house website for the newsroom, so it's about time to post this stuff for my bretheren. Thanks to everyone for your feedback and help, both on- and off-blog. --dc)

Monday, October 17, 2005

The New Media Food Chain

Anybody who has followed Jay Rosen's recent PressThink coverage of the Judith Miller debacle at The New York Times has probably noticed a change in the color of the sky this week. Some future historian will likely declare the Miller case a milestone in the development of global networked media, concluding with 20-20 hindsight that this was the week when we entered a new world.

In the Old World, the press and its superset, The Media, covered our institutions. When The Media became part of the story, some subset of The Media would examine that role and report on it.

In the New World, The Media still covers our institutions, but it no longer covers itself. That function has now been assumed by The Blogosphere. Permanently.

This is a natural phenomenon, because coverage of our shape-shifting, hydra-headed Media practically demands limitless perspective. No single observer can see the whole of it. But The Blogosphere is the totality of many observers. And while The Media is far better equipped to cover the world than individual citizens are, anyone with a TV and access to Google can cover The Media. Consequently, The Media covers tsunami recovery efforts, while The Blogosphere covers that coverage -- sometimes including unfiltered reports from bloggers on the scene. Is it accurate? Misleading? Does it offer the proper context?

While we have witnessed this phenomenon previously, the Miller story is the best example so far. Print-only readers simply do not have the same grasp of this complex tale as do those who read the comments and threads at places like PressThink, CJR, BTC News, Joho, etc.

Not only is The New York Times unable to cover itself in this instance, but other Old World publications seem to be struggling as well. They are bound by rules and conventions, friendships and rivalries, by "professional courtesy," and -- in some cases, no doubt -- by complicity.

In this limited sense, The Blogosphere has now transcended The Media. This is not to say that bloggers are more powerful than the TV news networks and big dailies (yet), but there is a comparison to be made here between a relationship that we understand far better: Media does not control government, but because it has the power to establish the narrative for government actions, media influences government.

To understand the New World, move one link up the new media food chain and look back. The Blogosphere does not control The Media, but because it has the power to establish the narrative for Media actions, the Blogosphere influences Media.

A scientist, looking at that event, would say that The Blogosphere is able to do this because it is larger and more complex than The Media. In every sense of the word, The Media is now the subject of The Blogosphere. It has "gone meta."

Let us pause and recognize the historic significance of this moment. We are democratizing power and changing the culture in ways few people have even imagined. Next step: Let's help those people imagine it.

Tuesday, October 04, 2005

The Media "singularity"

Editor's note: When one of my posts over at Xark! (a criticism of Anderson Cooper called Enough with the posing) initiated an interesting back-and-forth about media credibility and objectivity, it prompted me to write a long comment. I'm cross-posting it here because, as I read over it on the page, it occurred to me that I had inadvertently described a state of media singularity -- an evolutionary step in human consciousness.
The discussion about credibility/objectivity/etc. is a worthy one, but my point here was more basic: I don't like the acting, the dramatic personae, the fake cinema verte. I think Geraldo has done it for years, and it's laughable, but when I watched Anderson Cooper do it, I found it disturbing.

One of the best things I read every day is a MediaChannel.org e-mail called Media Savvy: A daily update on media and political matters that has the effect of making me a better receiver. From an informed position, everything has value -- Hannity, Limbaugh, Franken, PrisonPlanet, Stewart.

But what I notice about myself is that I assume I'm capable of watching all this and sorting it out in meaningful ways, but I assume that people who tend to get all their news from one source or another lack this perspective. So an important question becomes: "Am I right about that?" And if I am, what (if anything) should we do about that? Does it require any action more specific than identification and discussion?

These days I write a great deal about biology, and here's a lesson from the life sciences: diversity is the sign of a healthy ecosystem. Taken as a whole, our mediasphere is more diverse than ever, but the real issue is, what about people who self-select a media monoculture? How do we re-engage them?

And this is where I think Janet is headed in the right direction: The spirit of the new media age is niche. The spirit of the old was One-Size-Fits-All. I think that when we fight over MSM coverage today, the unspoken goal is actually control over the normative power that Big Media represented in the One-Size-Fits-All Era. Conservatives aren't generally angry at bloggers who write opinionated pieces favoring homosexual marriage, but an AP story that takes no stand yet has the effect of making gay unions look normal drives them nuts.

Janet says that a new media will emerge, and I agree. I think we're actually making it, right now, right here, at this moment. The old model is top-down, normative, restrictive, authoritarian. The new model is sideways-distributed, group-forming and based (in the loose sense) on merit rather than authority.

It's hard to imagine this now, but it will become easier once we build the tools that that give individual users more direct control over information. By tools I mean the tools of discovery informatics, neural networks, intelligent agents: thinking tools, pattern-recognition tools, aggregators, quantifiers, connectors.

Today a blog is an individual neuron in a holographic consciousness that isn't yet fully self-conscious, something that allows us access to the greatest problem-solving technology ever invented (community).

In the future, a "blog" will be part of an aggregate, measured, fluctuating vox populi, and the back-and-forth flow of information will be ordered and shaped not by editors and producers, but by machines.

The human factor doesn't disappear in such a system -- it just moves to doing the things that humans do best: Asking questions, sharing experiences, considering options, etc.

That's the optimistic view. The pessimistic future is FOX Populai, the manipulation of small media by Big Media in a monocultural hierarchy. You can choose left or right or "phony centrism" (left and right will both claim that the "objective" journalists are all working for the other side secretly), and the culture will continue to polarize.

But we're not playing on that level right now. Today, blogs and VODcasts are just "cool," particularly with the demographic that forms the core audience for Anderson Cooper 360. When Cooper steps out of his news character and steps into his romantic citizen-journalist character, he is self-consciously trying to be two contradictory things at once. Maybe that's a sign of genius. And maybe it's just opportunistic and shallow.

Friday, September 23, 2005

The Intelligence Briefing model of journalism

Posted today at PressThink in reference to discussion on the NYT's Times Select paywall:


What's valuable today? Information that comes with a high degree of confidence and carries predictive power.

What's parsley? Politicized opinion, infotainment, stenographic reporting and "analysis" of the obvious.

I think we are in the middle of a paradigm shift that will divide information and commentary into two basic categories: 1. Basic, "unwarranteed" communication, which will continue to be too cheap to meter; 2. Value-added information, which will abandon our Old School value of "fairness" for a model based on the daily intelligence briefing.

When we talk about "objectivity," we tend to talk about its limits. We don't tend to talk about its value. When we talk about commentary, we talk about its slant. We don't tend to talk about its perceptiveness. Our current frame of reference is a newspaper/broadcast model that is based on certain assumptions about "gatekeeper functions," "credibility," "balance," and the mass audience.

When you adopt an intelligence agency perspective, the information gatherer and the information analyst are working for a specific end user, not a general, passive audience.

This is a radically different relationship. Your loyalty is to your subscriber, not to your sources, not to your political friends. Your value -- your continued employment, for that matter -- is attached to the quality and utility of your information and your insights.

People will pay for such content, and the networked media makes it possible for more people to access such services. These are, ultimately, the "editors" described in the EPIC 2014 animation.

Will people pay for Times Select? Not unless it has this function.

Thursday, September 01, 2005

Getting ahead of disaster

The problem with a disaster like Katrina is that it is literally too large and too profound for the average person to wrap their brain around. Consequently, the media now resembles a bunch of blind men describing an elephant, only we're doing it around the clock.

Making matters worse, in our rush to catch up with events, in our need to provide "hurricane porn" 24/7, we've overlooked another important part of our job as journalists: Probing for meaning.

The unplugging of New Orleans from the American economy -- and the absorption of at least half a million long-term refugees -- promises to be one of the most transformative events of the early 21st century. The implications of this mind-boggling task will affect every American in hundreds of ways, large and small.

If there was ever a time for a disaster wiki ... if there was ever a call for the smartest people in the country to get in communication and start comparing notes ... this might be it.

When considered in the light of an American moment that was already feeling rather precarious, the ongoing disaster in the Gulf represents an enormous threat to our way of life. Traditions, institutions, relationships and expectations that made perfect sense on Sunday are now either history or utterly unsure. And we're not thinking about them.

A year from now, we will look back at these days and say "If we had only thought of X."

It's our job as journalists to start thinking about X today. And one way to do that is to start asking everyone we know -- and many people we don't -- "What might X be?"

P.S.: Here's the e-mail I just sent out to about 90 people, many of them local, many of them spread out across around the country:
Dear all:

I am casting as wide a net as I can today, trying to get as many thoughtful responses as I can to this question:

"Regarding the long-term flooding of New Orleans, what so-far unpublicized secondary effects are likely to have the most profound, transformative and currently unanticipated effect on the nation as a whole?"

Some secondary effects, such as the rising cost of gasoline, are getting lots of attention. Others, like the destruction of the Gulf Coast shrimp fishery, have yet to be examined. I am interested in what people with different perspectives and insights would foresee as important issues affecting us all that we have yet to consider in the wake of this disaster.

I hope to combine the best and most thought-provoking responses into a piece to run in (my newspaper) in Charleston, S.C.

If you have a thought that you wish to share, I will be greatly appreciative. If you have any friends or colleagues that you think might offer an interesting response to this, please consider passing it on to them.

Thank you,

Daniel Conover

Sunday, August 14, 2005

Guerilla media

Whilst cruising around the Chihuahua desert one afternoon in 1988 with my cavalry troop's executive officer, I listened as he waxed philosophical over an MRE.

"A tank costs $2.6 million," he said. "But what if you took that $2.6 million and bought a bunch of dune buggies, mounted guided missiles and machine guns on 'em, and offered each dune buggy crew some kind of bonus for harrying the hell out of the enemy?"

The lieutenant's idea was radical, the kind of thing you talk about with a curious young buck sergeant but never with fellow officers. Restated from a 21st century perspective, the XO was suggesting that a swarm of lightly armed, highly mobile, independently commanded guerillas might be more effective at denying a modern enemy the ability to execute its plans.

In military jargon, we might say that 1st Lt. Kontos was stealing the principles of "asymmetric warfare" from the underdog and applying them to the dominant force. Not pirates: Privateers.

I was reminded of this while reading a comment on Jay Rosen's post about "things (journalism professors) used to believe but don't believe anymore." Journalism instructor David Crisp despaired of the current state of the business, examined the ideals that now look silly and concluded that "Maybe I'll just try to teach them to write punchy ledes and forget the rest of it."

Better yet: Teach them those ideals, David, but point them away from newspapers and network television. Tell them where to find the resistance instead.

Whatever you want to call it -- mainstream media, legacy media, The Media -- the dominant media in our culture is stuck. It moves a predictable speeds, in predictable ways. It lacks verve and brilliance, but it is well supplied and armored.

You can't confront it and win. You can't "change it from within." Creating a mirror-image "alternative media" that could go head-to-head with such a force is simply not a logistical possibility.

So instead, maybe you take Crisp's journalism students and you teach them the way of the guerilla. Teach them big ideals and little survival tricks. Teach them wisdom and initiative and character. Keep them out of the halls of corporate human resources, where mediocrity prowls in jealous vigilance.

Give them blogs, and set them loose.

Some enemies cannot be defeated directly. But if you deny those enemies the ability to act as they wish, if you harry them and give them no rest, if you show the people in the countryside that there is an option, then maybe...

Just maybe.