Tuesday, April 15, 2008

5 Lessons from newspaper contests

Here's a humbling series of events, and the still unpopular lessons I learned as a result:

In February 2005, my co-workers put a crown on my head and carried me to the front of a banquet hall to receive a plaque that said I was the state's Journalist of the Year. I'd secretly dreamed of winning that award for years, but it took the end of my career as an editor to make me eligible.

I got my paper's nomination because, as our former executive editor put it, 2004 had been "kind of a down year" for the staff. Most years we nominated a reporter for a specific article or series, but my nomination was just for my "body of work" as a features writer. Ouch.

That was the fall of 2004. Fast-forward to early 2005: The South Carolina Press Association posted the winners list for most of the award categories to its website, and I eagerly I called it up. I'd entered lots of categories, and after spending years polishing other people's work I was hungry for some glory in my name.

And here's what I won: Nothing. In category after category, nothing.

To say that I was humiliated is to miss the most obvious point: The staff knew that I was the paper's Journalist of the Year nominee, and to be nominated for a "body of work" that didn't collect a single individual award made me a pathetic figure. No one spoke to me that day. People literally averted their eyes in passing.

So how did I win the Journalist of the Year title without winning a single award for a single individual story? Simple: Different judges.

Here's how the journalism-contest racket works: State press associations take turns judging each other's contests. Hence, every other year the newsroom administrator would drop a foot-tall stack of contest entries from, say, Arkansas, on the city desk with instructions for me to pick my top three. And that evening I would take an extra-long smoke break and plow through all the Enterprise Reporting entries from Arkansas dailies in the 80,000+ circulation category.

No check. No balance. Just one hurried opinion.

Journalist of the Year, on the other hand, was selected by in-state judges, through some secret process of I-don't-know-what, and the results aren't announced until the awards banquet. I'll never know how I won, why I won, or what the win really meant.

The lesson
So here's what I've learned from 20 years of journalism contests:

  1. That 2005 Journalist of the Year title was nice, but the valuable lesson was the humiliation that came first. I still don't have an opinion on whether I deserved the J.O.Y. title, but I'm confident that my individual entries were just better than most of what that year's judges liked. At some point we all have to grow up, stop seeking the approval of others and trust our own judgment. That's it. That's all. That's life.
  2. That thing you always wanted because you believed it would validate your efforts? I've had a few, and each quickly proved profoundly hollow. Their value? Winning eventually frees you from the need for external validation, because you no longer have to worry that you're rejecting it because of sour grapes.
  3. Because there's just no logic to most awards, the important thing to remember is that winning first place for Investigative Reporting is probably just an indication that you're basically competent at some portion of your job. Maybe. Just don't bet on it.
  4. The best way to win writing contests? Write to win contests. Stick to the tried and true. Play the game. Self-promote. Read past winners and follow their formula. Have a plan, then stick to it.

Which brings us to Lesson No. 5, and with the newspaper industry crashing down around us, it's the one I wish I could convince everyone to heed: The best approach to newspaper contests is to stop entering them.

I got a second-place plaque for something at the 2006 banquet, but I felt like a hypocrite for entering and didn't bother to pick it up. That led to my "stop entering" epiphany and later that year I pitched the idea to our executive editor: Opt out. Don't enter. Make a statement: We're turning our focus 100 percent toward our readers, and the only contest we want to win is the daily contest for your attention and trust. Make it a one-year moratorium, I told him. Hell, I figured the PR value of such a move would be worth whatever it cost us in contest prestige.

I didn't expect he'd take me seriously. He didn't.

So why rock the boat? Well, consider this: I've been producing a Friday features section (that I developed with Janet Edens and my former boss, Judy Watts) since March 2007 and it hasn't featured a byline yet. Imagine that: a features section without feature stories. I have no idea how successful it is -- there's been no readership study in the past year to tell me -- but I can tell you that the readership for traditional "lifestyles" features writing across the industry isn't exactly robust.

If you knew that, wouldn't you want to experiment with new approaches? Sure you would -- if you're focused on your readers and your business. But it's not what you'd do if you're focused on awards.

I'm not the only person who feels this way, apparently. Janet and I got to eat catfish with "Mister Magazine" Samir Husni outside Oxford, Miss., earlier this year, and Husni told us about a major magazine editor who informed his staff that he would fire anyone who entered a contest of any kind.

True innovation will never win awards, because there's no contest category for something that nobody else is doing. And if you foster a culture in which awards lead to promotions and financial rewards, then asking your talented people to invest themselves in anything new is going to look like a bad deal.

Well, I'm in. I'll never enter another press contest. Period. I'll be hosting the national convention of the Press Contest Conscientious Objectors of America in a utility closet at the Charleston Comfort Inn sometime next winter. You're all invited.

Wednesday, March 19, 2008

BONUS POST: "Ad hocracy"

I'm breaking up my series on newspapers to float an idea that I'm hoping to present at a workshop next month.

It comes from this basic observation: We've already got the free tools to construct ad hoc breaking news networks around discrete events: Blogs act as a reverse-order chronology of evolving events; Twitter gives us two way communication across platforms, plus a means to capture text via SMS from "dumb" phones and then push those messages out to the Web. Utterz lets us send audio clips from our cell phones straight to the blog. And Flickr gives us a way to organize and "theme" photos coming in live from multiple users.

The fun part? Figuring out how to configure the parts and administer the results so that you find the Goldilocks Zone between immediacy and chaos.

For the record, I experimented with something similar to this in 2006. The idea of creating ad hoc news networks arose from my thought that I'd have set up Storm Watch differently if I'd been a Twitter user.

I'm posting the original concept here so that everyone can bang on it, come up with better solutions. Have at it.

PART 1: SET UP AND OUTREACH
STEP 1: Administrator sets up a breaking news blog on a standard blogging platform that can handle multiple authors.

STEP 2: Administrator publishes the URL of the new blog and encourages members of the network to bookmark it. The blog will remain absolutely dark until it is needed.

STEP 3: Administrator creates a Twitter account for the new blog and encourages people within the network to Follow it.

STEP 4: People who are known and trusted by the administrator (reporters, bloggers, Twitterers, photographers, videographers, frequent commenters on other sites, etc.) are invited to create author accounts/Utterz accounts that will feed into the blog when it goes live.

STEP 5: Administrator adds a Twitter widget to the blog to display recent Tweets. (?)

PART 2: WAKING UP THE NETWORK
STEP 1: The administrator/network wrangler hears about a breaking news event (a plane crash, a storm, a riot, a blogger conference, etc.) and wakes up the network via a variety of means: Tweets, cell phone alerts, e-mail, blog posts, etc.

STEP 2: Eye witnesses begin filing reports in a variety of ways: Tweets to the @breakingnewsnet account; Utterz calls that go straight to the blog; photos uploaded to individual Flickr accounts but tagged to create an event-specific Flickrstream.

STEP 3: Eye witnesses who are live-blogging to their own sites can share their posts with the breaking news blog by sending links in multiple ways.

PART 3: MANAGING INFORMATION
STEP 1: The administrator should focus on trying to manage the flow of information across platforms in real time. In particular, the administrator should look for reports from non-trusted sources that could be misleading, erroneous or threatening.

STEP 2: If the interface between Twitter and the blog is manual (i.e., the administrator is copying and pasting Tweets as new posts), then information flows will be managed by deciding which reports to publish elsewhere. If the connection is scripted (@breakingnewsblog Tweets appear automatically on the blog, or in the blog publishing queue), then the administrator has to decide whether some reports should be removed awaiting verification.

STEP 3: The administrator(s) should attempt to develop as clear a picture of events as possible, posting updates that write-through the various incremental reports from the field.

STEP 4: Witnesses get feedback, questions and requests for clarification from the administrators and other users. Follow-up reports attempt to address those questions and requests.

PART 4: POWERING DOWN
STEP 1: As the situation moves from real-time development and into the more traditional news cycles, the administrator notifies the network that the breaking news blog is shutting back down.

STEP 2: Administrators capture and save the Twitterstream, tag photos and video, and formally concludes the blog coverage. Commenting continues as long as needed.

STEP 3: Administrators review inputs and extend "trusted source" accounts to network participants who performed as reliable observers. Sources that perform poorly could have their "trusted source" status rescinded.

STEP 4: New account requests and Twitter follows are processed and prepared in anticipation of the next breaking news event, which will wake up the blog and the network all over again.

DISCUSSION:
SCRIPTED VS. HUMAN: First, is there a way to capture specific Tweets (my example: @breakingnewsblog) and script their posting to the blog? Second: If there is, should we? My gut tells me that the Twitterstream would be the raw material: Anyone could Follow it and see the incoming reports in real time. The blog, on the other hand, could be a step up in terms of making sense of the reports. What's the best way to moderate this?

BEST USES: Should this be done using the system I described or something different? Would hashtags work better? Should Facebook be used instead of/in addition to?

WHICH BLOGGING PLATFORM? Each blogging platform has its quirks. Which one would work best for an ad hoc news network?

IS MORE BETTER? This blog could handle everything from organizing live responses to a scheduled political debate to receiving reports from a surprise earthquake, but obviously each event would have a different network of contributors. My thought it that you want to make it as easy as possible for people to contribute (and cell phone to Twitter is pretty easy). But am I right? And will the administrator be able to keep up if the size of the reporting audience gets too big too fast?

IS THIS ONE IDEA OR SEVERAL? A version of this created for covering tech conferences might look different than a version built to cover breaking news in a geographic area. Or maybe you'd do this within a very defined community (graphic novel fans, infectious disease experts) to handle particular types of information.

WHAT ABOUT ETHICS? Your average professional organization would be scared to try something like this because of fears that people might use bad language or make libelous claims against individuals or post hoaxes or upload photos that invade privacy, etc. Well, aren't those legitimate concerns? How could we address them in an ad hoc, cooperative breaking news network?

That's what I'm thinking about. I'd love to hear your thoughts...

Sunday, March 16, 2008

MEDIA REVOLUTION: Why The Tower Must Fall

From a generic perspective, it’s probably fair to say that the public history of new media began five years ago this month with the invasion of Iraq. Technorati was tracking fewer than half a million blogs in March 2003, but 24/7 coverage of the war meant cable news needed things to talk about, and there were the bloggers – this strange new species of pundit – always talking, talking, talking.

I became a blogger in March 2003, part of a surge in self-expression that exceeded 4 million participants on the eve of the November 2004 elections. To mainstream media in that campaign year, “new media” mostly meant blogs, and their reaction was generally dismissive. Sure, bloggers might find a niche in the chattering-classes ecosystem, the reasoning went, but they were bottom-feeders sucking up leftover information produced by the reporting professionals.

In the public mind, therefore, blogging more or less began with the war, then became significant – for better or worse – in September 2004, when bloggers played a leading role in casting doubt on a CBS News story about President George W. Bush’s service record in the Texas Air National Guard.

So why talk about the nascent blogosphere of almost four years ago?

Because at this moment of decision and tumult in mass-media news, far too many of our business leaders and newsroom decision-makers appear to be stuck there. And since most failed to explore even those outdated possibilities (part of a general tendency to reject as irrelevant anything they do not control or understand) far too many of our industry's executives continue to misread the dramatic cultural and economic changes now reshaping the markets for our products.

The consequences? Executives sink hefty budgets into e-mail marketing schemes – even as young people abandon e-mail in favor of communication via social sites. Consultants skim big fees pitching “Web 2.0” business concepts that look promising based on analysis of traffic and function, yet fail instantly because they begin with no understand of the culture of online communities.

Is it any surprise that so many of our flagship institutions are foundering? From tagging to RSS to social filtering, mainstream media has failed to keep up with the ever-leapfrogging development of new tools because its leaders remain not only ignorant of the Web, but suspicious of its very legitimacy.

For a simple example, look at the average newspaper website video section. Can you find the embed code for any of the videos they host?

I went through a list of the Top 100 newspaper websites tonight, checking every eighth one to see if it offered an embed code. Most required me to watch a commercial before letting me see the video and I’d chosen (also a mistake… to see the RIGHT way to put an ad on a video, watch TPM.com’s VERACIFIER videos), but only one offered me an embed code -- and that was the San Jose Mercury News, the home paper for Silicon Valley.

Why? YouTube has been putting embed codes on its videos since 2005, and it’s not exactly advanced technology. Allowing people to place your video on their site is a great way to increase the number of plays you’ll get on a potentially popular video, and they don’t cost you a thing.

I suspect there are four reasons:

  1. Top editors have never heard of embed codes;
  2. They don’t understand how they work;
  3. They MISUNDERSTAND how they work (“You mean other people will get to take our video and we wouldn’t have any control over how they use it? What’s in it for us, other than a lawsuit?”); and
  4. They just don’t think it matters, since the only people who care about embed codes are bloggers, and bloggers a bunch of whack-job wannabees.

The next post in this series is going to talk about the necessary next steps for newspapers – not online, but in print. Before we moved on to that subject, I wanted to cast this thought out to the world:

The biggest problem faced by the newspaper industry isn’t a competition problem, or a revenue problem, or a technology problem, or even a quality problem. It’s a culture problem. The average metro newspaper is a monopolized commodity, and after decades of bottom-line corporate control, all vestiges of independent thought have been selectively bred out of its cultural DNA.

Newspaper leaders are willing to make almost any change… so long as certain things remain untouched. And those untouchable items -- double-digit profits, secretive editorial boards, black-box news judgment -- are the very first things that healthy companies will have to address.

Newspaper monopolies are not worth saving. They are The Tower, and they must fall before something new can begin to grow in its place.

Tuesday, March 11, 2008

An old idea has a new future

Back in the late 1990s, at roughly the same time as the advent of the ill-fated :CueCat, my employer invested in a failed technology called "GoCode." Both of these devices had the same goal: to connect print readers easily to the Web. :CueCat was a bigger flop because it had a bigger footprint, but I'm sticking with the GoCode scanner system because I got up close and personal with it.

Here's how it was supposed to work: Newspapers would take the URLs for content related to individual stories and enter them into an encoder. The encoder was supposed to produce a tiny bar code that could be attached to the end of printed newspaper stories. Newspaper readers, in turn, were supposed to be able to take their free "GoCode wands," drag them over the tiny bar codes, and delight in being transported to related content on the Web.

Got it? Four items: 1. Printed content; 2. Web content; 3. A printed bar code for the Web content URL; 4. A bar code scanner that was supposed to connect print to the Web via a personal computer.

GoCode, like :CueCat, was a dismal failure. I could see it was doomed the first time we received training in how to use the encoder: Instead of giving editors the ability to link to valuable related content (full texts of speeches, budget databases, etc.), what they delivered came with a menu of useless content that we really couldn't edit or expand. Then the 100 "test" wands never got distributed. Our company grabbed some industry headlines for our innovative attempt at modernity, but GoCode was DOA. We never even bothered to announce its demise.

Why did GoCode/:CueCat fail?
  1. Failed to understand how people use the Web;
  2. Required that you read sitting at your computer;
  3. Required special hardware;
  4. Required special generating hardware/software;
  5. Required user-end software;
  6. Cheap device never achieved scale;
  7. Device was pre-USB 2.0 and required special pin-porting;
  8. Media companies that didn't understand the Web were sold on these technologies by huckster companies that were long on promises and short on delivery;
  9. The timing was all wrong (Dot.com bust);
  10. Didn't empasize content quality.
After these disasters, the print publishing industry has been uninterested in anything similar. Consequently, even the act of printing meaningful URLs has been daunting. I taught the people at my newspaper to use www.tinyurl.com, and it's also true that some Web content providers have gotten somewhat better about generating logical (and shorter) URLs. But let's face it: once you get past the root domain name, people start turning you off if they've got to type in the address.

Which is why it's time for a bright technologist and a smart media company to start developing the next generation of URL scanner. Because the technology that made this idea workable has been on the market since June.

The iPhone.

Connecting print to Web -- mobile
The cellphone -- not the PC -- is the proper device for taking print readers from page to pixel, and the development of a cell phone with an imaging device AND a functional Web display is the next-to-last hardware development task required to make this idea work.

The final step: Add a laser/infrared scanner capability to all smart phones.

So here's how it would work:
  1. All print publication automate the creation of URL bar codes, so that any URL mentioned in the pub is accompanied by a tiny bar-code icon;
  2. Smart phone users who want to see Web content mentioned in print wave their phone over the icon;
  3. Scanner on the back of the phone decodes the URL, opens the phone's browser and displays the content.
Other uses should be obvious: All display and classified advertisements would come with an appropriate icon; printed maps and directions would have icons for related attractions, restaurants, etc.; Business cards would display the bar code. On so on.

Could you add a scanner to your computer? Sure. It's a simple USB device. But that's not the killer app: People who are mobile need simple, quick access to the Web. And in 10 years we're ALL going to be mobile. All the time.

But what's the business model?
I was never clear how we were supposed to make money off GoCode. But I know how to make money off what I'm describing.

Advertising and licensing.

The advertising stream: So maybe before this system displays your page it flashes up a targeted advertisement -- it's topical, because you could code keywords into the icon, and it's local, because the cell phone carrier can read where you're transmitting based on which repeater has picked up your outgoing signal.

So that's a premium ad spot: Local and targeted, with algorithms that serve up well-suited commercial info.

Could the user get a version of this with extra features and no ad support? Absolutely: It's a freemium feature in the making.

And finally, licensing: If I develop this tech and offer it for free to users, how hard will it be for me to charge all print publishers a small licensing fee for using the software and/or hardware we give them to handle the encoding?

The Web is based on clicking. If we had to type every URL, the Web wouldn't be the Web. A cellphone that scans and displays URLs extends that power from the traditional PC to the extended network.

It makes sense for cell users. It makes sense for publishers. And it has an easy business model.

Anybody want to pick this up?

Thursday, February 21, 2008

Foundations of 21st century journalism

Up to this point, I've been talking about some of the concepts that shape my thinking about media and journalism: Quality; epistemology; the cultural flaws that warp these discussions; the ideas that I think will lead us around those obstacles to answers. But I haven't really offered any answers of my own.

One reason: What use are they? They're not scholarly. They're not researched and footnoted. By the standards that I respect, my answers are little more than jackleg bullshit.

So to put the proper value on these answers, understand that each comes with endless caveats and a deep sense of humility. The future isn't known, everything can (and will) change in unexpected ways, and the odds of me being right on more than half of these statements is absurdly low. I accept that.

But if you'll take these ideas in that spirit -- not as confident proclamations, but as working insights and imagination gained through hard experience, reading and conversation -- then perhaps you'll find their intrinsic value. If they turn out to be predictively true -- great. If they wind up having no value of their own, but spark your thinking toward something great, then I'm OK with that, too.

Human intelligence is based on prediction. We shouldn't be afraid to predict, just as we shouldn't perpetually tie our egos to being correct.

Here goes:

MONOCULTURE TO ECOSYSTEM
Modern media is in transition from the monoculture monopolies of the 20th century to the diverse explosion of expression that represents the 21st century future. Think of the past as a wood-pulp tree farm; think of the future as a rainforest. To outsiders, a tree farm appears orderly and logical while a rainforest appears impassible, chaotic and dangerous. To residents, however, a rainforest is a vibrant, living ecosystem, and a tree farm is artificial and sterile.

We are moving into a media ecosystem of multiple niches and processes. No single niche dominates, and despite constant competition within and between niches, the resulting chaos is externally stable because of its healthy inner dynamism. All the ideas expressed here are to be conceived as existing within the new media ecosystem. To those who fear the thought of a news media that is not controlled by trained elites, here's your lesson for the day: Control doesn't scale.

STUCTURED AND SEMI-STRUCTURED DATA
The No. 1 functional shift, from a future media historian's perspective, will be the change from the current "document" mindset to the future "database" mindset. Modern journalists see themselves as people who report facts and write stories and cannot grasp that the act of turning information into narrative instantly limits its usefulness and accuracy. News writers will continue to create narratives, because the human brain sorts and stores data by narrative mnemonics. In fact, the bulk of the information we consume will continue to be structured into narrative.

But the primary function of newsgathering organizations will be to create and curate semi-structured databases of interesting/significant information. This, by the way, is the reason that the expansion of The Semantic Web matters right now.

SCALABILITY
Narrative doesn't scale to a global information economy. Personal insight about a candidate doesn't scale to a national campaign. A fair and balanced examination of a two-sided story doesn't scale to a topic like global climate disruption. News media will shift from an artificial one-size-fits-all system (based on front pages, production schedules, newshole, broadcast formats, standard server/bandwidth configuration, etc.) to one that expands and contracts depending on situations. This will require new tools and conventions. Many of these solutions will be social.

OPEN SOURCE
Proprietary, compiled information tools and repositories will fail to keep pace with their open source competitors. Ultimately, all first-class news platforms will be based on open-source principles, and all commonly held information will be structurally compatible.

INFORMATICS
Informatics is the study of the structure of information. "Discovery Informatics" uses sophisticated software agents to detect and explore patterns in enormous streams and vast pools of data. Since all major news organizations will have comparable news databases, much 21st century newsmedia competition will consist of duels over user-tools. The news company with the "best" informatics tools stands a good chance of being the commercial winner.

THE BLUR: NEWS, INFORMATION AND ADVERTISING
Artificial distinctions between information types will be blurred and then forgotten. The new challenge will be getting the right information to the right user at the proper time, rather than maintaining firewalls or winnowing out things that "aren't news anymore." In the future, it's all about the end-user's needs and experience. This means that something other than artificial firewalls will have to stand as a credibility marker between types of content.

NEWSBOTS AND INTELLIGENT AGENTS
Human intelligence doesn't scale to the flow of global information. Informatics tools that represent the interests and intentions of individual human beings will serve as the adaptation that scales human intent to the scope and pace of the new information economy. The ultimate result of a system that incorporates multiple intelligent agents acting on behalf of each individual and organization will be something I've called The Construct, and understanding what is being expressed within The Construct in real time will replace polling, focus groups and market research.

MULTIPLE REVENUE STREAMS AND BUSINESS MODELS
Modern media profits are based on paid content and -- to a far greater extent -- advertising, with an emphasis on display advertising. Future media operations will collect revenue in multiple ways, often receiving a percentage of a transaction whenever its "free" user tools connect buyer and seller. Most advertising will be performance-based and connected to the expressed intent of the user (whether by search or some other function). Traditional display advertising will be a high-end niche for major brands and a low-end function of small-scaled media.

Other news operations will operate as non-profits, receiving no traditional advertising. Some may be run as foundations, or even as informational utilities, governed by boards and bylaws. Much media will be created by individuals and groups that depend on pledge drives to cover their costs, but some of it may be produced by "for-profit social ventures" that blend the power of supply and demand with the intentions of non-profit organizations.

INTELLIGENCE BRIEFING MODEL
One news organization model will rely solely on paid premium subscriptions: News agencies that base their credibility on predictive accuracy (outcome) rather than on "fair and balanced" coverage (process). These services, which treat users as executives to be briefed and prepared, will cost more and will appeal to users who work in highly competitive industries or individuals who make informational awareness a lifestyle choice.

MAINSTREAM RETRENCHMENT
"Mainstream media" today are in decline, with "the people formerly known as the audience" fragmented. Future media will separate into market-driven grades of information. The "mainstream" will become a smaller subset of the total media flow, generally associated with less-sophisticated technology and users who: 1. Produce little content; 2. Profit only marginally from higher grades of information; and 3. Choose a passive lifestyle. Mainstream media will not dominate, but will represent the most significant media plurality.

NICHESTREAMING
Higher-end information users will largely reject 20th century-style mass media, which will remind future users of Stalinesque architecture from the old USSR. These higher-end users will select and manage their personal mediascapes, and media companies will work to connect to these users by identifying and serving many individual niches.

E PLURIBUS UNUM
The current mediascape is built around a recognizable meda "voice" that was established during a period of information scarcity. It is animated by a sense of lingua franca continuity that stretches across newspapers, magazines, TV channels and the academy. The new mediascape will arise from the spirit of unlimited bandwidth and will be fundamentally infused with a limitless diversity of voice, tone and topic. While it will seem a cacophony to older users, new tools and conventions will allow us to experience it on a human scale. From the many will flow a single "media gestalt" -- that we'll experience in many ways.

WATCHMEN WATCHERS
Bias warriors have reduced media criticism to an endless ferreting-out of journalistic hostility toward victimized partisans. In the future, subjective, analog bias hunting will be replaced by a variety of data-driven credibility grades, "best-practices" quality assurances (think ISO 9000 for professional news media organizations), and outside observers. Who is watching The Watchmen? We are. With computers.

CREDIBILITY GRADING
Not all corrections are created equal. Not all lies are as damaging. Stupidity in one sphere doesn't prove stupidity in general. These are the common arguments against the notion that a database approach to credibility grading as a practical application. Yet our current analog system -- based entirely on "the human factor" -- has failed to make useful distinctions on these questions. The reason? Without a system of standards, a human-mediated system cannot respond quickly enough to counter its manipulation by outside parties (Swiftboating). Again: Control doesn't scale.

For credibility to scale to a global information glut, future news media must develop: 1. Systems of publicly grading the "confidence level" of developing information; 2. "Sticky" credibility grades on factual outcomes, both for news organizations AND for news sources; 3. A reputation economy for multiple levels of information users; 4. Transparent processes; 5. Standards-based archiving.

The human element is not endangered, and will account for applying these evolving systems. But future generations will find our fondness for seat-of-the-pants epistemology quaint... if more than a little disturbing.

DEATH OF MONOPOLY PRICING AND PROFITS
While metro newspaper publishers tend to frame the industry's current financial situation as a crisis brought on by declining circulation, the loss of old revenue streams (classifieds) and structural changes in the economy (Big Box retailers vs. locally owed businesses), these statements -- while true -- obscure the obvious. News markets that we once ruled are fracturing into numerous competitors, making it impossible for us to dictate monopoly pricing to advertisers. Newspapers remain profitable, but their profit margins are declining.

The current crisis is a crisis of expectations more than it is a crisis of fundamentals (which are, nonetheless, shakey). Shareholders have come to expect 20 to 30 percent profits from their media holdings, and that simply cannot continue in a diverse 21st century mediascape.

The future of our industry will be based on companies that return profits similar to those experienced in the retail sector. Smaller companies may return higher percentages, but big media will have to learn to get along on single-digit margins.

In a monopoly environment, falling profits mean quality cutbacks. In a competitive environment, companies that choose not to compete for quality choose to die. The winners in the 21st century will be those media companies that make this transition gracefully, fund quality journalism, and learn to be pleased with 8 percent returns.

GAME THEORY
Before video games, electronic entertainment was passive and learning was something that we delivered to young people in ways we determined to be good for them. Not anymore. Today's information users were weened on games that allowed them to explore their environments, and nobody under the age of 60 reads the entire user's manual before diving into a new game. Game concepts -- from user interfaces on news sites to reputation economies on comment threads -- will drive the development of 21st century media. All significant information will be interactive and two-way.

SOCIAL TECHNOLOGY -- VIRTUAL AND OTHERWISE
Social technology extends from the passive (LinkedIn) to the transitory (Twitter), from the
networked (Facebook) to the experiential (Second Life). Social technology can be the delivery platform for news (BREAKING NEWS ALERTS on Twitter), the organizing force behind original reporting (NewAssignment.Net) or the system that sorts and shapes the information stream (Digg). Social technology that incorporates each of these functions will play significant roles in news media within the coming decade for one reason: Social technologies are scalable.

THE WEB IS LOCAL
Earlier this decade I annoyed people by asking this question during high-level discussions of news strategy: Is the Internet local? No one ever said yes.

I meant this question as a challenge to the "local-local-local" insanity that has gripped the newspaper industry, but also as a challenge to think about community in non-geographic terms. And I think I've been proven right: the Internet (and more specifically, the Web) is local, at least in terms of the way people experience their lives.

I've lived in North Central Charleston, S.C., since 2001 and have never attended a neighborhood association meeting. On the other hand, last night I spent hours helping promote the Draft Lessig movement, then donated $50 to a potential candidate for the California 12th congressional district. Why? Because I am more fundamentally a member of the virtual community of values represented by Lessig than I am a member of the North Central neighborhood. I could move to another neighborhood tomorrow, but I would still be the same person once I unpacked.

And why should we disparage online identities relative to meatspace identities? There are more Americans who play World of Warcraft than there are American farmers. If you're trying to be relevant to people's lives, why aren't you covering the kid who made Level 70 in The Burning Crusade with the same degree of interest that you apply to reporting on the kid who won a 4H Award?

Is the Web local? Yes, if by local we mean "of highest personal priority."

Geographically local community coverage will continue to be -- as it is now -- an expensive, high-priority product with a market value capped by geography. Virtual community coverage is also expensive and high-priority -- but its value is limited only by the size and interest of each virtual community. Where would you rather put your money?

TRUE CONVERGENCE
The old idea of "media convergence" meant that newspapers and TV stations would -- in one way or another -- become similar entities on the Web, and news organizations spent plenty of money in the 1990s trying to figure out how to collect 30 percent profits off that idea. The new idea of converged media begins with open-source, structured/semi-structured data streams and flows out to every imaginable form of media, from newspaper to "news games" to virtual worlds to cell phones, via every established 20th century medium (text, still image, audio, video, tabular data, game).

The art of 21st century journalism editing will come in understanding how individual ideas or events are best communicated to target audiences. True convergence isn't about capturing the online video market: It's about learning to surf the wave of constantly churning social and technological change.

Example: YouTube revolutionized Web video in 2005 by offering free hosting for user-created content, plus an essential yet counter-intuitive feature: HTML embed codes. Three years later, most mainstream media have yet to catch up to that advancement because they can't figure out how to think about video within the context of their news operations. That's because they see video as a "thing," just as they see a news story as a "thing." Meanwhile, the video-sharing market is rapidly fracturing into dozens of competing platforms because video is many things. It can be raw, uneven and viral (YouTube), it can be immediate and highly personal (Qik), it can be deliberate and repeatable (BlipTV).

Hence, True Convergence isn't about adding video to your news website. It's about understanding that a breaking news clip of a robbery shot from a bystander's cell phone and a three-minute video story on crime statistics are fundamentally different things. They are only lumped together as "video" in the same way that a limerick and a technical manual may categorized as "text."

True Convergence begins with recognizing the similarities and differences between pieces of content REGARDLESS OF THE MEDIUM THAT TRANSMITS THEM.

CURATING INFORMATION
Wikipedia gets a bad rap in traditional media, typically on the grounds that it is uncontrolled and unfiltered by traditional top-down editorial methods (Reason? Control doesn't scale). While this fundamentally misunderstands the wiki concept, it also ignores entirely the beneficial ways in which people have come to use Wikipedia: As a curated form of search, and as a non-news based method of keeping up with developing information.

Consider: If I'm suddenly interested in what's happening in Kosovo, I can read the news reports on the independence movement. These reports place the emphasis on what's new, since news reporting values novelty. But for me -- since I haven't been paying attention to Kosovo -- that news is an isolated, context-free dataset, and there's only so much context I can get from a 15-inch wire story.

I can go to a traditional, top-down information source (CIA Factbook) to find out more about Kosovo, but I'll have to guess at how current the information is. Or I can go to Wikipedia and read an article about Kosovo that has been edited to included the latest information.

Typical news organizations shun this kind of thinking as "not news. They will soon retire that attitude. Since zapping in and out of topics is the way most informed people acquire information, creating and curating not only databases but high-quality topic articles will be one of the most significant journalism jobs of the future. Again, this will not be instead of news writing, but in addition to newswriting. The best news sources (BBC) already perform this function, often in real time.

NEW ELITES
The Old Elites were economic and institutional, with the occasional popular artist thrown into the mix. While such elites will continue to influence culture, they will be forced to compete with unmediated communities that produce their own elites. Example: Boing Boing's contributors represent an informational elite that is driving culture in ways that elude the control of traditional elites. These new elites will likely appear transitory by traditional standards, but their influence will be profound.

THE CREATIVE MIDDLE CLASS
In the current system, the two options for creative people are rock star or starving artist. In a networked era approaching The Singularity, the creation of content, knowledge and technology will be the primary work of the American economy. For this to function properly, we'll have to develop some kind of stable basis for a creative middle class. Journalists will be members of this class, and would benefit from structures that enable it (health care, new revenue relationships, etc.)

SURPLUS PEOPLE
As development of our knowledge and technology accelerates toward The Singularity, it is likely that the bulk of the human species will become -- in economic terms -- surplus labor. Affordable robots are moving off the assembly line and into our homes and offices, a trend that will accelerate as nanotechnology, green energy and environmentally friendly materials replace the fundamentals of our Peak Oil economy. Media will have an enormous role in dealing with the crisis of unemployable humans, a shift so enormous that we should not fail to include it in our thinking.

YES, NEWSPAPERS ARE GOING AWAY
Not immediately, and not because of TV and the Internet. Newspapers (as we know them) are going away because, as physical products, they are wasteful, create an enormous carbon footprint and pollute our cities. The transition from a Peak Oil economy of waste, consumerism and profitable inefficiency is rapidly moving toward an economy based on life-cycle costs. As regulations begin to require that publishers account for the disposal, cleanup and carbon-mitigation costs of their products, most newspapers will instantly lose real profitability. TMCs will shut down their print editions and switch to Web-only distribution overnight.

The future of newspapers is niche. In 20 years we'll see print newspapers as expensive, "boutique" products for the select few (and the terminally stubborn).

OK, that's what I've got. Think we'll find something to talk about?

Monday, February 18, 2008

"Blackboxing" news judgment

Orson Scott Card's "Ender" series wouldn't be much of a story without a device known as "the ansible," a sort of sub-space radio that allows people to communicate instantaneously from planet to planet across light years of empty space. The story doesn't work without it... but how does it work?

Card's answer? It doesn't matter. The ansible is a black box: In science fiction terms, that means that it has rules, that it abides by those rules, and that so long as those rules are followed, the reader doesn't need to know how it works.

If you get right down to it, that sounds an awful lot like news judgment.

You'll hear a lot of people explain a lot of mysteries as "news judgment" as you go through your career. At its most basic, news judgment determines the difference between a strip "false-lead" and a two-column "news lead" and a bottom of the page "reader." New judgment sorts the daily budget into pages and categories, but it also illuminates decisions about photo placement, cropping, and the tone of headlines.

That's plenty, but we're still not done. Because our profession uses the "news judment" black box to determine more than story placement and layout. News judgment informs our decisions about what stories to cover and what resources they merit. News judgment tells us which voices to trust and which voices to ignore. Information goes in, processed news comes out. What happens in between is... well, as we say: Trust us.

Pause to consider this for a moment. We work in a profession that demands disclosure of interests. We discipline ourselves as reporters and editors to think in rational, restrained ways about competing versions of "truth." We demand that all information have a source, and that we know the source. We set aside what we think and suspect for what we can prove, and if anything in that reporting winds up being even trivially inaccurate, we will fall on our collective swords "because our credibility is all we have."

And at the end of all that work, we turn over all that we do... to a black box.

Is it really a black box? Of course not.

News judgment is one of the arts in what we do. It's supposed to represent the wisdom of our tribal elders, passed down by the generations. It's the pause before taking the bait, the long view in the heat of the moment, the experienced eye that see through the surface spin. News judgment is achieved in part by observing, in part by remembering, in part by reasoning... but it is also largely a function of sitting around talking and worrying. Even when traditional news judgment is done well, it's a messy, fretful process, much more sausage than steak.

But when news judgment is done poorly -- and it often is -- it makes a mockery of those noble intentions. Work in the business long enough and you'll encounter it: Sunday night editors who downplay a big story because they don't want to remake a page on deadline; top editors with hidden agendas they will never voluntarily reveal. Egos and office politics and fear and vengeance."News judgment" is our vague rationalization for all sorts of failures.

One of the cultural shifts that lies ahead of us can be compared to the shift from analog to digital recording. Old LP records -- the kind we listened to when I was a kid -- warped and hissed and popped, but they had a warm sound that digital music doesn't, and there were (and are) still those people who prefer it. But switching to digital music -- each note and quality assigned a digital descriptor instead of an analog wave -- opened doors to new possibilities: CDs, MP3s, downloads, etc.

We stand at that crossroads. There are things I love about our current way of doing things, but then again, I know that the way we do things isn't really a system. Moving ahead into the 21st century is going to require information systems that allow people to make multiple uses of the same data. The past is analog and opaque, the future is digital and transparent, yet we can't budge from this intersection. Why?

Three reasons:

Newsroom culture loves its analog myths: the hard-nosed reporter, the tough city editor, the cynical poet who captures the beautiful ugliness of life in a 20-word news lede, then heads off to the nearest bar to drink himself out of that terrible clarity. We are, so many of us, romantics at heart. Don't turn journalism into a digital representation of data, we say. You'll kill off the human factor -- and that's what really matters.

Then there's another, less romantic reality: We know that, despite all our claims to contrary, we produce a low-grade product. Call it the first draft of history, call it whatever you want, but ask yourself this question: Would you put a guarantee on it? What's the shelf life of what we write? How informed are our decisions? Demanding a systematic accounting for the messy daily miracle that is a newspaper will only reveal how non-systematic we are.

Finally, there's power. Senior editors and the people who influence them have the unchecked power to "make the news" in their own image, and for all their talk, they simply don't want to give that power away.

Twenty-first century journalism will differ from journalism practiced in the 20th century in numerous ways, but I predict the most significant change will be in the way we structure information and account for our decisions. Narrative was our primary tool in the past, but narrative doesn't scale. News judgment worked pretty well when the economics of information were based on scarcity, but it falls apart in an information market based on glut.

Newspaper journalists tell me you can't present the news by formula, that you can't grade information on its confidence. My answer to them? Horseshit. Google News is run by algorithms. Digg is propelled by user input. Intelligence agencies -- one of the models for our 21st century journalistic descendants -- routinely grade the "confidence level" of the information they process.

We're going to need all sorts of digital news products and systems, but none of those developments will matter if they're presented based on some black box called "news judgment." In a world where there's too much information and not enough time, trust will demand transparency and repeatability.

Does that mean we'll no longer have courageous editors who buck the system and do what's right? Or that reporters who tell insightful, moving stories will lose their value? Absolutely not.

Here's what I believe: Once we do journalism in the open, with open-source principles and ethics, we'll have a shot at regaining the credibility we lost over the last 30 years. And once the people learn to trust us because they can test us, they'll be able to see the value of that courageous editor and that insightful reporter.

All they see right now is a black box.

Sunday, February 17, 2008

Gloom and Doom

I hear this quite a bit: Journalists (particularly print journalists) are tired of hearing all the "doom and gloom" about what lies ahead for the industry.

The statement is usually followed by a call for "solutions" and bolstered by high-ranking reassurances that "newspapers aren't going away."

Which probably explains why I'd rather talk about doom and gloom than sweetness and light.

Some points worth remembering:

New communications technologies don't typically exterminate their predecessors (this isn't exactly true: try sending a telegram). Newspapers survived radio, radio survived TV, TV has survived Teh Interwebs, and so on.

Even dinosaurs didn't exactly go extinct. They just stopped being big old dominant lizards and survived as birds -- light, maneuverable, adaptable birds.

Change only looks like gloom and doom when you're living through it.

Hence: The issue isn't newspapers. It's journalism.

If we insist on determining the success or failure of ideas based on the relative successes and failures of newspapers, then we're using the wrong metric. If we insist on valuing only those ideas that return 20 percent profits to investors, then short-sighted greed will blind us to the obvious.

We've got some heavy lifting to do. We need new business models (note that I made this plural) that can fund quality journalism. We need to switch from "document" thinking to "database" thinking. We must re-imagine a relationship with "the people formerly known as the audience" that fundamentally accounts for the fact that the transmission of information is no longer a one-way street.

Will there be newspapers in the future? Sure. Not that most of us will care.

Radio was still around in the 1950s, but Edward R. Murrow -- whose "This is London" Blitz radio reports were the stuff of radio-journalism legend -- had moved on to TV.

So let's stop worrying about preserving the status of the current newsroom management class and the wealth of our media ownership elites.

Let's start scheming up new ways to make great journalism.

Nothing gloomy there.

Wednesday, February 06, 2008

Why quality is a moving target

I got my first regular job at a professional newspaper in my final semester of college. It was a job that no longer exists: Paste-up guy for The Chapel Hill Newspaper.

In those days, editors used pencils to draw page layouts on pieces of paper and stories came out of typesetting machines in long single columns. My job was to take those typeset columns, run them through a machine that coated the back of the sheet with hot wax, and then use an Exacto knife to assemble the stories according to the layout diagrams.

Photos came in pretty much the same way: Editors sized them, the imaging department sent them to a printing machine, and then our crew would cut them out, wax them, crop them, put them on the page and outline them with line tape.

Every once in a while an editor would call for a cut-out photo, and since I had a background as a commercial artist and did a decent job of cutting subjects out of photographs, my skills encouraged more editors to design more pages using more cutouts.

In those days, the definition of a "quality" cutout was a photo that had been cut by someone with a knack for it.

Then Photoshop came along, and suddenly it was possible to do digital cutouts.

FAST FORWARD: PHOTOSHOP
And here's the message: On the day that newspapers gained that ability, the BEST Exacto-knife cutout artist became obsolete. It's simply not possible to do with a knife and a waxing machine what Photoshop users can do with the Magic Wand and various lassos.

New tools -- new technologies -- changed the definition of quality for that specific journalistic convention. The reason was obvious: What used to look good to our pre-Photoshop eyes looked ham-handed once we saw what the new software could do. We understood, without needing to argue over it, that we had to change our standards.

So why is it that when we talk about informational tools, Old School Print Journalists categorically reject the idea that new technologies make our old standards obsolete?

EXAMPLE: ELECTION NIGHT
I spent more than decade running election coverage at two daily newspapers. Having a successful election night is really about logistical planning, which meant that I would typically spend a lot of time thinking about how to process poll results into reliable numbers on deadline. The hard numbers went to "Winner's Boxes," the percentages went to reporters and editors, and if everything ran smoothly, they'd all match-up by final edition.

That was the state of the art in the late 1990s: Dead data in stories and image files, collected over the phone and handwritten by a staff of clerks onto Xeroxed forms that we copied and distributed to be keyed into the system by more clerks. Valuable today, useless tomorrow.

Did we know about spreadsheets and databases back then? Sure. But we had no practical way of integrating those programs into our proprietary content management systems. We couldn't even type the numbers into that system and have them flow into the program that we used to create the winners' boxes.

Then XML came along, and suddenly everyone had the opportunity to build systems that could talk to each other across platforms.

XML allows us to mark up structured or semi-structured data for multiple uses and re-uses. Type it in once, edit it as you would an important story, and then archive when you're done. Write a simple script and the numbers in your winners boxes will update automatically every time you enter new results. Collect enough of it and you can do information magic: Comparisons, charts, searchable products. Combine that data with other tools (the Google Maps API, Flash, Action Script, etc.) and you can create products that were unimaginable when you collected the data in the first place.

That should have made the old "dead-data" system obsolete overnight, just like Photoshop killed all the Exacto cutout artists. But XML has been around since 1998, and most newspapers still aren't using it to manage the data that they process.

Take a look at the cable news network websites: They get it. Cable news spends more on technology because they're working in public in real time. Newspapers don't care, because we work in private and we only show you our final revision. To cable news, technology is a mandatory investment; to newspapers, it's an expensive luxury.

So most cable news sites put up election numbers that are actually being served from databases, rather than typing results from one document into another document. The difference? Change the database and you change all the instances. Without a databased system, you must manually edit each instance that appears on your site.

Because newspapers don't tend to think in terms of databases, they create documents. Even the cool "Interactive Map" that Charleston.net posted after the South Carolina primaries isn't really a true database product. Yes, it nicely mashes up election data to a Flash map, and it looks great. But the data is an isolated capture.

That is to say: because we didn't design the guts of the map around a central database, the data in the map wasn't automatically updated when the state parties provided their final tallies. That's why it's Feb. 6th and our Jan. 26th map still says that 1 percent of Richland County's precincts have yet to report.

To a newspaper editor, that's a quibble. We published. We had a map. It was cool. But I look at that 1 percent and see an Exacto knife cutout. It's a good Exacto knife cutout, but I know that technology has changed the audience's expectations. I know that this change is continuous and accelerating. I know that the data matters.

Why should a newspaper editor care? Because if he'd bitten the bullet on creating an integrated election-results database in 2007, he'd have all sorts of cool things that he could do with with that data in 2012. Or November. Maybe his reporters would play with it and find fascinating stories. Maybe he could open it up to users from the website and they would find fascinating stories. We'll never know.

If you cannot immediately grasp why this is concept is fundamental to understanding the immediate future of 21st century journalism and culture, well... I'll be posting on that later.

Are you thinking, or "quorum sensing?"

In the fall of 2005 I wrote one of my final science-beat articles on research into a biological phenomena called "quorum sensing." Specifically, quorum sensing represents a form of chemical communication between bacteria. That's vaguely interesting, but the exact moment at which quorum sensing transformed my understanding of the world took place when a microbiologist described bacterial behavior as being "kinda like a really bad corporate environment."

Because I realized it wasn't "kinda" like a bad corporate environment -- it was EXACTLY like a bad corporate environment.

QUORUM SENSING IN BIOLOGY
To summarize: Bacteria send out chemical signals into their environment. As they wander about they encounter chemical signals sent out by other bacteria. At low levels of population density, this chemical signaling has zero effect on bacterial behavior, but as a population begins to expand, the background noise of chemical signaling passes a threshold. The result is like flipping a switch: Bacteria that acted one way below the threshold suddenly behave in radically different ways. And they do so like a light turned on by a switch rather than like a light turned on by a rheostat.

Hence: When an individual bacteria receives only a few chemical signals, it acts as an individual. But once it perceives that there are sufficient members of its species in the area, it cooperates in complex ways to create a colony. Once that colony is established, quorum-sensing bacteria will serve the colony even if that means committing suicide.

This is why having one or two e. coli on your hamburger won't make you sick. Individual e. coli are waiting for a signal to start acting aggressively. A few e. coli acting up would immediately be overwhelmed by your immune system; an army of e. coli, conducting a coordinated surprise attack, will put you right on your ass.

(Interestingly, this also takes place in the animal kingdom. Ever notice how fire ants tend to bite all at once? That's because they're using chemical signals. If the first fire ant to reach your ankle bit you right away, you'd kill it and brush off the rest. Instead, the ants wait for the signal that says a bunch of them are in position, and then they bite simultaneously. It's an evolved a form of communication that enables ants to inflict maximum damage on their enemies.)

And how do bacteria colonies compete for resources? Our classical, Western, market-based idea of competition holds that whichever competitor is the most productive wins. But quorum-sensing bacteria often win by cheating, secreting chemicals that poison or confuse their competitors.

Does the best bacteria colony win? Not usually. The most organized and established bacteria win.

And since our subject is media change, this point is significant, too: In a mature bacterial ecosystem, there is little actual competition. All the organisms in the ecosystem have evolved to exploit their own niches. What sets off bacterial war is anything that upsets that equilibrium. Competition -- often violent -- occurs whenever nature encounters a vacuum.

QUORUM SENSING IN THE MEDIA
I believe it helps to think of mass-media as an ecosystem. Newspapers and TV stations competed for scoops, eyeballs and ad revenue, but TV didn't threaten to put newspapers out of business, and vice versa. That particular media ecosystem remained more or less stable from the early 1960s to the early 2000s. The World Wide Web began threatening it in 1994-95, but it took the mass distribution of broadband access and a series of subtle technological advancements to actually disrupt the equilibrium that pays journalists' salaries.

We practice journalism today in the transitional period between an old equilibrium that has ended and a new equilibrium that has yet to take shape. The outcome cannot yet be reliably predicted, and the notion that the best, most productive ideas will naturally rise to the top is far from proven.

As I look around, I see a lot of companies acting like bacteria colonies. They send out signals and try small initiatives, but few are moving in any bold, wise directions. Many executives are just sitting around, receiving signals from their environment, waiting for the signal that a "quorum" has coalesced around a new direction.

In other words, their actions will not be determined by an independent, forward-thinking assessment of individual ideas, but by their perceptions of where everyone else is going.

On the one hand, this isn't a bad approach. It's certainly traditional, and it certainly offers at least the illusion of safety. But this corporate quorum sensing has also been the cause of some amazingly foolish industry fads -- like the "pay-to-read-our-website" push and the "heavy registration" mandate. It's not like the experts didn't tell the executives these were bad ideas at the time -- it's just that everyone understood from the signals in their boardroom environments that the CEOs and shareholders were tired of websites that didn't make profits.

QUORUM SENSING WITHIN THE PROFESSION
Journalists like to believe that we are -- as a profession -- a tribe of free-thinking individualists. It turns out we're not that different from other professions, with the same incentives toward group-think, quorum-sensing and anti-intellectualism.

It's also time to call bullshit on the newsroom tendency to imagine that all these conformist, bottom-line tendencies arise with the beancounters and are then forced upon our noble First Amendment enterprise. Newsrooms are, by their very nature, conservative institutions that abhor anything that disrupts the production cycle. We're actually hostile to innovation.

You'd think that an entity that produces something new every day would be adapted to rapid change. The opposite is true. A newspaper is a physical object that is printed and delivered at the same time, every day. Only what goes into it changes. and so long as what goes in doesn't upset the production process or change the physical object, content barely matters.

But try to change any of those variables -- deadlines, workflows, meeting schedules, the relationships between our print products and our electronic products -- and watch all hell break loose. A newspaper is a machine honed to perfection by time. It can't adapt easily to new things because it's become so efficient at doing the same thing, over and over.

Why bother to talk about this? Because as we discuss creating or improving quality in the digital environment, what we're really talking about is re-engineering the entire environment. Print journalists want to talk about saving newspapers, and that's the wrong topic. The real issue is how we'll change journalism to function with 21st century tools.

Why talk about this in the context of bacteria? Because print journalists by and large still discuss new-media journalism based not on experience or study, but on their quorum-sensing perception of their peers' attitudes. New media tools like database/map mashups represent wonderful new opportunities for advancing the original goals of journalism, yet print journalists still tend to frame their internal discussions of new media as a narrative about civilized people besieged by barbarians. These men and women are not thinking about the future, they are conforming to their perception of the present.

And if you are outside of that perceived consensus, you're likely to be very lonely.

PUTTING QUORUM SENSING TO WORK FOR YOU
A shorter version of this post might be: "Being right is less important than being normal." Because something about human psychology tells us that the more often we hear or see something in a non-threatening way, the more normal it becomes.

Two things will have to happen before we'll succeed in creating quality journalism for the new media ecosystem. Thing No. 1 is we'll need a functional business model; Thing No. 2 is we'll need to convince everyone in the business that our ideas fit within the mainstream.

For instance, I predicted two years ago that the future of news reporting will include geo-tags for every location we mention in stories. And the technology exists today to integrate this into the newswriting workflow. The costs of including this in our reporting are minimal, the benefits are significant, and you could make money by doing it.

So why aren't newspapers doing this?

Because it seems weird. Because it's new. Because the people who make these decisions can't quite imagine the new products that would make use of this data or the process that would create a geo-tagging interface for newswriters and editors.

But here's what will happen: As executives are exposed to more geo-data mashups, they'll begin to perceive that geo-tagging is OK. Geo-tagging will start to seem like a normal thing to do with bits of information. They'll start to see the profit potential in it, because companies like Google will be making money off of it.

And on that day, some newspaper executive will ask his assembled subordinates,"Why aren't we geo-tagging all our stories? Why are we behind the curve again!?"

In other words, seeing the future isn't enough. We have to communicate that future over and over, spreading our good ideas and winnowing out our bad ideas. We have to understand that our decision-makers will have to see our ideas crop up in many places before they'll see them as valid.

News veterans -- and I'm one of them -- are not generally enamored of the newest generation of entry-level reporters. We question your work ethic, your willingness to learn, your willingness to pay your dues. I think there are lots of things young journalists should learn from veterans -- but let's be frank: You're also in a position to help us advance new ideas about journalism and media.

You can help by leading, but you can also help by treating new ideas as normal evolutions of old values. You can become quorum-sensing transmitters of normalcy to anxious news executives. You can think about novel technologies without feeling threatened by them. You can see change as an opportunity rather than a threat, and you can communicate that with your attitudes.

Whatever the fate of your generation, you can play an important role in the current transformation from the old equilibrium to the new. It might even make some of you stars.

"Charlie, here comes the deuce. And when you speak of me, speak well." -- Crash Davis, Bull Durham (1988)

Tuesday, February 05, 2008

Quality and other essential bullshit

"The place to improve the world is first in one's own heart and head and hands, and then to work outward from there. Other people can talk about how to expand the destiny of mankind. I just want to talk about how to fix a motorcycle. I think that what I have to say has more lasting value." -- Robert Pirsig, Zen and the Art of Motorcycle Maintenance, 1974
Our topic for Feb. 25th (absent a spiffy name-change) is "Quality in the digital environment." And obviously, with professional journalists coming in to talk to journalism students and faculty, the form of quality we're going to be discussing is ...

Musical quality.

Or we might as well be. Because "quality" turns out to be one of those philosophical topics that professional journalists simply hate. So to get things started, let me encourage you to begin your thinking about journalistic quality in the digital environment by forgetting about the journalism and the vaguely defined digital environment and focusing on that elusive word: quality.

So: Back to music.

You know good music when you hear it, right? And your tastes in music have improved with age, as you've listened to more music? Most likely.

But if I asked you to define -- in terms that apply to all music and to all listeners, in all meaningful situations -- what makes one piece of music quality and another pure crap, would you consider that a valid request? Where would you begin?

And if it's not possible to define a universal standard of quality for something as universal as music... well, how are we going to do that trick for journalism?

We all think we know quality when we see it (or hear it), but when we attempt to define what quality is, we can't get there. We can't pin quality down, measure its absolute terms, cite its sources. In touching upon this word, we are entering into hostile territory: the realm of the philosopher.

Quality is, at its heart, a matter of metaphysics, and journalism -- as a set of tools -- is poorly equipped to deal with metaphysics. Our profession is a mental discipline in which we are asked to weigh multiple bits of information and perspective and then responsibly reason our way through conflicting possibilities by asking: How do you know? Where is your proof? In our business, the material always trumps the intangible: You've got an anonymous quote? My on-the-record quote trumps it (regardless of the quality of the two quotes, by the way). You've got three people who'll say something on the record? My official, FOIA-obtained document kicks their collective asses. And so on.

So when you try to talk philosophy with journalists, be prepared. Our brains short-circuit. What journalists really believe is that things that cannot be sourced in a materialistic way are -- in essence -- irrelevant. Which means that quality is just a word, and people who say things like "quality cannot be defined" are simply eggheaded bullshit artists.

OK. Got it.

But understand this: To be great, you must be willing to tackle the intangible. To build better journalism, you simply must struggle with the abstract dilemmas of quality, because the answers you find will point you to new possibilities.

Are all opinions of quality equal? Should decisions about quality be trusted to the people or to informed elites? What expertise improves our understanding of quality? Can quality be measured by popularity? Or profitability? Should we measure journalistic quality by how closely we hew to abstract notions like "truth," or by tangible metrics, like correction counts? Is quality found in a process or discipline, or does it reside solely in the end product, however created?

Why should anyone care about this? Because brand new tools are changing the context in which we commit journalism. My charge to you: Do not judge new tools by old assumptions -- particularly those that reject the validity of abstract criticism.

Because we do not look to improve the quality of journalism for journalists. Or to make more money for stockholders. And we certainly don't do it to validate the grumpy opinions of our high priests.

The only reason to improve the quality of journalism is to serve people. And we should get about it.
"Any effort that has self-glorification as its final endpoint is bound to end in disaster." -- Robert Pirsig, Zen and the Art of Motorcycle Maintenance.

Wake up!

I mothballed this blog in 2007 for many reasons. Some of them were conceptual (I felt that discussing media absent the larger context of culture was pointless); others were personal (I'd stepped down from a "management" level job and returned to reporting).

But it occurred to me last week that it was time to bring it back in a new, and very specific, incarnation: From now until Feb. 24, I'll be using this site to prepare students at the University of Mississippi for my visit on Feb. 25th. Janet and I will be appearing on a panel with the tentative title "Quality in the Digital Environment," then sticking around to talk with the staff of the student newspaper there.

There's a great story about a radio talk show host and fan of Noam Chomsky who once ambushed Jeff Greenfield about why Nightline wouldn't invite Chomsky on as a guest. Greenfield's answer -- oft-derided but certainly candid -- was to ask a question: Could Chomsky condense his ideas down to 30-second answers?

And obviously, the answer is no. You can't introduce an idea in 30 seconds. All you can really do in 30 seconds is reinforce existing ideas, scoring shallow debating points against "the opposition."
I have no interest in boring people who don't care about my topic with lengthy explanations. In fact, I really have no interest in people who don't care about my topic -- which, based on experience, will likely be most of the people in the audience. But I'm intensely interested in the self-selecting few who will find something of value in the ideas I represent.

To those people, I dedicate this new incarnation of this old blog. I'll try to post a steady stream of ideas and concepts here, then leave it around for reference for anyone who comes along later.