news
Tagged:

Keith HopperThere are scores of news sources covering the elections, but which provide us with a clear picture of how things are progressing, especially when so many sources these days rely on sensationalism or political bias to build audience? One potentially under-valued way to understand what’s happening in our world is to take a look at online prediction markets.

Online prediction markets are sites where individuals can place bets - usually non-monetary – on particular outcomes of world events. These "bets" take the form of either/or outcomes, such as whether a particular movie will be number one in the box office or whether a political candidate will win or lose an election. Bets are then traded much like on a stock market. If you bet wisely on a particular outcome, then you stand to "profit" on your bet. As individuals buy and sell on potential outcomes, the collective prediction fluctuates in real time. As bystanders, we might gain some insight into predicted outcomes and their underlying drivers by watching the fluctuations in predictions as they unfold.

But are these predictions accurate? Markets such as Iowa Electronic Market have been recognized as being a fairly accurate and effective predictor of past elections. While past performance is no indicator of future outcomes, it can be enlightening to watch the outcome percentages fluctuate over time. Behind shifts in predicted outcomes lies the aggregated knowledge of all the market traders. In other words, when a percentage shifts there might be new information driving that shift. In the case of an election, this shift might represent the punch of a campaign ad or the release of a new piece of potentially damaging candidate history. Because traders stand to benefit if they move quickly on a piece of new information, outcomes in exchange markets often represent the very latest information and can be used as a bellwether of sorts around particular events as they occur. For example, perceived performance during the course of a live debate might drive real-time market predictions. Additionally, as bystanders we can gain a better understanding of important events without first having to track down and analyze the underlying facts. The underlying information is still critical of course, but instead of finding it and determining what it all means we might do the opposite – look for impacts and then find the information behind it that might prove meaningful.

Monitoring outcome percentages in prediction markets might be useful for the casual news consumer. I have pulled prediction numbers from three online prediction markets on the web around the upcoming presidential election in order to explore whether this theory might be correct. Specifically, I include predictions in real-time from Intrade, Foresight Exchange, and the Iowa Electronic Market around whether the incumbent president will secure a second term1.

Once per day, I will automatically pull the prediction numbers from the three respective markets and publish them to my Twitter feed. My hope is that these percentages and their fluctuations over time might help the casual news consumer get a slightly better understanding of not only the election outcome, but of meaningful news information that might be happening surrounding the election without having to wade through a sea of potentially conflicting reports. I’m curious whether this experiment might be useful or the predictions accurate. I have no idea – perhaps you can help me understand. I’ll be watching responses on twitter and reporting back if interesting stuff arises.

Latest Results:


1 To be clear, I could have tracked whether Obama will either win or lose the election (% chance of losing = 100 - % chance of winning for those keeping score at home). The decision to track the chance of a reelection and the decision to track the presidential election more generally does not represent my views on the outcome (e.g. whether or not I prefer either of these outcomes), but instead represents my desire to do the least amount of math possible, since all three prediction markets provide prediction data in the affirmative (win) rather than the negative (lose). It’s also worthwhile to note that these three prediction markets represent slightly different potential outcomes, for example, it is technically possible for Obama to not secure a second term without losing the election. I feel these outcomes are similar enough for the type of monitoring I propose to treat as equally representative of a predicted reelection outcome.

Intrade tracks Obama's re-election, Foresight Exchange and Iowa Electronic Market track any Democratic candidate winning the presidency.

Most of the blogging work I've done over the last several months has been around NPR's local news efforts.

In Beyond the Blog, I point out some interesting places where Gawker Media's new proposed redesign may teach public media a thing or two and perhaps suggests that we're on good footing.

In another post on Top 10 Challenges Stations Face in Adopting Local Continuous News, I go into detail around what we've learned as we introduce news blogging to our pilot public radio stations and teach them how to do it. Certainly some lessons here around new technology adoption and organizational change management (pro tip - it's hard).

And even earlier, I looked at The Pew Internet and American Life Project findings from their Neighbors Online report on individuals’ use of online tools. The big takeaway here is that people don't care less about local news, they're just shifting their attention to a wider array of sources and finding content via their social connections online.

My NPRbackstory experiment got some press this week when Josh Benton from Harvard's Neiman Journalism Lab published an in-depth piece on the utility. Josh and I had discussed the project last fall, right before I started working for NPR (the utility was cooked up as a homegrown effort to play with the API and is not officially endorsed by NPR). More recently, he saw an interesting backstory piece pop up on the Kentucky Derby and plumbed his own archives. I'm particularly excited by his focus on how the tool extracts value from existing news archives.

The piece ended up getting attention from Techmeme, Waxy.org, Christian Science Monitor, Journalism.co.uk, Poynter Online, and others.

And of course I'm grateful for all the positive mentions on twitter... and for my employer not pulling my API key when they found out what I had done ;-)

Update (June 7, 2009): Great coverage of NPR's forward thinking digital strategy highlighting NPRbackstory from Mashable and CBS News (Monday Note).

Internet solutions appear wherever finding, connecting, and sharing information with others is expensive or difficult. This is especially noticeable when individuals with similar interests but insufficient proximity are finally able to connect. Unsurprisingly, there are now sites bringing together global interest in speaking Klingon, knitting food, and collecting cookie fortunes.

But what about deploying internet technologies for people who are near one another? Certainly this technology isn’t just about bringing together far-flung hobbyists – there should be unresolved information needs that exist at a local level, as suggested by the buzz around hyperlocal news.

In determining these information needs, we must resist the temptation to focus on what media organizations proscribe or what is currently vanishing from existing news outlets. Instead, we should look at routine communication barriers that can be dismantled by internet-based solutions. This is surprisingly difficult to do, since we often don't see the barriers we face or recognize them as unnecessary. In order to determine where technology might be best deployed to address local needs, we must find situations where individual members of local communities are actively trying to find, connect, and share information with one another. Then we can look more closely at the difficulties, delays, and expenses that might be eliminated or reduced through more tailored use of online technology.

Looked at in this way, it becomes clear that finding and connecting with others nearby to exchange our stuff (craigslist.org), meet around shared interests (meetup.com), and initiate relationships (match.com) have all been remarkably successful. But what about sharing local news? Success with local news has been less pervasive and straightforward. Arguably, this is because existing solutions have not yet fully uncovered the true needs and barriers to sharing local news.

Another method for determining what these needs and barriers might be is to monitor online tools that excel at supporting a breadth of communications. Within these tools, we might find clusters of people who share geographic proximity and are actively communicating. Identifying patterns in communications or locations here will reveal which local needs may be benefiting most from the reduced friction of online communication.

Interestingly, most social networking tools provide little of this local communication. Both Linkedin and Facebook, for example, seem to excel at connecting out of touch and geographically disparate individuals. Things have started to shift, however, with the introduction of the short messaging system, Twitter. With Twitter, people are starting to connect with one another simply because they are nearby. Twitter seems different in this regard, and understanding how Twitter is different might just be the key to understanding where frictionless local communication holds the most promise.

Twitter saw its first big explosion in usage during the 2007 SXSW festival in Austin, TX. This was in large part due to the attendee’s unresolved need to connect with others at the conference. Ironic as this may seem, as you move around an event such as a conference, you become a mostly passive recipient of information, cut off from explicitly sharing the experience with others. Communication needs at large events like this range from broadcast heckles to simple queries around where your friends are, what events are attendance-worthy, and who to get to know. In my own experience, this proximity-effect of Twitter carries over into day-to-day situations as well - it becomes valuable to follow someone simply because they live near you. But why?

I believe one answer lies in the immediacy of the information that is shared. Specifically, it is surprisingly difficult to share information about what's going on right now amongst people near one other. As with SXSW, local twitter messages (tweets) are most valuable when they contain information about what is happening right now – often something that might affect me because of our relative proximity. For example, I might monitor the tweets from those I follow locally to know where they are or where they’re going so that I can (presumably) join them. It’s valuable to find out about something as it happens. I can always visit a traditional news source if I need to seek out a specific piece of information or learn of important happenings after the fact, but who’s going to let me know of something important going on right now? It's this active nature of twitter, filtered by real people, providing immediately sourced, proximal information that makes it so valuable. Nothing seems to match twitter for a real-time assessment of what I need to know about that’s going on near me.

Perhaps Twitter points to only one unresolved need – the need for immediate, proximal information, but I believe this need will blossom into a more significant source of local news and take different forms as it more seamlessly encourages useful sharing.

WBUR Hyperlocal Discussion Following a recent post and discussion on hyperlocal news, WBUR was kind enough to let me initiate an open discussion on the topic during their monthly meetup at the station.

Around 15 people participated in this discussion, including Lisa Williams from Placeblogger, Ben Terris from Boston.com's Your Town, Adam Weiss of Boston Behind the Scenes, Persephone Miel from Internews Network, and Doc Searls from Harvard's Berkman Center. You can hear the conversation here:

The conversation covers a wide range of topics, including:

  • Trends and directions of hyperlocal news. Where the emerging opportunities might be.
  • What the user demand might be around hyperlocal news - where the current gaps are in addressing user needs.
  • The rising importance of immediacy and speed of hyperlocal solution deployment
  • The problem of scale and searchability around hyperlocal sites
  • How hyperlocal sites and the online-offline proximity connection might address the human need for social cohesion

WBUR Tweetup

On the evening of Thursday, February 5th, WBUR in Boston will be hosing their sixth (seventh?) monthly informal gathering at the station. WBUR regularly convenes the Boston social media community for the purpose of facilitating discussion around social technology and its growing role and impact on local community, news, and public media. All are invited to attend this free and open event. Details here.

At this event, WBUR has agreed to let me lead a discussion on hyperlocal news - in part due to the good discussion that's stemmed from this hyperlocal blog post and my interest in doing a follow-up on hyperlocal's future potential. Won't you join us?

Keep an eye on this blog for a follow-up from the event.

everyblock.com crime map

The term "Hyperlocal" generally refers to community-oriented news content typically not found in mainstream media outlets and covering a geographic region too small for a print or broadcast market to address profitably. The information is often produced or aggregated by online, non-traditional (amateur) sources.

Hyperlocal news is conceptually attractive because of its perceived potential to rescue struggling traditional media organizations. Most attempts at hyperlocal news websites have not proven to be entirely successful. Hyperlocal appears attractive to traditional media organizations for the following reasons:

  1. There is a perceived demand for news at the neighborhood/community level. The costs of print production and distribution have historically made providing this unprofitable, but the lower cost of web distribution could be used to serve this need.
  2. In an online world, regional media outlets are no longer the gatekeeper of news content and therefore must rely on their geographic relevance to provide unique value. Hyperlocal news leverages geographic relevance.
  3. The rise of citizen journalism and Web 2.0 seems to suggest that users could provide the majority of local content, thereby reducing or eliminating staffing costs.
  4. Local online advertising seems like a promising and not yet fully tapped revenue source.

History & Approaches
Hyperlocal seems to have emerged as a popular concept in 2005, even while regional news websites and blogs were already becoming common1. In 2006-2007, the first significantly funded hyperlocal sites and platforms were launched. There were high-profile failures, most notably Backfence.com (2007) and LoudounExtra.com (from Washington Post in 2008). Many early efforts took the form of online newspaper websites, employing local reporters (or sometimes bloggers), and attempting to source user-generated content by inviting individual submissions or incorporating user discussion functionality. There was much speculation on why this approach often failed. Regardless of the specifics, their universal unprofitability suggests that producing a local newspaper-like presence simply doesn’t create enough demand (online readership) to justify the costs (local staff). Of note are a few surviving examples like the Chicago Tribune’s Triblocal project that create and distribute hyperlocal print editions from their online content, and many hyperlocal blogs which operate on less auspicious budgets.

Around the same time, a slightly more promising wave of information-heavy regional news sites (such as pegasusnews.com) emerged. These sites were inspired by the success of regional review sites such as yelp.com and Yahoo! Local and in response to the high costs of local content production. These new efforts focused on incorporating dynamic regional data, such as crime stats, permit applications, real estate listings, and business directories in lieu of an emphasis on hand-crafted local reporting.

A third and perhaps most promising wave of local news sites emphasized the aggregation of third-party content. These include platforms such as outside.in, topix.com, and everyblock.com – all of which are framework approaches - aggregating content, mostly through RSS feeds, for many geographic locations (in some cases thousands) in order to build enough accumulated traffic to make a local business model work. Some slightly different takes on this model have individuals in specific locations acting as editors and republishing aggregated content (universalhub.com) or aggregator sites focusing on particular types of content (Placeblogger.com).

Lessons Learned
You can’t serve online users the same way as newspapers or broadcasters serve regional audiences. The news and information demands are wildly different. It is not enough to reduce printing and distribution costs or put content into "bite-sized" pieces. The user-consumer is trying to solve radically different problems from a unique perspective around their online information needs.

Giving participatory tools to users does not make them publishers. Users do not produce material that looks anything like mass media content. Users have an expectation of being involved, and their efforts (such as sharing) can be helpful or even necessary in some contexts. However, assumptions about traditional publishers shifting effort "to the crowd" are misguided. Users are also notoriously fickle in their socially-driven motivations. Our understanding around what motivates people to participate online and in what context is limited.

Manually producing local content is expensive. This isn’t a surprise. What shocked people is that there is not enough consumer demand online to justify this cost.

Aggregation is cheap, and if done effectively can create enough demand to be profitable – particularly across many locations. As more sources make their content available through RSS feeds and APIs, this is only going to get better.


1To be clear, the hyperlocal hype from traditional media organizations took fire in 2005, but local sites like Craigslist and H20Town were long-standing successes by this point, thereby playing their part in fueling the excitement.

In the early part of the 20th century, radio programs reached national audiences through newly-constructed radio networks. For the first time, mass media news had a human voice – and later with television networks, a face. This drove networks to develop trust as a human asset, and news anchors cultivated personalities that you welcomed into your home and returned to again and again. Over the ensuing decades, we stopped relying primarily on our friends and neighbors to learn about what was going on in the world and instead looked to a few critical human voices that were trusted without question.

This trust began to unravel in the late part of the 20th century. News media fragmented into biased channels, public opinion of reporting eroded around clashes with the federal government, and shifting advertising revenue and downsized newsrooms led to highly visible gaps and gaffs in a previously trustworthy and consistent news reporting environment.

Meanwhile, the Internet is helping consumers become increasingly savvy about media, and new expectations around participation and transparency in information delivery is emerging. In this new environment, a singular voice of the news ceases to make sense – except perhaps when John Stuart mocks the system as a whole. Online tools are enabling collaborative and person-to-person communications as potentially more reliable and trustworthy mechanisms for getting news. Individuals now capture the news on their cell phones, deliver the news on their blogs, and share the news through social networks.

Perhaps news is no longer presented as a single story, cobbled together by a single agency and delivered through the mass media by a single voice. What was once a single story now becomes interpreted and conveyed by a range of voices through different formats, channels, and modes. As humans, we still build trust through human interaction and engage with stories that are delivered with emotion and conviction. Some stories are made more meaningful by our connection to the individuals telling the stories, and others because a fresh authentic human voice speaks to us. I believe we yearn for this raw communication as a method for getting our news and making sense of the information within – something that historically has not been present in mass media.

If the future of news communication is more humanistic and distributed, delivered by an array of authentic storytellers, where does that leave the traditional journalistic reporter? Their importance doesn't suddenly evaporate. What is their place amongst this array of voices? Are we now all journalists or do we expect the ones with the credentials to develop their own authentic voices? Both situations are currently happening in this new environment, as some bloggers are vaulted into mainstream public attention, and some existing journalists now craft their own blogs.

However, I think there's a third less explored role for journalists. A need that arises from an array of unknown, emotion-fueled storytellers who do not necessarily engender trust. The very nature of these raw voices will cast doubt on the validity of the underlying information. Journalists must help the information in their stories be valid, and the stories be trusted. I believe the secret lies in weaving together these new voices into a more cohesive whole. The time-tested role of editor re-emerges to perform this critical function. Perhaps the contemporary journalist wields new media editing tools like the traditional journalist wields a typewriter. The news is not delivered through a single human voice, but by collecting together the voices of others. The editor's red pen ensures the facts are preserved, underlying truths are revealed, and opinions are exposed. In this way, we get original voices, rich with information and authenticity. We are not led astray by their subtle biases or gaps in reporting. A new voice of the journalist emerges, crafting the news out of independent tellings, spinning the traditional piece on its head. Here, truth can be served in a compelling new way, and a variety of voices reveal new insights only possible through the stories of regular people.

Syndicate content