‘Extracting key facts from the big news stories and publishing them in machine-readable format so talented creative people can make beautiful displays.’ Lisa Evans, The Guardian
We are in the midst of data journalism’s golden era. I know this may be impossible to judge when arguably the profession is still only forming but take a look at the big stories of the last few months and you will find data journalism at their core:
- Wikileaks (Iraq, Afghanistan and US embassy cables)
If there is data, there is probably a story in it. The size of the story may vary, but the essence is that if a mass of data is effectively filtered and presented, then what was not apparent at first can quickly become so.
The question for me is what comes first: the story then the data, or the data then the story? The answer, excuse me for sitting on the fence, is that both are right in certain circumstances. For example, the US embassy cables data came first and the stories started to trickle out. But for other stories the data follows the initial line, such as the recent flu outbreaks.
The most blaringly obvious answer to this post’s title is: to generate new stories. Beyond this however, there are finer points.
The main ways data journalism can be used:
1) National events mapped to pinpoint localities
2) Demonstrate trends over long periods of time for a new story
4) Centralising information that is spread out across a lot of platforms
5) Acting as a gateway to data that the public may have not known of
Data journalism is the promotion of transparency of public bodies. The thirst for data and the ability to get it through the Freedom Of Information Act means that openness is a requirement rather than an option. The focus is on the raw facts, and then opinions can be formed on the basis of that.
Or in other words, data journalism stands hand in hand with open government.
Michael Greenfield (mgreenfield13)