Reconfiguring journalism in the digital era
UBC professor Alfred Hermida is charting the evolution of data journalism in newsrooms.
Posted by GRAND NCE, November 6, 2014

Newsroom of Maison Radio-Canada in Montréal, QC. Photo by Jason Paris.
Newsroom at Maison Radio-Canada in Montréal, QC. Photo by Jason Paris.

In a time of declining resources, a new form of data-driven journalism is on the rise in beleaguered newsrooms. Integrating data analysis, visualization, and computer algorithms into journalistic practice, the emerging field is pioneering ways to turn our glut of public data into engaging stories for media-saturated audiences.

“What we’re seeing is an evolution in the way journalists approach data," explains University of British Columbia professor Alfred Hermida. "The opportunity here is that there is a lot of data available. If the role of the journalist is to help you make sense of the world, one way you can do this with big data is to present it in a way that helps citizens then make sense of that data.”

Though reporters have long exploited data and computer power to do their work (computer-assisted reporting dates back to the mid-twentieth century), many are now using data to drive storytelling and investigations. From algorithms that write basic news stories, to visualizations and search tools that allow the public to interrogate datasets on their own, data technologies are changing the norms and practices of traditional journalism.

A veteran BBC journalist and a founding news editor of the BBC News website, Hermida and UBC colleague Mary Lynn Young have been leading GRAND research to understand the challenges and opportunities of using data among practicing journalists and those who train them.

One of their key studies (soon to be published in the scholarly journal Digital Journalism) focuses on the creation of the Los Angeles Times Data Desk, an informal team of reporters and computer programmers who together pioneered new approaches to data-driven reporting.

Data Desk’s Homicide Report is a leading example. Started in 2007, the project has developed a map and growing database of all of the homicides in L.A. County over the past decade. In addition to the interactive bubble chart of L.A.'s  grim murder stats distributed across the County region, algorithms developed by Data Desk programmers pull together facts from the databases to write basic factual stories about each and every homicide - an unprecedented scope to the reporting.

“To these robotic stories that are very formulaic, designated reporters then add more detail – the additional human side to the story. So essentially every homicide gets a story. It provides the newspaper, not as a way of reporting the facts, but also as a public square for people to come together and express their emotions, their grief around what happened in their neighbourhood.”

As Hermida points out, this form of automated reporting addresses the deficit in traditional journalism of only reporting on a small fraction of what is happening in the world. Mainstream crime reporting, for example, mainly pays attention to outlier homicides – those marked by unusual circumstances deemed newsworthy. Having the facts covered also frees the journalist to focus on aspects of the story that require a human perspective.

“It’s a much better use of your brainpower. A lot of what happens in journalism is fairly rote. You have stories now written by algorithms, which you’d be hard pressed to distinguish from what a journalist wrote. If you can spend less time writing basic stories that say ‘this happened here to this person in this location’ you can spend more time explaining why it happened, how did it happen, what does it mean.”

Working with big datasets also means reporters benefit from a broader social and political context versus traditional journalism. Data visualizations and analytics provide tools for journalists to see historical trends, to come up with research questions, to identify stories and test hypotheses, as well as challenge their own impressionistic views.

“Often the way journalists approach a story is through conventional wisdom. One of the advantages of data journalism is they can take the conventional wisdom and see if it holds up to the data. In many cases, it doesn’t."

Data Journalism in Canada

By going “behind the curtain” of the L.A. Times, Hermida and Young had sought to discover how data journalism is adopted or promoted by news organizations, and how journalists see their role in dealing with data. What they found was that the advent of the Data Desk was more of an evolution of existing practice than revolution, more discontinous than continuous.

“The biggest challenge among institutions is not technology, it’s culture - the cultural norms and practices within that news organization," says Hermida. "The way to effect change in these news institutions often depends a lot on personal relationships, and identifying key individuals willing to take on new initiatives. [It’s about] certain key people in the right place, with the right connections at the right time.”

"Part of what we have learned from our research is that we should not fall into that trap where we will let our own personal biases inform our decisions. We need to take our own personal views out of the equation and look at the data."

Hermida and Young are now preparing to replicate their research in Canada by identifying champions of data journalism and early adopters. David Skok is one of those key people and a GRAND partner. Now at The Boston Globe, Skok was the former director of digital at Global News and co-founded, a national network of news websites that has gone on to become Canada’s fastest growing news and information site. In 2013, Global launched Canada's first data desk comprised of a dedicated team of data journalists.

“You don’t look at it as immediate returns,” Skok said in a 2013 interview with “But when you think about how to distinguish yourself from all the news organizations out there doing the same stories all day, you have to have something journalistically different or fresh or more indepth.”

Though data journalism is making its way into Canadian newsrooms, Hermida's findings point to a significant lag behind our American and European counterparts. Largely spared from the fallout of the 2008 global economic crisis, he argues that Canadian news organizations face less competition, and hence have less of an incentive to innovate.

Journalism schools are also slow to adapt to the changing times: the skills needed to effectively work with data – those traditionally found in information schools, computer science disciplines – are still not seen as significant for journalists going forward. “We’re looking at the future through the rearview mirror,” says Hermida referencing Marshall MacLuhan.

At UBC’s Social Media Advanced Research, Teaching and Training Lab (SMARTTLab), Hermida is taking the lead in training a new breed of journalists. The interdisciplinary centre is dedicated to understanding the interplay between social networks, the media, and public discourse. Students also learn the use of social media analytics.

By adopting best practices, Hermida also ultimately sees news organizations using data journalism to provide a form of public service by making the raw data accessible and useful to consumers.

“The role for the data journalist becomes not only to provide the story but to provide tools and means for the consumer to explore the story and figure things out for themselves. The question is: How do we have to change the way we think about our journalism in order to provide this service to the public.”

Alfred Hermida’s new book Tell Everyone: How the Stories We Share Shape What We Know and Why It Matters, was recently published by Doubleday Canada.



Spencer Rose