CONTACT US Wed Nov. 13, 2013

CASS 中国社会科学网(中文) Français

.  >  RESEARCH  >  JOURNALISM

Gatekeepers need better regulation in computational news

Author  :  YANG FAN     Source  :    Chinese Social Sciences Today     2022-03-02

Since the concept of computational journalism was put forward at the beginning of this century, its theory and meaning have been constantly expanded and enriched with the advent of new technologies. As of today, it covers a range of information technologies applied to journalism, from chatbots and recommendation systems to artificial intelligence (AI) and automized journalism. These technologies have been widely used in news gathering, production, and distribution. Data, algorithms, and software—collectively referred to as machines—are becoming an important part of contemporary journalism. There is no doubt that machines are replacing human beings, at least in part. At the same time, the gatekeepers who select and control the spread of news are also changing.

Gatekeeper theory

Gatekeeping refers broadly to the process of controlling and filtering information as it moves through society. The decision matrix that influences news control includes internal factors such as individual cognitive differences, the news production process, organizational characteristics, and external social institutional actors such as advertisers and governments, as well as social system factors such as culture and ideology.

Gatekeepers have control over how news content is disseminated and how social reality is constructed. Traditionally, journalism has served as a primary gatekeeper. Today, however, the rise of social media has greatly weakened its role. The massive participation of users have generalized the definition of gatekeeper. As Pamela Shoemaker, originator of the gatekeeping theory, pointed out, we’re all gatekeepers in the new media era.

In light of this, some scholars contend that the gatekeeping theory is outdated, some scholars define users as “secondary gatekeepers,” while others believe that the role of journalism is changing from “constructers of social reality” to “managers of a secondary release of online content,” thus, their role has shifted from gatekeepers to “gatewatchers.”

Contemporary gatekeeping theory sees human roles, such as journalists, professional opinion leaders, and online influencers, as well as technological roles such as algorithms, and their interactions, as part of a broader practice of socio-technical gatekeeping. In the digital era, some journalism and communication scholars have noticed that the algorithms, aggregators, scoring systems, and regulatory platforms built into search engines have been playing the role of gatekeepers, serving as digital gatekeepers. In the beginning, digital gatekeepers were mainly in charge of the distribution of news. With the development of AI, digital gatekeepers have penetrated every link of news gathering, production, and distribution.

Computational news discovery

Within computational journalism, Computational News Discovery (CND) is a particular application area related to the use of algorithms to orient editorial attention to potentially newsworthy events or information prior to publication.

There are three common applications of CND. One is to monitor the massive amount of content on social media platforms, pick up sudden hot spots or regular activities, identify and track them, and then recommend effective information sources or witnesses to journalists. The other is to monitor data sources in the form of numbers or texts and identify items that may be of interest to journalists by setting rules or detecting outliers. The third application involves using machine learning technology to find patterns from the historical data of reporting, and then expand the scope of news investigation and help reporters find more news clues.

A CND system can push notices and summaries to journalists who can then evaluate the value of reporting, to decide whether or not to carry out further investigation and follow-up reports, so as to ensure the quality of news.

As it is, CND works with journalists through attention orientation to complete socio-technical gatekeeping. CND does not increase the level of attention in a newsroom, but it does affect how it is distributed. Some studies have found that CND competes for attention with human monitoring channels, potentially aggravating journalists’ labor intensity.

In some cases, CND can save time by accurately filtering out content that would otherwise attract attention, whereas in other scenarios, it’s more of a distraction. The key to improving the internal attention economy is that human-machine interaction design should support human behavior. For example, journalists have different expectations of the effort put into different types of news leads, so CND systems should be flexible enough to include a wide range of scenarios. Also, the CND system should preliminarily assess the importance of push content in order to effectively direct users’ attention.

Factors that affect gatekeeping decisions are complex and are prone to individual bias. Though the machine can establish a unified value standard, its limited prior knowledge will also lead to algorithm deviation. The difference is that as technology advances, the machine’s deviation will approach zero indefinitely.

Therefore, the human-centered socio-technical gatekeeping process should enhance the CND system’s configurability, to lead journalists in news selection. However, there might be a contradiction, as the industrial nature of modern media will drive capital to continuously improve the AI level of CND systems in order to replace human labor to a greater extent.

Automated journalism

Automated journalism, the algorithm-driven transformation of structured data into news text, involves little human intervention beyond initial programming. Automated journalism is already widely used in sports and finance reporting. It parses a steady stream of data, embeds the information in customized templates, and then forms a press release suitable for various platforms. In a way, automated journalism is far more effective and efficient than human beings.

In addition, automated journalism is a techno-utopian vision of news value neutrality. Studies have shown that pure value neutrality cannot be achieved, and that automated journalism cannot be divorced from specific social, cultural, and political contexts, because it complies with the goals and techniques of its creators. In other words, automated journalism entails machines that produce news, but humans are still the gatekeepers.

In addition to “machines writing for people,” we have seen the emergence of “people writing for machines”—atomized journalism. The technology comes amid the growing use of structured data in automated journalism. It requires human editors to transform unstructured information into structured data for automatic reorganization and future reuse of content. Atomized news is beneficial for individualized news production. It can generate stories that are easy to read and understand based on the characteristics of the audience, catering to different reading devices, age groups, and level of education.

Atomized journalism frees editors from the increasingly demanding and complex process of news production, allowing them to focus on a distilled segment of news with time and resources to create better content. However, the trade-off is that editors have to give up some of their gatekeeping authority and delegate responsibility to algorithms or audiences.

Though AI has been widely used in structured journalism, its application in investigative journalism is still limited. The first reason is that some newsworthy stories have deep sociopolitical elements which are challenging to calculate and encode. Second, investigative journalism stories are unique, so it is difficult to obtain AI training data, and the training models are not universal, so AI cannot share development costs through multiple reuses.

However, it doesn’t mean AI has no place in investigative journalism. To say the least, AI can assist investigative journalists with data cleansing and collation, as well as large-scale structured data analysis. Back in 1988, The Atlantic reporter Bill Dedman trawled through mountains of home-mortgage data with the help of computers to write a Pulitzer Prize-winning series exposing hidden racism in the industry. We believe that AI will play a bigger role in investigative journalism as technology advances.

Algorithmic news recommender

Algorithmic News Recommender (ANR) adopts the distribution method of recommending personalized news, which influences users’ online news consumption habits. ANR can predict what users are interested in based on their characteristics and preferences, and even help them achieve cognitive goals.

At present, machines dominate the ANR gatekeeping process, which leads to a series of problems such as information silos on a personal level and political polarization on a social level. Some scholars blame this on the lack of news values. The problem is, in the context of ANR, journalistic values are changing as journalistic practices and relationships between journalists and audiences evolve.

Studies have shown that most media workers regard ANR as an extension of the newsroom and believe that traditional news values should be adhered to in accordance with news conventions. However, when asked how to design a successful ANR, respondents first emphasized the core values of transparency, diversity, editorial autonomy, breadth of information, personal relevance, and pleasure to use, while they felt that traditional news values such as objectivity, neutrality, and public service were less important. The design and implementation of ANR involves many stakeholders, including data scientists, product managers, and user experience engineers, in addition to journalists and audiences. Considering the complexity of technology and its functionality, it is not easy to adhere to traditional news values in ANR.

Journalists, audiences, media, social platforms, and the government should actively participate in the multi-party strategy of ANR algorithm design. In the face of massive and complicated network information, we would be overwhelmed by the flood of information without an algorithm’s filtering mechanism. The question is, what kind of news gatekeepers do we want machines to be?

Machines have no instincts, but they are built upon certain cultural, political, and economic interests. Just as there is no universally accepted standard for evaluating the effectiveness of human gatekeeping, the evaluation of machines in this role is undoubtedly complicated. Only by bringing stakeholders into the decision-making matrix can the overall value be maximized.

The development of new media not only brings about decentralization of the gatekeeping process, but also leads to an excessive concentration of gatekeeping power in the hands of a few roles or platforms. Contemporary gatekeeping theory puts forward an interpretation framework, the core of which is to bring more stakeholders into the gatekeeper decision matrix.

To analyze and structure media gatekeepers, it is necessary to understand the cultural, political, and economic factors behind the practice of social technology. Also, we should clearly view the communication mode’s transformation brought by new technology. Computational journalism is still in rapid development, as innovations such as news chatbots and the metaverse continue to expand the boundaries of journalism. Future research on the gatekeepers of computational news needs to shift its focus from human-machine collaboration to human-machine communication.

 

Yang Fan is an associate professor from the School of Journalism and New Media at Xi’an Jiaotong University.

 

 

 

Editor: Yu Hui

>> View All

Ye Shengtao made Chinese fairy tales from a wilderness

Ye Shengtao (1894–1988) created the first collection of fairy tales in the history of Chinese children’s literature...

>> View All