AI works competing for prestigious journalism award spark debate
Maintaining human-centered values remains crucial as AI reshapes journalism. Image generated by AI
Recently, news that AI-generated content is competing for the 35th China Journalism Award stirred intense debate across industry and academia: Should such content count as journalism? Is it eligible for news awards? If AI-generated work wins, what becomes of human journalists? Will journalism as a profession still retain its value? To clarify, CSST consulted Huang Chuxin, a research fellow at the Institute of Journalism and Communication, Chinese Academy of Social Sciences (CASS), who took part in both the preliminary and review rounds of this year’s competition. “Yes, and there’s more than one entrant,” he confirmed.
As AI works formally enter the arena of China’s premier journalism prize, this discussion—sparked by technological advances—has extended from technical concerns to the heart of journalistic ethics. The industry now faces an unprecedented identity test: In an age of increasingly sophisticated algorithms, where should journalism’s professional boundaries be drawn? To gauge views from both academia and practice, CSST launched a survey entitled “Should AI-Generated Content Be Eligible for the China Journalism Award?” and received 120 valid responses. Results show that 57.5% conditionally support AI entries, provided the degree of AI involvement is clearly disclosed; in stark contrast, 95% express deep concern about accuracy. This seemingly contradictory stance reveals the industry’s dilemma amid the tide of technological transformation: It cannot resist innovation, yet it must uphold core professional values. At the center of debate is how to balance efficiency with authenticity—a critical test of the industry’s collective wisdom.
Upholding human-centered values
The survey found that 69.17% of respondents accept that AI-generated content “can partly qualify as journalism,” but firmly reject its complete replacement of human work. So what should be the standard for AI entries in journalism awards? And what role should AI play in news production?
Huang, who has participated in six preliminary and four review rounds of the China Journalism Award, explained that works containing AI elements are currently eligible to compete, but AI may serve only as an auxiliary tool rather than the principal creator, with its contribution capped at 30%. This year, he noted, submissions where AI was used effectively and added value to the reporting will advance to the next stage, while those with weak or superficial application will be eliminated.
Journalism, Huang argued, is more than the simple reproduction of recent facts; it reflects the active choice and expression of professional journalists, grounded in ethics, values, and public interest. Whether AI-generated content qualifies as journalism depends above all on whether it is human-led. He recommended establishing a transparent human-machine collaboration mechanism within existing awards, requiring applicants to disclose and specify AI’s involvement and highlight human input. At the same time, a special technical award could be considered to encourage innovation while preventing AI from overshadowing journalists. “Technology’s contribution must not outweigh journalism’s core values,” he stressed, reiterating that the awards system should remain “human-centered,” with AI kept in a supporting role.
Chen Kaihe, vice dean of the School of Journalism and Communication at Peking University, proposed a more operational principle of “full-process human reconfirmation.” From initial prompt setting to mid-stage content generation to final quality checks, human journalists should lead every step, with AI limited to reference and productivity support. In commentary or investigative reporting, human journalists must retain full responsibility—an essential baseline of professionalism. He further suggested that the China Journalism Award criteria should be dynamically adjusted as technology evolves, but with human creativity at the core. The award process, he stressed, should remain open and transparent, specifying both AI’s contribution and who bears accountability.
Survey findings support these views: 82.5% of professionals demand “explicit disclosure of AI involvement,” showing the industry’s strong commitment to transparency; 52.5% believe the award criteria should be “appropriately updated,” but never at the expense of human-centered values.
As technological enthusiasm collides with professional principle, Chen Changfeng, former executive vice dean of the School of Journalism and Communication at Tsinghua University, stated: “The China Journalism Award recognizes journalists’ professional insight and dedication, with its essence rooted in human-centered values. If AI becomes the main creative subject, whether credited alone or jointly, it would run counter to the award’s original purpose.” She also argued that debates over “fully AI-generated” work are largely hypothetical, since human subjectivity is present at every stage—from algorithm design to prompt engineering.
Hu Zhengrong, director of the Institute of Journalism and Communication Studies at CASS, cautioned against hastily “greenlighting” AI entries while the technology remains immature. He identified three structural flaws in AI-generated content: “hallucinations” leading to factual errors, weak verification mechanisms undermining credibility, and unclear copyright ownership creating legal risks. These uncertainties, he warned, threaten the regulatory foundation of journalism, and robust ethical review and fact-checking systems must therefore be built. Survey data supports this careful approach: While 90% acknowledge AI’s productivity advantages, an equal proportion worry it could introduce factual errors or bias, underscoring the tension between efficiency and quality.
Institutional construction in practice
While academia debates, the industry has already been experimenting with models of human-AI collaboration. In 2018, Xinhua News Agency launched full-process intelligent news production through its self-developed “Media Brain” platform. Using technologies such as facial recognition and speech synthesis, the system automatically generated data-visualized news. A video completed in just 15 seconds—“Media Brain Talks About the Past Five Years of the Supreme People’s Court and Supreme People’s Procuratorate”—won second prize at the 29th China Journalism Award. “By making AI’s contribution transparent, judges could clearly see the primary creator,” said Liu Gang, president of the Xinhua Institute. He explained that while AI’s role in auxiliary tasks like translation or video production could be evaluated as technological innovation, core reporting such as investigative journalism must still be judged by traditional standards like authenticity and public interest.
At the institutional level, the People’s Daily has adopted a more systematic approach. Yang Yang, an editor in its Research Department, revealed that the newspaper has established a cross-departmental “AI task panel” to uniformly allocate technical and editorial resources. Through top-level planning and stricter fact-checking processes, the team seeks to address hallucinations, bias, and other risks early in production.
To reconcile the contradiction between the rapid pace of technological iteration and the slower process of institutional reform, many survey respondents recommend creating separate categories such as “AI News Innovation Award” or “Outstanding Human-AI Collaborative Reporting Award” to better evaluate AI’s contribution. Huang proposed a “dual-track” solution: requiring disclosure of human-machine collaboration details in existing awards to ensure transparency, while also establishing an independent AI-focused prize to encourage breakthroughs in data visualization and multimedia storytelling.
Zhi Tingrong, dean of the School of Journalism and Communication at Jinan University in Guangdong Province, called for closer collaboration between academia and industry to jointly formulate unified AIGC standards for journalism. These would define basic rules and quality benchmarks, while an AI ethics review mechanism could conduct dynamic assessments of risks such as algorithmic bias, platform governance, and public opinion.
Reshaping journalistic landscape
Among those opposing AI-generated entries, 67.5% argue that “human value in journalism is irreplaceable,” believing this safeguards not only professional dignity but also the essential function of news as a social public utility.
Yet the advance of AI in journalism is irreversible. The key question is how to uphold professional values amid this shift. “From lead and fire to light and electricity, every technological revolution triggered anxiety about human subjectivity, but each ultimately settled into industry infrastructure through standardization,” said Li Jieqiong, an associate professor at the School of Journalism and Communication, Minzu University of China. Printing gave birth to modern journalism, photography reshaped visual reporting, and AI today represents a new point in technological evolution. Framing AI as a form of “new quality productive forces” rather than a replacement tool will unleash its potential—restructuring newsroom workflows to boost efficiency, enabling precise content distribution, and unlocking the value of data assets.
This historical perspective illuminates the essence of the relationship between technology and humanity. Photography did not replace painters but instead gave rise to Impressionism and other new art forms. The spread of computers did not erase accountants but reshaped financial management paradigms. Likewise, AI should be seen as an opportunity to restructure journalism rather than a threat to it.
The deeper transformation lies in redefining journalists’ skillsets. As AI takes over basic writing tasks, journalists’ “Four Abilities” (to travel, observe, think, and write) are endowed with new meaning. Several universities have already introduced three new modules into journalism curricula: prompt engineering to guide AI content generation, algorithm ethics assessment to cultivate critical technical thinking, and human-AI collaboration management to strengthen leadership. The message is clear—future journalists must be both creators of content and gatekeepers of technological value.
Editor:Yu Hui
Copyright©2023 CSSN All Rights Reserved