HOME>OPINION

Human-centered algorithmic governance needed in new forms of employment

Source:Chinese Social Sciences Today 2026-03-31

For a food delivery worker, starting a shift means confronting a task meticulously calibrated by algorithms: Orders must be picked up within strict time limits, with constant races against traffic lights, residential compound security checkpoints, and elevators, all to ensure meals arrive on time. The job is no longer simply about delivering food; it increasingly resembles a competitive game unfolding within a space structured by data and code. This offers a micro-level glimpse into new forms of employment.

As algorithms evolve from auxiliary tools into a foundational “mediated environment” for organizing labor, what emerges is not merely a change in work forms, but a deeper transformation involving power, human experience, and the future. From the perspective of communication studies, this transformation centers on the rise of algorithmic power, the resulting phenomenon of communicative alienation, and the need to construct a new, human-centered governance paradigm.

Operation of algorithmic power

At its core, algorithmic technology addresses the mismatch between users and vast volumes of information. In new forms of employment, however, it has shifted from an auxiliary instrument to a form of foundational infrastructure for organizing labor, making algorithmic power a central force embedded in labor processes. Algorithms no longer merely assist—they increasingly function as “rule-makers.”

The monitoring function of algorithms is equally significant. Through smart devices, digital platforms can carry out comprehensive, real-time surveillance of workers: The driving behavior of ride-hailing drivers, the locations of delivery workers, and the online activity of content creators can all be continuously tracked and analyzed.

At the same time, algorithms shape workers’ cognitive frameworks through filtering mechanisms. By determining what information is presented and how, they subtly influence how workers understand their tasks. Systems can prioritize certain workers for higher-value orders or steer decision-making through targeted information exposure.

Communicative alienation, workers’ predicament

As algorithmic power exerts deeper control, communication processes in new forms of employment become increasingly alienated. This not only affects worker efficiency, but also reshapes relationships—between individuals, and between people and their work—ultimately leading to a gradual erosion of workers’ sense of self.

A key manifestation of this dynamic is the inversion of the human–machine relationship. In traditional employment, workers, as users of tools, occupy a dominant position; in new forms of employment, algorithms exert control over workers, reducing them to execution units within a system.

The instrumentalization of social interaction is also growing increasingly evident. Driven by rating systems, genuine interpersonal engagement is often replaced by utilitarian exchanges. To secure positive reviews, workers must maintain constant friendliness and patience. Interaction ceases to be an end in itself and becomes a means to accumulate points and avoid negative feedback. Likewise, coworkers are transformed from potential collaborators into competitors on ranking lists. This form of “performative service,” driven by rating pressures, is steadily reshaping interpersonal dynamics in the service sector.

Algorithmic governance based on rational communication

In response to the communicative alienation produced by algorithmic power, a human-centered governance system must be established. Such a system should prioritize value reconstruction, supported by institutional checks and balances and multi-actor collaboration, in order to strike a more sustainable balance between technological efficiency and human well-being.

Value reconstruction entails abandoning the singular emphasis on “efficiency-first” logic in favor of a genuinely human-centered orientation. The value of algorithms should be assessed not only by the efficiency gains they deliver, but also by whether they respect workers’ rights to rest and safeguard their dignity. Platforms, in this sense, must adopt a broader conception of social responsibility that transcends mere profit. For instance, the Chinese ride-hailing company Didi has introduced a driver honor system and mandatory rest mechanisms, reflecting concern for drivers while maintaining operational efficiency, and showing that technological optimization and human concern need not be mutually exclusive.

Checks and balances must be realized through institutional innovation, beginning with algorithmic transparency. The Guidelines for Public Disclosure of Labor Rules for Workers in New Forms of Employment stipulate that platforms must publicly disclose and explain algorithmic rules that directly affect workers’ interests. In practice, platforms have implemented these requirements in different ways: One food delivery platform has introduced a “monthly workers’ rights forum,” in which core systems governing order assignment, pricing, route planning, and time management are disclosed, alongside explanations of how these decisions are made. Although still at a preliminary stage, such initiatives have already received constructive feedback.

 

Zhang Mingliang is from Ningbo University of Finance and Economics.

Editor:Yu Hui

Copyright©2023 CSSN All Rights Reserved

Copyright©2023 CSSN All Rights Reserved