Robots pose challenge to social governance
Human beings are at the core of social governance, serving as the subject, object and process of governance. However, the widespread use of robots is raising a number of new questions for the practice of social governance.
Integrating robots into social governance
Social governance pertains to public affairs, particularly those related to people, with the ultimate goal of enhancing people’s sense of gain, happiness and security. From a pragmatic perspective, if robots can contribute to achieving this goal, they should be integrated into social governance. However, their impact can be double-edged.
On the positive side, robots play an essential role in many sectors, which can improve the effect of social governance. For instance, robots can offer legal assistance or help with urban management in order to process and deliver public information more efficiently. On the negative side, excessive trust and reliance on robots may cause physical and mental harm to humans.
Robots can enter the realm of social governance in different ways. First, robots are catalysts for public affairs. For example, the adoption of robots may result in practical problems such as the transformation of the workplace and technology gap. Second, since robots are objects of social governance, governance bodies need to properly understand and define their role, scope of application, and rights and interests. Third, robots are involved in the process of social governance, including the collection and processing of information, decision-making and public communication. If we accept the reality that robots have in fact become part of social governance, there is room for discussion on how to understand the properties of robots and their relationship with humans, and in what way stakeholders of social governance can better respond to this reality.
Proper understanding of the dual nature of robots
The name “robot” itself implies the dual nature or dual role robots may have in human society: the instrumentality of tools and the affectivity of companions. Scholars hold different views on this duality. Some explicitly reject the analogy between interpersonal interaction and “human-robot” interaction because “robots have no morality.” Previous research that compared people’s emotional responses in interpersonal games and human-robot games found no difference in non-social emotions, such as satisfaction and frustration, displayed by people in interpersonal games, while social emotions, like pride and guilt, were rare in human-robot games. This supports the argument that robots are “tools” or “machines.”
However, another study showed that brain activation was stronger when participants cooperated with robots that had human-like faces, unlike when interacting with conventional machines. Therefore, it is reasonable to predict that the affective nature of robots will be increasingly enhanced as their appearance and modes of “human-robot” interaction evolve.
The dual nature of robots may create tension in several aspects with regards to social governance. First, robots present a challenge to the social function and social value exclusive to humans, which is an important reason for the general concern about robots. Second, users’ emotional projection onto robots could weaken the boundary between humans and robots. These issues require social governance bodies to rethink robots’ affective value for users and properly coordinate the relationship between humans and robots. Third, advances in robot intelligence may give rise to ethical dilemmas. If robots simply play an instrumental role, their intellectual advancement should basically contribute to the improvement of human productivity and efficiency. However, if robots also have an affective component and engage in decision-making and practices related to social governance, highly advanced robot intelligence could bring about a variety of ethical problems. These may include conflicts between the rights and interests of humans and those of robots, as well as the reshaping of moral and ethical norms.
The extensive use of robots has implications for social governance and has inspired research in psychology and other fields. However, existing psychological studies lag far behind the application of robotics. The field of psychology needs to focus on the social governance challenges posed by robots and urgently respond to them in the following areas.
First, as previously mentioned, humans have contradictory attitudes towards robots. According to the Uncanny Valley hypothesis, people tend to feel unsettled and worried when robots closely but imperfectly resemble human beings. In contrast, people are more willing to cooperate with robots when they perceive them to be companions and believe in the development of human intelligence. Scholars also found that humans’ identification with robots and artificial intelligence could lead to immoral behavior. Researchers and stakeholders should therefore pay close attention to the various changes in social mindsets triggered by robots and assess potential impacts in a timely manner.
Second, humans invest different cognitive and emotional resources in two modes of human-robot interaction: instrumental interaction and social interaction. Social governance bodies need to accurately identify the mode and psychology of human-robot interaction in different contexts or fields and adopt governance logic and strategies accordingly.
Third, interpersonal interaction and human-robot interaction differ in many ways. For instance, robots don’t have the mind or morality of human beings. As a result, it is difficult to directly transfer the theoretical frameworks of interpersonal interaction to human-robot interaction. As the use of robots expands the content of interpersonal interaction and increases its complexity, researchers may need to modify existing theories of social governance and adapt them to new contexts in order to establish effective theoretical frameworks.
Fourth, robots can be both the subjects and the objects of social governance; they can also serve as an effective approach to and practical medium for social governance. Hence the creation of a new model of social governance that incorporates or integrates robots may be necessary in the future so as to guide theoretical research, social governance practices and the development of the robotics industry.
Liu Guofang is an associate professor in the School of Economics and Management at Shanghai Maritime University.