Out of ivory tower: How can academic research serve the public?
Barry Bozeman believes that colleges and universities shoulder different functions and address social issues in different ways. Photo: COURTESY OF BARRY BOZEMAN
When Steven D. Levitt, renowned economist from the University of Chicago, announced his retirement from academia at the age of 57, it sent ripples in the economics community and beyond. A recipient of the prestigious John Bates Clark Medal and co-author of the bestseller Freakonomics, Levitt was celebrated for his unconventional research lens—using data to decode the hidden logic behind social phenomena, bringing economics out of the ivory tower into everyday life. Ultimately, this unorthodox scholar chose to step off the traditional academic track, pivoting instead to business and podcasting. His departure acts like a prism, refracting a broader dilemma faced by contemporary scholars: Amid the mounting pressures of performance metrics, funding competition, and publication anxiety, how can the public value of academic research be fully realized? To explore these questions, CSST recently spoke with Barry Bozeman, Regents’ Professor Emeritus of Technology Policy and Public Administration at Arizona State University in the United States.
Contributing to social issues in different ways
CSST: Academic research is often seen as an activity confined to the ivory tower, disconnected from public life. How do you think academic research can respond more directly to social issues?
Bozeman: The term “ivory tower” is popular and misguided. If one prefers metaphors, a better choice would be plural—“ivory towers.” In the United States, universities are quite different from one another. People often distinguish among, for example, state universities (e.g., University of Michigan), private elite universities (e.g., Princeton), small liberal arts colleges (e.g., Bowdoin), community colleges (e.g., Mesa Community College) and, more recently, for-profit colleges (e.g., University of Phoenix). While these distinctions are meaningful, they are not sufficiently precise. Let’s consider just one category: state universities. Each of the 50 states in the United States has several universities it sponsors, at least in part. But even in the same state one finds that some state universities are “research universities,” usually measured by research expenditures and federal grants and contracts. Others have higher teaching expectations and lower research expectations, and still others focus almost exclusively on teaching.
As a result of differences in core functions, there is great divergence in the extent to which universities respond to social issues. Some universities view social engagement as a core function. Others give little attention to this function. Sadly, few universities provide much direct incentive to faculty for social engagement. Only a handful of universities give strong consideration in tenure, promotion, or salary decisions to a faculty member’s social engagement activities.
Another issue relates to one’s concept of social issues. For example, some faculty members develop workshops with needy groups or organizations and help them solve problems. For example, if a university, either faculty or students, works closely with a shelter for poor and homeless citizens, that is social engagement. But what about professors who do little outreach to other organizations and citizens but who focus mainly on research, and through that research produce knowledge and, ultimately, goods and services that contribute directly to, for example, new vaccines or environmental protection technologies? Is this not a response to social issues? What about those faculty who have little direct contact with outside organizations but who regularly teach hundreds of students from impoverished or poorly educated families? Is this not addressing a social issue?
I take the broader view. In my judgement, almost all universities contribute to social issues, but often in ways quite different from one another. Few of these contributions are systematically organized by university administrators. Most are undertaken by faculty initiative. Some faculty members focus their entire careers on contributing to social issues, whereas others have different professional goals and contribute only incidentally to social issues. My conclusion is that a great deal of social contribution comes from the ivory towers, and it would be useful to identify and evaluate these social contributions more carefully and to fully recognize and award the individuals and institutions who are improving society and quality of life.
Insufficient public transmission of research outcomes
CSST: How can scholars balance the professionalism of academic research with public communication? Do academics have a responsibility to translate complex research into a language the public can understand? If so, what challenges do you see in this process of “translation?”
Bozeman: Historically, scholars have not been particularly active or successful in public communication. One reason is that many are not interested in communicating with the broader public. Most scholars are strongly interested in communication with their peers and, of course, enhancing their scholarly reputation. Again, this is partly the nature of reward systems. Peer recognition leads to rewards in pay, tenure, and promotion, whereas communication with the public may be personally gratifying but rarely leads to any university-based reward.
But it is also important to remember that researchers are trained during graduate programs and by mentors to engage in communication with peers. Early in their career, scholars learn to “speak the language” of peers, a specialized language poorly understood by most citizens. Just as important, very few researchers receive any training designed to help them communicate effectively with the public and, thus, it is no surprise that researchers are usually not adept at such communication.
A less recognized, but extremely important factor, is that almost all research that is used by anyone other than researchers’ peers is “curated” knowledge, meaning someone other than the researcher translates findings to a nonspecialized public. For decades, most important and socially relevant research findings were reported by journalists, typically those with specialized training as science journalists. As both conventional mass media and professional science journalism continue to decline, the gap between those who produce research and those who wish to know about it is becoming wider each year.
Diverse paths of serving public interest
CSST: Many complex public issues, such as climate change and public health crises, often require interdisciplinary collaboration. How can we ensure that the voices of different disciplines are heard and that research outcomes can truly serve the public interest?
Bozeman: For the first part of this question, regarding interdisciplinary collaboration, I can provide either an extensive and complex response or a simple one. The complex response requires some attention to not only the dynamics of interdisciplinary collaboration, but also the very idea of discipline itself. Notions of what constitutes a discipline shift over time, as do the social processes through which disciplines emerge and develop. I shall avoid delving into the specifics of these changing constructs and social processes and instead choose the simple response to the question. First, almost every field of study has a strong multidisciplinary component. It is becoming increasingly difficult to preserve traditional disciplinary boundaries. Moreover, many of today’s interdisciplinary fields will be tomorrow’s academic disciplines.
Regarding the “voices of different disciplines,” I am not sure I agree with the premise, which seems to imply that we must make sure to hear the voices of all disciplines. To be sure, we should never hinder the work of scholars, but we do not need to pretend that their work necessarily contributes to social benefit or that those producing it are motivated by that objective. Much research, whether science, the social sciences, or the humanities, has little relevance to the public interest.
The value of seeking knowledge for “its own sake” seems to me to have merit. In the first place, esoteric, non-instrumental knowledge often helps reinforce and demonstrate the value of curiosity. An atmosphere of curiosity, the notion that one need not only seek out problems that have direct public interest implications, may inspire and encourage others who seek knowledge, including those who will ultimately produce knowledge that has social value. There are many different routes to research serving public interest and the routes are not always predictable.
Improving mechanisms of academic evaluation
CSST: Do you think the academic community should rethink its evaluation criteria to better reflect the public value of research? If so, how might new evaluation mechanisms be designed to ensure that scholarship is not only academically valuable, but also has a positive impact on society?
Bozeman: Not only do I think that the academic community should rethink the evaluation criteria that currently dominate, I have, on many occasions, consulted and done research on this set of issues. Research evaluation is a special interest of mine. The problem with current approaches grounded in scientometrics is there are major problems they cannot address.
Scientometrics technologies have been advancing at a stunning pace and have given us many tools for evaluating conventional research productivity. But our ability to produce criteria that “reflect the public value of research” has proved much more challenging.
As is the case with most data-driven enterprises, we develop measures that are “partial indicators”—ones that capture some important phenomena but not others. Consider these limits of citation-based indicators. Research shows that most articles cited in a paper have never been read by those doing citing. Research also shows that the astute have means of “gaming the system” and thus inflating their citation accounts. In some cases, people listed as authors on academic papers neither contributed to the writing nor participated in the research—they are so-called “honorary authors.” The type of papers that receive the most citations in many fields are not major contributions to knowledge but rather literature reviews, especially ones that can, yes, be mined for gratuitous citations.
If we were to compare a list of “best scholars” based on citations and another based on professional colleagues’ assessments, the two lists would be very different. Both would be flawed. I have discussed only a few of the problems with scientometric evaluations. Here are some with peer assessment: seniority bias, gender bias, inattentiveness, personal acquaintance bias, conflating personality assessment with intellectual assessment, and language and nationality bias. I could easily go on.
My conclusion, one I find difficult, especially as a quantitative researcher, is that the best approaches to assessing social impacts or contributions to public interest are, at least at this point in time, all qualitative. Some useful approaches are quite simple. During the time I was a university administrator and also engaged in research evaluation and the social study of science, I used conventional sociometric indicators. Here is my most valid “technique” for yearly faculty impact evaluation. “Please provide a list of your most important academic and social accomplishments and write a brief accompanying essay than can be used by our faculty evaluation committee to determine your percentage salary raise this year.” It worked better than numbers of articles and citation rates and I received few complaints from anyone.
Editor:Yu Hui
Copyright©2023 CSSN All Rights Reserved