In a world of dwindling resources, climate change, and overpopulation, the human society has increasingly turned its attention towards accountability, performance, and impact monitoring. As a result, the 21st century has been defined by the incessant pursuit of quantifying everything. Perhaps this is best summarized in a single sentence by Michael Bennett (ASU) who suggests “Western societies are desperate for new functional interpretations of everything.”
Consider Academic Analytics, whose expansive database is increasingly leveraged by universities to monitor their institutional performance. This ecosystem of measures is deeply entrenched in the mainstream forms of scholarship, including grants, awards, and publications. Such metrics, however, captures only a small subset of scholarship in domains such as arts and design. More so, even the mainstream scholarship areas apparently lack comprehensive coverage. For instance, the New Interfaces for Musical Expression (NIME) international conference proceedings, despite having been assigned ISBNs and DOIs, and being the second highest ranked conference publication in the area of music (according to Google Scholar), remains conspicuously absent. As a result,
according to the Academic Analytics metrics a number of disciplines deceivingly appear to flatline in terms of their impact, thus painting an inaccurate picture of faculty productivity that may also adversely affect the decisions in the University resource allocation.
More profoundly, I argue that the absence of access to publications, their impact, and consequently discipline-agnostic cross-pollination of scholarly research may lead to wasting of taxpayers’ money. The way scholarly communities are structured tends to promote the recruitment of internal reviewers and thereby citations from within the community at the expense of exploring the research field more holistically. In a society that is increasingly transdisciplinary, it only makes sense that we encourage cross-pollination of scholarly work to minimize redundancy and the potential wasting of taxpayer money, as is the case with public institutions that are in part funded through such sources. For this to be possible, we will need to rethink the impact metrics, the visibility of and access to all scholarly outcomes, and invent a way to incentivize effort at holistically leveraging existing research regardless of its disciplinary origins. Consider the following Association for Computing Machinery (ACM) publication I encountered while researching different scholarly communities and their use of the High Density Loudspeaker Arrays (HDLAs). Now, consider the International Computer Music Conference (ICMC) publication that predates the other by 4 years and offers a strikingly similar implementation of the aural Pong game. Perhaps because the ICMC conference is not commonly on the scholarly radar of this particular ACM community and/or pool of reviewers, its scholarship effectively does not exist. Yet, if properly leveraged, it would not only enrich the target community, but would also provide invaluable shortcuts towards the system’s implementation and thereby provide room for additional progress and innovation. For this to be possible, however, disciplines that currently lack standardized metrics, need one. This is, in my opinion, the fundamental challenge, or the hair-on-fire problem, with the aforesaid lack of holistic scholarly representation within the Academic Analytics being one of the many symptoms, internal impact inflation being another. Tackling it may require a coalition of federal organizations, including the National Science Foundation, the National Endowment for the Arts, and the National Endowment for the Humanities.
In part due to personal interest in this topic, during my stint as the Virginia Tech College of Liberal Arts and Human Sciences (CLAHS) interim associate dean for research and graduate studies, I was tasked to explore the development of metrics for the disciplines within the College whose scholarship currently remains underrepresented if not outright ignored by the existing metrics. This complex task posed another challenge–the concern with trivializing the complexity of an unconventional (e.g. artistic) scholarly artifact by reducing it to a single number or a set of values. As such, for this initiative to be successful, it had to be a bottom-up, inclusive effort, that leverages iterative participatory design. My graduate assistant and I spent most of the fall 2018 designing the starting survey instrument–a stab in the dark, that will serve as a starting or reference point as we continue to refine the means of collecting data and its conversion into mutually agreeable metrics. I used this time to also work on expanding the stakeholder base, including the College of Architecture and Urban Studies, the Provost’s Office, and the Faculty Senate.
It is no secret that all metrics are flawed. What is often missing in this discourse is the realization that having no metrics is considerably worse. Being fully aware of how sensitive this topic may be among my colleagues, I set out to explore the metrics implementation using a custom-designed survey instrument that captures individual data and leaves plenty of room for new forms of scholarship and the accompanying metrics. The project is currently being piloted within the School of Performing Arts (SOPA) and its findings are due to be analyzed, consolidated, and the results published on by the end of the summer 2019. The results of this journey will be used to refine the instrument, so that it can be used across multiple University units. Further, its impact may be far-reaching, including the College- and University-level metrics, faculty annual reports, as well as capturing the newfound inter- and transdisciplinary forms of scholarship and providing a platform for a national discourse through organizations such as the Alliance for the Arts in Research Universities (a2ru).
Considering my appointment as the interim associate dean has effectively ended starting January 2019 and that the Virginia Tech Creativity+Innovation (C+I) Destination Area (DA) engages a majority of the disciplines that are currently lacking adequate visibility within the Academic Analytics and/or means of capturing their impact, this initiative has now officially transitioned from CLAHS to C+I. I look forward to following-up on the initiative’s progress later this year.