An upcoming article in Academy of Management Perspectives by Aguinis, Suarez-Gonzalez, Lannelongue and Joo investigates the accuracy of citation counts as a measure of how impactful an academic’s work in both in and out of academia. Short answer: while citation counts reflect the extent to which an academic’s work affects the work of his/her peers, they do not reflect the extent to which that work affects the world at large.
Business schools in particular place a great deal of importance on citation counts, often equating them with the importance of the scholar. A researcher with a citation count might be headhunted from another business school in order to increase the prestige of the hiring institution.
The problem is that citation counts do not necessarily capture the actual importance of a person’s work in anything beyond scholarly circles. If the purpose of science is to help the world (and I’d argue that it is), then citation counts capture something altogether different: they reflect how valuable other scientists view your work to be in supporting their own ideas. This is not really what we want to know when we ask, “how important is this scholar’s work?”.
Many fields (especially business-related fields) worry that their research is not adopted by those that could benefit from it most. Often, research never makes it beyond journal articles and into practice. So if we are really concerned with identifying the most “impactful” scholars, do citation counts capture that? Do highly cited authors have a bigger impact on the world than less-cited authors?
To determine how much exactly citation counts reflect larger impact, the authors found the number of citations to the top 550 most cited authors in the Academy of Management. They searched Google for these authors, using their full name with quotation marks to pull up a list of Internet references to that author. They then reviewed the first 50 pages of results to see how many actually referred to the author of interest. If more than 5% referred to someone else, they dropped that author from analysis. This resulted in a final database of 391 scholars.
In that database, the count of citations on Google did not correlate highly with citation counts: correlations ranged from .152 to .260 depending on whether or not you include .edu domains. In a multiple regression analysis, when controlling for years since earning the doctorate and the number of articles published, the number of citations did not predict substantial incremental variance in Google listings among non-.edu domains (delta-R2 of about one half of one percent).
In summary, if we believe the number of references in Google to be an accurate metric for capturing impact on the world at large, citation counts do not reflect this value. Impact is clearly a more complicated construct that we typically consider it; future work should investigate better ways to capture this. It’s also worth noting that while this approach works for business, where research results should be directly adopted by managers, it would not work so well for fields where there are several steps between research and adoption. For example, just because a nuclear physicist does not appear much on Google doesn’t mean that his work didn’t help build a nuclear power plant.
At the least, my 17000 results in Google put me around #300 of the 391 most-cited authors in the Academy of Management. Not too bad for 3 years out!Footnotes: