Tuesday, December 8, 2015


A quick look at Google Scholar the other day showed me that there have been over 1000 citations to my papers. The other citation metrics were interesting, too, like the h-index (currently 14, meaning that 14 of my papers have been cited 14 times or more) and the i10 index (currently 19, meaning that I have had 19 papers cited 10 times or more). These numbers only take into account the peer-reviewed publications, but I do have quite a number of additional non-peer-reviewed works. If I count up the peer-reviewed papers plus my other types of publications (abstracts, book reviews, annotated bibliographies, etc.), I now have over 100 publications!

As an aside, one metric having to do with publication that is commonly used is the 'impact factor.' Although this number is meant to evaluate journals as a whole, it is often used by institutions to evaluate researchers, using the journals in which they publish as a proxy for their overall impact as scientists. This is, of course, highly controversial. One aspect that makes it even more problematic is that metrics like impact factor can be highly dependent on the database used for compiling the raw citation data. My wife published a paper a few years back in which this issue was explored quantitatively (Gray & Hodkinson 2008). In a set of 50 journals, many rankings were seen to differ by numbers that ran up into the double digits depending on the database used, calling into question the accuracy of any given journal's impact factor.

- Brendan



Gray, E, and S. Z. Hodkinson. 2008. Comparison of Journal Citation Reports and Scopus Impact Factors for Ecology and Environmental Sciences Journals. Issues in Science and Technology Librarianship DOI:10.5062/F4FF3Q9G.
View publication (website)