of the American Society for Information Science (JASIS)
48(10): 962-964, 1997
Validation of Citation Analysis
In their recent "Citation Content Analysis of a Botany Journal'' (MacRoberts & MacRoberts, 1997), the MacRoberts continue to rail about the strawman they created a decade ago to discredit citation analysis. Repeating the irrelevant truism that published articles do not reflect all their historical influences, they continue to disregard all studies that demonstrate the utility and validation by peer judgment of citation analyses.
Rather than relying on their personal subjective judgment of significant influences, why not create a database of all the references cited not only in small journals like Sida, Castanea, but all botany journals? Then we could learn who are the most-cited authors and articles today and how well they reflect peer judgments of plant scientists as to the most important or influential authors and papers in the field? I explored this issue with botanists in 1977 (Garfield, 1977; p. 328). An update would be appropriate because the world of plant science has changed enormously in 20 years as had been the case for botany in the postwar years. The appropriate question is not, and never was, whether or not each published article reflects all the influences ever experienced by the author. Rather, does the paper cite most of the relevant articles that led up to the current work.
In fact, in fast moving fields this may only involve articles published in the last several years. Most research articles in fast moving fields do not cite all historical influences as a matter of editorial policy. The purpose of the bibliography today is not to provide exhaustive accounts of the literature as was traditional in pre-war German-style scholarship. Its purpose in science journals is to provide references to the more immediate predecessors of the current work. It is remarkable how often historically important articles are cited even though they are not essential to understanding the current context of laboratory research. A recent example is the literature of apoptosis that I have recently studied with the editor of a journal in that field. In spite of an exponential growth of the literature on programmed cell death, the core articles, many almost 50 years old, are still regularly cited (Garfield & Melino, 1997). As an expert in that field. Melino is well satiated that citation analysis is supported by peer judgment.
Another concrete personal example is the field of citation analysis to which the MacRoberts have devoted so much attention. It is reasonable to argue that my work has had significant influence on this field. Does that mean that my 1979 book Citation Indexing, or one of my earlier articles must be cited in every current discussion of citation analysis? The MacRoberts themselves cite their own recent work but omit any references to the large literature on citation analysis including those concerning botany. Nor do they mention the large number of citation classics in botany or genetics identified by citation analysis.
In a letter from Michael MacRoberts (August 1, 1996) he claims that he was not even aware of the special issue of Scientometrics ( 12 November, 1987 ) devoted to discussing his claims. The analysis by Harriet Zuckerman in that issue ought to be reprinted in JASIS if the journal wishes to seriously deal with the MacRoberts hypothesis (Zuckerman. 1987).
Naturally, their reiteration of the truism that individual articles do not cite all influences, finds resonance with any author whose work has been deliberately or otherwise omitted from a bibliography. But that does not invalidate the significance of bibliometric analyses when enough literature is taken into account.
The MacRoberts team would perform a far more relevant service were they
to compile a citation index of all their historical sources and from this
determine who are the most cited authors and then compare them to lists
of influential authors compiled by questionnaire or focus groups. It has
never been clear why they obsessively denigrate the field of citation analysis.
Ever since their first report in 1986. (MacRoberts & MacRoberts 1986
) the MacRoberts have looked for historical citation sources in current
work and, invariably, find these influences missing. If they are interested
in tracing the historical influences of earlier scientific figures on current
work, they might better begin with the earlier literature, not the current,
and create historiographs of their fields of interest. You will not determine
that Einstein has been one of the most influential figures in the history
of science simply by looking at references cited in the literature of 1997.
While there are a remarkable number of current references to his work,
one way to estimate that influence is through citation analysis as I proposed
back in 1963 (Garfield. 1963). However all such analyses should always
be augmented by subjective analysis and human memory when possible.
Garfield E. (1977). The 250 most-cited primary authors. 1961-1975. Part 1: How the names were selected. Current Contents 49, Dec. 5. 1977; reprinted in Essays of an Information Scientist, 3 1980 (pp. 306-336). Philadelphia PA: ISI Press. 1980.
Garfield. E. (1963). Citation indexes in sociological and historical research, American Documentation 14, 289-291.
Gartield. E. & Melino. G. (1997). The growth of the cell death field: An analysis from the ISI Science Citation Index. Cell Death and Differentiation, 4. in press.
MacRoberts. M. H., & MacRoberts. B. R. (1997). Citation content analysis of a botany journal, Journal of the American Society for Information Science, 48, 274-275.
MacRoberts, M. H. & MacRoberts B. R. (1986). Quantitative measures of communication in science: A study of the formal level. Social Studies of Science 16, 151-172.
Zuckerman, H. (1987). Citation analysis and the complex problem of intellectual influence. Scientometrics, 12, 329-338.