A Statistically Valid Definition of Bias Is Needed To Determine Whether the
Science Citation Index® Discriminates Against Third World Journals

Eugene Garfield, Ph.D.
Chairman Emeritis
Institute for Scientific Information
3501 Market Street, Philadelphia, PA 19104

 (Current Science, Vol:73(8), October 25, 1997)

From the inception of the Science Citation Index® in the sixties, the question of journal coverage has been widely, and often emotionally, discussed. Imagine the hubris of my suggesting in 1964, that "only" a few hundred core journals would satisfy the needs of most readers. Since then, I have regularly published data demonstrating this concentration effect1 . These periodic reports illustrate the well-known Law of Scattering2 and the less known Law of Concentration, which I first discussed in 19713 .
SCI Graph

Samuel Bradford's Law of Scattering was first developed to explain the dispersion of journal articles within specific fields. Later, I discovered this was true for science journals as a whole. In other fields of human endeavor, these phenomena are sometimes referred to as the 80/20 rule, Zipf's Law, Pareto effect, etc. Derek J. deSolla Price was well aware of these distributions which have been studied by B. C. Brookes4and many other bibliometricians5.

Figure 1, reprinted from The Scientist article cited above, is important to the following discussion of alleged bias in ISI or other databases. All such dicussions are essentially concerned with the tail of a long hyperbolic curve. Once the core journals are selected, the remainder of one's effort is spent selecting from thousands of relatively small and low-impact journals published, both in the advanced as well as in the developing countries.

Allegations of bias in ISI journal coverage are not new but an August 1995 article in Scientific American6(pp.92-99) precipitated the discussion once again. The article raised some important questions about the treatment of Third World material in the west, but at the same time made unsubstantiated statements about ISI's selection policies.

A "Clarification" published in the October 1995 issue (pp.10)7is reprinted below. The editor of the journal involved also wrote to me apologizing and explaining the source of the error.
The following statement appeared in the "Letters to the Editor" section of Scientific American , October 1995, pp.10.


The article "Lost Science in the Third World" [Scientific American, August] reported several assertions made during interviews and later confirmed by Luis Benitez-Bribiesca, editor in chief of the journal Archives of Medical Research, regarding the Science Citation Index, a database produced by the Institute for Scientific Information (ISI). According to the ISI, it has never required that any journal, person or institution purchase an ISI product to qualify for inclusion in its indexes.

Furthermore, the ISI notes that it has never made a decision about indexing a journal until after at least three issues have appeared. No statement in the article is meant to imply that the extent to which a journal's articles are cited is the sole criterion for inclusion in an ISI product. Also, the $10,000 subscription price mentioned for the index is the approximate current price, not the price during the 1970s. The editors regret any misunderstanding resulting from ambiguities or misstatements in the article.



    1.back to text Garfield, E. "The significant scientific literature appears in a small core of journals,"
        The Scientist, Vol:10(17) p.13,16, September 2 1996.
        See also
    2.back to textBradford, S.C. "Sources of information on specific subjects,"
        Engineering, 137, p.85-6, 1934.
    3.back to textGarfield, E. "Is the ratio between number of citations and publications
        cited a true constant?" Current Contents, #6, p.5-7, February 9 1976.
    4.back to textBrookes, B.C. "The derivation and application of Bradford's Law,"
        Journal of Documentation, 24, p.247-65, 1968.
    5.back to textOliuc-Vukovic, V. "Bradford's Distribution: From the classical
        bibliometric "law" to the more general stochastic models" Journal of the American Society of
        Information Science, 48(9):833-42 , 1997.
    6. back to textGibbs, W.W. "Lost science in the Third World," Scientific American, p.92-99, August 1995.
    7. back to text"Clarifications and Errata" Scientific American, p.10, October 1995.

After the original Scientific American article appeared, I drafted a letter commenting upon the main thrust of the article. However, when the apology was published, I decided to withhold the letter. Then, due to a recent discussion with Subbiah Arunachalam, I decided to submit the letter to Scientific American which has declined to publish it, since two years have passed. I can understand that viewpoint since Scientific American is a popular magazine and most readers will have forgotten the original article. However, the editor of Current Science asked me to publish the information for the benefit of Third World readers, in particular. The references in that letter are listed separately from those cited above.

Text of letter to Scientific American...

Scientific American
415 Madison Avenue
New York, NY 10017-1111

I was glad to see your clarification note (Scientific American, October 1995, p. 10) for the article by W. W. Gibbs, Scientific American (August 1995, p. 92-99). His article gave the impression that indexing services discriminate against Third World journals. And he used extensive Science Citation Index® data to support this apriori position.

As the former Chairman and founder of the Institute for Scientific Information, I have devoted 40 years to espousing the cause of scientists in the Third World. The suggestion that the Science Citation Index is the reason for the failure of any journal, institution, or individual to achieve international peer acceptance is unwarranted. Undoubtedly, coverage of any journal in Current Contents® or SCI® helps to make a journal's articles better known, but that by itself is not sufficient to induce citation by other scientists. A variety of factors account for citation not the least of which is quality, timing, and/or relevance to the current work of other scientists.

The history of science is replete with examples of delayed recognition (i), including the classic example of Gregor Mendel, but none of these cases was due to coverage by abstracting services. The work of Mendel was known and first quoted in the Encyclopedia Britannica in 1880 (ii) but its significance was not appreciated until this century. I have also documented the case of the Indian scientist Dr. Shambu Nath De (iii). His classic works on cholera were published in the Journal of Pathology and Bacteriology in the fifties. Even that did not guarantee full recognition in his lifetime. The fact is that thousands of articles published in leading journals are rarely cited for many reasons. Undoubtedly an important factor is mobility and frequency of contact with peers outside the Third World.

Over 30 years ago, ISI developed the journal impact factor as just one method of supplementing the subjective appraisal of small journals by objective unobtrusive means. Thousands of new and established journals which were not originally covered in Current Contents® or the Science Citation Index eventually were selected in part because their articles somehow attracted the attention of scientists publishing in ISI-covered journals. Third World scientists routinely send reprints to colleagues in other countries. And relevant or significant material will eventually be cited in research or review articles.

The procedures for selecting journals in CC,®SCI,® and the Social Sciences Citation Index® are meticulous and well documented(iv). Citation impact is but one of many criteria used to select journals. Thousands of small journals are included from countries and institutions around the world.

One of the many criteria for selection is the use of English which has become the lingua franca of science. Any journal which claims international significance will at minimum include English titles and abstracts.

The Gibbs article characterizes SCI coverage of Third World science as declining simply because the coverage of some local journals has discontinued. In fact, total article coverage has increased substantially because Third World scientists increasingly publish in the international peer reviewed journals, where their work is seen and read by peers worldwide. We have documented these trends at numerous conferences.(v)

In 1981 SCI indexed 904 articles from Mexican authors. In 1994, coverage increased to 2478 articles. However, the impact of Mexican research relative to the rest of the world has remained stable during this 14-year period. Its impact on average is 50% below the world average. These and other data demonstrate that SCI's coverage of 3300 journals goes far beyond the needs of most of its subscribers. Indeed, the reality today, as it was 50 years ago, is that a small percentage of journals accounts for a major share of papers published and an even greater share of citations. 100 journals account for over 20% of the articles published and over 40% of the citations. 1000 journals account for 70% of papers and 85% of the citations (See Figure 1).

Any journal or indexing service has economic limitations. Even Medline, a government-subsidized enterprise, has limited resources and must limit coverage. To an outsider, selection decisions may seem as arbitrary as peer review of research grant applications does to some researchers.

The Zelinski quotation, featured in your article, implies that population should determine the number of journal articles to be covered by indexing services. On that basis, journals published in China, India, and Russia would receive the highest priority regardless of the quality or relevance of the material.

SCI is an index which already includes 650,000 source articles per year and over 15,000,000 cited references appearing in 3300 journals.

Many Third World countries suffer by publishing dozens of marginal journals whose reason for being is questionable. I have urged them to combine the best material into larger regional journals to achieve a critical mass. In addition, their local funding sources need to adopt stringent criteria for publication including international peer review. The precedents for this are to be seen in the numerous European journals which have made many national journals essentially obsolete. Nevertheless many local journals published in vernacular languages serve a useful purpose for reviewing the clinical and applied literature to the benefit of local physicians and industry. These serve a purpose similar to thousands of American trade or state medical journals which have little impact on research but they too are not covered in SCI(vi).


    back to text
    ( i ) Stent, G. S. "Prematurity and Uniqueness in Scientific Discovery," Scientific American 227:84-93 (1972).back to text
    ( ii) Garfield, E. "Would Mendel's Work Have Been Ignored if the Science Citation Index Was Available 100 Years Ago?, Current Contents, January 14, 1970, p. 4-5, reprinted in Essays of an Information Scientist, (1962-1973), Volume 1, pgs 69-70 1979).back to text
    (iii) Garfield, E. "Mapping Cholera Research and the Impact of Shambu Nath De of Calcutta," Current Contents, April 7, 1986, p 3-11, reprinted in Essays of an Information Scientist (1986), Volume 9, pgs 103-111 (1988)back to text
    ( iv) Garfield, E. "How ISI Selects Journals for Coverage: Quantitative and Qualitative Considerations," Current Contents, May 28, 1990, p 5-13, reprinted in Essays of an Information Scientist (1990), pgs 185-193, Volume 13, (1991)back to text
    ( v) Garfield, E. "Quantitative Analysis of the Scientific Literature and Its Implications for Science Policymaking in Latin America and the Caribbean," Bulletin of the Pan American Health Organization 29(1) 87-95 (March 1995).back to text
    ( vi) Sanz E, Aragon I, and Mendez A. "The Function of National Journals in Disseminating Applied Science," Journal of Information Science 21(4): 319-323 (1995).

    End of Letter


In correspondence with numerous scholars, I've indicated that there has never been an objective definition of "bias" or discrimination with respect to secondary databases. The difficulty can be characterized as a chicken and egg situation. If all the world's journals were included in the ISI citation indexes, then we could more easily decide which should be given the highest priority of coverage. Since this is not possible, considering the economic limitations, my recommended solution is that scientists in each country or region should gather whatever data is needed to evaluate their journals and give ISI and other database publishers prioritized lists of journals they would recommend. Local citation indexes can be compiled if necessary to facilitate such evaluations.