Science, 250:1331-2, 1990
David P. Hamilton

Also see : .  Hamilton DP, "Research Papers: Who's Uncited Now?"Science, 251:25, 1991
              .   Pendlebury DA "Science, Citation, and Funding" (letter to the editor)
                      Science 251:1410-1411, 1991

Publishing by -- and for? -- the Numbers

New evidence raises the possibility that a majority of scientific papers make negligible contributions to knowledge

Citations, according to the conventional wisdom, are the glue that binds a research paper to the body of knowledge in a particular field and a measure of the paper's importance. So what fraction of the world's vast scientific literature is cited at least once? Seventy percent? Eighty percent?

Guess again. Statistics compiled by the Philadelphia-based Institute for Scientific Information (ISI) indicate that 55% of the papers published between 1981 and 1985 in journals indexed by the institute received no citations at all in the 5 years after they were published. The figure was derived by ISI analyst David Pendlebury, who at the request of Science searched ISI's extensive database of scientific citations.

And that's the good news. ISI's database covers only the top science and social science journals -- some 4500 out of nearly' 74,000 scientific titles listed in Bowker / Ulrich's database, a commercial listing of all periodicals. "The conventional wisdom in the field is that 10% of the journals get 90% of the citations," says Pendlebury. "These are the journals that get read, cited, and have an impact."

Even those papers that do get cited aren't cited very often. An earlier ISI study of articles in the hard sciences (including medicine and engineering) published between 1969 and 1981 revealed that only 42% received more than one citation. (Because of database limitations at the time, that study didn't examine the number of uncited papers.) If a similar trend holds for 1981 to 1985, then as much as 80% of papers published during that period have never been cited more than once. Moreover, self-citation -- a practice in which authors cite their own earlier work -- accounts for between 5% and 20% of all citations, according to Pendlebury.

Does this mean that more than half -- and perhaps more than three-quarters -- of the scientific literature is essentially worthless? Of nearly 20 academicians, federal officials, and science policy analysts contacted by Science, few were willing to state the case so harshly. But a majority agreed that the high percentage of uncited papers is certainly reason for concern. Chief among the explanations offered was that researchers are publishing far too many inconsequential papers in order to pad their resumes. A typical reactions is that voiced by Robert Park, Washington director of the American Physical Society; "My God! That is fascinating -- it's an extraordinarily large number. It really does raise some serious questions about what it is we're doing." Ray Bowen, assistant director for engineering at the National Science Foundation, agreed. "It does suggest that a lot of work is generally without utility in the short-term sense." Similarly, Frank Press, president of the National Academy of Sciences, noted that "There are obvious concerns which are worrisome namely that the work is redundant, it's me-too type of follow-on papers, or the journals are printing too much."

A few officials, however, cautioned against interpreting the uncitedness figure as evidence of overpublication on two grounds: that even uncited papers can influence other researchers, and that the figure may be skewed because ISI databases include some foreign journals with minimal impact. "Maybe 10,000 people used the particular data from [an uncited article] because it was just sent out as an informal paper, or the numbers appeared in the traffic sent out over an electronic network," said Charles Brownstein, NSF's assistant director for computer and information science. "I just have no way to even begin to evaluate [the 55% figure]." Maxine Singer, president of the Carnegie Institution, posed her objection as a question: "So this includes a lot of journals published in countries of minimal scientific effort? Its very hard to evaluate a number unless you know what's in it." But even some who were reluctant to assign much importance to the statistic admitted that it surprised them. "It strikes me as a high figure -- I would have guessed one-third," said William Raub, acting director of the National Institutes of Health. "But I don't know what to make of it." Timothy Springer, a Harvard cancer researcher, was more direct. "It is higher than I'd have expected," he said. "It indicates that too much is published. A lot of us think too much is published."

If Springer is right, the publishing industry is at least partly responsible. The number of scholarly journals in all fields (scientific and others) has risen from 70,000 to 108,590 over the past 20 years, according to the Bowker / Ulrich's database. Crunched by rising subscription prices and the sheer number of titles, libraries have been unable to keep up with the flood of information. The average member of the Association of Research Libraries now holds only about 27,000 titles, about 26% of the total available.

To critics of the academic promotion system like University of Michigan president James Duderstadt, the growing number of journals and the high number of uncited articles simply confirm their suspicion that academic culture encourages spurious publication. "It is pretty strong evidence of how fragmented scientific work has become, and the kinds of pressures which drive people to stress number of publications rather than quality of publications," Duderstadt said.

Most of that pressure is rooted in the struggle for grants and promotions. "The obvious interpretation is that the publish or perish syndrome is still operating in force," said David Helfand, chairman of the astronomy department at Columbia University. (Helfand is best known outside his field for refusing to accept a tenured appointment at Columbia, instead preferring to work under a renewal five-year contract.) "You get a stack of 60 papers in the mail when you're on a tenure committee, and its sort of stupid, because you know you're not going to read them all." Allen Bard, editor of the Journal of the American Chemical Society, added: "In many ways, publication no longer represents a way of communicating with your scientific peers, but a way to enhance your status and accumulate points for promotion and grants."

For just this reason, some universities have begun limiting the number of papers they will accept for evaluation. The Harvard Medical School, whose promotion committees will only review applicants' 5 to 10 most significant papers, is the most celebrated example, but other schools and some federal agencies seem to be following suit. New rules at NSF, for instance, allow scientists to submit no more than five publications with their grant applications. Even so, it may be a while before this trend moves beyond elite research universities. "At the state colleges and universities, where they believe publication is their road to credibility, there's still a great emphasis on the number of publications," says Vito Perrone, a Harvard School of Education researcher who has studied academic publishing for the Carnegie Foundation for the Advancement of Teaching.

Pendlebury says he plans further analysis of the citation data within the next few months. In particular, he intends to examine how the percentage of uncited papers varies between disciplines and between journals put out by commercial and nonprofit publishers, as well as the frequency of uncited papers in upper echelon journals such as Nature, Science, Cell, the New England Journal of Medicine, and so forth. So far, there is only a hint as to what further analysis will reveal -- and it's bad news for social scientists. A preliminary ISI study conducted on papers published in the hard sciences in 1984 revealed that only 40% of them received no citations in the 4 years following publication, a fact which suggests that social science papers go uncited at a rate much greater than 55%.

One consequence of this phenomenon is that many researchers have become deeply suspicious of articles not published in so-called first-tier journals. "I routinely have to go into the 'deep literature' --those journals I no longer have time to read on a daily basis -- and it is frequently a waste of time," says MIT biology professor Richard Young. If the bottom 80% of the literature "just vanished," he says, "I doubt the scientific enterprise would suffer." The ISI statistics would seem to give academics, university administrators, and government officials a great deal to think about.

David P. Hamilton