Chutes and Ladders:
NIH Scientists Discuss the Art and Strategy of Biomedical Publishing
by
Celia Hooper

Anyone who has been in biomedical research for more than five minutes is acutely aware of this fact of life: It is not just how much you publish , but also where you publish that counts. And in the last decade or so, another measure of professional status has assumed increased importance -- how much do other scientists cite your papers ?

These facts of life were brought home to NIH last spring when the Institute for Scientific Information, which maintains a huge database for measuring citation rates, added its two cents to debates about the quality of NIH's intramural research program. In the March issue of its newsletter, Science Watch, ISI maintained that "Research papers by NIH scientists published over the last five years are failing to carry quite the same clout as those published during the early-to-mid-1980s." The unsigned article, entitled "Intramural Research at NIH: Cracks in the Crown Jewel," was based on ISI's analysis of citation data for 92,961 NIH papers published from 1981 to 1993. Graphs of the citation rates for papers from individual institutes sometimes showed inexplicable peaks and troughs; but the graph of citations per paper for NIH as a whole, relative to the biomedical baseline (expressed as percent above the average citation rate) was remarkably flat -- migrating vaguely between 85% and 75% above average.

It is difficult to know whether NIH should lose sleep over citation rates that are only 75% above average. What kind of blips are statistically significant in such a massive number crunch? What might lie behind changes in citation rates for a given institute? Could a few early blockbuster papers on AIDS temporarily elevate NIH's average citation rate? How quickly would the relocation of a few highly cited authors show up in the numbers? What if NIH researchers developed a predilection for sending their papers to PNAS (ranked 37th for its citation-rate impact in 1992 in Science Journal Citation Reports) rather than Cell (ranked third)?

Stymied by these imponderables, The NIH Catalyst turned its attention to the more personal, immediate, and comprehensible issues. Where is the best place to publish these days? What journals possess the greatest cachet? What journals do intramural scientists actually read, and in which ones do they publish? What new journals are in ascendance? And what journals should be avoided?

The answer to these questions is to some extent subjective, so The Catalyst explored these issues accordingly. With disregard for the principles of experimental design, we faxed off a survey to members of The Catalyst's editorial board and to an unscientifically selected cross-section of researchers representing all of NIH's institutes. The approximately 50 returns that we received included responses from senior- and junior-level scientists; American- and foreign-born scientists; and intramural and extramural staff. Respondents were molecular and cellular biochemists, immunologists, geneticists, neurobiologists, and clinical and behavioral researchers.

We asked respondents to list the most prestigious journals in which to publish nonclinical and clinical research papers; the journals that they would rank in the second tier of respected general-interest publications; the top specialty journals in their areas of expertise; the journals they are most likely to read and publish in -- regardless of prestige; exciting new places to publish; and journals that are problematic -- slow to get papers reviewed and published, for example.

We summarize the responses to our query about the most prestigious journals as the "NIH Cachet Factor" (NCF) in Table 1 . We also include data on these journals from ISI's 1992 Science Journal Citation Reports indicating their ranks in terms of citations. The ISI Ranks listed in Table 1 are the 1992 Impact Factor rankings. The tables also include information, provided by the journals, on the acceptance rate for papers and the average time it takes each journal to publish a paper once it is accepted and in final form. These numbers are approximate and the average time to publication is affected by when, within the publishing cycle, the paper is accepted. The last two columns show page charges and the number of NIH scientists on the editorial boards of each journal. These numbers also are approximate.

We spotted some interesting differences between the list of journals in which our survey respondents publish (Table 2 and the journals with the highest NCF. Not surprisingly, workhorse journals, such as Cancer Research, and Endocrinology and the Journal of Biological Chemistry, Journal of Immunology, Journal of Infectious Diseases, and Journal of Virology, assume greater importance in the list of likely places in which NIH scientists publish than they do in the list of NCF rankings. Cell and Neuron -- both published by Benjamin Lewin's Cell Press in Cambridge, Mass. -- and Lancet, a British journal, although ranked as highly prestigious, were infrequently listed as journals in which respondents were likely to publish. One anonymous respondent said he would advise against submitting papers to Cell "unless you are a member of the 'club.'" Another observed, "Cell appears to have: a) somewhat of a sliding scale [in acceptance decisions], adjusted by lab of report origin, and b) some editorial pressure to 'hyperextend' hypotheses, leading to a fair percent of reports with exciting general conclusions that don't hold up." Molecular and Cell Biology and Genes and Development -- both comparatively new journals -- appear to have risen more quickly in cachet than as likely outlets for NIH scientists' papers, and both journals were singled out in comments by several survey respondents as exciting new journals in which to publish.

When it comes to precious reading time, there appeared to be three patterns governing which journals NIH scientists peruse: some scientists read the big-name, broad-interest journals; some focus on journals in their specialty area; and some go for a mixture of the two. Because many of the specialty journals were listed by only one or two respondents, they didn't make the summary list of journals ranked as top reading by NIH scientists (Table 3).

Survey respondents had many suggestions for exciting new places to publish -- including two publications by Cell Press, Neuron and Immunity. Several respondents recommended Nature's new monthly, Nature Genetics. Several others suggested Molecular Biology of the Cell and International Immunology. In neurobiology, respondents recommended Neuro Report, Neurobiology of Disease, Cerebral Cortex, and Synapse. One respondent noted that the Journal of Neurochemistry, although not new, was changing and "has made the transition to a good molecular neuroscience journal. This fills an important gap for reports that are significant, but not 'full' enough for Neuron...." In structural biology, Structure and Protein Science received endorsements. Mechanisms of Development and Developmental Dynamics were cited as good new development journals. Cancer, Molecular Carcinogenesis, Bone Marrow Transplantation, Endocrine Journal, Mammalian Genome, and Molecular Microbiology were also mentioned as good new speciality journals. One anonymous respondent recommended Current Biology as an attractive journal having the same scope as Cell, but that is "not as capricious as Cell." Respondent Graeme Wistow of NEI bemoaned the lack of interesting new general-interest journals. "We need another good general journal," he wrote.

In responding to an open-ended request for other comments, respondents offered seasoned perspectives on the art and politics of publishing, and some excellent tips for scientists submitting papers. For example, NCI's Ira Pastan, advised, "If it [a submitted paper] comes back, reformat it and send it right out to a comparable journal. Don't sit on it. Remember, not all editors are perfect." NIAID's Ron Germain wrote, "1-Do first-rate work and do it first!! 2-Get to know the editors where you want to publish. 3-Learn the journal's preferences (e.g., Science in the past has liked HIV-related work better than Nature..." Peng Loh of NICHD advised, "Try to publish in the highest-impact journals." Jim Nagal of NIA urged, "Be realistic when evaluating the quality of your work and submit accordingly." Another respondent observed, "So many times, the reviews you receive will be opposite opinions [...] you begin to realize that the process is a crap shoot."

Several respondents commented on long-standing and widespread concerns about the publishing and review processes, citing, for example, the importance of having "rebuttable, accountable, documented reviews." Another respondent opined that papers should be published along with reviewers' comments. Others are concerned that women continue to be underrepresented on the editorial boards and boards of reviewing editors for many of the top journals. "One notable exception is the highly rated Journal of Cell Biology, with about 28% women on the board," wrote one respondent.

NINDS' Monique Dubois-Dalcq observed that sometimes some of the slower journals have excellent reputations and may ultimately be a better choice than the big-name journals. "For most of us who get reviewed and who review others, the crucial point is how fast and well the reviewers are working and how fair the process is. Even the best journals should have three reviewers whenever possible, or allow for a third review if there is some controversy." Another senior scientist also has observed that the best reviews don't always emanate from the most prestigious journals. "Some highly rated journals are not as critically reviewed as JBC, or even PNAS, which are more 'democratic' and merit-driven," he says.

Dubois-Dalcq notes that keeping the peer-review process working efficiently and effectively can be a challenge to journal management. "An editor should require concise, crisp, clear reviews with constructive criticism whenever possible -- this means work! Similarly, editors should drop reviewers who do not do their job in a timely manner." Wistow notes that, ironically, hot-shot scientist-editors may not always be up to the task: "Famous editors are sometimes too busy to take care of editorial responsibilities."

We also asked survey respondents to identify any journals that they would advise against publishing in. We summarize many of these comments in "A Young NIH Scientist's Breaking Into the Big Name Journals Blues Rap" (see box). Procedings of the National Academy of Sciences, with its unusual method for acquiring papers (see footnote to Table 1), drew the most comment. One scientific director wrote, "Although I have published in PNAS in the past, I feel quite negatively about this journal. The articles vary widely in quality and rigor (much worse than other top-rank journals). In general this is due to extreme variability in rigor of the review process." A young scientist warned that some members of the Academy -- the only route to getting a paper into PNAS -- "are known to take in more papers than they can review and communicate" to the journal for publication. Another researcher may have suffered exactly this experience: "Since PNAS has no editorial board oversight, the review process can take forever if the sponsor is slow." Several respondents advised against publishing in PNAS but gave no explanation for why.

Several scientists criticized Science for perceived discrimination against all but the most famous authors. One young clinician wrote, "All of the major journals have a history of prejudice with some reviewers. This is frequently a problem, especially for young scientists without extensive name recognition." A senior immunologist advised against publishing in Science because, "except for 'hot' papers, [it's] SLOW and there is no re-review if the paper is initially rejected." Another warns that "Science may reject papers on grounds of space [limitations] even after recommended acceptance by reviewers." This respondent observed that, in general for researchers submitting papers, "It helps to be famous."

Two other journals high in NIH Cachet Factor rankings -- Nature and the Journal of Biological Chemistry -- also took hits from survey respondents. Nature was cited by two respondents as being very slow in its reviews. One respondent attacked Nature -- which, unlike most of the top journals, has no editorial board -- for having a "very arbitrary editorial policy." Two different respondents from NEI complained that JBC was also slow and had some troublesome, idiosyncratic editors.

In addition to the negative comments above, at least one respondent recommended avoiding publishing in, or reported negative experiences with, each of the following journals: Annals of Epidemiology, Biopolymers, Brain Research, Brain Research Bulletin, Calcified Tissue International, Cancer, Cancer Research, Cellular Immunology, Cell Growth and Differentiation, Experimental Eye Research, Gene, JAMA, Journal of Experimental Medicine, Journal of Immunology, Journal of Infectious Diseases, Lancet, Life Sciences, Matrix Biology, Neuroscience Letters, and Pediatrics. The two most common complaints were publication delays and arbitrary or overly picky editors requiring (or even providing!) extensive revisions. One scientist complained that the journal Blood charges a $50.00 fee, up front, when manuscripts are submitted, "And that doesn't entitle you to a review." NIDR's Hynda Kleinman did not single out any particular journal, but urged colleagues not to publish important results in meeting reports and procedings. "It's very time consuming, and when people look at your c.v., they look at peer-reviewed articles, not symposia. Also, publications like that take much longer to publish and are never timely."

David Rodbard, Director of DCRT, predicted that in the not-too-distant-future, "electronic publication, especially of preprints, will assume increasing importance. I think that all NIHers should have an option to put manuscripts into Gopher or Mosaic, either as a preprint or after acceptance [by a journal]. In this manner, we could get results to our colleagues weeks to months ahead of the printed form." Noting that physicists have started just such pre-publication exchanges of results, Rodbard acknowledges that there are some problems to resolve, including copyright, standardized formatting, and funding.

But until information superhighways can handle the traffic jam of information coming out of biomedical research labs today -- or until some other miracle comes to pass -- NIHers and the rest of the scientific community may just have to persevere and accept the problems and indignities in the status-conscious name-game of publishing. For, as Harvey Pollard of NIDDK explained, "The reason why people are more concerned with where an article is published, rather than its intrinsic merits, is that many readers cannot evaluate the latter any more outside their own narrow field."

A Young NIH Scientist's Breaking Into the Big Name Journals Blues Rap

Off in Bethesda, or down in NC,
Up in Frederick, or out in MT,
Just hang around and you'll hear the sad song:
Gosh dang journal's gone 'n done me wrong.

Giving my mentor zero defiance,
In May I sent my oeuvre to Science
Ten days later I let out a groan,
When they mailed back the envelope, "SENDER UNKNOWN."

Next I harkened to Nature's call,
Heeded and loved by one and all.
In a mere three months, the chaps told me, "Hey,
This belongs in a journal beginning with 'J'."

But my very hip mentor said, "Here's what we do:
Call my friend Ben -- he'll push it through."
So I phoned up Cell to say, "Look what we learned!"
But all my calls went unreturned.

The saddened boss said he wouldn't think bad of me
If at least we made Proceedings of the National Academy.
"So-so data are all that's needed,
Given the way their garden gets weeded."

So I called up a National Academician,
A colleague and friend in that envied position.
"Could you get this reviewed and communicated?"
Her answer suggested my life is ill-fated.

"I've already used my allotment this year
Of five papers I can have published there,
But your data and ideas sound great to me,
And I bet you could get into JBC."

Her advice possessed wisdom's very kernel,
So next it went to that favored journal.
When I started this tale, the month was May,
As JBC wrote back, the year'd slipped away.

Acceptance was contingent on a few little changes:
A few more experiments checking the ranges,
Clean up the data, rewrite conclusions,
Re-format references, make these exclusions.

By March, one very quirky reviewer-
's added demands were notably fewer.
By May, all was in total compliance
As a similar paper came out in Science.

--Celia Hooper