The Cooler: PLoS ONE and the Panic Over Impact

Image credit: Dave Gray/Flickr through a Creative Commons license.

Image credit: Dave Gray/Flickr through a Creative Commons license.

Bob Lalasz directs science communications at The Nature Conservancy.

What is “impact” for science, anyway?

And could the ways we define “impact” explain why we have less of it than we think we should?

Case de jour: PLoS ONE, the world’s largest scientific journal. Its 2012 “impact factor” (the most widely used measure of a journal’s scientific influence — calculated by the number of citations a year’s worth of its papers receive elsewhere, divided by the number of papers it published that year) dropped a whopping 16% from 2011’s number, it was announced earlier this month.

That magnitude of Journal Impact Factor (JIF) decline would be enough to make most scholarly publishers man the life rafts. Since publishing is so key to science, what makes up a scientific career — the hirings, promotions, tenure, grants — relies at least indirectly and many times directly on JIFs. So scientists need to publish in journals with relatively high impact factors…and hope those numbers don’t drop before they’re up for their next job.

If you’re a scientist, you’ve almost certainly at least peeked at a journal’s JIF before submitting to it; you might even get a cash bonus when you publish to a high JIF one such as Nature or Science. No wonder the annual announcement of every journal’s JIFs in June by Thompson Reuters, the official toter-up of the numbers, is followed in some circles like Selection Sunday for March Madness.

So it’s a safe bet no champagne was popped in the PLoS ONE offices when their new JIF was announced. But scientists — including conservation scientists — weren’t happy, either.

PLoS ONE — a free-to-read, online-only, fast-turnaround, data- and graphics-friendly science journal whose 2006 debut cut shook up a scholarly culture used to snooty editors, $1,000-and-up annual journal subscriptions and glacial manuscript reviews — has become a favorite submission destination for many researchers who have papers that could make a splash.

I recommend it to Conservancy scientists for articles that have color graphics and potential media impact. PLoS One does graphs and charts well, is followed closely by science media, and has a competent and aggressive media relations staff. In addition, it doesn’t cost much to publish there, and even less for developing country authors.

Unfortunately, PLoS ONE’s JIF will continue to drop because of the very way it does business, according to Phil Davis, a publishing analyst and contributor to the group blog Scholarly Kitchen.

Davis argues that PLoS One’s first strong JIF in 2009 (4.351) brought a slew of submissions in 2010 from researchers looking to capitalize on that number. And since PLoS ONE now publishes tens of thousands of those submissions annually (23,464 last year, to be precise), it doesn’t have the tight editorial selectivity of a Nature or a Science necessary to ensure it only selects potentially high-citation papers. So that Impact Factor will keep dropping, because any one high-impact paper will be lost in a sea of thousands.

In essence, Davis is saying, PLoS ONE will continue to be victimized by its early success and its all-comers philosophy. He predicts a decline in megajournals like PLoS ONE and a return to discipline-based journals that have sped up their review cycles and added altmetrics without corresponding declines in JIF.

Mother of jargon, is this the end of PLoS ONE?

A better question might be: Is this the end of JIF?

JIF’s value has been debated for years, but it’s never been under such systematic attack as now. A statement signed in May by more than 8,000 scholars and 300 institutions called the San Francisco Declaration on Research Assessment (DORA) calls for an end to using journal-based metrics such as JIF in decisions about scientific funding, appointments and promotions.

But here’s the shock: Now, there’s a growing body of evidence that JIF isn’t a particularly good measure of impact — and might even contraindicate it.

As George Lozano writes for the London School of Economics blog, the strength of the relationship between a journal’s Impact Factor and any one of its individual paper’s citations rates has been dropping since 1990 — when the Internet began untethering papers from journals and search made journal provenance largely moot.

And get this: Lozano says that, since 1991, the proportion of top papers not published in top JIF journals is increasing.

“If the pattern continues,” he writes, “the usefulness of the IF will continue to decline, which will have profound implications for science and science publishing. For instance, in their effort to attract high-quality papers, journals might have to shift their attention away from their IFs and instead focus on other issues, such as increasing online availability, decreasing publication costs while improving post-acceptance production assistance, and ensuring a fast, fair and professional review process.”

Bjorn Brembs, Katherine Button and Marcus Munafo top Lozano, with a long takeout on JIF in Frontiers in Human Neuroscience that blames journal rank for everything from the rise in scientific retractions and the decline effect to the unwillingness of many publishers to make their journals open-access or to cut subscription prices.

And if that evil list isn’t enough, ranking journals is just bad scientific practice, they argue: “Much like dowsing, homeopathy or astrology, journal rank seems to appeal to subjective impressions of certain effects, but these effects disappear as soon as they are subjected to scientific scrutiny.”

Think about that before you go back to the Journal of Obscurity and Editorial Neglect.

As I do a dozen or more times each year, I worked recently with a scientist to develop a communications plan for one of her new papers. She chose to publish it in a specialty journal that had the practitioner readership she wanted for her work; but that journal was print-based and subscription only, with very limited online-first features that we had to pay $3,000 to secure.

The journal also had zero resources for media relations, so we had to generate any coverage for it ourselves. Worst of all, it took nine months between the time the paper had been accepted and publication — it felt as if single-celled organisms had evolved into mammals during the interim.

And communication from the journal’s editorial staff about when it finally would appear was non-existent. The paper went virtually uncovered by media. We’ll have to wait a few years to find out about scholarly impact. It’s fair to say the scientist was disappointed in the experience.

For my money, if you’re a conservationist worried about PLoS ONE’s JIF, or JIF at all, that’s a sign you have bigger issues to worry about.

PLoS ONE is still a premier journal for communicating your work — especially interdisciplinary and media-worthy work — as opposed to using it to get validated. Conservation science has enough trouble getting attention; we don’t need to place imaginary boulders in the pathways we have.

Other interesting recent links on JIF and scientific impact:

Paul Wouter at The Citation Culture says there are a lot of better journal impact indicators than JIF, but that they shouldn’t rule the world, either.

What would a metric that evaluated the impact of research not just on science, but on practice and society look like? Birge Wolf and co-authors take a crack at it in the latest issue of GAIA.

There’s also always social media if you want to really make an impact.

Opinions expressed on Cool Green Science and in any corresponding comments are the personal opinions of the original authors and do not necessarily reflect the views of The Nature Conservancy. 

Correction, 2.10.2014: The sentence “And get this: Lozano says that, since 1991, the proportion of top papers not published in top JIF journals is declining” should have read “…is increasing.” It has been corrected.

Bob Lalasz is the director of science communications at The Nature Conservancy and the editor of the new Cool Green Science. A long-time editor and writer, he was previously the Conservancy's associate director of digital marketing. He now blogs here about the Conservancy's scientific research and on-the-ground work as well as larger conservation science and science communications issues.



Comments: The Cooler: PLoS ONE and the Panic Over Impact

  •  Comment from Theresa Liao

    I am not sure if I would consider the drop in PLoSONE JIF a surprise. And also don’t think JIF is a good indicator for PLoSONE in the first place anyways (so I personally don’t consider it “victimized”).

    1. When you have a large volume of published articles, the way you share these articles will have to be different – one cannot rely on people just “finding them.” Some mechanisms will have to be built in, maybe through a email list, subject directory, social media, etc.

    2. Unfortunately the 1-2 articles I came across through PLoSONE are not of the research quality I was hoping for. But, that isn’t necessarily true for other articles.

    Perhaps this speaks to what’s changing in the bigger environment – where is academic publishing going? Will we see more journals like PLoSONE? Or will we see people moving to free online repositories like FigShare or PeerJ pre-print(granted, a pre-print repository is not a journal, and there is no peer review, but the line for me is getting awfully thin here…). It’s an interesting time indeed.

    •  Comment from Bob Lalasz

      Hi, Theresa:

      Thanks for your comment. One thing I didn’t mention in my post was the relative conservatism of many conservation scientists with regards to publishing outlets. As with many disciplines, we like our usual journals — but PLoS ONE (even though its once-revolutionary aspects are now replicated in many places) became one of those. The drop in JIF could scare some of those scientists back to traditional outlets. I wrote this in hopes we could look past our habits.

      I also think the Lozano post in the LSE blog starts to put the lie to the argument that open access=lack of quality vs. subscription=quality. He found no correlation between individual papers with high IFs and the IFs of the journals in which they appeared. That should give one pause, at the least.

    •  Comment from Vishnu

      Hi Theresa,

      I agree that quite a few articles that appear in PLoS ONE are of substandard quality. However, that is absolutely true of journals like Science and PNAS as well! Editors choose papers that THEY think are interesting, sometimes famous authors get a pass even if the paper is not all that interesting (especially when direct submissions by its board members were allowed by PNAS).

      Finally, what really motivated me to submit a manuscript to PLoS ONE was that I think knowledge should be freely available to all. So open access is important to me as a principle. When I have the money as a researcher, I intend on paying the fees that other non-open access journals require to make articles open access.

    •  Comment from Francy Lisboa

      What did you mean with ” low quality research”? Have expertise in all areas embraced by Plos One?

  •  Comment from Alan

    I dont think plos themselves are even worried about JIF drop or increase, since they are founded based on open access and have different views on JIF system.

    Only those traditional scientists and those who use JIF to assess impact of researchers in their promotions/grants.

    •  Comment from Wasim Khan

      I think this is very true indeed. Some researchers only concerned with the JIF system (especially those who see it as the only ‘golden ticket’ to an academic promotion) will inevitably avoid PLoS One.

      Personally I think PLoS One has cemented its place as the prominent high-throughput open access journal. Whether or not we may see journals like this in the coming future remains to be seen.

  •  Comment from Tilahun Nigatu

    I think this will provide some basic problems associated with impact factor: http://publication2application.org/2013/12/02/impact-factor-a-poor-quality-indicator-of-quality/

    Thanks,
    Tilahun

  •  Comment from Gary Cottrell

    You have this stated incorrectly:
    And get this: Lozano says that, since 1991, the proportion of top papers not published in top JIF journals is declining.

    You mean it is increasing!

 Make a comment




Comment

Forest Dilemmas

Too many deer. Logging one tree to save another. Beavers versus old growth. Welcome to forest conservation in the 21st century. Join us for a provocative 5-part series exploring the full complexity facing forest conservation in the eastern United States.

What is Cool Green Science?

noun 1. Blog where Nature Conservancy scientists, science writers and external experts discuss and debate how conservation can meet the challenges of a 9 billion + planet.

2. Blog with astonishing photos, videos and dispatches of Nature Conservancy science in the field.

3. Home of Weird Nature, The Cooler, Quick Study, Traveling Naturalist and other amazing features.

Cool Green Science is managed by Matt Miller, the Conservancy's deputy director for science communications, and edited by Bob Lalasz, its director of science communications. Email us your feedback.

Innovative Science

Investing in Seagrass
Marine scientists and fishers alike know that grass beds are valuable as nursery habitat. A new Conservancy-funded study puts a number to it.

Drones Aid Bird Conservation
How can California conservationists accurately count thousands of cranes? Enter a new tool in bird monitoring: the drone.

Creating a Climate-Smart Agriculture
Can farmers globally both adapt to and mitigate the impacts of climate change? A new paper answers with a definitive yes. But it won't be easy.

Latest Tweets from @nature_brains

Categories