The careers of converts – how a transfer to BioMed Central affects the Impact Factors of established journals

3

Logos for BioMed Central journalsDoes open access increase the likelihood for articles to be cited, or to be cited more often, compared to articles published in subscription-based journals? The questions around such an ‘open access citation effect’ – its size, indeed its existence, and how it may relate to different open access models – have been discussed for many years.

A 2010 literature review by Alma Swan showed that the vast majority of relevant studies found evidence for the effect, and the growing number of such studies adds to our understanding of it and how it varies in relation to factors like academic disciplines, journal ranking, or open access models.

At BioMed Central we’ve seen a strong effect on citations and the Impact Factors of publications that move to BioMed Central when they are already well-established titles listed in the Journals Citation Report.

The following report, based on my presentation at the most recent German Open Access Days, cannot claim to provide much more than anecdotal evidence, not least because of the sample size. However, the observable effects are large enough and of sufficient interest to be shared even at this early stage.

The results could perhaps be said to show more a ‘BioMed Central citation effect’ rather than an open access citation effect; however, if BioMed Central can be generalised to be an open access publisher with high-quality services, the results may have wider relevance.

*****

Report: How moving to an open access model at BioMed Central affects established journals

Over the years, many journals have transferred to BioMed Central, in the process moving from subscription-based arrangements to the open access model. Of this larger group, so far five titles meet criteria that allow meaningful comparisons of the respective rankings before and after the conversion to open access. The journals we’ve looked at:

  • have been with BioMed Central for long enough that the calculation of their most recent (2012) Impact Factor is based entirely on open access articles, meaning that the year of transfer cannot be later than 2010;
  • have had Impact Factors for at least two years before their transfer;
  • have published no fewer than 30 articles per year in each of the two years either side of the transfer.

The focus of this piece is on the development of the (two-year) Impact Factor of journals converted to open access. This is for two reasons, one of which is that as we are looking at journal-level trends, there are none of the serious distortions that arise when the Impact Factor is used to assess or make claims about individual articles or even individual researchers. The other reason is that, while other metrics could be used (and would show similar results), in practice the move to open access is often the preferred option expressly because of the society’s and/or editors’ declared intention to improve a journal’s ranking.

 

Impact Factors for Acta Veterinaria Scandinavica

The first example we will look at (in the graph above) shows the Impact Factor trend for Acta Veterinaria Scandinavica, a publication of more than 50 years that moved to BioMed Central in 2006.

In this and in the following graphs the vertical line indicates the first Impact Factor for which the calculation was based exclusively on articles published by BioMed Central, and the x-axis shows the years of the Impact Factor (JCR years), not calendar years. The latter is relevant as the Impact Factor of one year will be published in the summer of the following calendar year.

Since Acta Veterinaria Scandinavica converted to open access publication seven years ago, its Impact Factor has increased about four-fold, and as a consequence the journal has moved into the top third of the veterinary sciences category in Web of Science.

This qualitative growth was accompanied by a steady increase in the annual number of articles submitted and published, with the number of articles published in 2012 (77) representing more than a doubling of the numbers in, as well as before, the year of the transfer.

The following graphs for Journal of Experimental & Clinical Cancer Research, Journal of Cardiovascular Magnetic Resonance, and Genetics Selection Evolution all convey a similar message: the first Impact Factor that is based solely on open access articles (vertical line) represents a significant rise – a doubling and more – compared to the years before the transfer to BioMed Central.

Journal of Experimental and Clinical Cancer Research

Impact factors for Journal of Cardiovascular Magnetic Resonance

Impact Factors for Genetics Selection Evolution

The exception to the emerging rule appears to be Journal of Biomedical Science (shown below) for which the Impact Factor remained virtually unchanged when the first open-access Impact Factor (the 2011 one, published in 2012) was calculated. Instead, a smaller rise (but one still amounting to about 25%) kicked in a year later, longer than expected against the backdrop of the other examples.

Impact Factors for Journal of Biomedical Science

The editors of converted titles tend to have an improvement of their journal’s ranking position as their top priority. The resulting editorial strategy, then, has to address the balance of this qualitative growth with quantitative growth, i.e. an annual increase in the number of accepted articles. Quantitative growth is possible, and it can sustainably build on the journals’ rise through the ranks.

It is in this context that the apparently exceptional case of Journal of Biomedical Science finds it explanation. In 2013, all five journals covered above published at least as many articles, often significantly more, compared to the respective year preceding the transfer to BioMed Central.

However, among the five titles, Journal of Biomedical Science was the only one that accepted significantly more articles in its first year (2009) as an open access, online-only journal, compared to its last year as a subscription-based journal.

While in the years from 2006 to 2008 the journal published a fairly steady number of between 70 and 80 articles per annum, this number jumped by about 50%, to 114, in 2009. It stands to reason that this sudden growth had the potential to delay the onset of the open access effect on the rise in Impact Factor. Then again, it could be argued that open access allowed a significant increase in the size of the journal, measured by numbers of articles published, without any detrimental effect on its Impact Factor.

Journals that transferred to BioMed Central after 2010 were excluded from our analysis because of the criteria, outlined above, that allow a meaningful comparison of journals’ developments before and after their conversion to open access. In every year since then further journals that meet those criteria have moved to BioMed Central. The sample size will continue to grow, and there will undoubtedly be more to report.

View the latest posts on the Research in progress blog homepage

3 Comments

Witold Kieńć

That is a great news! As recently as yesterday I was writing that dictatorship of IF diminish growth of Open Access, because authors are looking for high IF journals, which are usually toll access.

https://openscience.com/open-access-growing-faster-part-one-advantages-authors/

I hope that news like that will prompt journal owners to change publishing model to Open Access and the problem will disappear. Anyway this is also a confirmation that OA is good for each author’s personal h-index and visibility in general.

Michael Callaham

This is interesting but pretty anecdotal. I haven’t reviewed it in the last few years but for many years before that the collective IF of all journals that were tracked was steadily rising. Have you examined some comparable journals that did NOT convert to OA, to see if they have a similar pattern of apparent increase or not?

Stefan Busch

It’s true that there is general IF inflation (I guess mostly because of more journals added to WoS, generating more citations tracked by the db) although I’m not aware of detailed stats. I haven’t run our findings against a control group, no. This would be a sizeable project. To make it meaningful and escape vagueness in what constitutes “comparability”, it probably means identifying as control group all journals in the relevant SCR subject categories that are “unconverted” research (as opposed to review) journals of the same minimum size as defined in the blog item.
I venture a guess that the IF increases for the control groups would not be anywhere near what we found for the converted titles. It’s open to testing.

Comments are closed.