Actual acceptance and publication times in 4 out of 5 STM journals are too slow!
Speed isn’t everything in academic publishing, although for some particular research fields currently racing to find solutions to challenges and crises facing the world today, it could mean all the difference.
The growth of academic journal publishing continues at an accelerating rate. Data taken from Scopus® indicate that in the year 2000 there were just under 800,000 research articles published. This grew to over 1.25 million in 2010 and by 2019 there was 2.06 million articles published, a total increase of 157% since the millennium.
And to meet the increase in research output the number of journals increased from 17,577 in 2012 to 22,808 journals publishing at least 1 article in 2018, a growth of 27% in just 6 years.
So, where does a researcher choose to publish their research findings? Speed of publication, quality of peer-review, quality, timeliness and professionalism of the editorial and publisher services. Rejection is the norm. One study calculated that most manuscripts are submitted between three and six times prior to acceptance and publication. So how do we avoid long delays and miserable publishing experiences?
Currently advice seems to focus on one source – the previous experience of colleagues.
With a minority of journals displaying publishing performance metrics and transparent, standardised data, researchers are totally reliant on word of mouth.
A group of young post-doc researchers located around the world have determined that the performance of academic journals publishing their research needs to improve. They have created conpher. To take the word of mouth recommendations and share them with everyone.
Simply put, conpher will collect personal journal publishing experiences from academics around the world and together with publishers and other stakeholders, create a freely accessible database of comprehensive, transparent and verified facts enabling researchers to make educated, calculated publication choices.
“What a great idea” – Richard Horton, Editor-in-Chief of The Lancet.
The conpher team believe more journals is not necessarily best for academic research publishing standards. It is quality that is required. And researchers should have oversight to compare how journals are performing. Bad experiences need to be highlighted, mistakes not repeated, and celebrating good experiences.
Ultimately conpher want to play a part in raising academic publishing standards.
John Marshall, leader of the conpher launch team, explained:
“To kickstart the conpher platform we pulled data from over 1.2 million articles published in the last 2 years from National Institute of Health’s PubMed database. A sample study of this data concluded that the average time taken from submission to publication was 182 days. conpher then took the actual acceptance and publication speeds of 1000 journals and compared this data to the advertised figures presented by the journal publishers. We discovered that 4 out of 5 journals actual publication times recorded by PubMed were in fact longer than the publishers were telling us. And perhaps even more alarmingly, 50% of the journals we compared had actual delays of more than 10 weeks longer than the publisher’s figures. I do not need to underline to anyone how vital an additional 10-week delay could mean to the success of a research breakthrough to our world today.
There can be many reasons for acceptance and publication delays, but we feel performances can and should improve. Quality journals will emerge as the victors from this project, and under-performing journals can set targets for improvement.”
conpher invites researchers around the world to share their experiences. It will only take you seconds but could save a colleague month of stress!