Prestigious Science Journals Rapidly Declining In Influence Due to Drug Company Funding

Karen Foster, Prevent Disease
Waking Times

The most prestigious peer-reviewed journals in the world are having less influence amongst scientists, according to a paper co-authored by Vincent Lariviere, a professor at the University of Montreal’s School of Library and Information Sciences.

Lariviere questions the relationship between journal “impact factor” and number of citations subsequently received by papers.

Medical journals can no longer be trusted to provide accurate, unbiased information about biomedicine since funding is almost entirely funded by drug companies. And it’s showing.

Richard Smith, the ex-editor of the British Medical Journal (BMJ), publicly criticized his former publication, saying the BMJ was too dependent on advertising revenue to be considered impartial. Smith estimates that between two-thirds to three-quarters of the trials published in major journals — Annals of Internal Medicine, Journal of the American Medical Association, Lancet and New England Journal of Medicine — are funded by the industry, while about one-third of the trials published in the BMJ are thus funded. He further adds that trials are so valuable to drug companies that they will often spend upwards of $1 million in reprint costs (which are additional sources of major revenues for medical journals).

Consumers trust medical journals to be the impartial and “true” source of information concerning a prescription drug, but few are privy to what is truly going on behind the scenes at both drug trials and medical journals.One result is few impact studies that are referenced in additional research.

“In 1990, 45% of the top 5% most cited articles were published in the top 5% highest impact factor journals. In 2009, this rate was only 36%,” Lariviere said. “This means that the most cited articles are published less exclusively in high impact factor journals.” The proportion of these articles published in major scholarly journals has sharply declined over the last twenty years. His study was based on a sample of more than 820 million citations and 25 million articles published between 1902 and 2009. The findings were published in the Journal of the American Society for Information Science and Technology.

  • For each year analysed in the study, Lariviere evaluated the strength of the relationship between article citations in the two years following publication against the journal impact factor. Then, he compared the proportion of the most cited articles published in the highest impact factor journals. “Using various measures, the goal was to see whether the ‘predictive’ power of impact factor on citations received by articles has changed over the years,” Lariviere said. “From 1902 to 1990, major findings were reported in the most prominent journals,” notes Lariviere. But this relationship is less true today.”

    Lariviere and his colleagues George Lozano and Yves Gingras of UQAM’s Observatoire des sciences et des technologies also found that the decline in high impact factor journals began in the early 90s, when the Internet experienced rapid growth within the scientific community. “Digital technology has changed the way researchers are informed about scientific texts. Historically, we all subscribed to paper journals. Periodicals were the main source for articles, and we didn’t have to look outside the major journals,” Lariviere noted. “Since the advent of Google Scholar, for example, the process of searching information has completely changed. Search engines provide access to all articles, whether or not they are published in prestigious journals.”

    Impact factor as a measure of a journal’s influence was developed in the 1960s by Eugene Garfield, one of the founders of bibliometrics. “It is basically the average number of times a journal’s articles are cited over a two-year period,” Lariviere explained. “Initially, this indicator was used to help libraries decide which journals to subscribe to. But over time, it began to be used to evaluate researchers and determine the value of their publications.” The importance of impact factor is so ingrained in academia’s collective consciousness that researchers themselves use impact factor to decide which journals they will submit their articles to.

    Various experts in bibliometrics have criticized the use of impact factor as a measure of an academic journal’s visibility. A common criticism is that the indicator contains a calculation error. “Citations from all types of documents published by journal are counted,” Lariviere said, “but they are divided only by the number of articles and research notes. Impact factor is thus overestimated for journals that publish a good deal of editorials, letters to the editor, and science news, such as Science and Nature.”

    Another criticism is that the time frame in which citations are counted in calculating impact factor is too short. “There are research areas in which knowledge dissemination is faster than it is in others,” Lariviere said. “We cannot, for example, expect to get the same kind of impact factor in engineering and biomedical sciences.” Yet journal impact factor is established in the two-year period following publication of articles regardless of the discipline.

    The research results reveal some interesting points. On the one hand, journals are increasingly poor predictors of the number of citations an article can expect to receive. “Not only has the predictive power of impact factor declined, but also, impact factor is no longer suitable for evaluating research,” Lariviere argued. In his opinion, if we want to evaluate researchers and their work, it is best to use citations, which are a true measure of an article’s impact. “This indicator is more accurate. It is not an estimation based on the hierarchy of journals.” On the other hand, his work confirms that the dynamics of scholarly journals is changing, due especially to the open access of knowledge made possible by the Internet. “What then is the present function of scholarly journals?” Lariviere asked. “One remains: peer review.”

    About the Author

    Karen Foster is a holistic nutritionist, avid blogger, with five kids and an active lifestyle that keeps her in pursuit of the healthiest path towards a life of balance.

    ~~ Help Waking Times to raise the vibration by sharing this article with the buttons below…

    No, thanks!