Discover the most talked about and latest scientific content & concepts.

Concept: Eugene Garfield


The Journal Impact Factor (JIF) is a single citation metric, which is widely employed for ranking journals and choosing target journals, but is also misused as the proxy of the quality of individual articles and academic achievements of authors. This article analyzes Scopus-based publication activity on the JIF and overviews some of the numerous misuses of the JIF, global initiatives to overcome the ‘obsession’ with impact factors, and emerging strategies to revise the concept of the scholarly impact. The growing number of articles on the JIF, most of which are in English, reflects interest of experts in journal editing and scientometrics toward its uses, misuses, and options to overcome related problems. Solely displaying values of the JIFs on the journal websites is criticized by experts as these average metrics do not reflect skewness of citation distribution of individual articles. Emerging strategies suggest to complement the JIFs with citation plots and alternative metrics, reflecting uses of individual articles in terms of downloads and distribution of related information through social media and networking platforms. It is also proposed to revise the original formula of the JIF calculation and embrace the concept of the impact and importance of individual articles. The latter is largely dependent on ethical soundness of the journal instructions, proper editing and structuring of articles, efforts to promote related information through social media, and endorsements of professional societies.

Concepts: Academic publishing, Nature, Impact factor, Bibliometrics, Eugene Garfield


The incentive structure of a scientist’s life is increasingly mimicking economic principles. While intensely criticized, the journal impact factor (JIF) has taken a role as the new currency for scientists. Successful goal-directed behavior in academia thus requires knowledge about the JIF. Using functional neuroimaging we examined how the JIF, as a powerful incentive in academia, has shaped the behavior of scientists and the reward signal in the striatum. We demonstrate that the reward signal in the nucleus accumbens increases with higher JIF during the anticipation of a publication and found a positive correlation with the personal publication record (pJIF) supporting the notion that scientists have incorporated the predominant reward principle of the scientific community in their reward system. The implications of this behavioral adaptation within the ecological niche of the scientist’s habitat remain unknown, but may also have effects which were not intended by the community.

Concepts: Ventral tegmental area, Eugene Garfield


Bibliometric indicators increasingly affect careers, funding, and reputation of individuals, their institutions and journals themselves. In contrast to author self-citations, little is known about kinetics of journal self-citations. Here we hypothesized that they may show a generalizable pattern within particular research fields or across multiple fields. We thus analyzed self-cites to 60 journals from three research fields (multidisciplinary sciences, parasitology, and information science). We also hypothesized that the kinetics of journal self-citations and citations received from other journals of the same publisher may differ from foreign citations. We analyzed the journals published the American Association for the Advancement of Science, Nature Publishing Group, and Editura Academiei Române. We found that although the kinetics of journal self-cites is generally faster compared to foreign cites, it shows some field-specific characteristics. Particularly in information science journals, the initial increase in a share of journal self-citations during post-publication year 0 was completely absent. Self-promoting journal self-citations of top-tier journals have rather indirect but negligible direct effects on bibliometric indicators, affecting just the immediacy index and marginally increasing the impact factor itself as long as the affected journals are well established in their fields. In contrast, other forms of journal self-citations and citation stacking may severely affect the impact factor, or other citation-based indices. We identified here a network consisting of three Romanian physics journals Proceedings of the Romanian Academy, Series A, Romanian Journal of Physics, and Romanian Reports in Physics, which displayed low to moderate ratio of journal self-citations, but which multiplied recently their impact factors, and were mutually responsible for 55.9%, 64.7% and 63.3% of citations within the impact factor calculation window to the three journals, respectively. They did not receive nearly any network self-cites prior impact factor calculation window, and their network self-cites decreased sharply after the impact factor calculation window. Journal self-citations and citation stacking requires increased attention and elimination from citation indices.

Concepts: Academic publishing, Nature, Impact factor, Scientific journal, Bibliometrics, Immediacy index, Eugene Garfield, Nature Medicine


Standard area diagrams (SADs) have long been used as a tool to aid the estimation of plant disease severity, an essential variable in phytopathometry. Formal validation of SADs was not considered prior to the early 1990s, when considerable effort began to be invested developing SADs and assessing their value for improving accuracy of estimates of disease severity in many pathosystems. Peer-reviewed literature post-1990 was identified, selected and cataloged in bibliographic software for further scrutiny and extraction of scientometric, pathosystem- and methodological-related data. A total of 105 studies (127 SADs) were found and authored by 327 researchers from 10 countries, but mainly from Brazil. The six most prolific authors published at least seven studies. The scientific impact of a SAD article, based on annual citations after publication year was affected by disease significance, the journal’s impact factor and methodological innovation. The reviewed SADs encompassed 48 crops and 103 unique diseases across a range of plant organs. Severity was quantified largely by image analysis software, such as QUANT, APS-Assess® or a LI-COR® leaf area meter. The most typical SADs comprised five to eight black and white drawings of leaf diagrams with severity increasing non-linearly. However, there was a trend towards using true color photographs or stylized representations in a range of color combinations and more linear (equally spaced) increments of severity. A two-step SAD validation approach was used in 78/105 studies for which linear regression was the preferred method, but a trend towards using Lin’s correlation concordance analysis and hypothesis tests to detect the effect of SADs on accuracy was apparent. Reliability measures, when obtained, mainly considered variation among, rather than within raters. The implications of the findings and knowledge gaps are discussed. A list of best practices for designing and implementing SADs and a webpage called SADBank for hosting SAD research data are proposed.

Concepts: Scientific method, Regression analysis, Estimation theory, Academic publishing, Impact factor, 1990s, Bibliography, Eugene Garfield


ResearchGate has been regarded as one of the most attractive academic social networking site for scientific community. It has been trying to improve user-centered interfaces to gain more attractiveness to scientists around the world. Display of journal related scietometric measures (such as impact factor, 5-year impact, cited half-life, eigenfactor) is an important feature in ResearchGate. Open access publishing has added more to increased visibility of research work and easy access to information related to research. Moreover, scientific community has been much interested in promoting their work and exhibiting its impact to others through reliable scientometric measures. However, with the growing market of publications and improvements in the field of research, this community has been victimized by the cybercrime in the form of ghost journals, fake publishers and magical impact measures. Particularly, ResearchGate more recently, has been lenient in its policies against this dark side of academic writing. Therefore, this communication aims to discuss concerns associated with leniency in ResearchGate policies and its impact of scientific community.

Concepts: Scientific method, Academic publishing, Peer review, Impact factor, Scientific journal, Publishing, Eugene Garfield, Eigenfactor


In the past decade, researchers have made great progress in the field of Orthopedics. However, the research status of different countries is unclear. To summarize the number of published articles, we assessed the cumulative impact factors in top orthopedic journals. The aims of the study were to measure: 1) the quality and quantity of publications in orthopedics-related journals from China and other five counties, 2) the trend of the number of publications in orthopedics-related journals. The related journals were selected based on the 2014 scientific citation index (SCI) and articles were searched based on the PubMed database. To assess the quantity and quality of research output, the number of publications including clinical trials, randomized controlled trials, meta-analyses, case reports, reviews, citations, impact factors, number of articles in the top 10 journals and most popular journals were recorded. A total of 143,138 orthopedics articles were published from 2005 to 2014. The USA accounts for 24.9% (35,763/143,138) of the publications, followed by UK (7878/143,138 (5.5%)), Japan (7133/143,138 (5.0%)), Germany (5942/143,138 (4.2%)), China (4143/143,138 (2.9%)) and France (2748/143,138 (1.9%)). The ranking for accumulated impact factors as follows: USA, UK, Japan, Germany, France and China. The mean impact factor’s order is USA, China, Germany, Japan, France, UK, and interestingly the mean impact factors in Japan is similar to the Germany in 2005-2014. The USA had the highest percentage of articles in the top 10 journals, while China owns the least. The USA had the highest number of average citations, while Japan had lowest number of average citations. According to this study, we can conclude that the USA has had been leading the orthopedics research in the past 10 years. Although China still falls behind, it has made considerable progress in the orthopedics research, not only in quantity but also quality.

Concepts: European Union, Developed country, Academic publishing, Olympic Games, Impact factor, World War II, Institute for Scientific Information, Eugene Garfield


Introduction Hirschsprung disease (HD) is a congenital bowel innervation disorder that involves several clinical specialties. There is an increasing interest on the topic reflected by the number of annually published items. It is therefore difficult for a single scientist to survey all published items and to gauge their scientific importance or value. Thus, tremendous efforts were made to establish sustainable parameters to evaluate scientific work within the past decades. It was the birth of scientometrics. Materials and Methods To quantify the global research activity in this field, a scientometric analysis was conducted. We analyzed the research output of countries, individual institutions, authors, and their collaborative networks by using the Web of Science database. Density-equalizing maps and network diagrams were employed as state of the art visualization techniques. Results The United States is the leading country in terms of published items (n = 685), institutions (n = 347), and cooperation (n = 112). However, although there is dominance in quantity, the most intensive international networks between authors and institutions are not linked to the United States. By contrast, most of the European countries combine the highest impact of publications. Further analysis reveal the influence of international cooperation and associated phenomena on the research field HD. Conclusion We conclude that the field of HD is constantly progressing. The importance of international cooperation in the scientific community is continuously growing.

Concepts: Scientific method, United States, Academic publishing, Science, Scientific literature, Technical report, Citation index, Eugene Garfield


“Not everything that can be counted counts. Not everything that counts can be counted.” William Bruce Cameron Journal metrics mania started over 50 years ago with the impact factor that has since become so well entrenched in publishing. Ask anyone where they would like to publish their research and most will reply by saying in a journal with the highest impact factor. While this suggests quality and a degree of vetting by the scientific community, the impact factor has also been used to benchmark and compare journals. Impact factors are often used as a proxy of a journal ’s quality and scientific prestige. However, is medicine dependent on a valuation system that may be grounded in falsity? Much about this measure is imperfect and destructive. Journals can manipulate the impact factor by refusing to publish articles like case reports that are unlikely to be cited or, conversely, by publishing a large proportion of review articles, which tend to attract more citations. Another tactic that may be used is to publish articles that could be highly cited early in the year, thereby leaving more time to collect citations. Many use the impact factor as an important determinant of grants, awards, promotions and career advancement, and also as a basis for an individual’s reputation and professional standing. Nevertheless, you should remember that the impact factor is not a measure of an individual article, let alone an individual scientist. As long as an article has been cited, the citation will contribute to the journal’s impact factor. This is regardless of whether the article’s premise is true or false, or whether the cited paper was being credited or criticised. Perversely, a weak paper that is being refuted will augment the impact factor, as will a retracted article, because although the article may have been retracted, the citations of this article will still count. The impact factor has weathered many storms in the past but criticisms against it are increasing, as is interest in displacing it as a single metric used to measure an article’s influence. Many would like the scientific community to assess research on its merits rather than on the basis of the journal in which it is published. With the advent of social media, an article can now be commented on in real time with Tweets, bookmarks and blogs. In future, these measures will complement the impact factor but they will probably not become an alternative. Despite its imperfections, the impact factor has been around for a long time. As yet, although many alternative metrics have since emerged, nothing better is available. Perhaps it is the scientific community’s misuse of the impact factor that is the problem and not the impact factor itself? In this article, Pippa Smart, who is the guest editor for this series, writes about the ways to measure the impact of a journal and published articles. JYOTI SHAH Commissioning Editor.

Concepts: Academic publishing, Nature, Impact factor, Scientific journal, Publishing, Eugene Garfield