Discover the most talked about and latest scientific content & concepts.

Concept: Journal Citation Reports


Objective To examine how poor reporting and inadequate methods for key methodological features in randomised controlled trials (RCTs) have changed over the past three decades.Design Mapping of trials included in Cochrane reviews.Data sources Data from RCTs included in all Cochrane reviews published between March 2011 and September 2014 reporting an evaluation of the Cochrane risk of bias items: sequence generation, allocation concealment, blinding, and incomplete outcome data.Data extraction For each RCT, we extracted consensus on risk of bias made by the review authors and identified the primary reference to extract publication year and journal. We matched journal names with Journal Citation Reports to get 2014 impact factors.Main outcomes measures We considered the proportions of trials rated by review authors at unclear and high risk of bias as surrogates for poor reporting and inadequate methods, respectively.Results We analysed 20 920 RCTs (from 2001 reviews) published in 3136 journals. The proportion of trials with unclear risk of bias was 48.7% for sequence generation and 57.5% for allocation concealment; the proportion of those with high risk of bias was 4.0% and 7.2%, respectively. For blinding and incomplete outcome data, 30.6% and 24.7% of trials were at unclear risk and 33.1% and 17.1% were at high risk, respectively. Higher journal impact factor was associated with a lower proportion of trials at unclear or high risk of bias. The proportion of trials at unclear risk of bias decreased over time, especially for sequence generation, which fell from 69.1% in 1986-1990 to 31.2% in 2011-14 and for allocation concealment (70.1% to 44.6%). After excluding trials at unclear risk of bias, use of inadequate methods also decreased over time: from 14.8% to 4.6% for sequence generation and from 32.7% to 11.6% for allocation concealment.Conclusions Poor reporting and inadequate methods have decreased over time, especially for sequence generation and allocation concealment. But more could be done, especially in lower impact factor journals.

Concepts: Scientific method, Randomized controlled trial, Academic publishing, Nature, Impact factor, Bibliometrics, Institute for Scientific Information, Journal Citation Reports


The National Institute of Academic Anaesthesia (NIAA) was founded in 2008 to lead a UK strategy for developing academic anaesthesia. We aimed to assess the distribution of applications and quantify the academic returns of NIAA-supported research grants, as this has hitherto not been analysed. We sought data on the baseline characteristics of all grant applicants and recipients. Every grant recipient from 2008 to 2015 was contacted to ascertain the status of their supported research projects. We also examined Google Scholar, Scopus®database and InCites Journal Citation Reports for citation, author and journal metrics, respectively. In total, 495 research project applications were made, with 150 grants being awarded. Data on 121 out of 150 (80.7%) grant awards, accounting for £3.5 million, were collected, of which 91 completed studies resulted in 140 publications and 2759 citations. The median (IQR [range]) time to first or only publication was 3 (2-4 [0-9]) years. The overall cost per publication was £14,970 (£7457-£24,998 [£2212-£73,755]) and the cost per citation was £1515 (£323-£3785 [£70-£36,182]), with 1 (0-2 [0-8]) publication and 4 (0-25 [0-265]) citations resulting per grant. The impact factor of journals in which publications arose was 4.7 (2.5-6.2 [0-47.8]), with the highest impact arising from clinical and basic science studies, particularly in the fields of pain and peri-operative medicine. Grants were most frequently awarded to clinical and basic science categories of study, but in terms of specialty, critical care medicine and peri-operative medicine received the greatest number of grants. Superficially, there seemed a geographical disparity, with 123 (82%) grants being awarded to researchers in England, London receiving 48 (32%) of these. However, this was in proportion to the number of grant applications received by country or city of application, such that there was no significant difference in overall success rates. There was no significant difference in productivity in terms of publications and citations from grants awarded to each city. The 150 grants were awarded to 107 recipients (identified as the most senior applicant for each grant), 27 of whom received ≥ two grants. Recipients had a median career total of 21 (8-76 [0-254]) publications and 302 (44-1320 [0-8167]) citations, with an h-index of 8 (3-22 [0-54]). We conclude that a key determinant of grant success is simply applying. This is the first study to report the distribution and scholarly output of individual anaesthesia research grants, particularly from a collaborative body such as the NIAA, and can be used as a benchmark to further develop academic anaesthesia in the UK and beyond.

Concepts: Academic publishing, Research, Impact factor, Ulysses S. Grant, Grant, Citation, Institute for Scientific Information, Journal Citation Reports


Research productivity and impact are often considered in professional evaluations of academics, and performance metrics based on publications and citations increasingly are used in such evaluations. To promote evidence-based and informed use of these metrics, we collected publication and citation data for 437 tenure-track faculty members at 33 research-extensive universities in the United States belonging to the National Association of University Fisheries and Wildlife Programs. For each faculty member, we computed 8 commonly used performance metrics based on numbers of publications and citations, and recorded covariates including academic age (time since Ph.D.), sex, percentage of appointment devoted to research, and the sub-disciplinary research focus. Standardized deviance residuals from regression models were used to compare faculty after accounting for variation in performance due to these covariates. We also aggregated residuals to enable comparison across universities. Finally, we tested for temporal trends in citation practices to assess whether the “law of constant ratios”, used to enable comparison of performance metrics between disciplines that differ in citation and publication practices, applied to fisheries and wildlife sub-disciplines when mapped to Web of Science Journal Citation Report categories. Our regression models reduced deviance by ¼ to ½. Standardized residuals for each faculty member, when combined across metrics as a simple average or weighted via factor analysis, produced similar results in terms of performance based on percentile rankings. Significant variation was observed in scholarly performance across universities, after accounting for the influence of covariates. In contrast to findings for other disciplines, normalized citation ratios for fisheries and wildlife sub-disciplines increased across years. Increases were comparable for all sub-disciplines except ecology. We discuss the advantages and limitations of our methods, illustrate their use when applied to new data, and suggest future improvements. Our benchmarking approach may provide a useful tool to augment detailed, qualitative assessment of performance.

Concepts: Regression analysis, Academic publishing, University, Academia, Dean, Faculty, College, Journal Citation Reports


We analysed the authorship policies of a random sample of 600 journals from the Journal Citation Reports database. 62.5% of the journals we sampled had an authorship policy. Having an authorship policy was positively associated with impact factor. Journals from the biomedical sciences and social sciences/humanities were more likely to have an authorship policy than journals from the physical sciences, engineering or mathematical sciences. Among journals with a policy, the most frequent type of policy was guidance on criteria for authorship (99.7%); followed by guidance on acknowledgments (97.3%); requiring that authors make substantial contributions to the research (94.7%); requiring that authors be accountable for the research as a whole (84.8%); guidance on changes in authorship (77.9%); requiring that authors give final approval to the manuscript (77.6%); requiring that authors draft or critically revise the manuscript (71.7%); providing guidance on corporate authorship (58.9%); prohibiting gift, guest or ghost authorship (31.7%); requiring authors to describe their contributions (5.3%); limiting the number of authors for some types of articles (4.0%) and requiring authors to be accountable for their part in the research (1.1%). None of the policies addressed equal contribution statements. Journals that do not have authorship policies should consider adopting or developing ones.

Concepts: Academic publishing, Science, Impact factor, Scientific journal, Journal Citation Reports


The past 3 decades have witnessed a boost in science development in China; in parallel, more and more Chinese scientific journals are indexed by the Journal Citation Reports issued by Thomson Reuters (SCI). Evaluation of the performance of these Chinese SCI journals is necessary and helpful to improve their quality. This study aimed to evaluate these journals by calculating various journal self-citation rates, which are important parameters influencing a journal impact factor.

Concepts: Academic publishing, Science, Nature, Impact factor, Scientific journal, Bibliometrics, Institute for Scientific Information, Journal Citation Reports


Publishing in peer-reviewed journals is essential for medical education researchers. Competition remains fierce for top journals and authors are advised to consider impact factor (IF), audience, and alignment of focus. However, little is known about how authors balance these factors when making submission decisions. The authors aimed to explore decision-making around journal choice.

Concepts: Scientific method, Academic publishing, Science, Nature, Scientific journal, Journal Citation Reports, Academic journals


The purpose of our study was to identify and characterize the 100 most-cited articles in neuroimaging. Based on the database of Journal Citation Reports, we selected 669 journals that were considered as potential outlets for neuroimaging articles. The Web of Science search tools were used to identify the 100 most-cited articles relevant to neuroimaging within the selected journals. The following information was recorded for each article: publication year, journal, category and impact factor of journal, number of citations, number of annual citations, authorship, department, institution, country, article type, imaging technique used, and topic. The 100 most-cited articles in neuroimaging were published between 1980 and 2012, with 1995-2004 producing 69 articles. Citations ranged from 4,384-673 and annual citations ranged from 313.1-24.9. The majority of articles were published in radiology/imaging journals (n = 75), originated in the United States (n = 58), were original articles (n = 63), used MRI as imaging modality (n = 85), and dealt with imaging technique (n = 45). The Oxford Centre for Functional Magnetic Resonance Imaging of the Brain at John Radcliffe Hospital (n = 10) was the leading institutions and Karl J. Friston (n = 11) was the most prolific author. Our study presents a detailed list and an analysis of the 100 most-cited articles in the field of neuroimaging, which provides an insight into historical developments and allows for recognition of the important advances in this field.

Concepts: Brain, Medical imaging, Magnetic resonance imaging, Impact factor, Bibliometrics, Journal Citation Reports, Radcliffe Infirmary, John Radcliffe Hospital


Bibliometrics are a set of methods, which can be used to analyze academic literature quantitatively and its changes over time. The objectives of this study were 1) to evaluate trends related to academic performance of dental journals from 2003 to 2012 using bibliometric indices, and 2) monitor the changes of the five dental journals with the highest and lowest impact factor (IF) published in 2003. Data for the subject category “Dentistry, Oral Surgery & Medicine” was retrieved from the Journal Citation Reports (JCR) published from 2003 to 2012. Linear regressions analysis was used to determine statistical trends over the years with each bibliometric indicator as the dependent variable and the JCR year as the predictor variable. Statistically significant rise in the total number of dental journals, the number of all articles with the steepest rise observed for research articles, the number of citations and the aggregate IF was observed from 2003 to 2012. The analysis of the five top and five bottom-tire dental journals revealed a rise in IF however, with a wide variation in relation to the magnitude of this rise. Although the IF of the top five journals remained relatively constant, the percentile ranks of the four lowest ranking journals in 2003 increased significantly with the sharpest rise being noted for the British Journal of Oral & Maxillofacial Surgery. This study revealed significant growth of dental literature in absolute terms, as well as upward trends for most of the citation-based bibliometric indices from 2003 to 2012.

Concepts: Scientific method, Statistics, Academic publishing, Impact factor, Dentistry, Oral and maxillofacial surgery, Bibliometrics, Journal Citation Reports


The mission of any academic orthopaedic training program can be divided into 3 general areas of focus: clinical care, academic performance, and research. Clinical care is evaluated on clinical volume, patient outcomes, patient satisfaction, and becoming increasingly focused on data-driven quality metrics. Academic performance of a department can be used to motivate individual surgeons, but objective measures are used to define a residency program. Annual in-service examinations serve as a marker of resident knowledge base, and board pass rates are clearly scrutinized. Research productivity, however, has proven harder to objectively quantify. In an effort to improve transparency and better account for conflicts of interest, bias, and self-citation, multiple bibliometric measures have been developed. Rather than using individuals' research productivity as a surrogate for departmental research, we sought to establish an objective methodology to better assess a residency program’s ability to conduct meaningful research. In this study, we describe a process to assess the number and quality of publications produced by an orthopaedic residency department. This would allow chairmen and program directors to benchmark their current production and make measurable goals for future research investment. The main goal of the benchmarking system is to create an “h-index” for residency programs. To do this, we needed to create a list of relevant articles in the orthopaedic literature. We used the Journal Citation Reports. This publication lists all orthopaedic journals that are given an impact factor rating every year. When we accessed the Journal Citation Reports database, there were 72 journals included in the orthopaedic literature section. To ensure only relevant, impactful journals were included, we selected journals with an impact factor greater than 0.95 and an Eigenfactor Score greater than 0.00095. After excluding journals not meeting these criteria, we were left with 45 journals. We performed a Scopus search over a 10-year period of these journals and created a database of articles and their affiliated institutions. We performed several iterations of this to maximize the capture of articles attributed to institutions with multiple names. Based off of this extensive database, we were able to analyze all allopathic US residency programs based on their quality research productivity. We believe this as a novel methodology to create a system by which residency program chairmen and directors can assess progress over time and accurate comparison with other programs.

Concepts: Academic publishing, The Surrogates, Impact factor, Impact event, Bibliometrics, Institute for Scientific Information, Journal Citation Reports, Scopus


The Journal of Psychology: Interdisciplinary and Applied is a leading international journal in psychology dating back to 1935. This study examines its publications since its creation utilizing a bibliometric analysis. The primary objective is to provide a complete overview of the key factors affecting the journal. This analysis includes such key issues as the publication and citation structure of the journal, its most cited articles, and the leading authors, institutions, and countries referenced in the journal. The work uses the Scopus database to classify the bibliographic material. Additionally, the analysis provides a graphical mapping of the bibliographic data by using visualization of similarities viewer software. This software uses several bibliometric techniques including co-citation, bibliographic coupling and co-occurrence of keywords. The Journal of Psychology is strongly connected to most of the current leading journals in psychology, and currently has a 5-year impact factor of 1.77 (Thomson Reuters, 2015 Journal Citation Reports).

Concepts: Academic publishing, Impact factor, Bibliography, Citation, Thomson Reuters, Bibliometrics, Reuters, Journal Citation Reports