SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Citation

281

Despite their recognized limitations, bibliometric assessments of scientific productivity have been widely adopted. We describe here an improved method to quantify the influence of a research article by making novel use of its co-citation network to field-normalize the number of citations it has received. Article citation rates are divided by an expected citation rate that is derived from performance of articles in the same field and benchmarked to a peer comparison group. The resulting Relative Citation Ratio is article level and field independent and provides an alternative to the invalid practice of using journal impact factors to identify influential papers. To illustrate one application of our method, we analyzed 88,835 articles published between 2003 and 2010 and found that the National Institutes of Health awardees who authored those papers occupy relatively stable positions of influence across all disciplines. We demonstrate that the values generated by this method strongly correlate with the opinions of subject matter experts in biomedical research and suggest that the same approach should be generally applicable to articles published in all areas of science. A beta version of iCite, our web tool for calculating Relative Citation Ratios of articles listed in PubMed, is available at https://icite.od.nih.gov.

Concepts: National Institutes of Health, Research, Division, Mathematics, Quotient, Academic publishing, Citation, Impact factor

117

Some scholars add authors to their research papers or grant proposals even when those individuals contribute nothing to the research effort. Some journal editors coerce authors to add citations that are not pertinent to their work and some authors pad their reference lists with superfluous citations. How prevalent are these types of manipulation, why do scholars stoop to such practices, and who among us is most susceptible to such ethical lapses? This study builds a framework around how intense competition for limited journal space and research funding can encourage manipulation and then uses that framework to develop hypotheses about who manipulates and why they do so. We test those hypotheses using data from over 12,000 responses to a series of surveys sent to more than 110,000 scholars from eighteen different disciplines spread across science, engineering, social science, business, and health care. We find widespread misattribution in publications and in research proposals with significant variation by academic rank, discipline, sex, publication history, co-authors, etc. Even though the majority of scholars disapprove of such tactics, many feel pressured to make such additions while others suggest that it is just the way the game is played. The findings suggest that certain changes in the review process might help to stem this ethical decline, but progress could be slow.

Concepts: Publication, Reference, Citation, Peer review, Science, Research, Scientific method, Academic publishing

63

Citations to previous literature are extensively used to measure the quality and diffusion of knowledge. However, we know little about the different ways in which a study can be cited; in particular, are papers cited to point out their merits or their flaws? We elaborated a methodology to characterize “negative” citations using bibliometric data and natural language processing. We found that negative citations concerned higher-quality papers, were focused on a study’s findings rather than theories or methods, and originated from scholars who were closer to the authors of the focal paper in terms of discipline and social distance, but not geographically. Receiving a negative citation was also associated with a slightly faster decline in citations to the paper in the long run.

Concepts: Citation, Real number, Natural language, Science, Linguistics, Scientific method

40

A number of new metrics based on social media platforms-grouped under the term “altmetrics”-have recently been introduced as potential indicators of research impact. Despite their current popularity, there is a lack of information regarding the determinants of these metrics. Using publication and citation data from 1.3 million papers published in 2012 and covered in Thomson Reuters' Web of Science as well as social media counts from Altmetric.com, this paper analyses the main patterns of five social media metrics as a function of document characteristics (i.e., discipline, document type, title length, number of pages and references) and collaborative practices and compares them to patterns known for citations. Results show that the presence of papers on social media is low, with 21.5% of papers receiving at least one tweet, 4.7% being shared on Facebook, 1.9% mentioned on blogs, 0.8% found on Google+ and 0.7% discussed in mainstream media. By contrast, 66.8% of papers have received at least one citation. Our findings show that both citations and social media metrics increase with the extent of collaboration and the length of the references list. On the other hand, while editorials and news items are seldom cited, it is these types of document that are the most popular on Twitter. Similarly, while longer papers typically attract more citations, an opposite trend is seen on social media platforms. Finally, contrary to what is observed for citations, it is papers in the Social Sciences and humanities that are the most often found on social media platforms. On the whole, these findings suggest that factors driving social media and citations are different. Therefore, social media metrics cannot actually be seen as alternatives to citations; at most, they may function as complements to other type of indicators.

Concepts: Reference, Social media, Citation, Twitter, Mass media, Social sciences, Thomson Reuters, Sociology

38

Increasingly, scholarly articles contain URI references to “web at large” resources including project web sites, scholarly wikis, ontologies, online debates, presentations, blogs, and videos. Authors reference such resources to provide essential context for the research they report on. A reader who visits a web at large resource by following a URI reference in an article, some time after its publication, is led to believe that the resource’s content is representative of what the author originally referenced. However, due to the dynamic nature of the web, that may very well not be the case. We reuse a dataset from a previous study in which several authors of this paper were involved, and investigate to what extent the textual content of web at large resources referenced in a vast collection of Science, Technology, and Medicine (STM) articles published between 1997 and 2012 has remained stable since the publication of the referencing article. We do so in a two-step approach that relies on various well-established similarity measures to compare textual content. In a first step, we use 19 web archives to find snapshots of referenced web at large resources that have textual content that is representative of the state of the resource around the time of publication of the referencing paper. We find that representative snapshots exist for about 30% of all URI references. In a second step, we compare the textual content of representative snapshots with that of their live web counterparts. We find that for over 75% of references the content has drifted away from what it was when referenced. These results raise significant concerns regarding the long term integrity of the web-based scholarly record and call for the deployment of techniques to combat these problems.

Concepts: Hypertext Transfer Protocol, Semantic Web, Website, Internet, Citation, Reference, World Wide Web, Uniform Resource Identifier

35

The number of citations that papers receive has become significant in measuring researchers' scientific productivity, and such measurements are important when one seeks career opportunities and research funding. Skewed citation practices can thus have profound effects on academic careers. We investigated (i) how frequently authors misinterpret original information and (ii) how frequently authors inappropriately cite reviews instead of the articles upon which the reviews are based. To reach this aim, we carried a survey of ecology journals indexed in the Web of Science and assessed the appropriateness of citations of review papers. Reviews were significantly more often cited than regular articles. In addition, 22% of citations were inaccurate, and another 15% unfairly gave credit to the review authors for other scientists' ideas. These practices should be stopped, mainly through more open discussion among mentors, researchers and students.

Concepts: Parenthetical referencing, Reference, Bibliography, Research, Science, Citation, Acknowledgment

35

Scientific articles are retracted at increasing rates, with the highest rates among top journals. Here we show that a single retraction triggers citation losses through an author’s prior body of work. Compared to closely-matched control papers, citations fall by an average of 6.9% per year for each prior publication. These chain reactions are sustained on authors' papers (a) published up to a decade earlier and (b) connected within the authors' own citation network by up to 4 degrees of separation from the retracted publication. Importantly, however, citation losses among prior work disappear when authors self-report the error. Our analyses and results span the range of scientific disciplines.

Concepts: Scientific literature, International Phonetic Alphabet, Retraction, Author, Citation, Academic publishing, Scientific method, Science

28

The number of citations a scholarly work receives is a common measure of its impact on the scientific literature; “citation classics” are the most highly cited works. The content of Suicide and Life-Threatening Behavior (SLTB) citation classics is described here. The impact of SLTB citation classics is compared to their counterparts in journals having published the most suicide papers. All data are from the ISI electronic venue on the Web of Science and refer to the number of citations the top 1% of works received in each of ten journals from 1975 through August 10, 2011. Among all ten journals, SLTB ranked first in the number of works on suicide. The principle theme of half of SLTB suicide classics was literature review. The median number of citations for SLTB citation classics (top 1%) was 121.5, with a range between 96 and 279 citations, but classics from generalized psychiatric journals received more citations as anticipated. Journal impact factors explained 73% of the variance in classic’s citation counts across journals. On average, suicide classics received 30% more citations than all classics. Among a second group of five specialized suicide journals, however, SLTB ranked first in average 5-year impact. Although SLTB produced the highest number of suicide articles of any journal, SLTB’s citation classics received fewer citations than suicide classics in high-impact/prestige, general journals. Future work is needed to assess what predicts which SLTB articles ultimately become citation classics.

Concepts: Bibliography, Median, Journal, Citation impact, Citation, Reference, Academic publishing, Impact factor

26

Due to the increasing amount of scientific work and the typical delays in publication, promptly assessing the impact of scholarly work is a huge challenge. To meet this challenge, one solution may be to create and discover innovative indicators. The goal of this paper is to investigate whether Facebook likes for unpublished manuscripts that are uploaded to the Internet could be used as an early indicator of the future impact of the scientific work. To address our research question, we compared Facebook likes for manuscripts uploaded to the Harvard Business School website (Study 1) and the bioRxiv website (Study 2) with traditional impact indicators (journal article citations, Impact Factor, Immediacy Index) for those manuscripts that have been published as a journal article. Although based on our full sample of Study 1 (N = 170), Facebook likes do not predict traditional impact indicators, for manuscripts with one or more Facebook likes (n = 95), our results indicate that the more Facebook likes a manuscript receives, the more journal article citations the manuscript receives. In additional analyses (for which we categorized the manuscripts as psychological and non-psychological manuscripts), we found that the significant prediction of citations stems from the psychological and not the non-psychological manuscripts. In Study 2, we observed that Facebook likes (N = 270) and non-zero Facebook likes (n = 84) do not predict traditional impact indicators. Taken together, our findings indicate an interdisciplinary difference in the predictive value of Facebook likes, according to which Facebook likes only predict citations in the psychological area but not in the non-psychological area of business or in the field of life sciences. Our paper contributes to understanding the possibilities and limits of the use of social media indicators as potential early indicators of the impact of scientific work.

Concepts: Future, Futurology, Prediction, Immediacy index, Citation, Scientific method, Impact factor, Academic publishing

18

Scientists and inventors can draw on an ever-expanding literature for the building blocks of tomorrow’s ideas, yet little is known about how combinations of past work are related to future discoveries. Our analysis parameterizes the age distribution of a work’s references and revealed three links between the age of prior knowledge and hit papers and patents. First, works that cite literature with a low mean age and high age variance are in a citation “hotspot”; these works double their likelihood of being in the top 5% or better of citations. Second, the hotspot is nearly universal in all branches of science and technology and is increasingly predictive of a work’s future citation impact. Third, a scientist or inventor is significantly more likely to write a paper in the hotspot when they are coauthoring than whey they are working alone. Our findings are based on all 28,426,345 scientific papers in the Web of Science, 1945-2013, and all 5,382,833 U.S. patents, 1950-2010, and reveal new antecedents of high-impact science and the link between prior literature and tomorrow’s breakthrough ideas.

Concepts: Technology, Invention, Scientist, Scientific method, Citation, Reference, Impact factor, Science