SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Bibliography

33

The usefulness of Google Scholar (GS) as a bibliographic database for biomedical systematic review (SR) searching is a subject of current interest and debate in research circles. Recent research has suggested GS might even be used alone in SR searching. This assertion is challenged here by testing whether GS can locate all studies included in 21 previously published SRs. Second, it examines the recall of GS, taking into account the maximum number of items that can be viewed, and tests whether more complete searches created by an information specialist will improve recall compared to the searches used in the 21 published SRs.

Concepts: Academic publishing, Medical research, Review, Bibliographic databases, Bibliographic database, Bibliography, Searching, Reference management software

32

The number of citations that papers receive has become significant in measuring researchers' scientific productivity, and such measurements are important when one seeks career opportunities and research funding. Skewed citation practices can thus have profound effects on academic careers. We investigated (i) how frequently authors misinterpret original information and (ii) how frequently authors inappropriately cite reviews instead of the articles upon which the reviews are based. To reach this aim, we carried a survey of ecology journals indexed in the Web of Science and assessed the appropriateness of citations of review papers. Reviews were significantly more often cited than regular articles. In addition, 22% of citations were inaccurate, and another 15% unfairly gave credit to the review authors for other scientists' ideas. These practices should be stopped, mainly through more open discussion among mentors, researchers and students.

Concepts: Science, Research, Bibliography, Reference, Citation, Parenthetical referencing, Acknowledgment

28

The number of citations a scholarly work receives is a common measure of its impact on the scientific literature; “citation classics” are the most highly cited works. The content of Suicide and Life-Threatening Behavior (SLTB) citation classics is described here. The impact of SLTB citation classics is compared to their counterparts in journals having published the most suicide papers. All data are from the ISI electronic venue on the Web of Science and refer to the number of citations the top 1% of works received in each of ten journals from 1975 through August 10, 2011. Among all ten journals, SLTB ranked first in the number of works on suicide. The principle theme of half of SLTB suicide classics was literature review. The median number of citations for SLTB citation classics (top 1%) was 121.5, with a range between 96 and 279 citations, but classics from generalized psychiatric journals received more citations as anticipated. Journal impact factors explained 73% of the variance in classic’s citation counts across journals. On average, suicide classics received 30% more citations than all classics. Among a second group of five specialized suicide journals, however, SLTB ranked first in average 5-year impact. Although SLTB produced the highest number of suicide articles of any journal, SLTB’s citation classics received fewer citations than suicide classics in high-impact/prestige, general journals. Future work is needed to assess what predicts which SLTB articles ultimately become citation classics.

Concepts: Median, Academic publishing, Impact factor, Bibliography, Reference, Journal, Citation, Citation impact

27

Websites and online resources outside academic bibliographic databases can be significant sources for identifying literature, though there are challenges in searching and managing the results. These are pertinent to systematic reviews that are underpinned by principles of transparency, accountability and reproducibility. We consider how the conduct of searching these resources can be compatible with the principles of a systematic search. We present an approach to address some of the challenges. This is particularly relevant when websites are relied upon to identify important literature for a review. We recommend considering the process as three stages and having a considered rationale and sufficient recordkeeping at each stage that balances transparency with practicality of purpose. Advances in technology and recommendations for website providers are briefly discussed.

Concepts: Cancer staging, Stage, Website, Identification, Publishing, Bibliographic database, Bibliography, Searching

13

A Sleeping Beauty (SB) in science refers to a paper whose importance is not recognized for several years after publication. Its citation history exhibits a long hibernation period followed by a sudden spike of popularity. Previous studies suggest a relative scarcity of SBs. The reliability of this conclusion is, however, heavily dependent on identification methods based on arbitrary threshold parameters for sleeping time and number of citations, applied to small or monodisciplinary bibliographic datasets. Here we present a systematic, large-scale, and multidisciplinary analysis of the SB phenomenon in science. We introduce a parameter-free measure that quantifies the extent to which a specific paper can be considered an SB. We apply our method to 22 million scientific papers published in all disciplines of natural and social sciences over a time span longer than a century. Our results reveal that the SB phenomenon is not exceptional. There is a continuous spectrum of delayed recognition where both the hibernation period and the awakening intensity are taken into account. Although many cases of SBs can be identified by looking at monodisciplinary bibliographic data, the SB phenomenon becomes much more apparent with the analysis of multidisciplinary datasets, where we can observe many examples of papers achieving delayed yet exceptional importance in disciplines different from those where they were originally published. Our analysis emphasizes a complex feature of citation dynamics that so far has received little attention, and also provides empirical evidence against the use of short-term citation metrics in the quantification of scientific impact.

Concepts: Time, Scientific method, Science, Research, Empirical, Impact factor, Bibliography, Citation

10

Authorship and citation practices evolve with time and differ by academic discipline. As such, indicators of research productivity based on citation records are naturally subject to historical and disciplinary effects. We observe these effects on a corpus of astronomer career data constructed from a database of refereed publications. We employ a simple mechanism to measure research output using author and reference counts available in bibliographic databases to develop a citation-based indicator of research productivity. The total research impact (tori) quantifies, for an individual, the total amount of scholarly work that others have devoted to his/her work, measured in the volume of research papers. A derived measure, the research impact quotient (riq), is an age-independent measure of an individual’s research ability. We demonstrate that these measures are substantially less vulnerable to temporal debasement and cross-disciplinary bias than the most popular current measures. The proposed measures of research impact, tori and riq, have been implemented in the Smithsonian/NASA Astrophysics Data System.

Concepts: Scientific method, Academic publishing, Research, Academia, Peer review, Bibliography, Interdisciplinarity, Astronomy

6

Reference management software programs enable researchers to more easily organize and manage large volumes of references typically identified during the production of systematic reviews. The purpose of this study was to determine the extent to which authors are using reference management software to produce systematic reviews; identify which programs are used most frequently and rate their ease of use; and assess the degree to which software usage is documented in published studies.

Concepts: Management, Computer program, Bibliography, Reference, Computer software, Citation, Technical communication, Usability

5

Measuring the usage of informatics resources such as software tools and databases is essential to quantifying their impact, value and return on investment. We have developed a publicly available dataset of informatics resource publications and their citation network, along with an associated metric (u-Index) to measure informatics resources' impact over time. Our dataset differentiates the context in which citations occur to distinguish between ‘awareness’ and ‘usage’, and uses a citing universe of open access publications to derive citation counts for quantifying impact. Resources with a high ratio of usage citations to awareness citations are likely to be widely used by others and have a high u-Index score. We have pre-calculated the u-Index for nearly 100,000 informatics resources. We demonstrate how the u-Index can be used to track informatics resource impact over time. The method of calculating the u-Index metric, the pre-computed u-Index values, and the dataset we compiled to calculate the u-Index are publicly available.

Concepts: Academic publishing, Investment, Bibliography, Citation, Rate of return, Style guide, The Chicago Manual of Style, ASA style

5

Citation bias concerns the selective citation of scientific articles based on their results. We brought together all available evidence on citation bias across scientific disciplines and quantified its impact.

Concepts: Evidence-based medicine, Systematic review, Science, Meta-analysis, Bibliography, Citation

3

This article highlights the issue of wasteful publishing practices that primarily affect non-mainstream science countries and rapidly growing academic disciplines. Numerous start-up open access publishers with soft or nonexistent quality checks and huge commercial interests have created a global crisis in the publishing market. Their publishing practices have been thoroughly examined, leading to the blacklisting of many journals by Jeffrey Beall. However, it appears that some subscription journals are also falling short of adhering to the international recommendations of global editorial associations. Unethical editing agencies that promote their services in non-mainstream science countries create more problems for inexperienced authors. It is suggested to regularly monitor the quality of already indexed journals and upgrade criteria of covering new sources by the Emerging Sources Citation Index (Web of Science), Scopus, and specialist bibliographic databases. Regional awareness campaigns to inform stakeholders of science communication about the importance of ethical writing, transparency of editing services, and permanent archiving can be also helpful for eradicating unethical publishing practices.

Concepts: Academic publishing, Ethics, Writing, Bibliographic databases, Publishing, Editing, Bibliography, Citation index