Background Concern about the use of epinephrine as a treatment for out-of-hospital cardiac arrest led the International Liaison Committee on Resuscitation to call for a placebo-controlled trial to determine whether the use of epinephrine is safe and effective in such patients. Methods In a randomized, double-blind trial involving 8014 patients with out-of-hospital cardiac arrest in the United Kingdom, paramedics at five National Health Service ambulance services administered either parenteral epinephrine (4015 patients) or saline placebo (3999 patients), along with standard care. The primary outcome was the rate of survival at 30 days. Secondary outcomes included the rate of survival until hospital discharge with a favorable neurologic outcome, as indicated by a score of 3 or less on the modified Rankin scale (which ranges from 0 [no symptoms] to 6 [death]). Results At 30 days, 130 patients (3.2%) in the epinephrine group and 94 (2.4%) in the placebo group were alive (unadjusted odds ratio for survival, 1.39; 95% confidence interval [CI], 1.06 to 1.82; P=0.02). There was no evidence of a significant difference in the proportion of patients who survived until hospital discharge with a favorable neurologic outcome (87 of 4007 patients [2.2%] vs. 74 of 3994 patients [1.9%]; unadjusted odds ratio, 1.18; 95% CI, 0.86 to 1.61). At the time of hospital discharge, severe neurologic impairment (a score of 4 or 5 on the modified Rankin scale) had occurred in more of the survivors in the epinephrine group than in the placebo group (39 of 126 patients [31.0%] vs. 16 of 90 patients [17.8%]). Conclusions In adults with out-of-hospital cardiac arrest, the use of epinephrine resulted in a significantly higher rate of 30-day survival than the use of placebo, but there was no significant between-group difference in the rate of a favorable neurologic outcome because more survivors had severe neurologic impairment in the epinephrine group. (Funded by the U.K. National Institute for Health Research and others; Current Controlled Trials number, ISRCTN73485024 .).
- Proceedings of the National Academy of Sciences of the United States of America
- Published 3 months ago
The origins of bread have long been associated with the emergence of agriculture and cereal domestication during the Neolithic in southwest Asia. In this study we analyze a total of 24 charred food remains from Shubayqa 1, a Natufian hunter-gatherer site located in northeastern Jordan and dated to 14.6-11.6 ka cal BP. Our finds provide empirical data to demonstrate that the preparation and consumption of bread-like products predated the emergence of agriculture by at least 4,000 years. The interdisciplinary analyses indicate the use of some of the “founder crops” of southwest Asian agriculture (e.g., Triticum boeoticum, wild einkorn) and root foods (e.g., Bolboschoenus glaucus, club-rush tubers) to produce flat bread-like products. The available archaeobotanical evidence for the Natufian period indicates that cereal exploitation was not common during this time, and it is most likely that cereal-based meals like bread become staples only when agriculture was firmly established.
Background The recurrence score based on the 21-gene breast cancer assay predicts chemotherapy benefit if it is high and a low risk of recurrence in the absence of chemotherapy if it is low; however, there is uncertainty about the benefit of chemotherapy for most patients, who have a midrange score. Methods We performed a prospective trial involving 10,273 women with hormone-receptor-positive, human epidermal growth factor receptor 2 (HER2)-negative, axillary node-negative breast cancer. Of the 9719 eligible patients with follow-up information, 6711 (69%) had a midrange recurrence score of 11 to 25 and were randomly assigned to receive either chemoendocrine therapy or endocrine therapy alone. The trial was designed to show noninferiority of endocrine therapy alone for invasive disease-free survival (defined as freedom from invasive disease recurrence, second primary cancer, or death). Results Endocrine therapy was noninferior to chemoendocrine therapy in the analysis of invasive disease-free survival (hazard ratio for invasive disease recurrence, second primary cancer, or death [endocrine vs. chemoendocrine therapy], 1.08; 95% confidence interval, 0.94 to 1.24; P=0.26). At 9 years, the two treatment groups had similar rates of invasive disease-free survival (83.3% in the endocrine-therapy group and 84.3% in the chemoendocrine-therapy group), freedom from disease recurrence at a distant site (94.5% and 95.0%) or at a distant or local-regional site (92.2% and 92.9%), and overall survival (93.9% and 93.8%). The chemotherapy benefit for invasive disease-free survival varied with the combination of recurrence score and age (P=0.004), with some benefit of chemotherapy found in women 50 years of age or younger with a recurrence score of 16 to 25. Conclusions Adjuvant endocrine therapy and chemoendocrine therapy had similar efficacy in women with hormone-receptor-positive, HER2-negative, axillary node-negative breast cancer who had a midrange 21-gene recurrence score, although some benefit of chemotherapy was found in some women 50 years of age or younger. (Funded by the National Cancer Institute and others; TAILORx ClinicalTrials.gov number, NCT00310180 .).
Over a decade ago, the Atacama humanoid skeleton (Ata) was discovered in the Atacama region of Chile. The Ata specimen carried a strange phenotype-6-in stature, fewer than expected ribs, elongated cranium, and accelerated bone age-leading to speculation that this was a preserved nonhuman primate, human fetus harboring genetic mutations, or even an extraterrestrial. We previously reported that it was human by DNA analysis with an estimated bone age of about 6-8 yr at the time of demise. To determine the possible genetic drivers of the observed morphology, DNA from the specimen was subjected to whole-genome sequencing using the Illumina HiSeq platform with an average 11.5× coverage of 101-bp, paired-end reads. In total, 3,356,569 single nucleotide variations (SNVs) were found as compared to the human reference genome, 518,365 insertions and deletions (indels), and 1047 structural variations (SVs) were detected. Here, we present the detailed whole-genome analysis showing that Ata is a female of human origin, likely of Chilean descent, and its genome harbors mutations in genes (COL1A1,COL2A1,KMT2D,FLNB,ATR,TRIP11,PCNT) previously linked with diseases of small stature, rib anomalies, cranial malformations, premature joint fusion, and osteochondrodysplasia (also known as skeletal dysplasia). Together, these findings provide a molecular characterization of Ata’s peculiar phenotype, which likely results from multiple known and novel putative gene mutations affecting bone development and ossification.
Exomoons are the natural satellites of planets orbiting stars outside our solar system, of which there are currently no confirmed examples. We present new observations of a candidate exomoon associated with Kepler-1625b using the Hubble Space Telescope to validate or refute the moon’s presence. We find evidence in favor of the moon hypothesis, based on timing deviations and a flux decrement from the star consistent with a large transiting exomoon. Self-consistent photodynamical modeling suggests that the planet is likely several Jupiter masses, while the exomoon has a mass and radius similar to Neptune. Since our inference is dominated by a single but highly precise Hubble epoch, we advocate for future monitoring of the system to check model predictions and confirm repetition of the moon-like signal.
Despite concerted international effort to track and interpret shifts in the abundance and distribution of Adélie penguins, large populations continue to be identified. Here we report on a major hotspot of Adélie penguin abundance identified in the Danger Islands off the northern tip of the Antarctic Peninsula (AP). We present the first complete census of Pygoscelis spp. penguins in the Danger Islands, estimated from a multi-modal survey consisting of direct ground counts and computer-automated counts of unmanned aerial vehicle (UAV) imagery. Our survey reveals that the Danger Islands host 751,527 pairs of Adélie penguins, more than the rest of AP region combined, and include the third and fourth largest Adélie penguin colonies in the world. Our results validate the use of Landsat medium-resolution satellite imagery for the detection of new or unknown penguin colonies and highlight the utility of combining satellite imagery with ground and UAV surveys. The Danger Islands appear to have avoided recent declines documented on the Western AP and, because they are large and likely to remain an important hotspot for avian abundance under projected climate change, deserve special consideration in the negotiation and design of Marine Protected Areas in the region.
Dogs may be beneficial in reducing cardiovascular risk in their owners by providing social support and motivation for physical activity. We aimed to investigate the association of dog ownership with incident cardiovascular disease (CVD) and death in a register-based prospective nation-wide cohort (n = 3,432,153) with up to 12 years of follow-up. Self-reported health and lifestyle habits were available for 34,202 participants in the Swedish Twin Register. Time-to-event analyses with time-updated covariates were used to calculate hazard ratios (HR) with 95% confidence intervals (CI). In single- and multiple-person households, dog ownership (13.1%) was associated with lower risk of death, HR 0.67 (95% CI, 0.65-0.69) and 0.89 (0.87-0.91), respectively; and CVD death, HR 0.64 (0.59-0.70), and 0.85 (0.81-0.90), respectively. In single-person households, dog ownership was inversely associated with cardiovascular outcomes (HR composite CVD 0.92, 95% CI, 0.89-0.94). Ownership of hunting breed dogs was associated with lowest risk of CVD. Further analysis in the Twin Register could not replicate the reduced risk of CVD or death but also gave no indication of confounding by disability, comorbidities or lifestyle factors. In conclusion, dog ownership appears to be associated with lower risk of CVD in single-person households and lower mortality in the general population.
- Proceedings of the National Academy of Sciences of the United States of America
- Published 22 days ago
Glyphosate, the primary herbicide used globally for weed control, targets the 5-enolpyruvylshikimate-3-phosphate synthase (EPSPS) enzyme in the shikimate pathway found in plants and some microorganisms. Thus, glyphosate may affect bacterial symbionts of animals living near agricultural sites, including pollinators such as bees. The honey bee gut microbiota is dominated by eight bacterial species that promote weight gain and reduce pathogen susceptibility. The gene encoding EPSPS is present in almost all sequenced genomes of bee gut bacteria, indicating that they are potentially susceptible to glyphosate. We demonstrated that the relative and absolute abundances of dominant gut microbiota species are decreased in bees exposed to glyphosate at concentrations documented in the environment. Glyphosate exposure of young workers increased mortality of bees subsequently exposed to the opportunistic pathogen Serratia marcescens Members of the bee gut microbiota varied in susceptibility to glyphosate, largely corresponding to whether they possessed an EPSPS of class I (sensitive to glyphosate) or class II (insensitive to glyphosate). This basis for differences in sensitivity was confirmed using in vitro experiments in which the EPSPS gene from bee gut bacteria was cloned into Escherichia coli All strains of the core bee gut species, Snodgrassella alvi, encode a sensitive class I EPSPS, and reduction in S. alvi levels was a consistent experimental result. However, some S. alvi strains appear to possess an alternative mechanism of glyphosate resistance. Thus, exposure of bees to glyphosate can perturb their beneficial gut microbiota, potentially affecting bee health and their effectiveness as pollinators.
In 1965, the Sugar Research Foundation (SRF) secretly funded a review in the New England Journal of Medicine that discounted evidence linking sucrose consumption to blood lipid levels and hence coronary heart disease (CHD). SRF subsequently funded animal research to evaluate sucrose’s CHD risks. The objective of this study was to examine the planning, funding, and internal evaluation of an SRF-funded research project titled “Project 259: Dietary Carbohydrate and Blood Lipids in Germ-Free Rats,” led by Dr. W.F.R. Pover at the University of Birmingham, Birmingham, United Kingdom, between 1967 and 1971. A narrative case study method was used to assess SRF Project 259 from 1967 to 1971 based on sugar industry internal documents. Project 259 found a statistically significant decrease in serum triglycerides in germ-free rats fed a high sugar diet compared to conventional rats fed a basic PRM diet (a pelleted diet containing cereal meals, soybean meals, whitefish meal, and dried yeast, fortified with a balanced vitamin supplement and trace element mixture). The results suggested to SRF that gut microbiota have a causal role in carbohydrate-induced hypertriglyceridemia. A study comparing conventional rats fed a high-sugar diet to those fed a high-starch diet suggested that sucrose consumption might be associated with elevated levels of beta-glucuronidase, an enzyme previously associated with bladder cancer in humans. SRF terminated Project 259 without publishing the results. The sugar industry did not disclose evidence of harm from animal studies that would have (1) strengthened the case that the CHD risk of sucrose is greater than starch and (2) caused sucrose to be scrutinized as a potential carcinogen. The influence of the gut microbiota in the differential effects of sucrose and starch on blood lipids, as well as the influence of carbohydrate quality on beta-glucuronidase and cancer activity, deserve further scrutiny.
Background Aspirin is a well-established therapy for the secondary prevention of cardiovascular events. However, its role in the primary prevention of cardiovascular disease is unclear, especially in older persons, who have an increased risk. Methods From 2010 through 2014, we enrolled community-dwelling men and women in Australia and the United States who were 70 years of age or older (or ≥65 years of age among blacks and Hispanics in the United States) and did not have cardiovascular disease, dementia, or disability. Participants were randomly assigned to receive 100 mg of enteric-coated aspirin or placebo. The primary end point was a composite of death, dementia, or persistent physical disability; results for this end point are reported in another article in the Journal. Secondary end points included major hemorrhage and cardiovascular disease (defined as fatal coronary heart disease, nonfatal myocardial infarction, fatal or nonfatal stroke, or hospitalization for heart failure). Results Of the 19,114 persons who were enrolled in the trial, 9525 were assigned to receive aspirin and 9589 to receive placebo. After a median of 4.7 years of follow-up, the rate of cardiovascular disease was 10.7 events per 1000 person-years in the aspirin group and 11.3 events per 1000 person-years in the placebo group (hazard ratio, 0.95; 95% confidence interval [CI], 0.83 to 1.08). The rate of major hemorrhage was 8.6 events per 1000 person-years and 6.2 events per 1000 person-years, respectively (hazard ratio, 1.38; 95% CI, 1.18 to 1.62; P<0.001). Conclusions The use of low-dose aspirin as a primary prevention strategy in older adults resulted in a significantly higher risk of major hemorrhage and did not result in a significantly lower risk of cardiovascular disease than placebo. (Funded by the National Institute on Aging and others; ASPREE ClinicalTrials.gov number, NCT01038583 .).