- Proceedings of the National Academy of Sciences of the United States of America
- Published 6 months ago
Around the world, increases in wealth have produced an unintended consequence: a rising sense of time scarcity. We provide evidence that using money to buy time can provide a buffer against this time famine, thereby promoting happiness. Using large, diverse samples from the United States, Canada, Denmark, and The Netherlands (n = 6,271), we show that individuals who spend money on time-saving services report greater life satisfaction. A field experiment provides causal evidence that working adults report greater happiness after spending money on a time-saving purchase than on a material purchase. Together, these results suggest that using money to buy time can protect people from the detrimental effects of time pressure on life satisfaction.
Sea surface temperature (SST) records are subject to potential biases due to changing instrumentation and measurement practices. Significant differences exist between commonly used composite SST reconstructions from the National Oceanic and Atmospheric Administration’s Extended Reconstruction Sea Surface Temperature (ERSST), the Hadley Centre SST data set (HadSST3), and the Japanese Meteorological Agency’s Centennial Observation-Based Estimates of SSTs (COBE-SST) from 2003 to the present. The update from ERSST version 3b to version 4 resulted in an increase in the operational SST trend estimate during the last 19 years from 0.07° to 0.12°C per decade, indicating a higher rate of warming in recent years. We show that ERSST version 4 trends generally agree with largely independent, near-global, and instrumentally homogeneous SST measurements from floating buoys, Argo floats, and radiometer-based satellite measurements that have been developed and deployed during the past two decades. We find a large cooling bias in ERSST version 3b and smaller but significant cooling biases in HadSST3 and COBE-SST from 2003 to the present, with respect to most series examined. These results suggest that reported rates of SST warming in recent years have been underestimated in these three data sets.
There is consistent evidence supporting the ergogenic effects of caffeine for endurance based exercise. However, whether caffeine ingested through coffee has the same effects is still subject to debate. The primary aim of the study was to investigate the performance enhancing effects of caffeine and coffee using a time trial performance test, while also investigating the metabolic effects of caffeine and coffee. In a single-blind, crossover, randomised counter-balanced study design, eight trained male cyclists/triathletes (Mean±SD: Age 41±7y, Height 1.80±0.04 m, Weight 78.9±4.1 kg, VO2 max 58±3 ml•kg(-1)•min(-1)) completed 30 min of steady-state (SS) cycling at approximately 55% VO2max followed by a 45 min energy based target time trial (TT). One hour prior to exercise each athlete consumed drinks consisting of caffeine (5 mg CAF/kg BW), instant coffee (5 mg CAF/kg BW), instant decaffeinated coffee or placebo. The set workloads produced similar relative exercise intensities during the SS for all drinks, with no observed difference in carbohydrate or fat oxidation. Performance times during the TT were significantly faster (∼5.0%) for both caffeine and coffee when compared to placebo and decaf (38.35±1.53, 38.27±1.80, 40.23±1.98, 40.31±1.22 min respectively, p<0.05). The significantly faster performance times were similar for both caffeine and coffee. Average power for caffeine and coffee during the TT was significantly greater when compared to placebo and decaf (294±21 W, 291±22 W, 277±14 W, 276±23 W respectively, p<0.05). No significant differences were observed between placebo and decaf during the TT. The present study illustrates that both caffeine (5 mg/kg/BW) and coffee (5 mg/kg/BW) consumed 1 h prior to exercise can improve endurance exercise performance.
Precise modelling of the influence of climate change on Arabica coffee is limited; there are no data available for indigenous populations of this species. In this study we model the present and future predicted distribution of indigenous Arabica, and identify priorities in order to facilitate appropriate decision making for conservation, monitoring and future research. Using distribution data we perform bioclimatic modelling and examine future distribution with the HadCM3 climate model for three emission scenarios (A1B, A2A, B2A) over three time intervals (2020, 2050, 2080). The models show a profoundly negative influence on indigenous Arabica. In a locality analysis the most favourable outcome is a c. 65% reduction in the number of pre-existing bioclimatically suitable localities, and at worst an almost 100% reduction, by 2080. In an area analysis the most favourable outcome is a 38% reduction in suitable bioclimatic space, and the least favourable a c. 90% reduction, by 2080. Based on known occurrences and ecological tolerances of Arabica, bioclimatic unsuitability would place populations in peril, leading to severe stress and a high risk of extinction. This study establishes a fundamental baseline for assessing the consequences of climate change on wild populations of Arabica coffee. Specifically, it: (1) identifies and categorizes localities and areas that are predicted to be under threat from climate change now and in the short- to medium-term (2020-2050), representing assessment priorities for ex situ conservation; (2) identifies ‘core localities’ that could have the potential to withstand climate change until at least 2080, and therefore serve as long-term in situ storehouses for coffee genetic resources; (3) provides the location and characterization of target locations (populations) for on-the-ground monitoring of climate change influence. Arabica coffee is confimed as a climate sensitivite species, supporting data and inference that existing plantations will be neagtively impacted by climate change.
Feelings of loneliness are common among young adults, and are hypothesized to impair the quality of sleep. In the present study, we tested associations between loneliness and sleep quality in a nationally representative sample of young adults. Further, based on the hypothesis that sleep problems in lonely individuals are driven by increased vigilance for threat, we tested whether past exposure to violence exacerbated this association.
Objective To determine the frequency of prescriptions for short term use of oral corticosteroids, and adverse events (sepsis, venous thromboembolism, fractures) associated with their use.Design Retrospective cohort study and self controlled case series.Setting Nationwide dataset of private insurance claims.Participants Adults aged 18 to 64 years who were continuously enrolled from 2012 to 2014.Main outcome measures Rates of short term use of oral corticosteroids defined as less than 30 days duration. Incidence rates of adverse events in corticosteroid users and non-users. Incidence rate ratios for adverse events within 30 day and 31-90 day risk periods after drug initiation.Results Of 1 548 945 adults, 327 452 (21.1%) received at least one outpatient prescription for short term use of oral corticosteroids over the three year period. Use was more frequent among older patients, women, and white adults, with significant regional variation (all P<0.001). The most common indications for use were upper respiratory tract infections, spinal conditions, and allergies. Prescriptions were provided by a diverse range of specialties. Within 30 days of drug initiation, there was an increase in rates of sepsis (incidence rate ratio 5.30, 95% confidence interval 3.80 to 7.41), venous thromboembolism (3.33, 2.78 to 3.99), and fracture (1.87, 1.69 to 2.07), which diminished over the subsequent 31-90 days. The increased risk persisted at prednisone equivalent doses of less than 20 mg/day (incidence rate ratio 4.02 for sepsis, 3.61 for venous thromboembolism, and 1.83 for fracture; all P<0.001).Conclusion One in five American adults in a commercially insured plan were given prescriptions for short term use of oral corticosteroids during a three year period, with an associated increased risk of adverse events.
Standard theories of decision-making involving delayed outcomes predict that people should defer a punishment, whilst advancing a reward. In some cases, such as pain, people seem to prefer to expedite punishment, implying that its anticipation carries a cost, often conceptualized as ‘dread’. Despite empirical support for the existence of dread, whether and how it depends on prospective delay is unknown. Furthermore, it is unclear whether dread represents a stable component of value, or is modulated by biases such as framing effects. Here, we examine choices made between different numbers of painful shocks to be delivered faithfully at different time points up to 15 minutes in the future, as well as choices between hypothetical painful dental appointments at time points of up to approximately eight months in the future, to test alternative models for how future pain is disvalued. We show that future pain initially becomes increasingly aversive with increasing delay, but does so at a decreasing rate. This is consistent with a value model in which moment-by-moment dread increases up to the time of expected pain, such that dread becomes equivalent to the discounted expectation of pain. For a minority of individuals pain has maximum negative value at intermediate delay, suggesting that the dread function may itself be prospectively discounted in time. Framing an outcome as relief reduces the overall preference to expedite pain, which can be parameterized by reducing the rate of the dread-discounting function. Our data support an account of disvaluation for primary punishments such as pain, which differs fundamentally from existing models applied to financial punishments, in which dread exerts a powerful but time-dependent influence over choice.
Background Most patients with locally advanced, unresectable, non-small-cell lung cancer (NSCLC) have disease progression despite definitive chemoradiotherapy (chemotherapy plus concurrent radiation therapy). This phase 3 study compared the anti-programmed death ligand 1 antibody durvalumab as consolidation therapy with placebo in patients with stage III NSCLC who did not have disease progression after two or more cycles of platinum-based chemoradiotherapy. Methods We randomly assigned patients, in a 2:1 ratio, to receive durvalumab (at a dose of 10 mg per kilogram of body weight intravenously) or placebo every 2 weeks for up to 12 months. The study drug was administered 1 to 42 days after the patients had received chemoradiotherapy. The coprimary end points were progression-free survival (as assessed by means of blinded independent central review) and overall survival (unplanned for the interim analysis). Secondary end points included 12-month and 18-month progression-free survival rates, the objective response rate, the duration of response, the time to death or distant metastasis, and safety. Results Of 713 patients who underwent randomization, 709 received consolidation therapy (473 received durvalumab and 236 received placebo). The median progression-free survival from randomization was 16.8 months (95% confidence interval [CI], 13.0 to 18.1) with durvalumab versus 5.6 months (95% CI, 4.6 to 7.8) with placebo (stratified hazard ratio for disease progression or death, 0.52; 95% CI, 0.42 to 0.65; P<0.001); the 12-month progression-free survival rate was 55.9% versus 35.3%, and the 18-month progression-free survival rate was 44.2% versus 27.0%. The response rate was higher with durvalumab than with placebo (28.4% vs. 16.0%; P<0.001), and the median duration of response was longer (72.8% vs. 46.8% of the patients had an ongoing response at 18 months). The median time to death or distant metastasis was longer with durvalumab than with placebo (23.2 months vs. 14.6 months; P<0.001). Grade 3 or 4 adverse events occurred in 29.9% of the patients who received durvalumab and 26.1% of those who received placebo; the most common adverse event of grade 3 or 4 was pneumonia (4.4% and 3.8%, respectively). A total of 15.4% of patients in the durvalumab group and 9.8% of those in the placebo group discontinued the study drug because of adverse events. Conclusions Progression-free survival was significantly longer with durvalumab than with placebo. The secondary end points also favored durvalumab, and safety was similar between the groups. (Funded by AstraZeneca; PACIFIC ClinicalTrials.gov number, NCT02125461 .).
- Journal of the International Society of Sports Nutrition
- Published over 5 years ago
Creatine is one of the most popular and widely researched natural supplements. The majority of studies have focused on the effects of creatine monohydrate on performance and health; however, many other forms of creatine exist and are commercially available in the sports nutrition/supplement market. Regardless of the form, supplementation with creatine has regularly shown to increase strength, fat free mass, and muscle morphology with concurrent heavy resistance training more than resistance training alone. Creatine may be of benefit in other modes of exercise such as high-intensity sprints or endurance training. However, it appears that the effects of creatine diminish as the length of time spent exercising increases. Even though not all individuals respond similarly to creatine supplementation, it is generally accepted that its supplementation increases creatine storage and promotes a faster regeneration of adenosine triphosphate between high intensity exercises. These improved outcomes will increase performance and promote greater training adaptations. More recent research suggests that creatine supplementation in amounts of 0.1 g/kg of body weight combined with resistance training improves training adaptations at a cellular and sub-cellular level. Finally, although presently ingesting creatine as an oral supplement is considered safe and ethical, the perception of safety cannot be guaranteed, especially when administered for long period of time to different populations (athletes, sedentary, patient, active, young or elderly).
So far, conservation scientists have paid little attention to synthetic biology; this is unfortunate as the technology is likely to transform the operating space within which conservation functions, and therefore the prospects for maintaining biodiversity into the future.