Concept: South Dakota
To promote optimal health and well-being, adults aged 18-60 years are recommended to sleep at least 7 hours each night (1). Sleeping <7 hours per night is associated with increased risk for obesity, diabetes, high blood pressure, coronary heart disease, stroke, frequent mental distress, and all-cause mortality (2-4). Insufficient sleep impairs cognitive performance, which can increase the likelihood of motor vehicle and other transportation accidents, industrial accidents, medical errors, and loss of work productivity that could affect the wider community (5). CDC analyzed data from the 2014 Behavioral Risk Factor Surveillance System (BRFSS) to determine the prevalence of a healthy sleep duration (≥7 hours) among 444,306 adult respondents in all 50 states and the District of Columbia. A total of 65.2% of respondents reported a healthy sleep duration; the age-adjusted prevalence of healthy sleep was lower among non-Hispanic blacks, American Indians/Alaska Natives, Native Hawaiians/Pacific Islanders, and multiracial respondents, compared with non-Hispanic whites, Hispanics, and Asians. State-based estimates of healthy sleep duration prevalence ranged from 56.1% in Hawaii to 71.6% in South Dakota. Geographic clustering of the lowest prevalence of healthy sleep duration was observed in the southeastern United States and in states along the Appalachian Mountains, and the highest prevalence was observed in the Great Plains states. More than one third of U.S. respondents reported typically sleeping <7 hours in a 24-hour period, suggesting an ongoing need for public awareness and public education about sleep health; worksite shift policies that ensure healthy sleep duration for shift workers, particularly medical professionals, emergency response personnel, and transportation industry personnel; and opportunities for health care providers to discuss the importance of healthy sleep duration with patients and address reasons for poor sleep health.
- Proceedings of the National Academy of Sciences of the United States of America
- Published about 7 years ago
In the US Corn Belt, a recent doubling in commodity prices has created incentives for landowners to convert grassland to corn and soybean cropping. Here, we use land cover data from the National Agricultural Statistics Service Cropland Data Layer to assess grassland conversion from 2006 to 2011 in the Western Corn Belt (WCB): five states including North Dakota, South Dakota, Nebraska, Minnesota, and Iowa. Our analysis identifies areas with elevated rates of grass-to-corn/soy conversion (1.0-5.4% annually). Across the WCB, we found a net decline in grass-dominated land cover totaling nearly 530,000 ha. With respect to agronomic attributes of lands undergoing grassland conversion, corn/soy production is expanding onto marginal lands characterized by high erosion risk and vulnerability to drought. Grassland conversion is also concentrated in close proximity to wetlands, posing a threat to waterfowl breeding in the Prairie Pothole Region. Longer-term land cover trends from North Dakota and Iowa indicate that recent grassland conversion represents a persistent shift in land use rather than short-term variability in crop rotation patterns. Our results show that the WCB is rapidly moving down a pathway of increased corn and soybean cultivation. As a result, the window of opportunity for realizing the benefits of a biofuel industry based on perennial bioenergy crops, rather than corn ethanol and soy biodiesel, may be closing in the WCB.
The rapid rise of unconventional oil production during the past decade in the Bakken region of North Dakota raises concerns related to water contamination associated with the accidental release of oil and gas wastewater to the environment. Here, we characterize the major and trace element chemistry and isotopic ratios ((87)Sr/(86)Sr, δ(18)O, δ(2)H) of surface waters (n = 29) in areas impacted by oil and gas wastewater spills in the Bakken region of North Dakota. We establish geochemical and isotopic tracers that can identify Bakken brine spills in the environment. In addition to elevated concentrations of dissolved salts (Na, Cl, Br), spill waters also consisted of elevated concentrations of other contaminants (Se, V, Pb, NH4) compared to background waters, and soil and sediment in spill sites had elevated total radium activities ((228)Ra + (226)Ra) relative to background, indicating accumulation of Ra in impacted soil and sediment. We observed that inorganic contamination associated with brine spills in North Dakota is remarkably persistent, with elevated levels of contaminants observed in spills sites up to 4 years following the spill events.
To evaluate the effectiveness of roadway policies for lighting and marking of farm equipment in reducing crashes in Illinois, Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota and Wisconsin.
In the past decade, severe weather and West Nile virus were major causes of chick mortality at American white pelican (Pelecanus erythrorhynchos) colonies in the northern plains of North America. At one of these colonies, Chase Lake National Wildlife Refuge in North Dakota, spring arrival by pelicans has advanced approximately 16 days over a period of 44 years (1965-2008). We examined phenology patterns of pelicans and timing of inclement weather through the 44-year period, and evaluated the consequence of earlier breeding relative to weather-related chick mortality. We found severe weather patterns to be random through time, rather than concurrently shifting with the advanced arrival of pelicans. In recent years, if nest initiations had followed the phenology patterns of 1965 (i.e., nesting initiated 16 days later), fewer chicks likely would have died from weather-related causes. That is, there would be fewer chicks exposed to severe weather during a vulnerable transition period that occurs between the stage when chicks are being brooded by adults and the stage when chicks from multiple nests become part of a thermally protective crèche.
Comparisons of paleofaunas from different facies are often hampered by the uncertainty in the variation of taphonomic processes biasing the paleoecological parameters of interest. By examining the taphonomic patterns exhibited by different facies in the same stratigraphic interval and area, it is possible to quantify this variation, and assess inter-facies comparability. The fossil assemblages preserved in Badlands National Park (BNP), South Dakota, have long been a rich source for mammalian faunas of the White River Group. To investigate the influence of the variation of taphonomic bias with lithology whilst controlling for the influence of changes in patterns of taphonomic modification with time, taphonomic and paleoecological data were collected from four mammal-dominated fossil assemblages (two siltstone hosted and two sandstone hosted) from a narrow stratigraphic interval within the Oligocene Poleslide Member of the Brule Formation, in the Palmer Creek Unit of BNP. Previous work in the region confirmed that the two major lithologies represent primarily aeolian- and primarily fluvial-dominated depositional environments, respectively. A suite of quantifiable taphonomic and ecological variables was recorded for each of the more than 800 vertebrate specimens studied here (857 specimens were studied in the field, 9 specimens were collected and are reposited at BNP). Distinctly different patterns of taphonomic biasing were observed between the aeolian and fluvial samples, albeit with some variability between all four sites. Fluvial samples were more heavily weathered and abraded, but also contained fewer large taxa and fewer tooth-bearing elements. No quantifiable paleofaunal differences in generic richness or evenness were observed between the respective facies. This suggests that while large vertebrate taxonomic composition in the region did vary with paleodepositional environment, there is no evidence of confounding variation in faunal structure, and therefore differences between the assemblages are attributed to differing preservational environments producing a taphonomic overprint on the assemblages. The lack of apparent taphonomic bias on paleofaunal structure suggests that such paleoecological data can be compared throughout the Poleslide Member, irrespective of lithology.
Based on geologic mapping, measured sections, and lithologic correlations, the local features of the upper and lower type areas of the Early Arikareean (30.8-20.6 million years ago) Sharps Formation are revised and correlated. The Sharps Formation above the basal Rockyford Member is divided into two members of distinct lithotypes. The upper 233 feet of massive siltstones and sandy siltstones is named the Gooseneck Road Member. The middle member, 161 feet of eolian volcaniclastic siltstones with fluvially reworked volcaniclastic lenses and sandy siltstone sheets, is named the Wolff Camp Member. An ashey zone at the base of the Sharps Formation is described and defined as the Rockyford Ash Zone (RAZ) in the same stratigraphic position as the Nonpareil Ash Zone (NPAZ) in Nebraska. Widespread marker beds of fresh water limestones at 130 feet above the base of the Sharps Formation and a widespread reddish-brown clayey siltstone at 165 feet above the base of the Sharps Formation are described. The Brown Siltstone Beds of Nebraska are shown to be a southern correlative of the Wolff Camp Member and the Rockyford Member of the Sharps Formation. Early attempts to correlate strata in the Great Plains were slow in developing. Recognition of the implications of the paleomagnetic and lithologic correlations of this paper will provide an added datum assisting researchers in future biostratigraphic studies. Based on similar lithologies, the Sharps Formation, currently assigned to the Arikaree Group, should be reassigned to the White River Group.
- Conservation biology : the journal of the Society for Conservation Biology
- Published over 4 years ago
The contribution of renewable energy to meet worldwide demand continues to grow. Wind energy is one of the fastest growing renewable sectors, but new wind facilities are often placed in prime wildlife habitat. Long-term studies that incorporate a rigorous statistical design to evaluate the effects of wind facilities on wildlife are rare. We conducted a before-after-control-impact (BACI) assessment to determine if wind facilities placed in native mixed-grass prairies displaced breeding grassland birds. During 2003-2012, we monitored changes in bird density in 3 study areas in North Dakota and South Dakota (U.S.A.). We examined whether displacement or attraction occurred 1 year after construction (immediate effect) and the average displacement or attraction 2-5 years after construction (delayed effect). We tested for these effects overall and within distance bands of 100, 200, 300, and >300 m from turbines. We observed displacement for 7 of 9 species. One species was unaffected by wind facilities and one species exhibited attraction. Displacement and attraction generally occurred within 100 m and often extended up to 300 m. In a few instances, displacement extended beyond 300 m. Displacement and attraction occurred 1 year after construction and persisted at least 5 years. Our research provides a framework for applying a BACI design to displacement studies and highlights the erroneous conclusions that can be made without the benefit of adopting such a design. More broadly, species-specific behaviors can be used to inform management decisions about turbine placement and the potential impact to individual species. Additionally, the avoidance distance metrics we estimated can facilitate future development of models evaluating impacts of wind facilities under differing land-use scenarios.
Plant and associated insect-damage diversity in the western U.S.A. decreased significantly at the Cretaceous-Paleogene (K-Pg) boundary and remained low until the late Paleocene. However, the Mexican Hat locality (ca. 65 Ma) in southeastern Montana, with a typical, low-diversity flora, uniquely exhibits high damage diversity on nearly all its host plants, when compared to all known local and regional early Paleocene sites. The same plant species show minimal damage elsewhere during the early Paleocene. We asked whether the high insect damage diversity at Mexican Hat was more likely related to the survival of Cretaceous insects from refugia or to an influx of novel Paleocene taxa. We compared damage on 1073 leaf fossils from Mexican Hat to over 9000 terminal Cretaceous leaf fossils from the Hell Creek Formation of nearby southwestern North Dakota and to over 9000 Paleocene leaf fossils from the Fort Union Formation in North Dakota, Montana, and Wyoming. We described the entire insect-feeding ichnofauna at Mexican Hat and focused our analysis on leaf mines because they are typically host-specialized and preserve a number of diagnostic morphological characters. Nine mine damage types attributable to three of the four orders of leaf-mining insects are found at Mexican Hat, six of them so far unique to the site. We found no evidence linking any of the diverse Hell Creek mines with those found at Mexican Hat, nor for the survival of any Cretaceous leaf miners over the K-Pg boundary regionally, even on well-sampled, surviving plant families. Overall, our results strongly relate the high damage diversity on the depauperate Mexican Hat flora to an influx of novel insect herbivores during the early Paleocene, possibly caused by a transient warming event and range expansion, and indicate drastic extinction rather than survivorship of Cretaceous insect taxa from refugia.
An Ehrlichia muris-like (EML) pathogen was detected among 4 patients in Minnesota and Wisconsin during 2009. We characterized additional cases clinically and epidemiologically. During 2004-2013, blood samples from 75,077 patients from all 50 United States were tested by PCR from the groEL gene for Ehrlichia spp. and Anaplasma phagocytophilum. During 2007-2013, samples from 69 (0.1%) patients were positive for the EML pathogen; patients were from 5 states: Indiana (1), Michigan (1), Minnesota (33), North Dakota (3), and Wisconsin (31). Most (64%) patients were male; median age was 63 (range 15-94) years; and all 69 patients reported likely tick exposure in Minnesota or Wisconsin. Fever, malaise, thrombocytopenia, and lymphopenia were the most common symptoms. Sixteen (23%) patients were hospitalized (median 4 days); all recovered, and 96% received doxycycline. Infection with the EML pathogen should be considered for persons reporting tick exposure in Minnesota or Wisconsin.