Glyphosate, hard water and nephrotoxic metals: are they the culprits behind the epidemic of chronic kidney disease of unknown etiology in sri lanka?
- International journal of environmental research and public health
- Published over 4 years ago
The current chronic kidney disease epidemic, the major health issue in the rice paddy farming areas in Sri Lanka has been the subject of many scientific and political debates over the last decade. Although there is no agreement among scientists about the etiology of the disease, a majority of them has concluded that this is a toxic nephropathy. None of the hypotheses put forward so far could explain coherently the totality of clinical, biochemical, histopathological findings, and the unique geographical distribution of the disease and its appearance in the mid-1990s. A strong association between the consumption of hard water and the occurrence of this special kidney disease has been observed, but the relationship has not been explained consistently. Here, we have hypothesized the association of using glyphosate, the most widely used herbicide in the disease endemic area and its unique metal chelating properties. The possible role played by glyphosate-metal complexes in this epidemic has not been given any serious consideration by investigators for the last two decades. Furthermore, it may explain similar kidney disease epidemics observed in Andra Pradesh (India) and Central America. Although glyphosate alone does not cause an epidemic of chronic kidney disease, it seems to have acquired the ability to destroy the renal tissues of thousands of farmers when it forms complexes with a localized geo environmental factor (hardness) and nephrotoxic metals.
We report the results of a study we conducted using a simple multiplayer online game that simulates the spread of an infectious disease through a population composed of the players. We use our virtual epidemics game to examine how people respond to epidemics. The analysis shows that people’s behavior is responsive to the cost of self-protection, the reported prevalence of disease, and their experiences earlier in the epidemic. Specifically, decreasing the cost of self-protection increases the rate of safe behavior. Higher reported prevalence also raises the likelihood that individuals would engage in self-protection, where the magnitude of this effect depends on how much time has elapsed in the epidemic. Individuals' experiences in terms of how often an infection was acquired when they did not engage in self-protection are another factor that determines whether they will invest in preventive measures later on. All else being equal, individuals who were infected at a higher rate are more likely to engage in self-protective behavior compared to those with a lower rate of infection. Lastly, fixing everything else, people’s willingness to engage in safe behavior waxes or wanes over time, depending on the severity of an epidemic: when prevalence is high, people are more likely to adopt self-protective measures as time goes by; when prevalence is low, a ‘self-protection fatigue’ effect sets in whereby individuals are less willing to engage in safe behavior over time.
Mark Siedner and colleagues reflect on the early response to the Ebola epidemic and lessons that can be learned for future epidemics.
Increasing the durability of crop resistance to plant pathogens is one of the key goals of virulence management. Despite the recognition of the importance of demographic and environmental stochasticity on the dynamics of an epidemic, their effects on the evolution of the pathogen and durability of resistance has not received attention. We formulated a stochastic epidemiological model, based on the Kramer-Moyal expansion of the Master Equation, to investigate how random fluctuations affect the dynamics of an epidemic and how these effects feed through to the evolution of the pathogen and durability of resistance. We focused on two hypotheses: firstly, a previous deterministic model has suggested that the effect of cropping ratio (the proportion of land area occupied by the resistant crop) on the durability of crop resistance is negligible. Increasing the cropping ratio increases the area of uninfected host, but the resistance is more rapidly broken; these two effects counteract each other. We tested the hypothesis that similar counteracting effects would occur when we take account of demographic stochasticity, but found that the durability does depend on the cropping ratio. Secondly, we tested whether a superimposed external source of stochasticity (for example due to environmental variation or to intermittent fungicide application) interacts with the intrinsic demographic fluctuations and how such interaction affects the durability of resistance. We show that in the pathosystem considered here, in general large stochastic fluctuations in epidemics enhance extinction of the pathogen. This is more likely to occur at large cropping ratios and for particular frequencies of the periodic external perturbation (stochastic resonance). The results suggest possible disease control practises by exploiting the natural sources of stochasticity.
How social structures, space, and behaviors shape the spread of infectious diseases using chikungunya as a case study
- Proceedings of the National Academy of Sciences of the United States of America
- Published almost 2 years ago
Whether an individual becomes infected in an infectious disease outbreak depends on many interconnected risk factors, which may relate to characteristics of the individual (e.g., age, sex), his or her close relatives (e.g., household members), or the wider community. Studies monitoring individuals in households or schools have helped elucidate the determinants of transmission in small social structures due to advances in statistical modeling; but such an approach has so far largely failed to consider individuals in the wider context they live in. Here, we used an outbreak of chikungunya in a rural community in Bangladesh as a case study to obtain a more comprehensive characterization of risk factors in disease spread. We developed Bayesian data augmentation approaches to account for uncertainty in the source of infection, recall uncertainty, and unobserved infection dates. We found that the probability of chikungunya transmission was 12% [95% credible interval (CI): 8-17%] between household members but dropped to 0.3% for those living 50 m away (95% CI: 0.2-0.5%). Overall, the mean transmission distance was 95 m (95% CI: 77-113 m). Females were 1.5 times more likely to become infected than males (95% CI: 1.2-1.8), which was virtually identical to the relative risk of being at home estimated from an independent human movement study in the country. Reported daily use of antimosquito coils had no detectable impact on transmission. This study shows how the complex interplay between the characteristics of an individual and his or her close and wider environment contributes to the shaping of infectious disease epidemics.
Sustained and coordinated vaccination efforts have brought polio eradication within reach. Anticipating the eradication of wild poliovirus (WPV) and the subsequent challenges in preventing its re-emergence, we look to the past to identify why polio rose to epidemic levels in the mid-20th century, and how WPV persisted over large geographic scales. We analyzed an extensive epidemiological dataset, spanning the 1930s to the 1950s and spatially replicated across each state in the United States, to glean insight into the drivers of polio’s historical expansion and the ecological mode of its persistence prior to vaccine introduction. We document a latitudinal gradient in polio’s seasonality. Additionally, we fitted and validated mechanistic transmission models to data from each US state independently. The fitted models revealed that: (1) polio persistence was the product of a dynamic mosaic of source and sink populations; (2) geographic heterogeneity of seasonal transmission conditions account for the latitudinal structure of polio epidemics; (3) contrary to the prevailing “disease of development” hypothesis, our analyses demonstrate that polio’s historical expansion was straightforwardly explained by demographic trends rather than improvements in sanitation and hygiene; and (4) the absence of clinical disease is not a reliable indicator of polio transmission, because widespread polio transmission was likely in the multiyear absence of clinical disease. As the world edges closer to global polio eradication and continues the strategic withdrawal of the Oral Polio Vaccine (OPV), the regular identification of, and rapid response to, these silent chains of transmission is of the utmost importance.
Infectious diseases rarely end in extinction. Yet the mechanisms that explain how epidemics subside are difficult to pinpoint. We investigated host-pathogen interactions after the emergence of a lethal fungal pathogen in a tropical amphibian assemblage. Some amphibian host species are recovering, but the pathogen is still present and is as pathogenic today as it was almost a decade ago. In addition, some species have defenses that are more effective now than they were before the epidemic. These results suggest that host recoveries are not caused by pathogen attenuation and may be due to shifts in host responses. Our findings provide insights into the mechanisms underlying disease transitions, which are increasingly important to understand in an era of emerging infectious diseases and unprecedented global pandemics.
We assess how presymptomatic infection affects predictability of infectious disease epidemics. We focus on whether or not a major outbreak (i.e. an epidemic that will go on to infect a large number of individuals) can be predicted reliably soon after initial cases of disease have appeared within a population. For emerging epidemics, significant time and effort is spent recording symptomatic cases. Scientific attention has often focused on improving statistical methodologies to estimate disease transmission parameters from these data. Here we show that, even if symptomatic cases are recorded perfectly, and disease spread parameters are estimated exactly, it is impossible to estimate the probability of a major outbreak without ambiguity. Our results therefore provide an upper bound on the accuracy of forecasts of major outbreaks that are constructed using data on symptomatic cases alone. Accurate prediction of whether or not an epidemic will occur requires records of symptomatic individuals to be supplemented with data concerning the true infection status of apparently uninfected individuals. To forecast likely future behavior in the earliest stages of an emerging outbreak, it is therefore vital to develop and deploy accurate diagnostic tests that can determine whether asymptomatic individuals are actually uninfected, or instead are infected but just do not yet show detectable symptoms.
In the new millennium, the centuries-old strategy of quarantine is becoming a powerful component of the public health response to emerging and reemerging infectious diseases. During the 2003 pandemic of severe acute respiratory syndrome, the use of quarantine, border controls, contact tracing, and surveillance proved effective in containing the global threat in just over 3 months. For centuries, these practices have been the cornerstone of organized responses to infectious disease outbreaks. However, the use of quarantine and other measures for controlling epidemic diseases has always been controversial because such strategies raise political, ethical, and socioeconomic issues and require a careful balance between public interest and individual rights. In a globalized world that is becoming ever more vulnerable to communicable diseases, a historical perspective can help clarify the use and implications of a still-valid public health strategy.
During outbreaks of high-consequence pathogens, airport screening programs have been deployed to curtail geographic spread of infection. The effectiveness of screening depends on several factors, including pathogen natural history and epidemiology, human behavior, and characteristics of the source epidemic. We developed a mathematical model to understand how these factors combine to influence screening outcomes. We analyzed screening programs for six emerging pathogens in the early and late stages of an epidemic. We show that the effectiveness of different screening tools depends strongly on pathogen natural history and epidemiological features, as well as human factors in implementation and compliance. For pathogens with longer incubation periods, exposure risk detection dominates in growing epidemics, while fever becomes a better target in stable or declining epidemics. For pathogens with short incubation, fever screening drives detection in any epidemic stage. However, even in the most optimistic scenario arrival screening will miss the majority of cases.