It is often suggested that coffee causes dehydration and its consumption should be avoided or significantly reduced to maintain fluid balance. The aim of this study was to directly compare the effects of coffee consumption against water ingestion across a range of validated hydration assessment techniques. In a counterbalanced cross-over design, 50 male coffee drinkers (habitually consuming 3-6 cups per day) participated in two trials, each lasting three consecutive days. In addition to controlled physical activity, food and fluid intake, participants consumed either 4×200 mL of coffee containing 4 mg/kg caffeine © or water (W). Total body water (TBW) was calculated pre- and post-trial via ingestion of Deuterium Oxide. Urinary and haematological hydration markers were recorded daily in addition to nude body mass measurement (BM). Plasma was analysed for caffeine to confirm compliance. There were no significant changes in TBW from beginning to end of either trial and no differences between trials (51.5±1.4 vs. 51.4±1.3 kg, for C and W, respectively). No differences were observed between trials across any haematological markers or in 24 h urine volume (2409±660 vs. 2428±669 mL, for C and W, respectively), USG, osmolality or creatinine. Mean urinary Na(+) excretion was higher in C than W (p = 0.02). No significant differences in BM were found between conditions, although a small progressive daily fall was observed within both trials (0.4±0.5 kg; p<0.05). Our data show that there were no significant differences across a wide range of haematological and urinary markers of hydration status between trials. These data suggest that coffee, when consumed in moderation by caffeine habituated males provides similar hydrating qualities to water.
Natriuretic regulation of extracellular fluid volume homeostasis includes suppression of the renin-angiotensin-aldosterone system, pressure natriuresis, and reduced renal nerve activity, actions that concomitantly increase urinary Na+ excretion and lead to increased urine volume. The resulting natriuresis-driven diuretic water loss is assumed to control the extracellular volume. Here, we have demonstrated that urine concentration, and therefore regulation of water conservation, is an important control system for urine formation and extracellular volume homeostasis in mice and humans across various levels of salt intake. We observed that the renal concentration mechanism couples natriuresis with correspondent renal water reabsorption, limits natriuretic osmotic diuresis, and results in concurrent extracellular volume conservation and concentration of salt excreted into urine. This water-conserving mechanism of dietary salt excretion relies on urea transporter-driven urea recycling by the kidneys and on urea production by liver and skeletal muscle. The energy-intense nature of hepatic and extrahepatic urea osmolyte production for renal water conservation requires reprioritization of energy and substrate metabolism in liver and skeletal muscle, resulting in hepatic ketogenesis and glucocorticoid-driven muscle catabolism, which are prevented by increasing food intake. This natriuretic-ureotelic, water-conserving principle relies on metabolism-driven extracellular volume control and is regulated by concerted liver, muscle, and renal actions.
Background The epidemiologic characteristics of children and young adults with acute kidney injury have been described in single-center and retrospective studies. We conducted a multinational, prospective study involving patients admitted to pediatric intensive care units to define the incremental risk of death and complications associated with severe acute kidney injury. Methods We used the Kidney Disease: Improving Global Outcomes criteria to define acute kidney injury. Severe acute kidney injury was defined as stage 2 or 3 acute kidney injury (plasma creatinine level ≥2 times the baseline level or urine output <0.5 ml per kilogram of body weight per hour for ≥12 hours) and was assessed for the first 7 days of intensive care. All patients 3 months to 25 years of age who were admitted to 1 of 32 participating units were screened during 3 consecutive months. The primary outcome was 28-day mortality. Results A total of 4683 patients were evaluated; acute kidney injury developed in 1261 patients (26.9%; 95% confidence interval [CI], 25.6 to 28.2), and severe acute kidney injury developed in 543 patients (11.6%; 95% CI, 10.7 to 12.5). Severe acute kidney injury conferred an increased risk of death by day 28 after adjustment for 16 covariates (adjusted odds ratio, 1.77; 95% CI, 1.17 to 2.68); death occurred in 60 of the 543 patients (11.0%) with severe acute kidney injury versus 105 of the 4140 patients (2.5%) without severe acute kidney injury (P<0.001). Severe acute kidney injury was associated with increased use of mechanical ventilation and renal-replacement therapy. A stepwise increase in 28-day mortality was associated with worsening severity of acute kidney injury (P<0.001 by log-rank test). Assessment of acute kidney injury according to the plasma creatinine level alone failed to identify acute kidney injury in 67.2% of the patients with low urine output. Conclusions Acute kidney injury is common and is associated with poor outcomes, including increased mortality, among critically ill children and young adults. (Funded by the Pediatric Nephrology Center of Excellence at Cincinnati Children's Hospital Medical Center and others; AWARE ClinicalTrials.gov number, NCT01987921 .).
Whole genome sequencing (WGS) is becoming available as a routine tool for clinical microbiology. If applied directly on clinical samples this could further reduce diagnostic time and thereby improve control and treatment. A major bottle-neck is the availability of fast and reliable bioinformatics tools. This study was conducted to evaluate the applicability of WGS directly on clinical samples and to develop easy-to-use bioinformatics tools for analysis of the sequencing data. Thirty-five random urine samples from patients with suspected urinary tract infections were examined using conventional microbiology, WGS of isolated bacteria and by directly sequencing on pellets from the urine. A rapid method for analyzing the sequence data was developed. Bacteria were cultivated from 19 samples, but only in pure culture from 17. WGS improved the identification of the cultivated bacteria and almost complete agreement was observed between phenotypic and predicted antimicrobial susceptibility. Complete agreement was observed between species identification, multi-locus-sequence typing and phylogenetic relationship for the Escherichia coli and Enterococcus faecalis isolates when comparing the results of WGS of cultured isolates and directly from the urine samples. Sequencing directly from the urine enabled bacterial identification in polymicrobic samples. Additional putative pathogenic strains were observed in some culture negative samples. WGS directly on clinical samples can provide clinically relevant information and drastically reduce diagnostic time. This may prove very useful, but the need for data analysis is still a hurdle to clinical implementation. To overcome this problem a publicly available bioinformatics tool was developed in this study.
In 2015, scientists reported the emergence of the plasmid-encoded mcr-1 gene conferring bacterial resistance to the antibiotic colistin (1), signaling potential emergence of a pandrug-resistant bacterium. In May 2016, mcr-1-positive Escherichia coli was first isolated from a specimen from a U.S. patient (2) when a Pennsylvania woman was evaluated for a urinary tract infection. The urine culture and subsequent testing identified the gene in an extended-spectrum beta-lactamase (ESBL)-producing E. coli with reduced susceptibility to colistin. The patient had no international travel for approximately 1 year, no livestock exposure, and a limited role in meal preparation with store-bought groceries; however, she had multiple and repeated admissions to four medical facilities during 2016.
We describe a novel infection-responsive coating for urinary catheters that provides a clear visual early warning of Proteus mirabilis infection and subsequent blockage. The crystalline biofilms of P. mirabilis can cause serious complications for patients undergoing long-term bladder catheterisation. Healthy urine is around pH 6, bacterial urease increases urine pH leading to the precipitation of calcium and magnesium deposits from the urine, resulting in dense crystalline biofilms on the catheter surface that blocks urine flow. The coating is a dual layered system in which the lower poly(vinyl alcohol) layer contains the self-quenching dye carboxyfluorescein. This is capped by an upper layer of the pH responsive polymer poly(methyl methacrylate-co-methacrylic acid) (Eudragit S100®). Elevation of urinary pH (>pH 7) dissolves the Eudragit layer, releasing the dye to provide a clear visual warning of impending blockage. Evaluation of prototype coatings using a clinically relevant in vitro bladder model system demonstrated that coatings provide up to 12h advanced warning of blockage, and are stable both in the absence of infection, and in the presence of species that do not cause catheter blockage. At the present time, there are no effective methods to control these infections or provide warning of impending catheter blockage.
BACKGROUND: To investigate the effect of prostaglandin depletion by means of COX-inhibition on cholinergic enhanced spontaneous contractions. METHODS: The urethra and bladder of 9 male guinea pigs (weight 270–300 g) were removed and placed in an organ bath with Krebs' solution. A catheter was passed through the urethra through which the intravesical pressure was measured. The muscarinic agonist arecaidine, the non-selective COX inhibitor indomethacin, and PGE2 were subsequently added to the organ bath. The initial average frequency and amplitude of spontaneous contractions in the first 2 minutes after arecaidine application were labelled Fini and Pini, respectively. The steady state frequency (Fsteady) and amplitude (Psteady) were defined as the average frequency and amplitude during the 5 minutes before the next wash out. RESULTS: Application of 1 muM PGE2 increased the amplitude of spontaneous contractions without affecting frequency. 10 muM of indomethacin reduced amplitude but not frequency.The addition of indomethacin did not alter Fini after the first application (p = 0.7665). However, after the second wash, Fini was decreased (p = 0.0005). Fsteady, Psteady and Pini were not significantly different in any of the conditions. These effects of indomethacin were reversible by PGE2 addition.. CONCLUSIONS: Blocking PG synthesis decreased the cholinergically stimulated autonomous contractions in the isolated bladder. This suggests that PG could modify normal cholinergically evoked response. A combination of drugs inhibiting muscarinic receptors and PG function or production can then become an interesting focus of research on a treatment for overactive bladder syndrome.
BACKGROUND: Diabetic patients have a higher risk of bladder cancer and benign prostatic hyperplasia (BPH). Theoretically, BPH patients may have an increased risk of bladder cancer because residual urine in the bladder surely increases the contact time between urinary excreted carcinogens and the urothelium. However, whether BPH increases bladder cancer risk in patients with type 2 diabetes has not been studied. METHODS: The reimbursement databases of all Taiwanese diabetic patients under oral anti-diabetic agents or insulin from 1996 to 2009 were retrieved from the National Health Insurance. An entry date was set at 1 January 2006 and a total of 547584 men with type 2 diabetes were followed up for bladder cancer incidence until the end of 2009. Incidences of bladder cancer for BPH by status and by duration were calculated and adjusted hazard ratios (95 % confidence intervals) were estimated by Cox regression. The effects of diabetes duration and medications used for diabetic control in relation with bladder cancer risk were also evaluated by Cox regression in BPH men. RESULTS: The incidences were 258.77 and 69.34 per 100,000 person-years for patients with and without BPH, respectively, adjusted hazard ratio 1.794 (1.572, 2.047). For BPH patients, those who underwent surgical procedures for BPH had a higher incidence than those who did not (355.45 vs. 250.09 per 100,000 person-years), respective adjusted hazard ratios: 2.459 (1.946, 3.109) and 1.709 (1.492, 1.958). The significantly higher risk could be demonstrated for BPH of any duration: respective adjusted hazard ratios 1.750 (1.430, 1.605), 1.844 (1.543, 2.203), 2.011 (1.680, 2.406) and 1.605 (1.341, 1.921) for BPH <1, 1--3, 3--5 and >=5 years versus patients without BPH. Sensitivity analyses for patients aged >=60 years and after excluding BPH patients with surgical procedures or without surgical procedures, respectively, yielded similar results. In BPH men, diabetes duration was not significantly related with bladder cancer; but metformin was consistently associated with a significantly lower risk, with adjusted hazard ratio of 0.719 (0.590, 0.875) for all ages and 0.742 (0.604, 0.912) for age >=60 years. CONCLUSIONS: BPH is a significant risk factor for bladder cancer in men with type 2 diabetes. Metformin may protect against bladder cancer in BPH men.
The Characterization of Feces and Urine: A Review of the Literature to Inform Advanced Treatment Technology
- Critical reviews in environmental science and technology
- Published over 3 years ago
The safe disposal of human excreta is of paramount importance for the health and welfare of populations living in low income countries as well as the prevention of pollution to the surrounding environment. On-site sanitation (OSS) systems are the most numerous means of treating excreta in low income countries, these facilities aim at treating human waste at source and can provide a hygienic and affordable method of waste disposal. However, current OSS systems need improvement and require further research and development. Development of OSS facilities that treat excreta at, or close to, its source require knowledge of the waste stream entering the system. Data regarding the generation rate and the chemical and physical composition of fresh feces and urine was collected from the medical literature as well as the treatability sector. The data were summarized and statistical analysis was used to quantify the major factors that were a significant cause of variability. The impact of this data on biological processes, thermal processes, physical separators, and chemical processes was then assessed. Results showed that the median fecal wet mass production was 128 g/cap/day, with a median dry mass of 29 g/cap/day. Fecal output in healthy individuals was 1.20 defecations per 24 hr period and the main factor affecting fecal mass was the fiber intake of the population. Fecal wet mass values were increased by a factor of 2 in low income countries (high fiber intakes) in comparison to values found in high income countries (low fiber intakes). Feces had a median pH of 6.64 and were composed of 74.6% water. Bacterial biomass is the major component (25-54% of dry solids) of the organic fraction of the feces. Undigested carbohydrate, fiber, protein, and fat comprise the remainder and the amounts depend on diet and diarrhea prevalence in the population. The inorganic component of the feces is primarily undigested dietary elements that also depend on dietary supply. Median urine generation rates were 1.42 L/cap/day with a dry solids content of 59 g/cap/day. Variation in the volume and composition of urine is caused by differences in physical exertion, environmental conditions, as well as water, salt, and high protein intakes. Urine has a pH 6.2 and contains the largest fractions of nitrogen, phosphorus, and potassium released from the body. The urinary excretion of nitrogen was significant (10.98 g/cap/day) with urea the most predominant constituent making up over 50% of total organic solids. The dietary intake of food and fluid is the major cause of variation in both the fecal and urine composition and these variables should always be considered if the generation rate, physical, and chemical composition of feces and urine is to be accurately predicted.
This study took a retrospective approach to investigate patients with catheter-associated urinary tract infection (CAUTI) over 2 years at a single hospital’s intensive care unit (ICU) to identify meaningful risk factors and causative organisms.