Background The level of anticoagulation in response to a fixed-dose regimen of warfarin is difficult to predict during the initiation of therapy. We prospectively compared the effect of genotype-guided dosing with that of standard dosing on anticoagulation control in patients starting warfarin therapy. Methods We conducted a multicenter, randomized, controlled trial involving patients with atrial fibrillation or venous thromboembolism. Genotyping for CYP2C9*2, CYP2C9*3, and VKORC1 (-1639G→A) was performed with the use of a point-of-care test. For patients assigned to the genotype-guided group, warfarin doses were prescribed according to pharmacogenetic-based algorithms for the first 5 days. Patients in the control (standard dosing) group received a 3-day loading-dose regimen. After the initiation period, the treatment of all patients was managed according to routine clinical practice. The primary outcome measure was the percentage of time in the therapeutic range of 2.0 to 3.0 for the international normalized ratio (INR) during the first 12 weeks after warfarin initiation. Results A total of 455 patients were recruited, with 227 randomly assigned to the genotype-guided group and 228 assigned to the control group. The mean percentage of time in the therapeutic range was 67.4% in the genotype-guided group as compared with 60.3% in the control group (adjusted difference, 7.0 percentage points; 95% confidence interval, 3.3 to 10.6; P<0.001). There were significantly fewer incidences of excessive anticoagulation (INR ≥4.0) in the genotype-guided group. The median time to reach a therapeutic INR was 21 days in the genotype-guided group as compared with 29 days in the control group (P<0.001). Conclusions Pharmacogenetic-based dosing was associated with a higher percentage of time in the therapeutic INR range than was standard dosing during the initiation of warfarin therapy. (Funded by the European Commission Seventh Framework Programme and others; ClinicalTrials.gov number, NCT01119300 .).
BACKGROUND: Paralysis or amputation of an arm results in the loss of the ability to orient the hand and grasp, manipulate, and carry objects, functions that are essential for activities of daily living. Brain-machine interfaces could provide a solution to restoring many of these lost functions. We therefore tested whether an individual with tetraplegia could rapidly achieve neurological control of a high-performance prosthetic limb using this type of an interface. METHODS: We implanted two 96-channel intracortical microelectrodes in the motor cortex of a 52-year-old individual with tetraplegia. Brain-machine-interface training was done for 13 weeks with the goal of controlling an anthropomorphic prosthetic limb with seven degrees of freedom (three-dimensional translation, three-dimensional orientation, one-dimensional grasping). The participant’s ability to control the prosthetic limb was assessed with clinical measures of upper limb function. This study is registered with ClinicalTrials.gov, NCT01364480. FINDINGS: The participant was able to move the prosthetic limb freely in the three-dimensional workspace on the second day of training. After 13 weeks, robust seven-dimensional movements were performed routinely. Mean success rate on target-based reaching tasks was 91·6% (SD 4·4) versus median chance level 6·2% (95% CI 2·0-15·3). Improvements were seen in completion time (decreased from a mean of 148 s [SD 60] to 112 s ) and path efficiency (increased from 0·30 [0·04] to 0·38 [0·02]). The participant was also able to use the prosthetic limb to do skilful and coordinated reach and grasp movements that resulted in clinically significant gains in tests of upper limb function. No adverse events were reported. INTERPRETATION: With continued development of neuroprosthetic limbs, individuals with long-term paralysis could recover the natural and intuitive command signals for hand placement, orientation, and reaching, allowing them to perform activities of daily living. FUNDING: Defense Advanced Research Projects Agency, National Institutes of Health, Department of Veterans Affairs, and UPMC Rehabilitation Institute.
Background We observed an apparent increase in the rate of device thrombosis among patients who received the HeartMate II left ventricular assist device, as compared with preapproval clinical-trial results and initial experience. We investigated the occurrence of pump thrombosis and elevated lactate dehydrogenase (LDH) levels, LDH levels presaging thrombosis (and associated hemolysis), and outcomes of different management strategies in a multi-institutional study. Methods We obtained data from 837 patients at three institutions, where 895 devices were implanted from 2004 through mid-2013; the mean (±SD) age of the patients was 55±14 years. The primary end point was confirmed pump thrombosis. Secondary end points were confirmed and suspected thrombosis, longitudinal LDH levels, and outcomes after pump thrombosis. Results A total of 72 pump thromboses were confirmed in 66 patients; an additional 36 thromboses in unique devices were suspected. Starting in approximately March 2011, the occurrence of confirmed pump thrombosis at 3 months after implantation increased from 2.2% (95% confidence interval [CI], 1.5 to 3.4) to 8.4% (95% CI, 5.0 to 13.9) by January 1, 2013. Before March 1, 2011, the median time from implantation to thrombosis was 18.6 months (95% CI, 0.5 to 52.7), and from March 2011 onward, it was 2.7 months (95% CI, 0.0 to 18.6). The occurrence of elevated LDH levels within 3 months after implantation mirrored that of thrombosis. Thrombosis was presaged by LDH levels that more than doubled, from 540 IU per liter to 1490 IU per liter, within the weeks before diagnosis. Thrombosis was managed by heart transplantation in 11 patients (1 patient died 31 days after transplantation) and by pump replacement in 21, with mortality equivalent to that among patients without thrombosis; among 40 thromboses in 40 patients who did not undergo transplantation or pump replacement, actuarial mortality was 48.2% (95% CI, 31.6 to 65.2) in the ensuing 6 months after pump thrombosis. Conclusions The rate of pump thrombosis related to the use of the HeartMate II has been increasing at our centers and is associated with substantial morbidity and mortality.
BACKGROUND: We sought to characterise the frequency, health outcomes and economic consequences of diagnostic errors in the USA through analysis of closed, paid malpractice claims. METHODS: We analysed diagnosis-related claims from the National Practitioner Data Bank (1986-2010). We describe error type, outcome severity and payments (in 2011 US dollars), comparing diagnostic errors to other malpractice allegation groups and inpatient to outpatient within diagnostic errors. RESULTS: We analysed 350 706 paid claims. Diagnostic errors (n=100 249) were the leading type (28.6%) and accounted for the highest proportion of total payments (35.2%). The most frequent outcomes were death, significant permanent injury, major permanent injury and minor permanent injury. Diagnostic errors more often resulted in death than other allegation groups (40.9% vs 23.9%, p<0.001) and were the leading cause of claims-associated death and disability. More diagnostic error claims were outpatient than inpatient (68.8% vs 31.2%, p<0.001), but inpatient diagnostic errors were more likely to be lethal (48.4% vs 36.9%, p<0.001). The inflation-adjusted, 25-year sum of diagnosis-related payments was US$38.8 billion (mean per-claim payout US$386 849; median US$213 250; IQR US$74 545-484 500). Per-claim payments for permanent, serious morbidity that was 'quadriplegic, brain damage, lifelong care' (4.5%; mean US$808 591; median US$564 300), 'major' (13.3%; mean US$568 599; median US$355 350), or 'significant' (16.9%; mean US$419 711; median US$269 255) exceeded those where the outcome was death (40.9%; mean US$390 186; median US$251 745). CONCLUSIONS: Among malpractice claims, diagnostic errors appear to be the most common, most costly and most dangerous of medical mistakes. We found roughly equal numbers of lethal and non-lethal errors in our analysis, suggesting that the public health burden of diagnostic errors could be twice that previously estimated. Healthcare stakeholders should consider diagnostic safety a critical health policy issue.
Low vitamin B-12 concentrations are frequently observed among older adults. Malabsorption is hypothesized to be an important cause of vitamin B-12 inadequacy, but serum vitamin B-12 may also be differently affected by vitamin B-12 intake depending on food source. We examined associations between dietary sources of vitamin B-12 (meat, fish and shellfish, eggs, dairy) and serum vitamin B-12, using cross-sectional data of 600 Dutch community-dwelling adults (≥65 years). Dietary intake was assessed with a validated food frequency questionnaire. Vitamin B-12 concentrations were measured in serum. Associations were studied over tertiles of vitamin B-12 intake using P for trend, by calculating prevalence ratios (PRs), and splines. Whereas men had significantly higher vitamin B-12 intakes than women (median (25th-75th percentile): 4.18 (3.29-5.38) versus 3.47 (2.64-4.40) μg/day), serum vitamin B-12 did not differ between the two sexes (mean ± standard deviation (SD): 275 ± 104 pmol/L versus 290 ± 113 pmol/L). Higher intakes of dairy, meat, and fish and shellfish were significantly associated with higher serum vitamin B-12 concentrations, where meat and dairy-predominantly milk were the most potent sources. Egg intake did not significantly contribute to higher serum vitamin B-12 concentrations. Thus, dairy and meat were the most important contributors to serum vitamin B-12, followed by fish and shellfish.
Underwater noise from human activities appears to be rising, with ramifications for acoustically sensitive marine organisms and the functioning of marine ecosystems. Policymakers are beginning to address the risk of ecological impact, but are constrained by a lack of data on current and historic noise levels. Here, we present the first nationally coordinated effort to quantify underwater noise levels, in support of UK policy objectives under the EU Marine Strategy Framework Directive (MSFD). Field measurements were made during 2013-2014 at twelve sites around the UK. Median noise levels ranged from 81.5-95.5 dB re 1 μPa for one-third octave bands from 63-500 Hz. Noise exposure varied considerably, with little anthropogenic influence at the Celtic Sea site, to several North Sea sites with persistent vessel noise. Comparison of acoustic metrics found that the RMS level (conventionally used to represent the mean) was highly skewed by outliers, exceeding the 97(th) percentile at some frequencies. We conclude that environmental indicators of anthropogenic noise should instead use percentiles, to ensure statistical robustness. Power analysis indicated that at least three decades of continuous monitoring would be required to detect trends of similar magnitude to historic rises in noise levels observed in the Northeast Pacific.
AIM: To describe and test a new technique to obtain midstream urine samples in newborns. DESIGN AND METHODS: This was a prospective feasibility and safety study conducted in the neonatal unit of University Infanta Sofía Hospital, Madrid. A new technique based on bladder and lumbar stimulation manoeuvres was tested over a period of 4 months in 80 admitted patients aged less than 30 days. The main variable was the success rate in obtaining a midstream urine sample within 5 min. Secondary variables were time to obtain the sample and complications. RESULTS: This technique was successful in 86.3% of infants. Median time to sample collection was 45 s (IQR 30). No complications other than controlled crying were observed. CONCLUSIONS: A new, quick and safe technique with a high success rate is described, whereby the discomfort and waste of time usually associated with bag collection methods can be avoided.
- CMAJ : Canadian Medical Association journal = journal de l'Association medicale canadienne
- Published about 4 years ago
BACKGROUND:A hip fracture causes bleeding, pain and immobility, and initiates inflammatory, hypercoagulable, catabolic and stress states. Accelerated surgery may improve outcomes by reducing the duration of these states and immobility. We undertook a pilot trial to determine the feasibility of a trial comparing accelerated care (i.e., rapid medical clearance and surgery) and standard care among patients with a hip fracture. METHODS:Patients aged 45 years or older who, during weekday, daytime working hours, received a diagnosis of a hip fracture requiring surgery were randomly assigned to receive accelerated or standard care. Our feasibility outcomes included the proportion of eligible patients randomly assigned, completeness of follow-up and timelines of accelerated surgery. The main clinical outcome, assessed by data collectors and adjudicators who were unaware of study group allocations, was a major perioperative complication (i.e., a composite of death, preoperative myocardial infarction, myocardial injury after noncardiac surgery, pulmonary embolism, pneumonia, stroke, and life-threatening or major bleeding) within 30 days of randomization. RESULTS:Of patients eligible for inclusion, 80% consented and were randomly assigned to groups (30 to accelerated care and 30 to standard care) at 2 centres in Canada and 1 centre in India. All patients completed 30-day follow-up. The median time from diagnosis to surgery was 6.0 hours in the accelerated care group and 24.2 hours in the standard care group (p < 0.001). A major perioperative complication occurred in 9 (30%) of the patients in the accelerated care group and 14 (47%) of the patients in the standard care group (hazard ratio 0.60, 95% confidence interval 0.26-1.39). INTERPRETATION:These results show the feasibility of a trial comparing accelerated and standard care among patients with hip fracture and support a definitive trial. Trial registration: ClinicalTrials.gov, no. NCT01344343.
BACKGROUND:: Intrathecal baclofen (ITB) is an effective therapy for spasticity and dystonia in pediatric populations; however, there are associated infectious complications. METHODS:: Patients who had an initial ITB device implanted at our center were followed to determine the proportion of patients with infectious and non-infectious complications, identify risk factors for infection and describe the clinical presentations, treatment and outcomes of infectious complications. RESULTS:: Over the 15 year study period, 139 patients had an initial ITB device placed. The mean age at placement was 13.6 years (range- 6 months to 41 years). In the first year of follow-up, 83% had no complications or secondary procedures, 17% had at least one secondary procedure and 5% had an infectious complication. The median time until infection was 14 days (mean 33 ± 42 days). Patients with secondary spasticity or dystonia were more likely to have infections than patients with cerebral palsy (86% vs.14%; p<0.0001). In the 94 patients with a first secondary procedure, 29% had at least one other procedure and 8% had an infection in the one year follow-up. Overall, 24 patients had 27 infections; 22% superficial, 33% deep and 45% organ space. Staphylococcus aureus was isolated in 50% of those with cultures obtained. Explantation was required in 59% of patients with an infection and differed by infection type: superficial (17%), deep (44%) and organ space (92%) (p=0.004). CONCLUSIONS:: Infectious complications were relatively uncommon; however, when present, frequently led to the explantation of the ITB pump device.
Objective: We aimed to determine if previously identified adult obesity susceptibility loci were associated uniformly with childhood BMI across the BMI distribution. Design and Methods: Children were recruited through the Children’s Hospital of Philadelphia (n=7225). Associations between the following loci and BMI were assessed using quantile regression: FTO (rs3751812), MC4R (rs12970134), TMEM18 (rs2867125), BDNF (rs6265), TNNI3K (rs1514175), NRXN3 (rs10146997), SEC16B (rs10913469), and GNPDA2 (rs13130484). BMI z-score (age and gender adjusted) was modeled as the dependent variable, and genotype risk score (sum of risk alleles carried at the 8 loci) was modeled as the independent variable. Results: Each additional increase in genotype risk score was associated with an increase in BMI z-score at the 5th, 15th, 25th, 50th, 75th, 85th and 95th BMI z-score percentiles by 0.04 (±0.02, p=0.08), 0.07 (±0.01, p=9.58 x 10-7), 0.07 (±0.01, p=1.10 x 10-8), 0.09 (±0.01, p=3.13 x 10-22), 0.11 (±0.01, p=1.35 x 10-25), 0.11 (±0.01, p=1.98 x 10-20), and 0.06 (±0.01, p=2.44 x 10-6), respectively. Each additional increase in genotype risk score was associated with an increase in mean BMI z-score by 0.08 (±0.01, p=4.27 x 10-20). Conclusion: Obesity risk alleles were more strongly associated with increases in BMI z-score at the upper tail compared to the lower tail of the distribution.