Background Venous disease is the most common cause of leg ulceration. Although compression therapy improves venous ulcer healing, it does not treat the underlying causes of venous hypertension. Treatment of superficial venous reflux has been shown to reduce the rate of ulcer recurrence, but the effect of early endovenous ablation of superficial venous reflux on ulcer healing remains unclear. Methods In a trial conducted at 20 centers in the United Kingdom, we randomly assigned 450 patients with venous leg ulcers to receive compression therapy and undergo early endovenous ablation of superficial venous reflux within 2 weeks after randomization (early-intervention group) or to receive compression therapy alone, with consideration of endovenous ablation deferred until after the ulcer was healed or until 6 months after randomization if the ulcer was unhealed (deferred-intervention group). The primary outcome was the time to ulcer healing. Secondary outcomes were the rate of ulcer healing at 24 weeks, the rate of ulcer recurrence, the length of time free from ulcers (ulcer-free time) during the first year after randomization, and patient-reported health-related quality of life. Results Patient and clinical characteristics at baseline were similar in the two treatment groups. The time to ulcer healing was shorter in the early-intervention group than in the deferred-intervention group; more patients had healed ulcers with early intervention (hazard ratio for ulcer healing, 1.38; 95% confidence interval [CI], 1.13 to 1.68; P=0.001). The median time to ulcer healing was 56 days (95% CI, 49 to 66) in the early-intervention group and 82 days (95% CI, 69 to 92) in the deferred-intervention group. The rate of ulcer healing at 24 weeks was 85.6% in the early-intervention group and 76.3% in the deferred-intervention group. The median ulcer-free time during the first year after trial enrollment was 306 days (interquartile range, 240 to 328) in the early-intervention group and 278 days (interquartile range, 175 to 324) in the deferred-intervention group (P=0.002). The most common procedural complications of endovenous ablation were pain and deep-vein thrombosis. Conclusions Early endovenous ablation of superficial venous reflux resulted in faster healing of venous leg ulcers and more time free from ulcers than deferred endovenous ablation. (Funded by the National Institute for Health Research Health Technology Assessment Program; EVRA Current Controlled Trials number, ISRCTN02335796 .).
Background The level of anticoagulation in response to a fixed-dose regimen of warfarin is difficult to predict during the initiation of therapy. We prospectively compared the effect of genotype-guided dosing with that of standard dosing on anticoagulation control in patients starting warfarin therapy. Methods We conducted a multicenter, randomized, controlled trial involving patients with atrial fibrillation or venous thromboembolism. Genotyping for CYP2C9*2, CYP2C9*3, and VKORC1 (-1639G→A) was performed with the use of a point-of-care test. For patients assigned to the genotype-guided group, warfarin doses were prescribed according to pharmacogenetic-based algorithms for the first 5 days. Patients in the control (standard dosing) group received a 3-day loading-dose regimen. After the initiation period, the treatment of all patients was managed according to routine clinical practice. The primary outcome measure was the percentage of time in the therapeutic range of 2.0 to 3.0 for the international normalized ratio (INR) during the first 12 weeks after warfarin initiation. Results A total of 455 patients were recruited, with 227 randomly assigned to the genotype-guided group and 228 assigned to the control group. The mean percentage of time in the therapeutic range was 67.4% in the genotype-guided group as compared with 60.3% in the control group (adjusted difference, 7.0 percentage points; 95% confidence interval, 3.3 to 10.6; P<0.001). There were significantly fewer incidences of excessive anticoagulation (INR ≥4.0) in the genotype-guided group. The median time to reach a therapeutic INR was 21 days in the genotype-guided group as compared with 29 days in the control group (P<0.001). Conclusions Pharmacogenetic-based dosing was associated with a higher percentage of time in the therapeutic INR range than was standard dosing during the initiation of warfarin therapy. (Funded by the European Commission Seventh Framework Programme and others; ClinicalTrials.gov number, NCT01119300 .).
BACKGROUND: Paralysis or amputation of an arm results in the loss of the ability to orient the hand and grasp, manipulate, and carry objects, functions that are essential for activities of daily living. Brain-machine interfaces could provide a solution to restoring many of these lost functions. We therefore tested whether an individual with tetraplegia could rapidly achieve neurological control of a high-performance prosthetic limb using this type of an interface. METHODS: We implanted two 96-channel intracortical microelectrodes in the motor cortex of a 52-year-old individual with tetraplegia. Brain-machine-interface training was done for 13 weeks with the goal of controlling an anthropomorphic prosthetic limb with seven degrees of freedom (three-dimensional translation, three-dimensional orientation, one-dimensional grasping). The participant’s ability to control the prosthetic limb was assessed with clinical measures of upper limb function. This study is registered with ClinicalTrials.gov, NCT01364480. FINDINGS: The participant was able to move the prosthetic limb freely in the three-dimensional workspace on the second day of training. After 13 weeks, robust seven-dimensional movements were performed routinely. Mean success rate on target-based reaching tasks was 91·6% (SD 4·4) versus median chance level 6·2% (95% CI 2·0-15·3). Improvements were seen in completion time (decreased from a mean of 148 s [SD 60] to 112 s ) and path efficiency (increased from 0·30 [0·04] to 0·38 [0·02]). The participant was also able to use the prosthetic limb to do skilful and coordinated reach and grasp movements that resulted in clinically significant gains in tests of upper limb function. No adverse events were reported. INTERPRETATION: With continued development of neuroprosthetic limbs, individuals with long-term paralysis could recover the natural and intuitive command signals for hand placement, orientation, and reaching, allowing them to perform activities of daily living. FUNDING: Defense Advanced Research Projects Agency, National Institutes of Health, Department of Veterans Affairs, and UPMC Rehabilitation Institute.
Background Recombinant human tripeptidyl peptidase 1 (cerliponase alfa) is an enzyme-replacement therapy that has been developed to treat neuronal ceroid lipofuscinosis type 2 (CLN2) disease, a rare lysosomal disorder that causes progressive dementia in children. Methods In a multicenter, open-label study, we evaluated the effect of intraventricular infusion of cerliponase alfa every 2 weeks in children with CLN2 disease who were between the ages of 3 and 16 years. Treatment was initiated at a dose of 30 mg, 100 mg, or 300 mg; all the patients then received the 300-mg dose for at least 96 weeks. The primary outcome was the time until a 2-point decline in the score on the motor and language domains of the CLN2 Clinical Rating Scale (which ranges from 0 to 6, with 0 representing no function and 3 representing normal function in each of the two domains), which was compared with the rate of decline in 42 historical controls. We also compared the rate of decline in the motor-language score between the two groups, using data from baseline to the last assessment with a score of more than 0, divided by the length of follow-up (in units of 48 weeks). Results Twenty-four patients were enrolled, 23 of whom constituted the efficacy population. The median time until a 2-point decline in the motor-language score was not reached for treated patients and was 345 days for historical controls. The mean (±SD) unadjusted rate of decline in the motor-language score per 48-week period was 0.27±0.35 points in treated patients and 2.12±0.98 points in 42 historical controls (mean difference, 1.85; P<0.001). Common adverse events included convulsions, pyrexia, vomiting, hypersensitivity reactions, and failure of the intraventricular device. In 2 patients, infections developed in the intraventricular device that was used to administer the infusion, which required antibiotic treatment and device replacement. Conclusions Intraventricular infusion of cerliponase alfa in patients with CLN2 disease resulted in less decline in motor and language function than that in historical controls. Serious adverse events included failure of the intraventricular device and device-related infections. (Funded by BioMarin Pharmaceutical and others; CLN2 ClinicalTrials.gov numbers, NCT01907087 and NCT02485899 .).
Background We observed an apparent increase in the rate of device thrombosis among patients who received the HeartMate II left ventricular assist device, as compared with preapproval clinical-trial results and initial experience. We investigated the occurrence of pump thrombosis and elevated lactate dehydrogenase (LDH) levels, LDH levels presaging thrombosis (and associated hemolysis), and outcomes of different management strategies in a multi-institutional study. Methods We obtained data from 837 patients at three institutions, where 895 devices were implanted from 2004 through mid-2013; the mean (±SD) age of the patients was 55±14 years. The primary end point was confirmed pump thrombosis. Secondary end points were confirmed and suspected thrombosis, longitudinal LDH levels, and outcomes after pump thrombosis. Results A total of 72 pump thromboses were confirmed in 66 patients; an additional 36 thromboses in unique devices were suspected. Starting in approximately March 2011, the occurrence of confirmed pump thrombosis at 3 months after implantation increased from 2.2% (95% confidence interval [CI], 1.5 to 3.4) to 8.4% (95% CI, 5.0 to 13.9) by January 1, 2013. Before March 1, 2011, the median time from implantation to thrombosis was 18.6 months (95% CI, 0.5 to 52.7), and from March 2011 onward, it was 2.7 months (95% CI, 0.0 to 18.6). The occurrence of elevated LDH levels within 3 months after implantation mirrored that of thrombosis. Thrombosis was presaged by LDH levels that more than doubled, from 540 IU per liter to 1490 IU per liter, within the weeks before diagnosis. Thrombosis was managed by heart transplantation in 11 patients (1 patient died 31 days after transplantation) and by pump replacement in 21, with mortality equivalent to that among patients without thrombosis; among 40 thromboses in 40 patients who did not undergo transplantation or pump replacement, actuarial mortality was 48.2% (95% CI, 31.6 to 65.2) in the ensuing 6 months after pump thrombosis. Conclusions The rate of pump thrombosis related to the use of the HeartMate II has been increasing at our centers and is associated with substantial morbidity and mortality.
BACKGROUND: We sought to characterise the frequency, health outcomes and economic consequences of diagnostic errors in the USA through analysis of closed, paid malpractice claims. METHODS: We analysed diagnosis-related claims from the National Practitioner Data Bank (1986-2010). We describe error type, outcome severity and payments (in 2011 US dollars), comparing diagnostic errors to other malpractice allegation groups and inpatient to outpatient within diagnostic errors. RESULTS: We analysed 350 706 paid claims. Diagnostic errors (n=100 249) were the leading type (28.6%) and accounted for the highest proportion of total payments (35.2%). The most frequent outcomes were death, significant permanent injury, major permanent injury and minor permanent injury. Diagnostic errors more often resulted in death than other allegation groups (40.9% vs 23.9%, p<0.001) and were the leading cause of claims-associated death and disability. More diagnostic error claims were outpatient than inpatient (68.8% vs 31.2%, p<0.001), but inpatient diagnostic errors were more likely to be lethal (48.4% vs 36.9%, p<0.001). The inflation-adjusted, 25-year sum of diagnosis-related payments was US$38.8 billion (mean per-claim payout US$386 849; median US$213 250; IQR US$74 545-484 500). Per-claim payments for permanent, serious morbidity that was 'quadriplegic, brain damage, lifelong care' (4.5%; mean US$808 591; median US$564 300), 'major' (13.3%; mean US$568 599; median US$355 350), or 'significant' (16.9%; mean US$419 711; median US$269 255) exceeded those where the outcome was death (40.9%; mean US$390 186; median US$251 745). CONCLUSIONS: Among malpractice claims, diagnostic errors appear to be the most common, most costly and most dangerous of medical mistakes. We found roughly equal numbers of lethal and non-lethal errors in our analysis, suggesting that the public health burden of diagnostic errors could be twice that previously estimated. Healthcare stakeholders should consider diagnostic safety a critical health policy issue.
Low vitamin B-12 concentrations are frequently observed among older adults. Malabsorption is hypothesized to be an important cause of vitamin B-12 inadequacy, but serum vitamin B-12 may also be differently affected by vitamin B-12 intake depending on food source. We examined associations between dietary sources of vitamin B-12 (meat, fish and shellfish, eggs, dairy) and serum vitamin B-12, using cross-sectional data of 600 Dutch community-dwelling adults (≥65 years). Dietary intake was assessed with a validated food frequency questionnaire. Vitamin B-12 concentrations were measured in serum. Associations were studied over tertiles of vitamin B-12 intake using P for trend, by calculating prevalence ratios (PRs), and splines. Whereas men had significantly higher vitamin B-12 intakes than women (median (25th-75th percentile): 4.18 (3.29-5.38) versus 3.47 (2.64-4.40) μg/day), serum vitamin B-12 did not differ between the two sexes (mean ± standard deviation (SD): 275 ± 104 pmol/L versus 290 ± 113 pmol/L). Higher intakes of dairy, meat, and fish and shellfish were significantly associated with higher serum vitamin B-12 concentrations, where meat and dairy-predominantly milk were the most potent sources. Egg intake did not significantly contribute to higher serum vitamin B-12 concentrations. Thus, dairy and meat were the most important contributors to serum vitamin B-12, followed by fish and shellfish.
AIM: To describe and test a new technique to obtain midstream urine samples in newborns. DESIGN AND METHODS: This was a prospective feasibility and safety study conducted in the neonatal unit of University Infanta Sofía Hospital, Madrid. A new technique based on bladder and lumbar stimulation manoeuvres was tested over a period of 4 months in 80 admitted patients aged less than 30 days. The main variable was the success rate in obtaining a midstream urine sample within 5 min. Secondary variables were time to obtain the sample and complications. RESULTS: This technique was successful in 86.3% of infants. Median time to sample collection was 45 s (IQR 30). No complications other than controlled crying were observed. CONCLUSIONS: A new, quick and safe technique with a high success rate is described, whereby the discomfort and waste of time usually associated with bag collection methods can be avoided.
Underwater noise from human activities appears to be rising, with ramifications for acoustically sensitive marine organisms and the functioning of marine ecosystems. Policymakers are beginning to address the risk of ecological impact, but are constrained by a lack of data on current and historic noise levels. Here, we present the first nationally coordinated effort to quantify underwater noise levels, in support of UK policy objectives under the EU Marine Strategy Framework Directive (MSFD). Field measurements were made during 2013-2014 at twelve sites around the UK. Median noise levels ranged from 81.5-95.5 dB re 1 μPa for one-third octave bands from 63-500 Hz. Noise exposure varied considerably, with little anthropogenic influence at the Celtic Sea site, to several North Sea sites with persistent vessel noise. Comparison of acoustic metrics found that the RMS level (conventionally used to represent the mean) was highly skewed by outliers, exceeding the 97(th) percentile at some frequencies. We conclude that environmental indicators of anthropogenic noise should instead use percentiles, to ensure statistical robustness. Power analysis indicated that at least three decades of continuous monitoring would be required to detect trends of similar magnitude to historic rises in noise levels observed in the Northeast Pacific.
- CMAJ : Canadian Medical Association journal = journal de l'Association medicale canadienne
- Published almost 5 years ago
BACKGROUND:A hip fracture causes bleeding, pain and immobility, and initiates inflammatory, hypercoagulable, catabolic and stress states. Accelerated surgery may improve outcomes by reducing the duration of these states and immobility. We undertook a pilot trial to determine the feasibility of a trial comparing accelerated care (i.e., rapid medical clearance and surgery) and standard care among patients with a hip fracture. METHODS:Patients aged 45 years or older who, during weekday, daytime working hours, received a diagnosis of a hip fracture requiring surgery were randomly assigned to receive accelerated or standard care. Our feasibility outcomes included the proportion of eligible patients randomly assigned, completeness of follow-up and timelines of accelerated surgery. The main clinical outcome, assessed by data collectors and adjudicators who were unaware of study group allocations, was a major perioperative complication (i.e., a composite of death, preoperative myocardial infarction, myocardial injury after noncardiac surgery, pulmonary embolism, pneumonia, stroke, and life-threatening or major bleeding) within 30 days of randomization. RESULTS:Of patients eligible for inclusion, 80% consented and were randomly assigned to groups (30 to accelerated care and 30 to standard care) at 2 centres in Canada and 1 centre in India. All patients completed 30-day follow-up. The median time from diagnosis to surgery was 6.0 hours in the accelerated care group and 24.2 hours in the standard care group (p < 0.001). A major perioperative complication occurred in 9 (30%) of the patients in the accelerated care group and 14 (47%) of the patients in the standard care group (hazard ratio 0.60, 95% confidence interval 0.26-1.39). INTERPRETATION:These results show the feasibility of a trial comparing accelerated and standard care among patients with hip fracture and support a definitive trial. Trial registration: ClinicalTrials.gov, no. NCT01344343.