Concept: Blood transfusion
Jean-Pierre Allain and colleagues argue that, while unintended, the foreign aid provided for blood transfusion services in sub-Saharan Africa has resulted in serious negative outcomes, which requires reflection and rethinking.
Various hydroxyethyl starch (HES) preparations have been used for decades to augment blood volume. There has been concern recently regarding possible adverse outcomes when using HES in the intensive care setting, especially in patients with septic shock. However, the pharmacokinetic and pharmacodynamic properties of HES preparations depend on their chemical composition and source material. Thus, different clinical conditions could result in differing effectiveness and safety for these preparations. Consequently, we assessed the safety of tetrastarches when used during surgery, using a formal search, that yielded 59 primary full publications of studies that met a priori inclusion criteria and randomly allocated 4529 patients with 2139 patients treated with tetrastarch compared with 2390 patients treated with a comparator. There were no indications that the use of tetrastarches during surgery induces adverse renal effects as assessed by change or absolute concentrations of serum creatinine or need for renal replacement therapy (39 trials, 3389 patients), increased blood loss (38 trials, 3280 patients), allogeneic erythrocyte transfusion (20 trials, 2151 patients; odds ratio for HES transfusion 0.73 [95% confidence interval = 0.61-0.87], P = 0.0005), or increased mortality (odds ratio for HES mortality = 0.51 [0.24-1.05], P = 0.079).
INTRODUCTION: We previously derived and validated the AIMS65 score, a mortality prognostic scale for upper GI bleeding (UGIB). OBJECTIVE: To validate the AIMS65 score in a different patient population and compare it with the Glasgow-Blatchford risk score (GBRS). DESIGN: Retrospective cohort study. PATIENTS: Adults with a primary diagnosis of UGIB. MAIN OUTCOME MEASUREMENTS: Primary outcome: inpatient mortality. Secondary outcomes: composite clinical endpoint of inpatient mortality, rebleeding, and endoscopic, radiologic or surgical intervention; blood transfusion; intensive care unit admission; rebleeding; length of stay; timing of endoscopy. The area under the receiver-operating characteristic curve (AUROC) was calculated for each score. RESULTS: Of the 278 study patients, 6.5% died and 35% experienced the composite clinical endpoint. The AIMS65 score was superior in predicting inpatient mortality (AUROC, 0.93 vs 0.68; P < .001), whereas the GBRS was superior in predicting blood transfusions (AUROC, 0.85 vs 0.65; P < .01) The 2 scores were similar in predicting the composite clinical endpoint (AUROC, 0.62 vs 0.68; P = .13) as well as the secondary outcomes. A GBRS of 10 and 12 or more maximized the sum of the sensitivity and specificity for inpatient mortality and rebleeding, respectively. The cutoff was 2 or more for the AIMS65 score for both outcomes. LIMITATIONS: Retrospective, single-center study. CONCLUSION: The AIMS65 score is superior to the GBRS in predicting inpatient mortality from UGIB, whereas the GBRS is superior for predicting blood transfusion. Both scores are similar in predicting the composite clinical endpoint and other outcomes in clinical care and resource use.
Background Randomized, controlled trials have suggested that the transfusion of blood after prolonged storage does not increase the risk of adverse outcomes among patients, although most of these trials were restricted to high-risk populations and were not powered to detect small but clinically important differences in mortality. We sought to find out whether the duration of blood storage would have an effect on mortality after transfusion in a general population of hospitalized patients. Methods In this pragmatic, randomized, controlled trial conducted at six hospitals in four countries, we randomly assigned patients who required a red-cell transfusion to receive blood that had been stored for the shortest duration (short-term storage group) or the longest duration (long-term storage group) in a 1:2 ratio. Only patients with type A or O blood were included in the primary analysis, since pilot data suggested that our goal of achieving a difference in the mean duration of blood storage of at least 10 days would not be possible with other blood types. Written informed consent was waived because all the patients received treatment consistent with the current standard of care. The primary outcome was in-hospital mortality, which was estimated by means of a logistic-regression model after adjustment for study center and patient blood type. Results From April 2012 through October 2015, a total of 31,497 patients underwent randomization. Of these patients, 6761 who did not meet all the enrollment criteria were excluded after randomization. The primary analysis included 20,858 patients with type A or O blood. Of these patients, 6936 were assigned to the short-term storage group and 13,922 to the long-term storage group. The mean storage duration was 13.0 days in the short-term storage group and 23.6 days in the long-term storage group. There were 634 deaths (9.1%) in the short-term storage group and 1213 (8.7%) in the long-term storage group (odds ratio, 1.05; 95% confidence interval [CI], 0.95 to 1.16; P=0.34). When the analysis was expanded to include the 24,736 patients with any blood type, the results were similar, with rates of death of 9.1% and 8.8%, respectively (odds ratio, 1.04; 95% CI, 0.95 to 1.14; P=0.38). Additional results were consistent in three prespecified high-risk subgroups (patients undergoing cardiovascular surgery, those admitted to intensive care, and those with cancer). Conclusions Among patients in a general hospital population, there was no significant difference in the rate of death among those who underwent transfusion with the freshest available blood and those who underwent transfusion according to the standard practice of transfusing the oldest available blood. (Funded by the Canadian Institutes of Health Research and others; INFORM Current Controlled Trials number, ISRCTN08118744 .).
Myomectomy has potential risks of complications. To reduce these risks, medical pre-treatment can be applied to reduce fibroid size and thereby potentially decrease intra-operative blood loss, the need for blood transfusion and emergency hysterectomy. The aim of this systematic review and meta-analysis is to study the effectiveness of medical pre-treatment with Gonadotropin-releasing hormone agonists (GnRHa) or ulipristal acetate prior to laparoscopic or laparotomic myomectomy on intra-operative and post-operative outcomes.
BACKGROUND: Wuwei City has the highest prevalence of hepatitis B virus (HBV) in China. From 2007 to 2011, the average reported incidence rate of hepatitis B was 634.56/100,000 people. However, studies assessing the epidemic features and risk factors of HCV in the general population of Wuwei City are limited. METHODS: A total of 7189 people were interviewed and screened for HCV antibodies. HCV RNA and HCV genotypes were analyzed by PCR. Relevant information was obtained from the general population using a standardized questionnaire, and association and logistic regression analyses were conducted. RESULTS: The anti-HCV prevalence was 1.64% (118/7189), and HCV-RNA was detected in 37.29% (44/118) of the anti-HCV positive samples. The current HCV infection rate was 0.61% (44/7189) in the Wuwei general population. Hepatitis C infection rate was generally higher in the plains regions (χ(2) = 27.54,P<0.05), and the most predominant HCV genotypes were 2a (59.1%) and 1b (34.1%). The concurrent HCV and HBV infection rate was 1.37%, and a history of blood transfusion (OR = 17.9, 95% CI: 6.1 to 52.6, p<0.001) was an independent risk factor for HCV positivity. CONCLUSIONS: Although Wuwei is a highly endemic area for HBV, the anti-HCV positive rate in the general population is low. More than one-third of HCV-infected people were unaware of their infection; this may become an important risk factor for hepatitis C prevalence in the general population. Maintaining blood safety is important in order to help reduce the burden of HCV infection in developing regions of China.
Background It is uncertain whether the duration of red-cell storage affects mortality after transfusion among critically ill adults. Methods In an international, multicenter, randomized, double-blind trial, we assigned critically ill adults to receive either the freshest available, compatible, allogeneic red cells (short-term storage group) or standard-issue (oldest available), compatible, allogeneic red cells (long-term storage group). The primary outcome was 90-day mortality. Results From November 2012 through December 2016, at 59 centers in five countries, 4994 patients underwent randomization and 4919 (98.5%) were included in the primary analysis. Among the 2457 patients in the short-term storage group, the mean storage duration was 11.8 days. Among the 2462 patients in the long-term storage group, the mean storage duration was 22.4 days. At 90 days, there were 610 deaths (24.8%) in the short-term storage group and 594 (24.1%) in the long-term storage group (absolute risk difference, 0.7 percentage points; 95% confidence interval [CI], -1.7 to 3.1; P=0.57). At 180 days, the absolute risk difference was 0.4 percentage points (95% CI, -2.1 to 3.0; P=0.75). Most of the prespecified secondary measures showed no significant between-group differences in outcome. Conclusions The age of transfused red cells did not affect 90-day mortality among critically ill adults. (Funded by the Australian National Health and Medical Research Council and others; TRANSFUSE Australian and New Zealand Clinical Trials Registry number, ACTRN12612000453886 ; ClinicalTrials.gov number, NCT01638416 .).
Background In the wake of the recent outbreak of Ebola virus disease (EVD) in several African countries, the World Health Organization prioritized the evaluation of treatment with convalescent plasma derived from patients who have recovered from the disease. We evaluated the safety and efficacy of convalescent plasma for the treatment of EVD in Guinea. Methods In this nonrandomized, comparative study, 99 patients of various ages (including pregnant women) with confirmed EVD received two consecutive transfusions of 200 to 250 ml of ABO-compatible convalescent plasma, with each unit of plasma obtained from a separate convalescent donor. The transfusions were initiated on the day of diagnosis or up to 2 days later. The level of neutralizing antibodies against Ebola virus in the plasma was unknown at the time of administration. The control group was 418 patients who had been treated at the same center during the previous 5 months. The primary outcome was the risk of death during the period from 3 to 16 days after diagnosis with adjustments for age and the baseline cycle-threshold value on polymerase-chain-reaction assay; patients who had died before day 3 were excluded. The clinically important difference was defined as an absolute reduction in mortality of 20 percentage points in the convalescent-plasma group as compared with the control group. Results A total of 84 patients who were treated with plasma were included in the primary analysis. At baseline, the convalescent-plasma group had slightly higher cycle-threshold values and a shorter duration of symptoms than did the control group, along with a higher frequency of eye redness and difficulty in swallowing. From day 3 to day 16 after diagnosis, the risk of death was 31% in the convalescent-plasma group and 38% in the control group (risk difference, -7 percentage points; 95% confidence interval [CI], -18 to 4). The difference was reduced after adjustment for age and cycle-threshold value (adjusted risk difference, -3 percentage points; 95% CI, -13 to 8). No serious adverse reactions associated with the use of convalescent plasma were observed. Conclusions The transfusion of up to 500 ml of convalescent plasma with unknown levels of neutralizing antibodies in 84 patients with confirmed EVD was not associated with a significant improvement in survival. (Funded by the European Union’s Horizon 2020 Research and Innovation Program and others; ClinicalTrials.gov number, NCT02342171 .).
Background Although transcatheter aortic-valve replacement (TAVR) is an accepted alternative to surgery in patients with severe aortic stenosis who are at high surgical risk, less is known about comparative outcomes among patients with aortic stenosis who are at intermediate surgical risk. Methods We evaluated the clinical outcomes in intermediate-risk patients with severe, symptomatic aortic stenosis in a randomized trial comparing TAVR (performed with the use of a self-expanding prosthesis) with surgical aortic-valve replacement. The primary end point was a composite of death from any cause or disabling stroke at 24 months in patients undergoing attempted aortic-valve replacement. We used Bayesian analytical methods (with a margin of 0.07) to evaluate the noninferiority of TAVR as compared with surgical valve replacement. Results A total of 1746 patients underwent randomization at 87 centers. Of these patients, 1660 underwent an attempted TAVR or surgical procedure. The mean (±SD) age of the patients was 79.8±6.2 years, and all were at intermediate risk for surgery (Society of Thoracic Surgeons Predicted Risk of Mortality, 4.5±1.6%). At 24 months, the estimated incidence of the primary end point was 12.6% in the TAVR group and 14.0% in the surgery group (95% credible interval [Bayesian analysis] for difference, -5.2 to 2.3%; posterior probability of noninferiority, >0.999). Surgery was associated with higher rates of acute kidney injury, atrial fibrillation, and transfusion requirements, whereas TAVR had higher rates of residual aortic regurgitation and need for pacemaker implantation. TAVR resulted in lower mean gradients and larger aortic-valve areas than surgery. Structural valve deterioration at 24 months did not occur in either group. Conclusions TAVR was a noninferior alternative to surgery in patients with severe aortic stenosis at intermediate surgical risk, with a different pattern of adverse events associated with each procedure. (Funded by Medtronic; SURTAVI ClinicalTrials.gov number, NCT01586910 .).
The characterization of the blood virome is important for the safety of blood-derived transfusion products, and for the identification of emerging pathogens. We explored non-human sequence data from whole-genome sequencing of blood from 8,240 individuals, none of whom were ascertained for any infectious disease. Viral sequences were extracted from the pool of sequence reads that did not map to the human reference genome. Analyses sifted through close to 1 Petabyte of sequence data and performed 0.5 trillion similarity searches. With a lower bound for identification of 2 viral genomes/100,000 cells, we mapped sequences to 94 different viruses, including sequences from 19 human DNA viruses, proviruses and RNA viruses (herpesviruses, anelloviruses, papillomaviruses, three polyomaviruses, adenovirus, HIV, HTLV, hepatitis B, hepatitis C, parvovirus B19, and influenza virus) in 42% of the study participants. Of possible relevance to transfusion medicine, we identified Merkel cell polyomavirus in 49 individuals, papillomavirus in blood of 13 individuals, parvovirus B19 in 6 individuals, and the presence of herpesvirus 8 in 3 individuals. The presence of DNA sequences from two RNA viruses was unexpected: Hepatitis C virus is revealing of an integration event, while the influenza virus sequence resulted from immunization with a DNA vaccine. Age, sex and ancestry contributed significantly to the prevalence of infection. The remaining 75 viruses mostly reflect extensive contamination of commercial reagents and from the environment. These technical problems represent a major challenge for the identification of novel human pathogens. Increasing availability of human whole-genome sequences will contribute substantial amounts of data on the composition of the normal and pathogenic human blood virome. Distinguishing contaminants from real human viruses is challenging.