Journal: Clinical and applied thrombosis/hemostasis : official journal of the International Academy of Clinical and Applied Thrombosis/Hemostasis
Patients with iliac deep vein thrombosis (DVT) have a poor prognosis and high incidence of postthrombotic syndrome (PTS). We evaluated the effect of low-molecular-weight heparin (LMWH; tinzaparin) versus usual care (tinzaparin plus warfarin for ≥12 weeks at home) in the development of PTS according to DVT location (iliac/noniliac) by retrospective analysis of the Home-LITE cohort (480 patients with proximal DVT). Patients with iliac DVT had an overall odds ratio of 0.53 (95% confidence interval [CI] 0.33, 0.83; P = .0079) for PTS (including ulcer data) in favor of tinzaparin. Patients with noniliac DVT had a similar odds ratio (0.79 [95% CI 0.67, 0.93], P = .0046) to that reported in the overall Home-LITE population (0.76 [95% CI 0.66, 0.89], P = .0004; including ulcer data), both in favor of tinzaparin. Long-term LMWH may be a suitable alternative for the prevention of PTS in patients with iliac DVT who are unlikely to undergo invasive thrombolysis.
Inflammation is a key feature of atherosclerosis and its clinical manifestations. The leukocyte count has emerged as a marker of inflammation that is widely available in clinical practice. Since inflammation plays a key role in atherosclerosis and its end results, discovering new biomarkers of inflammation becomes important in order to help diagnostic accuracy and provide prognostic information about coronary cardiac disease. In acute coronary syndromes and percutaneous coronary intervention, elevated levels of almost all subtypes of white blood cell counts, including eosinophils, monocytes, neutrophils, and lymphocytes, and neutrophil-lymphocyte ratio and eosinophil-leukocyte ratio constitute independent predictors of adverse outcomes. Eosinophil count and eosinophil-leukocyte ratio, in particular, emerge as novel biomarkers for risk stratification in patients with coronary artery disease. Since the presence of eosinophils denotes hypersensitivity inflammation and hypersensitivity associated with Kounis syndrome, this reality is essential for elucidating the etiology of inflammation in order to consider predictive and preventive measures and to apply the appropriate therapeutic methods.
There are steps to achieve an optimum life for patients with hemophilia in developing countries, and awareness of the pattern of death in patients with hemophilia is a prerequisite for any health-care program. Owing to the lack of any data on the pattern of death in patients with hemophilia from developing countries, the current study was done to address common causes of death, and the spectrum of causes of death among individuals with hemophilia A and B. To address the pattern of death in northeast of Iran, we retrospectively collected demographic data regarding deceased patients with hemophilia A and B. Overall, among 379 people with hemophilia A and B, there were 46 deaths. Thirty-two deaths happened in the severe forms of the diseases. The obtained results show the patterns of death in the patients studied are not as parallel as some reports from the developed countries. Traumatic and spontaneous bleeding events were the main causes of death. The trend of death shows a decrease in the current decade post better therapeutic facilities. Evaluation of causes of death in hemophilia can be a useful indicator for managing the efficacy of health care in the current patients.
It is unclear whether initial infection control or anticoagulant therapy exerts a greater effect on early changes in the Sequential Organ Failure Assessment (SOFA) score among patients with sepsis-induced disseminated intravascular coagulation (DIC). This retrospective propensity score cohort study aimed to evaluate whether adequacy of infection control or anticoagulation therapy had a greater effect on early changes in the SOFA scores among 52 patients with sepsis-induced DIC. Inadequate initial infection control was associated with a lower 28-day survival rate among patients with sepsis-induced DIC (odds ratio [OR]: 0.116, 95% confidence interval [CI]: 0.022-0.601; P = .010); however, the adequacy was not associated with an early improvement in the SOFA score. However, despite adjusting for inadequate initial infection control, administration of recombinant human soluble thrombomodulin was associated with an early improvement in the SOFA score (OR: 5.058, 95% CI: 1.047-24.450; P = .044). Therefore, early changes in the SOFA score within 48 hours after the DIC diagnosis were more strongly affected by the administration of recombinant human soluble thrombomodulin than the adequacy of initial infection control.
There is no direct evidence comparing the 2 most commonly prescribed direct oral anticoagulants, apixaban and rivaroxaban, used for stroke prevention in nonvalvular atrial fibrillation (NVAF). A number of network meta-analyses (NMAs) of randomized control trials and real-world evidence (RWE) studies comparing the efficacy, effectiveness, and safety of apixaban and rivaroxaban have been published; however, a comprehensive evidence review across the available body of evidence is lacking. In this study, we aimed to systematically review and evaluate the clinical outcomes of apixaban and rivaroxaban using a combination of data gleaned from both NMAs and RWE studies. The review identified 21 NMAs and 5 RWE studies. The data demonstrated that apixaban was associated with fewer major bleeding events compared to rivaroxaban. There was no difference in the efficacy/effectiveness profiles between these treatments. Bleeding is a serious complication of anticoagulation therapy for the management of NVAF, and is associated with increased rates of hospitalization, morbidity, mortality, and health-care expenditure. The majority of studies in this comprehensive evidence review suggests that apixaban has a lower risk of major bleeding events compared to rivaroxaban in patients with NVAF.
Andexanet alfa is a recombinant factor Xa decoy protein, designed to reverse bleeding associated with oral anti-Xa agents. Andexanet alfa is also reported to neutralize the effects of heparin-related drugs. This study focused on the neutralization profiles of unfractionated heparin (UFH), enoxaparin, and, a chemically synthetic pentasaccharide, fondaparinux by andexanet alfa. Whole blood clotting studies were carried out using thromboelastography (TEG) and activated clotting time (ACT). The anticoagulant profile of UFH, enoxaparin, and fondaparinux was studied using the activated partial thromboplastin time (aPTT), thrombin time (TT), and amidolytic anti-Xa, and anti-IIa methods. Thrombin generation inhibition was studied using the calibrated automated thrombogram system. Reversal of each of these agents was studied by supplementing andexanet alfa at 100 µg/mL. In the TEG, andexanet alfa produced almost a complete reversal of the anticoagulant effects of UFH and enoxaparin; however, it augmented the effects of fondaparinux. In the ACT, aPTT, and TT, UFH produced strong anticoagulant effects that were almost completely neutralized by andexanet alfa. Enoxaparin produced milder anticoagulant responses that were partially neutralized, whereas fondaparinux did not produce any sizeable effects. In the anti-Xa and anti-IIa assays, UFH exhibited partial neutralization whereas enoxaparin and fondaparinux did not show any neutralization. All agents produced varying degrees of the inhibition of thrombin generation, which were differentially neutralized by andexanet alfa. These results indicate that andexanet alfa is capable of differentially neutralizing anticoagulant and antiprotease effects of UFH and enoxaparin in an assay-dependent manner. However, andexanet alfa is incapable of neutralizing the anti-Xa effects of fondaparinux.
Acute traumatic coagulopathy (ATC) is an extremely common but silent murderer; this condition presents early after trauma and impacts approximately 30% of severely injured patients who are admitted to emergency departments (EDs). Given that conventional coagulation indicators usually require more than 1 hour after admission to yield results-a limitation that frequently prevents the ability for clinicians to make appropriate interventions during the optimal therapeutic window-it is clearly of vital importance to develop prediction models that can rapidly identify ATC; such models would also facilitate ancillary resource management and clinical decision support. Using the critical care Emergency Rescue Database and further collected data in ED, a total of 1385 patients were analyzed and cases with initial international normalized ratio (INR) values >1.5 upon admission to the ED met the defined diagnostic criteria for ATC; nontraumatic conditions with potentially disordered coagulation systems were excluded. A total of 818 individuals were collected from Emergency Rescue Database as derivation cohorts, then were split 7:3 into training and test data sets. A Pearson correlation matrix was used to initially identify likely key clinical features associated with ATC, and analysis of data distributions was undertaken prior to the selection of suitable modeling tools. Both machine learning (random forest) and traditional logistic regression were deployed for prediction modeling of ATC. After the model was built, another 587 patients were further collected in ED as validation cohorts. The ATC prediction models incorporated red blood cell count, Shock Index, base excess, lactate, diastolic blood pressure, and potential of hydrogen. Of 818 trauma patients filtered from the database, 747 (91.3%) patients did not present ATC (INR ≤ 1.5) and 71 (8.7%) patients had ATC (INR > 1.5) upon admission to the ED. Compared to the logistic regression model, the model based on the random forest algorithm showed better accuracy (94.0%, 95% confidence interval [CI]: 0.922-0.954 to 93.5%, 95% CI: 0.916-0.95), precision (93.3%, 95% CI: 0.914-0.948 to 93.1%, 95% CI: 0.912-0.946), F1 score (93.4%, 95% CI: 0.915-0.949 to 92%, 95% CI: 0.9-0.937), and recall score (94.0%, 95% CI: 0.922-0.954 to 93.5%, 95% CI: 0.916-0.95) but yielded lower area under the receiver operating characteristic curve (AU-ROC) (0.810, 95% CI: 0.673-0.918 to 0.849, 95% CI: 0.732-0.944) for predicting ATC in the trauma patients. The result is similar in the validation cohort. The values for classification accuracy, precision, F1 score, and recall score of random forest model were 0.916, 0.907, 0.901, and 0.917, while the AU-ROC was 0.830. The values for classification accuracy, precision, F1 score, and recall score of logistic regression model were 0.905, 0.887, 0.883, and 0.905, while the AU-ROC was 0.858. We developed and validated a prediction model based on objective and rapidly accessible clinical data that very confidently identify trauma patients at risk for ATC upon their arrival to the ED. Beyond highlighting the value of ED initial laboratory tests and vital signs when used in combination with data analysis and modeling, our study illustrates a practical method that should greatly facilitates both warning and guided target intervention for ATC.
The aim of our study was to quantify risk factors for venous thromboembolism (VTE) during the puerperal period. The case-control study was conducted in Women’s Hospital, Zhejiang University, China, from January 2006 to December 2016; cases of hospitalized VTE within 1 week after delivery were identified according to International Classification of Diseases, Ninth Revision, Clinical Modification codes. Control postpartum women without VTE were randomly selected, matched on birth day, age, delivery mode, and number of fetus with 4:1 ratio. Clinical risk factors for postpartum VTE and coagulation parameters were analyzed. We found independent variables that were significantly related to postpartum VTE (all P < .05) in a binary logistic regression analysis included preeclampsia/eclampsia (odds ratio [OR], 2.89; 95% confidence interval [CI], 1.56-5.37) and postpartum hemorrhage (OR, 4.6; 95% CI, 1.71-12.40). D-dimer was the only biomarker that statistically significant associated with postpartum VTE in 3 days after delivery (all P < .05). These findings showed preeclampsia/eclampsia and postpartum hemorrhage were important risk factors for early VTE during puerperal period. A higher level of D-dimer was more meaningful than other coagulation parameters to suspect early thrombotic disease after delivery.
We aimed to evaluate the outcome of different treatment modalities for extremity venous thrombosis (VT) in neonates and infants, highlighting the current debate on their best tool of management. This retrospective study took place over a 9-year period from January 2009 to December 2017. All treated patients were referred to the vascular and pediatric surgery departments from the neonatal intensive care unit. All patients underwent a thorough history-taking as well as general clinical and local examination of the affected limb. Patients were divided into 2 groups: group I included those who underwent a conservative treated with the sole administration of unfractionated heparin (UFH), whereas group II included those who were treated with UFH plus warfarin. Sixty-three patients were included in this study. They were 36 males and 27 females. Their age ranged from 3 to 302 days. Forty-one (65%) patients had VT in the upper limb, whereas the remaining 22 (35%) had lower extremity VT. The success rate of the nonsurgical treatment was accomplished in 81% of patients. The remaining 19% underwent limb severing, due to established gangrene. The Kaplan-Meier survival method revealed a highly significant increase in both mean and median survival times in those groups treated with heparin and warfarin compared to heparin-only group ( P < .001). Nonoperative treatment with anticoagulation or observation (ie, wait-and-see policy) alone may be an easily applicable, effective, and a safe modality for management of VT in neonates and infants, especially in developing countries with poor or highly challenged resource settings.
This study evaluated whether rotational thromboelastometry (ROTEM; Tem International GmbH, Munich, Germany) FIBTEM maximum clot firmness (MCF) can be used to predict plasma fibrinogen level in pediatric patients undergoing cardiac surgery. Linear regression was conducted to predict plasma fibrinogen level using FIBTEM MCF (0.05 level of significance). Scatter plot with the regression line for the model fit was created. Fifty charts were retrospectively reviewed, and 87 independent measurements of FIBTEM MCF paired with plasma fibrinogen levels were identified for analysis. Linear regression analysis suggested a significant positive linear relationship ( P < .0001) between plasma fibrinogen levels and MCF. Both MCF intercept and slope were significantly correlated with fibrinogen level ( P < .0001). The estimated regression equation (predicted fibrinogen = 78.6 + 12.4 × MCF) indicates that a 1-mm increase in MCF raises plasma fibrinogen level by an average of 12.4 mg/dL. The statistically significant positive linear relationship observed between MCF and fibrinogen levels ( P < .001) suggests that MCF can be used as a surrogate for fibrinogen level. This relationship is of clinical relevance in the calculation of patient-specific dosing of fibrinogen supplementation in this setting.