|Year : 2007 | Volume
| Issue : 1 | Page : 2-5
|Date of Web Publication||17-Jun-2010|
Source of Support: None, Conflict of Interest: None
|How to cite this article:|
. Cardiovascular News. Heart Views 2007;8:2-5
Optimal medical therapy with or without PCI for stable coronary disease
In patients with stable coronary artery disease, it remains unclear whether an initial management strategy of percutaneous coronary intervention (PCI) with intensive pharmacologic therapy and lifestyle intervention (optimal medical therapy) is superior to optimal medical therapy alone in reducing the risk of cardiovascular events.
The authors conducted a randomized trial involving 2287 patients who had objective evidence of myocardial ischemia and significant coronary artery disease at 50 U.S. and Canadian centers. Between 1999 and 2004, we assigned 1149 patients to undergo PCI with optimal medical therapy (PCI group) and 1138 to receive optimal medical therapy alone (medical-therapy group). The primary outcome was death from any cause and nonfatal myocardial infarction during a follow-up period of 2.5 to 7.0 years (median, 4.6).
There were 211 primary events in the PCI group and 202 events in the medical-therapy group. The 4.6-year cumulative primary-event rates were 19.0% in the PCI group and 18.5% in the medical-therapy group (hazard ratio for the PCI group, 1.05; 95% confidence interval [CI], 0.87 to 1.27; P=0.62). There were no significant differences between the PCI group and the medical-therapy group in the composite of death, myocardial infarction, and stroke (20.0% vs 19.5%; hazard ratio, 1.05; 95% CI, 0.87 to 1.27; P=0.62); hospitalization for acute coronary syndrome (12.4% vs 11.8%; hazard ratio, 1.07; 95% CI, 0.84 to 1.37; P=0.56); or myocardial infarction (13.2% vs 12.3%; hazard ratio, 1.13; 95% CI, 0.89 to 1.43; P=0.33).
As an initial management strategy in patients with stable coronary artery disease, PCI did not reduce the risk of death, myocardial infarction, or other major cardiovascular events when added to optimal medical therapy.
Intracoronary streptokinase after primary percutaneous coronary intervention
Microvascular perfusion is often impaired after primary percutaneous coronary intervention (PCI). We proposed that in situ thrombosis might contribute to poor myocardial perfusion in this setting. To test this hypothesis, we evaluated the effect of low-dose intracoronary streptokinase administered immediately after primary PCI.
Forty-one patients undergoing primary PCI were randomly assigned to receive intracoronary streptokinase (250 kU) or no additional therapy. Two days later, cardiac catheterization was repeated, and coronary hemodynamic end points were measured with the use of a guidewire tipped with pressure and temperature sensors. In patients with anterior myocardial infarction, the deceleration time of coronary diastolic flow was measured with transthoracic echocardiography. At 6 months, angiography, echocardiography, and technetium-99m single-photon-emission computed tomography were performed.
Two days after PCI, all measures of microvascular function (means SD) were significantly better in the streptokinase group than in the control group, including coronary flow reserve (2.01 0.57 vs 1.39 0.31), the index of microvascular resistance (16.29 5.06 U vs 32.49 11.04 U), the collateral-flow index
(0.08 0.05 vs 0.17 0.07), mean coronary wedge pressure (10.81 5.46 mm Hg vs 17.20 7.93 mm Hg), systolic coronary wedge pressure (18.24 6.07 mm Hg vs 33.80 11.00 mm Hg), and diastolic deceleration time (828 258 msec vs 360 292 msec). The administration of intracoronary streptokinase was also associated with a significantly lower corrected Thrombolysis in Myocardial Infarction frame count (the number of cine frames required for dye to travel from the ostium of a coronary artery to a standardized distal coronary landmark) at 2 days. At 6 months, however, there was no evidence of a difference between the two study groups in left ventricular size or function.
In our pilot trial, the administration of low-dose intracoronary streptokinase immediately after primary PCI improved myocardial reperfusion but not long-term left ventricular size or function. These findings require clarification in a larger trial.
Major hemorrhage and tolerability of warfarin in the first year of therapy among elderly patients with atrial fibrillation
Warfarin is effective in the prevention of stroke in atrial fibrillation but is under used in clinical care. Concerns exist that published rates of hemorrhage may not reflect real-world practice. Few patients 80 years of age were enrolled in trials, and studies of prevalent use largely reflect a warfarin-tolerant subset. We sought to define the tolerability of warfarin among an elderly inception cohort with atrial fibrillation.
Consecutive patients who started warfarin were identified from January 2001 to June 2003 and followed for 1 year. Patients had to be 65 years of age, have established care at the study institution, and have their warfarin managed on-site. Outcomes included major hemorrhage, time to termination of warfarin, and reason for discontinuation. Of 472 patients, 32% were 80 years of age, and 91% had 1 stroke risk factor. The cumulative incidence of major hemorrhage for patients 80 years of age was 13.1 per 100 person-years and 4.7 for those <80 years of age (P=0.009). The first 90 days of warfarin, age 80 years, and international normalized ratio (INR) 4.0 were associated with increased risk despite trial-level anticoagulation control. Within the first year, 26% of patients 80 years of age stopped taking warfarin. Perceived safety issues accounted for 81% of them. Rates of major hemorrhage and warfarin termination were highest among patients with CHADS2 scores (an acronym for congestive heart failure, hypertension, age 75, diabetes mellitus, and prior stroke or transient ischemic attack) of 3.
Rates of hemorrhage derived from younger noninception cohorts underestimate the bleeding that occurs in practice. This finding coupled with the short-term tolerability of warfarin likely contributes to its underutilization. Stroke prevention among elderly patients with atrial fibrillation remains a challenging and pressing health concern.
Robotic image-guided therapy for atrial fibrillation ablation
A robotic catheter navigation system has been developed that provides a significant degree of freedom of catheter movement. This study examines the feasibility of synchronizing this robotic navigation system with electroanatomic mapping and 3-dimensional computed tomography imaging to perform view-synchronized left atrial (LA) ablation.
This study consisted of a porcine experimental validation phase (9 animals) and a clinical feasibility phase (9 atrial fibrillation patients). Preprocedural computed tomography images were reconstructed to provide 3-dimensional surface models of the LA pulmonary veins and aorta. Aortic electroanatomic mapping was performed manually, followed by registration with the corresponding computed tomography aorta image using custom software. The mapping catheter was remotely manipulated with the robotic navigation system within the registered computed tomography image of the LA pulmonary veins.
The point-to-surface error between the LA electroanatomic mapping data and the computed tomography image was 2.1 0.7 and 1.6 0.1 mm in the preclinical and clinical studies, respectively. The catheter was remotely navigated into all pulmonary veins, the LA appendage, and circumferentially along the mitral valve annulus. In 7 of 9 animals, circumferential radiofrequency ablation lesions were applied periostially to ablate 11 pulmonary veins. In patients, all of the pulmonary veins were remotely electrically isolated in an extraostial fashion.
Adjunctive ablation included superior vena cava isolation in 6 patients, cavotricuspid isthmus ablation in 5 patients, and ablation of sites of complex fractionated activity and atypical LA flutters in 3 patients.
This study demonstrates the safety and feasibility of an emerging paradigm for atrial fibrillation ablation involving the confluence of 3 technologies: 3-dimensional imaging, electroanatomic mapping, and remote robotic navigation.
Risk of thromboembolism in heart failure (scd-heft)
In patients with heart failure, rates of clinically apparent stroke range from 1.3% to 3.5% per year. Little is known about the incidence and risk factors in the absence of atrial fibrillation. In the Sudden Cardiac Death in Heart Failure Trial (SCD-HeFT), 2521 patients with moderate heart failure were randomized to receive amiodarone, implanted cardioverter-defibrillators (ICDs), or placebo.
The authors determined the incidence of stroke or peripheral or pulmonary embolism in patients with no history of atrial fibrillation (n=2114), predictors of thromboembolism and the relationship to left ventricular ejection fraction. Median follow-up was 45.5 months. Kaplan-Meier estimates (95% CIs) for the incidence of thromboembolism by 4 years were 4.0% (3.0% to 4.9%), with 2.6% (1.1% to 4.1%) in patients randomized to amiodarone, 3.2% (1.8% to 4.7%) in patients randomized to ICD, and 6.0% (4.0% to 8.0%) in patients randomized to placebo (approximate rates of 0.7%, 0.8%, and 1.5% per year, respectively). By multivariable analysis, hypertension (P=0.021) and decreasing left ventricular ejection fraction (P=0.023) were significant predictors of thromboembolism; treatment with amiodarone or ICD treatment was a significant predictor of thromboembolism-free survival (P=0.014 for treatment effect; hazard ratio [95% CI] versus placebo, 0.57 [0.33 to 0.99] for ICD; 0.44 [0.24 to 0.80] for amiodarone). Inclusion of atrial fibrillation during follow-up in the multivariable model did not affect the significance of treatment assignment as a predictor of thromboembolism.
In the SCD-HeFT patient cohort, which reflects contemporary treatment of patients with moderately symptomatic systolic heart failure, patients experienced thromboembolism events at a rate of 1.7% per year without antiarrhythmic therapy. Those treated with amiodarone or ICDs had lower risk of thromboembolism than those given placebo. Hypertension at baseline and lower ejection fraction were independent predictors of risk.
Predictors of outcome in chronic thromboembolic pulmonary hypertension
Chronic thromboembolic pulmonary hypertension (CTEPH) is characterized by intraluminal thrombus organization and fibrous obliteration of pulmonary arteries. Recently, associated medical conditions such as splenectomy, ventriculoatrial shunt for the treatment of hydrocephalus, permanent central intravenous lines, inflammatory bowel disease, and osteomyelitis were found to be associated with the development of CTEPH. The study aim was to define the impact of these novel risk factors on survival.
Between January 1992 and December 2006, 181 patients diagnosed with CTEPH were tracked with the use of our centeris customized computer database. A Cox regression model was used to examine relations between survival and associated medical conditions, age, sex, hemodynamic parameters, modified New York Heart Association functional class at diagnosis, CTEPH type, pulmonary endarterectomy, and anti-cardiolipin antibodies/lupus anticoagulant. During a median observation time of 22.1 (range, 0.03 to 152) months, the clinical end point of cardiovascular death or lung transplantation occurred in 48 cases (27%).
Pulmonary endarterectomy (hazard ratio, 0.14; 95% CI, 0.05 to 0.41; P=0.0003), associated medical conditions (hazard ratio, 3.17; 95% CI, 1.70 to 5.92; P=0.0003), and pulmonary vascular resistance (hazard ratio, 1.02; 95% CI, 1.00 to 1.04; P=0.04) were predictors of survival. Thirty-day postoperative mortality (24% versus 9%) and the incidence of postoperative pulmonary hypertension (92% versus 20%) were substantially higher in patients with associated medical conditions.
CTEPH-predisposing medical conditions, such as splenectomy, permanent central intravenous lines, and certain inflammatory disorders, predict poor survival in CTEPH.
Optimal treatment of obesity-related hypertension the hypertension-obesity-sibutramine (HOS) study
Current guidelines for the treatment of hypertension do not provide specific recommendations for obese hypertensive patients. To identify an optimal treatment regimen for obese hypertensive patients, we studied the interactions between a drug-based weight loss approach by sibutramine and different antihypertensive drug regimens.
This was a prospective, 16-week double-blind placebo-controlled randomized multicenter study in 171 obese hypertensive patients. After a 2-week run-in period, patients receiving 1 of the 3 antihypertensive combination therapies (felodipine 5 mg/ramipril 5 mg [n=57], verapamil 180 mg/ trandolapril 2 mg [n=55], or metoprolol succinate 95 mg/hydrochlorothiazide 12.5 mg [metoprolol/ hydrochlorothiazide; n=59]) were assigned randomly to sibutramine (15 mg) or placebo. Sibutramine treatment resulted in a significantly greater decrease in body weight, body mass index, and waist circumference and a significant increase in diastolic blood pressure during 24-hour blood pressure monitoring compared with placebo treatment. Sibutramine-induced weight loss and reduction of visceral obesity were markedly attenuated in the metoprolol/hydrochlorothiazide group compared with the other groups. Consistently, improvement in glucose tolerance and hypertriglyceridemia by sibutramine was abrogated in the cohort treated with metoprolol/ hydrochlorothiazide compared with the other groups.
The present study demonstrates for the first time that an antihypertensive combination therapy regimen with angiotensin-converting enzyme inhibitors and calcium channel blockers is more advantageous than a Ύ-blocker/diureticρbased regimen in supporting the weight-reducing actions and concomitant metabolic changes induced by sibutramine in obese hypertensive patients. These data may help to develop future comprehensive treatment strategies and guidelines for this high-risk patient population.
Clinical aspects and prognosis of brugada syndrome in children
Brugada syndrome is an arrhythmogenic disease characterized by an ECG pattern of ST-segment elevation in the right precordial leads and augmented risk of sudden cardiac death. Little is known about the clinical presentation and prognosis of this disease in children.
Thirty children affected by Brugada syndrome who were <16 years of age (mean, 8 4 years) were included. All patients displayed a type I ECG pattern before or after drug provocation challenge. Diagnosis of Brugada syndrome was made under the following circumstances: aborted sudden death (n=1), syncope of unexplained origin (n=10), symptomatic supraventricular tachycardia (n=1), suspicious ECG (n=1), and family screening for Brugada syndrome (n=17). Syncope was precipitated by fever in 5 cases. Ten of 11 symptomatic patients displayed a spontaneous type I ECG. An implantable cardioverter-defibrillator was implanted in 5 children; 4 children were treated with hydroquinidine; and 1 child received a pacemaker because of symptomatic sick sinus syndrome. During a mean follow-up of 3723 months, 1 child experienced sudden cardiac death,and 2 children received an appropriate implantable cardioverter-defibrillator shock; all of them were symptomatic and had manifested a type I ECG spontaneously. One child had a cardioverter-defibrillator infection that required explantation of the defibrillator.
In the largest population of children affected by Brugada syndrome described to date, fever represented the most important precipitating factor for arrhythmic events, and as in the adult population, the risk of arrhythmic events was higher in previously symptomatic patients and in those displaying a spontaneous type I ECG.
A prospective study of trans fatty acids in erythrocytes and risk of coronary heart disease
High consumption of trans fat has been linked to the risk of coronary heart disease (CHD). We assessed the hypothesis that higher trans fatty acid contents in erythrocytes were associated with an elevated risk of CHD in a nested case-control study among US women.
Blood samples were collected from 32 826 participants of the Nursesν Health Study from 1989 to 1990. During 6 years of follow-up, 166 incident cases of CHD were ascertained and matched with 327 controls. Total trans fatty acid content in erythrocytes was significantly correlated with dietary intake of trans fat (correlation coefficient=0.44, P<0.01) and was associated with increased plasma low-density lipoprotein cholesterol (P for trend =0.06), decreased plasma high-density lipoprotein cholesterol concentrations (P for trend <0.01), and increased plasma low-density lipoprotein to high-density lipoprotein ratio (P for trend <0.01).
After adjustment for age, smoking status, and other dietary and lifestyle cardiovascular risk factors, higher total trans fatty acid content in erythrocytes was associated with an elevated risk of CHD. The multivariable relative risks (95% confidence intervals) of CHD from the lowest to highest quartiles of total trans fatty acid content in erythrocytes were 1.0 (reference), 1.6 (0.7 to 3.6), 1.6 (0.7 to 3.4), and 3.3 (1.5 to 7.2) (P for trend <0.01). The corresponding relative risks were 1.0, 1.1, 1.3, and 3.1 (P for trend <0.01) for a total of 18:1 trans isomers and 1.0, 1.5, 2.5, and 2.8 (P for trend <0.01) for a total of 18:2 trans isomers.
These biomarker data provide further evidence that high trans fat consumption remains a significant risk factor for CHD after adjustment for covariates.
The impact of valve surgery on 6 month mortality in left-sided infective endocarditis
The role of valve surgery in left-sided infective endocarditis has not been evaluated in randomized controlled trials. We examined the association between valve surgery and all-cause 6-month mortality among patients with left-sided infective endocarditis.
A total of 546 consecutive patients with left-sided infective endocarditis were included. To minimize selection bias, propensity score to undergo valve surgery was used to match patients in the surgical and nonsurgical groups. To adjust for survivor bias, we matched the follow-up time so that each patient in the nonsurgical group survived at least as long as the time to surgery in the respective surgically-treated patient. We also used valve surgery as a time-dependent covariate in different Cox models. A total of 129 (23.6%) patients underwent surgery within 30 days of diagnosis.
Death occurred in 99 of the 417 patients (23.7%) in the nonsurgical group versus 35 deaths among the 129 patients (27.1%) in the surgical group. Eighteen of 35 (51%) patients in the surgical group died within 7 days of valve surgery. In the subset of 186 cases (93 pairs of surgical versus nonsurgical cases) matched on the logit of their propensity score, diagnosis decade, and follow-up time, no significant association existed between surgery and mortality (adjusted hazard ratio, 1.3; 95% confidence interval, 0.5 to 3.1).
With a Cox model that incorporated surgery as a time-dependent covariate, valve surgery was associated with an increase in the 6-month mortality with an adjusted hazard ratio of 1.9 (95% confidence interval, 1.1 to 3.2). Because the proportionality hazard assumption was violated in the time-dependent analysis, we performed a partitioning analysis. After adjustment for early (operative) mortality, surgery was not associated with a survival benefit (adjusted hazard ratio, 0.92; 95% confidence interval, 0.48 to 1.76).
The results of our study suggest that valve surgery in left-sided infective endocarditis is not associated with a survival benefit and could be associated with increased 6-month mortality, even after adjustment for selection and survivor biases as well as confounders. Given the disparity between the results of our study and those of other observational studies, well-designed prospective studies are needed to further evaluate the role of valve surgery in endocarditis management.
| Article Access Statistics|
| Viewed||775 |
| Printed||82 |
| Emailed||0 |
| PDF Downloaded||62 |
| Comments ||[Add] |