Monday, May 24, 2021

Medicare Reimbursement Trends for Interventional Radiology Procedures: 2012 to 2020

Medicare Reimbursement Trends for Interventional Radiology Procedures: 2012 to 2020


Clinical question
What are the reimbursement trends for 20 common interventional radiology procedures between 2012 and 2020?

Take away point
Common interventional radiology procedures are experiencing significant reimbursement cuts by Medicare. Across all procedures, there was a decrease of -2.8% over one year and -18.7% over eight years after adjusting for inflation.

Reference
Medicare Reimbursement Trends for Interventional Radiology Procedures: 2012-2020. Schartz, D. and Young, E. Journal of Vascular and Interventional Radiology, Volume 32: 447-452.

Click here for abstract

Study design
Retrospective review of all common interventional radiology procedures between 2012-2020 that were reimbursed by the Centers for Medicare and Medicaid Services (CMS).

Funding Source
No funding.

Setting
United States.

Figure


Summary


Medicare spending accounts for a large portion of the US total health expenditure and it will increase over time as our population ages. The Medicare compensation model is complicated. It is comprised of current procedure terminology (CPT) codes, which determine the reimbursement rate for the procedure determined by the Centers for Medicare and Medicaid Services (CMS). This is determined based on a formula that is updated annually and includes variables such as geographic practice cost and relative value unit (RVU). An RVU is broken down into three components: physician (time, skill, length of training), practice/resource expenses and malpractice. Many other surgical subspecialties and diagnostic radiology have shown a decline in Medicare reimbursement. The goal of this study is to assess Medicare reimbursement trends of common IR procedures over an eight-year period

A retrospective review was performed of all common interventional radiology procedures between 2012-2020 that were reimbursed by the Centers for Medicare and Medicaid Services (CMS). Common procedures were identified as those deemed “common” by the Society of Interventional Radiology (see figure above for the procedures included). They then used the Physician Fee Schedule look-up tool from CMS. The CPT codes and their reimbursement data was then analyzed. As with similar studies, only those performed at an ambulatory surgery center or hospital were included and office based procedures were excluded. All facility fees for each procedure for each year were averaged. The consumer price index inflation calculator from the US Department of Labor’s Bureau of Labor statistics was used to adjust for inflation. A 2-tailed t-test was used to compare the unadjusted inflation rates to the adjusted inflation rates.

Between 2012-2020, the unadjusted reimbursement rate for common IR procedures decreased by -6.9% (95% CI, -13.5 to -12.9%). After adjusting for inflation the mean decline was -18.7% (95% CI, -24.4% to 12.9%), with a mean yearly decline of -2.8%. A linear regression analysis showed a steady decline in reimbursement over time (R2=0.97). Percutaneous drain placement for visceral abscess was the only procedure to have an increase in adjusted reimbursement during the study period (+7.5%). Additionally, those procedures that underwent CPT code changes had the largest mean reduction in adjusted reimbursement—which is thought to be due to procedure bundling.

Congress has implemented steps to prevent drastic of reimbursement cuts and to increase physician reimbursement by 0.5% annually. Despite this, CMS has recently proposed a Physician Fee Schedule Proposed Final Rule, which will reduce the CMS conversion factor (a geographically adjusted RVU component for each procedure). They have also planned to decrease the total amount allowed for Medicare reimbursement for work RVUs, with an estimated total impact of -8% reduction for the specialty.

Commentary


This study shows that common IR procedures are experiencing significant reimbursement cuts by Medicare. Despite steps taken by congress to increase physician reimbursement, the opposite has occurred and there are further plans to decrease reimbursement by CMS in place. There are a couple main limitations that the authors discuss. The first is the data from Medicare may not be reflective of the specialty as a whole. Smaller cohort studies in different geographical areas may provide some comparative analysis. Second, the most common procedures deemed by SIR may not actually be the most frequently performed procedures. Further analysis into how SIR has deemed these procedures as the most common may help with the ambiguity. Overall, this study is important for trainees and practicing interventional radiologists for increasing their healthcare financial literacy, encouraging engagement in advocacy and leadership and understanding how reimbursement trends compare to other specialties.

Post Author
Marissa Stumbras, MD
Interventional Radiology Resident, PGY2
Oregon Health & Science University

@MarissaStumbras





Monday, May 17, 2021

A Multicenter Global Registry of Paclitaxel Drug-Coated Balloon in Dysfunctional Arteriovenous Fistulae and Grafts: 6-Month Results

A Multicenter Global Registry of Paclitaxel Drug-Coated Balloon in Dysfunctional Arteriovenous Fistulae and Grafts: 6-Month Results


Clinical question: 
This study aims to assess the clinical utility and safety of the Lutonix drug-coated balloon (DCB), (BD, Franklin Lakes, New Jersey) for the treatment of hemodynamically significant arteriovenous fistulae (AVF) and graft (AVG) stenoses.

Take away point:
In patients with AVF or AVG stenoses, primary safety endpoint of DCB use was 95.5%, while target lesion primary patency (TLPP) was 73.9% at 6 months. Access circuit primary patency (ACPP) was 71% at 6 months. Subgroup analysis showed significantly improved TLPP when DCB was dilated for ≥120 seconds (P = .007). TLPP was significantly better when predilation occurred compared with cases where only DCB angioplasty was performed (77% vs 48.6%, P = .0005).

Reference: 
Karnabatidis, D., Kitrou, P. M., Ponce, P., Chong, T. T., Pietura, R., Pegis, J. D., ... & Savio, D. (2021). A Multicenter Global Registry of Paclitaxel Drug-Coated Balloon in Dysfunctional Arteriovenous Fistulae and Grafts: 6-Month Results. Journal of Vascular and Interventional Radiology, 32(3), 360-368.

Click here for abstract


Study design: 
Observational, multi-center, single-arm prospective

Funding source: 
Lutonix, a subsidiary of Becton, Dickinson and Company

Setting: 
25 centers across 12 countries in Europe and Asia within the Lutonix AV Global Registry (NCT02746159)

Figure



Summary:


Maintaining effective long-term AVG and AVF patency continues to represent a significant challenge. DCBs have emerged as a treatment tool to minimize the neo-intimal hyperplasia response involved in restenosis and thus maintain ACPP. Paclitaxel is a cytotoxic drug that inhibits microtubular spindle disassembly. By applying cytotoxic agents to myofibroblasts and smooth muscle cells that make up stenotic lesions, the process of restenosis is slowed. The Lutonix DCB is categorized as a low-dose (2 μg/mm2) paclitaxel-coated balloon and is available in sizes 4-12mm and lengths of 2-10cm.

This study enrolled 320 patients with 392 lesions treated. Inclusion criteria included male or nonpregnant, non-breastfeeding female over the age of 18, presence of at least one clinical, physiological, or hemodynamic abnormality of the AVF or AVG, and treatable lesions as defined by Lutonix’s instructions for use. Exclusion criteria included current participation in another investigational drug or device study, severe contrast allergy, and another medical condition which was thought to affect data interpretation and worsen the life expectancy of the patient. 309/320 (96.6%) and 287/320 (89.7%) of enrolled patients completed 30 and 60-day follow-up, respectively. A total of 33/320 (10.3%) of patients were excluded; 17 of which were due to patient death while enrolled. The most common clinical sign of graft function included decreased access blood flow (40.6%), prolonged bleeding (31.3%), and elevated venous pressure (24.1%).

The study found that the Lutonix DCB has a device success of 100%, clinical success of 99.4%, and primary safety endpoint at 30 days of 95.5% (95% CI: 92.5-97.5%). Based on Kaplan-Meier curve survival analysis of both AVF, AVG, and both at the 3 and 6-month follow-up period, they found that the TLPP of AVF, AVG, and combined (AVF+AVG) was 78.1, 61.9, and 73.9%, respectively. Additionally, no significant difference in TLPP was seen among treated de novo and restenotic lesions (78.8% vs 67.6%, respectively; P=.13). TLPP at the 6-month period was also found to be different across different lesion sites based on subgroup analysis. For example, AVF anastomosis and outflow vein stenoses had a TLPP of 83.2 and 83.8%, respectively whereas those within venous graft anastomosis and central veins had a TLPP of 55.6% and 65.0%, respectively. Furthermore, authors found that lesions treated with conventional angioplasty prior to DCB treatment had a significantly higher TLPP than lesions that were not pretreated (77% vs. 48.6%, respectively P= 0.005). Length of DCB inflation was also an important factor for effective treatment. Lesions treated for 120-180s had a TLPP that was significantly higher than those treated for 50-120s (67.9% versus 79.8%, respectively (P=0.007).

Commentary:


AVF and AVG maintenance in the ESRD population remains a difficult dilemma in a diverse patient population. Results of this prospective, observational study suggests safety of use of the Lutonix drug-coated balloon, with TLPP through 6 months reaching 74%. This study is a welcome addition to other DCB use in dysfunctional AVF/AVGs trials currently published.

Despite including “life expectancy insufficient to allow for completion of the procedure and follow-up examinations” in the exclusion criteria, mortality of the study population neared 5% in the 30-day follow up period. This is comparable to other studies. Sub-group analysis also suggests that lesion location markedly affects TLPP, making well-powered future study designs extremely difficult.

Ultimately the study suggests reinforcement of conventional DCB use technical considerations: (1) Predilation prior to DCB and (2) maintained DCB venoplasty of at least 120s suggest improved TLPP.

Limitations of the study include lack of a control group receiving non-drug eluting venoplasty, particularly to offer a comparison of mortality rate over the study period. Additional limitations include lack of imaging analysis and solely clinical evaluation as a TLPP measure.

Author:
Murat Osman, MD
Rush University Medical Center
Integrated Vascular & Interventional Radiology Residency, Class of 2026
@Murat_Osman

David M. Tabriz, MD
Rush University Medical Center
Assistant Professor
Assistant Program Director, Integrated/Independent Vascular & Interventional Radiology Residency
@DrDaveTabriz



Monday, May 10, 2021

Transarterial Chemoembolization for the Palliation of Painful Bone Metastases Refractory to First-Line Radiotherapy

Transarterial Chemoembolization for the Palliation of Painful Bone Metastases Refractory to First-Line Radiotherapy


Clinical question
How do the efficacy and safety of transarterial chemoembolization (TACE) compare with re-radiotherapy (re-RT) in the palliative treatment of radiotherapy (RT)-failure bone metastases (BMs).

Take-away point
TACE achieved a superior response rate and longer duration of palliation in symptomatic RT-failure BMs without adverse events.

Reference
Heianna, J, Makino, W, Toguchi, M, et al. Transarterial Chemoembolization for the Palliation of Painful Bone Metastases Refractory to First-Line Radiotherapy. J Vasc Interv Radiol. 2021;32(3):384-392

Click here for abstract

Study design
Retrospective cohort study

Funding source
Self-funded or unfunded

Setting
Single institution, University of Ryukyus

Figure

Figure 2. A 63-year-old woman with radiotherapy (RT)-failure bone metastasis (BM) of the third thoracic spine from thyroid cancer who underwent transarterial chemoembolization. (a) T2-weighted magnetic resonance imaging showing severe spinal cord compression due to spinal BM. (b) Thoracic contrast-enhanced CT showing spinal cord compression due to BM (black and white arrow). (c) Localized intraoperative cone-beam CT of the left costocervical artery showing enhancement corresponding to the lesion (white arrows). (d) Selective angiography of the left costocervical artery showing a tumor blush corresponding to the lesion (black arrow). The vessel, which dorsally extends from the left costocervical artery (broken arrow), nourished the dorsal skin, as observed via angiography (no image). (e) After transarterial chemoembolization, selective angiography of the left costocervical artery shows the disappearance of the tumor blush in the upper left half of the lesion. The arrowheads indicate microcoils placed at the proximal part of the abovementioned branch vessel to prevent anticancer agent inflow and embolic material migration to the skin and muscle. (f) Thoracic contrast-enhanced CT images performed 1 month after transarterial chemoembolization showing the shrinkage of the BM with the resolution of spinal cord compression. This lesion was judged to have partially responded due to a tumor reduction rate of 40%.

Summary


Radiotherapy (RT) has been shown to be effective in treating pain caused by bone metastases (BMs) in a majority of patients. Unfortunately, recurrent pain from BM regrowth has been shown to occur in up to 50% of cases. In cases where palliative surgical excision of painful tumors is contraindicated, re-radiotherapy (re-RT) has been employed, but data demonstrates pain relief in only 58% of patients. Transarterial embolization (TACE), normally employed to reduce intraoperative bleeding, has been shown in isolated reports to temporarily reduce pain due to BMs in most patients. Its primary use for palliative treatment of BM pain, however, has not been established.

In this retrospective cohort study, 50 patients with RT-refractory BMs across various primary tumor locations who underwent a second therapy over a 6-year period were analyzed. 23 patients (46%) underwent repeat RT while 27 patients (54%) underwent TACE. All patients had severe pain after initial therapy and had disease that were refractory to medical therapy or analgesia and were surgically unresectable. Indication for TACE included predicted cumulative dose greater than 120 Gy2 after re-RT as well as hypervascular bone tumors with adequate bone marrow, hepatic, and renal function. Patients also needed an Eastern Cooperative Oncology Group performance status of ≤3. Patient demographics with statistically similar, including Katagiri score, which predicts survival of patients with BMs. Both re-RT and transarterial chemoemolization were performed with consistent methods between patients at a single center, although a variety of different chemotherapeutics were employed based on the primary cancer type. Measured outcomes included technical success, index lesion pain assessment within 2 weeks and at regular follow-up appointments, disease progression by imaging, and adverse events. Pain was assessed using a Numerical Rating Scale (NRS) to rate pain from 0 (no pain) to 10 (worst pain).

Primary tumor types included gastrointestinal (24%), urogenital (22%), head and neck (18%), sarcoma (16%), gynecological (10%), lung (4%), breast, thymus, and gastrointestinal stromal tumors (2%). 66% of patients had intermediate-risk Katagiri prognostic scores. TACE group tumors had a median diameter of 95 mm compared to 78 mm in the re-RT group (p = 0.04). Median initial RT dose to the lesion was 33 Gy in the re-RT group and 42 Gy in the TACE group (p = 0.003). NRS scores prior to treatment were 10 in all patients analyzed.

Technical success was achieved in 24 of 27 chemoembolization cases (89%). 1 treatment failure resulting from the tumor being fed by the anterior spinal artery while 2 failures were the result of difficult catheterizations secondary to atherosclerotic plaque burden. Pain relief was achieved in 92% and 57% of TACE and re-RT patients respectively (p = 0.006) with a median relief rate of 3 and 2 months respectively (p = 0.02). Median NRS scores were 3 in the TACE group and 7 in the re-RT group (p < 0.001). Median hospital stay was 1 day in the TACE group and 14 days in the re-RT group (p < 0.001). Pain-free survival at 6 months was 51% in the TACE group and 30% in the re-RT group (p = 0.08), but the difference was not statistically significant.

There were no significant adverse events recorded in the TACE cohort, including nerve injury, skin necrosis, muscle abscess, or death. Grade 3/4 leukopenia was observed in 4 (14%) re-RT patients and no TACE patients (p = 0.16). Grade 3/4 thrombocytopenia was observed in 2 (7%) re-RT patients and no TACE patients (p = 0.54). No patients in either cohort required blood transfusions. One (4%) patient experienced transient nausea after TACE.

Commentary


This study provides a comparison between the use of re-RT and TACE in patients undergoing subsequent therapy for painful BMs after initial RT. Despite its limited study size, the investigation demonstrated statistically-significant increase in time without pain in addition to decrease in hospital stay enjoyed by patients undergoing TACE versus re-RT. It also demonstrated a paucity of adverse events, signaling that this procedure is a safe alternative to RT when applicable. This study is limited, however, due to small sample size, making it difficult to draw comparisons between different types of primary cancer sources. There also exists a difference in tumor qualities between groups, including the required hypervascularity in TACE-treated BMs versus the BMs treated with re-RT. The promising reduction in pain over longer periods and decreased hospital stays, however, preliminarily suggest that TACE is a valuable treatment modality for BM pain control in a palliative care setting. Larger multicenter trials will not only elucidate a higher-resolution safety profile for TACE, but will also establish its efficacy on a larger scale.

Post Author
Jared Edwards, MD
General Surgery Intern (PGY-1)
Department of General Surgery
Naval Medical Center San Diego, San Diego, CA

@JaredRayEdwards


Monday, May 3, 2021

The Effect of Preoperative Renal Failure on Outcomes Following Infrainguinal Endovascular Interventions for Peripheral Arterial Disease

The Effect of Preoperative Renal Failure on Outcomes Following Infrainguinal Endovascular Interventions for Peripheral Arterial Disease


Clinical question
What is the effect of preintervention renal failure on acute outcomes after lower extremity endovascular interventions for peripheral arterial artery disease (PAD)?

Take away point
Renal failure before PAD intervention incurs greater morbidity and mortality.

Reference
Di Capua J, Reid NJ, Som A, An T, Lopez DB, So AJ, Di Capua C, and Walker GT. The Effect of Preoperative Renal Failure on Outcomes Following Infrainguinal Endovascular Interventions for Peripheral Arterial Disease. J Vasc Interv Radiol. 2021; 32:459-465. doi.org/10.1016/j.jvir.2020.10.020

Click here for abstract

Study design
Retrospective multi-center, database analysis of 6765 patients undergoing intervention for PAD.

Funding Source
No reported funding.

Setting
Multi-center database, American College of Surgery National Surgical Quality Improvement Program (ACS-NSQIP), USA.

Summary


PAD occurs at twice the rate in patients with chronic kidney disease (CKD) or end stage renal disease (ESRD), and patients with ESRD are known to suffer worse outcomes in open surgical revascularization procedures. These patients may also be at high risk for endovascular revascularization procedures. The authors performed a retrospective database analysis of 6765 adults undergoing lower extremity intervention for PAD, comparing 30-day outcomes in patients with and without preintervention renal failure.

Billing codes were used to identify patients from the ACS-NSQIP database from 2014-2017 who underwent lower extremity arterial interventions primarily including angioplasty and stent placement. Patients with missing data, ASA class 6, and those undergoing procedures as part of a transplant or trauma treatment were excluded. Patients were defined as having preintervention renal failure if they had increased BUN and creatinine on two measurements (both within 90 days of intervention with one <24 hours prior to intervention) or renal failure requiring hemodialysis, peritoneal dialysis, ultrafiltration, or hemodiafiltration <2 weeks prior to intervention.

Variables were collected before and during intervention and complications were recorded up to 30 days after intervention. Univariate analysis was performed on pre- and intraprocedure characteristics. Multivariate logistic regression identified independent risk factors for each 30-day complication adjusted for baseline characteristics. A multivariate linear regression model compared extended hospital length of stay (LOS) between groups. As a sensitivity measure, a propensity score-matched model using variables identified in the multivariate analysis was used to predict preintervention renal failure.

A total of 6765 patients met criteria with 742 patients classified as having preintervention renal failure. Renal failure patients were significantly more likely to suffer critical limb ischemia (CLI) with tissue loss, be inpatient status, undergo nonelective procedures, have diabetes, experience dyspnea, have a dependent functional status, carry cardiac comorbidities, have an open wound, receive blood products, be classified as ASA 3-5, and experience longer procedure times (p<.01).

Adjusted analyses revealed patients with preintervention renal failure were more likely to experience mortality, all cause morbidity, extended LOS > 1 day, pulmonary complication, perioperative transfusion, sepsis, reoperation, amputation, hospital readmission, major adverse cardiac event (MACE), and major adverse limb event (MALE) within the first 30 days postintervention. Of these, renal failure was an independent risk factor for mortality (OR=4.11), all cause morbidity (OR=2.03), extended LOS (OR=1.53), sepsis (OR=2.37), reoperation (OR=1.84), amputation (OR=2.74), hospital readmission (OR=1.89), MACE (OR=3.50), and MALE (OR=1.97) (p<.001). Additionally, patients with preintervention renal failure experienced a 4.2-day longer hospital stay than their counterparts. Sensitivity analysis showed no significant differences in effect sizes of cohorts, reinforcing the validity of the covariate models.

Commentary


The authors investigate outcomes in patients with and without renal failure prior to endovascular lower extremity intervention for PAD using a national surgical database. Of the recorded acute postintervention complications, the greatest increase in odds was for mortality, which agrees with mortality data from smaller studies. Additionally, there was a significantly high rate of cardiac comorbidity in patients with renal failure (8.5% vs 2.5%), as well as an independent increase in odds of postintervention MACE. Similar studies have only seen increased risk in ESRD patients (versus CKD patients), which suggests patients with dialysis-dependent renal failure may be driving the authors’ results. The third highest increased odds ratio was risk of amputation after the procedure. The authors compare this to similar surgical data, noting that some studies have only seen significance in long-term amputation rates. This implies the 30-day follow up data may be an underestimation of true risk.

In addition to the short-term follow up period, there are several other limitations to this study. Its retrospective, billing code-driven design did not allow for investigating differences among procedural techniques, and the use of a surgical database may limit the applicability of the results to interventional radiology. Also, although the authors’ aim was to specifically address preintervention renal failure, their definition of renal failure does not adhere to the diagnostic definitions of CKD or ESRD, which limits the overall generalizability and comparability of their results. Moreover, had established definitions of CKD and ESRD been used, a stratified analysis according to CKD stage could have offered additional insight into periprocedure risks, particularly when other studies that have found conflicting data in stratification analyses. The main strength of this study lies in its incredibly large sample size (n=6765), which provides a robust complement to smaller-scale data.

Overall, this large-scale, national study demonstrates that patients with preprocedural renal failure are at risk for worse outcomes compared to their nonaffected counterpart and as such, incur longer hospital stays. Interventionalists should be aware of these increased risks when offering and performing PAD interventions on this population.

Post Author
Catherine (Rin) Panick, MD
Resident Physician, Integrated Interventional Radiology
Dotter Interventional Institute
Oregon Health & Science University

@MdPanick