Pharmaceutical sciences
Permanent URI for this collectionhttps://digital.lib.washington.edu/handle/1773/25913
Browse
Recent Submissions
Item type: Item , Comparing Healthcare Resource Utilization and Costs in HFPEF patients using SGLT2-inhibitor Combination Treatments: A Retrospective Claims Analysis(2025-08-01) Lee, Jean; Hansen, Ryan NBackground: Heart failure with preserved ejection fraction (HFpEF) represents an increasingly significant clinical burden. Recent treatment guidelines recommend sodium-glucose cotransporter-2 inhibitors (SGLT2i) as foundational therapy, with the addition of either mineralocorticoid receptor antagonists (MRAs) or angiotensin receptor-neprilysin inhibitors (ARNIs) as adjunctive therapy. However, real-world evidence comparing healthcare resource utilization (HCRU) and costs between these combination therapies remains limited. Methods: We conducted a retrospective cohort study using the Merative™ MarketScan® commercial and Medicare databases from 2019 to 2023 to compare HCRU and costs in adult patients (≥18 years) with HFpEF initiating SGLT2i+MRA vs. SGLT2i+ARNI. Patients required SGLT2i usage and continuous enrollment for 12 months pre- and post-index. Those using both MRA and ARNI concurrently were excluded. Outcomes assessed over 12 months post-initiation included inpatient (IP), emergency department (ED), outpatient (OP) services, and pharmacy costs. Multivariable regression models adjusted for demographic and clinical covariates. Results: The study included 2,128 patients (1,418 MRA, 710 ARNI). Adjusted analyses showed that the ARNI group experienced significantly lower HCRU, including 9% fewer IP admissions (IRR, 0.91; 95% CI, 0.84-0.98; p=0.015), 20% shorter IP length of stay (IRR, 0.80; 95% CI, 0.72-0.89; p<0.001), 12% fewer ED visits (IRR, 0.88; 95% CI, 0.79-0.98; p=0.0164), and 11% fewer OP service days (IRR, 0.89; 95% CI, 0.84-0.95; p<0.001). Unadjusted mean total healthcare costs were modestly lower in the ARNI group compared to the MRA group, driven by reduced IP, ED, OP, and total costs, despite slightly higher pharmacy costs largely attributable to differences in drug prices. However, no statistically significant differences were observed in adjusted total healthcare costs across settings. Conclusions: In this real-world analysis, adjunctive ARNI therapy was associated with reduced HCRU but did not translate into lower overall healthcare costs compared to MRA-based combination therapy. These findings highlight the importance of considering both utilization patterns and drug costs when selecting combination treatments for HFpEF, supporting the need for individualized treatment strategies, particularly in patients with multiple comorbidities.Item type: Item , Healthcare Resource Utilization and Costs of Commercially Insured US Patients with Atopic Dermatitis Switching from First-line to Second-line Systemic Targeted Therapies(2025-08-01) Li, Kevin Haokun; Sullivan, SeanBackground: The recent expansion of Food and Drug Administration (FDA)-approved treatment options for moderate-to-severe atopic dermatitis (AD) has notably improved clinical management options. With the availability of these novel therapies, data on frequency of therapy switching and differences in healthcare resource utilization (HCRU) and costs between switchers and non-switchers are limited.Objective: To evaluate the frequency of treatment switching from first- to second-line systemic targeted therapies and compare HCRU and costs between switchers and non-switchers among commercially insured US patients with moderate-to-severe AD. Methods: We conducted a retrospective cohort study using MarketScan health insurance claims data. Adult patients with AD initiating a first systemic targeted therapy (index date) between January 1, 2022 and December 31, 2022 were identified and followed for at least one year from index date. Two cohorts were classified based on whether switching occurred over follow-up, which was defined as discontinuation of first systemic targeted therapy and initiation of second systemic targeted therapy. All-cause and AD-related HCRU outcomes, including hospitalizations, emergency department (ED) visits, and outpatient services, were compared between switchers and non-switchers during the follow-up period. Total healthcare costs were also evaluated, categorized by medical and drug costs. Statistical significance was defined as a two-sided p-value of < 0.05. Results: After a year following the initiation of first-line systemic targeted therapy, 466 (5.8%) switched to second-line systemic targeted therapy among the 8,063 patients with moderate-to-severe AD included in this study. Nearly all switchers (96.4%) had at least one AD-related outpatient service compared to 82.8% for non-switchers (p < 0.0001), and the mean number of visits was higher among switchers compared to non-switchers (5.12 vs. 3.20, p < 0.0001). AD-related hospitalizations and ED visits were rare. Mean total AD-related healthcare costs also were significantly higher among switchers compared to non-switchers ($63,245 vs. $53,004; p < 0.0001), with drug costs accounting for approximately 99% of AD-related healthcare expenditures in both groups. Discussion: We found a small proportion (5.8%) of patients switched from first- to second-line systemic targeted therapy after a median follow-up of approximately one year. Patients who switched therapies incurred significantly higher AD-related outpatient service use and total healthcare costs compared to non-switchers, which may potentially reflect either worsening disease severity or inadequate response or intolerability to first-line therapy. These findings emphasize the increased importance of personalized considerations for the selection of first-line systemic targeted therapy for patients with moderate-to-severe AD to reduce downstream economic burden. As additional therapies become available, future research exploring reasons for treatment switching and patient factors influencing response will be critical to guide clinical and formulary decision-making in this evolving treatment landscape.Item type: Item , Healthcare Utilization and Cost Burden Associated with Depression among Patients with Alzheimer’s Disease - A Retrospective Cohort Analysis(2025-08-01) Yip, Olivia; Lee, KyueunBackground: Alzheimer’s disease (AD) is the most common cause of dementia in the United States. Patients with AD have increased use of healthcare services, including more frequent high-cost events like hospitalizations and emergency department (ED) visits, compared to cognitively normal individuals. Neuropsychiatric symptoms are more common in individuals with AD than in the general population. Many patients exhibit symptoms consistent with depression, such as mood changes, social withdrawal, apathy, and suicidal ideation. Additionally, individuals with AD and comorbid depression tend to show more severe neuropathological changes, including greater accumulation of tau, amyloid, and vascular pathology, compared to those without depression. Depression and its related symptoms in AD patients contribute to greater caregiver burden, higher fall risk, and a greater likelihood of requiring costly interventions such as skilled nursing care and early institutionalization. To our knowledge, no studies have looked at the associated healthcare burden of comorbid depression among patients with AD. Objective: In this study, we compared AD patients with depression to AD patients without depression with respect to patient characteristics, healthcare resource utilization, and costs to characterize the healthcare burden. Methods: We performed a retrospective cohort study using MarketScan® health insurance claims data. AD patients with depression within three years of AD diagnosis and AD patients without depression were identified between January 1st, 2017, and December 31st, 2022, and followed for up to one year after their depression or proxy diagnosis. AD controls were matched 6:1 to AD with depression cases. The primary outcomes of interest were all-cause healthcare resource utilization and expenditures during the follow-up period. Results: In the 12-month follow-up, patients with AD and comorbid depression had substantially higher healthcare costs and utilization compared to those without depression. Adjusted models showed significantly higher total costs (+$13,089; 95% CI: $11,623, $14,554) and greater utilization, including more ED visits, office visits, drug claims, inpatient days, and hospital admissions, all of which were statistically significant. Conclusion: The results of this study suggest that the financial burden following depression diagnosis in AD patients is substantial. These findings highlight the need for more effective treatments that can mitigate the resource use and economic burden in the management of depression among this vulnerable population. There is a clear unmet need for properly managing depression in patients with underlying AD.Item type: Item , Evaluating the Medicaid Subscription-based Payment Model in Hepatitis C(2025-08-01) Elsisi, Zizi; Basu, AnirbanBackgroundHepatitis C virus (HCV) remains a major contributor to liver cirrhosis, hepatocellular carcinoma, and liver transplantation in the United States, imposing substantial clinical and economic burdens. Although direct-acting antivirals (DAAs) offer cure rates exceeding 95%, access remains constrained due to high drug costs, limited screening uptake, and Medicaid-specific treatment restrictions. In July 2019, Louisiana and Washington implemented subscription-based payment models (SBPMs), which decouple drug pricing from volume to promote broader treatment access through fixed-cost agreements. While theoretically promising, the real-world impact of SBPMs on HCV care delivery and their long-term societal value remains insufficiently evaluated. MethodsThis study integrated retrospective claims-based analysis with decision-analytic modeling to evaluate the clinical and economic implications of SBPMs. Aim 1 used a comparative effectiveness design using payer-complete claims data from the Komodo Health database (2018–2022), identifying Medicaid Managed Care (MMC) beneficiaries aged 18–64. Synthetic control methods were applied to construct counterfactuals for Louisiana and Washington using data from 14 comparator states, matched on pre-policy trends, state, and patient characteristics. Primary outcomes included HCV screening, RNA testing, and DAA initiation and refill rates, with subgroup and geographic stratification. In Aim 2, a state-specific, lifetime Markov model was developed to simulate disease progression across health states representing chronic infection (F0–F4), decompensated cirrhosis, hepatocellular carcinoma, liver transplantation, background mortality, and liver-related mortality. The model was parameterized using real-world behavioral inputs from Aim 1. Outcomes were assessed from a societal perspective and included total costs, quality-adjusted life years (QALYs), and incremental net monetary benefit (INMB), using a willingness-to-pay (WTP) threshold of $150,000 per QALY. ResultsAim 1: In Louisiana, SBPM implementation led to statistically significant and sustained increases in RNA testing (+35.2 per 100,000 population; P<0.10), DAA initiation (+7.8 per 1,000), and refill rates (+24.4 per 1,000), with improvements consistent across age, sex, and comorbidity subgroups. Conversely, Washington experienced a significant decline in treatment uptake, with DAA initiations and refills decreasing by 3.9 and 12.9 per 1,000 patients, respectively (P<0.10), with no corresponding improvements in screening or RNA testing. Aim 2: In Louisiana, the SBPM generated 6,377,658 QALYs at a societal cost of $56.1 billion, compared to 6,372,194 QALYs and $56.6 billion under traditional pricing. The model projected $506 million in cost savings and an INMB of $1.3 billion, indicating that the SBPM strategy was highly cost-effective. In Washington, SBPM implementation resulted in slightly fewer QALYs (2,235,301 vs. 2,235,972) and higher cost incurred ($95.2 million), yielding a negative INMB and suggesting that the strategy was not cost-effective in that context. ConclusionSBPMs offer a scalable approach to improving access to curative HCV treatment and can yield substantial societal benefits when implemented effectively. However, the different outcomes between Louisiana and Washington underscore the importance of state-level implementation context. DAAs financing reform alone is insufficient to achieve HCV elimination; investments in screening infrastructure, provider engagement, patient outreach, and real-time data monitoring are essential to maximize public health impact. These findings highlight the critical role of integrated policy and system-level interventions in advancing equitable and cost-effective HCV care.Item type: Item , Understanding the outpatient medication use and spending of cognitively impaired older adults under changing Medicare Part D policy(2025-01-23) Tabah, Ashley; Hansen, Ryan NBackgroundBetween 2011 and 2020, the Affordable Care Act progressively closed the Medicare Part D coverage gap, reducing coinsurance from 100% to 25% to alleviate financial burdens and improve prescription drug access. Individuals with cognitive impairment face particularly high drug costs and are at greater risk for medication discontinuation. Furthermore, adherence is essential in this population for managing their disease and comorbidities to reduce their healthcare use and expenditures. We evaluated the effect of reduced coinsurance on out-of-pocket (OOP) costs and medication use for beneficiaries with cognitive impairment, as well as the association between medication adherence and inpatient (IP) and emergency department (ED) related healthcare costs and visits. MethodsThe study sample was Health and Retirement Study respondents with cognitive impairment, linked to Medicare claims (2006-2018). Aim 1 evaluated the causal effect of coverage gap closure on Part D OOP spending using a difference-in-differences (DID) event study approach. Aim 2 assessed the causal effect of coverage gap closure on medication use with a DID analysis. For these two aims, we estimated the change in the outcome for non-low-income subsidy (LIS) respondents, as compared to LIS respondents, who were not subject to the coverage gap. Aim 3 examined the association between adherence to AD medication and IP and ED utilization and costs using two-part models. ResultsClosure of the coverage gap resulted in a significant reduction in annual OOP spending (2011 vs. 2010: -$134; 95% CI: -174 to -94; p<0.001) in beneficiaries with cognitive impairment and in an increase in the probability of AD medication use in those with ADRD (4.3 percentage points [ppts]; 95% CI: 1.2-7.4; p=0.017). In terms of healthcare resource utilization and cost, adherence to AD drugs was associated with significant reductions in the probability of incurring IP (-2.4 ppts; 95% CI: -4.3 to -0.55 ppts; p=0.011) and ED healthcare costs (-6.4 ppts; 95% CI: -9.8 to -2.9 ppts; p<0.001), and of having an IP hospitalization (-2.3 ppts; 95% CI: -4.26 to -0.40 ppts; p=0.018) or ED visit (-6.4 ppts; 95% CI: -10.1 to -2.8 ppts; p<0.001). However, adherence was not significantly associated with the amount of IP costs incurred conditioning on incurring any costs or the number of hospitalizations conditioning on having any hospitalizations. In contrast, adherence resulted in a 19.3 ppt reduction in total non-zero ED costs (95% CI: -30.2 to -6.7 ppts; p<0.01) and a 15.9 ppt reduction in the number of ED visits (95% CI: -22.7 to -8.2 ppts; p<0.01). ConclusionIn this sample of cognitively impaired Medicare beneficiaries, we found that closure of the Medicare Part D coverage gap successfully reduced OOP spending and increased medication use. Furthermore, medication adherence was associated with a significant reduction in the probability of healthcare resource utilization and cost. This research lays a foundation for the study of other chronic conditions and of the effects of the OOP spending caps that will be implemented by the Inflation Reduction Act.Item type: Item , Real World Treatment Patterns and Outcomes Among Patients with Early Non-Small Cell Lung Cancer(2024-10-16) Deem, Jennifer; Carlson, Joshua JBackground: Worldwide, about two million people are diagnosed with lung cancer each year, 85% of whom have non-small cell lung cancer (NSCLC). Over the last fifteen years, noteworthy progress has been made in treating advanced metastatic NSCLC with targeted systemic therapies. As early and comprehensive care can potentially improve and extend the lives of patients, attention is now turning towards Stages I – IIIA, or early NSCLC (eNSCLC), with new and recent approval of neoadjuvant and adjuvant systemic therapies. With this rapidly changing treatment landscape, it is critical to understand how care is implemented and to whom, to appreciate the real-world adoption of innovative treatments in eNSCLC as they enter the market. Methods: This retrospective observational study used Flatiron Health, a US nationwide electronic health record derived and de-identified database spanning from January 2019 - March 2024 to (1) describe eNSCLC patient demographic and clinical characteristics, (2) the real-world neoadjuvant and adjuvant treatment patterns, and (3) how these treatment patterns relate to long-term patient outcomes. Results: We studied 7,410 patients, mostly female (52.9%), with a mean age of 71.0 ± 8.5 years. Most were diagnosed at Stage I (n = 4,098), with the rest at Stages II and IIIA. About 65% received curative-intent treatment: surgery (50%), radiation (4.6%), or chemoradiation (10.7%). The rest did not receive definitive treatment. Neoadjuvant use was rare, and adjuvant use was limited. Surgical patients, primarily at Stage I, did not receive adjuvant or neoadjuvant systemic therapy (62.1%). In contrast, smaller fractions of definitive radiation or chemoradiation treatment groups went without neoadjuvant / adjuvant systemic therapies, 24.6% and 50% respectively. Immunotherapy monotherapy was the most common adjuvant therapy for patients undergoing definitive radiation or chemoradiation, while surgical patients received platinum chemotherapy. Survival outcomes were higher for patients treated with adjuvant systemic therapy following definitive radiation or chemoradiation. Patients undergoing definitive radiation without neoadjuvant / adjuvant systemic therapy had lower survival rates, but adjuvant therapy improved these rates. A similar trend was observed in patients who received chemoradiation. Conclusions: The landscape of treatment possibilities for patients diagnosed with eNSCLC is expanding rapidly. However, our comprehension of how these advancements is integrated into clinical practice and their impact on patient outcomes is just starting to unfold. A crucial initial step in improving patient outcomes is to confront and address the underutilization of neoadjuvant / adjuvant systemic therapy for eNSCLC patients.Item type: Item , Impact of Targeted Therapy on Healthcare Resource Utilization Among Patients with Atopic Dermatitis(2024-10-16) Ta, Richard C; Li, JingBackground: Atopic dermatitis (AD), the most prevalent inflammatory dermatologic condition, manifests as dry skin, erythema, and rashes. Although there have been many approvals of targeted therapies, no studies have quantified their impact on healthcare resource utilization (HCRU) and related costs among patients diagnosed with AD.Objective: To assess the impact of targeted therapy on intensive HCRU (inpatient and emergency department [ED] visits) and costs between those who have and have not been prescribed a targeted therapy among patients with AD in the US. Methods: Retrospective cohort study using MarketScan Commercial claims database. Subjects with AD who were prescribed and not prescribed a targeted therapy between January 1st, 2017 and December 31st, 2022 were identified. A difference-in-differences analysis was used to assess the effect of targeted therapy on HCRU and costs among AD patients. The study period was from January 1st, 2012, through December 31st, 2023. Results: We identified 157,966 individuals with AD, 148,646 of whom were not treated with a targeted therapy, while 9,320 received such treatment. Those who were prescribed a targeted therapy led to a 0.016 (95% CI: -0.021, -0.010; p-value < 0.01) decrease in the average proportion of inpatient visits and 0.017 (95% CI: -0.025, -0.008; p-value < 0.01) decrease in the average proportion of ED visits compared to those not on a targeted therapy. On average, targeted therapy was associated with an increase of $25,219 (95% CI: 24,759, 25,679; p-value < 0.01) in total cost and $26,061 (95% CI: 25,734, 26,387; p-value < 0.01) in drug cost, along with a $909 (95% CI: -1,106, -711; p-value < 0.01) decrease in outpatient cost and a $251 (95% CI: -382, -119; p-value < 0.01) decrease in inpatient cost compared to those not on a targeted therapy. Conclusion: We found a significant decrease in the average proportion of AD patients requiring inpatient or ED visits when treated with targeted therapy compared to those not on such therapy, suggesting that the targeted therapies may be effective in reducing intensive healthcare resource use. Further research should be conducted to understand how adherence to targeted therapy and treatment history can impact HCRU and cost.Item type: Item , Equitable access and reimbursement for pharmacy-based services: A case study of adult vaccinations(2024-09-09) Wittenauer, Rachel; Stergachis, AndyCommunity pharmacies are vital access points for healthcare in the United States. The COVID-19 pandemic highlighted the indispensable role of community pharmacists for patients and the healthcare system. However, not every neighborhood has good access to pharmacies, and pharmacies are facing increased financial and operational pressures which threaten their widespread availability. Despite the importance of community pharmacies, robust evidence at a national level is lacking on 1) the populations and locations which have low access to pharmacies, 2) whether this lack of access affects utilization of pharmacist-provided health services, and 3) whether any of the proposed policy solutions, such as provider status recognition for pharmacists at the federal level, provide a plausible path forward to bolster access to pharmacist-provided health services. This investigation is structured as three aims. First, I defined, mapped, and characterized the locations of all “pharmacy deserts” in the U.S. at the census tract level. Pharmacy deserts are defined as areas that are both low-income (>20% of people living below the federal poverty line or median income <80% of the nearest metro area) and have low access to a pharmacy (>1/3 of people living outside a 1, 5, or 10-mile radius of any pharmacy, depending on urbanicity). I found that 15.8 million (or 4.7% of) people in the U.S. live in neighborhoods classified as pharmacy deserts. Further, the populations living in these neighborhoods were associated with a higher proportion of many known social determinants of health such as lower educational attainment, racial/ethnic minority status, and lower health insurance coverage. These patterns were generally consistent across urban and rural areas and across all 50 states in the country. Second, I evaluated whether these pharmacy desert neighborhoods were associated with lower utilization of a key pharmacy-based health service: shingles vaccination. I acquired census-tract-level vaccination data from seven different state Departments of Health and used propensity score matching to efficiently account for a variety of known confounding factors in the evaluation. The results from our primary analysis showed that pharmacy desert status was not associated with lower vaccination completion rates (0.4 fewer shingles vaccinations per 1000 population, p = 0.83). However, the results of our secondary analysis found that census tracts with low pharmacy access (as opposed to the two-part pharmacy desert definition that includes low-income levels) were associated with reduced shingles vaccination completion rates (2.4 fewer vaccinations per 1000 population, p = 0.004). This pattern indicates that lack of community pharmacy access may have direct health consequences for people living in these neighborhoods. Lastly, I used a national claims database to explore the effects of state-level provider status legislation on reimbursements for shingles and seasonal influenza vaccination visits at pharmacies. We found that despite having the legal authority to do so, there are very few pharmacy claims being submitted to health insurance plans for vaccination services. Our dataset contained 2.3 million vaccination visits between 2021-2022, of which only 0.4% had any outpatient services claims billed during the visit, even in provider status states. This inhibits more robust evaluation of these policies' effects and may indicate important implementation barriers to address alongside these new authorities for pharmacists. In sum, this body of work provides evidence on the current state of access to pharmacies in the U.S., the negative effect of poor pharmacy access on shingles vaccination, and the potential utility of state-level provider status legislation in improving the financial profitability of vaccination services in community pharmacies.Item type: Item , Understanding Pre-Exposure Prophylaxis Use in the United States and the Potential Impact of Community Pharmacies(2024-09-09) Fulcher, Jacinda Tran; Hansen, Ryan NThe US HIV epidemic is characterized by notable geographic, racial, and socioeconomic disparities and substantial health and economic burden. Pre-exposure prophylaxis (PrEP) is a safe, effective therapy to prevent the acquisition of HIV, but uptake remains low, partly due to inadequate access to PrEP providers. Community pharmacies are well-positioned to expand access for populations disproportionately impacted by the HIV epidemic. This dissertation leverages several modeling methods to examine population-level factors driving PrEP utilization and the potential impact of community pharmacies for expanding PrEP access nationwide. First, I trained a robust set of machine learning models to explore population-level characteristics and social determinants of health that were most predictive of county-level PrEP use across the US. The best performing model highlighted the importance of county HIV prevalence and testing rates, access to healthcare facilities and providers, healthy lifestyle indicators (e.g., access to exercise facilities, obesity rates), and sociodemographic factors (e.g., racial composition, income, education). This exploratory, ecological analysis sets the stage for further investigations into the relationships between identified predictors and PrEP utilization, ultimately informing potential population-level strategies or policies to promote PrEP uptake. Furthermore, prompted by recent state legislation permitting community pharmacy-based PrEP, my second and third aims assessed the potential impact of a hypothetical federal policy empowering pharmacists to initiate PrEP for eligible individuals. My nationwide examination of geospatial access to PrEP providers and community pharmacies in 2022 demonstrated that community pharmacies could expand access in 78.2% to 94.3% of census tracts that currently lack PrEP access, potentially benefitting 34.7 to 41.0 million US residents and alleviating geographic and racial disparities in PrEP access. Lastly, I adapted an infectious disease model to simulate plausible pharmacy-based PrEP scenarios in the Atlanta metropolitan area, a region with high HIV burden. I concluded that community pharmacy-based PrEP could substantially improve health outcomes in terms of quality-adjusted life-years gained and HIV cases averted and would be cost-saving or cost-effective over a 50-year time horizon. Overall, the work presented in this dissertation provides insight into predictors of PrEP utilization in the US and the potential impact and value of community pharmacies for bridging gaps in PrEP access.Item type: Item , Identifying and Characterizing Commercially Insured HFpEF Patients with High vs. Low Healthcare Resource Utilization(2024-09-09) Earl, Jake; Hansen, RyanBackground: Heart failure with preserved ejection fraction (HFpEF) represents half of all heart failure (HF) diagnoses and is a growing public health concern. Despite therapeutic advancements, HFpEF contributes to substantial HF-related healthcare utilization and costs. Further investigation to characterize these measures and identify potential associations is needed.Objective: The objectives of this study were to characterize the differences in healthcare resource utilization and costs among the top 90th and bottom 10th percentiles of total healthcare costs, examine the association between patient characteristics at diagnosis and the odds of being in the 90th percentile, and to examine the differences in utilization and costs over time between the high- and low-cost groups. Methods: We conducted a retrospective cohort study using data from the Merative™ MarketScan® Research Database, including commercially insured adults diagnosed with HFpEF between 2014 and 2021. Baseline characteristics, healthcare utilization, and costs were analyzed, and multivariable logistic regression was used to assess factors associated with higher costs. Healthcare resource use and costs over one-year follow-up were estimated using a Kaplan-Meier Sample Average (KMSA) with bootstrapping. Results: There were 24,078 HFpEF patients included in the study. High-cost patients exhibited significantly greater healthcare resource utilization, with an incremental average of 12 ED/urgent care visits, 3 inpatient admissions, and 29 days of hospital stay per year. Mean total annual costs for the 90th percentile was $363,092 while the 10th percentile was $1,710 per year. Baseline characteristics associated with higher odds of being in the high-cost group included female sex with a 1.13 (95% CI: 1.1, 1.2) times higher than males; and Charlson Comorbidity Index (CCI) scores; when comparing to the lowest CCI score found (one), a CCI score of two was associated with a 3.27 (OR: 3.0, 3.6) times increase in the odds, and a CCI score greater than two was associated with a 18.71 (OR: 16.8, 20.8) times increase in the odds of belonging to the 90th percentile. Comorbidities associated with higher odds of being in the high-cost group included atrial fibrillation (AF) with a 3.54 (95% CI : 2.8, 4.4) times increase in odds. The drug classes that increased the odds of belonging to the 90th percentile were loop diuretics with a 2.18 (95% CI: 2.0, 2.4) times increase in odds, ARNI with a 1.89 (95% CI : 1.1, 3.1) time increase in odds, and SGLT2i with a 4.48, (95% CI : 3.0, 6.7) times increase in odds. Factors associated with a decrease in the odds of being in the high-cost group included: diabetes mellitus with a 0.53 (95% CI: 0.4, 0.7) times decrease in the odds, hypertension with a 0.71 (95% CI: 0.6, 0.8) times decrease in the odds, and chronic kidney disease with 0.62 (95% CI :0.4, 0.9) times decrease in the odds of being in the 90th percentile. Conclusion: Significant incremental differences in healthcare utilization and costs exist between high-cost and low-cost HFpEF patients, indicating an opportunity for improvement. Identifying and addressing factors associated with higher costs at diagnosis could improve outcomes and reduce healthcare expenditures. Future real-world-evidence studies should focus on the impact of integrating SGLT2 inhibitors in clinical practice, and more clinical research is needed to determine the impact of this new discovery in managing AF in HFpEF patients.Item type: Item , Incorporating Equity into Healthcare Decision Making Around New Technologies(2024-09-09) Khor, Sara; Bansal, Aasthaa; Carlson, JoshInnovations in cancer care, such as the development of new pharmaceutical treatments, are widely recognized for their potential to extend lives and enhance quality of life. However, concerns persist regarding the equitable distribution of these innovations and their potential to exacerbate existing disparities in health outcomes. Moreover, the advent of cutting-edge clinical risk prediction tools presents a significant dilemma: while promising in guiding cancer treatment decisions, there is a pressing concern that these technologies could embed discriminatory biases and perpetuate racial disparities in health outcomes. Specifically, the inclusion of race as a predictive factor in these algorithms raises alarms about the potential for differential treatment across racial groups, further entrenching inequities within the healthcare systems. This study aimed to generate evidence to support the incorporation of equity considerations into healthcare decision processes surrounding these innovative developments. Aim 1 utilized a quasi-experimental approach to evaluate the impact of cancer innovations on both population-level survival and health disparities across income levels. Our findings revealed improved survival rates in lung cancer and melanoma, juxtaposed with exacerbated disparities across income levels, suggesting a plausible causal link between new innovations and health disparities. Aim 2 employed a microsimulation model to examine the long-term health disparity implications of omitting race from a colon cancer survival prediction tool. The model projected that omitting race from this tool for adjuvant chemotherapy recommendations could worsen survival for Black patients and widen the disparity gap. These findings underscore the importance of integrating equity considerations into the fabric of policies governing the evaluation, adoption, and diffusion of innovative healthcare solutions to mitigate their disparity impacts.Item type: Item , Risk Prediction and Value of Polygenic Risk Scores in Colorectal Cancer Screening(2024-09-09) Jiang, Shangqing; Veenstra, DavidRisk prediction models that are based on common genetic variants, known as Polygenic risk Score (PRS), have shown promises to guide personalized screening for colorectal cancer (CRC). Continuous efforts to improve PRS risk prediction are needed for clinical use, and understanding its added value of guiding CRC screening is needed to inform screening guidelines, clinical adoption, and reimbursement decisions. In Chapter 1, we assessed whether the clinical validity of PRS risk prediction models could be improved by a new approach, called Multiple Polygenic Score (MPS) approach. This approach leverages PRSs developed for other diseases to enrich the risk prediction model. We first used machine learning models and large datasets to develop the MPS risk prediction models. We then used an independent dataset to validate the clinical validity of these models, measured by Area under the Receiver Operating Curve (AUC). Our results showed that MPS was a statistically significant predictor of CRC risk. Additionally, the increment in AUC associated with the MPS approach was small but was statistically significant. Our findings suggested that the MPS approach is able to improve PRS risk prediction models for CRC, and yet more efficient approaches to improve the AUC in a more noticeable way should be explored in the future. In Chapter 2, we developed a decision analytic model to simulate the long-term clinical and economic value of a population-level genomic screening to inform CRC screening. The genomic screening interventions included (1) population-level screening for PRS, (2) population-level screening for Lynch Syndrome, a rare but very high penetrance genetic syndrome associated with high risk of CRC (lifetime risk up to 70%), (3) population-level screening for both PRS and Lynch Syndrome. We compared these interventions with standard of care. We found that genomic screening for both Lynch Syndrome and PRS was marginally cost-effective and yet genomic screening for PRS only or Lynch Syndrome only was unlikely to be cost-effective. Our study also found that potential harms associated with false reassurance, i.e., reduced screening due to negative genomic results, could nullify the clinical benefits of genomic screening. The findings suggested that both Lynch Syndrome and PRS are important components of the value of population-level genomic screening. Additionally, our study emphasizes that proper risk communication with patients is critical to reduce the harm of false reassurance. Lastly, we found that the age of genomic screening generated the largest clinical benefits when it was offered at age of 0 years, and delayed genomic screening had a greater negative impact on individuals with LS than individuals with a high PRS. Our studies first help inform methodological development of PRS risk prediction in CRC. Future studies should continue to develop and examine new methods to improve the clinical validity of PRS efficiently. Our findings also help understand the economic value of population-level genomic screening for CRC. Future studies should continue to assess the value of genomic screening for other diseases to facilitate the understanding of the value of genomics in disease screening and prevention.Item type: Item , Treatment Patterns and Patient Characteristics in Misdiagnosis of Bipolar I Disorder(2024-09-09) Haile, Filmon; Lee, KyueunBackground: Bipolar I disorder (BP-I) presents significant diagnostic challenges, often misdiagnosed as major depressive disorder (MDD) during initial evaluation. Misdiagnosis can lead to inappropriate treatment regimens, including the prescription of antidepressant monotherapy, which poses the risk of inducing manic episodes in BP-I patients. Understanding post-misdiagnosis treatment patterns among BP-I patients is necessary to improve diagnostic accuracy and treatment selection. Objective: This retrospective cohort study aimed to characterize treatment patterns among the BP-1 patients who misdiagnosed with MDD during their misdiagnosis period and assess their associations with the time until correct BP-I diagnosis. Methods: Utilizing the MarketScan database, we identified two cohorts: BP-I patients with the history of misdiagnosis with MDD, and BP-I patients without the history of MDD misdiagnosis. In the misdiagnosed group, we described the first and last treatment regimens during the misdiagnosis period. We performed multinomial logistic regression to investigate the associations between patients and provider characteristics and the first treatment regimen after misdiagnosis. We employed Cox Proportional Hazard Model to assess the associations between treatment patterns and time until BP-I diagnosis. We compared the first treatment regimen after BP-I diagnosis between the two groups. Results: Among 21,771 misdiagnosed BP-I patients, 28.5% received antidepressant monotherapy initially, with 18.8% continuing this regimen before BP-I diagnosis. Conversely, 13.3% persisted with antidepressant monotherapy post-BP-I diagnosis. In the non-misdiagnosed BP-I cohort, 11.2% initiated antidepressant monotherapy. Notably, mood stabilizer/anticonvulsant monotherapy post-misdiagnosis had the highest hazard of BP-I diagnosis compared to antidepressant monotherapy (HR: 1.26, 95% CI: 1.19-1.34, p<0.001). Conclusion: Our findings highlight disparities in initial diagnoses between acute care providers, internal medicine, and family practice versus mental health facilities, psychiatrists, and nurse practitioners. This may reflect differences in diagnostic expertise and referral patterns. Notably, prevalent use of antidepressants and anxiolytics contravenes current guidelines, underscoring the need for improved clinical practice. The lack of screening tools for BP-I compared to MDD emphasizes the necessity for more comprehensive assessment tools to improve diagnostic accuracy.Item type: Item , Healthcare resource utilization and costs of Medicare-enrolled patients with HR+/HER2- metastatic breast cancer treated with CDK4/6i in the first-line setting(2024-09-09) Behan, Emma; Bansal, Aasthaa; Veenstra, DavidAbstract Healthcare resource utilization and costs of Medicare-enrolled patients with HR+/HER2- metastatic breast cancer treated with CDK4/6i in the first-line setting Emma Behan Chair of Supervisory Committee: Aasthaa Bansal, PhD Background: The introduction of cyclin-dependent kinases 4 and 6 (CDK4/6i) inhibitors (palbociclib, ribociclib, and abemaciclib) has transformed the treatment landscape for patients with hormone-receptor-positive (HR+) and human epidermal growth factor receptor 2 negative (HER2-) metastatic breast cancer (MBC). To our knowledge, no studies have quantified healthcare resource utilization (HRU) or economic burden following CDK4/6i initiation in the Medicare population. Objective: The objective of this study is to describe HRU and quantify healthcare costs among Medicare-enrolled patients with HR+ HER2- MBC treated with CDK4/6i in the first-line setting.Methods: We conducted a retrospective cohort study on Medicare-enrolled HR+ HER2- MBC patients who initiated a CDK4/6i in the first-line setting between February 2nd, 2016, and December 31st, 2022 using claims from the Merative MarketScan ® database. We examined all-cause healthcare resource utilization (HRU) by summarizing the number of inpatient (IP), outpatient (OP), and emergency room (ER) visits, as well as the length of stay during the six months following CDK4/6i initiation. Additionally, we assessed all-cause healthcare costs, including IP, OP, ER, and pharmacy, over the one year following CDK4/6i initiation using the Kaplan-Meier sample average (KMSA) estimator to account for censoring. We reported total healthcare costs as the sum of IP, OP, ER, and pharmacy costs, providing insights into the economic burden associated with CDK4/6i treatment in this patient population. Results: A total of 901 patients met the inclusion criteria with a mean age of 74 years (standard deviation [SD] 6.84) and a mean Charlson Comorbidity Index (CCI) score of 0.64 (SD 0.8). Most patients initiated palbociclib (n=804 (90%)) at the index date and most had a systemic therapy (n=634 (70%)) before CDK4/6i initiation. Nearly 24% (n=214) had an inpatient admission in the six months following CDK4/6i initiation. Among patients with an inpatient admission, the mean number of admissions per patient was 1.65 (SD=0.98) with a mean length of stay per admission of 5.98 days (SD=6.25). Roughly 30% [n=271] of patients had an ER visit, with a mean of 2.1 (SD=1.54) visits per patient among those who had a visit. Most patients (n=868 (96.44%)) had an outpatient service, and among those with an OP service, the mean number of days with outpatient services was 19.96 (SD=12.29). Mean total healthcare costs over the one-year period following CDK4/6i were $62,228 (95% CI 52281, 73029) per patient with the main drivers being outpatient services ($31,686 (95% CI 27168, 36925)) and pharmacy costs ($22,727 (95% CI 19273, 25931)).Item type: Item , Exploring Demographic, Geographical, and Clinical Factors Associated with Persistence to High-Efficacy Therapies in Multiple Sclerosis(2024-09-09) Miller, Alexandra; Veenstra, DavidBackground: High-efficacy disease-modifying therapies (DMTs) in multiple sclerosis (MS) are defined as those that reduce relapses by 50%. A majority of these high-efficacy DMTs are administered intravenously at varying intervals, including alemtuzumab every 365 days, mitoxantrone every 90 days, natalizumab every 28 days, ocrelizumab every 182 days, and ublituximab every 168 days. While several studies have investigated the rates of persistence on these high-efficacy DMTs, no studies to date have investigated patient factors associated with the persistence of these high-efficacy infusion DMTs.Objective: To identify demographic, geographical, and clinical factors associated with the persistence of high-efficacy infusion DMTs in MS after 12 months from initiation. Methods: We conducted a retrospective cohort study using the Merative™ Marketscan® Commercial Database. We identified patients diagnosed with MS starting high-efficacy infusion DMTs between January 1, 2018, and December 31, 2020. Persistence was defined as having no evidence of switching to a new therapy or having no gap greater than 60 days beyond the recommended dosing regimens. For each DMT, we used an adjusted multivariable logistic regression model to assess the association between the binary outcome of persistence for 12 months and age, sex, level of rurality, region, health plan type, employment classification, Charlson Comorbidity Index score, mental health comorbidity status, length of MS diagnosis, and presence of a recent MS relapse event. We ran additional scenario analyses to compare the probability of persistence using varying persistence definitions found in the literature. Results: We found that a higher proportion of patients were persistent at 12 months on ocrelizumab (80.9%) versus natalizumab (66.3%). Among patients who survived and were persistent for 12 months, a higher proportion of patients were persistent at 24 months on natalizumab (57.5%) versus alemtuzumab (39.8%) or ocrelizumab (49.8%). Age, sex, region, health plan type, employment classification, mental health comorbidity status, and presence of a recent MS relapse event were not significantly associated with persistence to natalizumab or ocrelizumab at 12 months. We found that the odds of persistence on ocrelizumab were significantly higher in patients living in rural areas versus those living in urban areas (OR: 0.73, 95% CI: 0.54-0.98, p-value: 0.039) and in patients who had been recently diagnosed with MS before starting an infusion DMT (OR: 1.39, 95% CI: 1.05-1.83, p-value: 0.020). The odds of persistence on natalizumab were significantly higher in patients with fewer than two comorbidities (OR: 0.58, 95% CI: 0.35-0.96, p-value: 0.041) and in patients who had been recently diagnosed with MS before starting an infusion DMT (OR: 2.24, 95% CI: 1.68-2.98, p-value<0.001). Conclusion: At 12 months post-index, ocrelizumab was found to have a higher probability of persistence compared to natalizumab, which may be explained by its extended dosing regimen of 182 days versus 28 days. In patients taking ocrelizumab, higher persistence was significantly associated with those living in rural areas compared to urban areas and those who had been recently diagnosed with MS before starting an infusion DMT. In patients on natalizumab, higher persistence was significantly associated with those with fewer than two comorbidities and patients who had been diagnosed with MS within six months before DMT initiation. Future research could explore persistence trends among newer high-efficacy DMTs, including ofatumumab and ublituximab.Item type: Item , Evaluating Heterogeneity in Treatment Effects and Economic Value of Tumor-Agnostic Drugs(2023-09-27) Chen, Yilin; Carlson, Josh JTumor-agnostic drugs (TAD), also known as histology-independent treatments, have the potential to benefit patients who currently have limited therapeutic options. TAD typically received accelerated approvals based on basket trials which included a small number of multi-cohort, single-arm studies. However, the evaluation of TAD poses major challenges for health technology assessment agencies, such as the potential heterogeneity in treatment effect by tumor type, the lack of comparative data due to single-arm studies, and variable standard of care (SoC) across tumor types. Consequently, these challenges create significant uncertainty regarding the expected clinical and economic impact of TAD. In Aim 1, I used Bayesian hierarchical models (BHM) to assess heterogeneity in treatment outcomes across tumor types and improve estimates of tumor-specific treatment outcomes from Phase II basket trials, which are crucial for healthcare decision-making. My analysis revealed high heterogeneity and uncertainty in survival endpoints, including median progression-free survival (PFS) and median overall survival (OS), although the treatment effects are more similar when judged by surrogate endpoint at approval. Metrics such as intra-class correlation can be used to quantify the variation between groups, which could inform a recommendation of TAD for use in all tumor types or a restricting subset of patients. The findings from our study are important because they demonstrated that BHM could reduce uncertainty of estimates derived from basket trial evidence, potentially improving confidence in tumor-agnostic decision making, despite small sample sizes in some tumor types. The methods presented in this study can be applied to the future assessment of TAD. In Aims 2 and 3, I address comparative effectiveness and economic value of TAD with single-arm trial evidence. To overcome the challenge of lacking comparators, I created eight external controls using observational data from the TriNetx electronic health databases. The Copula method was employed to simulate correlated trial samples while matching the dependence structure of the trial baseline covariates to that in the real-world population. Additionally, an inverse odds weighting approach was used to further balance the baseline characteristics between the trial and external control arms. Weighted Cox regressions showed that patients with MSI-H/dMMR advanced/metastatic colorectal and endometrial cancers receiving pembrolizumab were associated with significant prolonged PFS but not OS than real-world patients receiving chemotherapies. This analysis demonstrated that incorporating external control data in early phase trials may provide a more comprehensive understanding of treatment effects of tumor-agnostic drugs than relying solely on single-arm trials. Finally, using adjusted efficacy inputs from Aim 1 and external controls from Aim 2, I assessed the economic value of pembrolizumab compared to SoC across 8 tumor types to inform coverage and reimbursement decisions in the United States. A partitioned survival model with three health states (i.e., progression-free, post-progression, and death) was developed to evaluate the cost-effectiveness of pembrolizumab for previously treated patients with advanced or metastatic MSI-H/dMMR cancers. We found substantial variation in the economic value across tumor types, with pembrolizumab being a cost-effective strategy in treating colorectal and endometrial cancers at $150,000 willingness-to-pay per quality adjusted life years threshold, compared to SoC chemotherapies. However, pembrolizumab was not found to be cost-effective in treating other assessed cancers. The main findings from value of stratification estimates suggest that recommendations for using pembrolizumab in specific patient populations, based on comparative effectiveness or net health benefit, could result in greater overall value to the healthcare system compared to a tumor-aggregated recommendation.Item type: Item , Impact of Carbidopa-Levodopa Enteral Suspension Initiation on Oral Medication Treatment Patterns in Persons with Parkinson’s Disease: A Retrospective Cohort Analysis(2023-09-27) Baldwin, Zachary Thomas; Devine, BethIntroduction: Parkinson’s disease (PD) patients experience gradual worsening of symptoms as their disease progresses, necessitating complicated poly-pharmaceutical regimens. These complex therapeutic regimens lead to decreases in quality of life and impact adherence to medications, which may lead to poorer symptom management. At later stages of disease, more permanent solutions exist for management of PD, such as carbidopa-levodopa enteral suspension (CLES).Objective: The objective for this research was to characterize the impact of CLES on medication treatment patterns in persons with PD. Methods: We conducted a retrospective time-series analysis of a real-world claims database from the years 2015 to 2022, to evaluate medication utilization in adults with PD who initiated CLES (n=32). This population was covariate-balanced propensity score matched to eligible non-CLES controls in a 1:4 nearest neighbor (n=128) match. The index date was the date CLES was prescribed. Outcome measures, pills per day (PPD) and levodopa-equivalent daily dosages (LEDD), were created for each of the 12-months following from index for all individuals. We ran generalized mixed model regressions for all analyses to assess changes in outcomes over time. Results: PPD was reduced in CLES initiators compared to matched CLES non-initiators, a result that was seen in the first month and sustained through 12 months after treatment initiation (p<0.01 for each month). The likelihood of taking 5 or more PPD was reduced by up to 94% in individuals initiating CLES when compared to controls (OR 0.06, 95% CI 0.01, 0.37 in month 4 after treatment). Mean oral LEDD was reduced in CLES initiators compared to controls, with reductions ranging between 250 and 497 mg compared to baseline. The proportion of individuals taking 500 mg or more LEDD was significantly reduced within the first half of the year after treatment but was not found to be reduced after 6 months when compared to controls. Conclusion: Results suggest that CLES substantially reduced medication burden, improving aspects of overall pill burden in addition to oral levodopa requirements for individuals with advanced PD, with the greatest impact seen within the first month after treatment.Item type: Item , Treatment Patterns of Atrial Fibrillation (AF) Patients After Bleeding on Direct-Acting Oral Anticoagulants (DOACs)(2023-08-14) Katta, Arvind; Hansen, RyanBackground and ObjectiveDirect oral anticoagulants (DOACs) have vastly improved care for atrial fibrillation (AF) as well as prevention of strokes and heart attacks. Although clinical trials and observational studies have confirmed superior safety and reduced bleeding rates for DOACs compared to warfarin, the risk of bleeding though relatively small is not zero. Little is known about the drug-utilization patterns of patients after a major bleed on a DOAC. The aim of this study was to characterize the real-world treatment patterns and evaluate risk of change of treatment of AF patients after experiencing a major bleed on a DOAC. MethodsThis study was a retrospective cohort analysis conducted using the Marketscan Research Databases between October 1, 2015 and December 31, 2021. Patients were 18 years old and initiated a DOAC after AF diagnosis. Patients with a hospitalization for a major bleed were matched to patients that did not bleed through the follow-up period. Patients were followed until 12 months after index, 1st disenrollment, or 2nd hospitalization for a major bleed. Outcomes included discontinuation, treatment switching, dose change, and reinitiating the same DOAC after discontinuation. Time-to-event analyses were conducted using Cox proportional hazards regression to calculate risk of experiencing outcomes. ResultsThe most common first outcome after a major bleed in the bleed group (n = 2,087) was discontinuation (54.8%) followed by no change in therapy (34.7%). Of patients who discontinued after a major bleed, 16.5% of patients reinitiated. The bleed group had a 3.84-fold increased risk (95% CI: 3.56 to 4.14, p < 0.0001) of change to treatment compared to the control group (n = 6,261). Among the bleed group, patients with prior warfarin use had a 2.93-fold increased risk (95% CI: 1.83 to 4.68, p < 0.0001) of switching and trended toward a decreased risk of discontinuation (HR: 0.87, 95% CI: 0.73 to 1.03, p = 0.11) compared to those without prior warfarin use. Among the bleed group, those with high stroke risk had 29% reduced risk (95% CI: 0.55 to 0.92, p = 0.01) of discontinuation compared to those with low stroke risk. ConclusionsOur study found AF patients to have significantly increased risk of DOAC discontinuation, switching, or dose changes after being hospitalized for a major bleed. Future studies focusing on DOACs may increase prescriber confidence in restarting anticoagulation after a major bleed.Item type: Item , Healthcare Resource Utilization and Costs Associated with Misdiagnosis of Migraine(2023-08-14) Kim, Jae Rok; Devine, BethBACKGROUND: Migraine is commonly misdiagnosed and undertreated and can be confused with other conditions that also cause facial pain or headache. The most common misdiagnoses for migraine are headache, sinusitis, and cervical pain. It is possible that misdiagnosis may lead to inappropriate and ineffective treatment, unnecessary consultations, and unnecessary diagnostic evaluations for patients, culminating in untreated migraine and an economic burden for patients. This study evaluated healthcare resource utilization (HRCU) and costs among migraine patients with a prior misdiagnosis of migraine versus migraine patients without a prior misdiagnosis.OBJECTIVE: To assess the impact of a migraine misdiagnosis on all-cause health care resource utilization and all-cause direct healthcare costs in migraine patients. METHODS: A retrospective claims analysis was conducted using data from the Merative™ Marketscan® Commercial and Medicare Supplemental Databases. Adults with a migraine diagnosis were identified from June 2018 to June 2019, and further classified into a misdiagnosis cohort or correct diagnosis cohort based on whether or not they had a prior potential misdiagnosis (PM). PM was defined as a prior diagnosis of headache, sinusitis, or cervical pain within 2 years prior to the migraine diagnosis date. HCRU and direct healthcare costs were compared between the two groups, as well as between a subgroup of multiple misdiagnosis and the correct diagnosis group. Outcomes were reported as incidence rate ratios (IRR), adjusted for age, gender, region, plan type, and comorbidities. RESULTS: In all, 3,841 migraine patients with a prior PM, and 29,147 migraine patients without a prior PM met the inclusion criteria. Patients with PM had a significantly higher rate of inpatient admissions, emergency department (ED) visits, neurologist visits, outpatient visits, and prescription fills per month (IRR: 1.61, 1.92, 5.92, 1.67, and 1.52, respectively, all p<0.001) compared to patients without PM. Patients in the misdiagnosed cohort also had a significantly higher rate of healthcare cost accrual for inpatient admissions, ED visits, neurologist visits, outpatient visits, and prescription fills per month (IRR: 3.22, 2.66, 2.28, 2.06, and 1.36, respectively, all p<0.001) compared to patients without PM. CONCLUSION: Our study suggests that migraine patients with a prior PM have significantly higher rates of HCRU, and cost accrual compared to migraine patients without a prior PM. Our results suggest that an incorrect incident migraine diagnosis significantly increases HCRU and costs for migraine patients.Item type: Item , Productivity Loss Among Patients with Diabetic Macular Edema in Two Eyes: A Retrospective Commercial Claims Analysis in the United States(2023-08-14) Ko, Stella; Bansal, AasthaaBackground: Diabetic retinopathy (DR) is a leading cause of blindness in the US and its prevalence is increasing due to the rising incidence of diabetes. Diabetes macular edema (DME) is a common complication of DR that can cause central vision loss and impact workplace productivity. However, to the best of our knowledge no studies to date have assessed missed worktime following a diagnosis of DME in both eyes. With the implementation of the International Classification of Diseases, 10th revision (ICD-10) codes that differentiate laterality of the disease enable study of two-eye DME population specifically.Objective: To quantify productivity loss due to absenteeism in the year following diagnosis among commercially insured non-elderly adults with newly diagnosed DME in two eyes in the US. Methods: We conducted a retrospective cohort study using MarketScan health insurance claims data to identify DME diagnoses, linked with the Health and Productivity Management database to capture reported productivity loss. Incident two-eye DME patients were identified between January 1st, 2018 to December 31st, 2019 and followed for up to one year after diagnosis. One-year productivity loss after two-eye DME diagnosis was calculated as the sum of days missed due to nonrecreational absenteeism, short-term, and long-term disability during the one-year follow-up period. The indirect cost attributable to workdays lost was calculated assuming an 8-hour workday and using the US average hourly wage (March 2023). A multivariable logistic regression was performed to describe the association between having any workdays lost and age, sex, region, health plan type, CCI score, receipt of anti-vascular endothelial growth factor (anti-VEGF) therapy, and frequency of anti-VEGF therapy. Results: In the year following two-eye DME diagnosis, on average, patients with DME in both eyes lost 7.7 workdays (95% CI (5.13, 10.53)), corresponding to an indirect cost of $2,044 (95% CI ($1,362 - $2,795)). Region and the receipt of anti-VEGF therapy were associated with reporting any workdays lost among newly diagnosed two-eye DME patients, with almost 80% higher odds of having any workdays lost in patients who received anti-VEGF therapy, likely due to more severe disease, compared to those who didn’t receive the therapy (OR=1.80, 95% CI: (1.05, 3.06)). Discussion: We found that newly diagnosed two-eye DME patients have an average of 7.7 workdays lost, and the receipt of anti-VEGF therapy is the observed clinical characteristics from the claims data that is associated with workdays lost. However, the results may not be generalizable to the broader US population as this analysis primarily assessed full-time employees with employer-sponsored private health insurance and disability benefits.
