Fazel, MaryamSadeghi, Omid2024-02-122024-02-122024-02-122023Sadeghi_washington_0250E_26417.pdfhttp://hdl.handle.net/1773/51152Thesis (Ph.D.)--University of Washington, 2023Numerous tasks in machine learning involve objective functions that exhibit a Diminishing Returns (DR) property, i.e., the marginal gain of increasing the input gets smaller as the input gets larger. For instance, in document summarization, the goal is to select a small subset of sentences that represent the entirety of the document. As the summary gets larger, the additional benefit of adding a sentence to the summary diminishes. In this dissertation, we focus on the class of set functions and continuous functions that exhibit the DR property. These functions are called submodular set functions and continuous DR-submodular functions respectively. This dissertation presents several contributions to various online and offline maximization problems in machine learning where the utility functions satisfy the DR property, with the main themes organized into three parts: (i) study of online DR-submodular maximization under online budget constraints and designing primal-dual algorithms that not only perform well in terms of the utility, but they also satisfy the online constraints; (ii) characterization of the class of strongly DR-submodular functions and providing techniques for offline and online maximization of these functions that utilize the additional structure to obtain improved convergence rates and regret guarantees respectively; and (iii) study of offline and online submodular set function maximization problems under social and economic considerations (e.g., privacy and strategic behavior) and designing differentially private and incentive-compatible algorithms for these problems.application/pdfen-USnoneElectrical engineeringElectrical and computer engineeringThe Diminishing Returns (DR) Property and Its Applications in Machine LearningThesis