Zabinsky, Zelda B.Maneekul, Pariyakorn2025-10-022025-10-022025-10-022025Maneekul_washington_0250E_28838.pdfhttps://hdl.handle.net/1773/54040Thesis (Ph.D.)--University of Washington, 2025High-dimensional black-box optimization presents an increasingly prevalent challenge in modern science and engineering. This dissertation addresses this challenge through a novel interplay between optimization and machine learning methods, developing adaptive search algorithms that strike a balance between exploration, exploitation, and estimation. The proposed algorithms leverage machine learning techniques to construct surrogate models, thereby enhancing the efficiency of the optimization process. The dissertation proposes a multi-level Partitioning and Branch-and-Bound (PBnB) algorithm designed for level-set approximation, enhancing the original PBnB algorithm significantly. This multi-level PBnB algorithm employs importance sampling to strategically identify promising subregions of the partitioned search space. Its performance is further enhanced by integrating Gaussian processes as a surrogate model to guide local sampling exploration. During the process, the target level set is approximated by classifying subregions as either pruned (no intersection with target level set), maintained (contained within target level set), or undecided. This enhanced version of the PBnB algorithm introduces an adaptive sampling probability that strategically directs samples to the most promising regions. Since this importance sampling results in dependency amongst samples, we have applied a statistical method to construct a confidence interval on the probability of correctly classifying a subregion as pruned or maintained. The contribution to the interplay of optimization and machine learning is the local sampling within each subregion. We incorporate Gaussian processes and regularized quadratic regression, common and successful methods for prediction in machine learning for level-set approximation. The analysis of this multi-level PBnB algorithm quantifies the quality of the level set approximation by deriving probability bounds on the volume of incorrectly pruned or maintained regions, which accounts for the effects of importance sampling. To address the challenges of high dimensionality, this dissertation introduces the Branching Adaptive Surrogate Search Optimization (BASSO) framework that conceptualizes the use of branching and surrogate modeling for black-box optimization. BASSO generalizes multi-level PBnB and adapts it to optimization as opposed to level-set approximation. A finite-time analysis of BASSO proves that the expected number of BASSO function evaluations needed to first sample a point in the global optimum vicinity is linear in dimension given that two strong assumptions are satisfied. The desiredlinearity result suggests an algorithm that is scalable to high dimensions in theory. This research explores several variations to implement BASSO and partially satisfy the two assumptions. In this part of the research, methods used in machine learning are introduced to improve the chance of sampling in the improving region. One BASSO implementation incorporates Gaussian processes as a surrogate model and a second uses regularized quadratic regression as a surrogate model to predict where to sample next within a subregion. The synergy between the surrogate model and the optimization algorithm work together to balance exploration and exploitation. The local surrogate model guides sampling within a subregion, while the adaptive subregion probabilities identify promising subregions. This interplay allows the system to effectively use both local subregion information (from the surrogate model) and global information (from the adaptive probabilities) to improve its search. Numerical experiments of BASSO provide insights into the gap between theoretical ideal performance and the performance of proposed implementation with machine learning techniques to tackling high dimensional black-box problem. This dissertation also explores partitioning, clustering and decomposition as techniques for high-dimensional optimization. While the proposed multi-level PBnB algorithm and BASSO framework focus on balancing exploration and exploitation for deterministic, black-box optimization, this dissertation also considers estimation when dealing with a noisy black-box function. The dissertation extends the Single Observation Search Algorithm (SOSA) by incorporating insight from machine learning techniques. The original neighborhood averaging technique for noisy function value estimation of SOSA is replaced with a new quadratic regression, extending the concept of basis expansion. Complementing this, the search strategy is improved by incorporating optimistic sampling, a concept drawn from reinforcement learning, to more effectively guide exploration. This research contributes to the interplay of optimization and machine learning by providing quadratic regression as an estimation method within a single-observation scheme and achieving convergence results while accounting for dependency between samples. Theoretical convergence results for this SOSA extension are presented, and numerical experiments on benchmark problems demonstrate performance gains over the baseline algorithm. Finally, this dissertation identifies possible applications and future research opportunities arising from the interplay of optimization and machine learning in solving large-scale black-box noisy functions. This includes a discussion of quantum computing approaches for global optimization, considering both their theoretical promises and practical challenges.application/pdfen-USCC BY-NCBlack-box optimizationGaussian processHigh-dimensional optimizationMachine learningQuantum computingQuantum computingOperations researchStatisticsEngineeringIndustrial engineeringThe Interplay of Optimization and Machine Learning to Solve Large-scale Black-box Noisy FunctionsThesis