Towards Better Understanding of Algorithms and Complexity of Some Learning Problems
Loading...
Date
Authors
Yang, Xin
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
We present several novel results on computational problems related to supervised learning.We focus on the computational resources required by algorithms to solve learning problems.
The computational resources we consider are running time, memory usage and query complexity,
which is the number of positions in the input that the algorithm needs to check.
Some contributions include: \begin{itemize} \item Time-space tradeoff lower bounds for problems of learning
from uniformly random labelled examples. With our methods we can obtain bounds for learning concept classes of finite functions from random evaluations even when
the sample space of random inputs can be significantly smaller
than the concept class of functions and the function values can be from
an arbitrary finite set.
\item A simple and efficient algorithm for approximating the John Ellipsoid of a symmetric polytope.
Our algorithm is near optimal in the sense that our time complexity matches the current best verification algorithm.
Experimental results suggest that our algorithm significantly outperforms existing algorithms.
\item The first algorithm for the total least squares problem, a variant of the ordinary least squares problem, that runs in time proportional to the sparsity of the input. The core to developing our algorithm involves recent advances in randomized linear algebra.
\item A generic space efficient algorithm that is based on deterministic decision trees.
\item The first algorithm for the linear bandits problem with prior constraints.
\end{itemize}
Description
Thesis (Ph.D.)--University of Washington, 2020
