Tyler Maunu & Sumeet Katariya,
Date and Time: Feb 21, 2018 (12:30 PM)
Location: Orchard room (3280) at the Wisconsin Institute for Discovery Building
We present a mathematical analysis of a non-convex energy landscape for Robust Subspace Recovery. Under a deterministic condition, the only stationary point in a large neighborhood of an underlying subspace is the subspace itself. The same deterministic condition implies that a geodesic gradient descent method can exactly recover the underlying subspace with proper initialization. The condition is shown to hold with high probability for a certain statistical model of data.
Joint work with Teng Zhang and Gilad Lerman.
We consider the problem of sequential coarse ranking, where the goal is to sort items according to their means into clusters of pre-specified sizes, by adaptively sampling from their reward distributions. This setting is useful in many social science applications, where an approximate rank of every item is desired. In contrast, a complete ranking of the items requires a large number of samples if the means are close, while only finding the top few items as is done in recommender systems reveals no information about the ranks of other items. We propose a computationally efficient PAC algorithm LUCBRank to solve this problem, and derive an upper bound on its sample complexity. We also derive a nearly matching distribution-dependent lower bound. Experiments on synthetic as well as real-world data arising from assessing the safety scores of street view images show that LUCBRank performs better than state-of-the-art baseline methods, even when these methods have the advantage of knowing the underlying parametric model.