The weekly SILO Seminar Series is made possible through the generous support of the 3M Company and its Advanced Technology Group


with additional support from the Analytics Group of the Northwestern Mutual Life Insurance Company

Northwestern Mutual

Hardware Optimizations for Optimization | Mirror Descent for Metric Learning

Victor Bittorf | Gautam Kunapuli, CS Grad Student | Bio-Stats Post Doc

Date and Time: Feb 05, 2013 (12:30 PM)
Location: Orchard room (3280) at the Wisconsin Institute for Discovery Building


Victor's Abstract:
How we used system programming techniques to tune convex optimization solvers. I demonstrate how understanding the hardware can influence the runtime of Nonnegative Matrix Factorization (NMF).

Gautam's Abstract:
Most metric learning methods are characterized by diverse loss functions and projection methods, which naturally begs the question: is there a wider framework that can generalize many of these methods? In addition, ever persistent issues are those of scalability to large data sets and the question of kernelizability. We propose a unified approach to Mahalanobis metric learning: an online regularized metric learning algorithm based on the ideas of composite objective mirror descent (COMID). The metric learning problem is formulated as a regularized positive semidefinite matrix learning problem, whose update rules can be derived using the comid framework. This approach aims to be scalable, kernelizable, and admissible to many different types of Bregman and loss functions, which allows for the tailoring of several different classes of algorithms. The most novel contribution is the use of the trace norm, which yields a sparse metric in its eigenspectrum, thus simultaneously performing feature selection along with metric learning.