New work presented at Optimization workshop at NIPS 2015

At the beginning of November our “A Multilevel Acceleration for l1-regularized Logistic Regression” work on how to accelerate the L1-regularized logistic regression problem was accepted to the Optimization workshop at NIPS 2015. Last week, I presented the work in the Optimization workshop at NIPS 2015. This year the Optimization workshop grew a lot, having about 50 posters in several optimization topics.

This work was a collaboration between Earn Treister (Univ. Of British Columbia) and myself (Intel Labs).

Released Code of the Block-Coordinate Descent for Inverse Covariance

I released the code for the paper “A Block-Coordinate Descent Approach for Large-scale Sparse Inverse Covariance Estimation” that was presented in NIPS 2014. The algorithm includes a flag that enables the multilevel acceleration. This flag is very useful for large-scale problems on the thousands-millions of variables. The code runs in Matlab and includes some functions in C that require compilation. Also, it calls functions from METIS 5.0.2 to partitioning the neighbors in every sweep. The released version was tested on Windows, although it should work on other platforms as well.

You are welcome to try it and contact me with any comment you may have. I would like to know if somebody managed to run it in linux or mac.

Optimization Workshop OPT2014 in NIPS

Last week, I received the notice that the work with Eran Treister and Irad Yavneh was accepted in the optimization workshop at NIPS 2014. This is a follow up work the sparse inverse covariance work, where we present an acceleration framework based on multilevel techniques. The framework reduces the number of computations needed by defining an hierarchy of levels and updating a subset of the active set of non-zero elements. We tested the framework on QUIC and on BCD-IC algorithms with very interesting results, in particular for large-scale problems where the timings are reduced up to 10x.

See you at NIPS 2014 and in the OPT 2014 workshop.

NIPS 2014 – Accepted!

A few days ago, I’ve received the notification about the acceptance to NIPS 2014 of the work I submitted with my friend and colleague Eran Treister back in June. The NIPS 2014 conference will be held in Montreal, Canada during December 8th and 11th.  Our work is about a new algorithm to solve the Sparse Inverse Covariance Estimation problem in high dimensions, such that the memory is a limitation factor. In the work we show that the algorithm is faster than the previous methods in thousands to millions of variables, and that the algorithm is capable of running in a single server with 64GB because of its reduced memory usage.