A few days ago, I’ve received the notification about the acceptance to NIPS 2014 of the work I submitted with my friend and colleague Eran Treister back in June. The NIPS 2014 conference will be held in Montreal, Canada during December 8th and 11th. Our work is about a new algorithm to solve the Sparse Inverse Covariance Estimation problem in high dimensions, such that the memory is a limitation factor. In the work we show that the algorithm is faster than the previous methods in thousands to millions of variables, and that the algorithm is capable of running in a single server with 64GB because of its reduced memory usage.
acceleration biharmonic interpolation block-coordinate descent brainiak.org Clutter Reduction compositionality Context Cosparse Model Dictionary Learning dimensionality reduction distributed methods Echography factor analysis fMRI framework Functional Alignment Fusion ICASSP IEEEI Journal kernel approximation Language lasso Logistic Regression machine learning MAP MCA MMSE multilevel Neural Networks NeurIPS neuroscience NIPS NLP Nystrom optimization semi-supervised Shared Response Model source code sparse inverse covariance sparse optimization Sparse Representations sparse signal separation Ultrasound Workshop