Paper Stack

Here you can enter your suggestions for the reading group.

Papers in all fields of Machine Learning are welcomed!

7 thoughts on “Paper Stack

  1. Randomized Smoothing for Stochastic Optimization (or as a topic: algorithms for non smooth optimization)

  2. Gaussian Processes for Machine Learning
    Rasmussen and Williams

    Chapters 2 and 5 of the book cover Gaussian process regression, and are an easy read.

  3. Dimensionality reduction
    ==================
    1. Overview:
    [PDF] Dimensionality reduction: A comparative review‏
    LJP Van der Maaten, EO Postma… – Journal of Machine …, 2009‏ – Citeseer‏

    2. Semi-supervised local Fisher discriminant analysis for dimensionality reduction‏
    M Sugiyama, T Idé, S Nakajima, J Sese – Machine learning, 2010

    3. A Unifying Probabilistic Perspective for Spectral Dimensionality Reduction: Insights and New Models‏
    ND Lawrence – The Journal of Machine Learning Research, 2012

    We can also instead try to understand rough set theory (I have no idea what that is):
    Dimensionality reduction based on rough set theory: A review
    And follow a few of the mentioned papers.

  4. For a while, I’ve been curious to understand results in sparse signal recovery / compressed sensing / matrix completion. Especially, the general themes of “dual certificates” and “golfing schemes” feature as key ingredients in proofs of optimization programs. A couple of influential papers in this vein-

    D. Gross. Recovering low-rank matrices from few coefficients in any basis. Available at http://arxiv.
    org/abs/0910.1879, 2009.

    E. Candes and Y. Plan. A probabilistic RIP-less theory of compressed sensing. Available at http://arxiv.org/abs/1011.3854, 2010.

  5. NIPS2013
    post the name of paper you wish to present as a reply to this post.
    paper presentation should be no longer than 10 minutes.

Leave a Reply

Your email address will not be published. Required fields are marked *