Dimensionality reduction
==================
1. Overview:
[PDF] Dimensionality reduction: A comparative review
LJP Van der Maaten, EO Postma… – Journal of Machine …, 2009 – Citeseer
2. Semi-supervised local Fisher discriminant analysis for dimensionality reduction
M Sugiyama, T Idé, S Nakajima, J Sese – Machine learning, 2010
3. A Unifying Probabilistic Perspective for Spectral Dimensionality Reduction: Insights and New Models
ND Lawrence – The Journal of Machine Learning Research, 2012
We can also instead try to understand rough set theory (I have no idea what that is):
Dimensionality reduction based on rough set theory: A review
And follow a few of the mentioned papers.
For a while, I’ve been curious to understand results in sparse signal recovery / compressed sensing / matrix completion. Especially, the general themes of “dual certificates” and “golfing schemes” feature as key ingredients in proofs of optimization programs. A couple of influential papers in this vein-
D. Gross. Recovering low-rank matrices from few coefficients in any basis. Available at http://arxiv.
org/abs/0910.1879, 2009.
E. Candes and Y. Plan. A probabilistic RIP-less theory of compressed sensing. Available at http://arxiv.org/abs/1011.3854, 2010.
Hessian Free
http://link.springer.com/chapter/10.1007%2F978-3-642-35289-8_27?LI=true
Randomized Smoothing for Stochastic Optimization (or as a topic: algorithms for non smooth optimization)
Gaussian Processes for Machine Learning
Rasmussen and Williams
Chapters 2 and 5 of the book cover Gaussian process regression, and are an easy read.
Dimensionality reduction
==================
1. Overview:
[PDF] Dimensionality reduction: A comparative review
LJP Van der Maaten, EO Postma… – Journal of Machine …, 2009 – Citeseer
2. Semi-supervised local Fisher discriminant analysis for dimensionality reduction
M Sugiyama, T Idé, S Nakajima, J Sese – Machine learning, 2010
3. A Unifying Probabilistic Perspective for Spectral Dimensionality Reduction: Insights and New Models
ND Lawrence – The Journal of Machine Learning Research, 2012
We can also instead try to understand rough set theory (I have no idea what that is):
Dimensionality reduction based on rough set theory: A review
And follow a few of the mentioned papers.
http://arxiv.org/pdf/1304.7045v1.pdf
The results seem impressive and the main idea looks quite simple and elegant. I volunteer to lead the discussion.
For a while, I’ve been curious to understand results in sparse signal recovery / compressed sensing / matrix completion. Especially, the general themes of “dual certificates” and “golfing schemes” feature as key ingredients in proofs of optimization programs. A couple of influential papers in this vein-
D. Gross. Recovering low-rank matrices from few coefficients in any basis. Available at http://arxiv.
org/abs/0910.1879, 2009.
E. Candes and Y. Plan. A probabilistic RIP-less theory of compressed sensing. Available at http://arxiv.org/abs/1011.3854, 2010.
NIPS2013
post the name of paper you wish to present as a reply to this post.
paper presentation should be no longer than 10 minutes.