Sent September 9:
This week we will go over the paper:
It talks about primal-dual averaging, one of the methods mentioned in the review we covered which we didn’t quite work out.
Sent August 26:
Reminder – As we discussed last week, we will talk about the second part of the optimization lecture on Thursday.
Sent August 13:
Sent August 5:
I want to dedicate a few sessions to the area of optimization in ML. The idea is to cover new results but also try to make a “map” of the area, and make the connections between the fields.
To bring us all up to some level, this week, instead of reading a paper on a specific algorithmic/theoretical result, I thought we should read a review on the subject. I couldn’t find a good one written but found a nice nips tutorial on the subject (so you don’t even have to read:)).
Sent July 28:
Sent July 14:
Also, here a motivational video (learning this task previously demanded hundreds of trials, this algorithm does it in 7):
Sent July 3:
It was suggested that we do a couple of sessions on Gaussian Processes. For next week, please read Chapters 2 and 5 of the book Gaussian Processes for Machine Learning, available at
http://www.gaussianprocess.org/gpml/.
Sent June 16:
This week we will go over the paper:
Sent June 9:
Odalric will lead the discussion on the paper –
“Follow the Leader If You Can, Hedge If You Must”
http://arxiv.org/pdf/1301.0534v2.pdf, by Steven de Rooij, Tim van Erven, Peter D. Grünwald, Wouter M. Koolen.
This paper considers the online learning setting and tries to find a way to optimally
tune the Hedge algorithm so as to get an (really) adaptive algorithm.
Sent April 29:
This week we’ll be reading the paper “Sparse inverse covariance estimation with the graphical lasso” by Friedman et al. (http://www-stat.stanford.edu/~tibs/ftp/graph.pdf). The paper discusses the problem of estimating sparse graphs by a lasso penalty applied to the inverse covariance matrix. The connection to graphs- conditional independence may be deduced when there is a zero in the inverse covariance matrix; for a reminder on the subject see the attached tutorial.