Training Restricted Boltzmann Machines using Approximations to the Likelihood Gradient
published: July 29, 2008, recorded: July 2008, views: 12051
Slides
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
A new algorithm for training Restricted Boltzmann Machines is introduced. The algorithm, named Persistent Contrastive Divergence, is different from the standard Contrastive Divergence algorithms in that it aims to draw samples from almost exactly the model distribution. It is compared to some standard Contrastive Divergence algorithms on the tasks of modeling handwritten digits and classifying digit images by learning a model of the joint distribution of images and labels. The Persistent Contrastive Divergence algorithm outperforms other Contrastive Divergence algorithms, and is equally fast and simple.
See Also:
Download slides: icml08_tieleman_trb_01.pdf (296.2 KB)
Download slides: icml08_tieleman_trb_01.ppt (835.0 KB)
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Reviews and comments:
Nice lecture.Thanks. I am implementing PCD in matlab. I am using minibatch version. I am following Hinton's (CD1)code(science paper code) available at his website.
In that, i have included the below mentioned line before negative phase. So that,second batch data's Gibbs chain is initialized with previous models. Is that follows PCD? But if i visualize the reconstructed samples, it is not same as input sample. Could you please explain this for me.
if batch~=1,
poshidprobs = neghidprobs;
end
thanks
subha
Write your own review or comment: