Log-linear Models and Conditional Random Fields

author: Charles Elkan, Department of Computer Science and Engineering, UC San Diego
published: Nov. 19, 2008,   recorded: October 2008,   views: 119146
Categories

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

 Watch videos:   (click on thumbnail to launch)

Watch Part 1
Part 1 1:01:51
!NOW PLAYING
Watch Part 2
Part 2 1:02:01
!NOW PLAYING
Watch Part 3
Part 3 47:18
!NOW PLAYING
Watch Part 4
Part 4 1:00:25
!NOW PLAYING
Watch Part 5
Part 5 53:35
!NOW PLAYING
Watch Part 6
Part 6 39:52
!NOW PLAYING

Description

Log-linear models are a far-reaching extension of logistic regression, while con- ditional random fields (CRFs) are a special case of log-linear models suitable for so-called structured learning tasks. Structured learning means learning to predict outputs that have internal structure. For example, recognizing handwritten words is more accurate when the correlations between neighboring letters are used to reÞne predictions. This tutorial will provide a simple but thorough introduction to these new developments in machine learning that have great potential for many novel applications.

The tutorial will first explain what log-linear models are, with with concrete examples but also with mathematical generality. Next, feature-functions will be explained; these are the knowledge-representation technique underlying log-linear models. The tutorial will then present linear-chain CRFs, from the point of view that they are a special case of log-linear models. The Viterbi algorithm that makes inference tractable for linear-chain CRFs will be covered, followed by a discus- sion of inference for general CRFs. The presentation will continue with a general derivation of the gradient of log-linear models; this is the mathematical foundation of all log-linear training algorithms. Then, the tutorial will discuss two impor- tant special-case CRF training algorithms, one that is a variant of the perceptron method, and another one called contrastive divergence. Last but not least, the tu- torial will introduce publicly available software for training and using CRFs, and will explain a practical application of CRFs with hands-on detail.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 Hung Ngo, November 3, 2009 at 10:01 a.m.:

Lecture notes for this tutorial can be found at the author's homepage:

http://cseweb.ucsd.edu/~elkan/250B/ci...


Comment2 Pablo Barrio, February 13, 2010 at 12:11 a.m.:

There's something missing in the Last video unfortunately. The Collins perceptron.


Comment3 balaji, March 19, 2010 at 6:19 a.m.:

very informative tutorial! thanks a lot Prof Elkan!


Comment4 Rohan, December 17, 2010 at 2:13 p.m.:

Awesome lectures \m/


Comment5 signali, April 12, 2011 at 3:19 a.m.:

I also think it must be another lecture to cover whole materials


Comment6 Trung Huynh, May 19, 2011 at 12:59 p.m.:

Excellent lecture.


Comment7 Andreas, May 23, 2011 at 8:55 a.m.:

This tutorial is really an excellent introduction to conditional random fields. The pace is slow enough for everything to be pretty clear.

Thanks a lot for putting this online. There where only five people at the actual tutorial, but 3000 views proves that a lot more people have benefited from the videos.


Comment8 KGD, June 16, 2011 at 7:01 a.m.:

Thank you Professor Elkan. Clear and well-paced tutorial.


Comment9 Jason Lin, July 12, 2011 at 4:29 a.m.:

Thanks for Professor Elkan. Wonderful lecture!


Comment10 James Wu, October 17, 2011 at 4:27 p.m.:

Before watching the video, I knew nothing about CRF.
After the tutorial, I feel that I knew a lot.


Comment11 Michele Filannino, February 17, 2012 at 6:49 p.m.:

Prof. Elkan, thank you very much for this lecture.

Bye,
michele.


Comment12 Ian, February 22, 2012 at 7:28 a.m.:

Thank you Professor Elkan,

I am not a mathematician by training and you make much of this simple enough for me to understand. It is a great help to me!


Comment13 cuong hoang, April 22, 2012 at 9:20 p.m.:

Actually, i don't understand clearly when he said that we do not need to model p(x).
Thank you Professor Elkan, an awesome tutorial!


Comment14 Abhishek Shivkumar, December 22, 2012 at 8:25 p.m.:

The best video on Conditional Random fields I have ever seen. Perfect for a beginner who has no idea what CRF is. I think one last video part 7 is missing because he hasn't ended the session in the 6th video.


Comment15 gz_ricky, May 9, 2013 at 6:48 a.m.:

thanks Elkan, now i'm more clear about crf, and i will go on to learn 2D crf structure.


Comment16 abi_utem, August 25, 2014 at 9:50 a.m.:

thank you Elkan, you did very wonderful tutorial and uploading extend the benefit for anyone who search this kind of topic


Comment17 Ashish Kumar, October 2, 2014 at 6:41 a.m.:

Sorry, I cannot find the lecture notes. I think link is dead. Is there some alternative link?


Comment18 dola, April 30, 2015 at 7:33 p.m.:

Lecture Notes are available on the lecture website:
http://cseweb.ucsd.edu/~elkan/250B/


Comment19 Arun Chauhan, March 2, 2017 at 12:28 p.m.:

Really, Rakesh Aggarwal is sitting among four.


Comment20 nikai, February 27, 2020 at 5:33 a.m.:

Your feedback helps me a lot, A very meaningful event, I hope everything will go well https://candycrushsoda.co


Comment21 arianapham, April 6, 2020 at 6:50 a.m.:

A video helps me know a lot of things


Comment22 Burt Macklin, June 28, 2020 at 10:47 p.m.:

Great lecture. I learned a lot. Thank you

-Burt | https://selltohouseguys.com/


Comment23 Great Jobs, May 13, 2021 at 8:25 p.m.:

Thank you for such a sweet tutorial – all this time later, I’ve found it and love the end result. I appreciate the time you spent sharing your skills https://www.recruitmentcave.com/news/...


Comment24 Newsflasharena, August 7, 2021 at 10:01 p.m.:

Professor Elkan, thank you very much.

I am not a mathematician by training, yet you make much of this understandable for me. It is really beneficial to me!
https://www.pastquestionsarena.com.ng...


Comment25 All About , October 17, 2021 at 8:14 p.m.:

I love the valuable info provided in your article.


Comment26 G.P, October 17, 2021 at 8:14 p.m.:

<p>Thanks for sharing excellent <a href="https://globalpoint.in/">Good informations</a>.</p>
<p><br></p>

Write your own review or comment:

make sure you have javascript enabled or clear this field: