Cost-sensitive learning based on Bregman divergences
author: Raúl Santos-Rodríguez,
Carlos III University of Madrid
published: Oct. 20, 2009, recorded: September 2009, views: 2536
published: Oct. 20, 2009, recorded: September 2009, views: 2536
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
This paper analyzes the application of a particular class of Bregman divergences to design cost-sensitive classifiers for multiclass problems. We show that these divergence measures can be used to estimate posterior probabilities with maximal accuracy for the probability values that are close to the decision boundaries. Asymptotically, the proposed divergence measures provide classifiers minimizing the sum of decision costs in non-separableproblems, and maximizing a margin in separable MAP problems.
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Write your own review or comment: