Back to Results
First PageMeta Content
Statistical inference / Kullback–Leibler divergence / Estimator / Normal distribution / Consistent estimator / M-estimator / Divergence / F-divergence / Bregman divergence / Statistics / Estimation theory / Statistical theory


IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 56, NO. 11, NOVEMBER[removed]Estimating Divergence Functionals and the Likelihood Ratio by Convex Risk Minimization
Add to Reading List

Document Date: 2010-11-03 13:13:21


Open Document

File Size: 636,12 KB

Share Result on Facebook

City

Nice / Kernel / /

Company

Neural Information Processing Systems / /

Country

France / United States / /

Currency

USD / /

/

Facility

University of Michigan / University of California / /

IndustryTerm

statistical applications / inner product product / signal processing / optimization algorithm / dual solution primal solution / /

NaturalFeature

Shannon / /

Organization

University of California / Berkeley / University of Michigan / Ann Arbor / National Science Foundation / Department of Electrical Engineering and Computer Science / Department of Statistics / /

Person

Martin J. Wainwright / Michael I. Jordan / /

Position

Associate Editor for Detection and Estimation / /

ProvinceOrState

Michigan / /

PublishedMedium

IEEE TRANSACTIONS ON INFORMATION THEORY / /

Technology

machine learning / simulation / 3-D / Digital Object Identifier / optimization algorithm / /

URL

http /

SocialTag