<--- Back to Details
First PageDocument Content
Variational Bayesian methods / Statistical models / Expectation–maximization algorithm / Dirichlet process / Gibbs sampling / Mixture model / Bayesian inference / Exponential family / Kullback–Leibler divergence / Statistics / Bayesian statistics / Statistical theory
Date: 2015-03-12 00:16:23
Variational Bayesian methods
Statistical models
Expectation–maximization algorithm
Dirichlet process
Gibbs sampling
Mixture model
Bayesian inference
Exponential family
Kullback–Leibler divergence
Statistics
Bayesian statistics
Statistical theory

Variational Inference for the Nested Chinese Restaurant Process Chong Wang Computer Science Department Princeton University

Add to Reading List

Source URL: www.cs.columbia.edu

Download Document from Source Website

File Size: 216,01 KB

Share Document on Facebook

Similar Documents

Combining observation models in dual exposure problems using the Kullback-Leibler Divergence

Combining observation models in dual exposure problems using the Kullback-Leibler Divergence

DocID: 1nWoh - View Document

On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes Alexander G. de G. Matthews1 , James Hensman2 , Richard E. Turner1 , Zoubin Ghahramani1 1

On Sparse Variational Methods and the Kullback-Leibler Divergence between Stochastic Processes Alexander G. de G. Matthews1 , James Hensman2 , Richard E. Turner1 , Zoubin Ghahramani1 1

DocID: 1mR5K - View Document

Proceedings of 2010 IEEE 17th International Conference on Image Processing  September 26-29, 2010, Hong Kong Using the Kullback-Leibler Divergence to Combine Image Priors in Super-Resolution Image Reconstruction

Proceedings of 2010 IEEE 17th International Conference on Image Processing September 26-29, 2010, Hong Kong Using the Kullback-Leibler Divergence to Combine Image Priors in Super-Resolution Image Reconstruction

DocID: 1mLVW - View Document

Ann Inst Stat Math:439–468 DOIs10463x Escort distributions minimizing the Kullback–Leibler divergence for a large deviations principle and tests of entropy level

Ann Inst Stat Math:439–468 DOIs10463x Escort distributions minimizing the Kullback–Leibler divergence for a large deviations principle and tests of entropy level

DocID: 1mioD - View Document

Robust Control and Model Misspecification Lars Peter Hansen Thomas J. Sargent Gauhar A. Turmuhambetova Noah Williams∗ September 26, 2005

Robust Control and Model Misspecification Lars Peter Hansen Thomas J. Sargent Gauhar A. Turmuhambetova Noah Williams∗ September 26, 2005

DocID: 1gAK9 - View Document