Back to Results
First PageMeta Content
Mathematical analysis / Statistical theory / Randomness / Kullback–Leibler divergence / Thermodynamics / Divergence / Mutual information / Entropy / Cross entropy / Statistics / Information theory / Mathematics


A FAMILY OF DISCRIMINATIVE TRAINING CRITERIA BASED ON THE F-DIVERGENCE FOR DEEP NEURAL NETWORKS Markus Nussbaum-Thom1,2 , Xiaodong Cui1 , Ralf Schl¨uter2 , Vaibhava Goel1 , Hermann Ney2,3 1 2
Add to Reading List

Open Document

File Size: 241,09 KB

Share Result on Facebook

City

Detroit / Belgrade / Lyon / Puerto de Andratx / new york / /

Company

Context-Dependent Deep Networks / DNN / Transcription Using Context-Dependent Deep Neural Networks / Deep Neural Networks / McGraw-Hill / IBM / /

Country

France / Australia / Spain / /

/

Event

Reorganization / /

Facility

Victoria University / RWTH Aachen University / /

IndustryTerm

neural network / neural networks / conversational and scripted telephony speech / /

Organization

Victoria University / Melbourne / Department of Defense U.S. Army Research Labotory / RWTH Aachen University / Aachen / U.S. Government / Department of Defense / /

Person

Vaibhava Goel / Ralf Schl / Xiaodong Cui / /

Position

CPA / speaker / first author / senior chair / α-CPA / /

Product

C-0012 / /

ProgrammingLanguage

R / C / /

ProvinceOrState

Michigan / /

PublishedMedium

IEEE Transactions on Information Theory / /

RadioStation

DNN / /

Technology

alpha / speech recognition / neural network / /

SocialTag