ヒラオカ カズユキ   HIRAOKA Kazuyuki
  平岡 和幸
   所属   経営学部 データサイエンス学科
   職種   教授
発行・発表の年月 2018/02
形態種別 学術雑誌
査読 査読あり
標題 Necessary and sufficient conditions of proper estimators based on self density ratio for unnormalized statistical models
執筆形態 共著
掲載誌名 Neural Networks
掲載区分国外
出版社・発行元 Elsevier BV
巻・号・頁 98,263-270頁
著者・共著者 Kazuyuki Hiraoka,Toshihiko Hamada,Gen Hori
概要 The largest family of density-ratio based estimators is obtained for unnormalized statistical models under the assumption of properness. They do not require normalization of the probability density function (PDF) because they are based on the density ratio of the same PDF at different points. In contrast with most existing work, a single necessary and sufficient condition is given here, rather than merely sufficient conditions for proper criteria for estimation. The condition implies that an extended Bregman divergence framework with data-dependent noise (Gutmann & Hirayama, 2011) gives the largest family of proper criteria in the present case. This properness yields consistent estimation as long as some mild conditions are satisfied. The present study shows that the above-mentioned framework gives an “upper bound” for attempts to extend Hyvärinen's score matching and therefore provides a perspective for studies in this direction.
DOI 10.1016/j.neunet.2017.11.018
ISSN 0893-6080/1879-2782
PMID 29288873