figshare
Browse

A Deep Generative Approach to Conditional Sampling

Download (4.22 MB)
journal contribution
posted on 2021-12-10, 20:20 authored by Xingyu Zhou, Yuling Jiao, Jin Liu, Jian Huang

We propose a deep generative approach to sampling from a conditional distribution based on a unified formulation of conditional distribution and generalized nonparametric regression function using the noise-outsourcing lemma. The proposed approach aims at learning a conditional generator, so that a random sample from the target conditional distribution can be obtained by transforming a sample drawn from a reference distribution. The conditional generator is estimated nonparametrically with neural networks by matching appropriate joint distributions using the Kullback-Liebler divergence. An appealing aspect of our method is that it allows either of or both the predictor and the response to be high-dimensional and can handle both continuous and discrete type predictors and responses. We show that the proposed method is consistent in the sense that the conditional generator converges in distribution to the underlying conditional distribution under mild conditions. Our numerical experiments with simulated and benchmark image data validate the proposed method and demonstrate that it outperforms several existing conditional density estimation methods. Supplementary materials for this article are available online.

Funding

The work of X. Zhou and J. Huang is supported in part by the U.S. National Science Foundation (grant DMS-1916199). Y. Jiao is supported in part by the National Science Foundation of China (No. 11871474) and by the research fund of KLATASDSMOE of China. The work of J. Liu is supported by the Duke-NUS Medical School (R-913-200-098-263) and MOE2018-T2-2-006 from the Ministry of Education, Singapore.

History