본문 바로가기
Have Done/Generative Model

Generative Model Basic 2

by 에아오요이가야 2023. 12. 5.

Maximum Likelihood Learning

 - parameter가 주어졌을 때 데이터가 얼마나 likely 한지(뭐랑?), 이를 높이는 방향으로 학습함

 - 뭘 Metric으로 삼을까? -> KL-divergence

 - minimizing KL-divergence <=> Maximizing the expected log-likelihood

 - empirical log-likelihood is approximation of expected log-likelihood?

 - ERM(empirical risk minimization)을 보통 사용하지만 overfitting 문제를 갖고 있음

*jensen-shannon divegence(GAN), Wasserstein distance(WAE, AAE)등이 metric으로 쓰임

 

Latent Variable Models (Variational AutoEncoder는 generative model, AutoEncoder는 아님)

 - Key limitation

    - it is an intractable model(hard to evaluate likelihood)

    - The prior fitting term should be differentiable, hence it is hard to use diverse latent prior distributions.

    - In most cases, we use an isotropic Gaussian where we have a closed-form for the prior fitting term.

 

Generative Adversarial Networks

 - minmax game between generator and discriminator.

 - min generator, max discriminator

 

Diffusion Models - noise로부터 image를 만듦, 성능이 월등히 좋음

 - progressively generate images from noise

 - Diffusion(Forward) process

 - The reverse process is learned in such a way to denoise the perturbed image back to a clean image <- 여기가 학습하는 부분임

'Have Done > Generative Model' 카테고리의 다른 글

cGAN(conditional generative model)  (1) 2023.12.07
Generative model Basic 1  (0) 2023.12.05
Conditional Generative model  (0) 2023.12.05

댓글