Suppose that we are given images of dogs we want to learn a probability distribution
- Generation : If we sample
- Density estimation :
- This is also known as explicit models.
- Then, how can we represent
기본적인 확률분포는 알고있어야해요~
1. Bernoulli distribution : coin flip
2. Categorical distribution : m-side dice
Example
RGB image의 single pixel을 Modeling 한다면?
- Number of cases? => 256 x 256 x 256
- How many parameters do we need to specify? => 256 x 256 x 256 -1
- Number of cases => 2x2 x... x2 =
- How many parameters do we need to specify? =>
-------> 어쨌든 전체경우의 수를 다 고려한 데이터를 확보하는 것은 현실적으로 불가능하다
What if
- Number of cases =>
- How many parameters do we need to specify? =>
Q. n 왜
A.Conditional Independence
1. Chain rule
2. Baye's rule
3. Conditional independence
Autoregressive Model
- suppose we have 28 x 28 binary pixels
- Our goal is to learn
- Then, how can we parametrize
- Let's use the chain rule to factor the joint distribution
- In other words,
-
- This is called an autoregressive model.
- Note that we need an ordering of all random variables
첫 번째 nn model : NAEA(Neural Autoregressive Density Estimator)
: 단순히 생성할 수 있을 뿐만 아니라 새로운 입력에 대한 density를 구할 수 있음
- NADE is an explicit model that can compute the density of the given inputs
- BTW, how can we conpute the density of the given image? conditional probability
- continuous random variables에 대해서는 Mixture of Gaussian(MoG)와 같은 방법론을 사용할 수 있음
'Have Done > Generative Model' 카테고리의 다른 글
cGAN(conditional generative model) (1) | 2023.12.07 |
---|---|
Generative Model Basic 2 (2) | 2023.12.05 |
Conditional Generative model (0) | 2023.12.05 |
댓글