\( P(x)\)를 학습한다 : input x에 대한 확률분포를 학습한다
Suppose that we are given images of dogs we want to learn a probability distribution \(p(x)\) such that
- Generation : If we sample \(\tilde x ~ p(x), \tilde x \) should look like a dog
- Density estimation : \(p(x)\) shoud be high if x looks like a dog, and low otherwise.
- This is also known as explicit models.
- Then, how can we represent \( p(x)\)?
기본적인 확률분포는 알고있어야해요~
1. Bernoulli distribution : coin flip
2. Categorical distribution : m-side dice
Example
RGB image의 single pixel을 Modeling 한다면?
\( (r, g, b)~p(R, G, B) \)
- Number of cases? => 256 x 256 x 256
- How many parameters do we need to specify? => 256 x 256 x 256 -1
\(X_1,..., X_n, n\) 개의 binary pixel을 modeling 한다면?
- Number of cases => 2x2 x... x2 = \(2^n\)
- How many parameters do we need to specify? => \(2^n-1\)
-------> 어쨌든 전체경우의 수를 다 고려한 데이터를 확보하는 것은 현실적으로 불가능하다
What if \(X_1,..., X_n\) are independent, then \(P(X_1,..., X_n) = P(X_1)... P(X_n) \)
- Number of cases => \( 2^n\)
- How many parameters do we need to specify? =>
Q. n 왜 \(2^n-1\) 에서 n으로 줄어들었나?
A.Conditional Independence
1. Chain rule
2. Baye's rule
3. Conditional independence
Autoregressive Model
- suppose we have 28 x 28 binary pixels
- Our goal is to learn \( P(X) = P(X_1,..., x_{784}) over X \in {0,1}^{784} \)
- Then, how can we parametrize \(P(x)\)?
- Let's use the chain rule to factor the joint distribution
- In other words,
- \(P(X_{1:784} = P(X_1) P(X_2|X_1) P(X_3|X_2)...\)
- This is called an autoregressive model.
- Note that we need an ordering of all random variables
첫 번째 nn model : NAEA(Neural Autoregressive Density Estimator)
: 단순히 생성할 수 있을 뿐만 아니라 새로운 입력에 대한 density를 구할 수 있음
- NADE is an explicit model that can compute the density of the given inputs
- BTW, how can we conpute the density of the given image? conditional probability
- continuous random variables에 대해서는 Mixture of Gaussian(MoG)와 같은 방법론을 사용할 수 있음
'Have Done > Generative Model' 카테고리의 다른 글
cGAN(conditional generative model) (1) | 2023.12.07 |
---|---|
Generative Model Basic 2 (2) | 2023.12.05 |
Conditional Generative model (0) | 2023.12.05 |
댓글