Maximum Likelihood Estimation - standard deviation tends to zero

22 Views Asked by At

I am trying to estimate a marginalized likelihood function. I adopt numerical integration to marginalize out a variable (m) (latent variable) from the joint likelihood function f(X,m). I assume X follows normal distribution and m follows log-normal distribution.

Thus, f(X) = Integral of {f(X,m)*f(m)}.

I have N number of observations. I calculate the f(X), as above, for each observation and take sum-log of them to obtain the final log-likelihood.

The problem I am facing with this is, while maximizing the log-likelihood, the scale parameter of the f(m), marginal density of m, tends to closer-to-zero (not exactly zero since I limit it by using exp(scale)). Since the scale tends to zero, the marginal density at mean value (initial mean value I give for optimization) is very huge and thus producing very high positive number for final log likelihood; also leaving the parameters of f(X,m) unrealistic.

I tried different optimization algorithms. But, no improvement.

I am expecting to arrive at a reasonable estimation of parameters while avoiding scale of marginal density getting to zero.

0

There are 0 best solutions below