Modeling multivariate beta distribution using pymc or pymc3

106 Views Asked by At

I am trying to compute bayes-optimal predictions for non-trivial latent distributions, but my current implementation is not yielding correct results.

The problem involves the coin-flip paradigm:

  1. Use the sum of two beta functions (with parameters alpha1, beta1, alpha2, beta2) as the prior/pdf.
  2. Draw a sample for the coin bias from this sum.
  3. Generate n samples.
  4. Start a pymc model.
  5. Define a prior distribution as the sum of 2 betas with the ground truth alphas and betas.
  6. "Run" the trace.
  7. Compute the mean of the trace['prior'], which should recover the coin bias from step 2.

However, this process returns a posterior that is generally around 0.53, even when the coin bias is skewed significantly, and even after 100 observations.

Here is the code to implement this process:

import pymc3 as pm
import numpy as np
import scipy

# Step 3: Define parameters of the Beta distributions
alpha1, beta1 = 5.1, 2
alpha2, beta2 = 15, 50

# make a pdf from two beta distributions as defined above using the scipy beta function
x = np.linspace(0, 1, 100)
y1 = scipy.stats.beta.pdf(x, alpha1, beta1)
y2 = scipy.stats.beta.pdf(x, alpha2, beta2)
y = y1 + y2

# normalize y so that it's a pdf
y /= y.sum()

# grab 1 sample from the pdf
observed_bias = np.random.choice(x, size=1, p=y)

# plot the pdf
plt.figure()
plt.plot(x, y)
plt.axvline(observed_bias, color='red', linestyle='--',label=str(observed_bias))
plt.legend(loc='best')
plt.show()

# generate 100 observations based on observed_bias 
n_obs = 100
observed_data_all = np.random.binomial(n=1, p=observed_bias, size=n_obs)
print ("observed_data_all: ", observed_data_all)
print ("observed_data_all value: ", np.sum(observed_data_all)/n_obs)

#########################################
traces = []
#for k in range(10,observed_data_all.shape[0]+1,1):
if True:
        observed_data = observed_data_all
        print ('')
        print ('*******************************************')
        print ("observed_data: ", observed_data)

        # Step 4: Define prior distribution as the sum of two Beta distributions
        with pm.Model() as model:

            # Define the first Beta distribution
            beta1_ = pm.Beta('beta1', alpha=alpha1, beta=beta1)

            # Define the second Beta distribution
            beta2_ = pm.Beta('beta2', alpha=alpha2, beta=beta2)

            # Define the prior distribution as the sum of the two Beta distributions
            prior = pm.Deterministic('prior', beta1_ + beta2_)

            # Step 5: Define the likelihood
            likelihood = pm.Bernoulli('likelihood', 
                                      p=prior, 
                                      observed=data)
            
            # Step 6: Combine prior and likelihood
            posterior = pm.sample(2000, tune=1000, cores=16)

            # Step 7: Run inference algorithm
            trace = pm.sample(2000, 
                              tune=1000, 
                              cores=16, 
                              chains=1)

        #
        traces.append(trace)

        #
        bayes_mean = np.mean(trace['prior'])
        bayes_median = np.median(trace['prior'])

        print ("True bias:", observed_bias)
        print("Bayesian Optimal Estimate (Mean):", bayes_mean)
        print("Bayesian Optimal Estimate (Median):", bayes_median)

# Step 8: Analyze results and extract optimal Bayesian solution from posterior
print ("DONE...")

enter image description here

Output

observed_data_all:  [0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 1 0 1 1 0 0 1 0 0 0 1 0 0 1 0 1 1 0 0 0 0 0 1 0 1 1 1 0 0 0 1 0 0 0 1 0 1 0 0 1 0 0 0 1 1 0 0 0 0 0 0 1 0 0
 1 1 0 0 1 0 1 0 1 1 1 1 0 0 0 1 0 1 1 0 0 0 0 0 0 0]
observed_data_all value:  0.31

Auto-assigning NUTS sampler...
Initializing NUTS using jitter+adapt_diag...
Initializing NUTS using jitter+adapt_diag...
Multiprocess sampling (16 chains in 16 jobs)
NUTS: [beta2, beta1]
...
There was 1 divergence after tuning. Increase `target_accept` or reparameterize.
...
...
There were 4 divergences after tuning. Increase `target_accept` or reparameterize.
Only one chain was sampled, this makes it impossible to run some convergence checks
True bias: [0.3030303]
Bayesian Optimal Estimate (Mean): 0.5377621481100548
Bayesian Optimal Estimate (Median): 0.5338028628562981
DONE...
0

There are 0 best solutions below