Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

My samplers do not converge to maximum likelihood #448

Open
tga54 opened this issue Nov 10, 2022 · 0 comments
Open

My samplers do not converge to maximum likelihood #448

tga54 opened this issue Nov 10, 2022 · 0 comments

Comments

@tga54
Copy link

tga54 commented Nov 10, 2022

General information:

  • emcee version: 3.0.2
  • platform: Ubuntu Linux
  • installation method (pip/conda/source/other?): source

Problem description:

Expected behavior: The samplers converge to best-fit parameters that have the maximum likelihood

Actual behavior: corner plot shows that parameters converge to a group of parameters that do not have the maximum likelihood. Besides, best-fit parameters are not covered by [16%,84%] range of samples.

What have you tried so far?: I saved every group of parameters called by "log_likelihood" and the corresponding likelihood value returned by "log_likelihood" into a new file. I run MCMC sampling for 2000 steps with 12 chains and burned 0 steps. I found that "get_chain" returns 24000 samples while my file only saved less than 20000 samples. The samples returned by "get_chain" do not converge while samples saved by my file show that they are converged. When I was checking the likelihood value I found that samples do not converge to the best-fit parameters that gave the maximum likelihood. I guess the reason is that the best-fit parameters are too close to the boundary of parameter space defined in "log_prior". Therefore MCMC converged to a local minimum instead of the global minimum. To enlarge the parameter space would make it unphysical. What should I do?

Minimal example:

import emcee

## I paste my "log_likelihood" here.
import numpy as np
import os
def log_likelihood(par, y ,yerr):
    p1,p2,p3,p4,p5,p6 = par
    model = model(p1,p2,p3,p4,p5,p6)
    likelihood = -0.5*np.sum((y-model)**2/yerr**2)
# save parameters called by log_likelihood and likelihood to new files
    w = [p1,p2,p3,p4,p5,p6,likelihood]
    pid = os.getpid()
    f  =open(str(pid) + 'csv','a')
    csv_writer = csv.writer(f,dialect='excel')
    csv_writer.writerow(w)
    f.close()
    return likelihood

# sample code goes here...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant