Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bekki, WTF are you doing? #22

Open
davidwhogg opened this issue Jul 3, 2013 · 2 comments
Open

Bekki, WTF are you doing? #22

davidwhogg opened this issue Jul 3, 2013 · 2 comments
Assignees

Comments

@davidwhogg
Copy link
Member

@dawsonri how are you so awesome at finding candidates? What method are you using and what are your tricks? If there is a paper, point us to it. If there isn't, can we help you write it?

@ghost ghost assigned dawsonri Jul 3, 2013
@dawsonri
Copy link
Member

dawsonri commented Jul 3, 2013

The approach I'm using is finding some "candidates" and recovering some injections, but I'm actually hopeful that we can do better. The recovery rate of Dan's injected planets is about 28% overall -- see updated document I just uploaded -- but only 2% for transits between 80 and 100 ppm. What I'm doing is not exactly like anything I've seen in the literature, but it's also not anything fundamentally new. I think it would make sense to include it (or an improved version) is the paper we are planning about comparing different detrend and search methods.

More details on what I'm doing:
-- Detrend data using running median filter on segments (split up by by large time gap or flux jump), removing beginning and end of each segment
-- Using wavelets to interpolate missing data points
-- At each point, calculate and record how the likelihood is changed (i.e. delta Likelihood) by subtracting a transit (box) of a pre-specified duration and depth (basically strongly targeting/prioring on Earth-like planets).
Note 1: For now I'm just using Chi^2 as the likelihood, because the wavelet likelihood was functioning worse for recovery.
Note 2: I'm only doing this for data points for which the beginning and end of the transit duration does not fall in a gap. So the interpolation is only being used in practice for points missing within a transit duration.
-- For a range of trial periods from 200-500 days: fold, bin, and sum "delta likelihood" from previous step
I'd be interested to see how the recovery compares to a more traditional median filter + BLS.
I should also check how this approach compares to the Kepler pipeline search to generate TCEs. I'm finding none-TCE signals in the "pound stars" but have three additional quarters of data.

@saturnaxis
Copy link
Member

Hogg: From what I've experienced in TCERT, Bekki is doing most of what the Kepler WG is doing. The main difference, at least in these cases, is that the noise is not being well characterized or done most conservatively. The philosophy of TCERT in the past is to only vet unambiguous cases taking the least risk, but this is changing. So, Dan FM's GP analysis will be what pushes us across the goal line for proper planet candidacy!

Billy

----- Reply message -----
From: "dawsonri" notifications@github.com
To: "exosamsi/detrending" detrending@noreply.github.com
Subject: [detrending] Bekki, WTF are you doing? (#22)
Date: Wed, Jul 3, 2013 1:09 PM

The approach I'm using is finding some "candidates" and recovering some injections, but I'm actually hopeful that we can do better. The recovery rate of Dan's injected planets is about 28% overall -- see updated document I just uploaded -- but only 2% for transits between 80 and 100 ppm. What I'm doing is not exactly like anything I've seen in the literature, but it's also not anything fundamentally new. I think it would make sense to include it (or an improved version) is the paper we are planning about comparing different detrend and search methods.

More details on what I'm doing:
-- Detrend data using running median filter on segments (split up by by large time gap or flux jump), removing beginning and end of each segment
-- Using wavelets to interpolate missing data points
-- At each point, calculate and record how the likelihood is changed (i.e. delta Likelihood) by subtracting a transit (box) of a pre-specified duration and depth (basically strongly targeting/prioring on Earth-like planets).
Note 1: For now I'm just using Chi^2 as the likelihood, because the wavelet likelihood was functioning worse for recovery.
Note 2: I'm only doing this for data points for which the beginning and end of the transit duration does not fall in a gap. So the interpolation is only being used in practice for points missing within a transit duration.
-- For a range of trial periods from 200-500 days: fold, bin, and sum "delta likelihood" from previous step
I'd be interested to see how the recovery compares to a more traditional median filter + BLS.
I should also check how this approach compares to the Kepler pipeline search to generate TCEs. I'm finding none-TCE signals in the "pound stars" but have three additional quarters of data.


Reply to this email directly or view it on GitHub:
#22 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants