In order for Kepler to achieve its required <20 PPM photometric precision for magnitude 12 and brighter stars,
instrument-induced variations in the CCD readout bias pattern (our "2D black image"), which are either fixed or slowly
varying in time, must be identified and the corresponding pixels either corrected or removed from further data
processing. The two principle sources of these readout bias variations are crosstalk between the 84 science CCDs and the
4 fine guidance sensor (FGS) CCDs and a high frequency amplifier oscillation on <40% of the CCD readout channels.
The crosstalk produces a synchronous pattern in the 2D black image with time-variation observed in <10% of individual
pixel bias histories. We will describe a method of removing the crosstalk signal using continuously-collected data from
masked and over-clocked image regions (our "collateral data"), and occasionally-collected full-frame images and
reverse-clocked readout signals. We use this same set to detect regions affected by the oscillating amplifiers. The
oscillations manifest as time-varying moiré pattern and rolling bands in the affected channels. Because this effect
reduces the performance in only a small fraction of the array at any given time, we have developed an approach for
flagging suspect data. The flags will provide the necessary means to resolve any potential ambiguity between
instrument-induced variations and real photometric variations in a target time series. We will also evaluate the
effectiveness of these techniques using flight data from background and selected target pixels.
Bruce Clarke, Christopher Allen, Stephen Bryson, Douglas Caldwell, Hema Chandrasekaran, Miles Cote, Forrest Girouard, Jon Jenkins, Todd Klaus, Jie Li, Chris Middour, Sean McCauliff, Elisa Quintana, Peter Tenenbaum, Joseph Twicken, Bill Wohler, Hayley Wu
The Kepler space telescope is designed to detect Earth-like planets around Sun-like stars using transit photometry by
simultaneously observing more than 100,000 stellar targets nearly continuously over a three-and-a-half year period. The
96.4-megapixel focal plane consists of 42 Charge-Coupled Devices (CCD), each containing two 1024 x 1100 pixel
arrays. Since cross-correlations between calibrated pixels are introduced by common calibrations performed on each
CCD, downstream data processing requires access to the calibrated pixel covariance matrix to properly estimate
uncertainties. However, the prohibitively large covariance matrices corresponding to the ~75,000 calibrated pixels per
CCD preclude calculating and storing the covariance in standard lock-step fashion. We present a novel framework used
to implement standard Propagation of Uncertainties (POU) in the Kepler Science Operations Center (SOC) data
processing pipeline. The POU framework captures the variance of the raw pixel data and the kernel of each subsequent
calibration transformation, allowing the full covariance matrix of any subset of calibrated pixels to be recalled on the fly
at any step in the calibration process. Singular Value Decomposition (SVD) is used to compress and filter the raw
uncertainty data as well as any data-dependent kernels. This combination of POU framework and SVD compression
allows the downstream consumer access to the full covariance matrix of any subset of the calibrated pixels which is
traceable to the pixel-level measurement uncertainties, all without having to store, retrieve, and operate on prohibitively
large covariance matrices. We describe the POU framework and SVD compression scheme and its implementation in the
Kepler SOC pipeline.
Hayley Wu, Joseph Twicken, Peter Tenenbaum, Bruce Clarke, Jie Li, Elisa Quintana, Christopher Allen, Hema Chandrasekaran, Jon Jenkins, Douglas Caldwell, Bill Wohler, Forrest Girouard, Sean McCauliff, Miles Cote, Todd Klaus
KEYWORDS: Planets, Stars, Technetium, Binary data, Motion models, System on a chip, Statistical modeling, Charge-coupled devices, Data centers, Statistical analysis
We present an overview of the Data Validation (DV) software component and its context within the Kepler Science
Operations Center (SOC) pipeline and overall Kepler Science mission. The SOC pipeline performs a transiting planet
search on the corrected light curves for over 150,000 targets across the focal plane array. We discuss the DV strategy for
automated validation of Threshold Crossing Events (TCEs) generated in the transiting planet search. For each TCE, a
transiting planet model is fitted to the target light curve. A multiple planet search is conducted by repeating the transiting
planet search on the residual light curve after the model flux has been removed; if an additional detection occurs, a
planet model is fitted to the new TCE. A suite of automated tests are performed after all planet candidates have been
identified. We describe a centroid motion test to determine the significance of the motion of the target photocenter
during transit and to estimate the coordinates of the transit source within the photometric aperture; a series of eclipsing
binary discrimination tests on the parameters of the planet model fits to all transits and the sequences of odd and even
transits; and a statistical bootstrap to assess the likelihood that the TCE would have been generated purely by chance
given the target light curve with all transits removed.
Elisa Quintana, Jon Jenkins, Bruce Clarke, Hema Chandrasekaran, Joseph Twicken, Sean McCauliff, Miles Cote, Todd Klaus, Christopher Allen, Douglas Caldwell, Stephen Bryson
KEYWORDS: Calibration, Charge-coupled devices, Data modeling, Space operations, Electronics, Data processing, Planets, Space telescopes, Stars, System on a chip
We present an overview of the pixel-level calibration of flight data from the Kepler Mission performed within the Kepler
Science Operations Center Science Processing Pipeline. This article describes the calibration (CAL) module, which
operates on original spacecraft data to remove instrument effects and other artifacts that pollute the data. Traditional
CCD data reduction is performed (removal of instrument/detector effects such as bias and dark current), in addition to
pixel-level calibration (correcting for cosmic rays and variations in pixel sensitivity), Kepler-specific corrections
(removing smear signals which result from the lack of a shutter on the photometer and correcting for distortions induced
by the readout electronics), and additional operations that are needed due to the complexity and large volume of flight
data. CAL operates on long (~30 min) and short (~1 min) sampled data, as well as full-frame images, and produces
calibrated pixel flux time series, uncertainties, and other metrics that are used in subsequent Pipeline modules. The raw
and calibrated data are also archived in the Multi-mission Archive at Space Telescope at the Space Telescope Science
Institute for use by the astronomical community.
KEYWORDS: Calibration, Photometry, Detection and tracking algorithms, Charge-coupled devices, Planets, Data storage, Space operations, System on a chip, Target detection, Digital filtering
We describe the Photometric Analysis (PA) software component and its context in the Kepler Science Operations Center
(SOC) Science Processing Pipeline. The primary tasks of this module are to compute the photometric flux and
photocenters (centroids) for over 160,000 long cadence (~thirty minute) and 512 short cadence (~one minute) stellar
targets from the calibrated pixels in their respective apertures. We discuss science algorithms for long and short cadence
PA: cosmic ray cleaning; background estimation and removal; aperture photometry; and flux-weighted centroiding. We
discuss the end-to-end propagation of uncertainties for the science algorithms. Finally, we present examples of
photometric apertures, raw flux light curves, and centroid time series from Kepler flight data. PA light curves, centroid
time series, and barycentric timestamp corrections are exported to the Multi-mission Archive at Space Telescope
[Science Institute] (MAST) and are made available to the general public in accordance with the NASA/Kepler data
release policy.
Christopher Middour, Todd Klaus, Jon Jenkins, David Pletcher, Miles Cote, Hema Chandrasekaran, Bill Wohler, Forrest Girouard, Jay Gunter, Kamal Uddin, Christopher Allen, Jennifer Hall, Khadeejah Ibrahim, Bruce Clarke, Jie Li, Sean McCauliff, Elisa Quintana, Jeneen Sommers, Brett Stroozas, Peter Tenenbaum, Joseph Twicken, Hayley Wu, Doug Caldwell, Stephen Bryson, Paresh Bhavsar, Michael Wu, Brian Stamper, Terry Trombly, Christopher Page, Elaine Santiago
KEYWORDS: Space operations, Photometry, System on a chip, Data processing, Stars, Calibration, Charge-coupled devices, Data modeling, Software development, Planets
We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed,
developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center,
the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at
Ames Research Center, software development and operations departments, and a data center which hosts the computers
required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft
and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler
Science Processing Pipeline, including the hardware infrastructure, scientific algorithms, and operational procedures. We
present the high-performance, parallel computing software modules of the pipeline that perform transit photometry,
pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument
characterization. We show how data processing environments are divided to support operational processing and test
needs. We explain the operational timelines for data processing and the data constructs that flow into the Kepler Science
Processing Pipeline.
Jie Li, Christopher Allen, Stephen Bryson, Douglas Caldwell, Hema Chandrasekaran, Bruce Clarke, Jay Gunter, Jon Jenkins, Todd Klaus, Elisa Quintana, Peter Tenenbaum, Joseph Twicken, Bill Wohler, Hayley Wu
KEYWORDS: Stars, Photometry, Space operations, Data processing, Charge-coupled devices, Image compression, Motion measurement, System on a chip, Ka band, Detection and tracking algorithms
This paper describes the algorithms of the Photometer Performance Assessment (PPA) software component in the
Kepler Science Operations Center (SOC) Science Processing Pipeline. The PPA performs two tasks: One is to analyze
the health and performance of the Kepler photometer based on the long cadence science data down-linked via Ka band
approximately every 30 days. The second is to determine the attitude of the Kepler spacecraft with high precision at each
long cadence. The PPA component has demonstrated the capability to work effectively with the Kepler flight data.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.