PRIMA/PACMAN is scheduled for commissioning on Paranal in late 2008 as part of the VLTI. In this paper, we
discuss the important aspects of its astrometric data-reduction software. For example, the top-level requirements,
interfaces to existing ESO software, data types, data levels, and data flow among the recipes dictate the overall
design of any software package. In addition, the complexity of the PACMAN instrument, the long-term nature
of astrometric observations, and the need to improve algorithms as the understanding of the hardware improves,
impose additional requirements on the astrometric data-reduction software.
PRIMA, the Phase-Referenced Imaging and Micro-arcsecond Astrometry facility for the Very Large Telescope Interferometer, is now nearing the end of its manufacturing phase. An intensive test period of the various sub-systems (star separators, fringe sensor units and incremental metrology) and of their interactions in the global system will start in Garching as soon as they are delivered. The status and performances of the individual sub-systems are presented in this paper as well as the proposed observation and calibration strategy to reach the challenging goal of high-accuracy differential astrometry at 10 μas level.
The PRIMA facility will implement dual-star astrometry at the VLTI. We have formed a consortium that will build the PRIMA differential delay lines, develop an astrometric operation and calibration plan, and deliver astrometric data reduction software. This will enable astrometric planet surveys with a target precision of 10μas. Our scientific goals include determining orbital inclinations and masses for planets already known from radial-velocity surveys, searches for planets around stars that are not amenable to high-precision radial-velocity observations, and a search for large rocky planets around
nearby low-mass stars.
MIDI (MID-infrared Interferometric instrument) gave its first N-band (8 to 13 micron) stellar interference fringes on the VLTI (Very Large Telescope Interferometer) at Cerro Paranal Observatory (Chile) in December 2002. An lot of work had to be done to transform it, from a successful physics experiment, into a premium science instrument which is offered to the worldwide community of astronomers since September 2003. The process of "paranalization", carried out by the European Southern Observatory (ESO) in collaboration with the MIDI consortium, has aimed to make MIDI simpler to use, more reliable, and more efficient. We describe in this paper these different aspects of paranalization (detailing the improvement brought to the observation software) and the lessons we have learnt. Some general rules, for bringing an interferometric instrument into routine operation in an observatory, can be drawn from the experience with MIDI. We also report our experience of the first "service mode" run of an interferometer (VLTI + MIDI) that took place in April 2004.
Knowledge of the dispersion due to (humid) air in the light path of the Very Large Telescope Interferometer (VLTI) is crucial to obtaining good science data from MIDI, PRIMA and GENIE. To calculate the refraction due to air at infra red wavelengths in the ducts and delay line tunnel, the temperature and humidity has to be monitored during observations. To accomplish these measurements an easy to use and reliable system was assembled, based on commercially available components. In-house calibration of four humidity and temperature sensors of the system was done in Leiden. A test and calibration program was carried out to make sure that they work reliably and accurately and to determine the sensor characteristics. For this purpose a calibration box was designed which isolates the sensors from the environment so that there is no exchange of air with the outside environment. Using constant humidity salt solutions, the humidity in the box can be controlled. This allows the calibrations to be carried out for typical values of relative humidity and temperature at Cerro Paranal. Calibration of the sensors includes: 1. Reducing the systematic relative humidity differences between the sensors to less than 0.1 % and 2. Reducing the systematic temperature differences between the sensors to less than 0.01 K. In this paper we will present the outcome of the calibrations and the future of the sensors at Paranal.
A search for extrasolar planets using the ESO VLTI PRIMA facility
will become feasible in 2007. An astrometric accuracy of 10 micro-arcseconds will allow us to detect sub-Uranus mass planets around the
most nearby stars, as well as to conduct a planet search around stars of different ages. Most of the PRIMA hardware subsystems are currently being developed by industry. At the same time a scientific Consortium has formed that will deliver the differential delay lines and astrometric software for PRIMA to ESO.
In this paper we describe the planned efforts by the Consortium
related to the "PRIMA astrometry operations and software". These
activities include an overall "PRIMA astrometry error budget", a
"PRIMA astrometry calibration and observation strategy", the "PRIMA astrometry observation preparation tools" and the "PRIMA astrometry data reduction tools". We describe how all these components fit together in an overall approach to the flow of knowledge within the project. First by quantifying the fundamental limits of the VLTI infrastructure and the astronomical sources under study. Followed by elimination or suppression of the errors through either a hardware change to the system, software control of the system, or a proper calibration and observation strategy.
The ultimate goal is being able to calibrate all PRIMA astrometric data acquired over the full lifetime of PRIMA (5 to 10 years) to a uniform accuracy of 10 micro-arcseconds. This will allow identification of long-term trends in the astrometric parameters due to planetary companions around nearby stars and to determine the distances and proper motions for the selected sources.
One of the goals of the VLTI PRIMA (Phase Referenced Imaging and Micro-arcsecond Astrometry) facility will be to obtain high accuracy astrometry (of the order of 10 micro-arcsec) for the measurement of the reflex motion due to planets. In order to achieve this an offline astrometric Data Analysis Facility (DAF) is planned to perform a homogeneous and iterative analysis over several years of observations. This system will be part of the PRIMA Data Reduction Library (DRL), which also contains the online pipeline. The most important module of the DAF will be the Trend Analysis to identify and fit the systematic errors and feed them back into the data reduction. This requires an infrastructure which allows for comprehensive access to all raw and derived data and enough flexibility to easily introduce new algorithms in the system. We plan to realize this with a database, sophisticated middleware and Application Programmers' Interfaces (APIs) for the algorithms and user interface plug-ins. We present in this paper the requirements and preliminary design of the DAF, as well as the implementation issues concerning the integration with other modules of the DRL and ESO compliance.
The start of NEVEC was initiated by the opportunity in the Netherlands to reinstate instrumental efforts in astronomy through a funding program for 'Top Research Schools,’ which brought about the creation of NOVA. The fact that considerable experience exists in Radio Astronomical imaging through interferometry (the Westerbork Synthesis Radio Telescope started in 1970), and the relatively small size at the time of ESO's VLTI Team made it opportune to aim for a win-win situation through collaboration. So presently an MOU between ESO and NOVA is in force, which stipulates that 10 out of the 18 man-years funded by NOVA for NEVEC until 2005 [new personnel, in university setting (Leiden) but on project money] shall be used on tasks that are mutually agreed between NOVA and ESO.
The tasks presently are found in the domain of observing modes, calibration and modeling, as well as contributing to the commissioning of new instruments and thinking about future instruments. Another task, outside these 10 FTE, has been the data handling and analysis software for MIDI, and again contributing to its commissioning. Delivery of the first operational version in Heidelberg has just taken place (summer 2002) contributing to the successful Preliminary Acceptance in Europe for MIDI on September 10, 2002. The actual state of 'products and deliveries' and the future outlook are reviewed.
The first science instrument for the Very Large Telescope Interferometer (VLTI), the Mid-infrared instrument MIDI, will be commissioned in November 2002 with anticipated first fringe during that commissioning run on the 40-cm Siderostats and the 8.2-meter Unit Telescopes. In this paper we describe scientific and technical observing modes (also referred to as observation procedures) developed for MIDI and discuss in detail how an observing run with the instrument is planned.
MIDI is built by a consortium lead by the Max Planck Institute for Astronomy (MPIA Heidelberg), with contributions from among others ASTRON (Dwingeloo, The Netherlands), Leiden Observatory, University of Amsterdam, Paris Observatory, University of Groningen, the Kiepenheuer-Institut fur Sonnenpysik at Freiburg, Thuringer Landessternwarte Tautenburg, and the Observatoire de la Cote d'Azur.
The mid-infrared interferometric instrument MIDI is currently undergoing testing in preparation for commissioning on the Very Large Telescope Interferometer VLTI at the end of this year 2002. It will perform interferometric observations over the 8 μm - 13 μm wavelength range, with a spatial resolution of 20 milliarcsec, a spectral resolution of up to 250, and an anticipated point source sensitivity of N = 4 mag or 1 Jy for self-fringe tracking, which will be the only observing mode during the first months of operation. We describe the layout of the instrument and the performance during laboratory tests, both for broadband and spectrally resolved observing modes. We also briefly outline the planned guaranteed time observations.
This paper describes an automated manufacturability assessment and yield prediction software platform that has been developed, installed and used in a semiconductor manufacturing environment. the system is applied to characterize all incoming new products by extracting a large number of design attributes from their layout. Be checking the similarity to products that have already shown acceptable levels of yield, the risk of starting high volume production for new products is evaluated. The system also accurately predicts product yield on functional-block and layer level which is indispensable for the understanding of product dependent yield loss and rapid yield learning. The system proves to be a very valuable addition to the conventional yield analysis methodologies.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.