Long range telescopic video imagery of distant terrestrial scenes, aircraft, rockets and other aerospace vehicles can
be a powerful observational tool. But what about the associated acoustic activity? A new technology, Remote
Acoustic Sensing (RAS), may provide a method to remotely listen to the acoustic activity near these distant objects.
Local acoustic activity sometimes weakly modulates the ambient illumination in a way that can be remotely sensed.
RAS is a new type of microphone that separates an acoustic transducer into two spatially separated components: 1) a
naturally formed in situ acousto-optic modulator (AOM) located within the distant scene and 2) a remote sensing
readout device that recovers the distant audio. These two elements are passively coupled over long distances at the
speed of light by naturally occurring ambient light energy or other electromagnetic fields. Stereophonic, multichannel
and acoustic beam forming are all possible using RAS techniques and when combined with high-definition
video imagery it can help to provide a more cinema like immersive viewing experience.
A practical implementation of a remote acousto-optic readout device can be a challenging engineering problem. The
acoustic influence on the optical signal is generally weak and often with a strong bias term. The optical signal is
further degraded by atmospheric seeing turbulence. In this paper, we consider two fundamentally different optical
readout approaches: 1) a low pixel count photodiode based RAS photoreceiver and 2) audio extraction directly from
a video stream. Most of our RAS experiments to date have used the first method for reasons of performance and
simplicity. But there are potential advantages to extracting audio directly from a video stream. These advantages
include the straight forward ability to work with multiple AOMs (useful for acoustic beam forming), simpler optical
configurations, and a potential ability to use certain preexisting video recordings. However, doing so requires
overcoming significant limitations typically including much lower sample rates, reduced sensitivity and dynamic
range, more expensive video hardware, and the need for sophisticated video processing. The ATCOM real time
image processing software environment provides many of the needed capabilities for researching video-acoustic
signal extraction. ATCOM currently is a powerful tool for the visual enhancement of atmospheric turbulence
distorted telescopic views. In order to explore the potential of acoustic signal recovery from video imagery we
modified ATCOM to extract audio waveforms from the same telescopic video sources. In this paper, we
demonstrate and compare both readout techniques for several aerospace test scenarios to better show where each has
advantages.
A conventional camera can be adapted for underwater use by enclosing it in a sealed waterproof pressure housing with a viewport. The viewport, as an optical interface between water and air needs to consider both the camera and water optical characteristics while also providing a high pressure water seal. Limited hydrospace visibility drives a need for wide angle viewports. Practical optical interfaces between seawater and air vary from simple flat plate windows to complex water contact lenses. This paper first provides a brief overview of the physical and optical properties of the ocean environment along with suitable optical materials. This is followed by a discussion of the characteristics of various afocal underwater viewport types including flat windows, domes and the Ivanoff corrector lens, a derivative of a Galilean wide angle camera adapter. Several new and interesting optical designs derived from the Ivanoff corrector lens are presented including a pair of very compact afocal viewport lenses that are compatible with both in water and in air environments and an afocal underwater hyper-hemispherical fisheye lens.
Rockets and other high altitude aerospace vehicles produce interesting visual and aural phenomena that can be remotely observed from long distances. This paper describes a compact, passive and covert remote sensing system that can produce high resolution sound movies at >100 km viewing distances. The telescopic high resolution camera is capable of resolving and quantifying space launch vehicle dynamics including plume formation, staging events and payload fairing jettison. Flight vehicles produce sounds and vibrations that modulate the local electromagnetic environment. These audio frequency modulations can be remotely sensed by passive optical and radio wave detectors. Acousto-optic sensing methods were primarily used but an experimental radioacoustic sensor using passive micro-Doppler radar techniques was also tested. The synchronized combination of high resolution flight vehicle imagery with the associated vehicle sounds produces a cinema like experience that that is useful in both an aerospace engineering and a Hollywood film production context. Examples of visual, aural and radar observations of the first SpaceX Falcon 9 v1.1 rocket launch are shown and discussed.
A solar illuminated glinting specular object can serve as an in situ sensor probe that is observable from long distances.
Retroreflective objects produce bright glints when illuminated by coaxial illumination sources such as lasers. These
glints are modulated in various ways by illumination source variances, the local probe environment, the intervening
propagation paths and the remote sensing system. The modulating signals can be recovered by using reflectivity
detectors with temporal, spatial, wavelength, directivity and polarization sensitivity. Clustered and moving specular
probes provide additional information through geometry extraction, beam forming and multisensor noise reduction.
Experimental results are shown for omnidirectional specular imaging, atmospheric wake turbulence measurement,
redeye sensing and acoustic sensing.
Alexander Graham Bell's photophone of 1880 was a simple free space optical communication device that used the sun to illuminate a reflective acoustic diaphragm. A selenium photocell located 213 m (700 ft) away converted the acoustically modulated light beam back into sound. A variation of the photophone is presented here that uses naturally formed free space acousto-optic communications links to provide passive multichannel long range acoustic sensing. This system, called RAS (remote acoustic sensor), functions as a long range microphone with a demonstrated range in excess of 40 km (25 miles).
Achieving high resolution imagery of distant terrestrial objects from ground based sensors presents a unique technical challenge. The entire optical path is fully immersed in a dense and turbulent atmosphere, resulting in a significant loss of scene contrast and resolution. Although there are strong similarities to the problems of high resolution astronomical and space object imaging, there are also some significant differences. This paper describes the long horizontal path seeing environment, two portable long range imaging systems MIST (Miniature Integrated Speckle imaging Telescope) and TFIC (Terrestrial Fusion Imaging Camera) and the associated image processing workflow. MIST was specifically designed to support long range, high resolution imaging research. TFIC is a very portable and compact high resolution field imaging system. The TFIC image processing workflow uses a combination of luminance processing, speckle imaging and image fusion. Representative high resolution urban and marine environment imagery with horizontal path distances up to 128 km (80 miles) is shown.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.