The generation of photo-real renderings of bioluminescence is developed for creatures from the abyss.
Bioluminescence results from a chemical reaction with examples found in deep-sea marine environments including:
algae, copepods, jellyfish, squid, and fish. In bioluminescence, the excitation energy is supplied by a chemical reaction,
not by a source of light. The greatest transparency window in seawater is in the blue region of the visible spectrum.
From small creatures like single-cell algae, to large species of siphonophore Praya dubia (40m), luminescent
phenomena can be produced by mechanical excitement from disturbances of objects passing by. Deep sea fish, like the
Pacific Black Dragonfish are covered with photophores along the upper and lower surfaces which emits light when
disturbed. Other animals like small squids have several different types of light organs oscillating at different rates.
Custom shaders and material phenomena incorporate indirect lighting like: global illumination, final gathering, ambient
occlusion and subsurface scattering to provide photo real images. Species like the Hydomedusae jellyfish, produce
colors that are also generated by iridescence of thin tissues. The modeling and rendering of these tissues requires thin
film multilayer stacks. These phenomena are simulated by semi-rigid body dynamics in a procedural animation
environment. These techniques have been applied to develop spectral rendering of scenes outside the normal visible
window in typical computer animation render engines.
KEYWORDS: Skin, High dynamic range imaging, Light scattering, Camouflage, Scattering, Thin films, Tissue optics, Process modeling, 3D modeling, Digital imaging
High Dynamic Range Image (HDRI) rendering and animation of color in the camouflage of chameleons is developed
utilizing thin film optics. Chameleons are a lizard species, and have the ability to change their skin color. This change in
color is an expression of the physical and physiological conditions of the lizard, and plays a part in communication. The
different colors that can be produced depending on the species include pink, blue, red, orange, green, black, brown and
yellow. The modeling, simulation, and rendering of the color, which their skin incorporates, thin film optical stacks. The
skin of a chameleon has four layers, which together produce various colors. The outside transparent layer has
chromatophores cells, of two kinds of color, yellow and red. Next there are two more layers that reflect light: one blue
and the other white. The innermost layer contains dark pigment granules or melanophore cells that influences the amount
of reflected light. All of these pigment cells can rapidly relocate their pigments, thereby influencing the color of the
chameleon. Techniques like subsurface scattering, the simulation of volumetric scattering of light underneath the objects
surface, and final gathering are defined in custom shaders and material phenomena for the renderer. The workflow
developed to model the chameleon's skin is also applied to simulation and rendering of hair and fur camouflage, which
does not exist in nature.
KEYWORDS: Virtual reality, High dynamic range imaging, OpenGL, Human-machine interfaces, Visualization, Java, Cameras, Personal digital assistants, Nuclear weapons, Panoramic photography
A 3D workflow pipeline is presented for High Dynamic Range (HDR) image capture of projected scenes or objects for
presentation in CAVE virtual environments. The methods of HDR digital photography of environments vs. objects are reviewed. Samples of both types of virtual authoring being the actual CAVE environment and a sculpture are shown. A series of software tools are incorporated into a pipeline called CAVEPIPE, allowing for high-resolution objects and scenes to be composited together in natural illumination environments [1] and presented in our CAVE virtual reality environment. We also present a way to enhance the user interface for CAVE environments. The traditional methods of controlling the navigation through virtual environments include: glove, HUD's and 3D mouse devices. By integrating a wireless network that includes both WiFi (IEEE 802.11b/g) and Bluetooth (IEEE 802.15.1) protocols the non-graphical
input control device can be eliminated. Therefore wireless devices can be added that would include: PDA's, Smart
Phones, TabletPC's, Portable Gaming consoles, and PocketPC's.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.