KEYWORDS: Data modeling, Virtual reality, C++, Computer programming, Computing systems, Ultrasonography, Data storage, Augmented reality, Medical imaging, Systems modeling
Applications in the fields of virtual and augmented reality as well as image-guided medical applications make use of a
wide variety of hardware devices. Existing frameworks for interconnecting low-level devices and high-level application
programs do not exploit the full potential for processing events coming from arbitrary sources and are not easily generalizable.
In this paper, we will introduce a new multi-modal event processing methodology using dynamically-typed
event attributes for event passing between multiple devices and systems. The existing OpenTracker framework was
modified to incorporate a highly flexible and extensible event model, which can store data that is dynamically created
and arbitrarily typed at runtime. The main factors impacting the library's throughput were determined and the performance
was shown to be sufficient for most typical applications. Several sample applications were developed to take advantage
of the new dynamic event model provided by the library, thereby demonstrating its flexibility and expressive power.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.