Digital and remote education is of growing interest for internationalized education programs that combine state-of-the-art training programs including hybrid and blended elements. Particularly, but not limited to optics and photonics, hands-on experiences in training laboratories are key ingredients of modern academic education programs that cannot easily be replaced adequately. We propose a versatile platform for remote-controllable experiments with a focus on a flexible implementation. We present a toolbox called Extended Reality Twin Lab, which enables teachers and lecturers in academia with a personal commitment to advance and innovate education methods and learning outcomes to build their own remotely controllable optics and photonics experiments. An open-source GitHub repository includes source codes for the server, the respective web applications, and the included microcontrollers. It also contains the 3D printable models used to create the attachments for optical components often used in scientific labs. All parts are modularly designed to enable individual adaptation to a variety of experiments. We exemplify our approach by presenting a fully remote-controllable Michelson interferometer that was readily implemented in an ongoing international master’s degree curriculum. With this implementation, international students are now able to attend the course and acquire specific optical knowledge and lab training regardless of their actual physical location. Reviewing this running field experiment, we also discuss students’ learning outcomes with respect to optical principles, experimentation, and instruments.
Traditional hyperspectral imagers rely on scanning either the spectral or spatial dimension of the hyperspectral cube with spectral filters or line-scanning which can be time consuming and generally require precise moving parts, increasing the complexity. More recently, snapshot techniques have emerged, enabling capture of the full hyperspectral datacube in a single shot. However, some types of these snapshot system are bulky and complicated, which is difficult to apply to the real world. Therefore, this paper proposes a compact snapshot hyperspectral imaging system based on compressive theory, which consists of the imaging lens, light splitter, micro lens array, a metasurface-covered sensor and an RGB camera. The light of the object first passes through the imaging lens, and then a splitter divides the light equally into two directions. The light in one direction pass through the microlens array and then the light modulation is achieved by using a metasurface on the imaging sensor. Meanwhile, the light in another direction is received directly by an RGB camera. This system has the following advantages: first, the metasurface supercell can be well designed and arranged to optimize the transfer matrix of the system; second, the microlens array guarantee that the light incident on the metasurface at a small angle, which eliminate the transmittance error introduced by the incidence angle; third, the RGB camera is able to provide side information and help to ease the reconstruction.
We present an approach of developing XR teaching applications by implementing open spaces as a hub for the photonics community and developers interested in XR technologies and give a first evaluation of technologies and platforms.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.