Aging Indoor Navigation
The Aging Indoor Navigation room showcases a suite of technologies developed to support our aging population as they continue to live and thrive independently in their own home. The non-invasive alert system that we have developed allows for monitoring of movement around the house through use of RFID tags and a low-power reader rather than more invasive camera-based systems. Designed to be user-configurable, this system will respond to prolonged periods of inactivity by texting the user and, if the user doesn’t respond, will reach out to user-approved care givers or family members. Our demonstration system is installed in a mock-up apartment to give context to its use and portability.
Simulated Edge Detection
VEMI built the Simulated Edge Detection experience to test the feasibility and efficacy of using augmented-reality edge detection and highlighting as a helpful augmentation for use by visually impaired individuals. As this project progressed from simulated tests to physical hardware, leveraging real-time augmented reality, the simulation has become a useful demonstration of real-world applications related to accurate perceptions of the environment. It also provides insight into the agile and collaborative processes espoused here at VEMI.
Collaboratively built in-house and frequently upgraded to meet the demands of new research studies, our driving simulator is as diverse and flexible as our students. Thanks to custom hardware built around the seat of a Crown Victoria, the driving simulator adds an immersive tactile experience to the visual and auditory simulations. Used as a research tool, VEMI’s driving simulator helps us investigate how we form and implement cognitive maps of the spaces around us, how our perception and visual function (acuity, contrast, and field of view) are affected by driving conditions, distractions, and aging, and even allows us to test how effective and helpful compensatory augmentations would be before we build physical prototypes. The driving simulator is also a useful technical demonstration of how real and reliable virtual reality is as a tool for research. Previous studies conducted in actual vehicles in real-space were replicated in the simulator, resulting in highly similar performance. Our most recent work uses the simulator to study the human interactions associated with safe and efficient driving of semi-autonomous and fully autonomous vehicles.
In continuing VEMI’s search for ways to leverage virtual reality as a visualization tool, the Lab built an Architecture Demo. VEMI’s goal for this project was to explore how we can combine consumer level hardware with modern VR/AR techniques, thus enabling us to provide a robust and cost-effective solution supporting a broad range of applications. To this end, the demo can be easily implemented in a head-mounted display (HMD) for full 3D visual immersion or with an iPad for a realistic glimpse into the virtual space, akin to looking through a camera’s viewfinder. Consumer-accessible visualization software, like this Architecture Demo, can be used to solve and prevent many problems related to spatial perception and cognition. For example:
- Contractors can use the software to determine potential problems in construction.
- Investors and stakeholders in a project can see what a building or addition will look like before beginning physical construction.
- Someone who is moving into a new living situation, whether it be their first apartment, fifth house, or an assisted care facility, can explore and familiarize themselves with their new environment before even setting foot on site.
VEMI’s radiation balance visualizer was created as part of a collaboration with the Climate Change Institute here at UMaine. Earth’s radiation energy balance is a cornerstone of geologic and climate studies, but the static and two-dimensional charts and graphs commonly used to visualize this balance struggle to both capture the enormity of the data involved and make it easily digestible. Engineering immersive, three-dimensional environments as the setting for visual simulations of this balance makes the large quantity of data and their geographic context easier to understand. As such, this virtual reality experience is both an educational tool and technical demonstration of how virtual reality can be used for visualizing large quantities of data.
VEMI’s Disaster Demo showcases both how virtual reality can be used for safer disaster readiness training and how augmented reality can be used in the field to help emergency responders. By combining spatial information with VEMI’s simulated augmented reality and edge detection, this demo shows participants room and location information and provides highlighted edges of the building’s structure around them, allowing wayfinding and navigation through billowing smoke and flames.
Lunar Habitat Demo
The Lunar Habitat Demo is an example of a collaborative effort between VEMI and Dr. Ali Abedi’s WiSe-NET Lab (here at UMaine) showcasing virtual reality as a tool for visualization and planning of new environments. Static physical models have to be scaled down to be portable and accessible, but a virtual environment can be saved to a flash drive and shared with researchers, contractors, educators, even interested students attending seminars and camps! By making cutting-edge spaces like this lunar habitat explorable from anywhere, virtual reality is helping educate and connect people with the innovation and development happening around them.
Alone In The Dark
Alone in the Dark emphasizes and explores VEMI’s focus on multimodal interaction with our environments. Relying primarily on their sense of hearing, participants advance through a fun narrative by exploring their environment and reacting to the events happening around them. In addition to highlighting the importance of spatialized audio to creating truly immersive experiences, Alone in the Dark also demonstrates how we can overlay cognitive maps of virtual spaces onto our perception of real spaces. Participants are often surprised by how accurate they are when asked to point to the real-world locations of virtual objects learned in the demo!