What does it feel like to be in the body of another? Researchers are using sensors, three-dimensional (3D) image capture, and electronic materials to build wearable experiences and skin-integrated interfaces that are pushing the boundaries of immersive technologies and cognition.
Novices traditionally learn complex procedures by observing experts. WEKIT (Wearable Experiences for Knowledge Intensive Training) stretches this concept by using augmented reality (AR) or virtual reality (VR) to enable trainees to become embodied in an expert.
WEKIT uses a bespoke 3D authoring tool and the HoloLens2 wireless AR headset to capture experts' head, finger, and eye movements. Novices learn by wearing the captured experience, via the same technology, and following the experts' movements.
"They can see ghost hands, see a ghost of the performance of the expert, and thereby wear the expert," explains Carl Smith of Ravensbourne University, U.K., one of WEKIT's 13 partners. "You're not looking at the expert, you're in the body of the expert."
Between 2016 and 2019, pilot projects explored WEKIT applications in aircraft maintenance, astronaut training, and medical diagnostics. The consortium reported a 40% increase in knowledge transfer using the system.
WEKIT can integrate multimedia content, such as audio and video files, into the AR learning space, says Smith. "They're talking through an audio representation of what they're doing— a step-by-step guide."
WEKIT does not only capture objective learning. The researchers also developed a harness to monitor users' subjective states via heart rate and galvanic skin responses. Comparisons show whether expert and novice stress levels align during procedures.
The partners have created a spin-off company, WEKIT ECS, to develop further industry applications and exploit new hardware, such as AR contact lenses.
Embodiment aids technical learning, but can it also teach empathy?
Marte Roel Lesur is a doctoral researcher in cognitive neuropsychology at the University of Zurich, and cofounder of Be Another Lab, a Barcelona, Spain-based multinational collective that investigates empathy from an embodied perspective. The interdisciplinary group was formed in 2012 by individuals with backgrounds as varied as computer science, digital arts, and conflict resolution.
Early experiments used low-resolution systems to generate the illusion of embodying another person, explains Lesur. "We thought that this would be a way to create exponential change in the world; if we would create an accessible technology that would allow anyone to step in the shoes of another."
The collective's Body Swap installation stays true to the philosophy of using accessible technology; it deploys head-mounted VR displays and webcams to allow users to exchange perspectives. "We literally just swap where the webcam is connected so my webcam would be connected to your headset and your webcam would be connected to my headset," says Lesur.
Having exchanged perspectives, users are then instructed to move slowly and follow each other, with no leader and no follower. "What is interesting is that very organically, they start synchronizing their movements," says Lesur. "When you synchronize vision and touch, you sort of accept this new body as your own."
The illusion is reinforced using multi-sensory feedback tools, including mirrors, and by assistants who interact with the participants during the experience. BeAnotherLab researchers believe the system has applications in health and education and are currently working with schools in France to build empathy around issues such as dyslexia and bullying.
Immersive technologies can produce a sense of embodiment, which can be enhanced by using the body as a sensory interface.
An international team of researchers from China's City University of Hong Kong, Shangdong University, and Tsinghua University; the U.K.'s University of Bristol; the U.S. University of Arizona, University of Illinois at Urbana-Champaign, Northwestern University, and Pennsylvania State University, and Illinois' Neurolux Corp., Psyonic Inc., and Wearifi Inc., has developed a wireless, battery-free platform that uses haptic interfaces to communicate sensory information via programmable actuators. The system is controlled via touchscreen, explains contributor John A. Rogers, an electronic materials professor at Northwestern University. "You can touch the screen with your finger and the pattern of touch is immediately reproduced on the skin interface device."
Rogers has been researching how electronic materials can be integrated with skin for a decade. He has focused on devices such as monitoring patches for medical applications, "very, very thin, soft devices that can gently laminate under the surface of the skin."
This materials-oriented approach is now being applied to haptic interfaces for VR/AR, with the aim of full body coverage, "It looks more like a second skin, almost like our electronic monitoring patches, but now with that haptic, tactile feedback capability built-in."
The system has multiple, ultrathin layers, one of which contains an array of actuators in silicon. "A current running into each one of these actuators causes a vibratory motion that's then imparted to the surface of the skin," says Rogers. Another layer provides the electronic functionality that directs the pattern of current, he says, and "The refresh rate is faster than your ability to sense changes in patterns of touch on the skin, so that's very fast electronics."
Thermal and more sophisticated touch sensations are being added to the platform. Rogers sees multiple applications, including via social media, as the system runs remotely.