Introduction to Face-Mic
Researchers at Rutgers University-New Brunswick have published a study called "Face-Mic," which examines how voice command features on virtual reality headsets could lead on to major privacy leakages, often called "eavesdropping attacks." The study reveals that hackers could use popular virtual reality (AR/VR) headsets with built-in motion sensors to record subtle, speech-associated facial dynamics to steal sensitive information communicated via voice-command.
How Face-Mic Works
The research shows that common AR/VR systems in the marketplace, including Oculus Quest 2, HTC Vive Pro, and PlayStation VR, are vulnerable to eavesdropping attacks. Hackers can use the built-in motion sensors, similar to accelerometers and gyroscopes, to capture facial dynamics related to live human speech. This information will be used to derive sensitive information, including bank card numbers, Social Security numbers, phone numbers, PIN numbers, transactions, birth dates, and passwords.
Security Vulnerabilities
The study, led by Yingying "Jennifer" Chen, associate director of WINLAB and graduate director of Electrical and Computer Engineering at Rutgers University-New Brunswick, demonstrates that Face-Mic can derive the headset wearer’s sensitive information with 4 mainstream AR/VR headsets. The researchers found that each cardboard headsets and high-end headsets suffer from security vulnerabilities, revealing a user’s sensitive speech and speaker information without permission.
Types of Vibrations Captured
The researchers studied three sorts of vibrations captured by AR/VR headsets’ motion sensors, including speech-associated facial movements, bone-borne vibrations, and airborne vibrations. They found that bone-borne vibrations are richly encoded with detailed gender, identity, and speech information. By analyzing the facial dynamics captured with the motion sensors, the researchers found that AR/VR headsets can capture a user’s sensitive information without permission.
Risks of Eavesdropping Attacks
Eavesdropping attackers can derive easy speech content, including digits and words, to infer sensitive information. Exposing such information could lead on to identity theft, bank card fraud, and confidential and health care information leakage. Once a user has been identified by a hacker, an eavesdropping attack can result in further exposure of the user’s sensitive information and lifestyle, similar to AR/VR travel histories, game/video preferences, and shopping preferences.
Prevention and Future Research
The researchers hope that their findings will raise awareness in most people about AR/VR security vulnerabilities and encourage manufacturers to develop safer models. They suggest that manufacturers consider additional security measures, similar to adding ductile materials in the froth substitute cover and the scarf, which can attenuate the speech-associated facial vibrations that will be captured by the built-in accelerometer/gyroscope.
Conclusion
In conclusion, the Face-Mic study highlights the potential risks of using virtual reality headsets with voice command features. The research demonstrates that hackers can use built-in motion sensors to capture sensitive information, including bank card numbers and passwords. To prevent such attacks, manufacturers must develop safer models and consider additional security measures. As AR/VR technology continues to evolve, it is important to prioritize user privacy and security to forestall eavesdropping attacks and protect sensitive information.