Introduction to Next-Generation Augmented Reality
Augmented reality (AR) applications for mobile devices are about to get a complete lot smarter and more sophisticated, due to recent grants awarded to UC Santa Barbara computer science professors Matthew Turk and Tobias Höllerer. The professors envision next-generation AR that’s more stable, realistic, and dynamically updated by users.
The Current State of Augmented Reality
Many mainstream AR applications depend on mobile device sensors and a static dataset layered over real-time visuals or GPS coordinates. However, Turk and Höllerer aim to alter this by employing real-time computer vision for more stable presentation of 3D computer graphics that appear as in the event that they are truly a part of the physical world.
Applications of Next-Generation AR
Imagine with the ability to explore a landscape architect’s design by placing virtual trees or walking inside the grounds they plan to develop. A tourist at an archaeological site could explore the reconstruction of an ancient temple where it once stood. The possibilities are infinite, and the UCSB team is conducting intensive research to make this a reality.
The Technology Behind Next-Generation AR
The team is working on coupling mobile computer vision capture with crowdsourced user data that would immediately discern whether the app object matches the item in point of fact. They’ve termed it "anywhere" augmented reality. This technology would allow users to update information in real-time, making the experience higher for the following user.
Creating an Optimal User Interface
To achieve "anywhere augmentation," the researchers must first design an interface with which potential developers can easily experiment out of the box. With a recent $300,000 grant from the federal Office of Naval Research, Turk and Höllerer’s computer science research group shall be closely studying user experience to create that optimal user interface for AR.
Collaborations and Funding
The project is a collaboration with Virginia Tech Professor Doug Bowman, and the team has also received $500,000 over three years from the National Science Foundation for a project that uses computer vision-based tracking and augmented reality to boost distant collaboration (telecollaboration) in physical spaces.
Enhancing Remote Collaboration
Their research group is working on a project that permits users in numerous locations to see and assign data inside a goal scene, extending two-dimensional tracking to a real-time three-dimensional scenario. This technology has the potential to revolutionize distant collaboration, making it possible for people to interact with one another and their environment in a more immersive and interactive way.
The Future of Augmented Reality
Next-generation AR could make it possible to explore any physical environment, known or unknown, live from a distant location. Rapid advancements in mobile computing devices provide an ideal springboard for the technology. The applications for mobile, real-time augmented reality can have a significant impact on health, education, entertainment, and plenty of other areas.
Conclusion
The work being done by the UCSB Four Eyes Lab has the potential to revolutionize the sector of augmented reality. With their deal with the "4 I’s" of Imaging, Interaction, and Innovative Interfaces, they’re pushing the boundaries of what is feasible with AR. As the technology continues to advance, we are able to expect to see more sophisticated and interactive AR applications that change the way in which we interact with the world around us.