Remember cyberspace? It’s now called the metaverse. It can be used to look through a new house or building—virtually. Or to remotely design and debug an assembly line. The possibilities are endless.
The metaverse isn’t the ubiquitous and all-enveloping, realistic simulation of The Matrix or a Star Trek holodeck, but hardware and software have progressed to the point where augmented reality (AR) and virtual reality (VR) are practical tools for collaboration.
Few of you use these tools for production work at this point. Nonetheless, the number is growing, and engineers and software developers should be aware of what’s possible so that they can determine when to take advantage of some amazing functionality.
A lot of parts have been coming together, from high-frame-rate, high-resolution GPUs to tiny displays that can fit into a pair of glasses without feeling like you’re wearing an iron mask. Collaboration software has been expanding to take advantage of the hardware. We’re almost to the point where software will become the determining factor of success.
The advent of video conferencing due to COVID-19, with almost everyone involved, is a good indication of potential adoption and challenges for AR/VR. There are a dozen different video-conferencing systems in general use, with Zoom and Microsoft Teams leading the way.
Each of those systems lives within its own walled garden, with most providing a browser-based interface in addition to dedicated applications. All that’s needed is a sufficiently powerful PC. Microphones, speakers, cameras, and even touchscreens are optional. All are required for the full effect.
Entering a metaverse is more of a challenge, because having the right hardware is mandatory. Plus, the level of sophistication of the hardware and matching software can significantly affect the experience. We haven’t reached the point where resolution, frame-rate differences, etc. are insignificant as they are with something like a car, where minimum standards are required.
Moving from one metaverse to another is a bit of a hassle these days since it means at least switching software. The bigger challenge will be differences between the control and experiences within the virtual reality of the chosen environment. Unlike a car’s steering wheel, the way to navigate through a metaverse is often unique to the application or the virtual environment.
Like machine learning, digital twins, and self-driving vehicles, knowing what they are, how they can be applied, and when to take advantage of them requires a good understanding of the technology.