Apple isn’t often known for pioneering new technologies these days. Whether it’s folding smartphones, sophisticated sound tech, or hidden cameras, Apple is rarely the first to enter.
Instead, wait. Years go by, technology advances, and when a competitor releases several versions of something, Apple swoops in with the first version. Sophisticated, powerful, stylish and always very expensive.
Years after the first boom in virtual reality, Apple has finally used the card to launch its first VR headset. You may be late to the party, but Apple has acted as usual, and its first steps into virtual reality are the best model we’ve ever seen.
So what’s so good about Apple’s first take on virtual worlds? David Reeda professor of AI and spatial computing at the University of Liverpool Hope.
fusion of reality
Unlike some previous attempts at usable virtual reality headsets, Apple’s Vision Po device utilizes a mixed reality format. This means that the headset blends the virtual and real worlds rather than darkening the world around it.
With a headset on, you can control a virtual floating monitor on your desk, play a game in your living room, or essentially make the real world a little more fun.
“The big selling point here is that Apple is looking to expand the meaning of the Metaverse. It’s built into the world,” Reed says.
“There is a theory known as spatial computing. It is the idea that machines can hold and manipulate referents to real objects in the world. I am doing it so that it can be done.”
This does not mean that the headset cannot take advantage of virtual reality. By toggling a switch on the side of the headset, users can change how much they block out the world around them.
With access to both virtual and mixed reality, Apple is trying to give you the best of both worlds. They’re not the first to do this, but they’re the first to do it with this much processing power.
Add a power supply on top of a power supply
Where Apple really stands out in the crowd is inside the headset. Apple says that in the Vision Pro he uses two separate chipsets. One is used for processing graphics, vision algorithms, and running software, and the other is entirely focused on processing input from cameras, sensors, and microphones.
With this dual setup, the image will be displayed in 12ms with no delay. In theory, this should result in a smoother virtual reality experience, with no violent movements taking you away from the experience.
more like this
“Essentially, it’s more powerful than the MacBook Pro. It’s basically like having a computer strapped to your face. , and even new ways to browse the Internet,” says Reed.
“As creators and professionals, we will be able to do things that we cannot do on traditional computers. will lead the way in VR.
In addition to powerful processors, Apple has introduced many other unique features. Gone are the controllers, in favor of Apple’s powerful eye trackers that can determine exactly what you’re looking at.
The device has 12 cameras and 5 sensors. They monitor hand gestures and map the external environment. Two of these cameras send approximately 1 billion pixels per second to the display, depicting the real world around the user.
For an extra level of immersion, Apple has input speakers on each side of the headset. These speakers feature dual driver audio pods positioned right next to your ear. It analyzes the acoustic properties of the room and adjusts the sound to suit the space.
“This object has a dozen cameras and a complex tracking system. It requires a lot of processing power, but Apple managed to implement it. But Apple has added a lot of other high-end features as well,” says Reid.
car sickness problem
There is one big problem that has plagued designs across all versions of virtual reality. It’s motion sickness. Unsurprisingly, he has two small screens inches away from his eyes, and people tend to get a little queasy when showing fast-moving footage.
So did Apple fix this…? “Better, but still not ideal. The main problem with VR sickness is Vergence-accommodation Conflict (VAC).”
VAC is essentially the problem experienced when the brain receives cues of a mismatch between the distance of a virtual 3D object and the focal length required for the eye to focus on that object.
This often happens in virtual reality due to the proximity of the displays. There is currently no way to solve this problem in virtual reality, but many companies (including Apple) are working on technologies that can solve this.
“At Apple, we’ve worked hard to reduce motion sickness as much as we can. By reducing lag and latency and utilizing a high-quality display, Apple has created a best-in-class headset for motion sickness. ‘ says Reed.
Motion sickness can still occur, and for some people it will always be a problem in virtual reality. However, Apple seems to be ahead of the curve when it comes to minimizing the problem as much as possible.
About our expert David Reid
David Reid is Professor of AI and Spatial Computing at Liverpool Hope University. He specializes in virtual reality and the growing world of the metaverse.