Apple’s Reality Pro headset explained: What do AR, VR, and MR mean?

Apple’s about to unleash its most important new product in a decade. Reality Pro, as it is rumored to be called, will be a VR headset like none other. No, I mean an AR headset. Wait, make that an MR headset with a dial to switch between AR and VR at will…

Despite years of headsets from big-name corporations like Google, Microsoft, Sony, Facebook/Meta, and more, most people still don’t understand AR, VR, or MR. Before Apple unleashes Reality Pro on our faces, let’s clarify what these things mean and discuss what we can expect from Apple’s new platform.

Virtual Reality

Let’s start with the easy one. Virtual Reality (VR) products put screens in front of your eyes with lenses to expand their field of view. There are typically two screens (or one divided in half), each displaying the point of view from each of your eyes to give a true 3D effect.

The movement and position of the headset have to be tracked with gyroscopes and sensors to adjust your view with every little movement of what you see on screen. In order to make you feel fully immersed and avoid motion sickness, this has to be done extremely rapidly, with a very low “motion to photon latency” (the time between movement of your head and that change being output from the displays).

The end effect is that it looks like you’re in a completely computer-generated place. It can be cartoonish or realistic, but everything–the environment and everything in it–is made of computer graphics. That’s VR: no more real world, all virtual world.

Example products: Meta Quest, PlayStation VR, HTC Vive, Valve Index

Meta Quest 2VR headsets such as the Meta Quest 2 obscure your view of the world and replace it entirely with computer graphics.


Augmented Reality

Augmented Reality (AR) doesn’t replace the real world with a virtual one but rather incorporates digital objects into the real world. This can be done on a screen, like in an AR application on your iPhone, in a headset, or using a pair of glasses.

The basic idea is that you’re seeing the real world, but computer-generated things are added to it: icons, arrows, text, floating video screens, people, whatever.

An important note is that the graphics have to appear to be present in the real world. An AR object can stay put in the same location in the real world as you move around it, and may even interact with real-world geometry like the ground, tables, walls, or even people. Simply showing floating graphics on a transparent display in front of your eyes is not AR.

AR requires at least some rudimentary 3D mapping of your immediate environment, which is typically accomplished using multiple cameras and sensors like LIDAR. Many products bill themselves as AR but are more accurately called “heads-up displays” because they cannot actually integrate computer graphics into the real world. These products don’t make a 3D map of your surroundings to integrate graphics into it, they just float things in in front of you. Google Glass, Nreal Air, and Vuzix Blade 2 are examples of this.

Pokémon Go is often cited as an AR app, and this often gets conflated with the fact that you play it out in the real world. That’s just location-based gaming, not AR. There is one small AR portion of the game: When you are throwing your Poké Ball to catch a Pokémon, you can toggle an “AR mode” that shows your target Pokémon in the real world on our phone screen. That part, and that part alone, qualifies as AR. (And most players turn it off.)

An AR headset could look just like a VR headset, completely blocking your view of the outside world, if it uses external cameras to feed real-time video of your surroundings to the screens within. This is how Apple’s Reality Pro is expected to work, and it contrasts with something like Microsoft HoloLens which has a transparent screen with an integrated display, so you actually directly view the real world itself.

Still, as long as you’re seeing the real world around you in real-time, with computer-generated graphics integrated into it and not just floating in front of you, that’s considered augmented reality.

Example products: Microsoft HoloLens, Magic Leap, and the upcoming next-generation Snap Spectacles qualify as real AR glasses or headsets.

Microsoft Hololens 2Microsoft’s HoloLens 2 is a true AR headset, but Apple’s device is said to be far more capable.


Mixed Reality

Here’s where it gets confusing. Apparently, augmented reality isn’t augmented enough so the industry has coined a term that it can’t even seem to clearly define to separate real AR from all those virtual displays and heads-up displays as well as the AR phone app stuff.

Look up mixed reality (MR) and you’ll find a dozen different definitions full of not-very-precise phrases like, “brings together the real world and the virtual world in a more immersive way,” or “allows multiple users to interact together in a virtual space.” Some say it requires you to manipulate virtual objects with your own hands, while others say mixed reality can use controllers.

Intel says MR, “brings together real world and digital elements” (which is what AR does). “It provides the ability to have one foot (or hand) in the real world, and the other in an imaginary place, breaking down basic concepts between real and imaginary, offering an experience that can change the way you game and work today.” Poetic, but not especially helpful.

Meanwhile, Microsoft says, “It liberates us from screen-bound experiences by offering instinctual interactions with data in our living spaces and with our friends.” I don’t know what an “instinctual interation” with data is, but it sounds incredibly dull. “People may not even realize that the AR filters they use on Instagram are mixed reality experiences.”

So even among the biggest technology companies, there seems to be no agreement on how to define MR, or on what minimum standards qualify a product or experience to be labeled mixed reality.

I think a good way to think of it is this: Mixed reality is a superset of augmented reality. It’s AR with a higher minimum standard of interaction and immersion. It’s viewed with your own eyes (either through clear glasses or a headset with passthrough video from external cameras) rather than displayed on a phone or tablet screen. And it doesn’t just display information integrated into the real world, it allows you to interact with those virtual objects and for the objects to interact with the environment.

Google Glass HUD“Smart Glasses” like Google Glass cannot integrate graphics with the real world, just float info in your field of view.


What to expect from Apple Reality Pro

From what we know of Apple’s headset, it will provide an unparalleled VR, AR, and MR experience. It’s said to be a headset not unlike a pair of ski goggles, which will obscure your view of the outside world.

Multiple cameras will feed in a real-time view of the world around you, and a fleet of sensors will create a 3D map of your surroundings so that virtual objects can integrate and interact with it. You’ll control apps and manipulate objects with your hands, and internal sensors will even detect your eye movements.

A dial on the outside, similar to the digital crown on an Apple Watch, will let you control how much of the outside world you wish to see. Turn it one way and the world is blocked out, leaving only virtual environments: VR. Turn it the other way, and you see the complete real world (fed to the headset displays from multiple external cameras), with computer-generated graphics integrated (AR/MR).

We’ll know more about the Reality Pro’s capabilities when it is formally announced, likely on June 5 at the WWDC keynote. If the current rumors are true, it will set a new high bar for quality, detail, immersion, interaction…and price, possibly costing as high as $3,000.


Ad - Starter Web Hosting from SiteGround - The easiest start for your website. Click here to learn more.


Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top