Apple Vision Pro Hands-On and 2nd Impressions: I Was (Partially) Wrong
It’s like Apple tried to make the future with today’s tools
Many months ago — eight, in fact — Apple announced Apple Vision Pro, a marvel of technological engineering and prowess that left me with some concerns, mainly about where this device would fit in my life. While I can’t answer that question yet — I’ve only used Apple Vision Pro for about a day now, on and off — I can say one thing: Apple Vision Pro is mind-blowing. When you first put it on, it’s off-putting, behaving unlike any other computer you’ve ever used before. It’s deeply uncanny, making even the most seasoned computer users second-guess their every move. It takes a second to think about what you want your new computer to do, assess how you should make it do that thing, then perform the gesture to make it do the thing — three steps to perform one. But after a while, Apple Vision Pro shines, becoming a gadget that you find yourself simply being able to feel immersed in. Time flies when you’re wearing Apple Vision Pro — minutes become hours and hours become afternoons. Apple Vision Pro is the most fun you’ll have using a computer in years. It’s like the iPhone all over again.
With that being said, the product doesn’t feel complete, and neither does it feel like it’s at its best. Apple has a lot of work to do: The software, visionOS, is half-baked, and the hardware is heavy, hot, and disorienting. There is no storage included in the box, the included bands are uncomfortable, and the cameras are lackluster at best. Even just placing the headset on your head is jarring: You have to unscrew the Fit Dial or undo the hook-and-loop fasteners — depending on the type of band you’re using — every time you want to use it, then once it’s on your head, you have to make adjustments to it to ensure weight is balanced properly across your cheeks and forehead. It’s finicky. The field-of-view, somehow, is smaller than other virtual reality headsets on the market, breaking immersion, and Apple Vision Pro is essentially unusable in the dark.
But you’ll notice that in that list, most of the glaring issues I have are with the hardware, not the software, and that is because visionOS — though it lacks strong application support from third-party developers as well as basic features like Home View customization — is great. The operating system feels beautiful and intuitive. Though it takes some getting used to after setup, you’ll quickly get a hang of the gestures and eye-tracking after just a couple of hours, so much so that it becomes second nature to use Apple Vision Pro. You’ll find yourself zipping through the OS with no quibbles, using your hands and eyes to navigate through screens and across menus. visionOS is the Apple-iest software on any head-mounted computer, and I have many thoughts about it.
Over just one day, I’ve compiled how I feel about Apple Vision Pro so far. Here are those thoughts and feelings.
Setup
When you first get your Apple Vision Pro out of the box, it comes assembled with the Solo Knit Band, a fabric strap that loops around the back of your head with a “Fit Dial” that you loosen or tighten. You take Apple Vision Pro out of its cradle by holding the main unit’s aluminum frame, not the Light Seal — the fabric part that attaches magnetically to the main unit — or the speaker pods. This is because the Light Seal is prone to snap off since the magnet is relatively weak, which will result in you dropping the headset. This arrangement makes Apple Vision Pro a pain to carry and makes the $200 optional travel case a necessity. Apple Vision Pro does not come with any way to store it in the box; just a cover that you slip over the front glass to prevent scratches and fingerprints when the device is not in use.
Inside, you’ll additionally receive an extra, thicker Light Seal cushion and another band — the Dual Loop Band — along with a nice keepsake booklet instruction manual, 30-watt wall charger, braided USB Type C cable, and detached aluminum battery. The battery is the second nuisance in the box, though I understand why it’s separate. Once you remove the cardboard inserts that keep the headset secure during shipping — and there is a lot of it — you need to plug in the battery to a port on the left side of the unit, just to the right of the speaker pod. The connector on the end of the cord of the battery is shaped like a circle with an aluminum cap, and the cap has an LED light that glows white when the headset is in use. The port on the headset has two gray circles around it, painted onto the silicone of the speaker pod band: a hollowed-out one and a fully filled-in one. You align the LED light on the battery connector with the hollowed-out circle by pressing the connector into the port, which feels like it has a spring in it, then twist the cord by a quarter turn clockwise to align it with the filled-in circle. You’ll then hear a snap, and the LED light will blink white, indicating that it has been successfully connected. The spring then decompresses, and the aluminum connector pops out of its well, protruding from the body of the headset.
The battery itself is heavy, and is as large as an iPhone 15 or 15 Pro, albeit much thicker. I’d say that it’s about three iPhones thick, with rounded corners and edges and an embossed Apple logo at the top. Once you’ve secured the battery, you loosen the Fit Dial located on the right side of the band by twisting it counterclockwise. You then slip the Solo Knit Band over the back of your head, aligning the headset’s displays in front of your eyes and allowing the nosepiece — which seems like a piece of delicate, thin fabric that stretches around your nose — to rest on top of your nose. The device itself is heavy, with pressure being distributed across your forehead, cheekbones, and neck. You’ll then hear a slightly modified version of the Mac’s boot-up chime, hearkening back to the product’s older, 40-year-old sibling. From there, the displays light up, with passthrough mode enabled by default, showing you your surroundings.
You’ll then see a cursive “hello” appear in your space, about 3 feet away from your current position, around at eye level. Below the animation — which plays in different languages similar to iOS and macOS — is a Get Started label, which prompts you to click the Digital Crown at the top-right of the headset to begin setup. Once you click it (it feels just like an Apple Watch or AirPods Max), visionOS prompts you to look at your iPhone to sign in with your Apple ID and connect to Wi-Fi. When you look at your iPhone, you probably won’t be able to read anything because the passthrough quality isn’t spectacular, as I lamented earlier and will describe later. However, you’ll see a sheet similar to the one you see when your iPhone discovers unpaired AirPods, although, in Apple Vision Pro’s case, the sheet is black with an Apple Vision Pro graphic. You then tap the Get Started button, then just continue to look at your iPhone through Apple Vision Pro. An augmented circle will appear around your iPhone, and it will slowly fill in with white until setup is complete. After that, visionOS signs into your Apple ID automatically, similar to when setting up a new Apple TV.
After preliminary setup is complete — including restoring from a backup if you have one — visionOS prompts you to adjust the distance between the displays to prevent visual discomfort. The screen gently fades to black, and two green, hollow circles appear in the center of your view, mimicking the shape of the displays on Apple Vision Pro. Text appears below the circles, telling you to press and hold the Digital Crown to adjust interpupillary distance, or IPD. Unlike other VR headsets, IPD is measured and adjusted automatically, with internal sensors and motors mounted inside Apple Vision Pro. You simply hold the Digital Crown down, and the circles on-screen will join closer together. Once the circles are filled, visionOS will guide you to double-click the Digital Crown to complete setup. Then, a green checkmark will appear, indicating that IPD has been set. Once IPD is set, visionOS will enable passthrough again and display a medium-sized white dot in the center of your view and will instruct you to look at it and pinch your thumb and index finger together. When you look at the dot, it becomes smaller, and when you tap your fingers, it animates. Then, six dots all appear in a circle at the center of Apple Vision Pro, and visionOS guides you to look at each one and tap your fingers together to begin setting up eye tracking. Once you’re done, visionOS adds a translucent layer of white to your surroundings, simulating brighter lighting, and then displays a series of orange-colored dots in a variety of shades, prompting you to look at each one and tap your fingers together. The same process repeats in even “brighter” lighting with teal dots; your surroundings are entirely invisible by this point, filled with white.
After basic setup is complete, visionOS shows you an instructional video illustrating how to use the basic gestures of visionOS: tapping, scrolling, and looking. Your eyes act as the cursor, and your fingers act as the buttons on the mouse. When you want to select something, look at it — and only it — then tap your index finger and thumb together. To scroll, you tap your fingers together and move your arm up or down, or right or left depending on how you want to scroll. After that, a welcome script appears, and visionOS takes you straight to the Home View. Welcome to Apple Vision Pro.
Hardware
The first thing you’ll notice from the second you put on Apple Vision Pro is the quality of the passthrough feature, where the cameras mounted outside of Apple Vision Pro display your surroundings through the two screens. Passthrough is grainy on Apple Vision Pro and also feels weirdly out of focus. It’s extremely jarring, especially if you’ve never worn another VR headset. It looks nothing like real life — colors are washed out, there is noise all over the frame, and most noticeable of all, there is an extreme amount of motion blur. Wearing Apple Vision Pro and moving around is not a pleasant experience. The entire frame jitters around, and up-close objects, like your iPhone, look fuzzy and out of focus. When you look toward the edges or have Apple Vision Pro fitted slightly lower than it should, there is an enormous amount of green tint and chromatic aberration coming from the sides of the displays. The color distortion is uncanny.
The next thing you’ll notice is the limited field of view. If you’re guided by the marketing materials, drop everything you know and listen to me: it’s nothing like Apple’s advertisements. When you first put Apple Vision Pro on, it’s like looking through a tunnel, where what would usually be your peripheral vision is covered by black borders. The borders, in my experience, are shaped like a capsule or oval, with a semi-transparent dividing line where your nose would be when you’re looking without Apple Vision Pro. The limited field of view, which I’d say is about 100 degrees horizontally — lower than the Meta Quest 3’s 110 degrees — breaks immersion for me and makes me feel like I’m looking through a headset rather than heavy glasses. The field of view issues coupled with the low-quality passthrough cameras make the “reality” part of Apple Vision Pro lackluster at best.
These hurdles are fundamental version-one hardware issues that I’m sure will be addressed in future versions. But that’s the entire story of this headset: a glimpse at the future with today’s tools. Today, batteries are large, unwieldy, and thick, so it’s impractical to put a battery inside the already heavy Apple Vision Pro. So, Apple had no choice but to detach it from the main unit and enclose it in a thick aluminum chassis. The battery makes you stop and think whenever you find yourself veering off into the immersiveness of Apple Vision Pro. You’re always conscious about where it is, if you need to pick it up, and if you’re pulling on the cord. In practice, I’ve found it best to just place the battery in your left pocket, even when stationary, as you won’t be worried about it when using Apple Vision Pro. However, be careful to take it out of your pocket when you’re done using the headset since walking away while it’s connected will cost you over $2,000 to fix. (Yikes.)
Even without the battery, Apple Vision Pro is an extremely heavy piece of machinery. With both bands, the main unit rests on your cheekbones and nose and hugs your forehead tight. If you use the device for more than 30 minutes at a time and then take it off, you’ll feel some residual pressure on those areas. It’s not tenderness, per se — it didn’t hurt — but it feels like squeezing. It takes a second for you to return to normal; you still feel like you’re wearing the headset even 10 minutes after you take it off. That’s how heavy it is. The headset also generates a lot of heat, thanks to the M2 and R1 systems-on-a-chip and fans. When Apple Vision Pro is on and you use it for an extended period, you can feel warm air coming out from the exhausts at the top of the main unit, as well as some heat on the aluminum frame. My head didn’t feel hot, even after using the device for hours on end, but when I took it off, my head was sweaty, especially around my forehead. I wouldn’t consider the heat something that makes the headset feel uncomfortable — unlike the weight — but it’s strange.
The headbands themselves are disappointing. The pre-installed one, the Solo Knit Band, seemed the most comfortable, unlike for many others I’ve seen. For me, it hugged the back of my head well, keeping the headset secure while moving around, and equally distributed the weight across my face. When I tried the Dual Loop Band — which has both a top strap and back strap — I found that the back strap wasn’t allowing the headset to go lower down on my head, leaving a gap between the Light Seal and my forehead. It did feel lighter on my cheeks, though, presumably because the weight was instead placed on the top of my head. But while it was comfortable, the headset didn’t fit right.
The Solo Knit Band is also easier to take on and off, though it’s an involved process for both bands — it’s nothing like how it’s portrayed in the marketing materials published by Apple. When you want to put on Apple Vision Pro, the Solo Knit Band is probably already tightened, so you have to rotate the Fit Dial to loosen the band, slip the headset on, and then tighten the band again by twisting the dial in the opposite direction — clockwise. Putting it on itself requires some practice: I first found myself trying to get in from under the headset, then slipping on the band, but I’ve now defaulted to raising the headset above my head, stretching the band behind my head, then situating the main unit onto my forehead and cheeks before tightening the band. Then, I usually have to move the headset up or down — up if there is too much pressure on my cheeks, and down if the displays look blurry — which I usually do while I wait for visionOS to start up. Apple also says you should move the band up and down to distribute weight evenly across your face — the natural position of the band is to sit higher than your ears, but you want the band to be at ear level while also not being lower than the ears, since that would bend them.
For the first few times that I used Apple Vision Pro, the displays were extremely blurry, to the point where I couldn’t see anything but the user interface of visionOS — and even looking at that was unpleasant. To address this, I had to recalibrate the IPD in Settings. I did this a couple of times, and each time I did, I heard the motors move, indicating that the IPD was being adjusted. It turns out that as you change how the strap fits on your head, your IPD also changes in a way that the system has to recalibrate. Failure to do so might make you go cross-eyed, which is what happened to me for the first couple of minutes. But even after I recalibrated my IPD, I still wasn’t able to see anything close up through the passthrough mode. This continues even as I write this; it seems like the minimum focal distance of the cameras that power Apple Vision Pro is around a foot, maybe a foot and a half from your face. I still am unable to read very clearly from my iPhone — I can make out large letters, like titles, but the passthrough seems only good for looking at people and your surroundings.
In low-light conditions, the passthrough fails. It still “works,” but that’s using a very broad definition of the term. visionOS never disables the passthrough mode ever, so the system just adds artificial light, which increases grain. It’s incredibly off-putting, but luckily, this can be addressed by using an Environment. And when in very bright conditions, like looking into a light, the displays automatically dim to protect your eyes. Looking at some displays shows flickering due to mismatched refresh rates. Passthrough is in a high dynamic range, though, so you can see outside of a window and inside of your home, for example.
Again, this is not a full review, and I haven’t tested every aspect of the hardware in every imaginable condition — I’ve only had an Apple Vision Pro for about a day. But one thing is for certain: Apple Vision Pro’s hardware screams “version 1.0” all day long. As you’ll hear in the visionOS section of these second impressions, this device is oddly futuristic — in a good way — but its hardware limits it beyond belief. The hardware, while not disappointing, has a ways to go, but for now, it acts as a hindrance to the fantastic software and feature set of this device.
visionOS
After you take off Apple Vision Pro, you’ll usually keep the battery plugged in unless you need to charge it, in which case you can disconnect it and charge it separately. (I haven’t done enough extensive testing to determine battery life. That is coming in the full review.) Apple Vision Pro’s displays only are powered on when (a) the battery is plugged in — obviously — and (b) when Apple Vision Pro is on your head and, importantly, it can see your eyes. When not in use, Apple Vision Pro stays on for about a minute, indicated by the battery connector’s LED indicator staying on, syncing apps and downloading data before it goes to sleep — without an affordance for Find My since visionOS doesn’t have Find My functionality — similar to AirPods Max.
There is no power button to wake Apple Vision Pro — you just put it on, and the system wakes from sleep mode nearly instantaneously. There is no Apple logo when you wake it up, either; there is only one when you boot Apple Vision Pro. The only way to turn off the headset entirely is by disconnecting the battery — and connecting the battery again immediately powers on Apple Vision Pro. There is no way to make the device go to sleep while it’s on your head, and there is no way to make it fully turn off while the battery is connected — at least not that I’ve seen. Additionally, immediately disconnecting the battery will not prompt visionOS to save any of your data. There is no state restoration if you disconnect the battery — it’s equivalent to a shutdown on macOS, but with no warning to save your data because there is no reserve battery in the headset itself.
Once you place Apple Vision Pro on your head, the passthrough functionality kicks in. After the colors adjust — they start blue-ish and then are tuned to your environment — a window pops up around the bottom of the screen with an Optic ID logo. You do not have to look at this window for Optic ID to authenticate you; Optic ID scans your irises automatically and logs in without you having to lift a finger — no taps, no button clicks. Optic ID is configured during setup and is easily one of the most delightful experiences on visionOS. Configuring it takes about two seconds and all you have to do is look at a target placed in a window. Once it’s set up, it never fails unless your Apple Vision Pro is placed too high or too low on your head.
If Apple Vision Pro is fitted incorrectly, either with the wrong IPD, position on your face, or some other reason, visionOS will give you an alert notification before you can proceed with Optic ID. When you’re in motion or if the sensor array is covered — for example, if you have the front cover on, like when your Apple Vision Pro is not in use — you’ll get an alert that reads: “Tracking failed,” complete with an orange icon; if your surroundings are too dark, you’ll get an alert that your hands will not be cut out and overlaid on your application windows; if your Apple Vision Pro is too high, too low, too close, or otherwise misaligned with your eyes, you’ll get a notification allowing you to proceed, but asking you to adjust your device; and if your IPD is incorrect, you’ll be immediately taken to the display adjustment view to readjust, which is compulsory since not doing so can cause sickness.
Once Apple Vision Pro is unlocked, no Home View is displayed by default — it’s just your surroundings, akin to Transparency Mode on AirPods Pro. To show the Home View — a grid of alphabetically organized applications — you press the Digital Crown once, which spawns the Home View anywhere that you look. You can create a Home View anywhere you want, including on the ceiling or the floor, and once an app is opened, you can create a new Home View somewhere else to open a new app. The Home View isn’t customizable at all — there are no folders, no app re-organization, known colloquially as “Jiggle Mode,” and no ability to delete default, pre-installed apps — and is only organized by app name from the second page onward, with the first page dedicated to Apple apps and a folder of iPad apps running in Compatibility Mode. Swiping between pages just requires a tap of your fingers and a flick to the left or right, with dots appearing at the bottom of the view to indicate how many pages there are, similar to iOS. Looking at an app changes its icon’s appearance slightly, creating a subtle 3D effect where individual pieces of the icon are separated from each other with a minor drop shadow. It’s similar to the animation you see when selecting an app in tvOS but made for 3D.
visionOS, from the get-go, feels like the love child of macOS and tvOS: it’s an unholy combination that nobody would’ve ever expected, but that works remarkably well, feeling natural and intuitive. Like on tvOS, looking at a control makes it pop, with a subtle highlight around the button shape. Simply glancing past one casts a small glimmer of light onto it, too. Tapping on an icon by pinching your fingers compresses the button slightly before it bounces back, performing an action and dimming the icon while the action is in progress. But like on macOS, your eyes are an extraordinarily precise method of interaction, similar to a mouse cursor but just slightly less precise — a mouse cursor is exactly one pixel, whereas your gaze is many pixels. The effect is uncanny: Unlike on macOS or tvOS, you have to be actively looking at the element you’d like to select. Since Apple Vision Pro is a computer at the end of the day, you’ll tend to veer off or be distracted by other icons next to the one you want to select, but looking at something else will select that element, not the one you actually want. A simple glance won’t do — you have to actively look at what you want to tap while you’re tapping it. For the first couple of hours, it’s jarring, as everything is new to you. For example, I found myself pinching my fingers together as I moved my hands when I spoke, looking at a window. But after you settle in, it feels like second nature — like how a computer should always function. You make fewer mistakes and work more intentionally.
Your space is infinitely large in visionOS — even more expansive than your actual living space, because looking out through a window (definition №1: an opening in the wall that is fitted with glass to admit light) allows you to create a window (definition №3: a framed area on a screen for viewing information) that is as large as a tree. By default, visionOS apps open extremely large, taking up as much room as they want, but that’s because they’re far away when you open them for the first time. To move a window, you look at the bottom of it, where there are two icons: a dot, and a bar, arranged horizontally. Looking at the bar highlights it, and from there, you can tap your fingers together to move it left, right, up, down, and along the z-axis (back and front). To perform a z-axis move, you extend your arm outward or inward, but you don’t have to move it so much that it’s uncomfortable — only slightly. Like a physical object, moving a window outward (away) increases its size, and moving it inward decreases its size — both by an equal scale factor, respecting the aspect ratio of the window. To change the aspect ratio, you look at the bottom left or right corner to make the bar form around the corner. Then, you can tap and drag diagonally in any direction to change how wide, narrow, tall, or short a window is. To close a window, just tap the dot. It’s just like macOS, or iPadOS with Stage Manager enabled.
Once you place a window, it doesn’t move anywhere, even if you move. This means that you could place a window in the living room and walk to the kitchen, and the window would remain in its location in the living room. As you move closer to a window, it doesn’t “jiggle” around, even slightly. Windows feel like physical, translucent objects in your space, with the “glass” visual material and subtle drop shadows added to the floor or a table beneath a window. Using the augmented reality mode on iOS — even with iPhones 12 Pro and beyond equipped with lidar sensors — isn’t perfect, with virtual objects appearing to float around weirdly or make micro-movements as the camera moves, but that isn’t the case on Apple Vision Pro. It’s spectacular and just something you have to see for yourself. If you or someone else walks through a window, it fades and becomes semi-transparent, and then returns to its near-opaque state once there are no obstacles in view. Funnily, visionOS ignores physical objects as obstacles — it only prioritizes people — meaning that you can push a window into your bed or couch, for example. All parts of the window remain visible, with the window appearing overlaid on the physical object. Looking at a window from behind shows a white, almost opaque box, and looking at one from the side shows a razor-thin side profile of the window which you can look at up close.
Authenticating with Optic ID, tapping app icons, closing windows, taking screenshots, and opening the Home View all play unique, delightful sound effects. The one played while authenticating with Optic ID sounds like a key unlocking padlock — it’s the equivalent of skeuomorphism for sound and is intensely satisfying. Other sounds are familiar: the screenshot sound is from macOS and the battery charging and keyboard clicking sounds are from iOS. These sounds come from the speaker pods by default, but you can change that to connected AirPods if you’d like. Control Center is where you adjust audio settings, put the device into Travel Mode or Airplane Mode, change Wi-Fi and Bluetooth connections, record the screen, enable Mac Virtual Display, AirPlay to an Apple TV, or adjust the Environment you’re in. (You can also adjust audio and Environments by turning the Digital Crown and looking at the element you’d like to change.) Navigating to Control Center on visionOS is one of the weirdest interactions on the operating system: You move your head up toward the ceiling and look for a translucent dot with a chevron pointing down on it. You then tap that, then tap the Control Center button. Control Center also shows you your battery percentage, the time, and the date.
visionOS is a delightful operating system to use — period. From the get-go, it looks entirely different than macOS or iOS because it’s designed to fit within your space, blending in with your surroundings. But as soon as you get used to it, you realize where it gets its roots: macOS and iOS. Users of those two platforms will feel at home on visionOS because visionOS is a natural extension of them, with cues taken from tvOS, watchOS, and iPadOS. The Mac feels like an extension of my limbs when I’m working on it — it’s that natural to me — and the iPhone feels like a piece of paper that morphs into whatever I want. It doesn’t feel like I’m navigating a computer when I use those two operating systems because I’m so in tune with their every move. I know what is going to happen if I click a button on macOS or tap an icon on iOS — and that familiarity carries over to visionOS with no extra training.
The tactile feedback that you feel when you physically press your iPhone screen or the haptic feedback you feel when you click a Magic Trackpad — that feedback is now your fingers touching together on visionOS. The “pinching” gesture is an intuitive combination between clicking down on a mouse and tapping a touchscreen with your finger. The “pinch-and-swipe” gesture has the inertia of the two- or three-finger swipes on Magic Trackpads with the intuitiveness of scrolling on iOS. visionOS is a seamless, natural, perceptible blend between all of Apple’s operating systems, and that’s what makes it so fantastic. It’s a five-star interface, and though it lacks some basic features like app organization, editing in the Photos app, or turning off Spatial Audio (weird), it’s an astoundingly delightful interface to use — and hyper-futuristic.
My only gripe is this: visionOS has a steep learning curve. When you first use visionOS, there’ll be a temptation to move your hands up and tap things. Sometimes, you’ll be looking at something, reach and grab something different, and tap it — only to realize you’ve tapped the control you were looking at, not the one you reached out for. These mistakes — if you’ll call them that — come after decades of life, not just computer use. Our eyes never worked as the pointing device throughout human history; our gaze has always been independent of our limbs. On visionOS, your gaze is an extension of your limb, and thus your arms, hands, and eyes are no longer asynchronous or independent. You’ll want to go fast on visionOS, even when you first begin using it, but you can’t unless you train yourself to look at something and then tap it. Once you master looking and tapping, you’ll naturally place your hands in your lap as you move through visionOS, rather than raising them to tap things in space. The only time you’ll raise your hands is if you want to move a window or scroll — two actions that require arm movement rather than just hand or finger movement.
There is no better example of this irregularity — the one where you have to look and tap rather than reach out and tap — than the virtual keyboard on visionOS, which is easily the most grueling interface on the entire operating system. There are two ways to control the virtual keyboard: physically pressing “keys” with your finger in the air or looking at a key and tapping your fingers together like any other control. The first one is truly asinine; it’s genuinely remarkable how bad it is. The second one, however, requires some retraining: If you know your way around a keyboard well, it’s pretty simple to just keep your hands in your lap and quickly glance at each letter, though going ultra-quick will inevitably yield some mistakes. Regardless, text input on Apple Vision Pro is arduous, and that is being generous. You’ll be much better off by connecting a keyboard — there is no need for a trackpad since eye tracking is that good — via Bluetooth, as the virtual keyboard is a true pain. Alternatively, some text fields have a microphone icon to their left that you can look at to begin dictation — though dictation is as accurate as an iPhone, which is to say, not very accurate.
This issue could be easily addressed just by utilizing the device everyone has next to them when they’re using Apple Vision Pro: their iPhone. On tvOS, text input is worse than visionOS, so Apple sends a notification to your nearby iPhone to let you use its keyboard to type. This isn’t an option on visionOS, and it should be. The iPhone’s keyboard is great — much better than Apple Vision Pro — and Apple should add the ability to use it on visionOS as another continuity feature. Relatedly, Optic ID doesn’t unlock your iPhone, and there are some times when I just want to do something on my iPhone, even with the mediocre passthrough quality. When you swipe up on iOS to enter the Home Screen with Apple Vision Pro on, it silently fails. Hilariously, there is an option under Settings → Optic ID & Passcode that allows you to bypass Optic ID on Apple Vision Pro if a nearby iPhone has been unlocked with Face ID, similar to the “Unlock with Apple Watch” feature on iOS, but it should work the other way around with a future software update, where unlocking Apple Vision Pro with Optic ID unlocks your iPhone when you use it. I predict that Apple will fix these glaring issues, but for now, they’re omissions that make visionOS feel incomplete.
Experiences
First, hand occlusion, or hand cutouts: When your hand is over a window or experience in visionOS, and your room is sufficiently lit, Apple Vision Pro will dynamically track your hand and cut it out from the frame so you can see it over windows. In Apple’s marketing materials, it seems like the hand occlusion is superb and immaculate, but that is only partially true. The bad passthrough quality and motion blur are so perceptible that hand occlusion isn’t crucial. In bright lighting, the cutouts are jumpy and you can see about half a centimeter of a “halo” around your hand when it’s over a window, but it’s fine and barely noticeable in practice. Even in immersive Environments, you can see your hands and arms — but not any other body part — and move them freely. Oddly, this hand occlusion does not extend to a physical keyboard, so if you need to look at it to type, you’re out of luck — visionOS only cuts your hands out. But if you’re looking at screen recordings of visionOS and thinking that hand occlusion is poor, it’s imperceptible in practice and you won’t be bothered by it — even in the most immersive of Environments.
Speaking of Environments, there are six VR versions: Heleakalā, Yosemite, Joshua Tree, Mount Hood, the Moon, and White Sands. There are two that are labeled “Coming Soon,” and there are five lighting effects that change the way passthrough looks by adding a filter: Morning Light (blue), Spring Light (red), Summer Light (yellow), Fall Light (amber/orange), and Winter Light (another shade of blue). You access these Environments by pressing the Digital Crown to go to the Home View, then looking at the sidebar mounted to the left to select “Environments.” The light effects are useless, but the VR Environments are by far some of the most impressive aspects of visionOS. Each one is in HDR, and you can see your hands inside of them — great for writing. You control your level of “immersiveness” — i.e., how much of the Environment you want to see — by rotating the Digital Crown clockwise, which shows a circle that fills in depending on how immersed you are. Partial immersiveness shows a circular portal to the Environment at the center of your gaze — you can recenter by pressing and holding the Digital Crown — and full immersiveness takes you entirely into the Environment.
Environments are 3D and come in light and dark variants which you can toggle from Control Center. If you begin to move around in an Environment, Apple Vision Pro won’t immediately break immersiveness, but as soon as it detects that you’re even remotely close to an object, like a wall, it’ll blend your surroundings via passthrough in with the Environment automatically. Similarly, if you’re too close to an object from the beginning, visionOS will display a warning that says you’re too close to an object and will refuse to let you enter the Environment. I don’t think it’s possible to accidentally bump into anything when Apple Vision Pro is strapped to your head; it seems to do a great job at blending your surroundings in, even when fully immersed. Your arms are always cut out in every Environment, and if someone approaches you while you’re in one, they’ll break through, with a ghost-like halo of them appearing at their location. You can’t miss someone if they’re waiting for your attention nearby. It’s almost too good, in fact: If you’re sitting in a living room with other people, or perhaps an airplane, passthrough will activate no matter if the person is looking at you or not, in my experience, meaning that you’ll be able to see everyone around you all the time unless you’re actively looking away from them.
Third-party apps can also display their own immersive experiences, and two apps stood out to me: Disney+ and Night Sky. Disney+ has a couple of experiences: the Disney+ Theater, Avengers Tower, Scare Floor, and Tatooine. They’re all simply incredible — you just have to see them to believe it. My favorite is the Disney+ Theater, where you’re sitting in a chair in the theater and there is a giant, expansive skylight on the ceiling which you can look at. Behind you are more rows of seats, and to your sides are massive, red curtains, just like a beautiful live-action drama theater. Your hands are still cut out, so it really does make you feel like you’re sitting in the chair, with everything from virtual cupholders to tables and armrests. As you turn on a film, the lights go dark and the video player takes up the entire front wall of the virtual theater. It truly is incredible and makes you go, “Wow.” No wonder why they got Bob Iger, Disney’s chief executive, to demonstrate Disney+ during the Worldwide Developers Conference.
Night Sky, meanwhile, lets you sit in a planetarium, which surrounds you — you can look up, down, left, and right. Tapping on constellations, stars, planets, and rockets will give you more information in a window, and you can even walk around and explore. The only downside to these experiences is that you can’t bring in other apps like you can with the default Environments. Even though they look similar, they’re more akin to full-screen apps whereas the Apple-provided Environments are more like desktop backgrounds.
The second “revolutionary” experience is in the Photos app: panoramas. Since the iPhone 5s, iOS users have been able to take panoramic photographs just by holding their iPhone up and moving it around horizontally. On Apple Vision Pro, those panoramas can be viewed around you, meaning that panoramas engulf your view. The panoramas are still 2D images, but they surround your entire field of view — as much as you can see through Apple Vision Pro and then some, both vertically and horizontally. My strongest recommendation, even for those who aren’t intent on buying an Apple Vision Pro right away, is to take more panoramic photographs on your iPhone. They are easily one of the most mind-blowing parts of the headset, and truly something only Apple can do. Even years-old panoramas I took with iPhones of many years ago with inferior cameras look fantastic on Apple Vision Pro and transported me back to when they were originally taken.
For example, many years ago, I took a panorama at an ice hockey game on a field trip from the stands, out toward the rink and the opposite stands tilted slightly up. When I viewed that panorama on my iPhone or Mac, it wasn’t that impressive — you could see distortion, you couldn’t see the entire frame, and it was very narrow, so you had to zoom in to let the picture take up the horizontal length of the screen. On visionOS, I just tapped the Immersive button (it’s shaped like a panorama) at the top right of the picture in the Photos app, and it took up my entire view. It felt exactly like I was there, standing from my seat, looking into the arena, Amway Center. It was captivating, and a must-feel experience for anyone.
The other, perhaps slightly less impressive experience is Spatial Video, which you can capture on iPhone 15 Pro and Apple Vision Pro. I haven’t tried capturing one on the headset itself, but I did view my iPhone videos that I took over the holidays. I see the appeal and why someone would enjoy looking at them: Each video feels like a portal in time, opening up in a blurred oval that you can look inside to feel the depth effect more prominently. People feel like they’re right there, standing in front of you, but the resolution and frame rate make it feel like a video at the end of the day. Spatial Videos are shot at 30 frames per second in 1080p, which is too low of a resolution because Spatial Videos are viewed only about half a foot away from your face.
Additionally, shooting Spatial Videos without moving your iPhone very much is a difficult task, and after viewing just a couple of videos, I was feeling slightly queasy and had to leave the Photos app to stabilize again. This is due to the way Spatial Videos are viewed: they’re very close to your face, and the video takes up your entire field of view, causing nausea. They’re fascinating to view — you feel like you can touch objects in them — and I certainly will take more of them for later viewing, but in my opinion, the more compelling experience was viewing panoramas.
Aside from immersive experiences like Environments, 3D movies, panoramas, Spatial Videos, and AR views — all of which I’ll discuss at length in articles to come — regular, 2D apps also run on visionOS, and this is where the “spatial computing” part of this computer comes into play. 2D apps are where Apple Vision Pro shines — there have been VR headsets for years that can simulate virtual environments convincingly, but there is no headset on the market that provides a computing experience as riveting as Apple Vision Pro. As I wrote earlier, windows by default on visionOS are massive, taking up your entire room, but you can make them smaller and have a bunch of more compact, normal-sized windows in your space — all 360 degrees of it. You can even place windows in separate rooms, so if you’d like to put Safari in the living room and Crouton in the kitchen, for example, you can do so. The locations of your windows persist throughout sessions, too, so if you take Apple Vision Pro off and put it back on later, it’ll remember where your windows were last located. If you lose your windows, no worries: you can recenter them by pressing and holding the Digital Crown. When you recenter windows, they don’t change their relative distance to each other, meaning that if you had Safari to the right of Mail but in a different room, recentering would maintain their position, but not their location. You can look anywhere and recenter and apps will follow you.
There are two kinds of apps on visionOS: apps made for visionOS and apps made for the iPad or iPhone. Both work excellently, contrary to what spokespeople for YouTube and Netflix have said, but visionOS-optimized apps work the best. Apps made for visionOS are made out of the translucent glass material, which looks like a semi-transparent gray color with white text. The contrast changes depending on the lighting in your room, and text is readable and sharp. Sidebars expand as you look at them, too, and elements pop well, blending in with visionOS’ design language. Moving apps is seamless, as they don’t have specific size classes — they move and resize freely, just like macOS apps, adapting to the size you set them at. Some apps with large amounts of text or ones that display webpages, like Safari, Notes, and Mail, show a white background under the main area rather than the glass material for increased legibility. Your notes will always have a white background, for example.
Most importantly, if you look at an element in a visionOS-optimized app, the element will be given a brighter highlight color, indicating that the system knows you’re looking at it, which is important to see before you tap. visionOS apps also have a unique inset-style scroll bar and use less color in their interfaces because it’s hard to discern small amounts of color for it blends in with your surroundings, creating low contrast. visionOS-optimized tasks are where Apple Vision Pro feels at home, and the windows themselves feel like real objects in your space. It’s remarkable how fun interacting with them is. More information on these is coming in the full review.
The second kind of app is an iPhone or iPad app running in Compatibility Mode, which means that the app runs unmodified, just as it would on an iOS or iPadOS device. These apps work fine, and there isn’t much to write home about. At the top right corner of an iPhone or iPad app, you’ll see an ornament that allows you to change the app’s orientation, even for iPhone apps that don’t support the landscape orientation, interestingly. Closing and moving these apps still works the same, including the scale effect when you move the window across the z-axis. Apps optimized for the iPad work the best, in my experience, as they’re larger and support more size classes, à la Stage Manager on iPadOS. You can have as many iPad apps as you want, and they’re all located in the “Compatible Apps” section of the Home View, with squircle icons. (visionOS apps have circular icons.)
One interesting, and perhaps inconvenient side effect of these apps is that, because there is no dark mode on visionOS, iPhone/iPad apps always are in light mode by default. If an app supports a custom dark mode setting, you can enable that — and I prefer it since dark mode seems to blend better with visionOS’ UI elements — but some apps, namely Apple-made ones like Calendar and Reminders, don’t and are permanently in light mode, which is a shame. Also, if an app is optimized for the iPad’s trackpad, you’ll see UI elements highlighted in gray — similar to when hovering over them with a trackpad cursor on iPadOS — as you look at them. But if an app isn’t optimized, or is optimized poorly, elements won’t be highlighted. They’ll still work when you look and tap your fingers together, but it’s confusing sometimes. Regardless, most iPad apps on visionOS work sufficiently well, and I’d rather use them over the equivalent websites, even though Safari also works well, even with small touch targets.
visionOS supports trackpads and some Bluetooth keyboards, but not Bluetooth mice — let alone USB mice. These peripherals act just like they would on iPadOS, including support for keyboard shortcuts, Spotlight (⌘ Space), and gestures. I find that a Bluetooth keyboard is a necessity to do any semblance of work on Apple Vision Pro, but a trackpad is less useful, since looking and tapping, even for text, is a more fun, useful way of using Apple Vision Pro than the trackpad. If you connect a trackpad, however, the circular cursor moves between apps and doesn’t lie in the middle of your view.
The best way to work on visionOS is actually not on visionOS at all, but on macOS, using the Mac Virtual Display feature. To enable Mac Virtual Display, you can either (a) open the lid of your Mac laptop and unlock it, go to the Home View on visionOS, then tap the augmented Connect button above the screen, (b) go to Control Center on your Mac, select Screen Mirroring, then pick your Apple Vision Pro from the list, like you would an Apple TV, or © go to Control Center on Apple Vision Pro, choose Mac Virtual Display, then select a Mac from the list. You’ll need a Mac running macOS Sonoma or later, and one with Apple silicon to get the highest 4K resolution. Macs with Intel processors are limited to 3K. After you connect your Mac — which takes about two to three seconds — the display will go black, and a large, visionOS window-sized display will appear in the center of your view. You can bring it closer, which will make the display appear close to 32 inches, or farther, to make it take up the size of a wall, just like any other window on visionOS using the bar at the bottom. You can additionally look at the corner to make it larger, though you can’t change the aspect ratio — it looks like it’s stuck at 16:9.
The display itself has extremely low latency — I couldn’t tell a difference from any other external display. It’s certainly nothing like using AirPlay to connect to an Apple TV, and I could easily edit a video or type with it. It’s also extraordinarily high resolution, with crisp text and visuals, just as if I had a massive external display connected to my Mac, and supports playing back HDR content. The display, however, is ever so slightly translucent — I can’t make out details behind the display, but it’s not entirely solid either. It’s not a big deal, even for tasks that require precise color controls, but it’s not like having a rich-in-color, bright real organic-LED display in front of you. Black levels are close but aren’t exactly like a MacBook Pro with a mini-LED display. The refresh rate seems to be at 100 hertz, so if you’re coming from a MacBook Pro display, it’s slightly different, but again, the difference is negligible if you’re actually working rather than nitpicking.
Mac Virtual Display is special, though, for one reason: it doesn’t take over visionOS. If you want — and you’ll want to — you can place the display in the center of your view, move your head to the top or side, and spawn a new Home View with the Digital Crown to open a new app from visionOS, like Messages. Then, you can use Universal Control to use your Mac’s trackpad to move your cursor from macOS to visionOS and interact with the visionOS app just as if you connected an external keyboard and trackpad to visionOS directly. Then, you can move the cursor back to macOS and continue working. If you get a notification from Messages, it’ll appear on visionOS and macOS, and you can interact with the visionOS one by just looking at it where it comes from at the top, tapping your fingers together, then dragging your cursor from macOS to the Messages app to begin typing with your Mac’s keyboard. It’s no less than pure magic.
Mac Virtual Display makes Apple Vision Pro a fantastic tool to work with. While you’ll want to use pure visionOS for entertainment and casual computing to enjoy its unique experiences, there is no better way to enjoy spatial computing than with Mac Virtual Display. It’s every single one of your Mac apps, as well as the familiar Mac experience, blended with the wonder and novelty of spatial computing. You can work anywhere you want — Joshua Tree, the Moon, or any other Environment — with as big of a low-latency, beautiful screen as you want, along with an unlimited amount of computing space because you can create as many visionOS windows as you’d like. It has been such a joy to use Mac Virtual Display, and it works spectacularly.
The experiences I’ve had on visionOS over just a day are phenomenal. Apple Vision Pro is mind-blowing, and everything you do on it feels like magic. It’s the most “Apple” a VR product has ever been. It’s an infinite, immersive canvas with continuity features and 3D experiences that impress.
Conclusion
If you think what you just read is a lot, you’re wrong. That’s the short part. Apple Vision Pro is (a) an entirely new piece of hardware unlike one we’ve ever seen from Apple before, (b) an entirely new computing platform that runs a groundbreaking, futuristic operating system, and © a product with many use cases — too many to count. That is a lot of ground to cover, and I simply can’t do that in just one day.
There is still so much I haven’t covered: Personas, EyeSight, killer apps, app development, use cases, etc., the list truly does go on. And trust me, I will cover all of the above. But for now, just take this away from this extremely long piece: Apple Vision Pro is like Apple tried to make the future with today’s tools. Everything that Apple could control about this headset feels straight out of the future. This is the future of computing. In 1984, the Macintosh opened an entirely new computing paradigm that would lead us to the World Wide Web; in 2007, the iPhone opened an entirely new computing paradigm that would lead us to the personal internet; and now, in 2024, the 40th Anniversary Macintosh is bringing a computer into our spaces.
Like the iPhone never replaced the desktop personal computer, or the Macintosh never replaced heavy-duty servers, Apple Vision Pro will not replace any device Apple currently makes. Instead, Apple Vision Pro and visionOS open an entirely new world of computing. My mind is still racing with thoughts, and I’ll save much of it for later — after I enjoy my new slice of the future for just a bit more time.