I’ve had the Apple Vision Pro for almost three weeks, and it’s not an easy product to talk about briefly.
Apple’s foray into the world of AR/VR headsets is… interesting. CGP Grey called it the most interesting tech product that he’s purchased in his entire life. Which says something given that he has purchased a nontrivial amount of technology. I find it interesting on a couple of levels, and I’m going to frame my thoughts into two groupings I will call The Bits (the technical aspects) and The Feels (the emotional experience). There’s overlap, but I think the split helps frame how I interact with and think about the device.
But first… the mindset. This is a $3500 (or more) early adopter product. It’s a brand-new hardware design running the first version of a new, eye-and-gesture-driven operating system. That doesn’t mean I expect it to be shoddy, but it means I anticipate it will have a few bugs and things might not always work the way I expect. Just because things don’t work as I first expected doesn’t (necessarily) mean that they’re wrong. In the 1980s I learned to compute with a keyboard. In the 1990s I supplemented that with a mouse. In the 2000s I began using touchscreen mobile interfaces. And now, a quarter of the way through the 21st century, I’m learning a new paradigm as I look at Apple Vision Pro for photographers.
The Bits
Viewport
I’ve heard a few folks comment that even if the internal screens are great (they are) that it must be odd to look at the outside world through a display of what’s outside the headset. While this might be a new concept, as photographers, we’re already used to this.
Your modern camera has an electronic viewfinder, doesn’t it? Looking “through” the Vision Pro is very similar to holding a mirrorless camera to one’s eye and looking “through” to the world. I haven’t used other VR headsets, but the consensus among those who have is that the Vision Pro’s displays are far higher resolution and quality than any other devices on the market. This is confirmed when one looks at the technical details of the in-device screens viewed by the wearer’s eyes. It looks best in well-lit areas. In low light it can get a bit of the graininess (digital noise) that we’re used to as photographers, but even in dimmer situations I found it to work well.
If you’ve looked at the world through a modern electronic viewfinder, looking at the world through Apple Vision Pro will be a familiar experience.
One quick aside: You’ll notice foveated rendering in screenshots of the visionOS interface. This means that where the eyes are focused is sharp, and other areas are blurry. Our eyes work the same way. There are some hoops to jump through for clearer screenshots, and I haven’t yet made those leaps.
The Lenses
I will likely pull this section out into a separate blog post once the issues here are resolved.
I wish I could say that my Zeiss prescription inserts paired smoothly with the device and all was well, but that’s not the case. My inserts arrived a few days after the headset. With the inserts in, things get clearer because of the prescription, but there’s a pairing process with the headset so that the eye tracking features can adjust to the additional lenses. With the lenses in the headset, visonOS detects there are new inserts and prompts you to pair them by looking at a custom QR code that is printed on the lens packaging.
When I tried to pair my lenses, after a couple of seconds I got a bit white “X” in the display and then a message that the pairing has failed.
I tried multiple times, in various lighting conditions, with both sizes of the light seal cushions. It failed consistently every time. As I mentioned above, I’ll expand on this section once there’s a resolution, but the abbreviated version of what’s happened as of publication:
- Lots of time on the phone with various levels of AppleCare
- Being the first-ever Genius Bar customer for a Vision Pro issue at my local Apple Store
- Apple exchanging the headset for an entirely new one (after the exchange, the issue persisted)
- Lots of additional time on the phone with AppleCare
- Apple/Zeiss shipping me an entirely new set of custom lenses and associated QR code (with the new lenses, the issue persisted)
- Sending Apple various screenshots and photos of what I was experiencing, the lenses, and the QR codes
- Submitting all this information to Zeiss
At this point I’m waiting for Apple to contact me with the next steps. My understanding is that my issue is being watched closely by a lot of Apple folks and has been escalated to engineering, with Zeiss also being roped in for support. Incidentally, John Gruber had this same issue as noted in footnote 3 of his review. That point has been part of my discussions with Apple.
Although Apple has been very proactive about working toward a resolution, we don’t have one yet. Folks on my list will be the first to know.
The Weight
A lot has been said about how heavy the Vision Pro is and whether that was going to be bothersome.
Photographer Rick Sammon has quoted his father as saying “If a picture is so boring that you notice the noise, it’s a boring picture.” I’ll morph that to say that if a headset is so heavy that you notice the weight, it’s too heavy.
I haven’t really noticed the weight.
Photography Functions
Viewing Photos
Browsing and viewing photos with the Apple Vision Pro is awesome. I don’t just mean that in the way that we say “awesome” to mean that something is cool, nifty, dope, or whatever term the kids are using these days. I mean it in the way that viewing photos at the size you can with the Vision Pro gives me a sense of awe.
Many folks (mistakenly) call up a professional photographer and think they will want 8×10″ prints (if they’re ordering prints at all). Professional photographers know that bigger is better. Even 16×20″ isn’t that big when hanging on a wall in most offices or living spaces. But who am I kidding? While big wall prints do look fabulous, in 2024, most photos are viewed on smartphones or other small digital displays. Apple Vision Pro takes that to eleven. As I sit in a chair and look at images through the Vision Pro, I’m seeing them in a view that makes them appear as if they’re 7 feet vertically.
One might think that the images that large would appear grainy or blurry. But they don’t. It turns out that Apple’s pretty good at this computational photography stuff.
I’d be remiss if I talked about viewing photos and didn’t call out panoramas. We’ve been able to capture panoramic images easily since iOS 6 was released over 11 years ago. And of course photographers have been able to stitch together traditional images into panoramas manually. But viewing panoramas has always been… awkward. If you view them on a phone, or a computer monitor, the image fills up a narrow strip in the middle, at a relatively small size. If you zoom in, you lose a bunch of the image to a hard-edged crop at the side of your display. You can’t really appreciate the image because of the limitations of your viewing technology.
If I view a panoramic image such that it’s 7 feet tall, and very wide, that story changes. I’m looking at the center of the image by default, and I can turn my head to the side to see even more. Or I can reach and gesture to “move” the virtual image horizontally. Everyone I’ve spoken with who has used an Apple Vision Pro says they’re going to start capturing more panoramic images, and I’m right there with them.
Photo Editing
Apple’s visionOS Photos app doesn’t include the editing features that are present in the macOS, iOS, and iPadOS versions of the app. You are able to crop an image, or use the iOS “Markup” tools to annotate, but that’s it. If you’re looking for adjustments to exposure, contrast, or any sort of retouching… those features are simply missing. At this point we can’t say if Apple doesn’t want you to edit images with Apple Vision Pro, or whether these features simply weren’t able to be completed in time for visionOS 1.0. My suspicion is the latter. I wouldn’t be surprised to see photo editing, with new visionOS paradigms, be a feature of visionOS 2.0 that will likely be announced in June at Apple’s Worldwide Developer’s Conference.
Until Apple updates visionOS to include native photo editing features, you’ll need to use third party applications. Adobe Photoshop Lightroom (the iPad version) runs quite well on visionOS. A few of the controls are a bit touchy (small) for eye gaze manipulation, but overall I found it to be a decent experience when I tried it out. I also had a good experience using Photomator, whose controls seemed to be a bit less dense and thus more eye-friendly.
Here’s Lightroom in action:
Although there aren’t native visionOS photo editing tools, the third-party offerings you likely know and love seem to work fine. And that’s just the story right now, at launch. I would expect in the next six to nine months we’ll see some interesting photographic developments (pun intended) from various photography software vendors. I’ll be surprised if there’s nothing about Apple Vision Pro at Adobe MAX in October.
As a Camera
Given the number of cameras in the Apple Vision Pro, it’s not a surprise that you can use it as a camera to record either spatial photos or video. The top button (on the left side as you wear the device) brings up the camera app. Pressing the button then captures a photo or starts recording. You switch between photo and video modes by selecting them from the onscreen display.
The photos are… okay. Here are some examples, unedited other than resizing.
One question that’s been raised with the device (along with Google Glass, the Meta glasses, and others) is a question of privacy: could someone secretly record you with an Apple Vision Pro? Not easily, at least not if you’re looking at them. I made a video to show what it looks like when someone captures a still image or a video with the device:
Serious iPhoneographers become savvy with the built-in camera app, but many also use third-party cameras for various reasons. Apps such as Halide, Obscura, and Photon provide nuanced camera controls that might intimidate a casual photographer but provide specialized functions for those in the know. There are no camera apps for Apple Vision Pro at this point; my understanding is that there simply aren’t APIs for developers to build them.
The consensus among various industry folks seems to be that it’s due to privacy concerns. The wearable nature of Apple Vision Pro makes this a very personal device, and presumably a camera app could capture images either intentionally or unintentionally that could include personal moments. Do I think this means we’ll never see third-party camera apps for visionOS? Not at all. Given the privacy concerns, this is an area where Apple can’t have serious flaws in the initial implementation, and I’m fine waiting a couple of versions of visionOS for them to figure out what that means. I don’t see Apple Vision Pro being an ideal device for still photography.
If you’ve watched screen recordings from someone wearing the Vision Pro, you’ve probably noticed that head movement can knock things out of level or result in shaky video. The camera app features a level displayed on an image of the shutter button for still images, and there’s a crosshair-and-circle pattern displayed for video. These provide realtime feedback to help capture imagery that makes it look like you have your head on straight. Want to ensure your screenshots have that same level-ness? There’s an app for that. But the nature of the device on our never-quite-still heads means that it’s less than ideal as a capture device for still photos.
Photography Future
I’m beta testing a couple of applications specific to photography. I can’t share too much yet but think photographs that augment your reality. Fun stuff!
General Computing & Entertainment
It’s a Big-Ass Mac Display
I like big monitors and I cannot lie. Whether it’s editing photos, watching a video, or browsing Mastodon, if you give me the screen space I will use it all. My home office setup includes a big external monitor connected through a CalDigit Thunderbolt dock to my MacBook Pro. I also use the display on the MacBook Pro on a Twelve South HiRise stand at my desk. Those are Amazon links; as an Amazon Associate I earn from qualifying purchases.
You can use the Apple Vision Pro to create a huge virtual display for an Apple Silicon Mac. Not enough room in Photoshop for your image and all the toolbars and palettes that you’d like? Run it through Vision Pro:
The Vision Pro acts as the display, but you continue to use your existing peripherals with your Mac. As shown in the video just above, you can use any keyboard, trackpad, mouse, or even a Wacom tablet. The only time I found it to get funky was when I needed to look down at the keyboard for something I couldn’t touch-type. The angle of looking down coupled with some dimming resulted in a bit of fumbling. I suspect it’s the sort of thing that as I have additional time with the device will become less of an obstacle.
Using it as a big-ass Mac display at your home or office is one thing, but it can also be a big-ass Mac display elsewhere.
Hotel rooms or conferences come to mind. Whether you want a huge view of your company’s financial spreadsheets or want to have a big rehearsal for your Keynote presentation (yes, there’s a native visionOS version… more on that below), you’re set. All work and no play makes Aaron a dull(er than usual) boy, so don’t underestimate the entertainment value as well on the road. The Vision Pro gives you a large screen device that you can place anywhere. Don’t like that the hotel TV is directly in front of the bed? Sit on your hotel balcony and put the display floating in the air. Or bring it down to a cabana and fire up a window next to the pool.
Eyes and Pinches
An eye-and-pinch-driven operating system is a whole new thing. It took only a few seconds to understand how it all worked once I began using it. Actions such as selecting, opening, scrolling, and more have become second nature in the last two and a half weeks. Interestingly enough, while basic operations are at the point where I don’t have to consciously think about them anymore, the one interface challenge I’m having is because maybe I’m too efficient.
More than once, I’ve found myself looking at something, wanting to select it, but before I tap my fingers to make the selection, my eyes have already moved onto the next thing I want to look at onscreen. As a result, the selection doesn’t happen, because I’m now looking elsewhere (and may have just selected something else entirely). Our eyes move faster than our hands, but our brains move faster than our eyes. I don’t know what would’ve been written on blogs or posted to YouTube had those existed in the first few weeks that mouse-driven computing hit the mainstream world, but I feel like we’re at a similar point with the eye-and-gesture spatial computing environment of Apple Vision Pro.
visonOS currently requires you to look at something as you select it. As it turns out, we don’t always do that. That said, this issue seems to be the exception rather than the rule, and the more that I use the device, the smoother this all becomes.
It’s Not Just Eyes and Pinches
Using the Apple Vision Pro, your eyes direct what we’ve formerly known as an insertion point or a cursor. Pinching your fingers together is essentially equivalent to a click. But you can also use a traditional physical mouse, keyboard, or trackpad as well if you pair them with the headset via bluetooth.
I haven’t found a compelling reason to use an external trackpad.
If you’re going to type anything more than a sentence at once, you’ll need the external keyboard (or rely on dictation). While I can appreciate the midair virtual keyboard that Apple’s engineers created in visionOS, it’s not something anyone will use for any significant typing. It’s fine for a one-line text message or typing a few form fields. For anything longer, you’ll want a physical keyboard. My current thoughts are that with the external keyboard, this device could likely replace my laptop for shorter travel.
As I think about input device possibilities, I have a concept for reinventing something we already know and love, but with a spatial twist. I’ll share more in a followup article or video.
It’s a Personal Theater
I’m far from a cinephile. In the last couple of years I’ve probably watched more hours of Ted Lasso and the Portland Timbers than everything else combined. That said, having explored watching entertainment through Apple Vision Pro, I must say that the headset easily provides the best experience possible at home. When I’m watching a TV series or a soccer match, it’s going to be through Apple Vision Pro. Folks go to theaters to see a movie BIG on a giant screen. When I recently needed to replace my parents’ television, I found that the size we needed for the space (no bigger than 40″) was smaller than the smallest option at my local Costco. Apparently size matters.
You can now watch movies, television, or online video in your home on a virtual screen that appears to be as big as a traditional movie theater. You can do that without having to deal with traffic, wondering why the person behind you is coughing so much, or paying $8 for a soda.
The big caveat to that statement is that it’s the best experience possible… for one person. The wearer of the device can have a huge virtual screen and amazing spatial audio, but that experience isn’t shared with someone else sitting next to them on the couch. If you want to sit closely with a partner watching something romantic, or sit with a couple friends watching your favorite sports rivalry, that’s not in the feature set for the first generation of the device. For that communal watching experience, you’ll need to rely on a traditional television and your existing media sources.
If you asked me on release day if I would take my Apple Vision Pro with me on an airplane, my immediate answer would’ve been that I wouldn’t unless I had a specific reason to take it and use it at my destination. It seemed like even if the experience was solid, it wouldn’t be worth the hassle (and space) to bring another device along. If you ask me now? I’ll tell you that I’m seriously considering it for an upcoming trip. It’s that good.
Biographer Walter Isaacson wrote about a conversation prior to Steve Jobs’ death:
‘I’d like to create an integrated television set that is completely easy to use,’ he told me. ‘It would be seamlessly synced with all of your devices and with iCloud.’ No longer would users have to fiddle with complex remotes for DVD players and cable channels. ‘It will have the simplest user interface you could imagine. I finally cracked it.’
I don’t know if a headset with screens and virtual project was how Jobs thought he “cracked it” but… for personal viewing? Apple Vision Pro has cracked it.
Spatial Productivity
How well Apple Vision Pro works for productivity depends on what productivity means to you. For me, it’s photography, writing, speaking, and managing life in general.
I mentioned various photography functions above, but want to revisit things from a productivity angle and ask the question: do I foresee Apple Vision Pro being a key piece of equipment for my workflow as a professional and hobbyist photography? I’d say it’s an emphatic yes when it comes to browsing, culling, and viewing images. Shortly before writing this section of this article, I used Photoscope to clean up some near-duplicate images by reviewing and selecting the best and then deleting the rest.
I’ve already used the device to perform some general photo editing. I anticipate this will continue especially when traveling. At home, I doubt I’ll see improvements over my existing workflows.
I’m waffling on whether I’ll use this as a writing environment at home. While I can see using it on the road, I’m not sure that using Apple Vision Pro in my home office provides significant advantages over my well-designed, well-known environment with multiple monitors and other accessories.
We’ve known the traditional modes of our presentation software. You’re either in the slide builder mode where you construct and edit the presentation, or you’re in the presentation mode where the slides are projected and you can view a presenter window. The visionOS version of Keynote features an entirely new mode for rehearsing presentations. Keynote’s rehearsal view allows you to practice giving your presentation in one of two environments: the Steve Jobs Theater, or a conference room. You get to see the presenter view confidence monitor screen and can move it wherever you like.
This is pretty damn neat, and is something I’ve already used in preparation for a talk.
I think Casey Neistat nailed it with his observation starting at 5:07 in this video (this should start playing at that point)
A discussion about productivity on Apple Vision Pro wouldn’t be complete without highlighting one major productivity failure of the device: you can’t drink coffee from any sort of “regular” coffee mug. I’m baffled at how developers are even writing code for this. I write this with my tongue seriously in my cheek, but the fact that drinking essentially requires a straw is an issue where the headset requires changes to long-established patterns.
The Feels
It’s hard not to compare this brand-new tech to our existing tech. Whether it’s hardware design or software polish, my just-released Apple Vision Pro gets subconsciously compared to my iPhone 15 Pro Max and my MacBook Pro. Consciously, I am realizing similarities with the launch of the Apple Watch.
If you recall when the first Apple Watch hit the market, the hardware was slow. It was marketed as a fashion accessory, and Apple wanted us to know that we could send a “digital touch” to our friends. In time, software developers and watch-wearers found the device is fantastic as a fitness unit and to provide rich notifications and light interactions along with the iPhone. The hardware got better. I feel like the Apple Vision Pro is in a similar infancy. Right now we (and Apple) think we know how we’re going to use the device, but in a few years, we will likely find some of those purposes abandoned and entirely new ones that have developed.
Doggone Peculiar
Do you know who really hasn’t figured out quite what to make of Apple Vision Pro? My dog.
Our big ol’ labrador is convinced that my pinches and hand movements are gestures for him, and he wonders why I don’t seem to care if he sits or lays down. He’s none too pleased when my spatial computing doesn’t lead to a meatspace treat.
He probably also thinks I look like a dork when I wear the headset but hey… join the club, buddy.
Fitting In
How does this fit into my computing environment, and where will I (and it) go from here?
Every day, I use a computer, a smartphone, a tablet, and a smart watch. A few times per month I’ll watch something through a streaming box to a smart TV. How does a headset fit into this mix? I can’t tell you how it will be in six months or a year, but I can share where things are as I approach three weeks. In no particular order:
- I doubt I will ever watch something by myself on my AppleTV again. Whether I’m watching a feature movie, a comedy series, or a sportsball game, the experience through Apple Vision Pro far exceeds that of my living room.
- I doubt I will regularly use it as a virtual display for my Mac when I’m in my home office. I know that I will regularly use it as a virtual display for my Mac when I’m on the road.
- That said, I’m wondering if I need to bring that Mac at all for certain types of trips.
- Browsing and viewing images on my big desktop screen feels cramped. It’s fantastic on Apple Vision Pro.
- It’s becoming my platform of choice to watch YouTube using Christian Selig’s Juno app. Will YouTube eventually make their own app? Probably. Am I happy to support Christian until they give me a compelling reason not to do so? Of course.
- I didn’t really touch on it above, but using it for video communications with the Persona feature is… funky. I’m looking forward to seeing this get cleaned up because once it does, the ability to be present on a video call while also being mobile (and not holding your iPhone in front of you) is promising.
Going into my purchase of Apple Vision Pro I figured it would be pretty cool right now, and that I’d be excited for the possibilities of what software developers would do once they started building apps that take advantage of the capabilities of visionOS. After three weeks with the device, I feel like that sentiment hasn’t changed. It’s pretty damn cool right now, and I can’t wait to see where the platform is at in six to twelve months. Apple Vision Pro holds a lot of possibilities photographers and other creators.
What questions do you have? Drop me a comment below.
Leave a Reply