With visionOS 26, the wonder of Apple’s AR is finally real

I have long complained that, for all its impressive hardware and software technology, the Apple Vision Pro is thus far a failure mostly because Apple doesn’t understand AR. Or Spatial Computing, or whatever it wants to call it.
That’s been the case for two years, since the Vision Pro arrived in February 2024 and the first major update in visionOS 2. But based on what Apple showed of visionOS 26 at WWDC, I feel like the company is finally starting to understand what a spatial computing experience should be.
Floating iPad window syndrome
Vision Pro is a device you’re supposed to wear on your head for a long time, looking “through” it (via camera passthrough) with rendered graphics incorporated into and onto real-world objects, rather than just floating in front of you (a “heads-up display” like most smart glasses).
While Apple delivered all the underlying technology to make that happen, the finished product got in its own way at every turn. It’s too heavy to wear for very long. and the vast majority of apps are simply floating windows, as if Apple’s vision for “spatial computing” was filling your home with giant floating iPad apps.
Foundry
With the exception of watching movies alone on a giant screen or the odd few-minutes-long spatial video release, the Vision Pro experience has been decidedly 2D. Sure, the 2D windows (apps, virtual Mac displays, whatever) float in 3D space, but the ability to integrate with the real world was entirely squandered.
VisionOS 26 puts the space in Spatial Computing
With the new visionOS update coming this fall, it feels like someone at Apple got the memo about what apps are supposed to be like in augmented reality. I just don’t know why they weren’t developed this way from the start.
Widgets are now persistent 3D objects with customizable frames and depth. So your photos can look like actual digital picture frames, only it’s little windows into your spatial photos. You can essentially hang a calendar on the wall. Sure, it’s a calendar widget rather than the Calendar app, and doesn’t look like a “real” calendar, but it’s at least a spatial experience.

Apple
The same goes for the clock widget, which has new detailed clock designs that you can position up against a wall and looks kind of like a real clock, or the Weather widget that kind of looks like a window.
I still think Apple should have a Clock app (not widgets) that gives you different size and shape clocks to place in your environment, from grandfather clocks to egg timers. But these changes at least show us that Apple is acknowledging the obvious: that spatial apps shouldn’t be 2D floating windows, they should be “objects” you place in your environment.

Apple
Of course, that’s just widgets. Apple is also building tools to create “spatialized” photos in Photos and Safari, as well as third-party apps. Developers will be able to put them in apps like Zillow should make the experience at least somewhat spatial, even though these apps are still just 2D floating windows.
Shared experiences
The other big problem with Vision Pro has been how solitary it is. Nobody else can see the digital objects you see, of course, but it was far too hard to have a shared experience with someone else even if they were in the same room as you, wearing a Vision Pro of their own.
With visionOS 26, you’ll be able to at least watch the same spatial movie in same room at the same time, and developers can make their apps shared-area experiences too. This means not just you and a remote user looking at the same virtual chessboard, but two local users seeing the same virtual chessboard, in the same position and state in the room you’re both occupying.

Apple
I don’t expect most homes want to spend what it costs for a single Vision Pro, much less two or three, but the core tools to allow multiple users to see and manipulate the same digital objects in the same space should have been there from the start.
Game controllers
Another area Apple seems to have seen the light is with game controllers. Look, real games, the kind you’d expect out of a several-thousand-dollar face computer, need more than just hand-waving. They need buttons and fast, millimeter-precise movement and orientation. In short, games need controllers.
Apple isn’t making its own controllers, they’re just adding support for Sony’s PSVR 2 Sense controllers. That’s probably for the best, but it does mean Sony will have to start selling its PSVR2 controllers separately, which it doesn’t do now. And we don’t know what they’ll cost.

Apple
On the right track
Vision Pro still has a long way to go before it’s something I could recommend. There needs to be new, more affordable and much lighter hardware of course, but also quite a bit more on the operating system and app side. Way too much of the experience is still inflicted with “floating iPad app” syndrome.
But visionOS 26 at least gives me hope that Apple is staring to understand what Spatial Computing, or mixed reality, or whatever you wish to call it, is all about.
Sure it has lots of quality-of-life improvements and refinements, but most of what Apple is building now seems to be chasing the idea that Spatial Computing is about persistent and shared digital objects, not about running 2D applications in windows hanging in the air.
The shift in philosophy has me genuinely excited to see where Apple will be in another year or two, with new hardware and visionOS 27 or 28. I haven’t felt that way since visionOS released.