Four AR streaming shifts every business should watch

CathyDavenportLee
Kerry Justice
Lead Product Developer
With Apple’s Vision Pro now available in the UK and Google pushing ahead with Android XR, spatial computing has moved from hype to reality. But for streaming businesses, the most important story isn’t the hardware – it’s the changing ways people engage with content.
 

At Candyspace, we’ve been following this evolution closely, and 2025 is shaping up to be a pivotal year. Here are four clear shifts already underway that every streaming provider should be paying attention to.

1. Spatial content is starting to take hold

Most people using AR headsets today are still watching 2D content, just on a giant virtual screen – but that’s starting to change. Apple’s visionOS 2 supports spatial video natively, allowing immersive clips captured with iPhone 15 Pro or 3D cameras to be played back with real depth and dimension. The Vision Pro can simulate a 100-foot screen in your living room and lets you ‘pin’ screens wherever you want. Watch Netflix while browsing Safari in a mountain cabin or flying above Mars. We’ve now also had a glimpse of early spatial clips, like immersive Super Bowl segments – users have been impressed by lifelike presence and perspective, suggesting a revolution in how sports and events could be consumed.

Google’s Android XR platform is also geared for immersive playback, with stereoscopic video support baked in. Its AI and interaction features are impressive – integration with Gemini AI allows voice control over UI, real-time look‑and‑search in scenes, and contextual queries. 

As more creators and studios experiment with depth-aware production, spatial video is soon likely to move beyond novelty into something that audiences come to expect, especially in genres like sport, music, and behind-the-scenes features.

While full-length spatial series may be a way off, there’s real opportunity in short-form and companion content. For streaming businesses, starting with immersive extras or enhanced trailers could offer a low-risk, high-impact way to start learning what spatial storytelling looks like in practice.

2. Shared viewing is re-emerging in virtual form

Streaming has always had a social aspect – think live tweets, group chats, or second-screen commentary. But AR introduces the possibility of bringing people back into a shared viewing space, even when they’re apart.

Apple is developing shared virtual environments for the Vision Pro, and Meta has been experimenting with social viewing in VR for some time. These tools let people ‘sit’ in the same virtual room and watch content together. While still early on, the emotional pull is obvious; shared reactions, collective experiences, and the return of appointment-style viewing.

This generates new creative and commercial possibilities for streaming providers. Genres like reality TV, live sport, and entertainment formats that drive conversation are especially well suited to co-viewing. Thinking about how to design for shared experiences – whether through synced playback, real-time reaction tools or spatial commentary – could help reintroduce a social layer that’s been missing from digital video.

3. Smart glasses will drive everyday AR use

As impressive as some emerging VR headsets are, they’re not built for everyday life. With a starting price of £3,499, the Vision Pro is still luxury tech. It’s also bulky, tethered to a battery pack, and uncomfortable to use for more than a few minutes. Globally, the average person spends about 1 hour and 22 minutes daily streaming content (roughly 9.5 hours per week) – most users aren’t likely to binge-watch TV while wearing a VR headset, but there is a new wave of smart glasses that could bring AR streaming into the everyday person’s routine far more rapidly and accessibly.

Ray-Ban Meta smart glasses, for instance, combine classic style with solid hardware: good camera, clear audio, basic AI, and live-streaming capabilities, all in a familiar frame. At under £330 they’re being adopted today by creators and early tech adopters alike, and there are more devices on the way, with Samsung, Xiaomi and others working on lightweight, AR-enabled glasses.

This shift opens the door to shorter, more ambient content experiences; trailers, live snippets, news flashes, and companion clips designed to be consumed hands-free and on the go. For streaming businesses, that means thinking beyond the ‘lean-back’ format and exploring how content fits into new, more spontaneous viewing moments.

4. Content and UX need to adapt to spatial interaction

Spatial computing doesn’t only change what content people watch, it changes how they interact with it. Interfaces designed for flat screens and remotes aren’t going to cut it in an AR environment where users are navigating with gestures, eye movement or voice control.

Both Apple and Google are already releasing tools that allow developers to build richer, more responsive spatial experiences, but it will be up to streaming platforms to decide how menus, navigation, and even playback behave in this new world. From gesture-controlled scrubbing to layered content interfaces, there’s huge potential to rethink the viewing experience from the ground up. 

Getting this right won’t happen overnight, but platforms that start testing spatial UX now – whether through interactive menus, spatial overlays or voice-controlled browsing – will be better prepared to lead when these behaviours go mainstream.

 

Now is the time to experiment

AR streaming won’t land all at once but in stages, by way of new formats, new devices, and new expectations. This shift is already underway, and the businesses that benefit most will be the ones starting to test and adapt now.

Whether it’s exploring spatial content, enabling shared experiences, or rethinking how users navigate and interact with content, there are real opportunities to create value and deepen audience engagement. It doesn’t require a wholesale reinvention of your platform – but it does mean being curious, proactive and open to change.

‘We’re not just redesigning interfaces’, says Ian Palmer, Lead QA Engineer at Candyspace. ‘We’re reimagining how people experience stories together.’