The conversation around passthrough camera access is heating up within the XR community. While we’ve learned about the positions of Meta, Apple, and Pico, there’s keen interest in what moves Google will make regarding Android XR. After chatting directly with them, I’ve got the scoop: they’re planning a solution much like the one you’re familiar with on your smartphone. Stay tuned for the juicy details!
## Understanding the Camera Access Dilemma
Let me break it down for those who might be scratching their heads. Recent standalone VR headsets are essentially MR headsets, utilizing RGB passthrough cameras to project real-world visuals via front cameras. This technological trick is what gives rise to cool mixed reality applications like Cubism, Starship Home, and Pencil.
These cameras capture frames shown through the headset’s operating system. Developers eagerly want access to these frames, as they’re a gateway to analyzing a user’s surroundings using AI and computer vision. This could seriously enhance our reality! As I’ve argued before, having camera access is key to unleashing true mixed reality, allowing apps to become fully aware of a user’s context. For example, I managed to construct an AI+MR app for interior design with a sneaky camera trick on Quest. Such feats are nearly impossible without camera access.
Sounds fascinating, right? But there’s a flip side: privacy concerns. Granting camera access to less-than-honest developers could result in them quietly mining images of your surroundings—and personal data like your ID or credit cards lying about. Not to mention, harvesting images of people’s faces or bodies for dubious purposes.
It’s a tightrope walk: safeguarding user privacy while also unlocking the vast potential of mixed reality.
## The XR Companies’ Approach
Initially, unrestricted camera access was no big deal. Longtime followers might remember my 2019 experiments with camera textures on the Vive Focus—think diminished reality and sound reactivity.
As mixed reality gained traction, companies got skittish and clamped down on access over privacy fears. Meta, Pico, HTC, Apple—all played it safe and locked developers out of camera frames.
Gradually, developers realized the necessity of this feature and started tugging at XR manufacturers to allow it again. Names like Cix Liv, Michael Gschwandtner, and yours truly campaigned together for clear, user-transparent access that lets us run object recognition algorithms and more. It puzzled us why XR devices were treated differently given that smartphone apps easily obtain camera access with a simple permission request.
Our persistent push bore some fruit, with Meta committing to a “Passthrough API” rollout this year. But what about Google’s Android XR?
## Android XR Embracing Phone-Like Functionality
Globally, Android powers the majority of smartphones. Currently, developers can request camera access by obtaining user permission on Android mobile devices. If granted, they specify the camera ID—typically 0 for the back camera—and can then access the frames.
Google intends for Android XR to gel seamlessly with Android apps, including adopting a similar camera access model. After a detailed email exchange with a Google representative, here’s the verbatim response regarding Android XR’s camera access:
“Developers can use existing camera frames with user permission in XR, much like they do on any Android app. By requesting camera_id=0 for the world-facing camera stream (akin to the ‘rear camera’), or camera_id=1 for the selfie-camera stream (the ‘front camera’), through standard Android Camera APIs like Camera2 and CameraX, developers can access these streams conditioned with user permission.”
So, Android developers can employ familiar classes from phone development to handle camera streams on XR devices. Capture frames, save media, perform ML analysis—it’s all on the table for headsets and glasses too. Cheers to that!
The front camera directly delivers an image stream with the user’s avatar, designed by Avatar provider apps based on tracking data. Meanwhile, the world-facing camera feed grants insight into the user’s surroundings. This setup bridges XR functionality to the familiar Android phone experience: the “rear camera” shows the outside world, and the “selfie camera” reflects the avatar—a clever nod towards Apple’s Vision Pro approach.
What remains critical is coherence across devices in permission handling, enhancing compatibility of Android apps on XR. Google’s choices here are strategic.
However, when it comes to accessing all raw camera streams, there’s a caveat:
“Currently, we’re not providing application access to non-standard (e.g., forward-facing, reconstructed inward) sensor data.”
This response implies non-standard streams aren’t available right now, though there’s hope for an eventual rollout, especially for enterprise users.
For Unity developers, while Android Camera2 and CameraX are native classes, we could potentially leverage WebcamTexture to fetch camera frames on Android XR headsets. If that doesn’t work, a JNI workaround could always provide a bridge.
## A Preview Caution for Android XR
Remember, Android XR is still in preview, with no official headsets released. Although unlikely, there’s room for changes before final release, so keep that in mind.
## Opening the Gates of Camera Access
As Google and Meta lean towards opening camera access, expect other companies to join in soon. It looks like 2025 might be the dawn of exciting possibilities in mixed reality. Can’t wait to see what creations will emerge!
(Please note: this post may contain affiliate links. Click them and I might earn a little extra at no cost to you. Check out my detailed disclosure for more information.)