Right now, in the course of the XR version of The Android Present, Google confirmed off a bunch of updates and new options headed to its combined actuality OS. And whereas a lot of the information was geared toward builders, I acquired an opportunity to demo a number of the platform’s expanded capabilities on a variety of {hardware} together with Samsung’s Galaxy XR headset, two completely different reference designs and an early model of Xreal’s Mission Aura sensible glasses and I got here away quite impressed. So here is a rundown of what I noticed and the way it will impression the quickly rising ecosystem of head-mounted shows.
First up was one among Google’s reference design sensible glasses with a single waveguide RGB show constructed into its proper lens. I’ve included an image of it right here, however strive to not learn too deeply into its design or aesthetics, as this gadget is supposed to be a testbed for Android XR options and never an early have a look at upcoming fashions.

Strive to not learn an excessive amount of into the looks of Google’s reference design sensible glasses, as they’re explicitly labeled as prototypes meant to check upcoming options in Android XR. (Sam Rutherford for Engadget)
After placing them on, I used to be capable of ask Gemini to play some tunes on YouTube Music earlier than answering a name just by tapping on the touchpad constructed into the fitting facet of the frames. And since the reference mannequin additionally had onboard world-facing cameras, I might simply share my view with the individual on the opposite finish of the road.
Naturally, I used to be interested in how glasses had the bandwidth to do all this, as a result of in regular use, they depend on a Bluetooth or Bluetooth LE connection. When requested, Max Spear, Group Product Supervisor for XR, shared that relying on the scenario, the gadget can seamlessly swap between each Bluetooth and Wi-Fi, which was quite spectacular as a result of I could not even detect when that transition occurred. Spear additionally famous that one among Google’s focuses for Android XR is making it simpler for builders to port over the apps individuals already know and love.

That is the image Google’s reference design sensible glasses created (through Gemini ) after I requested it to remodel a photograph I took of some pantry cabinets right into a sci-fi kitchen. (Sam Rutherford for Engadget)
This implies for units just like the reference design I wore that characteristic a built-in show (or shows), the OS really makes use of the identical code meant for normal Android notifications (like fast replies) to create a minimalist UI as an alternative of forcing app makers to replace every bit of software program to be compliant with an ever-increasing variety of units. Alternatively, for fashions which might be tremendous light-weight and rely strictly on audio system (like Bose Frames), Google has additionally designed Android XR so that you simply solely want mics and voice controls to entry all kinds of apps with out the necessity for visible menus.
In the meantime, when you’re hoping to take photographs along with your sensible glasses, there is a shocking quantity of functionality there, too. Not solely was I capable of ask Gemini to take a photograph, the glasses had been additionally capable of ship a higher-res model to a linked smartwatch, which is tremendous helpful in case you need to evaluate the picture earlier than transferring on to the following shot. And if you need to inject some creativity, you’ll be able to ask Gemini to remodel photos into virtually something you’ll be able to think about through Nano Banana. In my case, I requested the AI to alter a shot of a pantry right into a sci-fi kitchen and Gemini delivered with aplomb, together with changing the room right into a metal-clad setting full with a lot of gentle strips and some bursts of steam.

That is how Google Maps will look on Android XR. Be aware that that is the flat 2D model as an alternative of the extra detailed stereoscopic view accessible on sensible glasses with twin shows. (Sam Rutherford for Engadget)
Nonetheless, some of the spectacular demos was after I requested Google’s reference glasses to have a look at a few of that very same pantry atmosphere after which use the elements to create a recipe primarily based on my specs (no tomatoes please, my spouse is not a fan). Gemini went down an Italian route by choosing pasta, jarred banana peppers, bell peppers (which I believed was a considerably uncommon mixture) and extra, earlier than launching into the primary steps of the recipe. Sadly, I did not have time to truly cook dinner it, however as a part of the demo, I realized that Gemini has been educated to know human-centric gestures like pointing and choosing issues up. This enables it to higher perceive context with out the should be tremendous particular, which is a type of little however very impactful tips that permits AI to really feel method much less robotic.
Then I had an opportunity to see how Uber and Google Maps ran on the reference glasses, this time utilizing fashions with each single and twin RGB shows. Surprisingly, even on the monocular model, Maps was capable of generate an in depth map with the flexibility to zoom out and in. However after I converted to the binocular mannequin, I observed a big bounce in sharpness and readability together with a higher-fidelity map with stereoscopic 3D pictures of buildings. Now, it could be a bit early to name this, and the notion of sharpness varies tremendously between individuals primarily based on their head form and different components, however after seeing that, I am much more satisfied that the sensible glasses with twin RGB shows are what the trade will decide on in the long run.

Though it was introduced not way back in late October, Samsung’s Galaxy XR headset is already getting some new options because of some updates coming to Android XR. (Sam Rutherford for Engadget)
The second kind of gadget I used was the Samsung Galaxy XR, which I initially tried out when it was introduced again in October. Nonetheless, within the brief time since, Google has cooked up a number of new options that basically assist increase the headset’s capabilities. Through the use of the goggle’s exterior-facing cameras, I used to be capable of play a sport of I Spy with Gemini. Admittedly, this may sound like a small addition, however I believe it’ll play an enormous half in how we use units working Android XR, as a result of it permits the headset (or glasses) to know higher what you are taking a look at in an effort to present extra useful contextual responses.
Nonetheless, the largest shock was after I joined a digital name with somebody utilizing one among Google’s new avatars, known as Likeness. As an alternative of the low-polygon cartoony characters we have seen earlier than in locations like Meta Horizon, Google’s digital representations of individuals’s faces are virtually scary good. So good I needed to double-check that they weren’t actual and from what I’ve seen they’re even a step up from Apple’s Personas. Google says that headsets just like the Galaxy XR depend on inside sensors to trace and reply to facial actions, whereas customers will be capable to create and edit their avatars utilizing a standalone app due out someday subsequent 12 months.

The individual within the backside proper is utilizing a Likeness, which throughout my demo appeared surprisingly responsive and reasonable. (Google)
Subsequent, I acquired an opportunity to check out the Android XR’s PC connectivity by enjoying Stray on the Galaxy XR whereas it was tethered wirelessly to a close-by laptop computer. Not solely did it run virtually flawlessly with low latency, I used to be additionally ready to make use of a paired controller as an alternative of counting on hand-tracking or the laptop computer’s mouse and keyboard. That is one thing I have been eagerly ready to strive as a result of it seems like Google has put numerous work into making Android XR units play properly with different units and OSes. Initially, you may solely be capable to join Home windows PCs to the Galaxy XR, however Google says it is seeking to help macOS programs as effectively.
Lastly, I acquired to check out Xreal’s Mission Aura glasses to see how Android XR works on a tool primarily designed to present you massive digital shows in a conveyable kind issue. Sadly, as a result of this was a pre-production unit, I wasn’t capable of take photographs. That stated, so far as the glasses go, I used to be actually impressed with their decision and sharpness and the inclusion of electrochromic glass is a very nice contact, because it permits customers to alter how closely the lenses are tinted with a single contact. Alternatively, the glasses also can alter the tint mechanically primarily based on no matter app you might be utilizing to present you a roughly remoted environment, relying on the scenario. I additionally recognize the Aura’s elevated 70-degree FOV, but when I am nitpicking, I want it had been a bit larger, as I sometimes discovered myself wanting a bit extra vertical show space.

Sadly, I wasn’t allowed to take photographs of Xreal’s Mission Aura sensible glasses, because the mannequin I used was nonetheless an early pre-production unit. So here is a shot supplied by Google as an alternative. (Google / Xreal)
As a tool that is type of between light-weight sensible glasses and a full VR headset, the Aura depends on a wired battery pack that additionally doubles as a touchpad and a hub for plugging in exterior units like your cellphone, laptop computer and even sport consoles.
Whereas utilizing the Aura, I used to be ready to connect with a special PC and multitask in model, because the glasses had been capable of help a number of digital shows whereas working a number of completely different apps on the identical time. This allowed me to be on a digital name with somebody utilizing a Likeness whereas I had two different digital home windows open on both facet. I additionally performed an AR sport (Demio) whereas I moved round in digital area and used my palms to reposition the battlefield or choose up objects with my palms.
Now I’ll totally admit it is a lot and it took me a bit to course of the whole lot. However upon reflection, I’ve a number of takeaways from my time with the varied Android XR units and prototypes. Greater than another headset or sensible glasses platform out now, it seems like Google is doing a ton to embrace a rising ecosystem of units. That is actually vital as a result of we’re nonetheless so early within the lifecycle for wearable devices with shows that nobody has actually discovered a really polished design like we have now for smartphones and laptops. And till we get there, which means a extremely adaptable OS will go a good distance in direction of supporting OEMs like Samsung, Xreal and others.
However that is not all. It is clear Google is concentrated on making Android XR units simple to construct for. That is as a result of the corporate is aware of that with out helpful software program that may spotlight the parts and options approaching next-gen spectacles, there’s an opportunity that curiosity will stay quite area of interest — just like what we have seen when trying on the adoption of VR headsets. So in a method, Google is waging a battle on two fronts, which makes navigating uncharted waters that rather more troublesome.

A significant focus for Android XR whereas persons are nonetheless determining the right way to make sensible glasses is to help all kinds of designs together with these with single shows, twin shows or fashions with none shows that depend on cameras and audio system. (Sam Rutherford for Engadget)
Google is placing a serious emphasis on Android XR’s capability to function a framework for future devices and help and tackle developer wants. This mirrors the strategy the corporate takes with common Android and the other of Apple’s typical MO, as a result of not like the Imaginative and prescient Professional and visionOS, it seems Google goes to rely closely on its companions like Xreal, Warby Parker, Mild Monster and others to create partaking {hardware}. Moreover, Google says it plans to help sensible glasses that may be tethered to Android and iOS telephones, in addition to smartwatches from each ecosystems, although there will probably be some limitations for individuals utilizing Apple units as a consequence of inherent OS restrictions.
That is to not say that there will not be Pixel glasses someday down the street, however at the least for now, I believe that is a wise strategy and presumably a lesson Google realized after releasing Google Glass over a decade in the past. In the meantime, hi-res and extremely reasonable avatars like Likenesses could possibly be a turning level for digital collaboration, as a result of, in a primary for me, speaking to a digital illustration of another person felt form of pure. After my demos, I had an opportunity to speak to Senior Director of Product Administration for XR Juston Payne, who highlighted the distinction between sensible glasses and typical devices by saying “Good glasses must be nice glasses first. They should have kind issue, good lenses with prescription help, they should look good and so they must be simple to purchase.”
That is no easy job and there is not any assure that next-gen sensible glasses and headsets will probably be a grand slam. However from what I’ve seen, Google is constructing a really compelling basis with Android XR.


