Monday, 22 September 2025

Samsung Android XR Headset Rumored to Release Next Month, Undercutting Apple Vision Pro

https://ift.tt/0btk5lB

Samsung could be launching its Project Moohan mixed reality headset next month, according to a report from South Korea’s ETNews (Korean), which hopes to take on Apple Vision Pro in the prosumer XR segment.

Citing industry sources, ETNews maintains that Samsung’s Mobile Experience division will unveil Moohan (Korean for ‘Infinite’) online on October 21st—likely in the morning of October 22nd in Korea, which will include specs, price and the ability to purchase the headset.

South Korea’s Newsworks previously reported that Samsung would unveil Moohan during an event scheduled for September 29th, with sales coming on October 13th in a Korea-first debut, however the latest ETNews report maintains the schedule was adjusted due to Samsung’s shifting marketing strategy and final quality checks.

Samsung Project Moohan | Image courtesy Google

It’s uncertain whether Moohan is now aiming for a global launch out of the gate, or sticking to the previously reported Korea-first strategy. Moohan is expected to be priced between between ₩2.5 and ₩4 million South Korean won—or between $1,800 and $2,900 USD—which is seen as way of undercutting Vision Pro ($3,500).

As the first XR headset running Google’s Android XR operating system, Moohan could serve as a foil to Apple’s VisionOS operating system for Vision Pro, which gives users access to most iOS apps in addition to standalone content created specifically for the device—a sharp contrast from Meta’s Horizon OS for Quest, which requires developers to manually port Android apps to the platform.

The inclusion of Android XR will not only give Moohan access to the massive library of Android smartphone apps and native XR content (ostensibly ported from Quest), but also includes the ability to natively stream PC VR games, like Quest.

The device is expected to integrate Google’s multimodal AI, in addition to supporting voice, hand and eye-tracking as input methods. We’re still waiting to hear about Moohan’s long-promised first-party motion controllers.

As for specs, Moohan is said to feature micro-OLED panels supplied by Samsung Display, packing in a pixel density of 3,800 ppi, running on Qualcomm’s Snapdragon XR2+ Gen 2 chipset with 16GB of RAM. Samsung has been tight-lipped on specs since Moohan was first unveiled late last year, so we’re waiting to hear more.

And it appears Samsung isn’t casting a very wide net with Moohan either, according to the report. The company is allegedly only targeting an initial shipment volume of around 100,000 units, with later targets adjusted according to early demand.


You can learn more by checking out our hands-on with Project Moohan from December 2024, which includes everything from comfort, display clarity, and details on its Android XR operating system.

The post Samsung Android XR Headset Rumored to Release Next Month, Undercutting Apple Vision Pro appeared first on Road to VR.



from Road to VR https://ift.tt/HekzhnW
via IFTTT

Friday, 19 September 2025

Quest is Getting an Official ‘Discord’ App Next Year

https://ift.tt/qa0cZsl

Meta CTO Andrew Bosworth said last May that he would “love” for Quest to have some sort of Discord integration, although it seemed it was up to Discord to make the first move. At Connect this week, Meta announced Discord is officially coming to Quest next year.

While Quest users can technically log into Discord via Quest’s web browser, it’s far from perfect. But in 2026, Discord says a native window app is coming to Quest, which ought to make multitasking even easier. Those window apps allow users to keep apps in view while playing VR games, meaning you don’t need to exit or pause the game in progress.

Meta calls Discord on Quest a “massive opportunity for VR developers,” thanks to the messaging platform’s wide user base among gamers.

“Think about it: Discord is home to a highly engaged community of more than 200 million monthly active players who spend a combined 1.9 billion hours playing games each month across thousands of titles on PC alone,” Meta says in a developer blog post. “With the launch of the Discord app on Quest, VR devs will have an incredible discoverability engine at their disposal.”

Neither company has confirmed anything beyond basic multitasking, however a native app release could point to some greater functionality that PC players already enjoy. Game presence support, or automatically showing what game you’re currently playing to other Discord users, could be on the list. We’re also hoping for improved performance, persistence, and possibly system-level notifications for things like messages and calls.

Still, some of the top wishlist items may be out of reach. While we’ll be waiting for the full release in 2026 to tell, it’s unlikely users will be able to stream headset POV to Discord, either as a game stream or as a video chat input. Another reach may be access to Quest’s ‘avatar selfie cam’ for Discord video chats, which would require the company to integrate Camera2 or CameraX API.

Something else you probably shouldn’t expect: the ability to invite friends to VR games through Discord, which only a handful of desktop games support on Discord at the moment. Whatever the case, we’re holding out hope for the full-fat Discord users experience, which we should learn about more in the coming months.

The post Quest is Getting an Official ‘Discord’ App Next Year appeared first on Road to VR.



from Road to VR https://ift.tt/GqPCVMF
via IFTTT

Why Ray-Ban Meta Glasses Failed on Stage at Connect

https://ift.tt/0vTLnJZ

Meta CEO Mark Zuckerberg’s keynote at this year’s Connect wasn’t exactly smooth—especially if count two big hiccups that sidetracked live demos for both the latest Ray-Ban Meta smart glasses and the new Meta Ray-Ban Display glasses.

Ray-Ban Meta (Gen 2) smart glasses essentially bring the same benefits as Oakley Meta HSTN, which launched back in July: longer battery life and better video capture.

One of the biggest features though is its access to Meta’s large language model (LLM), Meta AI, which pops up when you say “Hey Meta”, letting you ask questions about anything, from the weather to what the glasses camera can actually see.

As part of the on-stage demo of its Live AI feature, which runs continuously instead of sporadically, food influencer Jack Mancuso attempted to create a Korean-inspired steak sauce using the AI as a guide.

And it didn’t go well, as Mancuso struggled to get the Live AI back on track after missing a key step in the sauce’s preparation. You can see the full cringe-inducing glory for yourself, timestamped below:

And the reason behind it is… well, just dumb. Jake Steineman, Developer Advocate at Meta’s Reality Labs, explained what happened in an X post:

Unfortunate, yes. But also pretty foreseeable, especially considering the AI ‘wake word’ gaffe has been a thing since the existence of Google Nest (ex-Home) and Amazon Alexa.

Anyone with one of those friendly tabletop pucks has probably experienced what happens when a TV advert includes “Hey Google” or “Hey Alexa,” unwittingly commanding every device in earshot to tell them the weather, or even order items online.

What’s more surprising though: there were enough people using a Meta product in earshot to screw with its servers. Meta AI isn’t like Google Gemini or Apple’s Siri—it doesn’t have OS-level access to smartphones. The only devices with default are the company’s Ray-Ban Meta and Oakley Meta glasses (and Quest if you opt-in), conjuring the image of a room full of confused, bespectacled Meta employees waiting out of shot.

As for the Meta Ray-Ban Display glasses, which the company is launching in the US for $799 on September 30th, the hiccup was much more forgivable. Zuckerberg was attempting to take a live video call from company CTO Andrew Bosworth, who after several missed attempts, came on stage to do an ad hoc simulation of what it might have been like.

Those sorts of live product events are notoriously bad for both Wi-Fi and mobile connections, simply because of how many people are in the room, often with multiple devices per-person. Still, Zuckerberg didn’t pull a Steve Jobs, where the former Apple CEO demanded everyone in attendance at iPhone 4’s June 2010 unveiling turn off their Wi-Fi after an on-stage connection flub.

You can catch the Meta Ray-Ban Display demo below (obligatory cringe warning):

The post Why Ray-Ban Meta Glasses Failed on Stage at Connect appeared first on Road to VR.



from Road to VR https://ift.tt/cusBOpA
via IFTTT

Thursday, 18 September 2025

New Meta Developer Tool Enables Third-parties to Bring Apps to its Smart Glasses for the First Time

https://ift.tt/9tz3GkF

Today during Connect, Meta announced the Wearables Device Access Toolkit, which represents the company’s first steps toward allowing third-party experiences on its smart glasses.

If the name “Wearables Device Access Toolkit” sounds a little strange, it’s for good reason. Compared to a plain old SDK, which generally allows developers to build apps for a specific device, apps made for Meta smart glasses don’t actually run on the glasses themselves.

The “Device Access” part of the name is the key; developers will be able to access sensors (like the microphone or camera) on the smart glasses, and then pipe that info back to their own app running on an Android or iOS device. After processing the sensor data, the app can then send information back to the glasses for output.

For instance, a cooking app running on Android (like Epicurious) could be triggered by the user saying “Hey Epicurious” to the smart glasses. Then, when the user says “show me the top rated recipe I can make with these ingredients,” the Android app could access the camera on the Meta smart glasses to take a photo of what the user is looking at, then process that photo on the user’s phone before sending back its recommendation as spoken audio to the smart glasses.

In this way, developers will be able to extend apps from smartphones to smart glasses, but not run apps directly on the smart glasses.

The likely reason for this approach is that Meta’s smart glasses have strict limits on compute, thermals, and battery life. And the audio-only interface on most of the company’s smart glasses doesn’t allow for the kind of navigation and interaction that users are used to with a smartphone app.

Developers interested in building for Meta’s smart glasses can now sign up for access to the forthcoming preview of the Wearables Device Access Toolkit.

As for what can be done with the toolkit, Meta showed a few examples from partners who are experimenting with the devices.

Disney, for instance, made an app which combines knowledge about its parks with contextual awareness of the user’s situation by accessing the camera to see what they’re looking at.

Golf app 18Birdies showed an example of contextually aware information on a specific golf course.

For now, Meta says only select partners will be able to bring their app integrations with its smart glasses to the public, but expects to allow more open accessibility starting in 2026.

The examples shown so far used only voice output as the means of interacting with the user. While Meta says developers can also extend apps to the Ray-Ban Display glasses, it’s unclear at this point if apps will be able to send text, photo, or video back to the glasses, or integrate with the device’s own UI.

The post New Meta Developer Tool Enables Third-parties to Bring Apps to its Smart Glasses for the First Time appeared first on Road to VR.



from Road to VR https://ift.tt/fUX473g
via IFTTT

Wednesday, 17 September 2025

Meta Reveals Next-Gen Ray-Ban & New Oakley Vanguard Smart Glasses

https://ift.tt/LzuxAjJ

Undoubtedly the smart glasses headliner of Meta Connect this year was the new $800 Meta Ray-Ban Display Glasses, which pack in a single display into a familiar Wayfarer-style package. Alongside it though, Meta showed off two new smart glasses: the Oakley Meta Vanguard and next generation of Ray-Ban Meta.

Oakley Meta Vanguard – $499 (available Oct 21)

Oakley Meta Vanguard | Image courtesy Meta

Before Meta and Essilor Luxottica released Oakley Meta HSTN in July, we were definitely envisioning something more like the new Oakley Meta Vanguard. But it’s better late than never, as Meta has just unveiled its sleek, blade-like frames they say are “built for high-intensity sports.”

Rated at IP67 dust and water resistance, Meta Oakley Vanguard is supposedly durable enough for sweaty workouts or rainy rides, with it targeting sports like cycling, snowboarding, and running.

Oakley Meta Vanguard | Image courtesy Meta

Notably, like many of its traditional specs, the new smart glasses use Oakley’s Three-Point Fit system, which includes three interchangeable nose pads for a more secure fit, with Meta noting the frames are optimized for use with cycling helmets and hats.

They also include an onboard 12MP, 122° wide-angle camera sensor for capturing video up to 3K resolution, with modes including Slow Motion, Hyperlapse, and adjustable image stabilization.

And just like Ray-Ban Meta, it features open-ear speakers, notably rated at six decibels louder than previous Meta Oakley HSTN models, including a wind-optimized five-mic array to provide clear audio for taking calls, using voice commands, or listening to music while training.

The newest Oakley’s also integrate with Garmin, Strava, Apple Health, and Android Health Connect, delivering post-workout summaries and real-time stats through Meta AI. Athletes can check heart rate, progress, or other data hands-free with voice prompts.

Oakley Meta Vanguard | Image courtesy Meta

Available in four frame/lens color combinations, the glasses weigh 66g and offer up to nine hours of mixed use (or six hours of music) on a single charge, with an additional 36 hours via the charging case. Quick charging is said to bring the glasses to 50% in just 20 minutes, Meta says.

Like all of the other Meta smart glasses on offer, they include 32GB of storage for over 1,000 photos or 100 short videos, the company says.

Since it’s built for high-intensity sports, it also means the company is introducing replaceable lenses, starting at $85. Here are all four models available for pre-order, including the lenses you’ll be able to mix and match later.

  • Oakley Meta Vanguard Black with PRIZMTM 24K
  • Oakley Meta Vanguard White with PRIZMTM Black
  • Oakley Meta Vanguard Black with PRIZMTM Road
  • Oakley Meta Vanguard White with PRIZMTM Sapphire

Oakley Meta Vanguard is now available for pre-order through Meta or Oakley, priced at $499 and launching October 21st.

They’ll be available in the US, Canada, UK, Ireland, France, Italy, Spain, Austria, Belgium, Australia, Germany, Sweden, Norway, Finland, Denmark, Switzerland, and the Netherlands. Meta says they should also eventually launch in Mexico, India, Brazil, and the United Arab Emirates later this year.

Ray-Ban Meta (Gen 2) – Starting at $379 (Now Available)

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

While the company considers its next Ray-Ban Meta Glasses “Gen 2”, they’re technically the third generation following the release of Ray-Ban Facebook Stories in 2021 and Ray-Ban Meta in 2023.

Naming scheme aside, the latest Ray-Ban Meta smart glasses are delivering the same improvements seen in Oakley Meta HSTN, and essentially the same base functionality. While it can play music, do real-time translation, and hands-free calls, it also offers better photo and video capture than its predecessor.

Its ultrawide 12MP camera sensor is rated for photo capture up to 3,024 × 4032 pixels and video from 1200p at 60 FPS 1440p at 30 FPS, and 3K at 30 FPS—all of which are up to three minutes in length.

Ray-Ban Meta Wayfarer (Gen 2) | Image courtesy Meta

Like Oakley Meta HSTN, Ray-Ban Meta (Gen 2) boasts up to eight hours of continuous use and an additional 48 hours from the charging case, plus quick charge to 50% in 20 minutes in the charging case.

And it probably goes without saying, but all of Meta’s smart glasses make heavy use of its own Meta AI, which includes things like voice search queries (“Hey Meta!”), reading QR codes, suggesting recipes, saving notes, etc.

Ray-Ban Meta Skyler (Gen 2) | Image courtesy Meta

Additionally, the device includes Bluetooth 5.3, Wi-Fi 6, 32GB of storage, and an IPX4 water-resistance rating for light rain or splashes.

And like the 2023 model, the new Ray-Ban Meta smart glasses offer gads of frame and lens combinations: 27 in total across its Wayfarer and Skyler models, which include options for large or low nose bridges.

It is also getting a price bump over the first-gen, which were launched in 2023 for $299. Ray-Ban Meta (Gen 2) starts at $379 for standard lens options, and will be available with polarized lenses ($409), transitions lenses ($459), and prescription lenses (pricing varies).

You can find all of those models and lens combinations starting today over at Meta and Ray-Ban.com.


We’re currently on the ground at Meta Connect this year, so check back soon for all things XR.

The post Meta Reveals Next-Gen Ray-Ban & New Oakley Vanguard Smart Glasses appeared first on Road to VR.



from Road to VR https://ift.tt/iUKcLWl
via IFTTT

How to Watch Meta Connect for All Things XR, Kicking Off Today @5PM PT

https://ift.tt/nfdgqt7

Meta Connect 2025 is nearly here, with company CEO Mark Zuckerberg only a few hours away from telling us what’s next for the most influential company currently working in XR.

The keynote is starting unusually late this year, coming today at 5PM PT / 8PM ET (local time here), where Zuckerberg will take the stage to share Meta’s latest developments in mixed reality, AI, the metaverse and wearables. 

Note: that last one may be a big focus this year, as the company seemingly leaked its lineup of next-gen smart glasses built with Essilor Luxottica yesterday, including a model that includes a single monocular display.

We’re currently at Meta Connect, so we’ll be reporting on all of latest in XR news to come from the event. You can follow along live via YouTube at 5PM PT to catch the big keynote:

Still, there’s no telling what else the company has in store, with many questions still lingering from last Connect, which was only a few months after Meta announced it was not only open its Horizon OS to third parties for the first time, but also allowing a fleet of Quest-like headsets from Asus and Lenovo to come to market.

Like most years, we’re expecting to hear more about Quest games, experiences, and the company’s social VR platform, Horizon Worlds. Whatever the case, we’re always ready for a Connect curveball, so check back for our continued coverage of breaking news, previews, and more.

The post How to Watch Meta Connect for All Things XR, Kicking Off Today @5PM PT appeared first on Road to VR.



from Road to VR https://ift.tt/t6wjfOZ
via IFTTT

‘Reach’ Hands-on – A Cinematic & Immersive VR Adventure I’ve Been Waiting For

https://ift.tt/GcbKVt9

Reach is the debut title from nDreams Elevation, coming to all major VR headsets next month. I got a chance to go hands-on with the PSVR 2 version during the first in-person VR Games Showcase event in London earlier this month, where Reach proved to be an immersive and surprisingly cinematic adventure I truly can’t wait to get my hands on.

Reach’s demo was one of the longest I’ve had at a live event like this, clocking in around one full hour of gameplay. It’s pretty rare to be given so much time to poke around a game, which took me through the tutorial and a few disparate chapters to get a sort of ad hoc vertical slice of the action. It feels polished. Dare I say, ‘AAA’-esque.

While a 15-minute tutorial focused on teaching me the game’s full-body movement scheme and basic bow-shooting mechanic, it ultimately turned out to be a clever bait-and-switch. My character isn’t actually fighting gun-toting baddies through a city; in reality, I’m really a movie stunt performer, making the opening moments a clever, low-stakes taster for the real action yet to come.

‘Atlas’ | Image captured by Road to VR

That said, I don’t really know what the hell is actually happening in Reach at this point. From what I’ve played, you slip through a crack in the ground opened up by an earthquake, thrusting you into some sort of fractured realm ruled by magical robots tussling with some sort of ancient power grab. How I got there, or why I need to help the mysterious robot bro Atlas, has yet to be revealed.

Narrative intrigue aside, if I had to describe Reach with one word, it would be embodiment, i.e. you feel like your virtual body can do things you’d expect it to, and in a way that feels natural.

In Reach, you’ll be parkouring your way through some of the best level design I’ve seen in VR (at least in my hour-long demo), offering up an impressive amount of verticality at every turn that is intertwined with both 3D platforming and a host of environmental puzzles. It’s packed with cinematic moments, like sliding down near-vertical cliffs, ziplining over infinite drops, and blasting through the world as it crumbles around you.

Notably, jumping isn’t tied to a single button, but rather activated when you thrust your hands upward, mimicking a natural movement, which makes it feel very comfortable—but just one of the ways you’ll move around.

Despite the frenetic nature of the game’s many parkour challenges, I never once felt the telltale wooziness that comes with disproportionate virtual movement. All of the standard locomotion options are there, but it’s clear the game is intent on making movement not only functional, but truly cinematic.

You’ll also be climbing and squeezing through tight spaces by ‘grabbing’ walls and ledges to move you through the game’s multilevel, tower-like halls. The physics-based nature of your full-body avatar also extends to your virtual fingers, which are delightfully floppy and present the player with dynamic hand poses (which are undoubtedly the best and most immersive).

That said, I didn’t get to do a ton of combat outside of the tutorial, which I learned is pretty representative of the sort of arena-style areas I’ll see in the full game, which punctuate the game’s winding halls filled with patrolling baddies.

Still, the whole kit on offer feels suitably full-featured, including the main weapon: a bow and multiple arrow types, which you progressively unlock as you explore the game. Beyond your standard shot, arrows aren’t infinite, requiring you to manufacture them with items found in the environment.

You’re also armed with a Captain America-style shield, which you can throw and ping around corners as it bounces. Additionally, the shield pulls double duty as a locomotion device, clicking into bespoke slots and cleverly allowing you traverse levels by tossing the shield to make a death-defying jumps across wide chasms.

A grappling hook-style drone also lets you make stupid far leaps on bespoke glowing green overhangs, which not only adds a pretty real skill check as you explore, but a way to escape and dynamically fight back during larger arena-based battles. You’ll see those tubes at key points throughout.

Found weapons are a thing too, although these are one-and-done pickups that you can use tactically and then toss away. Think pistols, shotguns, grenades—all of them suitably futuristic and equally un-reloadable. The idea is to rely mostly on your bow, but offer up a bit of flavor as you grind through baddies.

But it’s not all combat. Every puzzle I encountered felt genuinely smart, and equally massive to fit the game’s arcane, sci-fi vibe. Best of all—none of them were spoiled by simple solutions, or a ‘helpful’ narrator telling me what to do or where to go. You’re simply placed in front of an enigmatic device, unsure of whether solving it will progress you through the story or lead you down a byway for some sort of new ability as reward for your patience.

Walking out of Reach, I’m left with a few questions beyond the admittedly impressive bits I experienced. The story for now is a mystery. I’m also curious to learn how much of the combat will involve arenas, which I hope don’t veer into your standard set of time-killing wave-based interactions. Bosses are also a big question mark, since I only ever saw one, but didn’t fight. Will they satisfyingly combine all of my learned skills? Simply act as narrative-forwarding set pieces? I simply can’t say for now.

It is however safe to say that I’m very impressed by the game’s locomotion, level design, and immersion of the whole package. You won’t have to wait too long either to get your hands on Reach.

Reach is slated to launch on the Horizon Store for Quest 3 & 3S, Steam for PC VR headsets, the PlayStation Store for PSVR 2 on October 16th, priced at $40. In the meantime, check out the gameplay trailer below:


Disclosure: VR Games Show covered travel and lodging expenses for one Road to VR writer to attend the event.

The post ‘Reach’ Hands-on – A Cinematic & Immersive VR Adventure I’ve Been Waiting For appeared first on Road to VR.



from Road to VR https://ift.tt/OqeGiAS
via IFTTT
Related Posts Plugin for WordPress, Blogger...