Pimax issueda statement on their forums stating that their 8K headset will not be shipped to all backers this month as previously claimed. The statement was also sent out via email.
The company explained that the 4K LCD panels used in the Pimax 8K have been rejected at a much higher rate since they enacted stricter QA testing.
So when will the remaining backers of the 8K model get their headset? Pimax made clear that it would not be before the Chinese Spring Festival which runs until February 10th. This means it’ll be weeks or months before all are produced & shipped.
The 8K headset isn’t the company’s only offering however. There is also the ‘Pimax 5K+’. Despite having lower resolution (1440p) panels the visual quality of the 5K+ is actually preferred by many. This is another reminder of the important fact that resolution is only one spec of a panel. While marketing departments everywhere would have you believe otherwise, a lower resolution can look superior if the other specs of the panel are better by enough of a degree.
Image from Pimax of 5K+ “black dots” issue
The 5K+ headset isn’t free of QA issues either however. Pimax now has a replacement program for a “black dots” issue. This is a flaw in some 5K+ units where, as in the name, black dots appear over the panel.
It’s now 16 months since the Pimax kickstarter launched. While the company has finally shipped almost all of its 5K+ model, backers of the original 8K are still waiting. When we spoke to Pimax at CES the company’s new Head of US Operations spoke of the many ways in which they plan to overcome their production, shipping and support issues- lets hope this is the year where they finally deliver.
The added multiplayer functionality in a new Early Access build gives players the ability to share music-making sessions with others around the world. One player hosts a room and EXA keeps layouts synced for the various instruments as well as “items, playback states, metronome, and live ringer events.”
“The room can be made available publicly, can be hidden until a player enters the room name, or can be limited to your local network (LAN). The room creator can even put players into a ‘spectator’ mode by disabling some of their room permissions,” developer Zach Kinstner wrote in an update explaining the change.
Calling All Bands
A video further explains the syncing functionality and how it might work better over lower latency conditions. Players can talk to one another and record loops in any network condition — arranging instruments, adding sounds and building up compositions together. Loops, however, transfer to other players upon completion. That process could take several seconds for detailed loops with lots of data to transfer, according to Kinstner. Musicians can add live sounds on top of the loops via their shared instruments — just like a real-life band — in extremely low latency sessions, like over a local area network.
“When latency is low, each player’s ringer events can transfer fast enough for other players to hear the full ringer sounds at the correct time,” Kinstner explains. “In these conditions, you could conceivably play a live performance in EXA, with everyone playing their virtual instruments at the same time, rather than sharing loops. With higher latency levels, you won’t hear the full sound from a ringer event. For example, if an event reaches you 80 milliseconds late, you’ll miss the first 80 milliseconds of that ringer’s sound. As latency increases, it becomes more difficult for live performers to stay in sync with each other, and players should collaborate using recorded loops instead.”
EXA lists support for Oculus Rift, HTC Vive and Windows Mixed Reality headsets.
Every month we aim to round up each and every VR game release for you in one single place — this is January’s list. Don’t worry — we’ll continue highlighting the best ones at the end of each week too.
With the door closed on December and all of 2018, we’ve just been through another great month of VR content. Between Borderlands 2 VR, Contractors, and Blade and Sorcery, there were lots of titles to look forward to last month. Now with January and the new year upon us, it’s time to take a look at what’s coming down the line next.
And if you’re a VR game developer planning to release a game soon — let us know!You can get in touch with me directly by emailing david@uploadvr.com or hit all of the editorial team by emailing tips@uploadvr.com. Please contact us about your upcoming releases so that we can know what you’re working on and include you in release lists!
Rift, Vive, and Windows VR Game Releases For January 2019
Kosmos School is working to build a hands-on VR education platform for all grade levels.
Students taking classes withKosmos Schoolcan design and launch rockets in VR as a way to study physics and chemistry, but the school hopes that students will soon be able to study all of their coursework in a virtual environment.
Right now, things are still small. Kosmos School is trying things out with a single class composed of just 6 students called ‘Intro to Rockets’, where students can run science experiments and work on engineering problems that would normally be too expensive or complicated to conduct in real-life.
“We started with small group science classes, where students meet with our teachers in our virtual world and work collaboratively to learn,” said co-founder Can Olcer to VRScout.
In this class, students meet virtually over the course of two weeks to study physics, chemistry, and engineering to create and test rockets in order to better understand how they operate and what components they require to function.
The class is geared toward middle and high school students, with an emphasis on collaborating in real time regardless of the students physical location. Olcer sees this opportunity as a major advantage when it comes time to create a full school curriculum.
“VR is a fundamentally social technology, Olcer said to VRScout. “The social aspect is key for a school, especially in primary and secondary education.”
Image Credit: Kosmos School
By socializing in VR, Kosmos can bring in homeschooled students who would typically be learning by themselves or in a very small group.
“With a VR school, we can solve two of educations main problems today: Access and cost,” Olcer said. “Cost is more tricky, but our school has a fundamentally different cost structure than “physical” school and essentially no limits for growth and economies of scale. Not only don’t we have to deal with costs of physical things, but teacher and student time can be used much more efficiently.”
Olcer’s vision speaks even louder in the wake of a six-day strike conducted by Los Angeles County teachers demanding higher wages and smaller classroom sizes.
Image Credit: Kosmos School
However, there is still a tremendous amount of variables to account for before fully launching a full curriculum and enrolling students.
“VR is relatively new and the frictions around it make it difficult for me to stay focused for a longer period– for now,” said Daniel Abebe who founded his own startup that helps provide funding to early stage social entrepreneurs.
Kosmos will also have to account for the difference in educational curriculum across the states and getting some parents on board with VR. Although, currently they offer a free info session to prospective parents to learn and ask questions about VR before enrolling their children.
Image Credit: Kosmos School
This isn’t the first use of VR as a medium for education, companies have started implementing VR for employee training, including career fields ranging frompolice forcestofast food chains. Universities have also used VR forbringing inlecturers from other countries.
On top of providing remote education via a stand-alone virtual K-12 school, Kosmos hopes to eventually extend their tools to teachers but would like for teachers in brick and mortar classrooms as well.
“We believe that in the future, a lot of the things we do today on smartphones and computers will be done in VR,” Olcer said to VRScout. “And schools are no exception to that.”
Stereoscopic 3D footage of another key sport is coming to virtual reality headsets thanks to a deal between the National Hockey League and NextVR.Read More
Animated characters are as old as human storytelling itself, dating back thousands of years to cave drawings that depict animals in motion. It was really in the last century, however—a period bookended by the first animated short film in 1908 and Pixar’s success with computer animation with Toy Story from 1995 onwards—that animation leapt forward. Fundamentally, this period of great innovation sought to make it easier to create an animated story for an audience to passively consume in a curated medium, such as a feature-length film.
Our current century could be set for even greater advances in the art and science of bringing characters to life. Digital influencers—virtual or animated humans that live natively on social media—will be central to that undertaking. Digital influencers don’t merely represent the penetration of cartoon characters into yet another medium, much as they sprang from newspaper strips to TV and the multiplex. Rather, digital humans on social media represent the first instance in which fictional entities act in the same plane of communication as you and I—regular people—do. Imagine if stories about Mickey Mouse were told over a telephone or in personalized letters to fans. That’s the kind of jump we’re talking about.
Social media is a new storytelling medium, much as film was a century ago. As with film then, we have yet to transmit virtual characters to this new medium in a sticky way.
Which isn’t to say that there aren’t digital characters living their lives on social channels right now. The pioneers have arrived: Lil’ Miquela, Astro, Bermuda, and Shudu are prominent examples. But they have are still only notable for their novelty, not yet their ubiquity. They represent the output of old animation techniques applied to a new medium. This Techcrunch article did a great job describing the current digital influencer landscape.
So why haven’t animated characters taken off on social media platforms? It’s largely an issue of scale—it’s expensive and time-consuming to create animated characters and to depict their adventures. One 2017 estimate stated that a 60-90 second animation took about 6 weeks. An episode of animated TV takes between 1–3 months to produce, typically with large teams in South Korea doing much of the animation legwork. That pace simply doesn’t work in a medium that calls for new original content multiple times a day.
Yet the technical piece of the puzzle is falling into place, which is primarily what I want to talk about today. Traditionally, virtual characters were created by a team of experts—not scalable—in the following way:
Create a 3D model
Texture the model and add additional materials
Rig the 3D model skeleton
Animate the 3D model
Introduce character into desired scene
Today, there are generally three different types of virtual avatar: realistic high-resolution CGI avatars, stylized CGI avatars, and manipulated video avatars. Each has its strengths and pitfalls, and the fast-approaching world of scaled digital influencers will likely incorporate aspects of all three.
The digital influencers mentioned above are all high-resolution CGI avatars. It’s unsurprising that this tech has breathed life into the most prominent digital influencers so far—this type of avatar offers the most creative latitude and photorealism. You can create an original character and have her carry out varied activities.
The process for their creation borrows most from the old-school CGI pipeline described above, though accelerated through the use of tools like Daz3D for animation, Moka Studio for rigging, and Rokoko for motion capture. It’s old wine in new bottles. Naturally, it shares the same bottlenecks as the old-school CGI pipeline: creating characters in this way consumes a lot of time and expertise.
Stylized CGI avatars, on the other hand, have entered the mainstream. If you have an iPhone or use Snapchat, chances are you have one. Apple, Samsung, Pinscreen, Loom.ai, Embody Digital, Genies, and Expressive.ai are just some of the companies playing in this space. These avatars, while likely to spread ubiquitously a la Bitmoji before them, are limited in scope.
While they extend the ability to create an animated character to anyone who uses an associated app, that creation and personalization is circumscribed: the avatar’s range is limited for the purposes of what we’re discussing in this article. It’s not so much a technology for creating new digital humans as it is a tool for injecting a visual shorthand for someone into the digital world. You’ll use it to embellish your Snapchat game, but storytellers will be unlikely to use these avatars to create a spiritual successor to Mickey Mouse and Buzz Lightyear (though they will be a big advertising / brand partnership opportunity nonetheless).
Video manipulation—you probably know it as deepfakes—is another piece of tech that is speeding virtual or fictional characters into the mainstream. As the name implies, however, it’s more about warping reality to create something new. Anyone who has seen Nicolas Cage’s striking features dropped onto Amy Adams’ body in a Superman film will understand what I’m talking about.
Open source packages like this one allow almost anyone to create a deepfake (with some technical knowhow—your grandma probably hasn’t replaced her time-honored Bingo sessions with some casual deepfaking). It’s principally used by hobbyists, though recently we’ve seen startups like Synthesia crop up with business use cases. You can use deepfake tech for mimicry, but we haven’t yet seen it used for creating original characters. It shares some of the democratizing aspects of stylized CGI avatars, and there are likely many creative applications for the tech that simply haven’t been realized yet.
While none of these technology stacks on their own currently enable digital humans at scale, when combined they may make up the wardrobe that takes us into Narnia. Video manipulation, for example, could be used to scale realistic high-res characters like Lil’ Miquela through accelerating the creation of new stories and tableaux for her to inhabit. Nearly all of the most famous animated characters have been stylized, and I wouldn’t bet against social media’s Snow White being stylized too. What is clear is that the technology to create digital influencers at scale is nearing a tipping point. When we hit that tipping point, these creations will transform entertainment and storytelling.
Pick your avatar, customize your loadout, and extinguish your rivals in this homegrown jailbreak-themed shooter.
VR has become a beacon for low-budget breakout hits; the success of Beat Saber and Onward in this space would not be possible in the saturated indie market of traditional flatscreen gaming. As such, VR isn’t just an exciting frontier for futurists and gadget enthusiasts, it also represents a sweet spot—a golden window of opportunity for game developers looking to get in and shape the industry at a grassroots level.
One such hopeful pairing of developers is Ryan Schattner (Fat Moth Interactive) and James Nye (Potato Face Games), who’ve teamed up to create ‘In League’, a squad-based competitive shooter for which the development cycle has been all about throwing ideas at the wall and seeing what sticks.
“I grew up messing around with computers a whole lot,” Schattner explained to me over the phone. “When I got out of the military, I got into real estate. I was still into tech, making websites and stuff, then one day in 2014 I saw UE4 pop up and saw that it was offering all of these tools together for $20 a month. I bought it immediately, and I’ve been messing around with it ever since.”
“I was at college for a degree in Electrical Engineering,” Nye said to me during a separate phone call. “I pretty much only ever went to my math classes.”
“I ended up dropping out when I was 18 to found Potato Face Games with my friend Cope Williams.” Nye continued. “In highschool at least, I was really into game dev and I made a couple of really small games. So when VR came out, I was like ‘Oh this is really cool, and I can develop for it!’.”
While Williams is still working with Nye at Potato Face Games over 3 years later, his attention is focused on running the Austin Virtual League, a local LAN-party organizer that hosts VR events and tournaments in and around Austin, TX.
‘In League’ is constantly receiving iterative content updates and patches from both Nye and Schattner, who are developing their vision for the game as it grows in popularity and scope.
Image Credit: Fatmoth Interactive / Potato Face Games
”With In League, I’ve been working on our next major patch. It includes bug fixes and a new mode,” Nye continued. ”I’ve been playing a lot of Rainbow Six: Siege, and penetrative physics is one of my favorite parts of that.”
“You have Slayer mode, and then Hunt mode which is basically just Team Deathmatch,” Schattner told me. “And then there’s Hostage mode, which is single-elimination. That’s where the Rainbow Six: Siege inspiration comes in.”
But what the team found out is that it’s difficult to run a solely multiplayer-focused game this early into VR’s lifecycle. To solve that problem, they are working on giving players tons of singleplayer content to enjoy when they can’t find a match.
“We’re focusing more on missions with singleplayer and co-op gameplay until the VR userbase grows a bit more,” Schattner told me. “From a singleplayer standpoint, we have the arcade arena modes. Beyond that, it’s a matter of creating additional levels. Depending on how long it takes, I want to make 3 or 4 of those, and then there’s a procedural dungeon element to the larger levels. It generates cities and dungeons; each time you play it’ll be a bit different, kinda like Gunheart, but on a bit larger scale.”
Image Credit: Fatmoth Interactive / Potato Face Games
“We’re creating an entire 10, 15 levels uniquely that’ll take the two of us a very long time,” Schattner continued. “If I can create some variance each time you run through, it’ll be slightly different mission types for each point you hit. It keeps the replayability there, and it’s a scope we can actually accomplish.”
The duo’s secret weapon is their tester base, who’ve been giving them consistent feedback since the game first entered early-access.
“Once we started giving people beta test keys, we had about 60 members initially,” Nye told me. ”Gameplay has just gotten tighter. We added attachments at one point, we added bots at another point. We originally only had one gamemode, but we’ve added three more game modes. For example the wave defense game mode was something we’d only initially thought of when we first added bots.”
“My military experience sort of played into In League’s development,” Schattner told me. “When you open a weapon drop in a match, the sound that goes off is exactly like the sound you hear when an incoming mortar round goes off in a live warzone.”
Given the wide availability of assets for sale on the UE4 Marketplace, the team has mostly been able to implement the gameplay features that they wanted.
Image Credit: Fatmoth Interactive / Potato Face Games
“For the weapon design choices, it was mostly based on the availability of assets,” Schattner told me. “If you look at Pavlov, they have some of the same asset packs that we do. We just want what people would expect; if you’ve played Call of Duty: Modern Warfare 2, you want that whole suite of weapons that players would expect.”
“I definitely wanted a claymore, definitely wanted a bear trap, definitely wanted landmines,” Schattner continued. “I want to put mortars in the game eventually. I have the assets for it, where you can drop it down the tube and it’ll bounce out.”
“When we started looking at a hostage mode, we needed something for people to defend themselves with,” Nye told me. “That’s what influenced the traps. But then we needed something for people to revive each other with, so that inspired the epi-pen.”
“If we didn’t have the asset for the epi-pen, it probably would have been something else,” Nye continued. “We needed something that people could revive each other with. Between asset availability and gameplay mechanics, it goes both ways.”
Image Credit: Fatmoth Interactive / Potato Face Games
But even though ‘In League’ uses many prefabricated assets, it does have a unique ‘look’ due to Schattner’s expertise in 3D modelling.
“I pretty much started out with 2D art and 3D art before I started programming,” Schattner told me. “Programming was the difficult part, but creating the models was tons of fun. The workflows nowadays are pretty easy for something that’s humanoid. So basically you can use an Adobe product to get a base model, and from there you can modify it however you want.”
“The character aesthetic is similar to Manhunt, where everybody’s a bad guy,” Schattner continued. “The game’s story is like the crowd spectating in Manhunt. Basically it’s like a person encouraging you to make snuff films, so that’s why you can down someone and get double points for executing them with a melee weapon. Everyone in the game is a different sect of insane people.”
The developers of ‘In League’ are managing to push the envelope on the VR game dev’s handbook by inventing their own solutions for features like climbing and downing other players.
Image Credit: Fatmoth Interactive / Potato Face Games
“Something that’s been joked about has been the ability to throw weapons,” Nye told me. “It’d be cool to add that in. I don’t feel it would happen enough to go into it with the current manpower we have. It’d take me little over a weekend to do, but it’s a matter of ‘Do I need to focus on that?’.”
“There’s OOB (out of body) movement, which is a way to locomote around,” Nye continued. “We use that really specially in In League that other VR FPS games don’t. When you get below a certain amount of health, you go into a crawl state where you move around and try to reach safety. It’s from third person, so you can crawl around the corner and not see yourself, but your friend can revive you because he has an epi-pen. But you have the challenge where you can’t see your own movement.”
“OOB was a whole separate challenge, but we ended up incorporating that into gameplay,” Nye continued. “It was a movement mode for a plugin we used called VR Expansion, and that was not the intended use for it. But we combined it with the downed-state gameplay mechanic and it turned into a really cool component.”
Image Credit: Fatmoth Interactive / Potato Face Games
“The most hardcore VR gamer is our target audience; the hardcore shooter fans,” Schattner told me. “There’s no babysitting, we don’t really help with anything. There’s no highlight over where you cock your gun. No teleports; it’s all free locomotion. You get the steeper learning curve plus the climbing and additional features you have to contend with.”
“The game design language in VR is definitely, absolutely still being written,” Nye continued. “Everything is wrong but you can do everything wrong the right way. You have to attack every idea. You have to try it; if it doesn’t work, it doesn’t work.”
You can purchase and download the early-access alpha of ‘In League’ on Steam for an MSRP of $20.
There’s a strand of VR madness that really works. Accounting+ embraces inevitable moral panics and judgment-free murder to create something entirely surreal. Job Simulator finds fun in the mundane, letting you live out your stupidest daydreams free from consequence. As the name implies, Mosh Pit Simulator has a slightly more traditional take on the zany possibilities of VR. It’s essentially a Goat Simulator wannabe inside a headset. I’m sorry to say the results are profoundly less interesting.
Don’t get me wrong, I had my giggles inside Mosh Pit Simulator’s creaky sandbox. Attaching missiles to a whale’s fin and then watching it corkscrew off into the sunset or punching rubbery humans through windows 50 stories high will always be at least a little funny. But it’s laughter I’ve already enjoyed in other, better games, and it wears too thin too fast.
If anything, this feels like a cautionary tale. Yes, there’s fun to be had being the last human on earth, but be careful what you wish for. Mosh Pit Simulator is set in a relatively small open world in which human’s bones have been turned to rubber and their brains resemble mush. In the sandbox mode, you can summon missiles and rotators that will send them and other objects spiraling off into space. It’s broken more often than not; humans clip through walls, collisions end with objects disappearing and the screen can stutter with how much it has to handle.
But any laughter you might get from it rings hollow across the game’s unsightly streets. These aren’t happy accidents; they’re glitches for the sake of glitches. Mosh Pit Simulator seems content with laughing at VR’s limitations rather than finding the deeper humor in what it does right. The world is also empty; there are some NPCs around but you have to summon most of them yourself in a shop. In the game’s single-player story (essentially a glorified tutorial), giant animals tour the town like clockwork. It fleshes the world out considerably. If the sandbox mode itself were this unpredictable I might find a reason to spend more than a few minutes inside it.
As it stands, this world feels dead and not intentionally so. There’s no audible impact when objects collide, making spectacular crashes feel lifeless. You can stick any two objects together but there often isn’t much point to it. The truth of the matter is that there just isn’t that much to do.
Now, I realize that I probably just don’t ‘get it’. I know that I’m being a Scrooge here and that people may mine hilarity from Mosh Pit. It’s probably the same people that find Drunkn Bar fight funny (I don’t). And, hey, more power to you. This has enough ammunition to fuel a few hours of streaming madness for sure. But a VR game that’s ultimately better watched than it is played is not something I can recommend.
Mosh Pit Simulator’s current state is a bit of a disappointment, then. This is all just the start, though. The game’s kicking off a proposed six-month Early Access phase today. If Mosh Pit wants to become the true Garry’s Mod of VR it’s going to need a heck of a lot more substance. As it stands, this is a virtual playground in dire need of some life.
Oh and might I add: bah humbug.
Mosh Pit Simulator is available now on Oculus Rift, HTC Vive and Windows VR for $19.99.
VR startup Vreal is opening up the audience for its service more widely with the addition of a desktop mode for its streaming platform.
The Vreal service is integrated with a number of VR titles including Tilt Brush,Superhot, H3VR, Gorn, Blocks and Fantastic Contraption. The app allows folks to record their session in a virtual world for playback later. This new mode lets viewers navigate around a recorded scene to see the action from another angle without needing to put on a VR headset.
This new mode could be useful for folks who spend a lot of time in one of the compatible titles and want to grow an audience for those experiences. The app should let viewers get closer to the action than a traditional Twitch stream. In particular, creators in apps like Tilt Brush or Blocks might be able to explain how they are making something to future viewers who get right up to see every brush stroke.
The company uploaded the following video to demonstrate the new mode.
Vreal remains in early access on Steam. Earlier in January, the company added support for Gunheart, representing the first game built in Unreal Engine to get support for the service.
We’re curious to see what 2019 has in store for Vreal. There is still little in the way of details regarding next generation PC-based VR headsets and those new systems could have a major impact on adoption and usage of a streaming platform like Vreal. We’re expecting major updates in the coming months at events like Mobile World Congress, Game Developers Conference and even E3 which could reshape the market.
Facebook this week hosted their Q4 2018 earnings call, reporting their finances for the quarter. During the call Facebook CFO David Wehner stated:
Payments & Other Fees revenue was $274 million, up 42%. Sales of Oculus Go and the launch of Portal contributed to the revenue growth in the quarter.
This is the first time Oculus has been mentioned as a revenue source. In 2016 after the Rift launch Wehner had very different news, stating “It’s not going to be material to our financials this year.”
This seems to indicate that the Oculus Go is selling much better than Rift ever did. Remember these are revenue figures, not units, so multiple Go headsets need to be sold to generate the same revenue as one Rift.
At Oculus Connect 5, Oculus CTO John Carmack claimed Go headset sales “exceeded even my expectations”- and that he had been “the most optimistic”.
It’s important to note that “Payments & Other Fees” is the smallest section of Facebook’s revenue. The company’s main business is still advertising, which brought in over 98% of revenue.
Marketing Costs
The notable Oculus Go revenue didn’t come for free, however. Facebook’s total expenses were up by $1 billion – an increase of 62% compared to this time last year. When explaining this increase, Wehr included the marketing cost of Oculus Go:
In addition to continued investment in infrastructure, safety & security, and innovation, expenses were also driven by seasonal factors – including marketing efforts, notably the promotion of Portal and Oculus Go.
This likely refers to the celebrity marketing campaign Facebook threw in fall. Wiz Khalifa, Jonah Hill, Adam Levine, Leslie Jones, and Awkwafina were enlisted in an effort to sell the standalone headset. While Facebook doesn’t break down its marketing expenses in detail, we don’t imagine those stars work for cheap.
Profitability
No comments were made about the profitability of Go- only about raw revenue. At $199 it’s likely the headset is sold at or near cost. The Oculus Store is where the profits should come from.
But this early in the VR market Facebook may not care about profit yet. In a 2016 earnings call, CEO Mark Zuckerberg described VR’s profitability as “a 10-year thing”. But what does seem likely from this week’s comments is that Oculus Go is selling better than Rift ever did.
America’s National Hockey League today announced a new VR video experience, captured during last weekend’s NHL All-Star Game, which is now available via the NextVR app. The league is further promising more VR highlights to come from “select 2019 NHL marquee events.”
Available today via the NextVR app on every major VR platform, NHL is releasing a VR video highlight reel from the All-Star Game, an exhibition match up featuring top talent from the hockey league. The event was hosted last weekend in San Jose, CA, and marks roughly the halfway point of the NHL season.
NextVR is a leading producer of broadcast-focused VR video content, and shoots some of the top quality live-action content viewable in any VR headset, typically consisting of stereoscopic 180 degree footage up to 60 FPS. The company was on hand to capture highlights from the All-Star Game, and the footage is now available globally for free in a new NHL channel within the NextVR app.
And that new channel also heralds more NHL content to come from NextVR, the companies say. While specifics have yet to be revealed, “VR post game highlights from some of the NHL’s biggest events” is purportedly forthcoming. And though NHL’s VP of Business Development, Chris Golier, says that the VR content will make “[fans feel like they are at a live NHL game,” so far no plans have been announced to stream full NHL games in VR, live or on-demand.
And while NextVR produces some content which makes its way to the social-enabled Oculus Venues app, this particular NHL content is currently only available through the company’s own NextVR app which lacks any social functionality.
SEE ALSO
NextVR's Latest Tech is Bringing New Levels of Fidelity to VR Video
The announcement with the NHL is good news for NextVR which has been in the VR space since the very beginning—and has repeatedly snagged opportunities to produce VR video content with top sports leagues—but has struggled to find real traction with its content. Earlier this month the company laid off a significant portion of its workforce, saying that it was “staffed for a pretty explosive growth curve” which didn’t pan out as expected.
The vast majority of Oculus Go apps are best played with a motion controller. But there’s no denying that some games and experiences simply play better with a gamepad. Good news, then; SteelSeries’ latest product is an Oculus Go gamepad designed for gamers.
The SteelSeries Stratus Duo launched this week for $59.99. It comes with all the usual bells and whistles; dual analog sticks, four face buttons, a d-pad, and shoulder buttons. SteelSeries is keen to mention that the kit works with both the Oculus Go and the Gear VR headsets, though. It connects via Bluetooth.
The Stratus Duo also comes with over 20 hours of rechargeable battery life. It weighs in at 245g. Inside the box you’ll find a wireless USB adapter (which you won’t need for VR) and a Micro-USB charging cable.
Outside of Go and Gear, the Stratus Duo also supports Windows PCs and Android devices. It’s also compatible with Steam games, so you could use it with your Vive, Rift and Windows VR headsets too. With a Stratus Duo in hand, you can play some Go games that require a controller like the Herobound series. Other games like Republique also just play better with a gamepad.
You can also use console gamepads you already own for Oculus Go gaming but support can be finicky. It might be the more expensive choice, but a dedicated controller is definitely the better way to go if you’re serious about an Oculus Go gamepad. Whether or not the Stratus Duo is the best option we can’t say just yet; we haven’t gone hands-on with it ourselves.