Thursday 30 November 2017

Virtual Reality Headset plus Remote by Hype by Cynoculars



Virtual Reality Headset plus Remote by Hype by Cynoculars
ID: 272962784567

Auction price: $9.50
Bid count:
Time left: 29d 20h

Buy it now: $9.50

December 1, 2017 at 10:01AM
via eBay http://ift.tt/2i5vvIv

VR BOX 2.0 Virtual Reality 3D Glasses Cardboard for Smartphones Movie Game TV



VR BOX 2.0 Virtual Reality 3D Glasses Cardboard for Smartphones Movie Game TV
ID: 391937441394

Auction price: $8.55
Bid count:
Time left: 9d 23h

Buy it now: $8.55

December 1, 2017 at 01:15PM
via eBay http://ift.tt/2kdKKmO

HTC VIVE - Virtual Reality System



HTC VIVE - Virtual Reality System
ID: 222739311971

Auction price: $550.00
Bid count:
Time left: 29d 23h

Buy it now: $550.00

December 1, 2017 at 01:15PM
via eBay http://ift.tt/2k9PYjw

VIULUX V1 VR Game Headset Virtual Reality Glasses Movie 1080P 5.5" OLED for PC



VIULUX V1 VR Game Headset Virtual Reality Glasses Movie 1080P 5.5" OLED for PC
ID: 152809653687

Auction price: $164.98
Bid count:
Time left: 29d 23h

Buy it now: $164.98

December 1, 2017 at 11:17AM
via eBay http://ift.tt/2isw2bf

3D VR Headset Virtual Reality Goggles for Movies Video Games 4.7-6.0" Smartphone



3D VR Headset Virtual Reality Goggles for Movies Video Games 4.7-6.0" Smartphone
ID: 272962881191

Auction price: $17.99
Bid count:
Time left: 29d 23h

Buy it now: $17.99

December 1, 2017 at 11:43AM
via eBay http://ift.tt/2zFmqNO

DOOM VFR On HTC Vive Does Not Include A Smooth Locomotion Option

http://ift.tt/2jz6lT3

DOOM VFR On HTC Vive Does Not Include A Smooth Locomotion Option

DOOM VFR is finally here! The game is launching for both Sony’s PlayStation VR (PSVR) and for PC VR on Steam. One of the biggest questions leading up to today is whether or not the game would include full, smooth locomotion as a movement option. On the PSVR edition of the game, which we’ve already completed for review (the campaign took us about 3 1/2 hours to finish and there is no multiplayer) and livestreamed for two hours, there are three types of movement. You can play traditionally with the DualShock 4 gamepad plus headtracking while wearing the headset, you can use the PS Move controllers with teleporting and dash movements, or you can play with the PS Aim controller peripheral with either smooth, analog stick locomotion, dash movements, or teleportation.

But if you opt for the HTC Vive version with the included Vive wands as motion controllers, which is graphically superior and features full 360-degree roomscale tracking, you lose the option of smooth, direct movement for locomotion. It’s just like playing with the PS Move controllers in that you can only dash step or teleport — that’s it. Here is the officially provided controller guide sent over to us from Bethesda directly:

The decision seems a bit baffling. The Vive wand trackpad is more than capable of translating inputs into smooth locomotion as seen in countless other VR games, so the absence of VR’s most requested movement method seems odd. Make no doubt about it though, modders will very likely create smooth locomotion hacks and mods on their own, as has happened in the past. Meanwhile, you can currently play a modded version of DOOM 3 BFG Edition with full, smooth locomotion instead if you want.

You can read our full review of the PSVR version right here (it will be updated with thoughts on the Vive version in due time.) While the review does not factor in playing on the Vive platform, the content of the game is still exactly the same on both devices. For comparison, here’s a video of gameplay using the PS Aim gun peripheral on PSVR with full, smooth locomotion:

What do you think of this news? Perhaps Bethesda will patch in smooth movement for Vive as an option. You can purchase DOOM VFR on Steam with official HTC Vive support right now for $29.99. In the meantime, let us know your thoughts down in the comments below!

Tagged with: , , ,



from UploadVR http://ift.tt/2i6hM4h
via IFTTT

HP Mixed Virtual Reality VR Headset with Controllers - Brand New - Sealed in Box



HP Mixed Virtual Reality VR Headset with Controllers - Brand New - Sealed in Box
ID: 162782555647

Auction price: $219.00
Bid count:
Time left: 29d 23h

Buy it now: $219.00

December 1, 2017 at 09:53AM
via eBay http://ift.tt/2j6ioaM

3D VR Glasses Virtual Reality Glasses Headset Bluetooth 3.0 Remote Controller



3D VR Glasses Virtual Reality Glasses Headset Bluetooth 3.0 Remote Controller
ID: 302549196577

Auction price: $19.95
Bid count:
Time left: 29d 23h

Buy it now: $19.95

December 1, 2017 at 08:38AM
via eBay http://ift.tt/2zCTkyC

Virtual Reality Baseball Headset



Virtual Reality Baseball Headset
ID: 253287619044

Auction price: $22.00
Bid count: 0
Time left: 9d 23h

Buy it now: $35.00

December 1, 2017 at 08:19AM
via eBay http://ift.tt/2kdkTeK

DOOM VFR Does Not Seem To Work On Oculus Rift At Launch

http://ift.tt/2i5NWgk

DOOM VFR Does Not Seem To Work On Oculus Rift At Launch

The burning question at the top of every Oculus Rift owner’s mind ever since E3 this year has been: Will DOOM VFR and Fallout 4 VR work on the Oculus Rift, natively, through Steam? Bethesda has been careful with their language ever since these games were announced and the Store pages clearly list only HTC Vive. But now, at long last, we finally have the answer to the first half of that question: No, it does not appear that you can play DOOM VFR with an Oculus Rift, at least at this moment. After we published our full review of the PSVR edition of the game, Bethesda sent us a download code for the Steam version. Officially, the Steam Store Page only lists HTC Vive as a supported headset and it looks like that was intentional.

Anyone that’s tried playing a Vive game with a Rift successfully in the past probably assumed it would “just work” with the Rift, but we’ve found otherwise. When I load the game up on my Vive I get a series of splash screens and logos, photosentitivity warnings, and a loading screen. After that the main menu appears. When I try it using my Oculus Rift however, the game boots me back to the SteamVR Home space in between the loading screen and the main menu.

It never finishes launching the game.

To understand the situation you have to know that Bethesda’s parent company, ZeniMax, is currently undergoing an ugly legal battle with Oculus’s parent company, Facebook. We’ve covered that case and ensuing trial plenty but we don’t know if that was a contributing factor to preventing Rift access to the game. Ideally, support will arrive sooner rather than later.

Have you tried playing DOOM VFR on a Rift yet? Let us know down in the comments below! We will continue updating this story once we find out more.

Tagged with: , ,



from UploadVR http://ift.tt/2jA7DwN
via IFTTT

How to get the best fit for your Gear VR

http://ift.tt/2nr8EIf

Getting a good fit is the first step on a journey in VR.

Gear VR is one of the more accessible VR headsets available right, as well as being a pretty comfortable fit. Before you go diving directly into VR though, you're going to want to adjust the headset so that it sits correctly on your head. After all, who wants to start enjoying VR only to move their head and have their headset slip off their face? This is how you ensure that never happens.

Adjust the straps

The big thing that you'll need to do is adjust the straps on your headset. Gear VR has 2 straps that you can mess with. The first attaches by velcro at the top of the headset, and the second attaches on the left and right of the headset using velcro. What you'll wan to do is put on your headset, and then hold it in place.

Once you've got one hand holding your headset where you ideally want it to sit, you're going to use your other hand to adjust your straps. They can be a little bit tight at first, and disorienting because you can't actually see what you're doing. It isn't difficult though. All you need to do is tighten the straps to your preferred fit.

Your headset should be firmly pressed against your face, and shouldn't move when you turn your head. If it wobbles, you'll probably want to tighten it a bit more. This can be tricky if you wear glasses though, because you need to make sure that the headset is firmly in place without hurting yourself from your glasses digging into your nose. If this is your first time with Gear VR, you may need to focus your Gear VR to avoid any blurriness.

If you have issues getting the headset firmly seated at the front of your face by yourself, you can also tag in a friend and have them tighten the straps for you. The important thing is for it to be firm and flush against your face without feeling particularly tight. You're gonna be looking around, and depending on the experience, moving a bit, so you want to ensure that the headset is going to stay precisely where you've put it.

Why it matters

Having a firm fit with your Gear VR matters more than you think it might when you first try. While having your headset slip and fall when you're sitting in the living room playing a game might be aggravating, it isn't the biggest deal ever. However, places like Six Flags have started to introduce VR roller coasters that use Gear VR. Making sure that your headset is firmly attached will ensure that you don't lose it while rocketing 70 miles an hour down hills and through loops.

While most people will probably never experience one of these Gear VR coasters, the same theory applies at home. You want a firm fit that isn't going to slide off of your face if you're ducking a projectile, or turning around to make sure you can see everything surrounding you.

Step by step directions

  1. Hold your headset to your face where you want it to sit.
  2. Using your other hand, tighten the straps on the sides and top of your Gear VR
  3. Remove your hands and move your head to make sure it's a firm fight
  4. Adjust again as necessary.

Questions?

Getting a good fit with your Gear VR shouldn't take anymore than a few minutes, and if you're the only one wearing the headset, then it should be good to go for future VR adventures. Did you have issues adjusting your headset? Do you still have questions? Drop us a line below in the comments, or pop into our forums and let us know about it!



from VRHeads - Best VR Guide for Virtual Reality Headsets, Top Picks and How To http://ift.tt/2oUBvq7
via IFTTT

3D VR Headset with Magnetic Trigger, Lightweight Virtual Reality Goggles - 3rd G



3D VR Headset with Magnetic Trigger, Lightweight Virtual Reality Goggles - 3rd G
ID: 112665567149

Auction price: $17.34
Bid count:
Time left: 29d 23h

Buy it now: $17.34

December 1, 2017 at 07:08AM
via eBay http://ift.tt/2Bx8OFH

‘Justice League’ IMAX VR Experience to Debut Haptic VR Controller from Tactical Haptics

SUPERIOR Virtual Reality Headset, 3D Glasses VR Googles With Bluetooth Remote



SUPERIOR Virtual Reality Headset, 3D Glasses VR Googles With Bluetooth Remote
ID: 192381769354

Auction price: $45.42
Bid count:
Time left: 29d 23h

Buy it now: $45.42

December 1, 2017 at 05:56AM
via eBay http://ift.tt/2BznST2

View-Master Virtual Reality Starter Pack with Wildlife and Destinations packs



View-Master Virtual Reality Starter Pack with Wildlife and Destinations packs
ID: 122837321484

Auction price: $25.00
Bid count:
Time left: 29d 23h

Buy it now: $25.00

December 1, 2017 at 05:54AM
via eBay http://ift.tt/2AtFEJK

Best PlayStation VR Events You Can Watch Live

http://ift.tt/2kafdSP

Latest Updates on Upcoming Live Events

Coming December 7th, 2017 Live Game Awards on PlayStation VR

December 7th is fast approaching and so is everyone's excitement for the 2017 Game Awards. With the countdown now hitting less than 10 days, it's time to start planning how you're going to watch the live-streamed event for yourself. Luckily, that option is available to you directly through your PlayStation VR. Tune in this year to watch the annual awards ceremony and get a sneak peak into the games set to be released in the future.

See Game Awards Official page for 2017 Game Awards Timezone & Download Information

A whole new way to experience your favorite events.

Here we're going to dicuss the different options available when it comes to watching Live events on your PlayStation VR. If we hear about it, you'll be the first to know about it. From live sports, experiences, events and concerts we'll make sure you stay updated!

Let the Excitement Begin!

What is NextVR?

NextVR is an app available to you on the PlayStation store. Here you can find plenty of live entertainment for a variety of different interests. This is, without a doubt, a video app you can use from the comfort of your couch and not miss out too much on the VR experience you're already going to get with this 180° view of the actual event (the other 180° being an image filler).

Moving your head or body won't change angles much at all as the scenes aren't interactive, but don't let that discourage you, you're still going to have a great time with an up-close and personal experience with your favorite forms of entertainment. Using your VR headset to watch videos is one thing, and using them to watch the 3D videos NextVR has to offer is another.

[DDownload NextVR Directly from the PlayStation Store Here!

NextVR: NFL

NextVR has partnered with The National Football League to bring you the best Football highlights from five of their games from the 2017 season with Super Bowl XLIV Champion Reggie Bush is hosting with Elika Sadeghi as commentator. It's a 3D experience in the front row seats of your favorite sport. On December 10th, NextVR will release the game highlights of the Dallas Cowboy Vs the New York Giants and, until then, they have the highlights of 4 other games available to view now! Don't forget to tune back in on December 18th for the season recap as well!

NextVR: NBA

NBA Digital has made NextVR the Official League Pass VR Partner of the NBA. Instead of game highlights like they've done with the NFL you'll have the full games LIVE right there in the app! For International NBA League Pass Subscribers, ALL NBA games will be live and at your fingertips through their NextVR Screening Room on the app.

In addition to that, 27 games will be available to watch that were filmed specifically for a full immersive VR experience. They're going to film one game from each team's home so that you can experience every basketball court without ever having to leave home. For NBA League Pass Holders, 7 out of those 27 VR experience games will be available to you.

See at NBA League Passes

NextVR: Live Dance Performances & More!

Not too big of a fan of Football or Basketball? That's okay, NextVR has plenty of other options available to cater to different interests. With 4 Boxing Event videos, all featuring Canelo Álvarez, you can stand right on the edge of the ring and watch amazing highlights of his fights or watch one of the 11 ICC's Soccer Game Highlights.

Then there's the NBC World of Dance, with 17 dances and behind-the-scenes clips of some of our favorite dance crews like Keone & Mari, Stroll Groove, or The Lab. They create a unique experience of watching these recitals with many different views, including one from the crowd AND a 180° camera that circles the stage! It is best of both worlds to not only experience the fancy footwork up close and personal, but then have a view from the crowd for the judging. The immersive feeling of actually having to turn your head from the contestant to judge during commentary is phenomenal.

Download NextVR Directly from the PlayStation Store Here!

Hatsune Miku: VR Future Live

Now we all remember Hatsune Miku, the J-Pop Vocaloid that debuted in 2007, developed and distributed by Crypton Future Media. Well, lucky for us, now you can enjoy a 3-song concert exclusively through your PlayStation VR. In fact, SEGA, the publisher of the experience, came out with [b]3 total[/b] concerts for you to enjoy.

All of them have just about endless possibilities of where you can chose to stand during the performance, available to change at any time during the show. Miku is the only available performer for the first concert. But other popular Vocaloids like Kagamine Rin, Kagamine Len, and Megurine Luka are available on the second, while the third performance features Kaito and Meiko.

Download Directly from the PlayStation Store Here!

OR!

See at SEGA

Joshua Bell VR Experience

Sony took the virtual viewing of music performances by storm by having Joshua Bell, partnered with pianist Sam Haywood, perform Brahms' "Hungarian Dance No. 1" live and just for you. Without being able to stop at a 360° video, they also used advanced audio technology so that the music will follow your precise location during the experience.

As a classical music lover and fellow VRHead, I'm almost speechless by the performance that danced across my headset as if I were standing right in front of him. The video is available free on the PlayStation Store, and I absolutely recommend it.

Download The Joshua Bell VR Experience Directly from the PlayStation Store Here!

Live Performances on Youtube VR

No upcoming events or concerts to watch live? Don't let that discourage you. Doing a quick search on Youtube for "VR Live Performances" you'll find almost endless concerts and performances. With choices ranging from Metallia to Hamilton, the possibilites are endless. This isn't even counting the available content of 360° live streamed events, sceneries, and semi-interactive music videos.

Download Youtube App Directly from the PlayStation Store Here



from VRHeads - Best VR Guide for Virtual Reality Headsets, Top Picks and How To http://ift.tt/2BC4bKD
via IFTTT

Wonderstorm is a new TV and game studio from League of Legends and Uncharted 3 vets

http://ift.tt/2AqDf2p


You could make a Netflix Original series, or you could make a video game — or if you’re newly minted studio Wonderstorm, you make both. It’s founders are Riot Games vets, and they’re partnering up with MWM (previously Madison Wells Media) to create an original animated Netflix series as well as a corresponding game for late 2018.

To start the studio, all three cofounders left careers at Riot, which develops the No. 1 multiplayer online battle arena game in the world, League of Legends. Executive producer Justin Richmond previously headed up a game development team at Riot. Before that, he was the game director for Naughty Dog’s critically acclaimed Uncharted 3: Drake’s Deception. And Wonderstorm president Justin Santistevan was Riot’s head of finance for research and development.

“Riot is an awesome place, and we worked with amazing and talented people there – and League has an incredible player community,” said Richmond in an email to GamesBeat. “We felt lucky to be a part of that. In a lot of ways, we were inspired by Riot to go out and build something ourselves, from the ground up. I think Wonderstorm will be more equally focused on both story and game than Riot is, and it obviously gives us a chance to build and own our vision with a small awesome team.”

Aaron Ehasz, Wonderstorm’s CEO, was Riot’s creative director. He also had an illustrious career in TV. He was co-executive producer and showrunner of Avatar: The Last Airbender, an animated series that won multiple accolades including a Peabody Award.

“Some of our favorite experiences when we were younger, and now for that matter, have been with worlds that have amazing characters and stories, but also great video games and other opportunities to play in that world,” said Richmond. “So we wanted our studio to be a two-headed beast that could deliver awesome experiences on both the story and game front. We got lucky that one of the first partners that believed in our creative vision was Netflix. Ultimately, we hope that our audience and players find our Netflix series engaging, and that it sparks their imagination — and that they take that with them so they can have a deeper and more engaging experience with an amazing game.”

As a founding investor, MWM will be executive producing Wonderstorm’s projects, including the Netflix Original series. Its previous projects include Wevr’s virtual reality experience Gnomes & Goblins.



from VentureBeat http://ift.tt/2it8sv1
via IFTTT

VR Deals Of The Week: PSVR, Games, Accessories And More

http://ift.tt/2hXbTGs

VR Deals Of The Week: PSVR, Games, Accessories And More

Welcome to our round-up of some of the best VR deals from around the web. Note that UploadVR may receive a commission from products listed in this article. For more details on affiliate links and other editorial practices, be sure to check out our Code of Ethics.

PlayStation VR

PlayStation VR Gran Turismo Sport Bundle For $299

Polyphony Digital made its PS4 debut with Gran Turismo Sport a few weeks back and it included some pretty polished, if very light VR content to boot. Basically, you can race any car on any track in 1 vs 1 AI races. It’s not the most abundant of experiences, but it’s still a heck of a lot of fun to play and, with a PSVR and camera for $299, you’re basically getting it for free. You can’t argue with free, can you?

PlayStation VR HMD For $277


If you just want the PSVR by itself you can get it for $73 off the suggested $350 retail price, but not sure why you’d want it without the camera and free game in the above bundle.

PlayStation Move Controllers for $79

Apps and Games

DOOM VFR PS4 for $29.88

When we say “go to Hell,” we mean in the best way possible — in VR!

Gran Turismo Sport PS4 Digital Code for $39.59

Grab a digital version of this PS4 king of racers for around $40.

Steam Deals on VR Games and Apps

Accessories

PlayStation Gold Wireless Stereo Headset For $65

VR is best experienced with a pair of headphones, so why not keep things strictly Sony with these official PS4 headphones. There’s also a hidden noise-canceling microphone so you can chat to your friends for online play.

$10 off VRGE VR Charging Dock with Free Shipping


Use code UPLOAD at checkout and get $10 off the VRGE VR dock and charging station.

Corwin E7 Active Noise Cancelling Headphones

Get the Corwin E7 Active Noise Cancelling Headphones for $39.99 by using code M8TUC9ZO at checkout.



from UploadVR http://ift.tt/2BzWtQY
via IFTTT

Virtual Reality Headset With Stereo Headphone For iPhone And Android Smartphones



Virtual Reality Headset With Stereo Headphone For iPhone And Android Smartphones
ID: 282755234741

Auction price: $37.85
Bid count:
Time left: 29d 23h

Buy it now: $37.85

December 1, 2017 at 04:40AM
via eBay http://ift.tt/2Bn9Djk

Virtual Reality VR Headset 3D IMAX Video Glasses for Android IOS iPhone Samsung



Virtual Reality VR Headset 3D IMAX Video Glasses for Android IOS iPhone Samsung
ID: 182937483789

Auction price: $5.99
Bid count:
Time left: 6d 23h

Buy it now: $5.99

December 1, 2017 at 04:00AM
via eBay http://ift.tt/2Bn5SdI

The 12 best PSVR games available today

http://ift.tt/eA8V8J

The PlayStation VR has eclipsed even Sony's expectations in its first year, in large part due to the strong lineup of games. From Job Simulator to Resident Evil 7, here's our picks for the 12 best PSVR games so far.

The post The 12 best PSVR games available today appeared first on Digital Trends.



from Virtual Reality–Digital Trends http://ift.tt/2vFWovb
via IFTTT

Google Cardboard 2nd Gen. VR Box Virtual Reality 3D Glasses Bluetooth Control



Google Cardboard 2nd Gen. VR Box Virtual Reality 3D Glasses Bluetooth Control
ID: 152809178946

Auction price: $8.99
Bid count:
Time left: 29d 23h

Buy it now: $8.99

December 1, 2017 at 03:04AM
via eBay http://ift.tt/2j43aTT

Monster of the Deep: Final Fantasy XV is fun but a little rough

http://ift.tt/2nhqd29

Fishing in VR with a Monster Hunting flavor.

Monster of the Deep is a spin off VR game, based on the Mini game from Final Fantasy XV where Noctis goes fishing to gain AP. This VR game is essentially a fishing game with the occasional crossbow shooter thrown in for good measure, which winds up being a good bit of fun.

Unlike the main game, you play a nameless hunter in the world of Final Fantasy XV instead of the whiney hero Noctis from the main game. You do bump into others in the game however, so far I've met Noctis and Cindy, the amazing mechanic who is entirely underdressed for this game, with more showing up as you play.

What are the Controls like?

Monster of the Deep uses both Move Controllers to simulate fishing in the real world. You use your right hand to cast the line while using the left hand to make circular motions to simulate reeling the line in. I think it is supposed to be 1:1 movement but the reeling doesn't feel very exact. Whether you reel in fast or reel in slow the motion on screen seems to be the same. Once you hook a fish by pulling up on the controller you have to fight it like you would a real fish. You get visual cues for when you need to pull left and right all the while reeling in, or giving slack, to get that fish in the basket.

There are also two other objects in the game to use, a crossbow and a radar. Both of these feel much better when it comes to control than the fishing reel, which is an issue when the main game is fishing. The radar you simply unclip from your belt and press the trigger to make it ping, while the crossbow requires targeting like any other First Person shooter in VR. Of the three types of equipment in the game, the crossbow is certainly the best.

How's the Fishing?

This game is not going to be an adrenaline filled thrill ride.

The main fishing game is actually quite a lot of fun. Like any fishing, game or real life, there is a lot of waiting for a fish to bite. You can buy lures and strings to maximize your fishing experience and give you the best chance to get some big catches. The bigger the catch the faster your monster meter fills up so you can take care of the big Daemon living in the water. This game is not going to be an adrenaline filled thrill ride. It's a slow, lazy fishing game, with Monsters.

The Casting of the rod also takes a lot of time to master. It doesn't feel connected to what you are doing at all. I have tried to cast the rod with the same motion and it can fly miles into the distance or just in front of you with seemingly no real reasoning. This makes casting into the correct areas extremely haphazard and frustrating. You are supposed to use the radar to locate the correct areas in the lake to cast your lure into, but with randomness that seems to apply to your casting it's hit and miss at best.

Is this... boring?

I can understand how you would think that but sometimes that's kind of nice. Some of the best VR games are ones that allow you to take in grand vistas and enjoy the experience of being inside a world you may never have been in. The premise of Monster of Deep, to inhabit a world you love and meet people you know from hours of gameplay, is a good one. It's nice to sit there with a virtual rod in the water and not catch fish, just like I do in the real world!

There are several different modes to spice the game up as well such as tournament mode, where you fish for Gil, Pleasure mode, where you fish for fun, and hunting mode where you fish for specific fish to gain prizes and Gil. Then there is the story mode.

Story Mode

Monster of the Deep has a story mode, sort of. You are a hunter, clearing out Daemons while enjoying some fishing. Something something storm, something something amnesia, it's pretty forgettable. The story mode does give you the chance to catch the big fish though, by catching the smaller fish you increase your Daemon meter allowing you to battle the Boss fish.

By using your crossbow to shoot the Daemon as it tries to attack, you can eventually knock down it's health to the point where you can catch it like an ordinary fish. It adds a really nice, exciting touch to the game. It's the most fun part of the game.

At least it would be if it wasn't for the appalling graphical fidelity.

The worst graphics I have seen on any VR game and I include games I've played on Google Cardboard.

The biggest issue with Monster of the Deep is the graphical fidelity. It's terrible, like really terrible. I don't say this lightly but it is, by a long way, the worst graphics I have seen on any VR game and I include games I've played on Google Cardboard. You can see from the picture above that it is pixelated, fuzzy, and just plain bad.

At first I thought it was issue with my hardware so I tried several other games, including Final Fantasy XV on the theater mode and Monster of the Deep is the only one that looks this bad. It's like playing Final Fantasy Viii one inch from your face, making it super disorientating. I'm sure this can be fixed by Square, it doesn't look like an insurmountable issue, but they need to fix it quick or people will start to notice, and complain.

Conclusion

Monster of the Deep is a fun, relaxing, fishing game for VR, with a crippling bug. I enjoy the fishing aspect of it a lot, just hanging out throwing a line in and trying to reel in a fish, then battling the Daemon fish to the death really finishes it off. The Graphical issues though, they are bad enough to stop me from recommending the game to anyone. When you are in VR the graphics have to be above par to stop you from feeling sick and Monster of the Deep falls short on that. The writing is illegible, the NPC's seem half formed, and the textures feel flat. All of this makes it a game to avoid, at least until they update it.

Have you played Monster of the Deep or have any questions about it? let me know in the comments.

See at PlayStation Store



from VRHeads - Best VR Guide for Virtual Reality Headsets, Top Picks and How To http://ift.tt/2zCFagB
via IFTTT

Exclusive: How NVIDIA Research is Reinventing the Display Pipeline for the Future of VR, Part 2

http://ift.tt/2AugieI

In Part 1 of this article we explored the current state of CGI, game, and contemporary VR systems. Here in part two we we look at the limits of human visual perception and show several of the methods we’re exploring to drive performance closer to them in VR systems of the future.

Guest Article by Dr. Morgan McGuire

Dr. Morgan McGuire is a scientist on the new experiences in AR and VR research team at NVIDIA. He’s contributed to the Skylanders, Call of Duty, Marvel Ultimate Alliance, and Titan Quest game series published by Activision and THQ. Morgan is the coauthor of The Graphics Codex and Computer Graphics: Principles & Practice. He holds faculty positions at the University of Waterloo and Williams College.

Note: Part 1 of this article provides important context for this discussion, consider reading it before proceeding.

Reinventing the Pipeline for the Future of VR

We derive our future VR specifications from the limits of human perception. There are different ways to measure these, but to make the perfect display you’d need roughly the equivalent to 200 HDTVs updating at 240 Hz. This equates to about 100,000 megapixels per second of graphics throughput.

Recall that modern VR is around 450 Mpix/sec today. This means we need a 200x increase in performance for future VR. But with factors like high dynamic range, variable focus, and current film standards for visual quality and lighting in play, the more realistic need is a 10,000x improvement… and we want this with only 1ms of latency.

We could theoretically accomplish this by committing increasingly greater computing power, but brute force simply isn’t efficient or economical. Brute force won’t get us to pervasive use of VR. So, what techniques can we use to get there?

Rendering Algorithms

Foveated Rendering
Our first approach to performance is the foveated rendering technique—which reduces the quality of images in a user’s peripheral vision—takes advantage of an aspect of human perception to generate an increase in performance without a perceptible loss in quality.

Because the eye itself only has high resolution right where you’re looking, in the fovea centralis region, a VR system can undetectably drop the resolution of peripheral pixels for a performance boost. It can’t just render at low resolution, though. The above images are wide field of view pictures shrunk down for display here in 2D. If you looked at the clock in VR, then the bulletin board on the left would be in the periphery. Just dropping resolution as in the top image produces blocky graphics and a change in visual contrast. This is detectable as motion or blurring in the corner of your eye. Our goal is to compute the exact enhancement needed to produce a low-resolution image whose blurring matches human perception and appears perfect in peripheral vision (Patney, et al. and Sun et al.)

Light Fields
To speed up realistic graphics for VR, we’re looking at rendering primitives beyond just today’s triangle meshes. In this collaboration with McGill and Stanford we’re using light fields to accelerate the lighting computations. Unlike today’s 2D light maps that paint lighting onto surfaces, these are a 4D data structure that stores the lighting in space at all possible directions and angles.

They produce realistic reflections and shading on all surfaces in the scene and even dynamic characters. This is the next step of unifying the quality of ray tracing with the performance of environment probes and light maps.

Real-time Ray Tracing
What about true run-time ray tracing? The NVIDIA Volta GPU is the fastest ray tracing processor in the world, and its NVIDIA Pascal GPU siblings are the fastest consumer ones. At about 1 billion rays/second, Pascal is just about fast enough to replace the primary rasterizer or shadow maps for modern VR. If we unlock the pipeline with the kinds of changes I’ve just described, what can ray tracing do for future VR?

The answer is: ray tracing can do a lot for VR. When you’re tracing rays, you don’t need shadow maps at all, thereby eliminating a latency barrier Ray tracing can also natively render red, green, and blue separately, and directly render barrel-distorted images for the lens. So, it avoids the need for the lens warp processing and the subsequent latency.

In fact, when ray tracing, you can completely eliminate the latency of rendering discrete frames of pixels so that there is no ‘frame rate’ in the classic sense. We can send each pixel directly to the display as soon as it is produced on the GPU. This is called ‘beam racing’ and eliminates the display synchronization. At that point, there are zero high-latency barriers within the graphics system.

Because there’s no flat projection plane as in rasterization, ray tracing also solves the field of view problem. Rasterization depends on preserving straight lines (such as the edges of triangles) from 3D to 2D. But the wide field of view needed for VR requires a fisheye projection from 3D to 2D that curves triangles around the display. Rasterizers break the image up into multiple planes to approximate this. With ray tracing, you can directly render even a full 360 degree field of view to a spherical screen if you want. Ray tracing also natively supports mixed primitives: triangles, light fields, points, voxels, and even text, allowing for greater flexibility when it comes to content optimization. We’re investigating ways to make all of those faster than traditional rendering for VR.

In addition to all of the ways that ray tracing can accelerate VR rendering latency and throughput, a huge feature of ray tracing is what it can do for image quality. Recall from the beginning of this article that the image quality of film rendering is due to an algorithm called path tracing, which is an extension of ray tracing. If we switch to a ray-based renderer, we unlock a new level of image quality for VR.

Real-time Path Tracing
Although we can now ray trace in real time, there’s a big challenge for real-time path tracing. Path tracing is about 10,000x more computationally intensive than ray tracing. That’s why movies takes minutes per frame to generate instead of milliseconds.

Under path tracing, the system first traces a ray from the camera to find the visible surface. It then casts another ray to the sun to see if that surface is in shadow. But, there’s more illumination in a scene than directly from the sun. Some light is indirect, having bounced off the ground or another surface. So, the path tracer then recursively casts another ray at random to sample the indirect lighting. That point also requires a shadow ray cast, and its own random indirect light…the process continues until it has traced about about 10 rays for each single path.

But if there’s only one or two paths at a pixel, the image is very noisy because of the random sampling process. It looks like this:

Film graphics solves this problem by tracing thousands of paths at each pixel. All of those paths at ten rays each are why path tracing is a net 10,000x more expensive than ray tracing alone.

To unlock path tracing image quality for VR, we need a way to sample only a few paths per pixel and still avoid the noise from random sampling. We think we can get there soon thanks to innovations like foveated rendering, which makes it possible to only pay for expensive paths in the center of the image, and denoising, which turns the grainy images directly into clear ones without tracing more rays.

We released three research papers this year towards solving the denoising problem. These are the result of collaborations with McGill University, the University of Montreal, Dartmouth College, Williams college, Stanford University, and the Karlsruhe Institute of Technology. These methods can turn a noisy, real-time path traced image like this:

Into a clean image like this:

Using only milliseconds of computation and no additional rays. Two of the methods use the image processing power of the GPU to achieve this. One uses the new AI processing power of NVIDIA GPUs. We trained a neural network for days on denoising, and it can now denoise images on its own in tens of milliseconds. We’re increasing the sophistication of that technique and training it more to bring the cost down. This is an exciting approach because it is one of several new methods we’ve discovered recently for using artificial intelligence in unexpected ways to enhance both the quality of computer graphics and the authoring process for creating new, animated 3D content to populate virtual worlds.

Computational Displays

The displays in today’s VR headsets are relatively simple output devices. The display itself does hardly any processing, it simply shows the data that is handed to it. And while that’s fine for things like TVs, monitors, and smartphones, there’s huge potential for improving the VR experience by making displays ‘smarter’ about not only what is being displayed but also the state of the observer. We’re exploring several methods of on-headset and even in-display processing to push the limits of VR.

Solving Vergence-Accommodation Disconnect
The first challenge for a VR display is the focus problem, which is technically called the ‘vergence-accommodation disconnect’. All of today’s VR and AR devices force you to focus about 1.5m away. That has two drawbacks:

  1. When you’re looking at a very distant or close up object in stereo VR, the point where your two eyes converge doesn’t match the point where they are focused (‘accommodated’). That disconnect creates discomfort and is one of the common complaints with modern VR.
  2. If you’re using augmented reality, then you are looking at points in the real world at real depths. The virtual imagery needs to match where you’re focusing or it will be too blurry to use. For example, you can’t read augmented map directions at 1.5m while you’re looking 20m into the distance while driving.

We created a prototype computational light field display allows you to focus at any depth by presenting light from multiple angles. This display represents an important break with the past because computation is occurring directly in the display. We’re not sending mere images: we’re sending complex data that the display converts into the right form for your eye. Those tiny grids of images that look a bit like a bug’s view of the world have to be specially rendered for the display, which incorporates custom optics—a microlens array—to present them in the right way so that they look like the natural world.

That first light field display was from 2013. Next week, at the ACM SIGGRAPH Asia 2018 conference, we’re presenting a new holographic display that uses lasers and intensive computation to create light fields out of interfering wavefronts of light. It is harder to visualize the workings here, but relies on the same underlying principles and can produce even better imagery.

We strongly believe that this kind of in-display computation is a key technology for the future. But light fields aren’t the only approach that we’ve taken for using computation to solve the focus problem. We’ve also created two forms of variable-focus, or ‘varifocal’ optics.

This display prototype projects the image using a laser onto a diffusing hologram. You look straight through the hologram and see its image as if it was in the distance when it reflects off a curved piece of glass:

We control the distance at which the image appears by moving either the hologram or the sunglass reflectors with tiny motors. We match the virtual object distance to the distance that you’re looking in the real world, so you can always focus perfectly naturally.

This approach requires two pieces of computation in the display: one tracks the user’s eye and the other computes the correct optics in order to render a dynamically pre-distorted image. As with most of our prototypes, the research version is much larger than what would become an eventual product. We use large components to facilitate research construction. These displays would look more like sunglasses when actually refined for real use.

Here’s another varifocal prototype, this one created in collaboration with researchers at the University of North Carolina, the Max Planck Institute, and Saarland University. This is a flexible lens membrane. We use computer-controlled pneumatics to bend the lens as you change your focus so that it is always correct.

Hybrid Cloud Rendering
We have a variety of new approaches for solving the VR latency challenge. One of them, in collaboration with Williams College, leverages the full spread of GPU technology. To reduce the delay in rendering, we want to move the GPU as close as possible to the display. Using a Tegra mobile GPU, we can even put the GPU right on your body. But a mobile GPU has less processing power than a desktop GPU, and we want better graphics for VR than today’s games… so we team the Tegra with a discrete GeForce GPU across a wireless connection, or even better, to a Tesla GPU in the cloud.

This allows a powerful GPU to compute the lighting information, which it then sends to the Tegra on your body to render final images. You get the benefit of reduced latency and power requirements while actually increasing image quality.

Reducing the Latency Baseline
Of course, you can’t push latency to less than the frame rate. If the display updates at 90 FPS, then it is impossible to have latency less than 11 ms in the worst case, because that’s how long the display waits between frames. So, how fast can we make the display?

We collaborated with scientists at the University of North Carolina to build a display that runs at sixteen thousand binary frames per second. Here’s a graph from a digital oscilloscope showing how well this works for the crucial case of a head turning. When you turn your head, latency in the screen update causes motion sickness.

In the graph, time is on the horizontal access. When the top, green line jumps, that is the time at which the person wearing the display turned their head. The yellow line is when the display updated. It jumps up to show the new image only 0.08 ms later…that’s about 500 times better than the 20 ms you experience in the worst case on a commercial VR system today.

The renderer can’t run at 16,000 fps, so this kind of display works by Time Warping the most recent image to match the current head position. We speed that Time Warp process up by running it directly on the head-mounted display. Here’s an image of our custom on-head processor prototype for this:

Unlike regular Time Warp which distorts the 2D image or the more advanced Space Warp that uses 2D images with depth, our method works on a full 3D data set as well. The picture on the far right shows a case where we’ve warped a full 3D scene in real-time. In this system, the display itself can keep updating while you walk around the scene, even when temporarily disconnected from the renderer. This allows us to run the renderer at a low rate to save power or increase image quality, and to produce low-latency graphics even when wirelessly tethered across a slow network.

The Complete System

As a reminder, in Part 1 of this article we identified the rendering pipeline employed by today’s VR headsets:

Putting together all of the techniques just described, we can sketch out not just individual innovations but a completely new vision for building a VR system. This vision removes almost all of the synchronization barriers. It spreads computation out into the cloud and right onto the head-mounted display. Latency is reduced by 50-100x and images have cinematic quality. There’s a 100x perceived increase in resolution, but you only pay for pixels where you’re looking. You can focus naturally, at multiple depths.

We’re blasting binary images out of the display so fast that they are indistinguishable from reality. The system has proper focus accommodation, a wide field of view, low weight, and low latency…making it comfortable and fashionable enough to use all day.

By breaking ground in the areas of computational displays, varifocal optics, foveated rendering, denoising, light fields, binary frames and others, NVIDIA Research is innovating for a new system for virtual experiences. As systems become more comfortable, affordable and powerful, this will become the new interface to computing for everyone.

All of the methods that I’ve described can be found in deep technical detail on our website.

I encourage everyone to experience the great, early-adopter modern VR systems available today. I also encourage you to join us in looking to the bold future of pervasive AR/VR/MR for everyone, and recognize that revolutionary change is coming through this technology.

The post Exclusive: How NVIDIA Research is Reinventing the Display Pipeline for the Future of VR, Part 2 appeared first on Road to VR.



from Road to VR http://ift.tt/2j2SAfB
via IFTTT

Echo Arena’s Latest Update Is Subtle, But It Changes Everything

http://ift.tt/2qYd91M

Echo Arena’s Latest Update Is Subtle, But It Changes Everything

Echo Arena released earlier this year and we absolutely loved it at the time. The zero-G movement system is fluid and liberating with some of the best competitive multiplayer action of any VR game. There haven’t been a whole lot of game-changing updates since it launched with just one single game mode and map, but this week a new update came out that’s made one big alteration to the flow of gameplay.

As a recap, the goal of the game is simple. You’re on a team of astronauts in a zero-G arena with wrist jets to help propel you around. You can push off of people and the environment to gain momentum as you vie for a floating disc that must be thrown into the other team’s goal. It’s a bit like Quidditch meets soccer without gravity. That’s really about it, but it’s so, so much fun.

Previously after each point the disc would respawn back at the center of the arena for each team to launch forward and try to nab it in what was known as “jousting” for the disc. But with this latest update that’s all changed. After a team scores, now the disc will spawn on the other team’s side of the arena, giving them the first possession right out of the gate. It sounds like a minor change but it has dramatic results. You can see some different launch scenarios in the video above.

When playing Echo Arena a large portion of the game is simply playing keep away from the other team. Rarely do you hold the disc for longer than a few seconds before passing it or taking a shot on the goal because of how fast and frenetic the matches can get. Someone else can easily climb onto you and steal the disc or stun you with a punch to the face. As a result, simply chucking it down to the other end of the arena or bouncing it off of walls to create an opening for your teammates are all viable strategies.

What this change does is create a whole new paradigm to the start of the game. Instead of dashing towards the middle to grab the disc after every point (although it still happens at the start of each match) the receiving team is given a major advantage. You can strategize to all fly together in a tight formation to try and block opponents and stun them as you zip by. Or you can spread out and have one person grab the disc while the others go deep for a long pass.

On the defending team you now have to decide whether you play defensively and wait for them to approach with the disc, or if you attack them head on. Before you wouldn’t have much of a strategy until after the disc was in play, but now, you’ve got to decide your move before you even leave the launch tube. You can see how it impacts Overtime as well down in the video below.

The only real downside to this new change is that it seems to have introduced a bizarre bug (as of last night) somehow that sometimes allows people to spawn into the arena before the match starts, in which case they can easily score before anyone else has even launched. That should get cleared up via a patch soon enough, if it hasn’t already. The update also introduced spectating and some other minor tweaks. Echo Combat, a first-person shooter expansion of the game, is coming next year too.

Are you still playing Echo Arena? It’s free to download on Oculus Home and you can read our full, scored review right here. Let us know what you think of it down in the comments below!

Tagged with: , ,



from UploadVR http://ift.tt/2i3WeFa
via IFTTT

Google Launches Social Awareness VR Initiative ‘Daydream Impact’

http://ift.tt/2i2TTdJ

Google Daydream will provide organizations with VR training and equipment rentals.

When it comes to social change, awareness is the first step on a long journey to impact the causes and problems that we care about. Although there are a number of tools and strategies that can help drive awareness, there are few more effective than VR to help shine a light on important issues.

Whether the onslaught of VR for awareness projects we saw take over Tribeca Film Festival this year or women’s issue focused experiences that will bring you to tears, VR is an extremely powerful medium that can drive true empathy and real change.

But the challenge is that these organizations and change-makers often lack the resources or knowledge on how to best use VR. That’s why Google is launching a new program dubbed Daydream Impact, with the hope of helping organizations and creators utilize virtual reality to take their programs to the next level.

Daydream Impact focuses on three common bottlenecks with VR creation today: a lack of training on how to create VR video, difficulties accessing camera equipment and tools to showcase their content, and little exposure to how VR has been used creatively to tackle big challenges.

So to start, Google is providing training through a VR filmmaking course on Coursera, which anyone can take. The course begins by outlining basic hardware requirements and pre-production checklists, and it shares tips for getting the best VR footage including best practices from other creators. The training also covers all the post-production work required to create the video and concludes with guidance on how to publish and promote the video.

Second, Google is launching a loaner program to give qualified projects access to equipment to capture and showcase VR pieces—this means a Jump Camera, an Expeditions kit, Google Daydream View and a Daydream-ready phone. You can apply for the program and successful applicants will have six months to capture and refine their work to showcase.

The program has already been piloted with a few organizations including Eastern Congo Initiative who created a VR film on the struggles of the Congo and resilience of its people, The Rising Seas project that uses VR and simulations to experience changes in our coastline environments, Harmony Labs that created three anti-bullying pieces to pilot in schools, and Springbok Cares who is working to integrate VR into hospital environments to entertain patients and minimize anxiety during cancer treatment.

The World Wildlife Fund & Condition One, UNAIDS, the International Committee of the Red Cross, Starlight Children’s Foundation, Protect our Winters, and Novo Media will be sharing their upcoming projects and case studies in 2018.

The post Google Launches Social Awareness VR Initiative ‘Daydream Impact’ appeared first on VRScout.



from VRScout http://ift.tt/2zQtdIl
via IFTTT
Related Posts Plugin for WordPress, Blogger...