gdc 2014

Three Demos, One VR Defining Moment With Project Morpheus

by Mike Futter on Mar 20, 2014 at 05:31 PM

GDC is rapidly drawing to a close, and the Game Informer team is bidding adieu to San Francisco tomorrow. Before we could leave the halls of the Moscone Center though, we had to make sure to stop by Sony's Project Morpheus demo.

We had the opportunity to test the device alongside Sony engineer Anton Mikhailov, who was on stage at the Project Morpheus unveiling. The first thing to know about the Project Morpheus is that it is extremely easy to put on and take off by yourself.

Grasping the device by the front optic block, I put it to my face and slid the supporting strap over my head. The rear of the unit has a locking mechanism with fine adjustment that is slid forward, tightening the unit on the head. The bulk of the weight was on the top of my cranium, and I experienced no physical discomfort wearing it.

The first demo, The Deep, is a standing experience. I stepped into the body of a diver, descending in a cage. There is a flare gun, but this is largely to demonstrate the tracking for a first-person experience. It supports PlayStation Move, but I was given a Dual Shock 4. Even so, I was impressed by the translation of my hand movements in the virtual space.

I was able to turn around completely, tracking the movement of a hungry shark circling the cage. I was a bit startled as it rammed into the protective enclosure and ripped off part of it with its jaws. The Deep is a good starting point for those that have never tried virtual reality, because it doesn't require much movement, but allows users to get used to looking around their environments with their heads instead of an analog stick.

The rear of the Morpheus kit includes tracking lights. The dial at the top is for fine adjustments, and the slide button in the center is used to release the rear fitting for headset removal.

The second demo was a new, single-player build of EVE Valkyrie. I played the game at Gamescom last August, but this was my first experience with head tracking. The difference is that you can lean in and around the cockpit to follow enemy fighters that buzz by. I greatly enjoy the virtual reality dogfighting, but it wasn't drastically different than what I've seen before.

The most exciting and transformational moment I had was with the final demo. During the reveal, Sony revealed its partnership with NASA to incorporate data captured from the Mars Curiosity rover into a virtual reality environment. The experience begins in orbit around the red planet, taken from sattelite data.

Mikhailov brought me down to the surface, where Curiosity was waiting for me. The terrain was displayed in two drastically different ways, with the ground I was standing on taken directly from the rover's stereoscopic cameras. In the distance, where Curiosity hasn't yet been, lower resolution sattelite data was used.

As the rover approached me, Mikhailov accidentally moved it too far, and it collided with me. The sensation of the high resolution, very real-looking automaton passing through me was unlike anything I have ever experienced. It was disconcerting and it was exhilarating to realize that it was because I had a strong sense of "presence" that the new sensations were even possible. This was not entirely dissimilar from my experience playing the new Oculus Rift Couch Knight demo when one of the creatures jumped on "my" leg.

A major contributor to my ability to give myself over to the illusion was sound. Sony's 360 degree audio works as advertised, even on the stock set of non-isolating earphones. Turning away from the rover as it moved its arm relocated the sound in real time. It was clearly coming from behind me. The same effect worked when I knelt on the ground and the machine moved above me.

The end of the Mars demo rockets the player into the air from the ground for a birdseye view. My stomach dropped a bit at first, and when I came to a stop in the air, I continued to feel as if I was floating.

Rear of Project Morpheus when worn.

Sony is on the right track with Project Morpheus, and president of worldwide studios Shuhei Yoshida tells us that manufacturing developer kits is the next step. Unlike Oculus and others, Sony won't be selling the kits. It will be approaching development similar to how it works with the PlayStation 4.

In addition to a need for content before a retail launch, there are challenges to surmount on the technology side. The developer kit is good enough for the community who will be building experiences, but it isn't ready for the public. There is still work to be done improving persistence, which when too high causes a smearing or "screen door" effect.

The field of view, which is currently 90 degrees if wearing glasses (possible because of the adjustable optic box) and a bit more if not, isn't quite as good as Oculus' 100 degree FOV. The 10 degrees matter, and I couldn't help but notice the blackness on the periphery.

I found Project Morpheus to be extremely comfortable, rivaling Oculus Rift in its wearability. I've not worn either for longer than a few minutes at a time, though. Long term use is going to be a big topic of discussion as virtual reality headsets get closer to launch.

"Persistence" is a word you'll be hearing a lot. Games often require users to shift their focus rapidly. Blurring caused by sudden, drastic movements detract from the immersion and, I suspect, negatively impact eye strain and long term wearability. The lower the persistence, the less you'll feel like you're looking through a screen door when you move your head.

Finally, there has been some conversation about what hardware the demos were running on. EVE Valkyrie and Thief, which will be on the show floor on the final day of GDC, are running on PC. Mars, The Deep, and Castle (which was unplayable due to significant wifi interference), are all running on PS4 developer kits.

Project Morpheus is a solid first step for Sony, and if developers have the interest in developing unique content, specifically made for VR, there is a real chance of it catching on. It's going to take more than simply porting 2D content, and success hinges on developers exploring virtual worlds from the inside instead of on a screen hanging on the wall.