Search This Blog

Sunday, November 22, 2015

Thoughts on VR-gaming in 2016

A List recently had a good article about Nvidia and Virtual Reality. [ link ] 

It was quite interesting and feels like 2016 is going to be a critical year for gaming-VR. A lot of releases are scheduled for just Q1.

Content might be the big winner. Let me elaborate : 

Stereoscopy has been around for quite awhile.

We are going to have plenty of flavors of VR-devices doing essentially the same thing - displaying 3D-images 2 inches in front of our eyes. Quite similar to stereoscopy devices that have been around for quite awhile. These devices will use different hardware-schemes to render the 3D images for content. 

- Some will use high-end PCs that have minimum-specs to be compatible. The VR goggles will be connected to the PC video card. Cables will be required between the HD video stream and the googles. 

- Some will use goggles that are just shells that surround smart phones. These will be more ubiquitous, but will have a lower-fidelity experience. However, modern smart phones have the necessary equipment to provide a compelling VR experience.

- Some will embed hardware within the goggles, but will have higher-priced entry points for the devices. These self contained units will provide a specialized experience and won't need cables or other equipment.

Most will converge around a common controller scheme. The input method will resemble the interface between a console and a controller. Basically consisting of trigger buttons and movement controls.

They will also converge around common game development tools like Unreal and Unity. Existing studios are already geared-up to make VR games, look for a lot of familiar studios making VR content.

The VR hardware battle is brewing to be a tight race, much like BlueRay versus HD-DVD. The big players are going to spend quite a bit of money attempting to become the standard for VR gaming. 

I predict Oculus is going to take the early lead. Taking the win with the premium experience. Adopters will be those willing to spend over $1000 for a PC capable of driving the Oculus, or ones that already have a gaming-PC. A small amount of people that have the high-end Samsung phones will also adopt. I'm not sure if this lead will be a win for Oculus? They are going to spend an enormous amount of Facebook money for this early lead.

Microsoft, Carl Zeiss, Google Cardboard, Nvidia, Sony and others will need to focus on bringing a premium experience to rest of the market. Consisting of the 472 million iPhones that have been sold and the world of computers that don't meet Oculus specs or Macs. However, these guys are behind the battle on how you control the game, they are also still struggling with just the visual experience. 

Samsung sells more phones globally than Apple, but how many of those international customers are going to rush out and buy a $200+ accessory to play VR games developed for a Western audience? I predict lower sales for Samsung VR units than the market might expect. 

A break-out hardware leader might bring a premium VR experience to existing consoles. ( Xbox and PS4 ) 

Sony is laying down that gauntlet with their PS4 'Morpheus' project. After watching the promo video, it just reenforced that these VR devices are just  : (  a phone screen - two plastic lenses -  and a gyro ).

Sony Morpheus VR is just 1080p resolution and only a 100 degree viewing area. Smartphones push that many pixels and a larger viewing area...

With a name like Morpheus, I expected quite a bit more. Also, just imagine the umbilical cord from your console to your couch? Unless someone has a wireless method to bring 120-FPS 4k video between those, the experience is broken for the mass-market. 

While the hardware battle ensues, studios with great game ideas and experiences for VR have an amazing opportunity to capitalize upon. 

I feel the content providers have a great chance to win in the VR game. 

Studios that develop engaging games that work across this hardware have a great chance of solid revenue and a market foothold. Whichever flavor of VR you have adopted, they will have quality games to play and share with your friends. 

The content race won't be about exclusive titles. These will be funded by the big VR players, but are really just designed to sell hardware.

Successful content also won't be about pushing the most polygons and pixels to your VR device. Content should bring unique experiences that unleash the immersive power of VR. I'm much more excited to see games like Monument Valley in VR, versus Grand Theft Auto in VR. Think more like Myst, and less like Call-Of-Duty ... VR edition. 

I want the market to bring break-out games to VR, I'm excited to see game-changers. They will be the big winners, while the VR-hardware battles tight margins and lower than expected adoption.

Most importantly, bring content across all the flavors of VR hardware.

What about the other side to VR? Watching movies through your 3D device? Netflix for VR? I'm not convinced the mass market wants to watch movies that way. Do parents really want their children to jack-in and sit mindlessly on the couch? Is human-interaction while watching a movie that important?

Is your significant-other going to be supportive of your nightly 2+ hours of jacking-in time? 

Do you want to bring your VR device into the bedroom? Jacked-in while your partner sleeps?

If watching movies with VR becomes a popular pass-time, I'll vote for the device to come with 'popcorn-mode' so I can see my snacks. 


Saturday, November 14, 2015

Human-Robot interaction : Tesla making a right turn




Sooner than we might think, autonomous cars will be zipping around our cities. The drivers won't be drivers, they will be passengers. So they will be paying less or little attention to traffic. I even imagine no more driver seats and everyone in an autonomous cars sitting around a table and socializing. 

But what about the interaction between drivers, pedestrians and bikers? This interaction involves all three being aware of each other, if they aren't dangerous outcomes can be the result.

I'm meticulous about making eye-contact with a driver as my 'interface' for crossing a street. Especially when the car is making a right turn on red. I can't trust that the driver will glance in my direction before accelerating through the turn.

Safety dictates that you should make eye-contact with a driver before walking in front of them. How do you do that with a autonomous car?

Granted that even current autonomous cars won't run you over, they have sensors to prevent that. But if you do walk in-front of one with just the right timing, there will be a jolting reaction from the car. After accelerating to make the turn, the car will suddenly stop. Which will be alarming to the pedestrian and also jolting the passengers. This is an experience that would be better if it could be avoided.

Also, just imagine if you were standing behind a street pole in a way that the autonomous sensors didn't see you? You then cross the street, thinking the car senses you. In this case, the car won't run you over. But nonetheless, the reaction is still jolting. 

Being able to make right turns at a red light help ease traffic and are important for city congestion. But they require a delicate safety check by the driver. I have a simple idea to make them safer and less cumbersome for autonomous cars.



How about a simple light that tells the pedestrian what the car is sensing?
Red light = I'm an autonomous car but I don't see you. Therefore, I think it is okay to make a turn!

Wouldn't it be cooler if we put a bright light on the autonomous car that indicated the intent of the car? The pedestrian would look at this light, much like they would towards the eyes of a driver. Position this light near where the human-human contact would have occurred. 

If the light was green, it tells the pedestrian that they are seen by the car and they can walk. If the light is a bright red, the pedestrian knows to use caution, the car doesn't see them. Helping to take away the guess work and clearly show how the AI is thinking.

But, what about blind people? They won't see this light. Maybe they could have a safety sensor that listens for a high-pitch sound that informs the blind person whether they can proceed or not. The autonomous car emits a sound as indication of what the car intent is at that moment.



Friday, November 13, 2015

App UI : iPad Pro and Anki OverDrive

For this posting, I would like to talk about my thoughts on designing for 'edge-cases'. I'm suggesting a fix that might be considered an edge-case. This is a product for younger children, in particular young boys. 

How many children are going to have an expensive iPad pro, iPad Air 2, samsung Galaxy View or another larger device?

For edge-cases like this, I still like to produce design solutions that solve for all. 

However these suggestions would need to consider against larger issues that might offer more value to be addressed. For example, the battery life of the cars might be a much larger issue? During games, your car battery will die during a battle, which is quite disruptive.





UX suggestions for larger screens ( iPad Pro - Samsung - etc )

A lot games are going to have to consider making alternative UI schemes for the new larger devices. Games and RC control UI generally utilize the entire screen. Including products like parrot drone, sphero, bb-8 and even a company I used to work for Anki. I’m going to suggest solutions that would improve the experience for my former company and products - Anki OverDrive and Drive.

OverDrive is amazing fun, I highly recommend giving it a try :) 


The iPad Pro weighs over 1 pound and has a large screen. Users will rest the weight of the device on their thighs, and they will sit down more. This is much different than playing with a lightweight phone. 

In Anki OverDrive Lane-switching is done by rotating the device, using the gyros. This will be fatiguing and should be re-thought, or at least be an option you can disable/enable.

You also tap the buttons hundreds of times per game and the distance between them will fatigue your hands.



Wouldn’t it be cooler to let the player tap their controls where they want them? With less distance for thumb travel needed? 

Where they tap is where the UI mounts, the UI could be moved by tapping in a new location, it is dynamic and you control the placement.

I see much less hand fatigue and a much better gameplay experience. Could you imagine the thumb fatigue with the old UI? 


These devices are so large, younger folks might want to hold them in portrait mode. This provides a more controlled grip on the larger and heavier devices.

This orientation also allows for less accidental ‘home’ button presses. Accidentally pressing the ‘home’ button closes the app!


It also doesn’t cover the 4 speakers for better sound presence. When your hands cover the speakers it mutes the sound, a big part of this game is the sound feedback.



30% of the world’s population is Left-Handed. Wouldn’t it also be cool to have a UI setting for Left-Handed mode? Allow the player to pick sides for the controls?