The VR sphere is fascinated by haptics and the notion that “feeling” is the ultimate immersive experience. Take the recently funded kickstarter for the Holosuit that is promising a full body haptic experience as an example of just how fixated the VR world is with more haptics. As a result, the gaming industry and others now using VR applications seem to have blindly accepted the idea that more haptics is better. I find some companies touting their haptics like they are the most essential part of its user experience. But are they? New research is beginning to show a different story. I think it’s time to take a step back from the haptics craze and survey where this awesome feature truly creates an awesome experience. I can tell you that when it comes to VR applications in medicine, for example, more haptics is not always better.

Healthcare is a good industry to look at to gauge the impact and usefulness of haptics since it is widely studied in simulation and robotics. As an orthopaedic surgeon and CEO of a surgical simulation company using VR, I have had the opportunity to learn from haptic innovators and researchers and to work with cutting-edge technologies. What has surprised me over the past few years is how many of my initial assumptions about haptics turned out to be wrong, as well as the complexity of the topic. Certain forms of haptics are not as effective as one would think in improving skill transfer (the degree that a skill learned in one context can be performed in another context) and may not be worth the cost and user experience challenges.

When haptics backfire

Science Robotics published a study this year that caught the attention of the VR world because it challenged a lot of the thinking around haptics. It found there is a fine line to walk when adding touch to make a virtual world more immersive: More realistic touch actually “ruined” the sense of immersion. This is often the case with kinesthetic haptics, also described as “force-feedback” devices such as 3D System’s Touch 3D Stylus. With kinesthetic haptics, we are achieving near-full realism, but that’s not enough to fool our brains, creating an overall negative experience. Some experts think this phenomenon may be due to something called the “uncanny valley” — the psychological effect in the brain that leaves us feeling put off when we experience something that is nearly but not completely realistic. The Polar Express and Final Fantasy: The Spirits Within are great real-world examples of this phenomenon in action.

Grounded kinesthetic haptics is widely regarded as the gold-standard for haptic feedback and make us feel that we are touching or holding objects that are not actually there by applying a force directly onto a user through a device that’s grounded to the table or floor. Though its high cost would imply high value, evidence has shown limited effectiveness in the world of simulation. A recent study split 20 medical students into haptic-trained and non-haptic trained groups and ran them through a series of simulation tasks. At a cost of $30,000, one would hope the haptic-trained group would perform better. Unfortunately, haptics didn’t deliver; at the end of the study, there were no differences in performance between the two groups. Another study from last year showed that the haptic-trained group performed worse than a group that did not get any simulation at all with the conclusion that “Poor mechanical performance of the simulated haptic feedback is believed to have resulted in a negative training effect.”

So if grounded kinesthetic haptics don’t improve learning, do they at least improve the overall user experience? Once again, the answer is no. One study of 20 surgeons, blinded to which instrument had kinesthetic haptics, found they overwhelmingly prefer no haptics at all. Eighty-five percent of the surgeons in this study preferred and felt they performed better with a non-haptic simulator, and 70 percent stated the non-haptic simulator felt more realistic.

Is less actually more?

These results seem puzzling, as realistic force-feedback should intuitively improve performance and overall experience. But grounded kinesthetic haptics in the world of simulation seem to add cost while decreasing overall usability. Still, things aren’t all bad for kinesthetic haptics, as there has been increasing interest in “ungrounded” kinesthetic devices (meaning the device the user gets sensations from does not have to be attached to a base and the user’s movements are not restricted). An example of ungrounded kinesthetic haptics would be an exoskeleton, such as the groundbreaking gloves from HaptX. This exoskeleton seeks to physically displace your skin the same way a real object would when touched, closely replicating its texture, shape, and movement. Ungrounded kinesthetics seem to have an advantage over grounded, and there is exciting research underway in this area.

This leads us to the field of cutaneous haptics, defined by the sensations of vibration, light touch, and temperature. With the recent developments in Microelectrical Mechanical Systems (MEMS) and 3D printing, there have been huge advances in this field. A good example would be the jump in quality between the rumble on the iPhone 6 and the taptic engine on the iPhone 7 (and later generations). Modern VR controllers, like the Oculus Touch controllers, also include sophisticated MEMS-based cutaneous feedback systems. There is some research demonstrating the effectiveness of cutaneous haptics in surgical robotics. In one study, the cutaneous haptics significantly improved performance of surgical robot operators. What is surprising is that this lower-cost, seemingly more primitive form of haptics seems to be more effective.

The situations where our brain steps in to fill the gaps is what I find most interesting about our perception of the sensation of touch. Called various things — “sense substitution,” “digital synesthesia,” or even “phantom haptics” — this phenomenon has been an exciting area of study for haptic researchers. The da Vinci robot from Intuitive Surgical is probably the best real-world example of phantom haptics. The robot is used in one third of U.S. hospitals on hundreds of thousands of patients. Most first-time users are blown away by the intuitiveness of the system, and many comment on how realistic the haptics feel. Then they are consistently shocked, as I was, to learn that there is no haptic feedback on the da Vinci robot. Yet people swear they “felt” something, even after knowing there is no feedback. So what’s happening here? Our brains are very good at filling in missing information, as is common in optical illusions.

New considerations for haptics

The momentum in haptics and research shows no signs of slowing. But it doesn’t mean we all need to buy expensive haptic technology to improve our user experience. We’re gaining a better understanding of the value and impact of haptics that will allow for improved virtual reality and simulation experiences. The research so far supports a two key takeaways for anyone in the space.

1. Focus more on desired outcomes than on how they are achieved. If having no haptics at all makes your experience more affordable, accessible and easy-to-use, and you still have the exact same outcome, that seems like the optimal solution. If haptics does improve your desired outcome, make sure the associated costs are justified by a disproportionate gain in effect.

2. Consider aiming for “just enough” haptics. Shooting for 100 percent haptic realism might not be the best strategy due to uncanny valley issues. Aiming for less haptics may counterintuitively lead to a better outcome.

Exciting technologies down the road include skin-stretch controllers, like the one from Tactical Haptics, which conveys the feeling of object weight and inertia quite well. I’m also excited about the exoskeleton glove/cutaneous hybrid from HaptX. As we continue to expand our understanding of the brain’s perception of touch we will continue to develop and refine more effective and immersive VR applications.

Justin Barad, MD is cofounder and CEO of surgical training platform Osso VR. He is also an orthopaedic surgeon with a background in game development. He has written for medical technology site Medgadget for more than a decade, and has spoken at multiple conferences including TEDMED, CES, Exponential Medicine, and Health 2.0.