Immersion is one of the biggest names in gaming you’ve never heard of. When it comes to haptic feedback – that vibration feedback you feel from controllers, mobile phones, and other devices – they’re the top of the heap.
Their work has found its way into Microsoft and Sony’s consoles, into mobile phones, and now, partnering with Nintendo for the first time, into the Nintendo Switch Joy-Con controllers. I (metaphorically) sat down with two guys from Immersion across two separate interviews to talk about the world of haptics to find out where it’s been, and where it’s going.
Why do Haptics Matter?
Five or ten years ago, this was a much harder question to answer. Nowadays, things have changed. As you read this, you might even know the answer yourself.
“It used to be that we’d have to explain haptics to people,” said Matt Tullis, Director of Business Development for Games and VR at Immersion. “It’s got enough mindshare now that people know what haptics is.”
Matt Tullis, Director of Business Development for Games and VR at Immersion
We see it influencing not just console games and mobile phones but virtual reality and other industries as well.
The idea behind haptics is right there in the name of the company: Immersion. Immersion in visual media has been on a trajectory that started way back with silent films and goes to what Chris Ullrich, Vice President of User Experience and Analytics, calls a “holodeck-like experience.”
There’s that apocryphal story of people bolting from the theater at the screening of the very first motion pictures, so startled were they by the movement. Regardless, they were astonished. Compared to photos and drawings, it was a whole new level of immersion. Every improvement to media since then has, in one way or another, pointed toward improving immersion, Ullrich explained. 4K and HDR video provide more detailed and colorful imagery, and technology like Dolby Atmos can immerse us in the sounds of whatever we’re watching.
Haptics play a role in immersion as well, helping transport not just our eyes and ears but some of our other senses to another place. Haptic feedback lets us touch and feel the intangible. The little motors that make our game controllers move in our hands are changing, and the feedback they’re delivering matters more than ever.
History of Haptics
Before we get in too deep, though, let’s go back in time and see where haptics, especially as concerned with games, got their start.
Haptics entered the game space in what Tullis called the “wheels and sticks” era, in the form of “Force Feedback” sticks and wheels for simulation games.
With the introduction of the Nintendo 64’s Rumble Pak and the original PlayStation DualShock controller, though, haptics entered the gaming mainstream.
“That was when we came out with our version of rumble. We went to market with that, and there was some litigation between ourselves and Microsoft and Sony back in [the early 2000s].” Microsoft would eventually settle, purchasing a 10 percent share in Immersion, while Sony would continue to battle the suit only to lose the case and appeal. Sony is now in an official agreement with Immersion.
Since then, things haven’t changed much. Nintendo had its Rumble Pak and the Sega Dreamcast even had something similar, but the familiar boomerang-shaped controllers established by Sony and later Microsoft have stuck around with only minor adjustments to each. Even the haptic motors inside are identical.
The devices hidden in each corner of our controllers are called Eccentric Rotating Mass (ERM) motors. In short, they’re rotating electric motors with an off-center weight at the end.
From there, the next big leap for haptics was into mobile phones.
“In the last several years, we’ve been focused on what we can do with touch screens to… create better vibration as well as looking at things like pressure, pressure sensing, and those kinds of things,” Tullis explained.
The need for smaller motors brought about a new kind of motor, the Linear Resonant Actuator (LRA). Those giant motors weren’t going to fit into a mobile phone. Unless…
Where an ERM acts more like a traditional rotational motor, LRAs act more like a speaker. Inside the device, an electromagnet moves a weight pushed against a spring. Because they’re so much smaller, LRAs can produce a wider range of haptic sensations. They can start and stop much faster than those ERM motors, too.
New Spaces for Haptics
If ERM motors were the past of haptics, LRAs are the future – or at least part of it.
“The last few generations of consoles have been pretty stagnant in terms of their evolution of haptic capabilities,” Ullrich told me. They’re good for ambient effects, Ullrich said.
“You want to create an explosion, a sense of road texture,” Ullrich offered – the motors inside PlayStation and Xbox controllers are great at that. But, Ullrich said, “there’s… an upper limit on what you can do with rumble technology.”
Chris Ullrich, Vice President of User Experience and Analytics at Immersion
Now, though, higher-fidelity actuators are working their way into other gaming tech, and this puts a common link between the Nintendo Switch and the burgeoning world of VR that, personally, I hadn’t thought of.
While Sony and Microsoft are still working with those boomerang-style controllers, Nintendo and the teams behind the Oculus Rift and HTC Vive are breaking controllers out into two separate pieces meant to be held with one in each hand. That change, coupled with the introduction of LRA motors, is introducing room for all sorts of ideas to enhance games with haptics.
“Within VR,” Tullis said, “it’s super understood that haptics is an important part of the experience.”
One suggestion both Tullis and Ullrich went to was a bow and arrow, a favorite weapon of game designers and green-clad superheroes. With two completely discrete devices delivering haptic feedback, you can communicate two wildly different sensations. In one hand, you have the bow itself – a heavier device that will offer some duller sensations. In the other, you have the bowstring, which can vibrate differently as you pull the string further back and increase the tension on the line. The idea of driving and shooting at the same time came up, too, because who doesn’t want to live out a Jason Statham movie?
It doesn’t have to be all murder weapons, though.
“Let’s say you’re playing Job Simulator,” Ullrich said. “You’re manipulating two things that look exactly the same but have different masses. You can know whether one weighs more, whether one is more sticky.” Ullrich, who comes from a background of in simulation-related haptics, also mentioned virtual surgical training, where making surgeons feel like they’re interacting with tissues helps make the virtual training more immersive and more effective.
In a driving game, Ullrich says haptic effects can be detailed enough to tell the difference between concrete and asphalt, cracked roadways, and even grooved roads. Tullis made sure to call out the Impulse Triggers in the Xbox One controller as a way Microsoft has tried to add additional value to controller rumble, and indeed it’s something I look forward to whenever I play Forza Motorsport.
In those cases, the haptic feedback is actually delivering additional world information on top of providing some aesthetic value. The person playing knows better how to play the game than they would without the feedback.
And before we can even get into a game, we have to navigate menus to get there. Haptics help with that, too. We see it constantly with mobile phones, like with the iPhone’s 3D touch, but it comes into VR as well. The VR art application Tiltbrush, for example, puts the menus around your wrists. Good haptic feedback is key to helping users navigate virtual menus that not only have no physical counterpart but have the further disconnect of being navigated by real hands in virtual space.
The Sensory Uncanny Valley
The hardest part of all this, then, is getting it just right.
“With haptics, you have to be able to do it well, or you can annoy people, or confuse them,” Tullis said.
Whether it’s the user experience team at Immersion or a designer on a big AAA game, someone has to put all the haptic feedback we already have in place and make it feel good, and they have a tough job.
It’s the tactile equivalent of what foley artists do in movies. A foley artist is the person who makes sound effects that match up with what’s happening on-screen in a movie. A punch in a movie is never just a punch. Fight Club foley artists experimented with “shattering chicken carcasses with baseball bats, cracking walnuts inside them,” according to sound designer Ren Klyce.
On the Reasonably Sound podcast, host Mike Rugnetta said that, with sound effects, “we don’t want to hear exactly the sound. We want some heightened, maybe even abstract version of the sound.” It’s not about accurately conveying the sound, but about precisely mimicking what we expect it to be.
So how could haptics perform this kind of mimicry?
“A gunshot in real life sounds nothing like a gunshot in the game,” Tullis said. “They’re adding other stuff because, in the case of the gunshot, they want you to feel like the gun is big. They don’t want it to be realistic, they want it to be surreal.
“Similarly, let’s say you have a complex reload sequence in a gun,” Tullis continued. “You have a couple different movements.”
Tullis paused for a moment and offered up this suggestion, which immediately took root in my mind:
“A gatling gun,” he said. Like those crank powered ones in westerns. “The gun shooting, the deep feeling. It goes empty and you have that spinning. You want that to be a light, subtle feeling.” A click with no explosion.
“If you’ve done it all well, if you’ve synchronized your haptics with your audio and visuals and so on, haptics can convince the subconscious part of your brain that whatever you’re doing is real,” Ullrich explained.
“When you pick up something in a virtual reality environment and it has no mass, or you put your hand through a wall and it’s not stopped from moving, or you hit something and there’s no tactile reaction whatsoever, a very deep part of your brain basically tells the upper part of your brain that whatever you’re doing is not real. It can’t suspend its disbelief,” he continued.
Bring on the Designers!
That’s where Immersion is trying to improve things. For a long time, Tullis explained, “putting haptics into a game has been a very code-focused endeavor.”
“It’s done programmatically. You go in and program; it’s a series of parameters.”
Immersion wants to bridge the gap between the hardware – those haptic motors – and designers. They want to give the paint brushes directly to the artists, instead of having them try to work with the coders to get it right through time-intensive coding.
To that end, Immersion has created a tool they’re calling TouchSense Force that hooks into stuff like the Unreal Engine and Protools, giving designers a familiar environment in which to start implementing haptics, making it more like implementing sound effects.
TouchSense Force is officially hitting the Nintendo Switch and Oculus Touch controllers first, but it’ll be coming to other platforms and other game engines as well, both Tullis and Ullrich confirmed.
Immersion will not only be working to get their software as widely available to game creators as possible, but they’ll be offering things like consulting services to some of those designers. The next big revolution, essentially, is in the hands of designers.
TouchSense Force works with both the old ERM and new LRA motors, but Ullrich said he hopes that Oculus and Vive are “putting pressure on Sony and Microsoft to up their game on the touch side of things.”
“Immersion has TouchSense Force as a technology that we would hope to see in next-gen consoles as a way to continue the evolution of the tactile dimension of that experience,” Ullrich said.
The haptic feedback we’re used to is on the precipice of a big change for the better. The rumblings of something new are getting too big to ignore.