How can we listen for events if the user doesn’t have a mouse?

Touch events to the rescue!

The touch events’ interfaces are relatively low-level APIs that can be used to support application-specific multi-touch interactions, such as a two-finger gesture.

A multi-touch interaction starts when a finger (or stylus) first touches the contact surface. Other fingers may subsequently touch the surface and optionally move across the touch surface. The interaction ends when the fingers are removed from the surface. During this interaction, an application receives touch events during the start, move and end phases.

Adding touch event listeners is rather simple, but you have to support both the touch and mouse :

Try this example on your phone, and try using it on the desktop version to split the screen and then inspect the console.log output.

testing touch events on iPhone

Supporting both the touch and mouse events can become very bloated and hard to maintain since you basically have to code events for different devices.

Pointer events to the rescue!

check out this awesome polyfill by the jQuery team to enable support for pointer events on most browsers

Remember our example above where we had 2 event listeners — one for the mouse and one for the touch?

Well, instead of writing two event listeners, we can simply change one of them to pointermove and remove the rest.

pointermove supports touch and mouse

It’s amazing what browsers can do today! Fingers crossed we have fully native experience inside browsers someday.

React recently launched version 16.4 with native pointer events support.