We have become a culture addicted to screens … but it's time to think beyond the screen.

The rise of smartphones over the last eight years has set a new mobile precedent for how we think about interacting with the internet. But is pinching, squinting, zooming and typing into tiny keyboards really the "connected" user experience we've all been waiting for?

As connectivity, sensors and networked services pervade every aspect of our lives and an ever-increasing number of objects, the screen seems less and less ideal as the mode of interaction. It is time for product manufacturers to think far beyond the screen, to refrain from leaning on a smartphone-centric paradigm, and begin to imagine the IoT user interface, and the experience it fosters -- or prevents -- differently.

In January 2016, Samsung released a smart refrigerator which has a giant smartphone on it.

This is, understandably, a difficult problem to solve. First, most product manufacturers are just now wrapping their heads around the idea of IoT and sensor applications. Converting from analog to digitally connected products is a massive shift in mind and skill set. Hardware, software, data, compliance and user experience are, in and of themselves new, complex and often unprecedented, never mind evolving. Second, advancements in technology are expanding possibilities for what interaction even looks like. Typing evolved to touch with the advent of the smartphone, but we're suddenly seeing many new forms of interaction emerge such as voice, gesture, movement and beyond. Third, ubiquitous connectivity and real-time data transfer transforming the very role of the user interface as we have always understood it.

With so much technology under the hood, the challenge becomes: how to deliver complexity as simply as possible.

Five examples of emerging IoT user interface designs To illustrate product reimagination, it's instructive to observe some clever examples of companies that are redefining the user interface designs of their product or service and have done so in a way that just makes it easier. 1. Knock [anywhere] as an interface, via Knocki

The Knocki is a small wireless device that converts any ordinary surface into a control interface. The device triggers the appropriate action based on how many times a user knocks on the surface (just as one would knock on a door). Users can customize (via mobile app) what specific actions are triggered by number of knocks. Through integrations with a variety of connected devices as well as software services, users can program complex requests like knocking twice to automatically push snooze on the alarm and start the coffee maker, or simple ones like knocking three times to locate a lost smartphone. 2. Button as an interface, via Logitech's POP Home Switch

Technology giant and remote-control leader Logitech recently released the POP Home Switch, a button that serves as a tappable home automation hub. With the proliferation of connected appliances, lightbulbs, door locks, speakers and all of their respective apps, the POP is designed to unite disparate devices together. Users can place buttons anywhere in their homes and program up to three "recipes," each triggering a different action depending on what other devices are included in the recipe. Without having to dive through layers of apps, users can, for example, tap once to turn on the TV with adjusted lighting, tap twice to turn off the lights and lock the doors, and so on. 3. Hand as an interface, via Augumenta

Augumenta takes a different approach to the IoT user interface, combining hardware (connected eyewear), software, augmented reality and body as an interface. When wearing smart glasses, users' own hands become configurable dashboards, keypads, sliders, control knobs, etc. The company has also developed software to use hand gestures to operate machine controls. Augumenta offers its SDK to developers and is working with leading industrial device and controls suppliers to prove industrial use cases, with a long-term objective of supporting consumer AR use cases. 4. In-ear digital assistance as an interface, via Sony's Xperia

Sony was one of the first to a new (and what many are calling the next big consumer mobile) market of in-ear digital assistance. The Xperia Ear earbuds look like any other wireless earbud except they speak quietly into users' ears, assisting in navigation, search, music, weather, scheduling, and sending and receiving messages. The devices are also equipped with accelerometers and gyroscopes and can detect nodding gestures to confirm a command. Users customize settings for the device in its associated mobile app, including notifications, integrations and other sound preferences. 5. Talking pill bottle as an interface, via AdhereTech AdhereTech is a connected pill bottle whose design enables its business model: a multistakeholder platform for medical adherence optimization. Patients pick up the preconfigured (no set-up required) bottle from the pharmacist when they collect their medication and drop it off when the prescription runs out for reuse or recycling. The bottle itself is outfitted to provide patients audio and visual reminders to take their medications. It simultaneously alerts medical professionals when patients fail to do so. Patients can select the media they prefer to receive alerts -- SMS, mobile, landline or directly on the bottle. This product is also unique in combining hardware and software interfaces created for multiple user personas -- patients, doctors, nurses, pharmacists, researchers and so on.