Lately, we’ve been talking about our predictions for technology in 2017 (#BigIdeas2017). My big prediction is not really a stretch. I think it’s clear that this is the year that voice interface for computers finally matures. As my friend Garrick Chow reported in his article “CES 2017: Alexa Leads the Way to the Smart Home”, voice-operated appliances from Amazon, Google, and Microsoft are going to be huge this year. Voice assistants have been around for years. But, I believe we are just now getting to the point where these tools really work as expected and are a joyous addition to your life. But, as this technology grows, I’m here to ask you – beg you – please don’t call it “she”.

When we hit big milestones like this, we have a responsibility to shape how these changes affect our future. To that end, I want everybody to keep in mind something very important. Computers are not people (despite what HBO’s “Westworld” would like you to believe). It’s easy to anthropomorphize digital assistants, especially when they can speak to you with a friendly voice. It’s easy to start treating your digital assistant like a person and referring to it as “she”. I believe that practice is not only counter-productive, it’s dangerous.

I love my Google Home device. I can talk to it (not she) conversationally, just like another person in my house. It (not she) will obey my instructions and respond in a calm, soothing voice. It (not she) can turn lights on and off, set timers, or play news and podcasts. I can even ask the Google Home to play Star Trek: The Next Generation on my TV. You might remember that as the 90’s sci-fi TV series where a group of space explorers allowed a robot act as a senior crew member on a starship and were repeatedly punished for that mistake.

As much as I love the Google home and other voice-controlled devices, there’s a subtle danger to them. When I bring this up, people assume I’m afraid of the robot uprising from movies like “Terminator”, or human/computer romances from movies like “Her”. I’m not worried about those scenarios, which I still see as goofy sci-fi concepts. I’m much more worried about the real dangers of treating a tool like a person.

Let’s look at a scenario. Your grandmother loves to use Facebook on her Windows 7 computer. But, that computer is old and slow. It’s infested with bloatware and out-of-date applications. The best solution would be to upgrade to Windows 10. Or, better yet, start with a fresh, clean install of Windows 10. Or buy a new computer. These are logical options because that computer is an old appliance.

Fast-forward 10 years in the future. The Cortana voice assistant has become a robust, intelligent, indispensable component of Windows 10. In this possible future, everybody uses their computers conversationally. Using a computer is similar to talking to a compliant, friendly, and intelligent person. It might even adjust its personality to suit your preferences. You might start to think of Cortana as a person – maybe even a friend or family member. You might not do this consciously. It might just be a subtle shift in your perception. But, what happens when “she” becomes slow, out-of-date, and loaded with bloatware? Will you be amenable to replacing “her”, or wiping “her” operating system and doing of a fresh install?

I watched people get angry when Apple changed the interface for Final Cut Pro. What happens when software publishers make updates to your “friend” Siri or Cortana? Allowing yourself to become invested in the personification of a piece of software means that people will eventually avoid software updates out of fear that they will affect the personality of a trusted companion. Developing a relationship with a piece of software, no matter how well it simulates human behavior, is a dangerous precedent.

When you apply this formula to much more vital computer systems, the scenario becomes more frightening. What if an engineer refuses to update or replace a computer that manages an air traffic control system, water reclamation facility, or missile guidance system, simply because the personality of that computer was deemed more important than the flaws and glitches it has developed? This may seem far-fetched, but we set the tone for this future if we start referring to digital assistants by name or by personal pronoun. If we start to become attached to personalities in our digital assistants in 2017, what happens as our culture develops over the next 50 years?

People have goals and desires of their own. Computers are inanimate tools. They should be repaired when faulty, updated when capable of improvement, and replaced when broken. I accept characteristics in my friends and family that make me frustrated or unhappy, because those people have a right to happiness independent of my needs. Computers have no right to happiness – real or simulated. They exist to complete the tasks they were built for.

So, I’m asking you to celebrate technology in 2017. Enjoy your computers and digital assistants. Have fun and get work done with your smartphone. Benefit from clean water and efficient air travel. But, as you do, let’s try to develop a culture with a clear separation between real people and inanimate tools. It starts by referring to Cortana as ”it”, not “she”. Don’t think of Siri as your friend, think of it as software. If your Google Home doesn’t work correctly, reset it to factory settings and start with a fresh setup. It’s a computer, not a person.





---------------

To learn more about software and technology that will make your day-to-day work easier and more fun, check out the series on LinkedIn Learning - Monday Productivity Pointers, which I co-author with Jess Stratton and Garrick Chow. Some relevant topics in this series include:

Start with a fresh install of Windows 10 to clean away bloatware.

Completely erase a computer or mobile device.

Remember, a simulated personality doesn't care if you erase it. But, you should be sure to back up your data before erasing or re-installing your device's operating system.