Is the way we bark out orders to digital assistants like Siri, Alexa and Google Assistant making us less polite? Prompted by growing concerns, two Brigham Young University information systems researchers decided to ask.

"Hey Siri, is the way we talk to you making humans less polite?"

OK, OK, they didn't ask Siri. Or Alexa. Instead they asked 274 people, and after surveying and observing those people, they found some good news: Artificially-intelligent digital assistants are not making adult humans ruder to other humans. Yet.

"Worried parents and news outlets alike have fretted about how the personification of digital assistants affects our politeness, yet we have found little reason to worry about adults becoming ruder as a result of ordering around Siri or Alexa," said James Gaskin, associate professor of information systems at BYU. "In other words, there is no need for adults to say "please" and "thank you" when using a digital assistant."

Gaskin and lead author Nathan Burton actually expected to find the opposite -- that the way people treat AIs would make a difference in their life and interpersonal interactions. According to their assessment, digital assistants in their current form are not personified enough by adult users to affect human-to-human interactions.

But that may not be the case with children. Parental concerns have already prompted both Google and Amazon to make adjustments to their digital assistants, with both now offering features that thank and compliment children when they make requests politely.

Gaskin and Burton did not study children, but assessed young adults, who generally have already formed their behavioral habits. The researchers believe that if they repeated the study with kids, they would find different results.

They also say that as artificial intelligence becomes more anthropomorphic in form, such as the new Vector Robot -- which has expressive eyes, a moving head and arm-like parts -- the effects on human interactions will increase because people will be more likely to perceive the robots as having and understanding emotion.

"The Vector Robot appears to do a good job of embodying a digital assistant in a way that is easily personifiable," Burton said. "If we did the same type of study using a Vector Robot, I believe we would have found a much stronger effect on human interactions."

The research is being presented this week at the Americas Conference on Information Systems.