IBM has banned Siri on its corporate network citing reasons that it can't trust the intelligent assistant to keep its virtual mouth shut.

Despite the company having a strong bring-your-own-device (BYOD) policy, it has caused a few headaches on the corporate security side of things.

The computing giant is concerned that Siri, the voice-activated assistant exclusive to Apple's iPhone 4S, could allow Apple to snoop on its customers' queries and potentially let industrial secrets out of the bag.

IBM chief information officer Jeanette Horan told MIT's Technology Review that the company is "extraordinarily conservative" about computer security, and disabled Siri because the company is worried that the "spoken queries might be stored somewhere."

It's corporate cloud paranoia at its best, but it also makes perfect sense.

Siri uploads what you say to Apple's datacenters for processing. There it translates what the user has said, and returns the best results back to the iPhone. All this happens in the space of a few seconds.

Looking at Apple's license agreement which dictates the terms of the data uploaded by Siri --- well, it doesn't clearly say. Apple doesn't say who can access the data, how long it stores the data for, or whether it actively accessed by staff.

An Apple spokesperson did not respond to questions at the time of writing.

Apple's iOS software license agreement states [emphasis mine]:

"When you use Siri or Dictation, the things you say will be recorded and sent to Apple in order to convert what you say into text and, for Siri, to also process your requests. [...] By using Siri or Dictation, you agree and consent to Apple’s and its subsidiaries’ and agents’ transmission, collection, maintenance, processing, and use of this information, including your voice input and User Data, to provide and improve Siri, Dictation, and other Apple products and services."

Terms like these are pretty bog-standard. These terms basically allows Siri to work, and often does not mean anything more than that --- certainly nothing untoward. Having said that, the language does leave it open to interpretation and more specifically, the potential for Apple to do something untoward if it chooses to.

Siri also uploads other things:

"...such as your first name and nickname; the names, nicknames, and relationship with you (e.g., “my dad”) of your address book contacts; and song names in your collection (collectively, your “User Data”). All of this data is used to help Siri and Dictation understand you better and recognize what you say."

But Apple doesn't mind its iPhone 4S users uploading unwittingly personal data to its datacenters. If anything, the more the better. The more data it receives, the better Siri ultimately becomes. It still has its beta tag firmly in place more than eight months since its release.

There is an interesting clause, however, that effectively exonerates Apple from "doing a Google". On March 1, Google consolidated its 70-plus privacy policies into one, allowing Google to build up a greater, more specific profile of its users for advertising purposes.

Apple's get out clause says:

"All of this data is used to help Siri and Dictation understand you better and recognize what you say. It is not linked to other data that Apple may have from your use of other Apple services."

Google came under heavy fire following its policy consolidation. European data protection agencies authorities are investigating Google because they believe the search giant broke E.U. privacy and data laws by merging its policies.

IBM is right to block off Siri, and it's right to take precautions. IBM also bans Dropbox and similar cloud services. Siri and the Dictation feature can be used to write emails, text messages, and store other information that IBM may not want being uploaded to Apple before it is downloaded back to the iPhone.

Having said that, it could be accused of double standards by not blocking access to Google, which stores store personal user data and hands it --- albeit after it is anonymised --- to advertisers.

Until Apple 'fesses up and says clearly and definitively what happens to its users' data when it's taken into Siri's custody, users probably shouldn't tell it anything they may be protective over.

That includes the nuclear codes, Mr. President.

Image credit: Apple/CNET.

Related: