In 2014, IT executives are going to have to make some very difficult decisions about privacy. Quite often when we talk about difficult decisions, we mean that we know what the right thing to do is, but it's just hard to bring ourselves to do it. In this case, though, part of the difficulty will be knowing what the right thing to do is. For that reason, every industry -- nay, every company -- will come to very different decisions based on the concerns of their employees and customers.

Of course, some companies have to face their privacy demons more than others. Yes, I'm looking at you, Google. Not that Google is likely to ever change how it handles privacy issues. (SAT time: Google is to privacy as (A) Osama bin Laden is to peaceful negotiations, (B) Lady Gaga is to rational thought or (C) Microsoft is to customer-centric. Answer: (D) all of the above.) The reason I'm looking at Google is that it just displayed privacy ineptitude on an epic scale.

This particular Google privacy debacle started out looking like the opposite: a rare instance of Google helping its customers protect their privacy. Android 4.3 was released with a privacy control that looked wonderful. You could see a list of all of your apps that told you exactly what access they had, and there was an easy way to restrict whatever you wanted to restrict. Even the privacy advocates at the Electronic Frontier Foundation -- not exactly a group of Google fans -- applauded the effort.

And then it disappeared. As of Android 4.4.2, this short-lived privacy aid was gone, as though the Grinch had discovered that all of his profits were being donated to charities. And then we found out that the useful tool was never meant to be useful to customers. In explanation, Google Android engineer Dianne Hackborn posted to followers: "That UI is (and it should be quite clear) not an end-user UI. It was there for development purposes. It wasn't intended to be available. The architecture is used for a growing number of things, but it is not intended to be exposed as a big low-level UI of a big bunch of undifferentiated knobs you can twiddle. For example, it is used now for the per-app notification control, for keeping track of when location was accessed in the new location UI, for some aspects of the new current SMS app control, etc."

You see? The idea that Google would make such intuitive privacy controls available to the people whose lives are being monitored was absurd to Google. Google views privacy as interference with profits. Its business model depends on extracting as much information as possible from consumers and businesses and selling it to other consumers and businesses.

Not that Google is alone in wanting to sell you minute-by-minute geolocation tracking of every employee, customer and supplier. This brings us back to 2014 plans. Businesses need to change the way they view privacy so they can make proper decisions about how far to let companies like that go. Making these decisions on a case-by-case basis, which is how most IT executives handle privacy decisions today, won't work any longer. These decisions must be made and approved at the CEO and board level and then mandated through every department. (Especially marketing, given that no one in quite some time has seen so much as a hint of Jiminy Cricket or, for that matter, anyone's conscience, in a marketing meeting.)

Let's take a look at some of the key areas where you're going to need to champion some decision-making:

1. BYOD mobile policies

The idea of permitting (forcing?) employees to use their personal property for business activities has its pluses and minuses, but the trend is prevalent enough that it's all but inevitable for most enterprises. Security protocols are going to mandate that personal phones and tablets be backed up regularly, just like corporate information devices. This forces the privacy debate: how to guarantee that these backups to IT servers (the cloud makes no difference here) do not expose personal information or images to corporate.

The answer is going to be some form of partitioning, with corporate data and apps completely segregated from personal data and apps. That way, IT can silently access and back up everything on its side of the Mason-Dixon Line without privacy worries. And when the employee quits, is laid off or is fired? On the last day, everything on the corporate side of the device can be remotely wiped.

2. The myth that young consumers don't care about privacy

Why is this a myth? Because it has become conventional wisdom on the basis of some shoddy survey questions. If you ask any group of people, "How much do you value your privacy?" you are going to get very different answers than if you asked those people to rank specific examples of personal information that they wouldn't want to be exposed.

As the father of a teenage girl, I can tell you that teens do value privacy, but what they don't consider to be private is stunning. Social interactions (including the baby-making kind) are matters to be freely shared on social sites, as are mobile phone numbers. But bank account information and payment card activity are not things they want other people to know about. (Remember Blippy's? It was a site that let shoppers publicize what they purchased. Turns out almost no one wanted to do that.)

You have to understand your key groups: employees and customers. What does each group consider private? How much do they care about each area? Is there anything that would make them surrender that particular privacy? You're going to find out that different employees (and customers) have very different concerns.

Then you have to review all of your privacy policies. For your employees, this would include your ability to access all company emails and phone calls (and, presumably, texts and Twitter exchanges and any other communication mechanism). Do you really need that information? If you do, is there a less intrusive way of getting it? You might conclude that less intrusion could prove to be a useful recruiting/retention tool, especially for developers and engineers. Examine your culture and have that discussion -- in a 2014 context -- with senior management.

3. Subpoenas and search warrants

An email vendor called Lavabit was a small player in the aftermath of Edward Snowden's revelations about the National Security Agency. When hit with a court order to turn over encryption keys, the company complied -- sort of. It delivered an 11-page printout in four-point type. Prosecutors complained, saying that the printout was illegible.

"To make use of these keys, the FBI would have to manually input all 2,560 characters, and one incorrect keystroke in this laborious process would render the FBI collection system incapable of collecting decrypted data," prosecutors wrote, according to Wired. The court eventually forced Lavabit to give the government the key in an electronic form. Lavabit then took an unusual move: It told its customers that it could no longer protect such communications and then shut down the service to prevent any more of its customers from unintentionally sharing data with government investigators.

Lavabit deserves credit for being true to its marketing message: that it cared about securing customer data. Principle trumped profit. This example raises the question: Should businesses decide where they will draw the line on legal requests and then publicize that decision as part of their privacy policy? Would such a move make for good public relations?

It might make sense for some companies. The standard disclaimer today states that the vendor will hand over anything that anyone can get a judge to sign off on. I'd guess that's the way 95% of companies should go, but for a few, taking a stand could be good for business. Certainly it's something that businesses should discuss.

4. How to publish your privacy policy

The way most companies present their privacy policies is a joke. They're in tiny type, with page after page of unintelligible legalese. Typically, they can be summed up as, "We can do anything we want, and there's nothing you can do about it. Just click on the Accept box, which will bind you to everything in here. Your only other option is to not use our site or app."

When you talk about privacy in 2014, this needs to be at the top of the agenda. Push for shorter privacy policies written in clearly understood English. Anything really important should be in bold type and quite explicit. You should point out that judges are already pushing back against user-unfriendly privacy policies, and you can expect more of that.

Ask management, "Is our policy something we're proud of or ashamed of? If we're proud of it, why are we obscuring it with tiny type and legalese? If we're not proud of it, shouldn't we change it?" As things stand today, most privacy policies practically scream to customers and employees this message: "We have something nasty to hide here." And the truth is that most companies do have something to hide in those boilerplates. Make sure that your company doesn't. Or if it does, discuss it openly so that all senior executives understand the implications.

Privacy policies are going to have to seen as core strategic documents in 2014. Anything less and you're going to find a lot more resistance than you've been used to. But there's also a positive reason to do a rethink: You stand to gain on rivals that pass up this chance for such strategic thinking.

Evan Schuman has covered IT issues for a lot longer than he'll ever admit. The founding editor of retail technology site StorefrontBacktalk, he's been a columnist for CBSNews.com, RetailWeek and eWeek. Evan can be reached at eschuman@thecontentfirm.com and he can be followed at twitter.com/eschuman. Look for his column every Tuesday.