Editor's Note: danah boyd is a social media researcher at Microsoft Research New England and a fellow at Harvard University's Berkman Center for Internet and Society. She recently completed her Ph.D. on social media, youth practices and other intersections between technology and society.

(CNN) -- Facebook's privacy problems have been in the news ... again.

Although complaints may have started deep in the blogosphere, even Time Magazine has made them its cover story.

In response, Facebook announced a new privacy model to address complaints that it's too darn hard to actually navigate privacy settings on Facebook. (Well, duh.)

It's not yet clear whether Facebook's changes will satiate the Facebook citizenry (let alone the rabid critics), but the conversations about privacy settings tend to emphasize only a fraction of the core concerns.

Facebook rightfully believes that it's important to give users control over their settings, to empower them to make decisions about what's accessible. But Facebook can also be condescending, suggesting that they know what's best for their users because they have so much data about them (which they do).

Unfortunately, Facebook's data tells them a lot about what people do but little about why they do it. They know that people aren't quitting Facebook, but that doesn't mean that users aren't frustrated or concerned (or would be if they understood what was happening).

Don't get me wrong -- the privacy settings are confounding even for the most experienced digerati and I'm very glad that they're addressing this fundamental issue.

But in doing so, I hope that they realize that the main reason for so much public outrage goes beyond privacy settings. The issue is fundamentally about trust and informed consent.

When people share information with Facebook, they become vulnerable to Facebook. They trust Facebook to respect their interests. Facebook has the power to expose people in ways that make their lives really miserable.

Because of that power, it's crucial that they stop telling users what's best for them and start engaging in a more meaningful dialogue.

Changing things and then forcing users to opt out is manipulative. Instead, they should be seeking informed consent -- actively working with users to help make sure that they understand what's at stake in their choices.

It is unacceptable for a company like Facebook to trick people into "consenting" to make their data more visible than they might think that it is.

People should be able to understand Facebook's changes and have choices available that allow them to make appropriate decisions. When Facebook changed its privacy settings in December, far too many people clicked on through without realizing that a few mouse clicks meant that they were exposing their status updates to the world.

Many people may know exactly how not-private their Facebook profile or updates are. But do they? I recommend using ReclaimPrivacy to scan your privacy settings.

Keep in mind that "Everybody" is more than the people searching for you on Facebook -- this includes every company or individual who wants to use your information for any purpose.

In many ways, it's more accessible than simply posting something on a public website and waiting for Google to find it. And keep in mind that "Friends-of-Friends" means more than the people that you'd invite to a birthday party.

I talked with a young woman who vowed that she'd never friend her mother; she didn't realize that if she friended her aunt and her aunt friended her mother than "Friends-of-Friends" included her mother. Do people really know who is included in their "Friends-of-Friends?"

Facebook doesn't just need to fix its privacy settings. It needs to fix its attitude and repair its relationship with its users. Facebook isn't just a space for users to share; it is built on the backs of people and profits off of the data people entrust to them. An abusive relationship is simply unacceptable.

Facebook must go beyond paternalism and start empowering users to help guide the future of the service. It starts with committing to an opt-in approach to changes and developing features that allow users to have complete transparency as to how their data is exposed to, and used by, third parties.

It then requires innovating ways to actively engage participants. Facebook built a platform for mass sharing, but if it wants to change the world, it must also develop mechanisms for informed participation.

With great power comes great responsibility.