As we reported Monday, Facebook users' seven-day voting period on the new privacy policy came to an end with far less than the 30 percent required to sway Facebook's opinion. Yes, 668,872 people voted, and of those, 589,141 said they were opposed to the new documents. But Facebook confirmed today in a blog post that it had "decided to adopt the proposed updates" to the Statement of Rights and Responsibilities (SRR) and Data Use Policy.

There was, however, a silver lining for the Facebook activists who spelled out their beefs in comments. Facebook says it amended the documents based on user feedback. "For example, we added new language to clarify our proposed updates on sharing information with our affiliates and our privacy controls," wrote Elliot Schrage, VP of communications, public policy and marketing for Facebook. (You can see more of the changes made by Facebook following user comments here.)

"While participation in the vote was minimal, this experience illustrated the clear value of our notice and comment process," Shrage wrote.



If you've been following along, this means that Facebook will definitely discontinue votes of this kind, though presumably it will keep asking for feedback in the form of likes and comments.

The whole process was a stark reminder of how huge Facebook is, not to mention how ineffective the 30 percent vote policy was from the start. As Shrage wrote:

We made substantial efforts to inform our users and encourage them to vote, both through emails and their news feeds. Despite these efforts and widespread media coverage, less than one percent of our user community of more than one billion participated. As stated in both policies, the results are advisory unless more than 30 percent of users vote.

Now that the policy is going into effect, it's high time you read it over. Right here is Facebook's full text of the Statement of Rights and Responsibilities, and here is its full data use policy.

If you don't have a magnifying glass and a lawyer handy, start with this brief explanation, from our very own Helen Popkin:

The new policy will allow Facebook to obtain data about you "from our affiliates or our advertising partners" (with whom you've already shared your personal info, such as websites, memberships, etc.), to "improve the quality of ads." Plenty of sites already do this, matching your info (which you've provided, technically of your own free will) to show you ads your most likely to respond to, and to report to those ad partners how you did respond.

Among other concerns are those outlined by the Electronic Privacy Information Center not-for-profit group, which focuses on emerging privacy issues: less control over the kinds of messages users let Facebook send via @facebook.com email, potentially leading to increased spam; sharing of user data with "affiliates" such as the Facebook-owned Instagram; and the removal of the user voting policy, despite the fact that it was impossible for users to reach the high bar.

"Although Facebook’s existing voting mechanism set an unreasonably high participation threshold, scrapping the mechanism altogether raises questions about Facebook’s willingness to take seriously the participation of Facebook users," wrote EPIC, in an open letter to Facebook CEO Mark Zuckerberg.

As always, it's important to try and grasp as much of this stuff as possible, but it's also vital that you take care what you post on any social network. And if you don't like it, the best way to show that is to not use it.

This episode's final question, which we have put to Facebook, is if they'll do away with the ballot-inspired governance logo (seen above), now that user balloting will no longer be part of the site's governance.



Wilson Rothman is the Technology & Science editor at NBC News Digital. Catch up with him on Twitter at @wjrothman, and join our conversation on Facebook.