(CNN) -- Eli Pariser made his mark on the Internet as the executive director of MoveOn.Org, the liberal group that was perhaps the first to turn the Web into a tool for massive political action.

Now he's worried the Internet is becoming too polarized, politically and otherwise, because of tools used by some of the technology and social-media world's biggest players.

His new book, "The Filter Bubble: What the Internet Is Hiding from You," details the ways Facebook, Google, Aol and numerous other online hubs quietly are personalizing the Internet for their users.

The stated goal is to make it easier for Web users to find the things online that they like. (And, of course, to make it easier for advertisers to hawk things to you that you're more likely to buy).

But the end result, Pariser says, is a silent, subtle bubble that isolates users from new discoveries and insights that may fall outside of their usual tastes and interests.

Pariser stepped down as chief of MoveOn in 2008 but is still president of the group's board. He spoke to CNN.com on Tuesday, the day his book was released.

On "the filter bubble" and how it works

One of the things that's really interesting about the filter bubble is that it's invisible. You can't see how your Internet, the websites you visit, are different than what other people see. They are sort of slipping further and further apart.

A couple of years ago, when you Googled something, everyone would get the same result. Now, when I've done these experiments, you can really get these dramatically different results. One person Googles and sees a lot of news about protests and the other person gets travel agents talking about traveling to Egypt.

I'm basically trying to make visible this sort of membrane of personalized filters that surround us wherever we go online, and let's see what we see.

On why the "bubble's" silent nature is bad

It's one thing when you turn on MSNBC or Fox News. When you do that, you know what the editing rule is -- what kind of things you'd expect to see there and what kind of things you'd expect to be edited out. But with a Facebook news feed or Google News, you don't know who they think you are. You don't know what's been edited out. It can really distort your view of the world.

Sometimes the unexpected, serendipitous articles or discoveries are some of the very best moments when you learn about some whole new process or way of thinking or topic. It's sad if we lose that just so a few companies can get more ad clicks.

On how Facebook filters your content

Facebook decides what people see in their News Feed largely based on what they "like" -- what they click on. (Pariser said that's imperfect. For example, someone would be more apt to click "like" on a funny photo than a news article about genocide in Rwanda). What that means is that you become more likely to see the former than the latter.

Mark Zuckerberg, I think not totally kidding, said a squirrel running through your frontyard may be of more interest to you right now than people dying in Africa. He may have meant that as a defense of the news feed. But to me that's a pretty strong critique. (The word-for-word quote, from David Kirkpatrick's book "The Facebook Effect": "A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.")

I learned this the hard way. I was really trying to cultivate a group of Facebook friends that were not like me, that had different views. And, all of a sudden, they were disappearing. Facebook was saying, "We know you better than you."

On Google's "filter bubble"

Google has an enormous (amount), 10 years worth, of aggregate data (through search, Gmail, Maps and other services). For me, it's gigabytes worth of data. This is part of the strategy for these companies ... to make it store more and more and more of your info on their servers to figure out which group of people are similar in what they like. Google has done an incredible job of that.

At times that can be handy. When I Google "pizza," my local pizza places come up. But I think it's much better for consumers than citizens.

I had friends Google BP when the oil spill was happening. These are two women who were quite similar in a lot of ways. One got a lot of results about the environmental consequences of what was happening and the spill. The other one just got investment information and nothing about the spill at all.

On private pros vs. public cons

There are ways in which this stuff is very useful, in particular for consumers being able to find the products that fit their tastes. But for citizens, it's a real problem. Democracy actually requires that the whole public be able to see common problems and address them and step outside of their own sort of narrow self-interest to do so.

This makes every step of that much more complicated. The problems you see may not be the same problems that other people see. I think it's easier than ever to hear only what you want to hear. That doesn't make a good citizen.

On what can be done

Part of the solution is for these companies to realize that what they're doing is important in this way and they can't just say, 'Don't mind us, we're just giving people what they want.'

If you look at the history of how information flows, there was a time that newspapers were kind of in the place that Google and Facebook are now -- how do we get more people to buy a copy? Then there was a shift in the early 20th century. They needed to do better, and readers and consumers demanded that of them.

Now, what we need is for the people who are building these algorithms to demand better. We need consumers who will hold their feet to the fire.