All the news yesterday out of Rupert Murdoch's interview with Sky News was about Murdoch's endorsement of Glenn Beck's claim that President Obama is a racist who hates white people. But the rest of the interview had some even more disturbing remarks in it -- especially early on, when talking about his plan to make everyone pay for their Internet content.

Rupert, Rupert, Rupert. He just doesn't understand how the Internet works. If he continues to actively try to destroy the "fair use" of content, readers from all across the political spectrum will revolt against him. Even from his own side. Murdoch hates Google and every other search engine because he thinks by having Google linking to his stories, they are kleptomaniacs and robbing him. When asked why he just doesn't remove his websites from Google searches now, he replies that he will after he turns them all into "just for pay" only sites. If he feels they are ripping him off then why doesn't he do it now? The answer is he can't afford to do that. I dare him to do it.

newsroomamerica writes:

When challenged that his news organisations could just remove themselves from the search engines, he said "I think we will. But that's when we start charging. We do it already with the Wall Street Journal. We have a wall but its not right to the ceiling, you get the first paragraph of each story. If you are not a paying subscriber of WSJ.com you get a paragraph and a subscription form. Was this WSJ model what we can expect to see in other online publications? "Maybe, maybe. There's a doctrine called 'fair use', which we believe could be challenged in the courts and bar it all together. But it's ok, we are getting a lot of advertising revenue so we will take that slowly." The doctrine of fair use defines the various purposes for which the reproduction of a particular work may be considered fair, such as news reporting, and is a content gathering cornerstone for most mainstream media, including publications owned by Mr Murdoch.

The NY Times already tried the firewall approach and failed.

Jamie Holly emailed me and said:

When a search engine goes to a website it reads a file called robots.txt. This is like an instruction manual for search engines on what to search and not to search. You can view my robots.txt file here. So what does the robots.txt file on foxnews.com say? Well look at that. Not only is Fox allowing Google, but they are giving specific directions to Google to read files and index those items. So in an analogy sense this is like inviting someone into your home, pointing out all your valuables and asking them to take them. You even help them carry them out the door and wave good bye with a big old smile, then go inside and call the police reporting you've just been robbed. And Murdoch going after "fair use" is really interesting. The big question under section 107 of the copyright law has always been this: (3) the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and Fair use is allowed for: for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research. Google only shows a headline and around 128 characters, which a lot of times isn't even a full sentence for their "news" service, which is also considered a very valuable "research" tool. If he thinks some judge would rule that as not being fair use then he is dumber than I thought. I really hope Murdoch does go after Google legally on this. It would be so much fun to watch. Of course the only lawyer that I think would take Murdoch's case is Orly Taitz.

Glenn Beck joins the Net Neutrality fight by standing with Rupert and the wealthy as usual. Beck says Net Neutrality would 'destroy the free market that created the Internet'. Oh really?

Yes ma'am, may I have another?

Does Murdoch really believe that every other content provider in one form or another will suddenly join up with him and boycott Google and turn the net into a pay-per-view outlet?

I can only imagine the fun hackers would have at destroying his website security if he actually tried to implement it.