Rupert Murdoch, after a short time of seemed like he understood the internet was a new and exciting tool, has since changed his medication and now sees it as the evil of all evils. He has been pushing, vocally, not through action, reinstating paywalls on his various media properties. The Wall Street Journal is one of the last major newspapers to have a paywall around most of its content.
Now Murdoch is claiming he will block Google from indexing the WSJ and his other media properties. Murdoch told Sky News Australia “If they’re just search people… They don’t suddenly become loyal readers.” He explained that traffic from search engines involve no loyalty – just view a few headlines and leave.
Removing a site from Google takes just a few lines of code in a robot.txt file, something Google and other search engines make no attempt to hide. So why is Murdoch waiting?
Maybe because even without loyalty, Murdoch knows traffic will drop significantly without search engines bringing tons of free traffic. Even if 99 percent of those people never return, there are 1 percent that stay and might return. It’s up to Murdoch and his websites to give these users a reason to stay and then find ways to monetize that traffic. Murdoch has previously said no news websites or blogs are making serious money, ignoring the massive enterprises behind Gawker, Huffington Post, PerezHilton, TechCrunch and hundreds of others who have embraced the internet to find more cost-effective ways to engage audiences and produce compelling content.
Techdirt points out that for all Murdoch’s grandstanding, his own websites have aggregators that link to other people’s content the same way he claims others are stealing his content. When others aggregate content it’s stealing. When Murdoch does it, its convenient? Maybe this will stop his crusade to overturn fair use in the courts since he’d be culpable too.