In the ongoing war between spammers and those who would have a useful internet, a blow was made against blogspam a couple months ago that is just now recieving some press.
Slashdot just noticed this article/study today, but it’s been out since January. Essentially, Microsoft researchers have discovered tens of thousands of junk Web pages, created only to lure search-engine users to advertisements. Us regular internet users have known about these pages for ages, and have been plagued by their existence for far too long, but as it turns out the researchers noted that the vast bulk of the junk listings was created from just two Web hosting companies and that as many as 68 percent of the advertisements sampled were placed by just three advertising syndicators.’
My question is: how will this effect e-commerce on the internet? Theoretically, since it’s been three months since the takedown of these sites, about a generation in terms of internet time, we should be feeling the positive or negative effects of these changes.
Theoretically, any decrease in the signal-to-noise ratio on the internet is a good thing, but in the end, all these sites ended up doing was promote legitimate businesses. Still, click thru ratios on these spam sites can’t be particularly good, and therefore many a well-crafted pay-per-click campaign ended up buried due to being placed on these types of sites.
I can’t be sure because I really don’t have the time to research this at the moment, but the type of advertiser that Project Strider seems to have been going after would probably have been the kind that was born out of necessity due to AdSense’s one affiliate per program rule. Again, this is all coming from slightly rusty memory, but I remember Perry Marshall sending out a notice a couple years ago mentioning that Google had made a change in their program that didn’t allow multiple affiliates from the same program to compete against each other on the same keywords, and essentially the most successful campaign is the one that didn’t get shut down (successful being the campaign with the highest CTR).
As the diagram shows, the type of advertisers that Project Strider was targeting were those that had unique domain names as doorways to common affiliate programs and PPC sites. Obviously the technique evolved to simply create lowbrow content trap sites, but I think the genesis of the site may have come from Google’s original policy.
Regardless, what it means for the industry and for the end user… As for the end user, obviously it should mean improved user experience when it comes to searching the web. To be honest, marketing has moved so far forward that the sites I look to for information that much of what I want to find is hidden behind several pay-me for the info gates. For instance, if I’m looking for information on a particularly arcane programming roadblock, there are a number of forums out there that exist and seem to have my question answered on there, but require a subscription to see the response. To me this is just as annoying as seeing the same question without an answer parrotted on fifteen content-trap websites simply filled with links to other advertising sites.
As an internet marketing professional always looking for an easy buck, I think if nothing else it will force me to be more creative in my quest to create automated sources of content. I’ve been working on for several months a site that will surf RSS feeds and create articles distinctly different but on topic with what large crowds in the blogosphere are talking about. I simply don’t have the resources and time to (or maybe it’s just that I’ve some scruples that prevent me from) creating retarded content-trap sites like the report is talking about.
Bottom line: I’m not sure what it means, but I’m leaning towards ‘its a good thing.’ This isn’t a common practice across a large cross-section of marketers, it’s simply the actions of a few companies, and shouldn’t have anything but positive effects for marketers, ecommerce sites, and users.