Content is King!

Posted on by in Guest Posts, SEO Industry

Note from MOGmartin: this guest post was written by Jim Magary of Boomient Consulting, a digital marketing firm focused on SEO, web development, and high-quality web content.

Some great news came down this week from Google, where they announced that they were going to take steps to decrease the amount of “content farm” content in their search results.

For those who don’t know what a content farm is, they are companies who produce up to thousands of pieces of content each day to put on the web.  This content tends to be produced as cheaply as possible, written expressly for the purposes of trying to rank in the search engines for particular keywords, and not produced with the best interests of the site visitor in mind.

We’ve all seen these search results… generic and unhelpful articles on a particular topic, surrounded by ads for products related to our search.  Google has determined that this type of low-quality content is not serving its users either, and they are correct. But the challenge remains how to identify low-quality content versus high-quality content, when neither one can overtly be called Spam.

As someone who works in the SEO field, I spend a lot of time attempting to drive search visitors towards particular content, so I’m sensitive to any industry practices that evaluate content for quality, as I wouldn’t want my content to be unfairly de-indexed by the search engines. However, believing myself to be a decent content writer, who produces and supports well researched, high-quality content on the web, I am in favor of Google doing everything they can do to cut down on crappy content-farm webpages, so their announcement comes as a breath of fresh air.

In the past few years, the Web has gotten to the point where regular users are adding content at an astonishing rate, and Google reports that they have over 1 trillion pages in their index. It’s almost like they’ve done too good of a job indexing the web, because now search results are littered with spam sites and other junk, which may be minimally qualifying in terms of addressing a search query, but don’t come anywhere close to professional writing.

It’s true that what may be spam to one person could be considered a truly satisfying meal to another, but any cursory scan of Web results will reveal content that has obviously been produced purely for SEO purposes, and where reasonable people would agree that it is not worth reading.  Many of these articles are too short, poorly written, and serve only as a driver to the numerous ads that appear on the page.  Coming across this content in the search results does not serve the average Google user, who is usually looking for something authoritative, credible, and professionally produced.

The issue lies with the fact that many of the best content producers simply don’t know enough about SEO, leaving their sites devoid of such common SEO tactics such as putting a keyword in the title tag, or optimizing the first hundred words of body copy to include a key phrase.  They are being outfoxed by marketers who are increasingly savvy about reverse engineering Google’s algorithm to favor their webpages. Google is good at what they do, but they still organize results based on quantitatively measurable factors, which makes it difficult to introduce “quality” as an algorithmic component.

But the news from Matt Cutts & Co. is encouraging. Here are a few things that Google is doing, or will be tackling soon, according to the recent post:

  • Algorithm changes to eliminate “low-quality” sites
  • Redesigned document-level classifier to make it harder for spam content to rank well
  • Improved ability to detect hacked sites
  • Upcoming changes to crack down on “scraping” sites, which simply use content from other sites and don’t offer much else
  • New extension for Chrome allowing users to report spam sites on their own

For the SEO community, the mystery is still how Google will actually identify low-quality sites. Once this becomes more clear, marketers will, of course, react to it, as they do with every algorithmic change they can detect. However, the most likely effect is going to be higher-quality content, which costs more to produce, and that will put many of these shadier marketers out of business.  This bodes well for good sites and ethical SEO consultants looking to promote true, quality content.

The fact that Google’s algorithm is open to reverse engineering of any sort is the whole reason that the SEO industry exists, but the fact that they have not yet isolated the “quality angle” of search results is why bottom-feeders like content farms are able to game the system with relative ease.  If Google ever perfects search, the practice of SEO will gain much-needed credibility, because it will be harder to do, and therefore will be populated with individuals who are both intelligent marketers and authoritative content producers. To me, this sounds much better than the current status quo.

I hope Google cracks this nut soon.  I’m looking forward to a web content industry focused on things like journalism, research, credibility, usefulness, and newsworthiness, not to mention good spelling and grammar. These are the things I like to see when I search the web, and I’m sure I’m not alone.  If content is King, then quality should be Queen.

Jim Magary is the founder of Boomient Consulting, a digital marketing firm focused on SEO, web development, and high-quality web content.

research paper