My FrontPager

Bringing you all the Front Pages from Africa to the World


Share This
One of the biggest challenges facing Content managers, Small Businesses and Digital Marketers is how to answer the question — what constitutes quality content, and how do we rank higher in core web search? This is one question that has been asked since the beginning of the internet age, yet unfortunately, one that doesn’t look like going away anytime soon.
In Dec of 2000 when Google launched the browser toolbar, it provided relevant search results from any webpage on the internet and since then, ranking has been a form of marketing. The higher your rank, the more visible your business.
During this period, the tactics for understanding and ranking higher on core web searches were simple — all you had to do was two things:
  1. Use a lot of keyword in your articles (keyword stuffing) and;
  2. Include a lot of inbound links in the content.
This is referred to as SEO, and everyone with a little idea could rank higher irrespective of poor content, and unoriginal content.
For those that don’t understand, Algorithm is a step by step process of solving a particular problem. So Google Algorithms are the process used by Google to rank Sites or return search results.
That has changed though. Since the early 2000’s, Google has been updating its Algorithm, sometimes as much as 500 times a year. These changes no doubt were done to create a better user experience; and to reward sites that post quality contents. While many obviously were little tweaks, some like the Google Panda; Google Penguin; Google Hummingbird; and Google Pigeon were amongst the major ones that have changed the ranking landscape.
Panda was released in 2011 as a content quality filter, and it is fair to say it has revolutionized SEO and search ranking since it then.
So I don’t think it is unfair to say a lot of people don’t deserve to be number one on Google rank, or even on the first page. This is especially true for sites known for bad/poor content or spam, keyword stuffing, pagination, thin content; and those with poor user experience that are only out there to exploit users. And if you think mere SEO, coupled with bad content would help you, then you are mistaken. The updates are periodical — meaning they come up on average once every two months, so it has become increasingly hard, if not impossible for bad content to escape the filter.
During one of the recent announcement for an Algorithm quality update, Google’s Matt Cutts revealed the reason for the many updates: “Many of the changes we make are so subtle that very few people notice them…This update is designed to reduce rankings for low-quality sites–sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites–sites with original content and information such as research, in-depth reports, thoughtful analysis and so on”.
With this, it is clear that Google’s core ranking Algorithm has the ability to pick up bad contents or webspam from many so called “Black hat” companies or webmasters, and this is a particularly bad news for lazy, uncreative businesses/content managers. It is no longer enough to be good with SEO, but you also need good content and practices to scale.
And this brings us to the question — what, according to Google separates a good content from a bad one?
A good content has the following:
  1. Is written by an expert or enthusiast who knows the topic well
  2. The article must not have a duplicate. There must not be an overlapping, or redundant article on the same or similar topics with slightly different keyword variations?
  3. The content must not have spelling, stylistic, or factual errors?
  4. The content must be factual enough for users to trust the information contained in it.
For a more exhaustive list of quality assurances when it comes to online content, read Google’s official Blog on the matter.
Some sites are more vulnerable to Panda than others, and every content developer or marketer should know why that is. The following are some of the reasons why Google would deem your content poor and ultimately rank your site low.
  1. Unoriginal or duplicated content: Sites with original content are known to rank higher in ranking while those with duplicated contents do poorly.
  2. High bounce rate: If your readers only visit a page and don’t click on any other links on your site, Google assumes they are not finding what they are looking for which is down to bad content.
  3. Low return visit: This is when visitors visit your site, never to come back again. Google again assumes they aren’t finding what they are looking for, and ranks your content as bad.
  4. High load time: If your site takes longer to load, that can constitute poor user experience, and consequently low ranking.
Despite the numerous updates, the best advice for any Marketer is to know what a good content is and focus on creating valuable, quality and shareable contents. An improved general user experience is also pertinent as is making use of less keywords — except for Brand names. A marketer can also integrate Google Analytics to see how the most recent Algorithm updates affects his/her site’s traffic and visibility. 

No comments:

Post a comment