lunes, 14 de abril de 2014

SEOs beware: Google preps over-optimisation penalty

Posted 19 March 2012 13:22pm by Patricio Robles with 7 comments

Can you ever have too much of a good thing? According to Google, the answer is 'yes' when it comes to SEO.

In the past couple of years, the search giant has made a concerted effort to improve the quality of its index.

The measures taken are wide-ranging, from updates targeting content farms to the more recently announced penalty for pages with too many ads.

Now Google is apparently set to take its efforts one step further by targeting pages and sites it deems have been over-optimised.

As detailed by Search Engine Land, Google's Matt Cutts has revealed that Google is hoping to "level the playing field" for sites with great content by penalising sites with lower quality but better SEO.

On a panel moderated by Search Engine Land's Danny Sullivan at SXSW, Cutts stated:

All those people doing, for lack of a better word, over optimisation or overly SEO – versus those making great content and great site. We are trying to make GoogleBot smarter, make our relevance better, and we are also looking for those who abuse it, like too many keywords on a page, or exchange way too many links or go well beyond what you normally expect."

According to Cutts, "several engineers" are working on this update, and it could be rolled out in as early as a few weeks.

Unfortunately for SEOs, outside of what Cutts mentioned (keyword stuffing and excessive links), no additional information on what constitutes "over-optimisation" has been revealed, leaving everybody to wait and see just how many sites get caught in the crosshairs of Google's update.

Ironically, you could argue that Google's efforts to clean up its index increasingly seem to be over-complicating its algorithm, a hint that the company is trying to fix something that it increasingly recognises has major flaws.

As one astute commenter on Search Engine Land observed:

There is no such thing as "over-optimisation." Rather, Google's algorithm relies too much on signals that do not correlate to the quality and relevance of a given webpage. For example, just because something is in an H1 tag, does NOT mean that the surrounding text is more likely to be high quality content about that keyword.

Penalising people for tricking something that is easily tricked isn't the solution. Revamping the system to improve its fidelity is."

No hay comentarios:

Publicar un comentario