Last year (Sept 2014) Google introduced an update to their Panda algorithm. It was said that this would affect around 3-5% of search queries and, websites which did not meet the algorithms guidelines, would experience a drop in website traffic and search engine ranking results. Although not directly affected by this, at Lilac James we have experienced an influx of enquiries with potential clients screaming ‘my traffic has disappeared over night’. When checking their sites we have seen that yes, they are not meeting the algorithm guidelines – they have ‘thin’, low quality ‘duplicate’ content and in some cases machine generated content, all of which does not follow current best practices.
However it is not all negative. Most of our existing clients saw an improvement in rankings results and website traffic where they were recognised for having quality relevant content, some even saw dramatic increases of over 100%. Why you may ask, because Google started to reward small to medium sized businesses over larger well known brands with this algorithm update.
Advice for website owners
If you haven’t already, we at Lilac James would recommend updating your content marketing strategy to meet the algorithm changes. As part of our processes we look at a website’s content, adjusting it if needed so that it is informative, relevant to searchers and of a high quality. This way we can reassure website owners that their website meets the current best practices.
So what should you do? Firstly, run your website through the appropriate software which can detect duplicate and thin content. We would recommend Siteliner.com as you can choose to use the software for free or pay on a $0.01 per credit basis (each credit is equal to one page of your website). It can also detect broken links and incorrectly set up redirects. Based on the findings, you can then adjust the pages on your site removing or re-writing duplicate and thin content.
Is that it?
Well according to recent news, it may no longer be as simple as that. The Daily Mail reported that Google is hoping it will be able to rank websites based on the quality of the facts contained. Although there are no plans for Google to use this newly created algorithm in the near future, it is recommend that you make sure your site content is factually correct.
How will the algorithm work?
The new algorithm will measure the trustworthiness of a website and its content by pulling facts from a number of pages and Google’s Fast Knowledge Vault. It will then jointly estimate the correctness and accuracy of the website it is crawling, giving it a Knowledge Trust Score. Pages and websites will then be ranked based on this with those websites who have a high trust score ranking highest, and those with a low trust score ranking lowest. It has not yet been confirmed whether those offering incorrect facts will be penalised but we would not recommend anyone risking it – after all it is much harder to rank when recovering from a Google Penalty.
Websites said to be affected by the algorithm if it is rolled out include forum websites. Which offers the question should we also think about our link building processes? Will Google penalise websites which link to or are linked from factually incorrect websites? One thing we do know is that quality content is king and has always been. So if you want to rank well and increase your website traffic, you should make sure that all your content including third party content linking back to you, is relevant, informative and of a high quality.