One of the key factors that Google looks at when ranking a site is the number and quality of sites that link to it. Google's ranking algorithms work by treating the entire internet as a giant web of connections. If a site has a lot of in-bound links to it from other high ranking sites then Google will naturally consider that site to be a high quality site and rank it accordingly.
In the bad old days of SEO it was possible to manipulate Google's algorithms by simply generating a large number of in-bound links from other sites on the web. It was common to see site link directories containing nothing other than millions of links to other sites (often paid for by those sites owners). It was also fairly common to see link sharing pages on sites as sites that were entirely unrelated shared links with each other simply to up their in-bound referral count.
Of course Google were well aware of this tactic and didn't in any way endorse this approach. However given the hundreds of millions of websites that Google has to index on a regular basis there is only a limited amount of computing power they could apply to their algorithms and the situation persisted for several years.
However starting with the infamous Penguin and Panda updates Google finally was able to apply enough computing power to change their algorithms. Rather than simply looking at the volume of in-bound links Google developed algorithms to measure a site's authority and the quality of the content it contained. As a result the algorithms actually started to penalise sites with hundreds of inbound links from low quality sites.
This caused significant changes in search engine rankings with some sites dropping 10's or 100's or places overnight. To deal with the damage caused by poor quality in-bound links Google included a 'disavow' tool in it's Webmaster Tools suite. This tool allows webmasters to disavow links from low quality sites so that they will no longer be included in Google's algorithm.
Over time more and more of the old black-hat approaches to SEO have been engineered out of Google's algorithms and we are now in a position where the best thing to do is to focus on the generation of high quality content and acquiring high quality in-bound links from other quality sites.
Webmasters can ensure their own site has a regular flow of high quality content by regularly updating their copy and using news or blog systems to create regular new content. However it's very difficult for webmasters not in the web design or SEO industry to place content of their own on other high quality sites. The task of creating high quality in-bound links now falls mainly to SEO companies such as Webfuel who can offer on-going SEO campaigns that slowly build up the number and quality of sites linking back to a client's site.
If you'd like to talk to Webfuel about an in-bound link building campaign that will generate guaranteed results your site's rankings then please give us a call on 01509 852 188 .
Online stores are heavily reliant on organic traffic when it comes to driving sales. As SEO becomes increasingly multifaceted, the scope of ‘doing ecommerce SEO’ widens — now including disciplines like content marketing, digital PR, and UX (user-experience). Here are
Any effective SEO campaign starts with keyword analysis. Before we can decide where to target our SEO efforts we first need to understand what keywords we are going to be getting to Google page 1. Our keyword research revolves around
Recently Google announced another major change to the way it ranks web pages. The search giant updated its algorithms to give greater priority to "fresh" content. The rationale behind this change is that people are generally seeking news and information on
One of the most important things you can do to give you site a boost in search engine results is to regularly add new content to the website. The amount of content on a website, and the frequency with which