One of the key factors that Google looks at when ranking a site is the number and quality of sites that link to it. Google's ranking algorithms work by treating the entire internet as a giant web of connections. If a site has a lot of in-bound links to it from other high ranking sites then Google will naturally consider that site to be a high quality site and rank it accordingly.
In the bad old days of SEO it was possible to manipulate Google's algorithms by simply generating a large number of in-bound links from other sites on the web. It was common to see site link directories containing nothing other than millions of links to other sites (often paid for by those sites owners). It was also fairly common to see link sharing pages on sites as sites that were entirely unrelated shared links with each other simply to up their in-bound referral count.
Of course Google were well aware of this tactic and didn't in any way endorse this approach. However given the hundreds of millions of websites that Google has to index on a regular basis there is only a limited amount of computing power they could apply to their algorithms and the situation persisted for several years.
However starting with the infamous Penguin and Panda updates Google finally was able to apply enough computing power to change their algorithms. Rather than simply looking at the volume of in-bound links Google developed algorithms to measure a site's authority and the quality of the content it contained. As a result the algorithms actually started to penalise sites with hundreds of inbound links from low quality sites.
This caused significant changes in search engine rankings with some sites dropping 10's or 100's or places overnight. To deal with the damage caused by poor quality in-bound links Google included a 'disavow' tool in it's Webmaster Tools suite. This tool allows webmasters to disavow links from low quality sites so that they will no longer be included in Google's algorithm.
Over time more and more of the old black-hat approaches to SEO have been engineered out of Google's algorithms and we are now in a position where the best thing to do is to focus on the generation of high quality content and acquiring high quality in-bound links from other quality sites.
Webmasters can ensure their own site has a regular flow of high quality content by regularly updating their copy and using news or blog systems to create regular new content. However it's very difficult for webmasters not in the web design or SEO industry to place content of their own on other high quality sites. The task of creating high quality in-bound links now falls mainly to SEO companies such as Webfuel who can offer on-going SEO campaigns that slowly build up the number and quality of sites linking back to a client's site.
If you'd like to talk to Webfuel about an in-bound link building campaign that will generate guaranteed results your site's rankings then please give us a call on 01509 852 188 .
Web design, web site layout and content has evolved rapidly over the last 12 to 18 months. So much so that even new websites, launched in the last few months, can look a little dated if following traditional web site
An ongoing debate in web design has been the scroll vs the click. Do users visiting a website prefer to scroll through a long page or click through to new pages? 2015 has possibly seen the demise of non scrolling websites, and
Webfuel are proud to announce the launch of our Google Keyword Finder tool. This tool has been part of our online SEO dashboard for our SEO campaign clients for some time, but we are now opening it up for anyone
Webfuel have recently deployed an open source Q&A platform for a department of Leicester University (RDS - Research Design Services). The tool allows RDS team members in Leicester and other RDS departments across the country to ask questions, provide answers