One of the key factors that Google looks at when ranking a site is the number and quality of sites that link to it. Google's ranking algorithms work by treating the entire internet as a giant web of connections. If a site has a lot of in-bound links to it from other high ranking sites then Google will naturally consider that site to be a high quality site and rank it accordingly.
In the bad old days of SEO it was possible to manipulate Google's algorithms by simply generating a large number of in-bound links from other sites on the web. It was common to see site link directories containing nothing other than millions of links to other sites (often paid for by those sites owners). It was also fairly common to see link sharing pages on sites as sites that were entirely unrelated shared links with each other simply to up their in-bound referral count.
Of course Google were well aware of this tactic and didn't in any way endorse this approach. However given the hundreds of millions of websites that Google has to index on a regular basis there is only a limited amount of computing power they could apply to their algorithms and the situation persisted for several years.
However starting with the infamous Penguin and Panda updates Google finally was able to apply enough computing power to change their algorithms. Rather than simply looking at the volume of in-bound links Google developed algorithms to measure a site's authority and the quality of the content it contained. As a result the algorithms actually started to penalise sites with hundreds of inbound links from low quality sites.
This caused significant changes in search engine rankings with some sites dropping 10's or 100's or places overnight. To deal with the damage caused by poor quality in-bound links Google included a 'disavow' tool in it's Webmaster Tools suite. This tool allows webmasters to disavow links from low quality sites so that they will no longer be included in Google's algorithm.
Over time more and more of the old black-hat approaches to SEO have been engineered out of Google's algorithms and we are now in a position where the best thing to do is to focus on the generation of high quality content and acquiring high quality in-bound links from other quality sites.
Webmasters can ensure their own site has a regular flow of high quality content by regularly updating their copy and using news or blog systems to create regular new content. However it's very difficult for webmasters not in the web design or SEO industry to place content of their own on other high quality sites. The task of creating high quality in-bound links now falls mainly to SEO companies such as Webfuel who can offer on-going SEO campaigns that slowly build up the number and quality of sites linking back to a client's site.
If you'd like to talk to Webfuel about an in-bound link building campaign that will generate guaranteed results your site's rankings then please give us a call on 01509 852 188 .
Prior to focussing on what the design of your new website will look like it is important to decide what features and content the website will incorporate so that these can be properly planned into the new layout and design.
SSL or Secure Sockets Layer is a technology that allows visitors to connect to a website using an encrypted connection. This means that even if someone is able to intercept the messages that the browser and website send to each
One of the great things about WordPress is the range of professionally designed templates that are available online for you to use, rebrand with your own logo / colours and of course add your own content. Many small businesses simply do
Many web applications require some sort of reporting functionality. One of the most powerful frameworks for developing complex reports is Microsoft's SQL Server Reporting Service (SSRS). Webfuel web applications use SSRS to deliver complex reporting functionality for our clients. Developing