Be sociable
Want to hear more from us? Then here's some of our favourite hangouts...
Jubilee Business Park,
2 Jubilee Pkwy,
Derby DE21 4BJ
Koobr Ltd.
Registered in England: 08353557
We’ve already looked at many of the ranking factors Google takes into consideration when deciding where to place your website in the search engine results pages (SERPs). As well some of the most important ones you should be optimising for to improve your position in search.
Last time, we looked at click-through rate, security, and page speed, and what you need to implement in order to ensure Google considers your website for one of its top spots. In case you missed it, you can read what we had to say here
For everybody else, let’s dive straight in and look at more ranking factors that you need to optimise for.
Once again, it’s important to remember that there is no silver bullet when it comes to page rankings. Algorithm updates, competitor performance and other factors mean rankings can fluctuate on a daily basis. But that doesn’t mean you shouldn’t be optimising your website. If anything else, little refinements here and there are a great way of improving your website’s overall performance.
Whilst it’s always important to make sure your website content is different in the sense that it gives the reader something new and dynamic to savour, it’s even more important to make it different in the sense that… well… it’s not duplicated.
Duplicated content — i.e. content that is just replicated from someplace else on your website (or anywhere else online, for that matter) — is one of the biggest cardinal-search-engine-sins and will negatively impact your rankings. At least, that’s what’s always been implied.
Google has never actually specified that duplicated content will harm your rankings, but they have stipulated numerous times that it’s “probably more counterproductive than actually helping your website”.
It seems like there’s a fine line of semantics between ‘duplicated’ and ‘copied’. Whilst Google expects there to be some duplicated carryover across web pages (brands like to keep their messaging consistent, right?), content that is directly copied and is deceptive in its origin will be severely frowned upon.
Long story short, make it a point to populate your website with as much original content as you can. Avoid copying yourself and other people, and routinely check your website to see if any duplicated content does exist. Online tools such as SiteLiner are useful for identifying which pages on your site contain duplicated content, as well as specific areas where text is replicated.
Backlink is an incoming link one website gets from another website. Let’s say, for argument’s sake, that you run a successful donut business, and a prominent food review website posts a review on its blog that includes a link to your website — well, that’s a perfect example of a backlink.
More importantly, it’s a perfect example because it’s natural, relevant and comes from a ‘prominent’ source which is likely to have a high domain authority. These are all important factors when gaining backlinks, as Google factors the quality of links when ranking your website. This is because linking, in general, can be extremely useful to the end-user, and user experience is what Google is all about.
To give you an example of how not to do things: Avoid building links with just any old website just to increase your total number of backings, as linking with any lower domain authority sites will dilute the overall quality of links you have cultivated. At the same time, you’ll want to avoid linking from a single domain as this carries much less weight compared to those that come from various domains.
A good way to kickstart things is to check what your competitors are doing. For this, there’s an online tool called SEO SpyGlass that enables you to compare your linking profile with your competitors and begin to formulate your own link-building strategy around these extremely insights. Moz’s own Link Intersect Tool also does the same thing, in case you already have that.
The way your website is built — the way your pages and subpages are organised, and the way information is thoughtfully structured for coherency — is as important to SEO as it is to the overall user experience. That’s because, at the end of the day, the two are intrinsically one and the same thing. If your website is a cluttered, unorganised jumble of pages that makes it difficult for humans to navigate, then don’t expect the web crawlers to fair any better.
The better your site structure, the easier it is for crawlers to index your site’s content and return it in the search results. And the more it will improve your position in search results.
Every good website is built on a solid foundation of pages that are structured in some form of ‘hierarchy’. This hierarchy forms the basis of your site’s navigation and URL structure, so it all begins right here.
The key is to ensure your page hierarchy is simple, minimal and well balanced. Each major category at the top needs to be distinct, while each sub-category needs to be directly related. Keep the number main categories somewhere between 2 and 7, and try to keep the number of sub-categories generally even across all main categories.
If your page hierarchy has a logical and coherent flow to it, then your URL structure will naturally follow suit — organised according to the way your pages are organised and with coherent wording and appropriate keyword coverage.
So let’s say, for example, your hierarchy looks like this:
The URL structure for the Shelbyville location would look like this:
www.homersdonutshop.com/location/shelbyville
You want to make sure pages, especially the really important ones, aren’t buried too deep within your site structure. There’s a general rule of thumb that any page should never be more than 3 clicks away from your homepage — it’s not entirely gospel, but it’s a rule of thumb for a reason.
Web pages buried deep within your structure are a nuisance for users to find and can result in a lower click-through rate and increase your bounce rate. They can also be problematic for web crawlers to index, as search engines operate on a crawl ‘budget’ and may result in deeper pages not getting indexed at all. You’d be surprised at how much a simple tweak of URL structure could improve your position in search.
Get in touch to find out more.