Search Engine Optimisation can be a rather confusing concept to understand and implement. On the flip side, SEO can be simple. Once you are comfortable with the basics, you can become a wizard in the field in no time.
In order for your site to appear favorably in search engines, there are key elements to adhere to maximise trustworthiness and ranking statistics. Here I will delve deeper into the mystical and sometimes perplexing basics to give an insight into what you should be doing.
It is almost a given that webpages that contain targeted keyword(s) within their title tags generally rank higher in search engines. This is because search engines decipher the content and topic of a page using its title tags. The correct keyword(s) embedded in title tags helps search engines to associate the page with the search query and can result in your page ranking higher.
Keywords should be implemented somewhere near the beginning of the title tag. Keywords that match a search query will appear bolded in the search engine results page (SERP) and it is widely believed that this practice helps to improve click through rate (CTR).
Getting the length of your title tag correct can be crucial. If it is too long then the efficacy will become diluted and not hold as much authority. On top of this, title tags that are excessive in length will appear in SERPs with ellipses resulting in your tags not appearing as much. Aim for title tags around 54 characters or in length ideally or use 512 pixels wide as a rough idea.
Historically speaking, keywords within the H1 was a must for SEO and was synonymous with ranking well in past years. Many SEO whizzes still believe that placing keywords in the H1 can boost search engine rankings.
Perhaps the most obvious of keyword placements is in the actual copy of the page. You’re probably wondering why I state the obvious! Sometimes this vital aspect can be skipped, particularly if the content has been written by an outsourced copywriter.
Closely related to ensuring content is keyword inclusive, is the idea of keyword density. Overuse of keywords can give a spammy feel to a webpage. Let’s say that I want to rank for “web design”:“Web design requires many skills. Web design comprises of many elements. Web design areas include SEO, graphic design and coding. Many people want to make their website better through better web design.”
Not only does it read as a basic English no-no, trying to hammer home your keywords will reduce the quality of your content and search engines may deem that you’ve attempted to cheat the system. I’d suggest a keyword density range of around 3-5%.
You probably know by now that keywords play a massive role in relation to SERPs. Carrying on with this is this idea. Search queries are usually carried out using multiple words. Keyword proximity is the distance between search terms and matching terms on a webpage.
For example, the search “web design derby” and the return sentence “The Web Design Group are a web design and development agency based in Derby” would see the proximity between “web” and “design” as zero as they are placed side by side. The proximity of “design” and “Derby“ is six as there are six words separating the two words. Search engines deem higher proximities as less relevant to search queries so this is definitely something to think about when creating your content.
More recently, search engines and Google in particular, have begun to penalise webpages with little content or that are weak in substance. There’s no guideline as to how many words should be on your page though. Use as many words as necessary to get your message across without going overboard and droning on.
An image “Alt” is generally used for a brief description of an image should the image fail to load correctly. It is widely accepted however, that an image “Alt” can help a search engine determine the subject of a webpage. Inserting keywords into these descriptions in an unforced way can boost your rankings.
Make sure that your Meta keywords are implemented where applicable. These are no longer used by Google for searches but still important for other search engine. Including them will not negatively impact your rankings within Google.
Proofreading text that is on your webpage is critical. The big search engines disapprove of poor English and can mark sites down for bad grammar and spelling.
Identical page titles and Meta Descriptions for different pages will go down as an HTML error resulting in duplication penalties from Google. Ensure that your pages have unique title tags and Meta Descriptions to avoid this.
Outbound links are part and parcel of link building. Linking to high quality, trusted sites that are relevant to your page can do you no end of good for your SEO strategy. If you have outbound links on your site that you are unsure about, you should add the NoFollow attribute to the anchor tag.
Similar to the link building process, internal linking to pages on your site can have a beneficial outcome. Google has gone on record in the past as acknowledging that internal links boosts rankings.
404 errors are reported by search engines when they attempt to crawl links to discontinued pages. Fixing 404s can be as easy as using a 301 redirect that tells search engines that the page in question has been relocated to a new URL that should be indexed instead.
As previously mentioned, search engines are not keen on duplicate material. The canonical link tag can control duplicate content and tell Google the URL to be indexed. You may have two versions of /index.php indexed. For example:http://domainname.com/index.php http://domainname.com/index.php?id=122
SEO relies heavily on the robots.txt file to provide instruction to search engine robots on crawling and indexing. “Allow” and “Disallow” command can be used to grant and deny access to areas of your site. Generally speaking, the “Allow” command will be used site wide at root level and “Disallow” to prevent indexing of areas such as admin pages and directories.
Search engines use the sitemap.xml file to crawl pages on your website. Your pages should be listed in this file and submitted to search engines in order to appear in SERPs.
A blocked page won’t appear in SERPs. It’s that simple. They won’t be indexed and the content you have will not be visible to people making a query. Make sure that you have no blocked pages across your website. This can be done by using the Meta Robots tag and the Robots.txt file.
If your Meta Robots tag looks like this, , with “noindex” and “nofollow” commands then chances are your page will not rank.
Does your Robots.txt file read?
User Agent* Disallow /page.php Disallow /directory/If so, the disallow command used is informing search engines not to crawl this area of your site. A bad idea if you want this page to be ranking well.
Gone are the days where every website used www. to begin their web address. You don’t have to have it in yours and it makes no difference to rankings but you cannot use both versions. Search engines will deem them as different websites and as previously mentioned, duplicate content is bad news for SEO.
http://www.domainname.com/ would be treated as an entirely different site to http://domainname.com/. Not only that, every single page of your site would be a duplicate of another, rendering it abysmal by SEO standards. Your hosting server can set up a redirect system to avoid this
I built my first website in 1999 and been working in web agencies since 2004. I have many years of experience in SEO, PPC and Web Design.
Unlike most agencies you may have spoken to I provide an initial 3 month trial period for SEO and PPC, following this period we will sit down and review the performance and your business goals.?
Contact me today to arrange your FREE no obligation SEO or PPC audit. I will highlight any potential stumbling blocks and discuss how we can overcome them.