Introduction – in simple terms, SEO is the process of improving the number of visitors to a website via search engines. By optimising your website with targeted specific key phrases used by your target customers, its possible for search engines rank your website more highly than similar competitive sites (that are not optimised). SEO should be viewed as a component of part of your overall professional internet marketing strategy and ethically used to improve the quality of your visitor experience, according to search engine guidelines and standards. The first step is to understand how search engines work….
Search Engine Basics – A search engine is the website that allows anybody to enter a search query for web site information from billions of web pages, files, video, images, music files. Most people have heard of Google, Yahoo, MSN but they’re also literally hundreds of other less well known specialist Search Engines also providing similar services. When you visit search engine, search results are traditionally displayed as blue links with a short description about the website. The results related directly to the users search query. Search engines evolved from the creation of large directory projects such as the DMOZ and the Yahoo Business directory. In the early to mid 1990s, search engines started using the web by crawling technology to trawl the ever increasing number of websites being developed. Neeva search engine Today search engine results from google, yahoo and MSN also appeared in other minor search engines such as AOL. 80% of people find information on the Internet via a search engine because they are easy to use, flexible and provide a highly relevant links to the Internet.
How Do Search Engines Work? – Search engines use automated mathematical algorithms to rank and compare web pages of a similar content. The algorithms are highly complex of and rely on search bots continually trawling the Internet to an copy or ‘cache’ every webpage it visits. Search bots automatically look for specific information when visiting a web site such as the robots.txt file, sitemap.xml file, WHOIS data. They do this to find new content in microseconds and ensure their own listings presented to users are highly up to date and relevant. The data is stored by the search engine company in huge server data centres. The exact mathematical formulae of the search algoithm is jealously guarded by search engines, and so only analysis of historical data is used to make some general assumptions about how they ranking work. In addition, each engine publish some webmaster guidelines to give some general guidance about how to create a quality site and not use techniques that may get a website banned from its listings, by its moderators.
How Do Search Engines Present Relevant Results? – historically, the primary factor search engines used to rank web sites is the number of links a website has from other websites. These are known as inbound links. Over time search engines grew more popular and link farms developed to try and manipulate the results. To combat this the algorithms became more sophisticated. Today, links are less important and instead the textual relevancy of the words, paragraphs, pages and entire theme of website is critical achieving high search engine results. Search engines employ advanced anti spam factors to ensure that users are presented with the most relevant and quality results possible to reflect their search. More recently search engines are diversifying into different means of search, such as images, video, universal local search, product and price comparison as well as developing free online applications such as calendars, spreadsheets and word processing applications.
Key Phrase Analysis & Selection – the next step is to identify the keywords related to your product or service, that your target prospects are typing into search engines. Only then can begin to effectively strategise and design and optimise a website around your buyers needs and wants. Key phrase selection is the first and most important step in internet marketing. Why is this relevant?… Search engines use mathematical algorithms to compare web pages in order to rank these pages (based on a user search query). If you make incorrect assumptions (without researching) and target key phrases that don’t interest buyers, your website will fail. Conversely, if you target the right combination of keywords and phrases (before you even design your website), you will maximise your chances of higher search rankings and create an opportunity to sell. The bigger the market is for a particular product or service, the more competitive the online marketplace is for the related search terms. For instance, a quick check in Google reveals there are approximately 3.44 million search engine results for the term ‘mortgage’… yet only 0.217 million results for the phrase ‘discounted commercial mortgage quote’. In other words, the former is approximately 19 times more competitive in achieving a top search engine position than the later. By using keyword selection tools, advertisers can identify what search terms are not only popular but also how competitive they are. For instance, there are approximately 37.2 million people typing in ‘mortgage’ into all global search engines per year, yet only 0.8 million people typing in ‘commercial mortgage’. Keyword tools are invaluable in identifying a range of niche search terms that can be used to help optimise a website to achieve higher search engine rankings/ more website visitors. These tools can also produce derivates and synonyms, common spelling mistakes as well as produce a comparative competitiveness indices to see if a particular phrase is hard or easy to achieve top search listings with.
Once you have used your market knowledge and keyword tools to validate the search volumes of phrases, make a list, ranked by search volume, of your top 10 phrases. Invariably there are derivatives of your top ten target phrases. For instance, if your primary target term is ‘mortgage quote uk’, you might also identify ‘uk fixed rate mortgages’ and ‘mortgage broker uk’ as secondary phrases. Your prospects will type in hundreds of similar search queries to find a particular product or service. By creating optimised web pages with relevant content (that include these phrases), you maximise your chances of achieving increased search engine traffic. You will need to continually update the list and check your site logs and statistics and test using keyword tools. From this feedback, by checking which search phrases and entry pages were used to enter your site over time, you can easily see how successful your optimisation efforts are going. Key phrase selection should literally dictate the choice of domain naming for new sites, website design; navigational structure, unique selling points and linking strategy. Always add a couple of optional fields in your Contact Form to get feedback form your website visitors; ‘which search engine did you use to find our site?’ and ‘what search term did you use?’
Competitor Analysis – assuming you have a product or service that has some unique selling points, you need analyse your online competition so you can optimise your USP’s effectively. From an internet point of view you need analyse what other websites have achieved relative to your own website. For the first step is to identify who they are and make a list. This is simple enough by entering all of your primary and secondary key phrases into the major search engines and building up to a list. Of similar key phrases are usually bring up the same websites and you will very quickly understand what do you need to knock off the to the search engine to succeed. Updating this list on a regular basis is as important as the initial analysis. By obtaining feedback from search results you can constantly re-analyse competitors in terms of additional links or content they have added, or analyse how they have restructured and reorganised their website have to make it more search engine friendly. Review each site carefully to analyse your competitors on the following basis:-
Keyword / Keyphrase Density Analysis – make a list of your competitors keywords (as per above) to validate your own analysis.
Pagerank Checkers – seochat provide a useful tool to lookup pagerank of multiple sites, if you have a large number of competitors.
Supplemental Page Checker – use toosl to check the proportion of ‘less important, less highly ranked’ pages of ab website in Google’s ‘Supplemental Index’, versus its main index.
Search Engine Exposure – rather than visiting each of search engine individually, you can visit websites like netconcepts to provide tools to measure how many internal links a site has been able to get cashed in each major search engine.
Whois & Contact Forms – the WHOIS databases will allow you to match website owners with real world businesses that are (perhaps) using multiple sites to boost sales possibilities.