Contact Us
b.s.c. Business Services Consultants internet websites marketing SEO
One Call for
IT Services
Bakers Dozen tips for an effective real estate website.
SEO Services
SEO Seminar
Real Estate Websites
e-commerce websites
Corporate & Image Websites.
About Us
Database Development
SEO Web Marketing
Website Marketing
Basic website mistakes
get it right or your gone!
Contact Us
We are a
domain name registrar!
Register a domain name here!
Login to your domain account
We reccoment I-drive for back up! IDrive Remote Backup

This page contains the specifics of the discussion held on September 26 th 2007 internet roundtable event held by the Business Entraperneal the Upper Valley.

The Fundamentals of a productive Search Engine Optimized Website
by Clifton McNaughton – Business Services Consultants

Domain Name

  • Short names are better than long ones
  • Don't use more than 3 hyphens on the name, it's considered spam
  • Do your homework before buying an existing domain
    • It may be banned
    • It may have a bad history
    • Better to start with a new one when possible

Site Structure

  • Most important pages on root level
    • Put keyword rich content at root level
  • Plan directories and file names
    • Use key phrases - big win!
    • Don't use underlines, separate words by hyphen.
    • Stay relevant, does it mean something to the visitor?
    • Long but not too long
      • Consider if it's used on emails - don't want wrap
      • Consider if tracking codes will be added
      • SE don't like dynamic additions to the URL (% $ ? etc.)

Page Optimization

  • Optimize Entry pages (2nd and 3rd level pages)
    • Search Engines don't always index the home page
    • Optimize and submit those pages
    • Make them stand alone like the home page
    • All entry pages should point to all other entry pages to encourage Search Engine spiders to follow
  • Ideally have pages linked from two different places (i.e. navigation and site map)
  • Have Unique Content
    • Page specific title
    • Separate pages only when there is separate content
  • Avoid Spam
    • Excessive doorway pages/Domains
    • Keyword stuffing
    • Hidden Text/Links
    • Link Farms/Massive Domain Interlinking
    • Cloaking

Site map

  • Site map is Special and Useful as an entry page
  • Site map is food for hungry spiders
  • You can have links that are not on the homepage
  • Robots will follow them and index the pages

404 error pages

  • Use logo and navigation
  • Or logo and a site map
  • Leave a way out for the spider

Splash / Intro pages

  • Avoid the use of splash pages or intro pages
    • No html usually a movie (which SE don't understand)
    • A delay tactic disliked by users
    • Only 1 or 2 links for spider to follow if it can find them

Avoid Re-directs

  • SE sees re-direct as an attempts to Spam them
  • Use other techniques for tracking other than a re-direct
  • No meta tag re-directs or JavaScript re-directs
  • If it's a real re-direct then use and Error 301 Page

Good page structure

  • Avoid, when possible, the use of frames
  • Always have a text link navigation
  • Use robot.txt page
  • Avoid Session Ids and long URLs

The rewards are in the details.

Choose the right one. Use the Overture keyword tool together with the number of results on Google to find out what keywords are searched for and how many other websites are targeting them. Be realistic about what keywords your website can be ranked for, if you have a small website it will be difficult to rank well for a highly competitive keyword.

I personally advise people not to use them unless absolutely necessary. Frames tend to cause problems with search engines, bookmarks and so on, because frames don't fit the conceptual model of the web (every page corresponds to a single URL). If you use them, make sure you use the NOFRAMES tag and link every content page to your frameset index page.

If your code is messy it could make it very difficult or even impossible for a search engine to properly see the content of the page.

At the moment only Google seems to be able to index Flash files, how much or how little content they see is unknown. Until search engine technology is able to handle your flash as standard then it would be advisable to avoid the use of it. You can use flash elements and embed them on a page, if you choose to have your page navigation in flash make sure you replicate it in straight html so that search engines can see it and follow it.

Dynamic URL
Although Google and Yahoo are able to crawl complicated URLs it is still advisable to keep your URLs simple and avoid the use of long query strings. Do not use session IDs in the URL as these can either create a 'spider trap' where the spider indexes the page over and over again or, at worst, your pages will not get indexed at all. If you do need to include parameters in the URL then limit them to two and the number of characters per parameter to ten or less. The best SEO solution for dynamic URLs is to use Mod-rewrite.

Title tag
The title tag is probably the most important thing for improving your website's search engine results. Make sure you include the chosen keywords in it and place them near the beginning of the title tag.

Meta tags
There are several possible meta tags that may be included in the head of a web page but the only ones worth using for search engine optimization are the "Description meta tag" and the "Keywords meta tag".

Search engines follow links in order to find and index pages and absolutely love straight HTML links. When creating the navigation for your site bear in mind that flash navigation, JavaScript navigation, area maps and drop down navigations will create problems with search engines.
If you use the above make sure you replicate the navigation as standard HTML links in the footer of your page.

When writing the content speak the customers' language. Provide detailed information and make sure the content supports the subject. The keywords chosen for the titles should be found in the page content with a density of 4 to 8%, within a tag, in bold, in italics, in the link text (Google loves it) etc.

If possible use text links; this will allow the use of keywords and are loved by Google. Don't use too many (Google recommends having no more than 100 on any given page) as it may lower the relevance of the page.

What to do

Select Right Keywords
The foundation stone of your search engine optimization campaign is to select proper keywords by spending some time on keyword research in order to lay the base of optimization process perfectly. Make it sure that your selected keywords are a match to the context of your business. Keywords must be relevant and should reflect your sites content.

Proper META Tag
META Tags are quite significant from different perspective. Firstly META feeds your sites topics to the search engines. As you may see the result pages of Google, the site description (the 2 lines below the link) are taken from the description META tag. Therefore the importance of META becomes obvious as it take visitors to your site if the description is written properly. Secondly META tags can also specify how the crawlers view your site information thorough the robots.txt file, code to tell the browser how it should read a page and more.

Proper TITLE Tag
Title tag is the most important tag as it goes in the anchor or, the bold link on the search results page from the search engines. It is your selling point and reflects what you page is all about. It plays a major role in driving the traffic to your site because it is Hook to entice a searcher to have a quick impact and vision about your site. It determines if your site is worth a visiting or not. So title tag must be reflect the page contents and should draw the attention of the visitor. The title tag should be different for every page in your site!

Keyword rich Content
Content means the text and data presented on that particular page. The content must be relevant to the description of the page used in the Meta tags. It should be descriptive and must have keyword focus in order to build relevancy of the page. Your SEO campaign must deploy some good copy writing techniques, which may include formatting keywords in bold text, and using header tags to start new subjects on the page.

Use of ALT Tag Attributes.
Search engine crawlers are not able to read images; they only read the text. So images or graphics should use the ALT attribute in the IMG tag. It’s specifically used to guide both, the visitor and the search engines in the case image is not displayed, and to index it in the Image search directory. The ALT attribute must be appropriate and it may be used to focus keywords but keep it in mind that using ALT attribute just to reflect your keywords may turn into SPAM.

Relevant Anchored (and/or Titled Anchored) Linking
Proper linking is a crucial part in achieving high rankings. Improper or irrelevant linking may cause penalty from search engines. So while linking your pages you must take it into account that what text you are going to use as the anchor text in the link ( the text that is what a user clicks to see the page). The anchor text must reflect the content of the page it is linked to. The Title Attribute should be used when navigation is image based. Contextual linking is now a Google favorite. These are links embedded in the content of the page, not in a navigation list. They are considered more relevant to the search results because there surrounded by a discussion of a topic and the link is assumed to give more relevant information about that topic. This makes your site an “authority on that topic, and search engines want to rank authority websites at the top of their results.

Theme of Your Website
A website is a collection of related web pages that are inter-linked with each other. As a collection, it must be having a common theme or page appearance, remember that your website is for visitor/clients first, and not for the search engines. But, you optimize it for the search engines. Theme is about integrity between the web pages, and ease of use to find them. Follow the basics of a good speech. It needs a Title, a description, the details, and a summary with a chance to get more information.

The Theme needs to be consistent. The Navigation should be clear and easy to follow and should not change unless you are directing the user to a call of action. (Purchase a product, get more information)

SE Friendly Site Structure
Website must be easy to crawl and to be indexed by search engines; there should not be complex directory structure or linking cannot be followed. A good resource to see how your site will be viewed is the W3C or the World Wide Web Consortium. They offer 3 tools that are a must to know if your website is healthy. First use the markup validation service to check your code. there is a show outline option that will display the page as an outline. This will let you see how the search engines will view your site with the context of theme.

Next use the CSS validation service to see of your site is going to look good in all web browsers. Most documents on the Web are written in a computer language called HTML. This language can be used to create pages with structured information, links, and multimedia objects. For color, text, and layout, HTML uses a styling language called CSS, short for "Cascading Style Sheets". What this tool does is help people authoring CSS check, and fix if necessary, their CSS Style Sheets.

And last but equally important it the link checker. This will go though your site and check every link to make sure it works.

Proper Sitemap
Sitemap is quite significant for visitors and search engines. It not only provides a road map to the visitor to go directly on the desired page without wasting time but also guides the visitor if lost in surfing different pages. Sitemap is also important for search engine findings, when a search engine locates the sitemap it easily crawls the links quickly and helps in the better ranking of a website in the search engine results. You can also submit your site map directly to Google guarantying all the pages are indexed

Quality Back Linking
The backbone of whole search engine optimization process is back linking. Relevant and quality back linking is the key. Back linking involves getting other related websites to link to your website and if the linking text is contains your keyword(s) then its more beneficial. While getting links to you site, you must take precautions to prevent you from being considered spamming your links and getting penalized by search engines. Avoid link farms! A link pointing to your website should be unique and be coupled with keywords in anchor text to bring the more credit from search engines, doing this not only helps you get traffic to you website but also bring higher ranking in search engines result pages. Also as the links on a page grow older and older it becomes more valuable especially for Google.

On-Going Content
Content is King in the search engine optimization process, because search engines significantly give more importance to the text written on the page. This is the details part of the theme structure. Here you explain about your product or service and also define yourself as an authority on that product or service. The integrity and consistency to the theme and how much the text is relevant and unique to the other existing millions and billions of web pages will determine your position in the search results. Search engines expect your content to be unique and symbiotic with all the other pages in the site. Therefore content of any web page must be current and must keep growing, because fresh content brings more value towards the creditability of your site by search engines.

The first thing most of the search engines look for is the robots.txt file. This file contains directives for the web crawler to follow. It tells it what areas of the site should be indexed, and as importantly, what areas should not be indexed. Any web pages that contain sensitive data, or information that you do not want exposed to the public, should be restricted here. Without a robots.txt file the web crawler will attempt to index all of the site and could get caught on a page that does not have navigation associated with it such as an include page. Once stuck the crawler will stop the indexing and leave the site, which will cause you 2 issues. First the site will be considered a poorly functional site, and second a lot of your valuable content may not be seen. The consequences are a poor ranking in the search engines.

What not to do

Keyword Stuffing
Search engines are getting more and more intelligent in the way they crawl and filter the content of the web pages. I recommended you write keyword rich content but it must not be a mess of keywords without context to them. Content or META tags full of keywords is surely SPAM and its must be avoided. Keywords should be used in a reasonable manner in proper sentences, which should give a relevant meaning to the context. This will not only help in enforcing over all relevance but also automatically gives focus in targeting your keyword phrases. Using bold text on the page raises the importance of a word or phrase and defines it as a keyword automatically, so use care and consideration when bolding text, and be sure it is part of your keyword meta tags.

Duplicate Content
Normally a web page profusely contains textual data, which is also a key point for any search engines. Content plays a vital role in rankings acquisition and loss. Most of the time webmasters create several web pages in order to grow the content of the website by duplicating the same content of older pages or creating multiple duplicate sites over different domains. When search engines encounter such situation they simply penalize those websites because of facsimileing the content, those who have tried to play tricks with search engine algorithms have found eventually to be black listed and thrown completely out of the search results. Therefore avoid doing this tactic.

Hidden Text
Similar to duplicating content, many people try to be smarter by hiding the text either using CSS hidden div, or by matching the text color to with background color, this is considered a malicious approach. When webmasters stuff a lot of keywords in hidden text and seek in drawing crawler’s attention in their favor, the result is the opposite, black listed! Search engines detect such spamming techniques very quickly because the hidden text which is infect not visible to the visitor only but it is quite obvious to crawler and can easily be identified. In response, search engines highly penalize or remove those sites/pages.

Heavy Graphics & Flash Animations
A website is normally a blend of text and images, a website which is only textual may bring you good results but is boring to the user, so you do not get results from the page .If the website is only loaded with lots of images it has minimal visible content to a search engine unless the proper use of the Alt Text is implemented. Remember search engine crawlers are not able to read the images, but do read the Alternate Text on the images. So if the website has immense graphics in body, like navigation buttons etc. then it may not do well. To avoid this it’s better to convert those navigation buttons to text or providing alternative textual navigation such as duplicating the links on the footer of the page, and adding ALT attribute to images. Instead of images CSS can be used to create a button look, and it will help in reducing the markup plus, provide compatibility with web browsers. Similarly, heavy flash animations are not recommended to use. Search engines mostly are not capable of extracting text from flash animation and it causes snag for indexing a website. Therefore using flash animation as navigation, should be avoided. Though it draws the visitor’s attention, it is no benefit from search engine’s perspective.

Wide-Ranging Markup & Heavy JavaScript
Search engines weigh the amount of markup to amount of text on the page as a factor in their ranking. In a web page, the HTML should be limited by using CSS. Because the crawler will parse everything in a web page, and if there is extensive markup, it may overlap the value of content. Using HTML tags again and again for formatting is actually bringing more nuisances for the crawler to reach to the content. Similarly with the JavaScript, if used in a bulk, and at the beginning of the web page may not be useful. Use external JavaScript and CSS to reduce the markup size.

Invalid Markup
Invalid markup means that the HTML or the code of the web page is not meeting the defined standards and it’s erroneous. When crawler finds errors in the markup it diminishes the creditability of the site. The importance of valid markup is quite obvious as the content is also the part of markup and that’s the backbone of you website. Therefore the markup of the web page must be validated as discussed above.

Lengthy & Non-Friendly URL’s
Search engines like simple and friendly URLs. If your website having complex URL’s which include query strings like,
are not appreciated by crawlers. Instead if the URL is
it is given more credit, because it is understandable. In the case where the URL needs to go deeper to define variables it is better to use a URL like

or a real life example is
this simple but requires more intelligence on the page to compete the query and match the content. But the results are amazingly positive in getting these pages index correctly. Therefore the URL’s must be simple and short and keyword relative.

Improper Site Structure
Site structure often exposed through internal linking. Proper and consistent website structure that shows the integrity between the pages, is an important key to get selected by the search engines. Good and comprehensive internal linking between the pages of the website builds cohesive organization and if the keywords are used in the anchor text the linked pages are more likely to get indexed properly and more importantly used by your visitors. After all if the site is not easy to use, no one will use it!

Bad & Irrelevant Back Linking
Back linking is a critical factor of any search engine optimization campaign and it is given high worth by the search engines. But if back linking done without considering the factors involved, will result in getting penalized. Bad linking, or irrelevant linking, are links that are out of context with the content of your page. For example, if the website is related to real-estate and it is linked with another website which is related to casinos. What is the relationship or degree of relevance between both of the sites, of course there is nothing related. Such back linking is irrelevant and may be taken as Spam.
On the other hand if there is a page in your site about area attractions and entertainment, a casino makes sense, but the back link must go to the attractions page, not the home page.

Multiple Mirror Websites
As duplicating content is highly discouraged by search engines, the same way if you have multiple websites with the same content but different domain names or a mirror website, this is considered Spam. Some people use redirection extensively or redirecting from one domain to another domain, which is not worth it to do.

Doorway Pages
Doorway pages are pages that are stuffed with the keywords and used try to fool the search engines. Doorway pages are used to fool the crawler by putting a lot of keywords in anchor text just to build the relevance. When the page is visited for those keywords, which are anchored, then redirection is made to another web page. This is easily detectable by the search engines. So to avoid getting penalized by search engines never use doorway pages.


Home | About Us | Real Estate Websites| e-commerce websites | e-takeout online food ordering |General websites|
Database Development | SEO Web Marketing | Contact US | New England Real Estate | Sitemap