You may have heard it said that having a search engine friendly site is key to success on the web. That’s fine and dandy, but what in the world is a search engine friendly site? Find out how to code your site to make a difference in how well you rank in the search engines. Learn what tags and attributes are essential to gaining those top rankings.
Even if you know nothing about HTML, this session will arm you with the knowledge to help your programming team get your site in shape.
This session is being presented by Stoney deGeyter, President, Pole Position Marketing. If your web site architecture is “jacked-up” you aren’t going anywhere (shows a car on jack stands with no tires). Good architecture helps search and users to think less. Poor architecture stops spiders and hinders usability. Ultimately poor web site architecture can decrease conversions.
Duplicate content can plague you like a virus. For one thing, it will slow the spiders down and may even cause them to leave before indexing everything you’d like them to index. Regarding domain names, keep them short, make them memorable and use keywords if possible. Don’t forget about alternate domains – misspellings, abbreviated, phonetically similar, various top level domains, etc. Redirect all domains to your primary domain to avoid duplicate content issues. Stoney also talks about Favicons, pointing out that it helps with branding and draws attention to your domain.
Next he talks about search engine friendly URLs. Like domains, keep them short and use keywords if possible. On whether to show the ‘www’ or not, Stoney recommends that redirects be set up to flip to one or the other so you do not get duplicate content issues. He also recommends not to redirect main domain to sub-page (i.e., mydomain.com/home.php).
Another duplicate content issue is when you have secure and non-secure pages. He recommends that you disallow search engines to index secure pages (https). In an e-commerce environment, don’t secure the process until they are ready to check out. Many sites secure the shopping cart but many times users go back to the shopping experience after adding items to their carts.
Different navigational products can present duplicate content issues. This happens when you have different URLs for essentially the same page. It is okay to have various ways to navigate to a product or brand page but either make sure the landing page is a consistent URL or make sure you have unique content for each different URL.
On using underscores or hyphens, Stoney recommends hyphens. One thing I will point out here not mentioned in the presentation is that when words in the URL are separated with underscores, search engines see them as one word whereas if you separate them with hyphens, they will see them as tow or multiple worlds.
Make sure you have a custom 404 error page so users don’t see the default “Not Found” page.
Stoney points out that a flat directory structure is better. What this essentially means is to make sure all pages are able to be reached with one or two clicks. If you are going to have multiple pages for each section of your site, locate them in specific folders.
Next Stoney is going to move on from domains and cover document structure. In site hierarchy, make sure there is a natural flow of topics and subjects. Make sure title tags are unique for each page.
This seems to be so elementary but there are still so many scenarios where I see duplicate or even worse no title tags throughout a site. One will have to decide whether they want to place page descriptive first or brand first in title tag. I prefer keyword descriptive first followed by pipe symbol and then brand name.
Regarding meta descriptions, if targeting long tail keywords, he recommends not having a meta description so engine will be forced to pull snippets from content. Keyword meta tag = uselessness.
Regarding content, make sure each page has unique content, avoiding duplicate content issues. Also take advantage to link to other pages as much as possible within the content of your site. This allows the use of keyword rich anchor text to describe the pages you are linking to. Be careful to heading tags in an outline style of format.
Regarding cookies, don’t force users to accept them. Why? Because search engines do not eat cookies. Therefore they will not see the pages.
Finally, Stoney will talk about link structure. Back on the ‘www’ or no ‘www’ – pick one or the other when linking to pages. He also recommends using absolute URLs, especially if you have secure (https) pages.
Site maps are also very useful for both users and search engines. Use robots.txt to block access to pages you don’t want to show up in the organic results of search engines. Use the nofollow attribute to control the flow of link juice so you do not waste it on pages that are not important (i.e., privacy policies, legal statements, contact us, etc.).