Every savvy Internet marketer worth their salt wants to know – what do web pages that are well positioned in Google have in common and what distinguishes them from lower ranking pages? To answer this question, SearchMetrics examined 300,000 URLs appearing in the top search result positions for the presence and extent of certain properties.
Unless you’ve been living under a rock for the last several years, it is pretty common knowledge that traditional SEO has evolved to include many facets of online marketing. On-page SEO still remains a foundational tactic but the modern day SEO has many other things that they need to employ.
High rankings in the SERPs are nice but often they are not enough to ensure a successful online marketing effort. Rich snippets are part of increasingly enhanced SERPs designed to help users make decisions before they click. But how does one facilitate their web sites for rich snippets?
Another great Infographic I discovered at SEOBook, this one showing how Google’s “fight with web spam” combined with their business development team “making spam” have made organic links less relevant than they used to be.
I am always flabbergasted with clients who pay good money for us to develop SEO strategy for them as well as manage their marketing campaigns and yet they do absolutely nothing with it. They ignore requests for web site modifications and enhancements, as well as other marketing suggestions.
It is in these scenarios where SEO does not make sense.
Just wanted to highlight a post by Patrick Altoft at Blogstorm where he points out that in a recent survey from Forrester, search results are shown to be the 3rd most trusted online information source. This is good news for search marketers everywhere, especially those practicing SEO related techniques that help improve the visibility of web pages in the organic search results of engines such as Google, Yahoo and others.
As a final installment to my “back to the basics” series, this post will discuss some of the pitfalls or obstacles you may come across when developing a SEO strategy. These may include duplicate content issues, potential problems with e-commerce sites and/or content management systems and obstacles that Flash and AJAX technologies may pose.
This is the second installment of a “Back to the Basics” series I am currently writing. In case you missed it, the initial installment was about keyword research and how it is the foundation of any search engine optimization (SEO) effort. In this segment I will actually detail how to go about developing a SEO strategy for your web site.
This is a “back to the basics” style of post related to search engine optimization (SEO). I plan on doing a number of these over the next couple of weeks that will detail the entire SEO process — from laying the foundation with strategic keyword research to effectively monitoring your progress. So if you consider yourself “advanced” in SEO, you might not wish to read any further. My target audience for this post is the “newbie” — in other words, those who are just beginning their education in SEO or at least are fairly new at the practice.
I am so weary of non-SEO types who have some measure of influence spouting off their opinion of what SEO is, what it isn’t, whether it is growing or declining and the like. It is no different then when celebrities use their clout and status to speak out of some subject like they are some kind of authority on the matter when in all actuality they are not.
The Dallas Business Journal recently ran a story on one of our clients, Wasp Barcode Technologies, describing how they went from spending enormous amounts of money on PPC (AdWords) to focusing more on traditional SEO and link building. The strategy paid off — Wasp spent less and got better results. Their web traffic grew by 60%, topping 600,000 visits while they were able to cut external spending by 13% in the process.