SEO (define) has grown to be a bit like one of Harry Potter's adventures. It's a narrative rooted in reason, passion, magic, and a bit of humor. Though traditional wizards and witches are replaced with Web wizards, widgets, and wands, conjuring and concoctions with keystrokes and technological catenations, Harry Potter's fictional world and SEO are not entirely dissimilar.
SEO tactics, much like the travails in the Harry Potter saga, have grown more complex over time. There's a guise of secrecy imbued in constructing strategic SEO strategies, too, particularly for beginners just learning the tricks of the trade and the vast online entities that attempt to conceal their efforts. And just like Lord Voldemort's soul, SEO has been severed into several distinct bits and pieces over the years.
Spiders Rule
Once upon a time, SEO was primarily achieved by understanding machine reading algorithms and initiating automatic submittals. Web wizards first began optimizing sites for search engines in the mid '90s, as the first engines were crawling and indexing information found in the earliest Web content.
Initially, all a Web wizard needed to do was submit a Web page to the various engines. The engines would respond by sending a spider to crawl the page. By further extracting links to other pages, spiders would crawl the entire site. The information on the page would be indexed for further referencing and referrals (except in Yahoo, where contextual algorithmic judgments were actually completed by Muggles).
Search engine spiders -- nearly as prodigious as Aragog's kin -- would read the text on a page and follow mathematical machinations to determine where the information would be positioned in the SERPs (define). Well-schooled Web wizards noticed how search engines responded to specific words found on their site pages. This was truly a time when content was king and reigned alone over an expanding realm of SERPs.
The Flaw in the Plan
Unfortunately, this same period could also be called the Golden Age of Spam, as it was possible gain SERP high rankings by repeating keywords or keyword phrases multiple times, as well as misspelled words, anything with "playboy" in it, general gibberish, and a spattering of gobbledegook, the official language of goblins.
Furthermore, early search engines relied too heavily on keyword metatags as an insightful guide to each page's content. By relying so heavily on factors exclusively within a Web wizard's control, early search engines suffered from abuse and ranking manipulation. Consequently, user experience suffered, too.
These events set Web wizards against the early search engine spiders. A battle ensued, not exactly a great as between Dumbledore and Grindelwald, but it was a battle for relevancy all the same.
Spells and Charms
To provide more relevant results to their users, search engines had to adapt. Rather than serve unrelated pages or malware and pornography disguised as harmless bits of fluff, search engines responded by developing more complex ranking algorithms, taking into account additional factors beyond a Web page that were more difficult for Web wizards to contrive.
Prior to developing Google as graduate students at reality's Hogwarts, Stanford University, Larry Page and Sergey Brin developed a search engine that relied on a mathematical algorithm to rate Web pages' citation analysis. The number calculated by the algorithm, PageRank, is a function of the quantity and contextual strength of inbound links.
To this day, PageRank estimates the likelihood a given page will be reached by a search engine user who randomly surfs the Web and follows links from one page to another. In effect, this means some links are stronger than others, just as some spells and charms are stronger than others.
As more sites reference other sites by way of inbound links, PageRank and search engine positioning is heightened. PageRank has charmed and intrigued the average search engine user for many years now, so naturally Yahoo and MSN Live Search have attempted to emulate Google's spell. And link-building, link-baiting, and social media marketing are now employed as critical SEO tactics. To inhibit the impact of darkly influenced link schemers, search engines consider a diverse array of undisclosed factors for their ranking algorithms.
Happy Endings or Disenchantment?
SEO wizards who employ overtly aggressive tactics can get their clients' Web sites banned from search results. This is highly disenchanting for potential clients and those pursuing the greater SEO good. Yet some search engines continue to reach out to the SEO industry and are frequent sponsors and guests at SEO conferences and seminars, where the dark arts mingle with the white.
In fact, with the advent of paid inclusion and PPC (define) revenues, some search engines now have a vested interest in the optimization community's overall health. Toward this end, the major search engines now provide helpful information and general guidelines that even Squibs can't ignore. While it's not as binding as the International Statute of Secrecy, SEO has come a long way from its humble, darker days to provide happy engines for search engines and search engine users alike.
Thursday, August 2, 2007
Wednesday, August 1, 2007
Search Engine Optimization: How to use SEO to get Listed
Search engine optimization is a means of making your website attractive to spiders and if you know of to use SEO to get listed on the search engine indices, you have a fabulous way of getting free advertising at your fingertips.
Many people spend a lot of money on pay per click advertising because they cannot seem to get their websites listed on search engines such as Google and Yahoo. However, there are some simple modifications that they can make to render their web pages more likely to be listed on the search engine indices once their site has been visited.
A visit by any search engine is prized, but when it does happen then you should be ready to make the most of it. Although not many search engines take much notice of meta tags these days, there is always the possibility that some might. There is no one single way to optimize a website, and it usually a combination of a number of factors that earns a page a high listing in a major search engine.
Your view should always be that if it doesn’t hurt you to do it, then do it! You should therefore include Keyword and Description meta tags in your html on every single page of your website that you want the spiders to visit, and name the pages that you don’t want visited in a robots.txt file, or use the nofollow attribute in the robots meta tag on the page concerned. This stops you wasting useful robot time on your site by allowing them to visit such pages as your 'Contact Page’ or one full of affiliate links.
You should also let the spiders know what your page is about. They can find that out by themselves simply by scanning your main text, but it is always better to include the main keyword for each page in the title of the page, and to place the title in H1 tags. You should also place any secondary keywords that you have listed in the keywords meta tag as subtitles in H2 tags.
This will improve the chances of your web page being listed for your main keyword, and perhaps even one or two minor long tail keywords if they have a reasonable demand and little supply. There are other modifications that you can make to your page, and also some design considerations to take into account that can be just important, and in fact many are even more important in leading to high listing positions for your web page.
It is important to remember that first text on your page is regarded as being important, so make sure that the spider sees your titles and most of the body text in the first few hundred words it reads. If you fail to do that then your chances of a high listing are dependant more on the number of incoming links you have than on the content of your web page.
Make sure that your content is relevant to the main keyword you are using for the page, and that the keyword is relevant to the general theme of your website. If you then include a good supply of relevant text associated with that keyword, you have an excellent chance of being listed.
Many people spend a lot of money on pay per click advertising because they cannot seem to get their websites listed on search engines such as Google and Yahoo. However, there are some simple modifications that they can make to render their web pages more likely to be listed on the search engine indices once their site has been visited.
A visit by any search engine is prized, but when it does happen then you should be ready to make the most of it. Although not many search engines take much notice of meta tags these days, there is always the possibility that some might. There is no one single way to optimize a website, and it usually a combination of a number of factors that earns a page a high listing in a major search engine.
Your view should always be that if it doesn’t hurt you to do it, then do it! You should therefore include Keyword and Description meta tags in your html on every single page of your website that you want the spiders to visit, and name the pages that you don’t want visited in a robots.txt file, or use the nofollow attribute in the robots meta tag on the page concerned. This stops you wasting useful robot time on your site by allowing them to visit such pages as your 'Contact Page’ or one full of affiliate links.
You should also let the spiders know what your page is about. They can find that out by themselves simply by scanning your main text, but it is always better to include the main keyword for each page in the title of the page, and to place the title in H1 tags. You should also place any secondary keywords that you have listed in the keywords meta tag as subtitles in H2 tags.
This will improve the chances of your web page being listed for your main keyword, and perhaps even one or two minor long tail keywords if they have a reasonable demand and little supply. There are other modifications that you can make to your page, and also some design considerations to take into account that can be just important, and in fact many are even more important in leading to high listing positions for your web page.
It is important to remember that first text on your page is regarded as being important, so make sure that the spider sees your titles and most of the body text in the first few hundred words it reads. If you fail to do that then your chances of a high listing are dependant more on the number of incoming links you have than on the content of your web page.
Make sure that your content is relevant to the main keyword you are using for the page, and that the keyword is relevant to the general theme of your website. If you then include a good supply of relevant text associated with that keyword, you have an excellent chance of being listed.
Subscribe to:
Posts (Atom)