Sunday, July 29, 2007

unavailable_after tag - Google Robots Exclusion Protocol

The ‘unavailable_after’ meta tag will soon be recognized by Google according to Dan Crow, Director of Crawl Systems at Google. from Loren Baker

Google is coming out with a new tag called “unavailable_after” which will allow people to tell Google when a particular page will no longer be available for crawling. For instance, if you have a special offer on your site that expires on a particular date, you might want to use the unavailable_after tag to let Google know when to stop indexing it. Or perhaps you write articles that are free for a particular amount of time, but then get moved to a paid-subscription area of your site.

Two new features added to the protocol will help webmasters govern
when an item should stop showing up in Google’s web search, as well
as providing some control over the indexing of other data
types.

One of the features, support for the unavailable_after tag, has
been mentioned previously. Google’s Dan Crow made that initial
disclosure.

He has followed that up with a full-fledged post on the official
Google blog about the new tag. The unavailable_after META tag
informs the Googlebot when a page should be removed from Google’s
search results:

“This information is treated as a removal request: it will take
about a day after the removal date passes for the page to disappear
from the search results. We currently only support unavailable_after
for Google web search results.”

“After the removal, the page stops showing in Google search results
but it is not removed from our system.”
(Email from: David A. Utter)

One of the major issues plaguing search engines right now is the growing list of web documents available online. While no exact numbers are available, there are billions of search results to sort through. But, they can’t all be relevant both on material content and time — can they?

Of course they’re not, and Google is hoping to solve this problem through the adoption of the unavailable_after META tag. more here
(From Sujan Patel: SEO Impact of Google’s unavailable_after META Tag)

Source :http://www.searchengineoptimizationcompany.ca

Things to Avoid in Search Engine Optimization




There are a few things you must avoid (or fix accordingly) when considering to optimize your site for search engine submission. These include the following and more:

Dead Links - As search engines index your entire site by crawling through hypertext links, you must make sure you check for dead links before submitting.

Graphics and Image Maps - Search engines cannot read images, be sure to include Alternative Text tags.
I recently had someone ask me why their site couldn’t get indexed on the search engines. I wasn’t surprised when I looked at their site - 41 pages of pure images only - not a shred of text on the site. That is the worst case scenario of course, but you should keep pages under 64k (max) total graphics and text. Anything else, your losing your search engine food, and the load time is driving away users before the page ever loads.

Frames - Many Search engines aren’t frames compatible. Meta tags and the tags are important in this instance.Only AltaVista, Google, and Northern Light understand frames. If you use frames, make sure that your first content page is search engine friendly, and that it’s linked to the main pages of your site by standard text links. Submit this page to the search engines, not your frameset page

Password protection - Most search engines cannot index content available behind a password-protected page unless you make special arrangement to provide password access.

Dynamic Pages - If your site uses database generated requests or CGI scripts etc, consider submitting pointer pages with the search engines.

SPAMMING - Avoid resubmitting your pages repeatedly to search engines if your site does not get listed in the first few weeks. Allow atleast 6 weeks before resubmission. Continual resubmission (such as those caused by automatic submission software) can cause your site to be penalized.

No Flashing! Nothing drives users away, never to return, like flashing text, or abuse of animated gifs can. That scrolling banner text ranks right up their too.

Ban Those Banner Exchanges! Link Exchange is the great modern Internet myth of our time. I’ve talked to hundreds of people in-the-know about this subject, and the facts are simple - banner exchanges cost you repeat visitors in the short run, the medium run, and the long run. Its like putting a DO NOT ENTER sign with a big skull and crossbones on your front door. Nothing spells Trailer Park like Link Exchange - your left wondering why your hit rate slowly fades away. It is one thing if you are getting paid for it - it is another entirely if you are giving it away.

Cloaking, Door way Pages, Mini sites. It’s all the same. This and some of the tricks mentioned below are considered “SPAMMING” by search engines.

Hiding Text. Padding your page with “hidden” text, using fonts the same color as your background, will prompt search engines not to index those pages.

Tiny Text. Visible text, whose only purpose is to pad the page, will be penalized the same as using hidden text.

Banners and Links. If banners or links are the first things that the search engine spider comes across, it may leave your site and follow the link. Place banners and links to other web sites after your own content, or on a dedicated links page.

Source: http://bill-ray.com/?p=22
Google
 
Zeus Internet Marketing Robot
NNXH
Cyber-Robotics - ZEUS INTERNET MARKETING ROBOT