Category: Search Engine Optimazation (SEO)

January 17th, 2010 by admin

Differences between  Google Page Rank and Alexa Page Rank

Google Page Rank and Alexa Rank are the two most general measuring tools by webmasters on the web. But, Google Page Rank and Alexa Rank are very much dissimilar. Here are some differences between both of them exposed.

1. Google’s Pagerank is a calculate of a page’s importance based on number and quality of incoming links to your website while Alexa Rank is computed based on traffic recorded to your website as calculated through the Alexa Toolbar.

2. Google Page Rank is only published 3-4 times a year while in Alexa ranks your website by current traffic.

3. Google Page Rank is not frequently updated as in the case of Alexa. So sometimes Google page rank may appear outdated.

4. Google Page Rank don’t require any toolbar installed on your browser but Alexa is crippled without toolbar.

5. Google Page Ranking mechanism can’t be mislead while Alexa page rank can be fictitious by installing the alexa toolbar to many computers and reloading the page frequently.

6. Google Page Rank is shown as picture while Alexa Tool Bar will show rank as Numbers.

7. Google Page Rank is calculated from 1 -10. Bigger is Better while Alexa Rank is calculated from Millions – 1. Smaller is Better.

Posted in Search Engine Optimazation (SEO) Tagged with:

January 17th, 2010 by admin

Robots.txt file is a file placed in your main directory and concerns commands to crawler visiting your site. The significance of a robots file can mean certain pages/sections can be “crawled” or not crawled depending on the issues given.

Using a Robots File efficiently

In general we wish for as much as exposure as possible to our sites, but there some content that you don’t want indexed and listed on search engines. This is where a robots.txt can be used effectively.

Definitions

User-agent: this parameter defines, for which bots the next parameters will be valid. * is a wildcard which means all bots or Googlebot for Google.
Disallow: defines which folders or files will be expelled. None means nothing will be expelled, / means everything will be expelled or /folder name/ or /filename can be used to specify the values to expelled.
Allow: this parameter works just the opposite of Disallow. You can mention which content will be allowed to be crawled here. * is a wildcard.
Request-rate: defines pages/seconds to be crawled ratio. Example, 1/20 would be 1 page in every 20 second.
Crawl-delay: defines how many seconds to wait after each successful crawling.
Visit-time: you can describe between which hours you want your pages to be crawled.
Sitemap: this is the parameter where you can show where your sitemap file is (You must use the complete URL address for the file).

Example

This the robots.txt We can use on our site:

User-agent: *
Disallow: /cms/feed/
Disallow: */feed/*
Disallow: /feed
Disallow: /cms/wp-content/
Disallow: /cms/wp-plugins/
Disallow: */wp-content/*
Disallow: /cms/wp-content/plugins/
Disallow: /cms/index.php
Sitemap: http://www.bestblogs.asia/sitemap.xml

Posted in Search Engine Optimazation (SEO) Tagged with: