Navigation       Home                            Contact                           Link

AMAZONTAGHERE6

 

ARTICLE PREVIEW

How To Get People To Read Your Ads
Rule One:- The Headline The headline should summarise the whole offer. It should grab the eye, and make you want to read the subheading. The headline should intrigue and captivate the reader. It's...read more

How To Choose VoIP Service Provider
VoIP or Voice over Internet Protocol is relatively a new addition to the cutting edge telecommunication industry. Using VoIP one can make a call over a broadband connection by installing certain...read more

How To Arrange Food Allergy Testing
If you have a food allergy it is essential that you identify it as soon as possible so that you can avoid problem foods. Food allergy testing is the only way to identify a food allergy. Once you...read more

HOME >> How To Detect Search Engine Friendly Web Directories

 

YOURIMAGEHERE3

How To Detect Search Engine Friendly Web Directories
By O.P. Mundae

 

 

How to detect search engine friendly directories:

Directories serve two purposes in your web development efforts. One, a link from a directory is counted as a back link and hence increases your link popularity. Two, some directories will send you targeted traffic.

Webmasters who use directories for increasing link popularity should check out the directory first to see if the directory is SE friendly or not as, only a SE friendly directory will give value for their links.

So, how do you detect a SE friendly directory? Here are some tips, which will help you do so.

1) Check the url format of the directory. Some directories have urls in the format

http://sitename//index.php?id=1&c=10... for their category pages. This is called a dynamic link and search engines find it difficult to crawl such pages.

Urls for the category pages should be in a static format.

Example : http://www.indexbizz.com/cat1.html . Search engines can easily crawl such pages and your link will also get crawled.

2) Your link should not be a JavaScript link. Some directories use JavaScript to jump to your site from their category pages. Search engines find it difficult to crawl such links.

3) The directory must not ban search engine robots from crawling the category pages. You can check whether such a condition exists or not by looking at the robots.txt file of the directory. This file is located at the root level of the directory. To see this file just type robots.txt after the domain name of the directory. Find out what this file is saying, for example -

User-agent: *
Disallow:
This will allow all SE bots to all the folders and files in the directory.

User-agent: *
Disallow: /
This will keep all SE bots away from all files/ folders in the directory.

User-agent: *
Disallow: /mydirectory/
This will keep away all SE bots from crawling the 'mydirectory' directory and all files under it.

So be wise in your selection of SE friendly directories, and put your efforts towards getting links from such directories that fulfill the above conditions.


About the Author: Osan Mundae writes for http://www.indexbizz.com/

Source: www.isnare.com

Return to HOME to read more articles
 

RSSTAGHERE4

 

COPYRIGHT © 2009-2015 HOW TO - ALL RIGHT RESERVED

 

CLICKBANKBUDDYTAGHERE5