wikispooks.org

WikiSpooks Home

Loading Wikispooks Main Page Please wait a few moments......


Alexa stats for wikispooks.org

Compare this site to:

traffic alexa for wikispooks.org

Site Seo for wikispooks.org

Tag :
H1 H2 H3 H4 H5
0 0 1 1 0
Image : There are 1 images on this website and 0 images have alt attributes
Frame : There are 0 embed on this website.
Flash : There are 0 flash on this website.
Size : 1,043 characters
Meta Description : No
Meta Keyword : No

Magestic Backlinks for wikispooks.org

Magestic Backlinks wikispooks.org

About wikispooks.org

Domain

wikispooks.org

MD5

76d059400fe016cf36be0dea458cbd41

Charset ISO-8859-1
Page Speed Check now
Web server Apache/2.4.51 (Unix)
IP Address 79.170.40.167
					
								
		
# advertising-related bots:
User-agent: Mediapartners-Google*
Disallow: /

# Wikipedia work bots:
User-agent: IsraBot
Disallow:

User-agent: Orthogaffe
Disallow: /

# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /

User-agent: DOC
Disallow: /

User-agent: Zao
Disallow: /

# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /

User-agent: Zealbot
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: Fetch
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: linko
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Microsoft.URL.Control
Disallow: /

User-agent: Xenu
Disallow: /

User-agent: larbin
Disallow: /

User-agent: libwww
Disallow: /

User-agent: ZyBORG
Disallow: /

User-agent: Download Ninja
Disallow: /

# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /

#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /

#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /

# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /

# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /

User-agent: *
Disallow: /secure/
Disallow: /lists/
Disallow: /dpf/
Disallow: /w/


User-agent: googlebot
Disallow: /secure/
Disallow: /lists/
Disallow: /dpf/
Disallow: /w/
Disallow: /wiki/Special:Search
Disallow: /wiki/Special:Random






User-agent: *
Crawl-delay: 2
		

Cross link