www.phex.org

Phex - Home

Phex is a peer-to-peer file sharing client for the Gnutella network. It is free software and without spyware or adware. Phex is released under the GNU General Public License. Phex is based on Java technology and therefore available for many different


Alexa stats for www.phex.org

Compare this site to:

traffic alexa for www.phex.org

Site Seo for www.phex.org

Tag :
H1 H2 H3 H4 H5
0 0 0 0 0
Image : There are 7 images on this website and 2 images have alt attributes
Frame : There are 0 embed on this website.
Flash : There are 0 flash on this website.
Size : 27,457 characters
Meta Description : Yes
Meta Keyword : Yes

Magestic Backlinks for www.phex.org

Magestic Backlinks www.phex.org

About www.phex.org

Domain

www.phex.org

MD5

c09264e0df9d433c50619e1f271df4bd

Keywords

Gnutella, Phex, file sharing, p2p, peer to peer, freeware, servent, network, java, LimeWire, BearShare, Morpheus, Napster, download, upload, swarming, Gnutella client, windows, linux, mac, osx, mp3, avi, divx, mpeg, jpg, gif, GWebCache, cross-platform, GPL

Charset ISO-8859-1
Page Speed Check now
Web server Apache
IP Address 85.13.135.89
					
								
		
# robots.txt rules mainly like they are used at http://www.wikipedia.org
#
# Please note: There are a lot of pages on this site, and there are
# some misbehaved spiders out there that go _way_ too fast. If you're
# irresponsible, your access to the site may be blocked.

# advertising-related bots:
User-agent: Mediapartners-Google
Disallow: /

# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /

User-agent: DOC
Disallow: /

User-agent: Zao
Disallow: /

# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /

User-agent: Zealbot
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: Fetch
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: linko
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Microsoft.URL.Control
Disallow: /

User-agent: Xenu
Disallow: /

User-agent: larbin
Disallow: /

User-agent: libwww
Disallow: /

User-agent: ZyBORG
Disallow: /

User-agent: Sproose
Disallow: /

User-agent: Speedy
Disallow: /

# urltrends.com
User-agent: Snappy
Disallow: /

User-agent: Yandex
Disallow: /

#
# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /

#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /

#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /

#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /

# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /

# WayBackMachine archiver
User-agent: ia_archiver
Disallow: /

# Bots we dont need to have..
User-Agent: YodaoBot
Disallow: /

User-Agent: baiduspider
Disallow: /

User-agent: Shim-Crawler
Disallow: /

User-agent: MJ12bot
Disallow: /

# added 18-03-07
User-agent: IRLbot 
Disallow: /

#
# Friendly, low-speed bots are welcome viewing article pages, but not
# dynamically-generated pages please.
#
# Inktomi's "Slurp" can read a minimum delay between hits; if your
# bot supports such a thing using the 'Crawl-delay' or another
# instruction, please let us know.
#
User-agent: *
Disallow: /w/
Disallow: /trap/
Disallow: /javadoc/
Disallow: /wiki/index.php/Special:Random
Disallow: /wiki/index.php/Special%3ARandom
Disallow: /wiki/index.php/Special:Search
Disallow: /wiki/index.php/Special%3ASearch
Disallow: /wiki/index.php/Special:Recentchanges
Disallow: /wiki/index.php/Special%3ARecentchanges
Disallow: /wiki/index.php?title=Special:Recentchanges
Disallow: /wiki/index.php?title=Special%3ARecentchanges
Disallow: /wiki/index.php/Special:Newpages
Disallow: /wiki/index.php/Special%3ANewpages
Disallow: /wiki/index.php?title=Special:Newpages
Disallow: /wiki/index.php?title=Special%3ANewpages
Disallow: /wiki/index.php?title=Special:Ipblocklist
Disallow: /wiki/index.php?title=Special%3AIpblocklist
Disallow: /wiki/index.php?title=Special:Contributions
Disallow: /wiki/index.php?title=Special%3AContributions
Disallow: /*&oldid
Disallow: /*Special:
Disallow: /*MediaWiki:

## *at least* 60 second please. preferably more :D
## 60 second is not enough again yahoo slurp... 10 minutes!!!
Crawl-delay: 600		

Cross link