inndata.at

inndata - Homepage

Das in Innsbruck ansässige Systemhaus inndata Datentechnik GmbH betreibt ein leistungsfähiges Rechenzentrum und den offenen industriedatenpool. Neben e-Commerce-Systemen für den Baustoffhandel entwickeln und betreiben wir mit interchange eine innovat


Alexa stats for inndata.at

Compare this site to:

traffic alexa for inndata.at

Site Seo for inndata.at

Tag :
H1 H2 H3 H4 H5
3 2 0 3 0
Image : There are 12 images on this website and 2 images have alt attributes
Frame : There are 0 embed on this website.
Flash : There are 0 flash on this website.
Size : 48,773 characters
Meta Description : Yes
Meta Keyword : Yes

Magestic Backlinks for inndata.at

Magestic Backlinks inndata.at

About inndata.at

Domain

inndata.at

MD5

267ed718967165470678ff5421918a8b

Keywords

000 inndata, datentechnik, baustoffdaten, internet, publishing, sortimentsbildner, eCommerce, druck, multimedia, terminals, baumarkt, kundenkarten, edifact, elektromobilität

Charset UTF-8
Page Speed Check now
Programming Language ASP.NET
Web server Microsoft-IIS/10.0
IP Address 83.175.127.153
					
								
		
#
# robots.txt f�r http://www.inndata.at/ 
#
# Leider f�hrt die Vielzahl von Seiten bei manchen verantwortungslosen Spidern
# zu �berh�htem Traffic. Diesen m�ssen wir leider unterbinden um den Service sicherzustellen
#
#

# versuch die Bots auf einen Zugriff alle 5 sec einzubremsen
User-Agent: *
Crawl-Delay: 5

# ubicrawler 
User-agent: BUbiNG
Disallow: / 

# advertising-related bots:
User-agent: Mediapartners-Google*
Disallow: /

User-agent: SEOkicks-Robot
Disallow: /

User-agent: BLEXBot
Disallow: /

User-agent: AhrefsBot
Disallow: /


# id-Search.org
User-agent: IDBot
Disallow: /

# work bots:
User-agent: IsraBot
Disallow: /

User-agent: Orthogaffe
Disallow: /

# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /

User-agent: DOC
Disallow: /

User-agent: Zao
Disallow: /

# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /

User-agent: Zealbot
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: Fetch
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: linko
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Microsoft.URL.Control
Disallow: /

User-agent: Xenu
Disallow: /

User-agent: larbin
Disallow: /

User-agent: libwww
Disallow: /

User-agent: ZyBORG
Disallow: /

User-agent: Download Ninja
Disallow: /

#
# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /

#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /

#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /

#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /

# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /

User-agent: *
Disallow: /WebResource.axd
		

Cross link