baukalkulation.at

Standardkalkulation Hochbau

BIM-konforme Standard­kalkulationen für die wesentlichen Leistungsgruppen im Hochbau nach österreichischer Leistungsbeschreibung Hochbau in der Version LB-HB 20.


Alexa stats for baukalkulation.at

Compare this site to:

traffic alexa for baukalkulation.at

Site Seo for baukalkulation.at

Tag :
H1 H2 H3 H4 H5
2 2 0 7 0
Image : There are 35 images on this website and 0 images have alt attributes
Frame : There are 0 embed on this website.
Flash : There are 0 flash on this website.
Size : 44,050 characters
Meta Description : Yes
Meta Keyword : Yes

Magestic Backlinks for baukalkulation.at

Magestic Backlinks baukalkulation.at

About baukalkulation.at

Domain

baukalkulation.at

MD5

63e56cfb355f61a017fb8b3010939da1

Keywords

000

Google Analytic UA-Compatible
Charset UTF-8
Page Speed Check now
Programming Language ASP.NET
Web server Microsoft-IIS/10.0
IP Address 83.175.127.150
					
								
		
#
# robots.txt f�r http://www.icontent.at/ und Partnerwebsites
#
# Leider f�hrt die Vielzahl von Seiten bei manchen verantwortungslosen Spidern
# zu �berh�htem Traffic. Diesen m�ssen wir leider unterbinden um den Service sicherzustellen
#
#

# ubicrawler 
User-agent: BUbiNG
Disallow: / 

# advertising-related bots:
User-agent: Mediapartners-Google*
Disallow:


# versuch die Bots auf einen Zugriff alle 8 sec einzubremsen
User-Agent: *
Crawl-Delay: 8



# Pixray.com
User-agent: pixray
Disallow: /

User-agent: Pixray
Disallow: /

User-agent: Pixray-Seeker
Disallow: /


User-agent: SEOkicks-Robot
Disallow: /

User-agent: BLEXBot
Disallow: /

User-agent: AhrefsBot
Disallow: /


# id-Search.org
User-agent: IDBot
Disallow: /

# work bots:
User-agent: IsraBot
Disallow: /

User-agent: Orthogaffe
Disallow: /

# Crawlers that are kind enough to obey, but which we'd rather not have
# unless they're feeding search engines.
User-agent: UbiCrawler
Disallow: /

User-agent: DOC
Disallow: /

User-agent: Zao
Disallow: /

# Some bots are known to be trouble, particularly those designed to copy
# entire sites. Please obey robots.txt.
User-agent: sitecheck.internetseer.com
Disallow: /

User-agent: Zealbot
Disallow: /

User-agent: MSIECrawler
Disallow: /

User-agent: SiteSnagger
Disallow: /

User-agent: WebStripper
Disallow: /

User-agent: WebCopier
Disallow: /

User-agent: Fetch
Disallow: /

User-agent: Offline Explorer
Disallow: /

User-agent: Teleport
Disallow: /

User-agent: TeleportPro
Disallow: /

User-agent: WebZIP
Disallow: /

User-agent: linko
Disallow: /

User-agent: HTTrack
Disallow: /

User-agent: Microsoft.URL.Control
Disallow: /

User-agent: Xenu
Disallow: /

User-agent: larbin
Disallow: /

User-agent: libwww
Disallow: /

User-agent: ZyBORG
Disallow: /

User-agent: Download Ninja
Disallow: /

#
# Sorry, wget in its recursive mode is a frequent problem.
# Please read the man page and use it properly; there is a
# --wait option you can use to set the delay between hits,
# for instance.
#
User-agent: wget
Disallow: /

#
# The 'grub' distributed client has been *very* poorly behaved.
#
User-agent: grub-client
Disallow: /

#
# Doesn't follow robots.txt anyway, but...
#
User-agent: k2spider
Disallow: /

#
# Hits many times per second, not acceptable
# http://www.nameprotect.com/botinfo.html
User-agent: NPBot
Disallow: /

# A capture bot, downloads gazillions of pages with no public benefit
# http://www.webreaper.net/
User-agent: WebReaper
Disallow: /

User-agent: *
Disallow: /WebResource.axd


		

Cross link