Robots.txt is a visual editor for Robot Exclusion Files and a log analyzer software. It allows a user to quickly and easily create the Robots.txt files required to instruct search engine spiders, which parts of a Web site are not to be indexed and made searchable by the general Web public and then to identify spiders, which do not keep to those instructions. The program provides a user with a way to log onto his FTP or local network server and then select the documents and directories which are not to be made searchable.
Visually generate industry standard Robots.txt files
Identify malicious and unwanted spiders and ban them from your site
Direct search engine crawlers to the appropriate pages for multilingual sites
Keep spiders out of sensitive and private areas of your Web site
Upload the correctly formatted robots.txt file directly to your FTP server not switching from Robots.txt Editor
Track spider visits
Create spider visits reports in HTML, Microsoft Excel CSV and XML formats
MacOS, Linux, PDA/Mobile, , , ,
Lizenz | Preis:
Shareware | 99,90 USD
Note: 1 (7 mal bewertet)
auf deine Watch List!