Secure Computing
   
Secure Computing Secure Content Management products

Secure Computing Page Verifier Information

Secure Computing® has been securing the connections between people and information for over 20 years. Specializing in delivering solutions that secure these connections, Secure Computing helps customers create a trusted environment both inside and outside their organizations. Please visit our Web site, www.securecomputing.com for more information about our products. As part of our work to provide solutions that enable our customers to provide Internet-borne threat protection, we have developed a crawler that scans public Web pages looking for malware. We endeavour to run this page verification tool in a manner that minimizes undue traffic for any Web site.

If you have questions or concerns about the way we are running this service, please email us at: page_verifier@securecomputing.com. We have QA resources dedicated to auditing our crawling service and to make sure we are properly following our policy.

We strive to honor the crawler rules that you have in your site's robots.txt file. If you would like to tell us what part of your site you would like us to stay out of, you can add a rule to your robots.txt file. There is a tutorial on how to do this at http://www.outfront.net/tutorials_02/adv_tech/robots.htm.

We at Secure Computing take our commitment to our products and users seriously. As a result we maintain a policy for our web crawler. We maintain a "do not call" list of Web sites, as requested by users. Feel free to request to be added to the list by emailing page_verifier@securecomputing.com. We will add and verify that your site is added to the "do not call" list. The policy also requires that we respond personally to all emails submitted to page_verifier@securecomputing.com, record all feedback and related information, and track this activity.

The following FAQs should answer most questions:

  • Does this Web robot belong to Secure Computing?
    • Yes, the Secure Computing SmartFilter Tools Team operates our Web crawler and verifies that we are following Internet Robot protocol.
    • Any questions we receive are personally reviewed and responded to. Our email address is page_verifier@securecomputing.com.
  • What is the purpose of this crawler?
    • The purpose of the crawler is to scan public Web pages looking for malware.
  • Why is the crawler targeting my Web site and what is it looking for?
    • Internet-borne threat protection requirements are ever-changing and in order to keep up with this changing landscape we direct the crawler to scan public Web pages looking for the newest and latest malware.
  • Is more information available about robots and the reasons for their visits?
  • Does Secure Computing's robot adhere to the robot exclusion standard (support of the robots.txt file)?
    • Secure Computing monitors and records all inquiries and verifies that the site had a robots.txt file and that we followed the protocol.