In this tutorial, we will show you how to prevent most common web crawlers from unnecessary utilization of your visitor traffic log. A web crawler is an Internet bot which periodically browses the Internet, typically for the purpose of indexing web pages.
Navigate to “My Projects” page. Locate the project that you need to have stop logging web crawlers and click on the “edit” link.
Find the “Log Filter” drop down menu and select “Do NOT Log Robot Visits”
Scroll to the bottom of the page and click on the “Update” Button.
On “My Projects” page, you can now visually confirm that the feature is enabled by observing a corresponding robot icon displayed under the status screen.