Why does the file "robots.txt" appear in my error log?

The file "robots.txt" is requested by search engines when indexing your site. It can be used to define search parameters, or to specify certain files to index. You are not required to provide a "robots.txt" file; however, you will see periodic errors in your error log if you do not supply one. If you are interested in the function and format of the "robots.txt" file, and other methods of controlling or disabling search engines (including META tags), please visit the Web Robots Pages. Typically, creating a blank "robots.txt" file in your document root directory will prevent "robots.txt" from appearing in your error log, and will also serve to prevent indexing by most search engines.