Bots are software applications that run automated tasks over the Internet. But there is some bad bots which will run on your web root and pass your information to outside public. This should be prevented. In this article I am stating how to search for a bad bot and prevent it.
Enable the htaccess as described in the previous post
Open the htaccess
vi .htaccess
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^BadBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^EvilScraper [OR]
RewriteCond %{HTTP_USER_AGENT} ^FakeUser
RewriteRule ^(.*)$ http://go.away/
Save and exit
So, what does this code do? It's simple: the above lines tell your webserver to check for any bot whose user-agent string starts with "BadBot". When it sees a bot that matches, it redirects them to a non-existent site called "go.away". And
also it will check for 3 types of bots and if found one among them the control will be directed to some site.
Enable the htaccess as described in the previous post
Open the htaccess
vi .htaccess
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^BadBot [OR]
RewriteCond %{HTTP_USER_AGENT} ^EvilScraper [OR]
RewriteCond %{HTTP_USER_AGENT} ^FakeUser
RewriteRule ^(.*)$ http://go.away/
Save and exit
So, what does this code do? It's simple: the above lines tell your webserver to check for any bot whose user-agent string starts with "BadBot". When it sees a bot that matches, it redirects them to a non-existent site called "go.away". And
also it will check for 3 types of bots and if found one among them the control will be directed to some site.