Would you like to learn how to configure the Apache server to deny access to bad Bots and Crawlers? In this tutorial, we are going to configure the Apache server to block the access from bad Bots and Crawlers.

• Ubuntu 20
• Ubuntu 19
• Ubuntu 18
• Apache 2.4.41

In our example, the Apache server is hosting the website WWW.GAMEKING.TIPS.

Equipment list

The following section presents the list of equipment used to create this tutorial.

As an Amazon Associate, I earn from qualifying purchases.

Tutorial Apache - Blocking bad Bots and Crawlers

Search the Apache log file for a list of User-agents.

Copy to Clipboard

A list of suspect User-agents that accessed your website will be displayed.

Copy to Clipboard

Create a list of User-agents to block.

Copy to Clipboard

Optionally, this GitHub project offers a list of bad Bots and Crawlers.

Enable the required Apache modules.

Copy to Clipboard

Edit the Apache configuration file for the default website.

Copy to Clipboard

Add the following lines to this configuration file.

Copy to Clipboard

Change the USER-AGENT values to reflect your needs.

Copy to Clipboard

Here is the file, before our configuration.

Copy to Clipboard

Here is the file, after our configuration.

Copy to Clipboard

Restart the Apache service.

Copy to Clipboard

In our example, the Apache server will forbid access to a list of Bots and Crawlers selected by the administrator.

From a remote Linux computer, test your configuration.

Copy to Clipboard

Here is the command output.

Copy to Clipboard

The Apache server will forbid access from specific USER-AGENT values.

From a remote Linux computer, try to perform access using any other USER-AGENT value.

Copy to Clipboard

Here is the command output.

Copy to Clipboard

The Apache server will allow any other USER-AGENT value to access your website.

Congratulation! You learned how to configure the Apache server to deny access to bad Bots and Crawlers.