Blocking abusive User-Agent strings in Apache reduces log noise from commodity scanners, lowers wasted CPU on expensive routes, and keeps a single misbehaving bot from turning normal traffic into a slow-motion denial of service.
Apache exposes the User-Agent header to configuration directives during request processing. Using SetEnvIfNoCase from mod_setenvif, matching requests can be tagged with an internal environment variable and then filtered by Apache 2.4 authorization rules.
Because the User-Agent header is client-supplied, spoofing is trivial; treat blocking as a coarse filter, not a security boundary. Keep match patterns narrow, scope rules to the correct content directory for the site, and run a syntax test before reloading so a bad directive does not prevent Apache from applying changes.
Steps to block user agents in Apache:
- Enable the setenvif module when it is not already loaded.
$ sudo a2enmod setenvif Module setenvif already enabled
Skip this step on distros that do not use a2enmod or where the module is already enabled.
- Create /etc/apache2/conf-available/block-user-agent.conf with a SetEnvIfNoCase rule that marks unwanted clients.
$ sudo tee /etc/apache2/conf-available/block-user-agent.conf >/dev/null <<'EOF' SetEnvIfNoCase User-Agent "badbot|scanner|crawler" bad_ua=1 EOF
Use specific patterns to avoid blocking legitimate crawlers, uptime checks, or real browsers.
- Add an access rule that denies requests when bad_ua is set.
$ sudo tee -a /etc/apache2/conf-available/block-user-agent.conf >/dev/null <<'EOF' <Directory "/var/www/"> <RequireAll> Require all granted Require not env bad_ua </RequireAll> </Directory> EOFChange /var/www/ to match the DocumentRoot path used by the site when content is served from a different directory.
- Enable the configuration snippet.
$ sudo a2enconf block-user-agent Enabling conf block-user-agent. To activate the new configuration, you need to run: systemctl reload apache2
- Validate configuration syntax before reloading.
$ sudo apachectl configtest Syntax OK
- Reload Apache to apply the change without dropping existing connections.
$ sudo systemctl reload apache2
Rollback: disable the snippet with
$ sudo a2disconf block-user-agent
and reload again.
- Verify the block rule by sending a request with a matching user agent.
$ curl -i -A 'badbot' -H 'Host: host.example.net' http://127.0.0.1/ HTTP/1.1 403 Forbidden Date: Sat, 10 Jan 2026 05:43:37 GMT Server: Apache/2.4.58 (Ubuntu) Content-Length: 281 Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>403 Forbidden</title> </head><body> <h1>Forbidden</h1> <p>You don't have permission to access this resource.</p> <hr> <address>Apache/2.4.58 (Ubuntu) Server at host.example.net Port 80</address> </body></html>
- Confirm normal requests are still allowed with a non-matching user agent.
$ curl -I -H 'Host: host.example.net' http://127.0.0.1/ HTTP/1.1 200 OK Date: Sat, 10 Jan 2026 05:43:37 GMT Server: Apache/2.4.58 (Ubuntu) Last-Modified: Sat, 10 Jan 2026 05:32:07 GMT ETag: "29af-64801f6762249" Accept-Ranges: bytes Content-Length: 10671 Vary: Accept-Encoding Content-Type: text/html
Mohd Shakir Zakaria is a cloud architect with deep roots in software development and open-source advocacy. Certified in AWS, Red Hat, VMware, ITIL, and Linux, he specializes in designing and managing robust cloud and on-premises infrastructures.
