Recently, Google Analytics does not filter out all bot traffic from the organic traffic we see in the panel. We can remedy this by adding a list of bot IPs to the .htaccess file in our main folder that should not be allowed on our server.
The problem with bot traffic
Bot traffic can significantly distort your Google Analytics data, making it difficult to understand your real user behavior. Common sources of bot traffic include:
- Cloud hosting providers (Digital Ocean, AWS, etc.) - bots often run from these servers
- SEO scrapers - tools that crawl your site to gather data
- Spam bots - attempting to submit forms or leave spam comments
- Competitor analysis tools - checking your content and keywords
Solution: .Htaccess bot blocking
Add the following to your .htaccess file to block known bot IPs:
## Block known bot networks
deny from 78.139.5.228
deny from amazonaws.com grapeshot.co.uk lipperhey.com
## Digital ocean ranges
deny from 104.236.0.0/16
deny from 159.203.0.0/16
deny from 165.227.0.0/16
## Amazon aws ranges (if bots originate from there)
deny from 52.4.0.0/14
deny from 52.20.0.0/14
deny from 54.64.0.0/15
Prefer server-level blocking
Use .htaccess or your hosting firewall to block known bad bots. We do not recommend installing security plugins for this; server-level rules are more reliable and do not add plugin overhead.
Monitoring results
After implementing bot blocking:
- Check your Google Analytics for reduced bot traffic
- Monitor server logs for blocked requests
- Adjust rules as needed for new bot sources
Important notes
- Be careful not to block legitimate crawlers like Googlebot
- Test thoroughly before implementing in production
- Keep updated as bot IP ranges change frequently
- Consider using a CDN with bot protection like Cloudflare
By properly blocking bot traffic, you’ll have cleaner analytics data and potentially improved server performance.


