Antibot Cloud

Universal website protection from bots: Antibot, Antispam, Firewall for protecting WordPress and PHP websites. Fast installation and setup.

Protecting websites since 2017.

What is Antibot Cloud?

Free open-source PHP script (GNU GPL v3) and cloud data-processing services (API, SaaS model) to protect websites from malicious bots: prevents behavioral factor manipulation, reduces server load, protects from spam, hacking attempts, brute-force attacks and vulnerability scanning, scraping (parsing), proxy servers, and content theft.

SEO-friendly - safe for indexing and search promotion. For Yandex - improves behavioral factors and protects Yandex Metrica. UX-friendly - does not interfere with users, allows you to remove CAPTCHAs from forms.

Cloud processing runs on infrastructure in Europe and Russia in compliance with GDPR and 152-FZ requirements. A free trial period is available for the cloud service.

Pricing Plans

ONE

$10 per Month

1 domain with subdomains

Unlimited bot checks

Behavioral factors protection

Trial period 10 days

Personal support via Email

Personal support via Telegram

Personal support via WhatsApp

Sign Up

UNLIMITED

$150 per Month

Unlimited protected domains

Unlimited bot checks

Behavioral factors protection

No trial period

Personal support via Email

Personal support via Telegram

Personal support via WhatsApp

Sign Up

FREE

Antibot script (GNU GPL v3) with local checks and local filtering rules is provided for free without limitations or additional conditions. Without part of the cloud functionality in the admin panel.

By default, it does not protect the website from behavioral (browser) bots.

No personal support - community help only (forums, Telegram groups).

You create local filtering rules yourself by country, city, time zone, PTR, ASNum, ASName, User-Agent, browser language, subnets or individual IPv4 and IPv6, referrer, and other parameters.

How does Antibot Cloud work?

Allow access

Real users

Real users pass an automatic check and reach the website in 1-3 seconds. Most visitors are verified automatically without any actions. If the system detects a visitor as suspicious, they will need to confirm they are not a robot - press a button or pass a quick CAPTCHA.

A visitor is checked once. After that, as long as cookies are saved and User-Agent or IP do not change (this usually happens rarely), Antibot will not show a repeated check.

Search engine bots

Search engine crawler bots (Googlebot, Bingbot, Yandexbot, etc.) get access without extra checks and do not "see" the Antibot script in action. Their authenticity is verified using the "IP-PTR-IP" scheme - fake "search bots" will fail the check and be blocked.

Good bots

You can allow the bots you need in the rules - you decide who to let through and who to block. Examples of useful bots: PageSpeed Insights/Lighthouse, link preview bots (Telegram, Facebook, X) and other social networks, RSS/Feed readers...

Custom rules

To filter traffic, you can set custom rules - allow or deny access by country and city, time zone, PTR, ASNum/ASName, User-Agent, browser language, IP, referrer, and other fingerprint parameters.

Block access

Spam

Spam bots automatically find contact forms, comments, forums, registrations, and other input fields to массово send ads, phishing and malicious links. This clogs databases and email, damages domain reputation, worsens content quality, and increases website load.

Hacking attempts

Hacker bots automatically scan a website for vulnerabilities, look for admin panels and common paths, brute-force passwords, and try to exploit XSS, SQL injections, file uploads, and other weak spots. The consequences include malicious code and redirect injections, page replacement, hidden SEO spam, data leaks, ranking drops, and blacklisting. Antibot blocks such attacks at the входе, blocks suspicious requests, and reduces the risk of hacking.

Scraping

Scraping is the automated collection and copying of website content using bots. Doorways and proxy sites create duplicate pages, which reduces content uniqueness in search and adds extra server load. Besides copying text and images, scrapers often extract catalog data - prices, product pages, links, contacts, email, and other public information - to use it on third-party resources or for mailings.

Behavioral bots and antidetect browsers

Behavioral bots disguise themselves as regular visitors and imitate "real" actions on the website - page views, scrolling, clicks, and navigation. They often use antidetect browsers and proxy/VPN, and spoof browser fingerprints to bypass simple filters. As a result, analytics get distorted, CTR and conversion degrade, ad budgets are wasted (click fraud), bounce rate grows, and behavioral factors get worse.

Antibot Installation Guide

Software requirements: PHP 5.6 - 8.5.2, PHP extensions (classes): SQLite3, Mbstring, GMP, ZIP, GD, cURL, OpenSSL.

Download the installer (antibot_installer.php script) in the user panel. User Panel.

Upload the installer to your website (via SFTP) and run it in the browser at a URL like https://your-site/antibot_installer.php

Include Antibot to the required scripts or CMS. Integration methods.

If you have difficulties, check the documentation section (it answers 95% of common questions) or order professional installation and setup from support (paid or free). Documentation. Order installation and setup.