Take Control Back On The Web

How It Works

Firstly, imagine we are one of these bots. They visit a page and literally click all the links adhoc then move onto to the next site. Now, imagine the website has a blackhole. Bots would naturally enter the obis. Upon visiting the blackhole URL, they are banned from there on in.

What about Google, et al? Well, they follow robots.txt protocol so assuming that is the case our blackhole is ignored. Therefore, any bot that listens to your robots.txt rules will be allowed to surf - in the fashion it defines.

01

Stage

ARE U A B0T?!

Bots are supposed to check robots.txt, so search engines like Google and so on would not visit those pages as per specification. With this in mind, we have a way of filtering good bots vs. bad bots.

02

Stage

BLACKHOLE

Now, imagine a blackhole. This would be a random URL visible only in the website code. On the website, it is not seen anywhere. So who would visit it? Bots not listening to robots.txt of course!

03

Stage

SRSLY, THEY'RE FRIED

Once they visit the blackhole the IP address will be blacklisted and all future attempts to visit your web resource would be denied.

Get The Code

PHP Pack

Set of files and documentation for PHP.
PHP Kit

Central API

Coming Soon - Notify us of the bad bots and we will let you know what information we have from all notifications.

Other Languages

Please provide other language support, based on the PHP pack. We need your help!

Get Started