Site Killing Spam Bot

mail-311519_1280You might have noticed, although there aren’t too many visitors to ChaosManor Reviews (CMR), some problems accessing the site this month (Sept 2016). The site would not load at all, usually due to a timeout of the request to load the desired page. Then yesterday (23 Sep 2016), the site was not available at all. And neither was the Chaos Manor (CM) site.

That prompted a concerned email from Jerry to me about site access problems. So I hobbled over to my computer (dealing with a sciatica problem) and started troubleshooting.

The cause of both sites not being available was the hosting company (BlueHost – BH) suspending the hosting account due to the load on the shared servers. (It is common for web sites to ‘share’ a single server and its resources; that’s how a hosting place keeps the hosting costs low.)

Calling Support

I got onto the BlueHost support line via a chat (after a long delay; they were a bit overloaded, so help chats responses were slow). I got the account un-suspended, and both the CM and CMR sites were again available.

Some background on how the hosting is set up here. Jerry uses BH as his hosting place. They are a very large hosting place, probably in the top 5, I’d guess. I’ve used their hosting (and their sister’s JustHost hosting service) for many years for my personal web sites and those of clients. The up time is good, and support is good also.

A hosting account can support more than one domain. And, to keep costs down, hosting companies usually put more than one hosting account (with its associated domains) on a single server. So multiple domains – and multiple hosting accounts – are using the resources of a single computer.

This usually works well, especially for smaller domains – those that have small numbers of daily visitors. With small loading on a server by each domain on that server, the process works well.

Until one domain starts hogging the resources of a shared server. That results in the other domains on that server to start slowing down. It’s like a big truck (a domain with lots of concurrent visitors) on a one-lane road going up a hill. The truck is loaded down with visitors, and slows down all the other vehicles (other domains on that shared server). The guys in charge of the highway (the hosting place) don’t want the big truck to take up all the resources.

So when a domain starts taking up a higher percentage of user ‘load’, the hosting place has a few options. They can increase the resources of that server, or they can move the high-load domain to a different server – maybe one with less domains sharing, or more powerful hardware.

But that is not cost-efficient for the hosting place. If a domain needs higher capabilities, then the customer (the owner of the domain) should pay more for a better/faster/less-shared server.

The other alternative that the hosting place has is to suspend the high-load domain, which is what BH did to all of Jerry’s domains. That reduces the load on the shared server, and the other domains on that server get their normal resources.

BlueHost will send an email to the domain owner telling them that their accounts (and sites) are suspended. Which they did – Jerry got the email alert. The problem with that procedure is that there is no ‘warning’ of the impending ‘doom’ of domains going off-line. It’s like the traffic cop always giving you a ticket – no warnings.

Jerry gets lots of email, and he happened to notice that email from BH about his sites being suspended (off-line). He forwarded it to me (I do all the technical stuff for his sites), and I started working on the problem via an on-line chat.

BH wouldn’t (couldn’t) tell me what domain was causing the overload problem. At least during the first chat. All they would say is that a domain on Jerry’s account was overloading the shared server’s resources. And that the best solution they could come up with was to upgrade the hosting account to get a more powerful server, or a server that is dedicated to his accounts – either of which would cost more.

Jerry was OK with spending the extra money, but I really wanted to figure out the root of the problem. Just throwing more server resources at the problem wouldn’t fixed the problem.

I did get them to un-suspend the sites, so they came back on line. And a second support chat with BH got me the information that the CMR domain was the source of the overload. With that issue resolved – Jerry’s domains back online – it was time to figure out the cause of the excess load on the CMR domain.

The Investigation Starts

Most hosting places have visitor analysis logs that you can look at via a log analyzing programs, most commonly the AWSTATS and Webalizer programs. You can also download the ‘raw’ access logs, which you can analyze off-line via a program like Log Parser Lizard. I used the on-line Webalizer via the Control Panel (cPanel) on the BH account administration screens.

It seemed to me that I could look at the most-accessed pages to see if there was one page being accessed by an automated process (bot). Follow along as I wander through the Webalizer stats for the current month as I figure out the cause of the CMR site overload.

I first looked at the daily usage of the site. It’s easy to see that there was a big spike in usage on two particular days. (Click on any image to see a larger version in a new tab.)

clip_image002

Now, the CMR site doesn’t get a whole lot of visitors, because there’s nothing new there. Jerry is the main author of posts for that site, and since the stroke he has less time/energy to devote to writing CMR articles. (It does take a bit of time to write posts – this one took me a couple of hours.)

So a big spike in traffic is not real people accessing the site, it’s probably a bot of some sort. Those spikes on the 18th and 22nd were not a result of real people accessing this site. Those spikes are out of the normal range of visitors to CMR. I know this because I keep track of site statistics for all of the sites that I manage – my sites, Jerry’s sites, and my client sites.

This chart verifies the numbers of the first chart.

clip_image004

The conclusion of looking at the above two charts – an indication of an automated (bot) process accessing the CMR site.

Based on that theory, this next chart shows the most-accessed pages.

clip_image006

Usually there will a good spread of use on a site’s pages. The home page will be the most accessed, and other pages will show less activity that the access to the home page.

The above chart shows that one particular page (the “Tell a Friend” page – “TAF”) was getting tons of visits, out of proportion to the other pages (which show in the rest of the chart; not shown here).

This was verified by looking at the “Total Entry Pages” and the “Total Exit Pages” – the next two charts.

clip_image008

clip_image010

This tells me that the TAF page is the entry and exit point of the visitors – they are only looking at that page, and not elsewhere in the site.

I know what the TAF page is – I know what all the pages are, which is a good thing for a web site admin to know. And those two charts are verifying the hypothesis – a bot is accessing the TAF page. And the bot is only accessing that page, not other sites. You can see from the two charts above that a disproportionate number of visitors are accessing that TAF page.

So, who is the culprit? Is it a single user, or a bunch of visitors? This next chart, which shows the Top 30 IP addresses accessing the entire site, tells us that one IP address is the top visitor to the site. Note that the percentages are an indicator – but of the entire month.

clip_image012

And who is this visitor? Let’s ask the googles. I put that IP address into the search bar, and come up with this:

clip_image014

Aha! A computer somewhere in Shandong, China is the high-volume visitor to CMR. The googles give me more verification that the visitor is a bot:

clip_image016

So we have a good indicator that the excessive traffic to the CMR site is a bot:

· They are only accessing one page, not any other part of the site

· The bot attacker is from a known hacker location.

We could verify this by deeper analysis of the ‘raw logs’, but I didn’t think it was necessary. The above charts show that the TAF page is getting way too much traffic than would be normal.

The Form

Since I built the CMR site, I know what the TAF page is. It’s a simple way to let people send an email to someone they know. It is a simple form:

· The ‘from’ email (the visitor)

· The ‘to’ email (who the visitor wants to send a message to)

· A short message

When a visitor fills out the form, the site sends an email to the ‘to’ (recipient) with a short message of “check out this site; I think you’ll like it” plus any additional message that is entered into the form. It’s a simple way for a visitor to recommend the CMR site.

The form is set up via the “Contact Form 7” plugin for WordPress. A great plugin; it allows the site web guy to easily create contact forms. I had that particular form to send me a BCC, so I could monitor the use of the form. It worked well, although not too many people used it. I was getting under 10 emails a week from this form.

Contact forms are often a target of email spammers. They fill in the form, and put some links in the message area, hoping that someone will click on their spammy link, which is how the spammer collects revenue – by people clicking the links in the spam email. (Don’t do that, even if you are curious. It’s a risky business – a great way to get your computer hacked.)

We Digress

A digression about comment forms and spammers.

“Comment Spam” is a problem for web site owners. They clutter up valid comment areas, and they cause excess load on the site’s server resources (our problem), slowing down valid access to the site.

There are some ways to try to limit form access only to ‘human’ (non-bot) visitors. Captchas, hidden fields, a math problem (what is 2+3?), are among the possible ways to block comment bots.

What is needed – and more effective – is a way for the form to ‘sense’ a human user.

I’ve done a bit of research into this, and did come up with a solution that works. The basic premise of my solution is that bots cannot emulate a click on a form field. And you also don’t want the bot to grab the ‘form process’ page – for you geeks, it’s a parameter in the ‘form’ tag – that processes the form submission.

My solution for that, which I use on high-volume sites, is to include code in the form that hides the ‘form action’ page, and also senses a user clicking on a field. The implementation is technical, but I’ve made it available for free on my “Form Spammer Trap” (FST) site at http://bit.ly/2cvhFi6 site. With this additional code in the form, a non-human (the bot) will be ignored (and sent to the Form Spammer Trap site). The process works great; I’ve never gotten any bot-submitted spam on a form that uses that process.

Since the CMR site is not a high-volume site, I didn’t bother with implementing the FST code for the TAF page, although it is implemented on the Comment forms. It is a bit of work to implement on a Contact Form 7 – type form, and that form wasn’t being used much. I’d get an occasional TAF BCC (I get a copy of that form access just to monitor things), but not too many of them to worry about.

Until earlier this month.

The Initial Response

I started getting TAF BCC messages, several thousand at a time throughout a day. It was clear that the TAF form was being attacked by a form-bot. Since that form wasn’t being used much, my initial response was to remove the form from the site.  I removed the TAF link from the menu; the page was still there.

That initial step did not work, the form spam was still being sent by the bot, because it already knew about the page. And there was another result – since the form-bot was sending this to random email addresses, those emails were rejected by the recipients’ mail system. So I was getting bounce messages – tons of them.

I use Gmail, and Gmail was putting all of those bounce messages in the Spam folder. Tons of them each day for a couple of days. I then removed the page entirely from the site, which eventually (after a few days) resulted in stopping all of those bounce messages being processed by Gmail into my spam folder.

That turned out to be a temporary reprieve, as you can see by the above charts. Even though the page no longer existed, the spam-bot was still trying to access the page. So the spam-bot was still causing problem. Removing the page will stop ‘future’ spam-bots, but this guy was still trying to load the page.

The Big Stop

Back to yesterday. I identified the spam-bot by IP address. So I needed to block those IP addresses before the request got to the site. This was done by using IP Address blocking feature in the BH Control Panel.

I knew that the CMR (and Chaos Manor) sites probably did not need the visitors from the Shandong Hacking School. And that there would be multiple IP addresses being used by the hacking school. So I decided to block the range of addresses from there. Here’s that screenshot:

clip_image018

There are a couple extra ones in there, but it looks like anyone from the hacker school will be blocked from accessing any of the CM/CMR pages.

Closing Thoughts

This may not be a final solution. I’ll need to keep monitoring the site stats to see if other IP addresses are being used by the spam-bots. And we may need to use some ‘cloud’ caching to reduce the load on the shared server.

It may also be advisable to move to a more expensive hosting plan that allows for more powerful resources on the server. That’s what the BH support guys said, but I don’t think the extra cost is justified at the moment. If the CM/CMR sites get a lot more visitors (and it would have to be a significant increase than now), moving to a more expensive plan with more resources might be advised.

But I thought that is was an interesting process to determine why a site is getting hammered. Perhaps it will help you CMR readers.

3 comments on “Site Killing Spam Bot

  1. I have had this a couple of times. A plug-in called Cleantalk works well for me. I run several forums and we often get targetted and overloaded by spam.

    • Thanks for the reply. Haven’t looked at how Cleantalk does it. But if it uses captchas, hidden forms, css tricks, or ‘what is 1+5’ questions, those are (IMHO) not very effective techniques.

      According to their plugin info, it’s a paid service. And it still stores the spam in the database.

      Which is why I wrote the FormSpammerTrap for Comments plugin on all the WP sites I manage or create. Although my sites are not high-volume, there are no spammy comments there. It doesn’t let the comment get into the database, so is less intrusive on the site data (the ‘submit’ send the spammer to the http://www.FormSpammerTrap.com site – so it might help you SEO and ‘site load’).

      And it’s free.

      Thanks….Rick…

Comments are closed.