I’m about to move my domains to a new server, and I couldn’t help notice that 3 of them are shifting a large amount of data, which I couldn’t explain.
Data has never been an issue before – the tariff I was on was unlimited – but the new cloud solution has a limit of 50Gb per month, with (very reasonable) charges if you go over that.
50Gb for a half-dozen personal websites – should be plenty eh?
But no. My stats were showing 60Gb as a usual usage, with peaks going up to 90Gb.
Now, I would be delighted if that many people were visiting my sites, but I know that not to be the case. So I I spent this lunchtime looking at the logs.
Boy, am I embarrassed! 😀
A while back – 2017, I think, I started using Hetrix. I came across them as a solution to email server blacklisting – they monitored the lists for your domains and warned you if you appeared on any of them. But they also did an uptime monitor that would check your websites and shout if they go down.
I remember setting these up very conservatively – I thought I had it checking once an hour. I wanted to know about downtime, but it was not critical.
Sometime between then and now, the 1 hour option disappeared. Instead, it was set to the default of 1 minute. Each, for 3 websites.
But that wasn’t one poll for each website every minute. You had to specify 3 servers that would poll you – New York, London, Berlin. So that is a total of 3 polls of 3 websites every minute.
What were they polling? I just pointed them at the website, so it would hit the front page. According to my logs, each hit was about 140K.
So that is 9 hits a minute, each about 140K, times 60 minutes, times 24 hours, times 30 days. That’s 55Gb/month, without any other traffic.
That has now changed. Unfortunately, I now have to specify 4 servers to poll me (another change that only happened when I made adjustments), but it is polling just one of my websites, on the basis that if the site is down, the whole server probably is. It is polling every 10 minutes, which is the longest Hetrix allows, now. And it is no longer hitting the front page, but a page containing just the word hetrix. With the overhead of headers, that still comes out to about 200 bytes, but better than 140K.
So the calculation now is 4 servers, times 1 website, times 200 bytes, times 60/10 minutes, times 24 hours, times 30 days. That comes to under 6Mb/month, a big improvement on 60Gb.