Cloud Backups

I got started using Amazon Glacier today – it’s ultra low cost cloud storage, with the caveats that if you want to get anything, it’s a few hours turnaround time, and also that the only real access to it is via a programmatic API.   To the first, that’s fine, as I am looking to use it for backups for my server at home, and hopefully will never have to get the stuff.   As for the second, well there are already plenty of third party clients that have been written using the API!   I am using CloudBerry Amazon S3 Explorer, which supports Glacier as well.    I’ve already uploaded some RAW camera files, about 200 MB worth…  Now to downsize those to small images on my server and get rid of the big digital negatives that are backed up in the Glacier…   Fun stuff.

Ten Times Faster

The past few days, Comcrap seems to have been playing IP games, perhaps in doing maintenance on their network. On Sunday my external IP changed to a very different number and I altered my DNS settings accordingly for people to be able to get to my webserver. Then on Monday morning it switched back! Oh, I guess that makes sense, if they are doing maintenance (IPv6support, PLEASE!) and need to isolate some of their infrastructure. The problem is that a few hours later, service to my house was dropped completely.

This is bad because, while I have a caching reverse proxy in the cloud, all dynamic parts of my websites are hosted on my home server. Particularly, my wife’s photography business that is taking off was not able to serve clients. Groan. Maybe I should bite the bullet and beef up the memory of the cloud server. “Next to nothing” costs for my current cloud server profile times two (as a result of doubling the memory) I guess would be “near nothing” costs….

At any rate, while considering all this, I had the downed connection to deal with. The standard reboot the cable modem deal did not solve the problem like it had in the past. Since my neighbor had Internet, a call to Comcrap was in order. The tech, Jeremy, was actually pretty nice, but after a while said he would have to call me back. Oh well.

While I was waiting for the callback, I figured now was the perfect time to install the new $22 5 Port Gigabit switch that I had bought from Fry’s, and move the WRT54G to the side to just serve wifi clients. My servers and desktop will like being able to talk to each other fast! I did so, rearranging the wires and all that, and cleaned things up a bit. Things were nice and fast now between my main server and my Untangle home gateway. Happy! Ten Times faster.

Comcrap called back saying things should work now, but had me unplug the cable modem for a few minutes before plugging it back in. I was testing directly with my laptop connected to it (I know, bad, as the average PC is hacked in 32 seonds or some such when unprotected facing the net!), and the connection worked.

The other thing I wanted to address with Comcrap was what my speed should be looking like. I had tested it the other day with speedtest.net and gotten only 1.4 Mb down and 0.3 Mb up, whereas my neighbor was getting 30 down and 5 up! I went to speedtest again to give Jeremy the results, and lo and behold, I was getting 33 Mb down! Wow! I wondered what they had fixed. Happily, I thanked him for his help and let him go.

Now putting all back together with the Untangle server as the gateway, I tested again… UGH – back to 1.4 Mb downloads. I got the same when testing from my server. This meant only one thing – that Untangle must be causing the problem. I went to the Untangle server and disabled the spam, virus, phish, and attack sensors, and also the web filtering, openvpn, and even firewall, leaving it pretty much as a router with no blocks. STILL 1.4 Mb down. UGH.

Poking around in the networking config, I saw the advanced link hiding in the corner, and sure enough, in there was the QoS – Quality of Service – section. Memories came back – I had set QoS a long time ago to prevent certain clients on the internal network from hogging too much bandwidth (Ok, teenagers downloading movies). The thing with this is, QoS asks you to tell it what your download and upload speeds are, which back then were 1.5 down and 0.367 up. By setting these, QoS knows what rates to throttle the connections back to. The down side was that QoS now effectively imposed its own cap on ALL traffic based on what I had told it, and as Comcrap updated service over the years, I never saw it.

Lesson learned: either don’t use QoS or remember to update your settings with accurate download/upload throughput speeds! Otherwise you will be capping your access un-necessarily.

Network and Server Monitoring

One of my passions is automated monitoring and correction of network and server problems. I have the most experience with SiteScope (primarily a commercial website monitoring tool that has branched out to include protocols, application stacks, and whatever custom stuf you want) and Nagios, which is free and open source, and very very configurable.

I would LOVE to form a company implementing these or similar monitoring tools. I’ve done this a lot at work, and a little bit on the side for a few friends and their companies.

I’ve been looking around, and it seems that in addition to Nagios, two other contenders are Zenoss and Cacti. I’ve heard good things about Zenoss, specifically how it is easier to set up than Nagios. I think I may check it out, though I am a fan of Nagios’s flexibility.

Wireless Network Upgrade

I recently bought the Linksys pre-N router (WRT300N) and a laptop PC card. I hate to say it, but the 4x range that it promises is flat out wrong. I get better connection to my wireless G router (WRT54G) up on the second floor. However, the throughput is amazing – 270 mbit/sec as opposed to 54. I just need to get it working well.

I see that they have come out with some firmware upgrades since I last checked a week ago, and it mentions improving wireless power and performance. We’ll see what happens!