Linux

I’ve been using IPcop for years. I use it as a secondary router/firewall at work and it has been my main firewall at home for years. At home it used to run on the first computer I personally ever purchased. It was an IBM Aptiva AMD K6-2 233MHz system with about 128MB of RAM and an upgraded 20GB hard drive. After years of faithful service, I ran into some “trouble.” I walked into the basement to hear quite a loud whirring and whining from the corner where the computer was installed. The router was amazingly still working, I assume everything was running from memory and not the disk. I started searching for parts but realized that I no longer had an old ATA hard disk in my parts bin; Everything I had was SATA or too large to be wasted on a small home firewall. Luckily my main wireless access point is a more then capable Linksys WRT54GS. It took a bit of conversion, but now I am running with the Linksys alone. IPcop was really nice, I just no longer see the need for it at home. It’s served me well and I continue to use it at work and can say nothing but good things about it.

I’ve also been messing with some Dallas 1-Wire temperature sensors. They are pretty simple to mess with and get working as most of the drivers are already in the Linux kernel. There are a few software packages that allow you to interface them directly. Some work rather well, some not so well. I’ve been messing with owfs and that seemed to work the best for me. Each sensor has a hardware address and owfs allows you to query it directly to get the current temperature:

$ owread -F /10.8889C7000800/temperature
61.5875

That’s the current temperature in our basement. I have to do a little wiring to move the sensor away from our pellet stove, otherwise when that is turned on the thing will jump up to an inaccurate temperature. I’ve also been working with a few different programs to help graph the temperature trends. Once that is all working, I can get sensors for all over the house. Why? Really, why not. Because I can.

It never fails. Yes, I had a small rant about being happy that my work machine was 4 years old and still working perfectly. Then a new, Dual Core machine “fell into my hands.” I couldn’t help it, when hardware slows down long enough at work for me to grab, I don’t hesitate. It is a 64-bit system, but I didn’t feel like installing Gentoo all over from scratch. This install is my first Gentoo install and continues to be my longest maintained install to date. I first installed Gentoo 1.4 on December 26th, 2002 during that Christmas break. The system has slightly newer hardware since then, but has never been re-installed from scratch. That’s quite an accomplishment (in my book anyway).

At any rate, it’s an HP system with a AMD Athlon(tm) 64 X2 Dual Core Processor 4000+, 2.1GHz system. 1GB (so far, this needs an upgrade) of RAM. Some of the chips are ATI based, so I had to get an nVidia based video card. ATI and Linux just don’t play nice. Luckily, my old system was recently upgraded to SATA3.0 with 2 additional drives added, so I just moved that setup and added some extra drivers to the kernel. The hardest part seemed to be the Dual Core setup, but setting up ACPI in the kernel took care of that.

One of my favorite open-sourced tools HAS to be mutt. In my opinion, it is the single greatest mail client that has ever existed. It’s small, super fast, and so configurable that it makes my head hurt. If you need something done, mutt can do it.

I’ve since moved all of my mail to my colo server. All my personal mail has already been delivered there for years, and recently I started POPs-ing my mail from work using fetchmail. I refuse to use MS Exchange’s webmail. It’s just awful to use no matter what anyone else says.

Anyway. The last thing I needed to make work was remote printing. There is a separate muttprint package that will take the generic text mutt output, pretty it up, and send it to a printer. The problem used to be that I don’t obviously have a printer connected to my colo server. Even if I did, it would be a pain to drive to the facility to retrieve documents. After a bit of Google work along with some ssh magic, I’ve come up with the solution:


set print_command="muttprint -p -| ssh me@my_work_desktop_hostname \"lpr -P techlaser\""

That’s right, the power of the | (pipe for you window’s n00bs). Encryped printing from my colo to my office laser printer. Nice.

Not only can you play music directly from last.fm, but you can also scrobble your own tracks. Unfortunately, most of the software is (without surprise) windows based.

Nuh uh!

Download, compile, and install cmus and grab post.fm. Yes, it is perl, but it works anyway. Make the script executable and add your last.fm username and password. Next, start up cmus and type:

set status_display_program=/path/to/post.fm

Start playing tunes. Super simple and it works in the background. However, the cmus interface is a little wonky and takes some time to get used to. Again, it works and there is no need for yet another clunky gui!

Leave it to Apple to do things different. I swear a lot of their changes make no sense, but they do them anyway just to be different. If you have a MacBook and have no need for OS X (like myself), then you have already installed some flavor of Linux. Just use Gentoo, it just makes sense.

However, the 13″ MacBook doesn’t have Page Up, Page Down, Home, or End keys. Apple uses the Up, Down, Left, and Right Arrows in conjunction with their Function (fn) key. To me, that’s all worthless. Something as fundamental as Page Up and Down. Install xmodmap and put this in your config file:

keycode 108 = Delete
keycode 75 = Home
keycode 76 = End
keycode 95 = Page_Up
keycode 96 = Page_Down

Run xmodmap xmodmap.conf and you are all set. This will remap F11 to Page Up, F12 to Page Down, F9 to Home, and F10 to End. The first one will also remap the smaller enter key near the arrow keys to be a Delete key, since the MacBook doesn’t have that either.

But 2 enter keys. That sounds like a good idea…

I am EXTREMELY paranoid. Read that again: EXTREMELY paranoid. I tend to back up important data two or three times to multiple locations. It’s just not worth the headache of losing it. So kids, here is a quick and dirty bash script to dump all of your MySQL databases and compress them into individual archives. I know it’s not rocket science, but can it, it works.

#!/bin/bash
password="mysqlrootpassword"
backupdir="/home/backup/mysqldumps/"
cd $backupdir
for database in $(mysql --password=$password -e "show databases" | grep "^\|" | grep -v Database);
do
  mysqldump --password=$password $database | bzip2 > $database.sql.bz2;
done
chmod 600 $backupdir/*

Please excuse the ugly spacing, the wordpress theme seems to have chopped it up a bit.

scp resume

For years I wished that there was a resume option for scp. After a bit of Google work, along with some testing, I now have one using our good friend rsync. Stick this in your .bashrc

alias scpr="rsync --partial --progress --rsh=ssh"

Simple. If you are strange and don’t use bash, stick that in your specific shell’s startup file.