Linux

This couldn’t be easier. Here is how to extract audio from an .mp4 video file (YouTube and others):

for file in *.mp4; do mplayer -ao pcm "$file" -ao pcm:file="$file.wav" && lame -b 192 "$file.wav"; rm "$file.wav"; done

Diggity done. Make sure you have mplayer and lame installed, otherwise, well, you know.

So I live in the past, whatever. Most of my music collection files are still mp3s, with a few oggs here and there. Jen had a free $25 gift card to iTunes, so I bought an album that Amazon nor my “russian connection” had on their list. I felt so dirty, but it had to be done. Anyway, if you want to convert m4a files to mp3 from the command line, run this:

for i in *.m4a; do faad -o - "$i" | lame - "${i%.m4a}.mp3"; done

Diggity done. Obviously, make sure faad and lame are installed. Re-tag and you are all set.

Well, I finally picked up an Asus Eee PC 1000HA 10″ laptop. It ships with Windows XP Home, which is really fine with me, but you know I need Linux on it.

With another user’s recommendation, I decided to give Arch Linux a try. I know, you KNOW I wanted to use Gentoo, but frankly, I didn’t feel like compiling all that on a1.66GHz Intel Atom processor.

So far I like Arch, it’s easy to use and their wiki is really well done. With only a few hours of messing around, I have a 95% functional system working just about the way I want it.

However, I noticed that for some reason, I had problems getting bash-completion to work with my ssh known_hosts files. After opening it up, I noticed that the IP addresses and host names were all hashed. That kind of sucks. I don’t like performing any unnecessary typing! If this happens to you, check out /etc/ssh/ssh_config:

HashKnownHosts yes

Seems that was set for me. I hate that. Do yourself a favor and (#) comment that out, or set it to no. That way you can bash-complete any and all hosts that you use on a daily basis.

More on the Eee PC later…

I use my linode server for pretty much everything. However, one of its main jobs is to house all of my email needs. I have several domains which receive personal email and that all collects on the server. For quite some time, I’ve used “fetchmail” to collect my work mail and deliver it via the local mail server. It would then be filtered and delivered to a 2nd Inbox. I had three total inboxes, one for my main account, a secondary for a backup account, and the third for my work email. The reason for the multiple mailboxes was to make it easier to setup email “profiles.” Using the console based mutt email client, I have setup folder rules. If I am in the work inbox, mutt will use specific settings that I use for work. Mainly reply to address, outgoing name, and signature.

In a nutshell, fetchmail is really a pain. It’s config file is really about the stupidest setup that I’ve ever seen, and all in all, it seemed like a lot of extra steps for something that really should be pretty simple.

Enter getmail. It is the same style program, but is much easier to install and doesn’t require a local server to filter and deliver mail. Instead, it can save to local folders using several different storage methods (mbox file, maildir, etc).

It was time to add another Inbox. This time, I needed to add one to Gmail. So, if you are looking to grab mail remotely from Gmail and store it somewhere, here is a quick and simple getmail config file that gets the job done:

[retriever]
type = SimplePOP3SSLRetriever
server = pop.gmail.com
username = username@gmail.com
password = password

[destination]
type = Maildir
path = ~/.maildir/.Gmail/

[options]
delete = true
verbose = 0
message_log = ~/.getmail/gmail.log

If that is the only mail you want getmail to get, create a ~/.getmail directory and name the file getmailrc. However, getmail has the ability to gather mail from multiple locations using the –rcfile switch, i.e.:

getmail –rcfile ~/.getmail/getmailrc_gmail

Stick that in your crontab and you are off. Last, I just setup the Gmail folder in mutt, and it will periodically search that folder for new mail. For security’s sake, please make sure you chmod 600 everything in your ~/.getmail directory, or snoopy users can see your email passwords.

I am finally starting to outgrow my relationship with my current co-location service. In a nutshell, I built and configured this server and pay a company to run it in their facility. It’s been great for years and really has been incredibly cheap compared to most co-location costs. However, due to raising costs as well as the new technology of virtually hosted machines, I will be moving to a new facility.

The biggest problem I have at this time is remote access. It takes me roughly 1.5 hours to get to the facility, so if/when there is a problem, that really isn’t very convenient. I am currently sharing a connection with eric and he has been more then gracious to help with problems since he lives far closer to the facility then I do. However, I hate relying on other people, and moreso, I hate to waste eric’s precious time, so it is time to move. With the new system, it will allow me to have full console access to the machine, so if I have to reboot it or get at the boot process (failed kernel upgrade, etc), I can do that via ssh or the web. Also, using a virtual based system also means that hardware failures will no longer be my problem; the facility takes care of that themselves. They will also handle hardware and systems upgrades; all I need to worry about is the OS and software.

Now the hard part. I have to start migrating all of users and hosted web sites to the new system. I have what I believe will be a decent and rock solid plan to migrate. I have already purchased the virtual machine and have begun the process of getting the necessary software and daemons working there. After some testing, I will start to migrate the web data. This is going to take some time actually, because rsync-ing 12GB of data won’t happen immediately. After that is complete I plan to start migrating user home directories which include mail setup. Again, I do not see a problem here, but transferring that much data across the Net will take some time. During that time, I will have to shut some things down and allow the backup mail server at my home to queue up old messages. Once the migration is complete, I can enable the mail system at the new facility and offload the messages manually from the backup server.

Again, this is the plan. I’m sure I will run into a hiccup here and there, but that’s expected. My main goal is no data loss. I would hate to lose any email messages that I have and I bet my users agree.

The good news is that once the migration is complete, I will have one hell of a server left over. That means I will be able to upgrade my server at home and will really be set.

gkrellm screenshot

If you have a *nix workstation you have probably either seen or used GKrellM. It’s a handy dandy program to give you up to the second stats on most of the important data on your system. Disk space/activity, network traffic, CPU and memory usage, processes, the works.

If you are looking to run it on remote headless machines, ssh is your friend and makes life easier. Install GKrellM on the remote box. In a local terminal, do the following:

ssh -N -f -L 19150:127.0.0.1:19150 user@host.com

Obviously, change the “user@host.com” to your user and host. You can also change the 19150 to pretty much anything, just make sure you match that in the gkrellmd.conf file. Once the gkrellmd daemon is started, you can then connect to it by running:

/usr/bin/gkrellm -s 127.0.0.1 -P 19150

Again, changing the 19150 port to match whatever you used in the above set. Configs are stored locally and can be manually edited, or you can use the standard GUI setup. You can then run the above with different ports for different hosts.

Happy monitoring!

Thanks to Eric, I finally found mailing list management software that is not only easy to setup, but super easy to use. If you are looking for something easy yet powerful, check out mlmmj. Most of the configs as well as message storage are all flat text files, so everything is super fast and simple to configure. It only took a few minutes to get a new list setup and running, and most of that time would have been saved if I actually read the documentation.

But come on, what fun is that?

OK, so I’m sure everyone knows that I am “not exactly a big fan of Windows.” We have all heard it before, so I won’t dwell on it. At this point, I am about 90% Windows free. Most of that, however, is for work. The Novell network stuff still relies on Windows for basic management. The current version of ConsoleOne (Netware administrator) is written in Java and actually runs better on my Linux box then it does on any Windows machine I have tried. So I still run Windows XP in a VMware desktop on my Gentoo box. I’ve come to find a small warm spot in my heart for XP, almost enough to call it “not too terribly bad.” Don’t worry, that’s as good as it will get.

Anyway, enter Vista. I decided to give it a whirl in a fresh VMware session. All I can say is Oh. My. God. I gave the thing 2 virtual processors and 2.5GB of RAM and it was still awful. I personally think the way the new Windows Explorer (My Computer, etc) design is terrible and totally difficult to use. I know the argument is that “it is just different,” but I don’t think so. It made simple things that I do hundreds of times a day so much harder. It only took me a half hour of playing around before I killed the virtual machine and wiped it into oblivion. It was that bad.

I guess if you really don’t have anything important to do with a computer (like spider solitare) Vista is OK. But really, if you need to get stuff done, stick with XP.

I noticed there was an Openfire (OSS Jabber server suite) upgrade in Gentoo’s portage, so that gave me the shakes to upgrade. I don’t know why, I just like to upgrade things. I guess I like the challenge of fixing things if/when they break. Sure enough, it was broke.

So for any of you openfire users out there, here is the fix. I tried messing with a bunch of settings (backing up my original MySQL database first) and nothing seemed to help. After scrolling through all of the settings in the admin console, I noticed that the server no longer liked the self rolled SSL certificates. A quick click and the server generated new ones. Restart the openfire server and I was back in business.

I am still one small version behind (3.4.4 versus the current 3.4.5), but I was just to lazy to create a custom ebuild today…

So, as we all know, I’ve used several tools to scrobble music to last.fm via the console. The problem was that it worked, well, sometimes. The old post.fm script would work great for hours, then just outright stop submitting tracks. Other times it would never work, and then submit 2 or 3 songs out of 10. I poked through the source a bit, but the extent of my perl knowledge consists of adding the line “#!/usr/bin/perl” to the top of the script. Seriously, that’s it. So thankfully Eric (who knows a LOT more then I do about perl) was poking around trying to figure it out as well. At any rate, we were both stuck, until he stumbled upon another script that works with cmus. The new script is called shell.fm-cmus and seems to be working MUCH better then the other. So far so good anyway…