Script to remove oldest file in a directory

I’ve created a script in order to manage the /var/spool/cups directory. I like saving cups spool files for as long as possible, but cups will only allow one to set the maximum spool files to save. It doesn’t allow one to set a minimum % free disk space. Here is the script. It monitors the % space used in a given path and if the % free space is below the set amount the script will delete the oldest file in the directory until the free space is below the % given. Here’s the code:
SPOOL_DIR="/var/spool/cups" #location of cups spool
MAX_USAGE="80" #maximum % disk space used 

while [[ $(df "$SPOOL_DIR" | tail -n1 | awk '{print $5}' | cut -d"%" -f1) -gt "$MAX_USAGE" ]] ; do
        ls -t $(find "$SPOOL_DIR" -maxdepth 1 -type f) | tail -n1 | xargs rm -v

Batch convert MKV to MP4 using ffmpeg

I had a bunch of files in mkv format that I needed to convert to mp4 format so that minidlna could serve them to my Xbox 360. Here’s how they can be converted in batch using ffmpeg.
for f in *.mkv; do 
ffmpeg -i "$fn.mkv" -vcodec copy -acodec aac -ac 2 -strict experimental "$fn.mp4";
I used the following pages as resources:

Transfer a block device (or a file) over the network via ssh using dd.

Today I was converting some local virtual machine storage from LVMs to .img files on remote host. One can copy block devices over the network using SSH with a command like this:
dd if=/dev/vg/logicalvolumehere bs=1500 | ssh user@remotehost dd of=/path/on/remote/host.img
… or one could pass it through gzip first to compress the data as it goes over the network.
dd if=/dev/vg/logicalvolumehere bs=1500 | gzip -c | ssh gzip -dc \| dd of=/path/on/remote/host.img
Running the transfer through gzip, I was able to get a speed of 16.6 MB/s or about 133 megabits/sec. over a 100 megabit network.