He has been asking around for this for about a week now. I stumbled upon a command-line to echo stuff to the com-port, but according to him, that wouldn't work either (again, the zero-byte). So I opened Visual Basic (I have it installed on my work machine, because I had to do some stuff with it in the past), whipped up a form with a MSComm-object that sent the command to switch the first four relays. The colleague runs VNC on his home machine, so after I sent the program to his home adress, he could run it. His girlfriend was at home, and she could verify the LEDs on the relay card -- and it worked!
So when I got home, I refined the program to work on the commandline, and it allows you to set the ports in binary form. He just said on IRC that it all worked, which is cool. I earned a sixpack of good beer! ;)
Also, work on the LED-clock is slowly progressing. I needed a way to measure whether a second had passed. The PIC runs at a clock frequency of 4MHz, and every instruction takes 4 clock cycles. This means that 1.000.000 instructions are executed in a second -- 1 MIPS! The PICs I use have a timer-module, which basically counts every instruction. When the timer-register overflows, an interrupt is triggered -- so every 256 instructions, my interrupt is triggered. It can be 'pre-scaled', so that only ever 2nd, or 4th, all the way up to 256th instruction is counted.
But the problem is, you can never get exactly one million if you have to count in increments of 256. I've been fiddling with this yesterday evening, but I couldn't get it to work...
This morning, I found this page which details a brilliant solution: put 1.000.000 in a register (actually three registers, since you need 24 bits), and every time the interrupt is called, subtract 256 of it. If the counter drops below 256, you know that, by the next time the interrupt gets called, the second will already have passed. So you do the stuff which needs to be done at the second now.
So far so good -- but you get an error. The first second, you get an error of 64 instructions. Not very much (this is 0.000064 second!), but over time, it all adds up. In 27.7 hours, the error adds up to 6.4 seconds, and that just won't do for a clock! So what does this algorithm do? It simply adds one million to the fraction still left from the previous second -- so on your 2nd second, you have to do 1.000.064 instructions, and so on.
This means that the error of earlier seconds gets corrected in later seconds, and every second takes 1.000.000 instructions on average. So while the clock will be a fraction of a second off (imperceptible for us humans if it's only one second), it will keep correct time over a longer period of time!
I've just implemented this code, and it works like a charm. Next stop: implementing seconds and minutes on my test-board, to see if it keeps the time well.