I'm having sort of a slow-motion day. I managed to knock out most of the pain of the earlier
headache, but feel too wobbly to drive a car, so I've been working on things that I can do sitting
down and for which it won't matter if I go glassy-eyed for several minutes every so often. I was
banging on some sheet music, and then getting distracted by putting together a makefile so that I
can just type "make" (or more often, "make most") in that directory to have all the recently-edited
source files turned into Postscript, PDF, and MIDI (because I was tweaking the layout a lot, and
when I'm feeling better will need to go proofread them (they're transcriptions of existing music,
not original compositions), so I was typing "abcm2ps -n -j5 -O = somethingorother.abc" a whole lot.
Of course, once I made regenerating everything easier, my perfectionist streak started tweaking
the layout in finer detail, adding headers to each page from the makefile instead of with statements
inside the source files, etc.
But I was getting some really confusing sounds from the MIDI files I generated, not the instruments
I thought I was selecting, and not in the octave I expected. So I decided the thing to do would be to
make a test file that would play each MIDI instrument for me so I could hear which sounds really went
with which number despite what the list "General MIDI Program Number/Instrument Name" list taped to my
wall says. A hundred and twenty eight presets worth. But I didn't want to do that much typing.
So I did what the somewhat- to very-hackish folks reading this will consider obvious, but which
still amused me anyhow: I tossed off a one-shot C program that wrote an ABC file to produce a score[*]
that looked like this:
( Not sure whether I need the cut -- it's only 520 pixels wide and 3K -- but better safe than sorry... )
(y'all can figure out what the rest of it looks like from that bit, I'm sure), then I told 'abc2midi' to
make sounds from it and I sat back and listened to the computer count for me, automatically changing
instruments each time the count went up by one.
It turns out that most of the instruments are in the right order, but off by one (the list I copied
starts at instrument #1; Windows thinks instrument #0 should make that sound), and a few of them just
don't sound (to me) very much like their descriptions. And 'abc2midi' seems to ignore the clef when
picking what octave to sound (at least the version I've got -- I need to check for upgrades more recent
than whhat I compiled it from), so "%%MIDI transpose -24" helped a lot there. (-12 probably would have
been technically correct, but I'm a guitarist; I'm used to hearing the bass parts an octave lower than
written anyhow.)
But now I'm wondering: for those of you who'd feel comfortable interpreting a binary number as a series of tones, would you find it easier/faster to parse in most-significant-bit-first order (as one
would write it, as I've done it here) or 'tother way 'round (possibly easier to add-as-you-go if you're
more "translating" than "recognizing", maybe)?
The C code is brute-force and ugly, BTW, reflecting my current "too spacey to think harder" state.
But it worked on the first-pass-that-counted (okay, I had two typos to fix after the first
compile. Pbbbbt!). It's not like it's a problem that requires any elegance or efficiency, not to count
to a hundred and twenty eight. But I know there is a more elegant solution that I was too
fried to concoct.
[*] Note that I did not, of course, actually need to produce the typeset score except to show
it off to y'all. And before that, just because it amused me to look at it. The only meaningful output
in this case was the MIDI file. The C program produced a file that looked like ( this )