I'm having sort of a slow-motion day. I managed to knock out most of the pain of the earlier headache, but feel too wobbly to drive a car, so I've been working on things that I can do sitting down and for which it won't matter if I go glassy-eyed for several minutes every so often. I was banging on some sheet music, and then getting distracted by putting together a makefile so that I can just type "make" (or more often, "make most") in that directory to have all the recently-edited source files turned into Postscript, PDF, and MIDI (because I was tweaking the layout a lot, and when I'm feeling better will need to go proofread them (they're transcriptions of existing music, not original compositions), so I was typing "abcm2ps -n -j5 -O = somethingorother.abc" a whole lot.
Of course, once I made regenerating everything easier, my perfectionist streak started tweaking the layout in finer detail, adding headers to each page from the makefile instead of with statements inside the source files, etc.
But I was getting some really confusing sounds from the MIDI files I generated, not the instruments I thought I was selecting, and not in the octave I expected. So I decided the thing to do would be to make a test file that would play each MIDI instrument for me so I could hear which sounds really went with which number despite what the list "General MIDI Program Number/Instrument Name" list taped to my wall says. A hundred and twenty eight presets worth. But I didn't want to do that much typing.
So I did what the somewhat- to very-hackish folks reading this will consider obvious, but which
still amused me anyhow: I tossed off a one-shot C program that wrote an ABC file to produce a score[*]
that looked like this:
(y'all can figure out what the rest of it looks like from that bit, I'm sure), then I told 'abc2midi' to
make sounds from it and I sat back and listened to the computer count for me, automatically changing
instruments each time the count went up by one.
It turns out that most of the instruments are in the right order, but off by one (the list I copied starts at instrument #1; Windows thinks instrument #0 should make that sound), and a few of them just don't sound (to me) very much like their descriptions. And 'abc2midi' seems to ignore the clef when picking what octave to sound (at least the version I've got -- I need to check for upgrades more recent than whhat I compiled it from), so "%%MIDI transpose -24" helped a lot there. (-12 probably would have been technically correct, but I'm a guitarist; I'm used to hearing the bass parts an octave lower than written anyhow.)
But now I'm wondering: for those of you who'd feel comfortable interpreting a binary number as a series of tones, would you find it easier/faster to parse in most-significant-bit-first order (as one would write it, as I've done it here) or 'tother way 'round (possibly easier to add-as-you-go if you're more "translating" than "recognizing", maybe)?
The C code is brute-force and ugly, BTW, reflecting my current "too spacey to think harder" state. But it worked on the first-pass-that-counted (okay, I had two typos to fix after the first compile. Pbbbbt!). It's not like it's a problem that requires any elegance or efficiency, not to count to a hundred and twenty eight. But I know there is a more elegant solution that I was too fried to concoct.
[*] Note that I did not, of course, actually need to produce the typeset score except to show
it off to y'all. And before that, just because it amused me to look at it. The only meaningful output
in this case was the MIDI file. The C program produced a file that looked like
this:X:99
and so on.
T:Midi Instrument Test
C:D. Glenn Arthur Jr.
M:C
L:1/8
Q:1/8=120
K:C
%%MIDI transpose -12
%%MIDI program 0
CCCCCCCC | z8 |
%%MIDI program 1
CCCCCCCa | z8 |
%%MIDI program 2
CCCCCCaC | z8 |
%%MIDI program 3
CCCCCCaa | z8 |
%%MIDI program 4
CCCCCaCC | z8 |
%%MIDI program 5
CCCCCaCa | z8 |
%%MIDI program 6
(no subject)
I forget what exactly it was, I think the GM midi spec, that I was reading recently* but it mentioned that one numbering scheme involves the instruments going from 0-255 (or was it 127?) like a uchar, and the other scheme went from 1-256. Maybe one was the low-level midi bytes and the other was the official GM numbers, but the gist seemed to be that to convert between the human-friendly advertised numbers and what the lower level expected, you had to had or subtract one.
(*-because I keep wanting to make stuff to interface with midi, but the only real library options are a Perl one that uses Perl's foul pointery things, and numerous C++ libraries; the only thing I've seen for straight C is something that IIRC came with the ALSA libraries, that only understands a very simple type of midi file. As for making something myself, Midi seems just too detailed and hacked together for me to stand any chance of getting it right without spending months on it. Every so often I go through this searching and hmming and then giving up again. Stupid computers...)
(no subject)
I played the same file on another machine (and a different OS), and got the same mapping, ruling out the "this player is broken" diagnosis -- which leaves, a) the list I copied from is wrong, b) the standard is implemented commonly in two different ways as you described, or c) the version of abc2midi I'm using is wrong. Honestly, now that it's been mentioned (thanks, by the way), 'b' isn't all that surprising. MIDI had to grow up kind of fast.
When you get tired of binary...
Re: When you get tired of binary...
Re: When you get tired of binary...
(no subject)
Demerits for my hack cred, but perhaps some merit for other geek cred.
(no subject)
So maybe Morse would have had more some-other-kind-of-geek cred, but not actually at the expense of hack cred.
(no subject)
(no subject)
(no subject)
(no subject)
(no subject)