MIDI is pretty fast.
Specifically, MIDI uses simple serial transmissions at 31.25 kilobaud, 8N1. That's one start bit, eight data bits, no parity bits, and one stop bit, making ten bits total for each byte of data sent. It can therefore send 3.125 kilobytes per second. With each note-on event taking up to three bytes (subsequent notes on the same channel can take only two), you can therefore start just over a thousand notes a second, or one note every millisecond.
As for the single-byte timing clock, assuming this is sent highest priority, it should take 320 microseconds (0.32 milliseconds) to send. As MIDI is asynchronous, if the transmitter is taking advantage of that by not waiting for an internal clock pulse, this could therefore lag by a consistent 0.32 milliseconds for smooth timing.
However, in practice, I suspect most transmitting devices will act as if they're using a synchronous protocol, waiting for the next 31.25 kHz internal clock pulse before they begin sending the first byte of a new sequence. This is because it's simply easier to write a program that uses a consistent internal clock frequency, rather than switching it to a finer resolution in between data bursts.
So depending on whether the transmitting device is not waiting for anything, waiting for the next bit's transmission slot (the next clock pulse), or waiting for the next full byte's transmission slot (up to ten clock pulses away), it could lag by nothing, up to 0.032 milliseconds, or up to 0.32 milliseconds.
That gives us 960 microseconds to send the first note-on event that switches channel, 640 microseconds for each subsequent note-on event if running status is used, and 320 microseconds for each timing clock event; and with a reasonably simple implementation synchronised to a steady 31.25 kHz internal clock pulse, a wait of up to 32 microseconds before the start of any of those messages.
|MIDI Event||Number of bytes||Microseconds|
|First note-on||3||960 to 992|
|Additional note-ons||2||640 to 672|
|Timing clock||1||320 to 352|
Indeed, different MIDI devices seem to have different rates of lag and jitter, which are not inherent to the MIDI protocol, and not at all MIDI's fault. The Yamaha QY-100, for example, is usually right on time, but when it's off, it's generally off by about 320 microseconds in either direction — and that's one of the better examples. When the Roland MC-50 MkII is off, it's by about 2400 microseconds in either direction. The latter seems to possibly have a less stable clock too, with timings roughly clustering around these distant points rather than exactly reaching them. The difference is so stark that when the QY-100 is a full 3.125 kHz cycle out, it's still roughly as tight as the MC-500 MkII when it's roughly reaching the optimal 417 Hz cycle.
Having said all that, remember we're talking about microseconds (millionths of a second) adding up to the odd millisecond (thousandths of a second). MIDI should generally sound just fine, unless you're triggering lots of percussive sounds simultaneously.
Vince Clarke had a theory that using MIDI had changed his sound by loosening the timing. I have a counter-theory that MIDI changed his sound by letting him write polyphonic parts. He subsequently avoided MIDI when recording Chorus. I agree it sounds great, but I think it would have sounded essentially the same had he used MIDI and just avoided polyphony out of choice.
In summary, the inherent timing issues of the MIDI protocol are too small to be worth worrying about. Individual programmers' implementations of the protocol may be another matter.
- "Sequencer Tightness: Hardware vs. Software" A. Matthews, 2020
- "Vince Clarke" Paul Ireson, Sound On Sound, Dec 1991
- "MIDI Timing Delays" Martin Russ, Sound On Sound, Mar 1993