Since MIDI is becoming a major part of how many people dj it seems appropriate to understand as much as possible about the format. A few questions have popped up about midi latencey, how to reduce it and which cables or interfaces are the best to use. Instead of shooting in the dark we asked an expert, namely Florian Bomers- creator of Bomes Midi Translator, the tough questions about midi.
Does Midi Have latency?
The original MIDI standard (“5-pin DIN”) is a serial protocol
where each byte takes 320 microseconds for transmission. Now a
typical MIDI message is 2 or 3 bytes, so it takes at least 0.6 to
0.9 milliseconds until the computer notices that you’ve done
something on a MIDI controller. This wouldn’t be sooo bad, but
today’s MIDI devices usually come with a USB or Firewire
connection, which, unfortunately, adds quite some latency.
Is that really so bad?
A total latency of under 10 milliseconds is usually unnoticeable,
though there is the additional problem of congestion: if two or
more MIDI messages should be sent at the same time, they’ll be
queued up and the last one may come considerably later than
In your testing have you found any midi devices that have more than 10 Ms of latencey?
no, but the entire system can easily exceed 10 milliseconds,
especially when considering audio latency.
Will that really affect a djs performance?
The less overall latency you have, the snappier it will feel
using the controller. With too much latency, you’ll have that
sluggish feel and it’ll be hard to do any precise mixing.
Is there a difference between how you connect your controller to the computer?
yes, indeed. The cables don’t have much impact per se 🙂 But the
type of connection does: typical USB and Firewire MIDI
connections add latency of 2-10 milliseconds. The standard MIDI
implementation for Firewire emulates the original 5-pin DIN speed
(i.e. on average, 1 byte per 320 microseconds) but typically adds
a little delay because MIDI data is piggybacked on audio packets.
Also, the faster the better if you also use the connection for
Is there a measurable difference between USB 1.0 and 2.0?
I haven’t done any measurements with USB 2.0, though in theory it
could perform much better. Still, USB 2.0 is optimized for
throughput and not latency…
what is the best way to achieve the lowest overall midi latency?
This is really hard to say. With little MIDI traffic (e.g. you
use only one controller at a time and the controller doesn’t send
excessive amounts of MIDI data), I’d recommend using a PCI audio
card and connect the controller directly to its MIDI IN jack —
if the controller provides 5-pin DIN MIDI OUT, that is.
Otherwise, I’d have to go and measure different USB and firewire
controllers to find out their latency.
You mentioned there is one thing thats even worse than latency, what is that?
indeed, there is jitter, i.e. how much the delay changes. If
every MIDI message takes exactly the same time to get to the
computer, there’s no jitter. In practice, however, there can be
enormous jitter: for example, let’s assume a connection where
MIDI data is piggybacked with audio packets. For the sake of the
example, an audio packet takes 1 millisecond for transmission,
and audio packets are sent every 10 milliseconds. Now it’s easy
to see that some lucky MIDI messages will take as little as 1
millisecond, but in worst case, a MIDI message has 11
milliseconds latency (if it just missed an audio packet to
why is jitter so bad?
The problem with jitter is that it affects rhythm. As a rule of
thumb you can say that a trained ear can distinguish rhythmic
deviations of 1 millisecond. Now if you have drum pads or you’re
scratching, or really anything for your live performance, you
want to hear exactly the same rhythm that you play! Too much
jitter will make your performance quite jerky…
As a side note: I’ve read a study where they found out that the
“feel” of a drummer is due to subtle deviations from the perfect
timing. For example, if the bass drum is hit a tad before the
actual beat (say, 2 milliseconds), there’ll be a forward-driving
feel to the performance. Now you can imagine that with jitter of
10 milliseconds, this feel will be replaced with arbitrary
non-rhythmic drums — and this translates to all instruments,
Too summarize: higher latency will make it harder to perform, but
jitter will kill your music.
How do you measure Jitter?
You cannot measure jitter directly. Just measure the delay for
many MIDI messages and then look at the maximum and minimum
delay: their difference is the jitter. I use modified MIDI THRU
boxes to accurately measure MIDI latency.
How can a person ensure they have the lowest jitter possible?
I wish there was an easy way to do that. Since jitter is directly
related to latency, the same applies here: you need to optimize
all parts of the system: get a controller with low latency and
little jitter, don’t hose your USB/Firewire connection, always
use the latest drivers, use software that is not known for bad
Fortunately, the audio path usually doesn’t add jitter, so if you
have 5 milliseconds ASIO buffer size, it’s fixed latency and it
will not add jitter.
below 200 microseconds (or below 0.2ms)
Does the new midi 2.0 spec solve many of these problems?
The MIDI Manufacturers Association (MMA) is currently working on
a new version of MIDI (working title: “HD”). Improvements for
latency, jitter and throughput are high on the list, plus higher
resolution (7 or 14 bits to e.g. 32 bits), more channels, more
message types and more!
Will midi be around for another 20+ years?
I’m sure that the unique properties of MIDI (realtime,
multi-purpose, efficient, low cost) are still very useful for
many decades to come. But if that’ll be via MIDI or something
else depends on the success of “HD”… well, as a member of the
“new MIDI” group at the MMA, I can only say: with HD, it’ll be a
better world 🙂