What is MIDI?
colin-williams-dev
Posted on December 19, 2022
**
MUSICAL INSTRUMENT DIGITAL INTERFACE
**
It's 1983, electronic instruments are but a whisper and a dream, but for the wild-hearted there is hope. The genesis of the personal computer is about to be upon us. Soon computers will be able to communicate with each other through a series of servers, but there was no means of sending data between musical instruments..? What kind of data would two musical instruments even talk about? Well, certain electronic musical instruments such as keyboards actually generate a great deal of data which represents the sound profile of each note that is played. But during this time, at the inception of MIDI, instruments were very limited about 'who' they could talk with and how...
Today, instruments can communicate with each other and/or computers using communication protocols and code. How did this happen? Below I will touch on how we came to this, why did we need this, and some technical specifications of how this communication is achieved.
ENTER MIDI
HISTORY
In 1983, the plurality and growth in popularity of electronic keyboards (like a piano.. not the one I'm typing this blog on...) necessitated a universal language to connect them. A man named Ikutaro Kakehashi, the president of Roland, dreamt of a world where electronic musical instruments could interact despite their individual manufacturing. At the time, complex synthesizers and keyboards were engineered by many different companies and their circuitry and components were only designed to communicate with each other. This became exceedingly more problematic as the industry grew and more and more instruments were being manufactured. Kakehashi and Dave Smith, president of "Sequential Circuits" set forth to create a technical standard; communications protocol, digital interface, and electrical connectors that would become known as Musical Instrument Digital Interface. This standard would explode into such a rate that today if you have heard recorded music written/recorded past 1990, you have heard MIDI.
This coincided with the advent of the personal computer in the nineties and combined bolstered the demand for digital music software and computers and in part move forward a technological boom in the electronic and digital software marketplace of music.
Faced with the task of universalizing a language for all electronic instruments the creators realized the only way for the language to be effective was to make it free for all, hence, MIDI's open source and ubiquitous presence as we know it today.
MIDI transmits an instantaneous data-stream responsible for
- instrument voicing / musical preset
- musical notes played
- velocity of played note ('attack')
- when note is released ('sustain')
from an electronic instrument into a digital interface,
such as a computer, which can then parse the information
and render sound.
TECHNICAL SPECIFICATIONS
MIDI messages are 8-bit words transmitted serially at a rate of 31.25 kbit/s. "This rate was chosen because it is an exact division of 1 MHz, the operational speed of many early microprocessors" (PETER MANNING). One "start-bit" and one "stop-bit" are added at the beginning and end of each MIDI message resulting in a ten bit transmission total per message. This allows
information to stream digitally which a computer can process and interact with. MIDI data is serialized and therefore can only send one event at a time. This could result in a One-millisecond delay between notes. However, a single MIDI cable can carry up to sixteen channels of MIDI data. So, the earlier engineers working with MIDI realized that this information could be spread out over multiple ports in order for the events to be simultaneously sent. Because of this, it is common to see multiple MIDI-OUT ports on instruments which would often be routed to the same device (whether for musical output or recording/tracking).
MIDI ON THE COMPUTER
As the computer received the MIDI information it reads multiple specific implications from the data.
- "status byte" - the first byte, determines the "type" of message being sent: "Channel Voice, Channel Mode, System Common, System Real-Time, and System Exclusive" (WIKIPEDIA). The receiving device has specified functionality which filters which data it will listen to using these types. I will describe the two most important message types further below.
- "parameter byes" - the next two bytes carry the parameters of the message.
CHANNEL VOICE MESSAGE (bytes 4-8)
This is the message type that defines the way the output will sound. They are responsible for emulating the way physical/analogue instruments sound. Here is a list of what the Channel Voice data will transmit...
- "note-on" - determines when the note will begin. Sent at the beginning of the message. "Monophonic" means each new note-on will terminate the previous note. The sound will be a singular tone.
- "omni On" - listens to all sixteen channels.
- "omni Off" - selectively ignores other channels.
- "note-num" - an integer between 0 - 127 representing Cโ1 to G9 (musical notes) or 8.175799 to 12543.85 Hz (tonal frequency) which will specify the note's pitch.
- "velocity val" - a number which represents how forcefully the note was played. This will effect the "attack" of the note on the digital interface side, which is the size of the soundwave upon the note's initialization.
- "note-off" - determines when the note will end. Sent at the end of the message. "Polyphonic" means the note-off will not terminate previous notes. The sound can be layered like a chord and contain many notes.
SYSTEM EXCLUSIVE MESSAGE (SysEx)
Unlike its equally (or more so...) important counterpart, which determines which note is played and how loud, this message ensures that only the targeted device responds to the message, and that all others ignore it. Manufacturers use them to create proprietary messages that control their equipment. Devices and instruments will usually have a SysEx setting which will read this message to route proper lines of communication.
- And last but not least, part of the MIDI data can control a clock on any digital device in its loop. This is called MIDI Sync and universally controls the speed at which the data is processed on all connected devices.
WORKS CITED:
https://www.youtube.com/watch?v=B9GtWG5-wec - What is MIDI? A simple explanation in 5 minutes! - KevTheAudioGuy
https://en.wikipedia.org/wiki/MIDI#Technical_specifications - Technical specifications
Electronic and Computer Music. 1985 - Peter Manning.
Posted on December 19, 2022
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.