MIDI β Musical Instrument Digital Interface β is the foundational language that allows electronic music instruments, computers, and software to communicate with each other. Without MIDI, modern music production as we know it would not exist. Every software synthesizer, every drum machine plugin, every virtual instrument you use in your DAW responds to MIDI. Every MIDI keyboard, every pad controller, every arpeggiator, every DAW piano roll β all of it speaks MIDI.
MIDI (Musical Instrument Digital Interface) is a protocol that allows electronic instruments, controllers, and software to communicate with each other by sending performance data like notes, velocity, and control changes. It's the foundational language of digital music productionβwithout MIDI, your DAW, synthesizers, and controllers couldn't interact or create sound.
Understanding MIDI is not optional for any producer working with electronic instruments, software synthesizers, or any kind of DAW-based production. This guide explains exactly what MIDI is, how it works, every key parameter and message type, how to use it effectively in your productions, and where the technology is heading with the introduction of MIDI 2.0.
Diagram β MIDI Signal Flow
What MIDI Is: Performance Data, Not Audio
The most important thing to understand about MIDI is what it isn't: MIDI is not audio. A MIDI signal contains no sound. It is a stream of digital instructions that tells a sound-generating device what to do.
Think of sheet music. A sheet of music notation tells a performer which notes to play, how long to hold them, how loudly to play them, and other performance details. But the sheet music itself makes no sound. The sound depends entirely on which instrument performs it β a piano, a violin, or a full orchestra all produce completely different sounds from the same notation.
MIDI works exactly the same way. A MIDI file or a MIDI clip in your DAW contains a sequence of instructions: note 60 (middle C) starts at 0:00, plays at velocity 84, and holds for 0.5 seconds. Note 64 starts at 0:25 seconds at velocity 72. These instructions produce different sounds depending on which instrument or plugin is assigned to receive them. The same MIDI pattern routed to a piano plugin sounds like a piano. Routed to a synthesizer sounds like a synth. Routed to a string library sounds like strings.
This flexibility is MIDI's superpower. You can compose and arrange an entire production using MIDI, then change every sound without changing a single note β just reassign the plugins. You can transpose a melody up a fifth by selecting all the MIDI notes and shifting them. You can double the tempo by scaling the timing of all events. You can quantize timing errors without re-recording anything. None of this is possible with recorded audio.
The History of MIDI: A Brief Background
MIDI was developed in 1981-1982 by a coalition of electronic instrument manufacturers β Roland, Korg, Oberheim, Sequential Circuits, and others β who recognized that having a universal communication standard would benefit the entire industry. Before MIDI, electronic instruments from different manufacturers couldn't talk to each other.
The MIDI 1.0 specification was finalized in 1983 and adopted almost immediately across the industry. The first public demonstration was at the 1983 NAMM show, where a Roland JP-6 synthesizer communicated with a Sequential Circuits Prophet 600 β two completely different instruments from competing companies β in real time over a MIDI cable. The crowd was astonished.
The MIDI 1.0 standard has been essentially unchanged since 1983 β a remarkable feat of engineering longevity. A MIDI keyboard from 1985 plugged into a modern DAW still works. This backward compatibility is both MIDI's greatest strength and its greatest limitation.
MIDI 2.0 was ratified in 2020 and is gradually being adopted by manufacturers and software developers. It addresses the limitations of MIDI 1.0 β primarily the 7-bit resolution of most parameters (128 possible values) β with 32-bit resolution for all parameters, bidirectional communication, improved timing, and per-note expression built into the core protocol.
MIDI Messages: Every Type Explained
MIDI communicates through messages. Every action β pressing a key, moving a knob, sustaining a note β generates a specific type of MIDI message. Understanding these message types is essential for using MIDI effectively.
Note On and Note Off
The most fundamental MIDI messages. Note On fires when a key is pressed (or a pad is hit), containing three pieces of data: the MIDI channel (1-16), the note number (0-127, where 60 = middle C), and the velocity (1-127, representing how hard the note was played). Note Off fires when the key is released, ending the note. If a Note On message is sent without a corresponding Note Off β a stuck note β the synthesizer holds the note indefinitely until a Note Off or All Notes Off message arrives.
Velocity
Velocity measures how quickly the key traveled from its resting position to fully depressed β a proxy for how hard you struck it. The value ranges from 1 (whisper soft) to 127 (maximum force). Most synthesizers and sample libraries use velocity to control dynamics β higher velocity produces a louder, brighter, or more intense timbre. Velocity 0 is treated as a Note Off message in MIDI 1.0.
Velocity is the most powerful humanization tool available for programmed MIDI. A drum pattern where every hit has velocity 100 sounds robotic. A pattern where kick hits range from 90-110, snare hits vary between 85-105, and hi-hats fluctuate between 60-90 sounds natural. Programming realistic velocity variation is one of the most important skills in electronic music production.
Pitch Bend
Pitch bend messages change the pitch of all notes currently playing on a channel. The range is typically one or two semitones up or down by default (adjustable per instrument). Pitch bend is used for guitar-style bends on keyboard instruments, expressive lead synth lines, and subtle tuning adjustments for expressiveness.
In MIDI 1.0, pitch bend affects all notes on a channel simultaneously β you cannot bend one note while holding another. This is one of the primary limitations that MPE (MIDI Polyphonic Expression) was designed to address.
Channel Pressure (Aftertouch)
Aftertouch is generated by pressing down harder on already-held keys after the initial note velocity has been captured. Channel pressure (also called monophonic aftertouch) sends a single pressure value for the entire channel β pressing any key harder affects the aftertouch message. Polyphonic aftertouch sends independent pressure values per note, but requires more data bandwidth and fewer controllers support it. Aftertouch is commonly mapped to vibrato, filter cutoff, or volume for expressive playing.
Program Change
Program Change messages switch the active preset or patch on a synthesizer or sound module. Sending Program Change 5 on channel 1 tells the receiving instrument to switch to preset 5. Used in live performance to switch sounds mid-song, and in DAW production to control patch selection on hardware instruments.
MIDI Channels: Organizing Multiple Instruments
MIDI has 16 channels, numbered 1-16. Each channel carries independent MIDI data, allowing a single physical MIDI connection to control up to 16 different instruments or sounds simultaneously.
In a traditional hardware setup, a keyboard controller might send data on channel 1, a drum machine might send and receive on channel 10, and a separate synthesizer might receive on channel 3. A single MIDI cable carries all of this data simultaneously, and each device filters out the channels that don't concern it.
In a DAW, MIDI channels let you direct MIDI data from a single controller to specific virtual instrument tracks. If you're using a multitimbral instrument β one that can play multiple different sounds simultaneously on different channels β MIDI channels determine which part of that instrument receives which data.
Channel 10 has been reserved for percussion by the General MIDI (GM) standard since 1991. GM sound modules respond to channel 10 MIDI notes with specific drum sounds (note 36 = bass drum, note 38 = snare drum, note 42 = closed hi-hat, etc.). This convention is deeply embedded in legacy equipment but less relevant in modern production where each drum sound is typically a separate instrument track.
MIDI CC (Control Change): Automating Everything
MIDI Control Change messages (CC) control parameters on an instrument or software that aren't covered by note data. There are 128 MIDI CC numbers (0-127), many of which have standard assignments defined by the General MIDI specification, while others are freely assignable.
The most commonly used MIDI CC values:
- CC1 β Modulation Wheel: Typically mapped to vibrato, tremolo, or other expression parameters. One of the most used CCs in performance.
- CC7 β Channel Volume: Controls overall volume of the channel.
- CC10 β Pan: Stereo position. 0 = full left, 64 = center, 127 = full right.
- CC11 β Expression: Like volume but more nuanced β used for performance-level dynamic swells within a phrase.
- CC64 β Sustain Pedal: Holds all notes without sending individual Note Off messages while active. 0-63 = off, 64-127 = on.
- CC74 β Brightness / Filter Cutoff: Often assigned to filter cutoff on synthesizers. Used extensively in sound design and performance.
- CC91 β Reverb Send Level: Standard reverb send.
- CC123 β All Notes Off: Sends Note Off to all active notes on the channel β the MIDI equivalent of stopping stuck notes.
In a DAW, you can draw CC automation directly into the MIDI piano roll or in a dedicated automation lane. Automating CC1 (modulation) over a synth pad creates a natural-sounding vibrato that grows over the course of a held note. Automating CC74 (filter cutoff) creates classic filter sweep effects. Automating CC11 (expression) creates volume swells in orchestral mockups that match how real strings naturally perform crescendos and diminuendos.
MIDI Controllers: Hardware That Generates MIDI
A MIDI controller is any hardware device that generates MIDI data. Controllers produce no audio of their own β they're input devices that send instructions to software or hardware instruments.
Keyboard Controllers
The most common MIDI controller type. Range from 25-key mini controllers (portable, good for travel and melody input) to 49-key, 61-key, 76-key, and full 88-key weighted controllers. Better keyboard controllers add aftertouch, semi-weighted or fully-weighted keys (important for pianists), and additional controls like knobs, faders, and pads for workflow control. The Arturia KeyLab series, Native Instruments Komplete Kontrol, and Nektar Impact are popular options at various price points.
Pad Controllers
Velocity-sensitive pads optimized for drum programming and finger drumming. The Akai MPC and Push 3 are the definitive pad controller formats. Pads are better than keyboards for programming natural-sounding drum patterns because you can hit them with varying force quickly, mimicking the physical act of drumming. Many pad controllers also include step sequencers for pattern-based composition.
MIDI Connection Types
Traditional MIDI uses 5-pin DIN connectors β the circular connectors with 5 pins you see on hardware synthesizers and drum machines. Data flows in one direction only: MIDI Out sends, MIDI In receives. A MIDI Thru port passes the incoming MIDI signal along unchanged for daisy-chaining multiple devices.
USB MIDI has largely replaced traditional DIN MIDI for computer-based production. USB-MIDI controllers connect directly to a computer without requiring a separate MIDI interface. The computer appears as a MIDI device, and the DAW can route MIDI between all connected devices.
Bluetooth MIDI is increasingly common for wireless controllers and mobile applications. MIDI 2.0 introduces USB-C and network MIDI connectivity standards that will eventually replace both DIN and USB-A connections.
Using MIDI in a DAW: Piano Roll, Quantization, and Humanization
Every major DAW (Ableton Live, Logic Pro, FL Studio, Pro Tools, Cubase, Studio One) handles MIDI through a piano roll editor β a grid display where the horizontal axis represents time and the vertical axis represents pitch. MIDI notes appear as blocks on this grid, with their horizontal position determining when they play, their vertical position determining pitch, and their length determining duration.
Recording MIDI
Connect your MIDI controller, create an instrument track with a virtual instrument plugin, arm the track for recording, and play. The DAW captures every MIDI event as it happens β note timing, pitch, and velocity β as blocks in the piano roll. Unlike audio recording, you can edit every aspect of the performance after the fact without re-recording.
Quantization
Quantization moves MIDI note positions to align with a rhythmic grid. Full quantization at 1/16 notes moves every note start time to the nearest 16th note position β perfectly in time but potentially robotic-sounding. Most DAWs offer percentage quantization (50%, 70%) that nudges notes toward the grid while preserving some human timing variation. The swing setting adds a behind-the-beat feel to alternating 16th notes, creating the groove that defines hip-hop and swing music.
The general rule: quantize drums and bass tightly for a locked, modern feel. Quantize melodic and harmonic content less aggressively to preserve the expressiveness of the performance. Never quantize vocals or recorded acoustic instruments β MIDI only.
Velocity Editing
After recording, you can adjust the velocity of individual MIDI notes in the piano roll. The velocity is typically shown as bars below the note grid β taller bars represent higher velocity. Drawing different velocity values for each hit in a drum pattern is how programmed drums begin to sound human. The kick might peak at 110, the snare at 95, the hi-hats fluctuating between 60 and 80 with every note slightly different.
Step Sequencing
An alternative to real-time MIDI recording, step sequencing programs notes one at a time into a pattern grid. FL Studio's step sequencer was one of the first DAW implementations and remains one of the best. Ableton's clip view in Session mode allows step sequencing alongside audio clips. Step sequencing is faster and more visual for drum programming and repetitive patterns than playing in real time and then editing.
MPE and MIDI 2.0: The Future of MIDI
The limitation of MIDI 1.0 that MPE addresses: pitch bend and pressure are channel-wide messages. On a standard MIDI channel, pitch bend affects every note simultaneously. You cannot bend just one note of a chord β you bend everything. This limits the expressive capability of keyboard controllers compared to instruments like guitar, where each string has fully independent pitch control.
MPE β MIDI Polyphonic Expression β assigns each individual note to its own MIDI channel, giving each note independent pitch bend, pressure, and slide data. An MPE controller like the Roli Seaboard or Linnstrument can send a rising pitch bend on one note while holding two others static, or apply pressure to one note for vibrato while the others remain straight. The result is expressive capability approaching that of a bowed string instrument from a keyboard form factor.
MIDI 2.0 goes further, building per-note expression into the core protocol rather than as an extension, adding 32-bit resolution to all parameters (compared to 7-bit in MIDI 1.0), and introducing bidirectional communication β instruments can tell the DAW their capabilities and accept configuration data back. The adoption curve for MIDI 2.0 is gradual β hardware and software both need to update β but it represents the most significant advancement in the protocol's 40-year history.
Practical MIDI Exercises
Exercise 1 (Beginner) β Program a Drum Pattern with Velocity Variation: Open a drum plugin (or any sampler with drum sounds) and create a simple 4/4 beat in the step sequencer or piano roll: kick on beats 1 and 3, snare on 2 and 4, closed hi-hat on every 8th note. Now open the velocity editor and set every hit to the same velocity (say, 100). Listen to how robotic it sounds. Now vary the velocities: kick at 100-110, snare at 90-100, hi-hats alternating between 60-70 for upbeats and 80-90 for downbeats. Listen to the difference. This exercise teaches you more about feel and groove than any amount of reading.
Exercise 2 (Intermediate) β CC Automation for Expressive Pads: Load a soft synth pad. Record or program a simple 4-bar chord progression in MIDI. Now open the piano roll's CC lane and draw CC1 (modulation) automation that starts at 0 and gradually rises to 80 over the course of the 4 bars, then drops back down for the repeat. Listen to how the modulation makes the pad feel alive and breathing rather than static. Try the same with CC74 (filter cutoff) for a classic filter-sweep effect.
Exercise 3 (Advanced) β Humanize a Programmed Melody: Program a simple 8-bar melody in the piano roll, perfectly on the grid, at a flat velocity of 80. Quantized, straight, robotic. Now apply humanization: randomly shift each note's start time by Β±10ms, randomize velocity between 65-95 on a note-by-note basis, and add a very slight pitch bend on a few strategic notes. The melody should now feel significantly more like a real performance. Most DAWs have a humanize or randomize function that can do this automatically β explore it, then compare to manual velocity editing.