The most important fact about this is that there is not a one-to-one mapping of MIDI to music notation . . . THOUGHTSMIDI is what I call "digital", but music notation is "analog" . . .
Calling music notation "analog" is a good way to distinguish music notation from MIDI--which is "digital" by definition--but there is more to this . . .
The problem is that everything in MIDI is based on integer values, which are high-level simplifications of real values . . .
In MIDI, "Middle C" is assigned the integer value "60", and the durations of notes similarly are given integer values based on "ticks", where the "tick" value is based on the defined "ticks per quarter note" or "parts per quarter (PPQ)" . . .
Tempo in MIDI is based on the number of microseconds per quarter note, and it also is an integer number . . .
Music notation is different from MIDI; and from my perspective music notation primarily is what I call "binary" in the sense that without tuplets and unusual time signatures, note durations are multiples of two, where for example a standard eighth note has half the duration of a standard quarter note . . .
As you know, a single dot appended to a note increases it's duration by half, and a double-dotted note increases the duration of the note by half and then additionally by half of the half, which makes a double-dotted quarter note the duration of a quarter note plus an eighth note plus a sixteenth note . . .
More to the point, there are no standard third notes, fifth notes, and other odd notes, where "odd" in this context is used in the standard way in integer arithmetic . . .
Everything is an "even" value rather than an "odd" value for what I call the "standard note durations" . . .
If you want "odd" note durations, then you need to use tuplets, since tuplets make it possible to create "odd" or "non-binary" durations--again with "binary" used in the context of multiples of two rather than the way "binary" is used in computer science . . .
The first time this became obvious to me was when I started working on drumkit rhythm patterns, which in 2010 was a new experience, since I had never done anything with percussion notation . . .
I knew there was percussion notation; but since I mostly do everything "by ear", playing rhythm patterns on a real drumkit was something I did intuitively after decades of listening to drummers, mostly on records and radio, but also by watching drummers in live performances . . .
My primary instruments are electric bass, electric guitar, and drums; but while I always have been fascinated with drumming, I never had a drumkit until about 15 years ago, which was a fascinating experience that in retrospect I should have done earlier . . .
When I had real musical groups, the drummers would leave their drumkits every once in a while when we practiced at my house; and this was an opportunity for me to get a bit of "stick time"; but it's
not the same as having your own drum kit . . .
On the other hand, there is a benefit to waiting decades to get a real drumkit; and one of the benefits involves mathematics and physics, as well as knowing about ergonomics and tonality . . .
I started with a basic Pulse drumkit (kick drum, snare drum, tom-tom, floor tom, hi-hats, and two cymbals) . . .
After it made a bit of sense, I discovered Prof. Sound's "Drum Tuning Bible", which introduced the importance of pitch in a very clear way, since according to Prof. Sound each drum has a specific pitch to which it needs to be tuned for optimal tonality and so forth . . .
Prof. Sound's "Drum Tuning Bible"Whether I would have realized the importance of pitch and tonality with respect to a drumkit without the knowledge I gained from Prof. Sound is another matter, but while I think I would have discovered it eventually, it was considerably easier and more obvious with the information from Prof. Sound . . .
As I became more attuned to pitch and tonality, I expanded the basic drumkit and embarked on an effort that took about a year, which mostly was focused on expanding the basic drumkit with more drums, and then with more cymbals, which soon evolved into adding Latin percussion instruments--mostly cowbells and wood blocks but also lots of other Latin percussion instruments . . .
I arranged and positioned the various percussion instruments with great attention to ergonomics, which included stacking cymbals so that with one downward movement of a drumstick I could play several cymbals and typically at least one cowbell or wood block . . .
As the drumkit increased in size, I found myself having to make large motions to play some of the drums and cymbals, which led to another bit of physics and ergonomics and specifically mapped to making 22" drumsticks, which work very nicely . . .
I use 5/8" oak dowels to make 22" drumsticks, and generally they last a while, since I am very careful where I hit drums, cymbals, and Latin percussion instruments, which mostly is a matter of ergonomics and making everything as efficient as possible . . .
Along the way, I discovered and use advanced kick drum pedals that have the ability to double or triple the number of hits from a single foot movement--you get hits on both the downward and upward foot movements--which makes them "multipliers", where again the goal is to be able to play the most drum notes with the least number of physical motions . . .
Duallist D4 Dual Pedal (Duallist Drum Pedals)This led to the "Really Big Drumkit" and then, when I added a second kick drum and more tom-toms, to the "Really Bigger Drumkit", which at present has what I consider to be a reasonable array of percussion instruments, although from the perspectives of mathematics, physics, and ergonomics, there is much more than can be done with respect to moving everything literally and physically to a higher or three-dimensional level, if you prefer . . .
Drums, cymbals, and Latin percussion instruments generally are designed to be played exclusively with downward motions, and there are good reasons for this; but for example, if you have two identical cowbells in a stack where the space between them is perhaps no greater than six inches, then you can play each one with a single up and down hit by placing the drumstick in the space between the two cowbells, which maps to playing two identical cowbell notes rapidly with a simpler drumstick motion (up-and-down or down-and-up) as contrasted to playing each note separately on a single cowbell with downward motions . . .
And who says you can't have several snare drums, some of which are mounted to the sides and overhead?
The Really Bigger Drumkit[
NOTE: I run the drumkit through cascading echo units to make it possible to "play" even more notes without actually needing to do anything other than adjusting the repeat speeds of the echo units, which I also do for electric guitar. It's mathematics and physics, really. For reference, the drumkit part for "Nuke Out" was played in real-time on the fly with no overdubbing, which was done before I switched primarily to music notation and virtual instruments. Here in the sound isolation studio, percussion is rhythmic and melodic when there is enough of it tuned to different pitches. . . ]
.
MAPPING MIDI TO MUSIC NOTATIONWhat about mapping MIDI to music notation? Since there is
not a one-to-one mapping of MIDI to music notation, converting MIDI to music notation is more of an art than a science at present, although I think there is a way to improve the accuracy and the practicality of the conversion, which primarily is a matter of using a bit of what one might call "artificial intelligence" or better yet "practical common sense" . . .
Some of this is supported in NOTION for real-time MIDI recording and subsequent conversion of recorded MIDI to music notation; and there are various MIDI recording options which can improve the accuracy of the conversion to music notation . . .
Quantizing is another technique which can in some instances improve the production of "practical common sense" music notation . . .
I think that most keyboard players have a very specific idea of the way the music they play should look when converted or transcribed to music notation . . .
Sometimes this happens, but it's more likely that it doesn't happen . . .
As computers become faster and more capable of doing elaborate series of algorithms, the idea of using "artificial intelligence" to improve the quality of MIDI-to-music-notation conversions becomes more practical, and intuitively I think there are ways to do this sensibly . . .
Based on experiments over the past few years, my perspective is that when NOTION converts recorded MIDI to music notation, it nearly never matches what I think I am playing . . .
In fairness, it probably is exactly what I played, but since it always looks like something I have no idea how to play, I think the conversions can be improved with help from more advanced computer algorithms . . .
It's important to understand that what you think you were playing might
not be exactly what you actually played; and for me this certainly is the case--in part because, as you might know, I taught myself how to play grand piano (
and keyboard synthesizers) by dreaming about it for 20 years . . .
Based on being able to let my unconscious mind (
"id" in Freudian terminology) run the show for a while, I can play extraordinary rapid series of notes without having any immediately conscious idea what I am doing; and if I have no immediately conscious idea what I am doing, then how can a computer algorithm make sense of it?
NOTION makes it possible to record the MIDI, and as best as I can determine NOTION is able to make some level of sense of what I am doing as it maps the MIDI to music notation; but making it look "pretty" is another matter . . .
Consider this bit of Surf.Whammy synthesizer lunacy which I played and recorded in NOTION in real-time on the fly on a 25-key Behringer mini-MIDI keyboard, which was done as an exercise in ReWire that probably nobody ever does, but so what . . .
The song is "Faster" (Techno Squirrels), and it was a demo song that came with Reason 6 (Propellerhead Software) or some version around the time . . .
I like the song, and I use it for educational purposes to show how one can enhance a song--which in this example from a few years ago (2014) is focused on Reason 7.1 (Propellerhead Software)--by adding additional instruments in NOTION with virtual instruments and music notation, some of which is recorded as MIDI and then converted to music notation, with all this being done in a ReWire session where a Digital Audio Workstation (DAW) application--in this example Studio One Producer 2.6.3--is the ReWire host controller and both NOTION and Reason are ReWire slaves . . .
[
NOTE: The original version of "Faster" does not have a horn section, Surf.Whammy synthesizer, and Hammond B-3 Organ; and as I recall I also did some structural arranging to make the song longer, which might be in this version. Mostly, the goal was to determine whether it's possible to do all this stuff at the same time in a ReWire session; and it certainly is . . . ]
SUMMARYConverting recorded MIDI to music notation is as much an art as it is science, and since there is
not a one-to-mapping of MIDI to music notation, there is a lot of art involved in the decision making . . .
Remember that what you think you are playing probably is
not exactly what you are playing . . .
It's probably not exactly what you thought you were playing, because (a) you are human and (b) MIDI and the computer are much faster at determining what you actually are doing in real-time on the fly than you might imagine, at least with respect to tracking and recording what you are doing in microseconds . . .
With a bit of practice, humans can do a reasonable job of keeping track of things in the range approximately 24 to 60 milliseconds--provided as I did, we reprogram our brains to increase the number and density of neural pathways between the auditory cortex and the frontal eye fields regions . . .
But even when you can perceive and perhaps play notes as rapidly as one note every 24 milliseconds (
approximately 42 notes per second), if your timing is off by 1 millisecond, this is an approximation when time is measured in microseconds, "ticks", and so forth . . .
And then there is the matter of the perceptual apparatus of the brain doing some fascinating things to reduce the complexity of rapidly occurring events--in particular the Haas Effect, where two identical sounds occurring very rapidly are perceived a being only one sound but louder than the two individual sounds, which as best as I can hypothesize is a primitive defense mechanism to make the rapid sounds of bear or tiger paws appear louder so that you can get out of the way or do something to protect yourself . . .
Haas Effect (Wikipedia)The horizontal bars--sequencer overlays--come directly from the MIDI, and they are precise . . .
How it maps to music notation is reasonable, but if it's
not what you expected, then at present the strategies are (a) to adjust the MIDI recording configuration, (b) to quantize, and (c) to adjust the music notation and sequencer overlays manually . . .
Lots of FUN!