Page 1 of 1

Quickest Way to Fix (or Prevent) This MIDI>Notaton Conversion Error

Posted: Mon Aug 12, 2019 5:47 pm
by brundle-fly
Below are before and after screenshots of how Notion 5 converted recorded MIDI input to notation, and how it had to be corrected. The problem is that the correction took a lot of steps, and the added 16th note now plays with a much higher velocity than the original. I haven't yet figured out how to correct that.

Ideally, Notion would just get this right in the first place. Are there recording resolution settings that might fix this? If not, what steps would you take to get to the corrected notation with the fewest clicks, and have the added-and-tied 16th note play back with the correct velocity.

Before
Incorrect Notation.png


After
Corrected Notation.png

Re: Quickest Way to Fix (or Prevent) This MIDI>Notaton Conversion Error

Posted: Tue Aug 13, 2019 4:19 am
by Surf.Whammy
The most important fact about this is that there is not a one-to-one mapping of MIDI to music notation . . . :ugeek:

THOUGHTS

MIDI is what I call "digital", but music notation is "analog" . . .

Calling music notation "analog" is a good way to distinguish music notation from MIDI--which is "digital" by definition--but there is more to this . . .

The problem is that everything in MIDI is based on integer values, which are high-level simplifications of real values . . .

In MIDI, "Middle C" is assigned the integer value "60", and the durations of notes similarly are given integer values based on "ticks", where the "tick" value is based on the defined "ticks per quarter note" or "parts per quarter (PPQ)" . . .

Tempo in MIDI is based on the number of microseconds per quarter note, and it also is an integer number . . .

Music notation is different from MIDI; and from my perspective music notation primarily is what I call "binary" in the sense that without tuplets and unusual time signatures, note durations are multiples of two, where for example a standard eighth note has half the duration of a standard quarter note . . .

As you know, a single dot appended to a note increases it's duration by half, and a double-dotted note increases the duration of the note by half and then additionally by half of the half, which makes a double-dotted quarter note the duration of a quarter note plus an eighth note plus a sixteenth note . . .

More to the point, there are no standard third notes, fifth notes, and other odd notes, where "odd" in this context is used in the standard way in integer arithmetic . . .

Everything is an "even" value rather than an "odd" value for what I call the "standard note durations" . . .

If you want "odd" note durations, then you need to use tuplets, since tuplets make it possible to create "odd" or "non-binary" durations--again with "binary" used in the context of multiples of two rather than the way "binary" is used in computer science . . .

The first time this became obvious to me was when I started working on drumkit rhythm patterns, which in 2010 was a new experience, since I had never done anything with percussion notation . . .

I knew there was percussion notation; but since I mostly do everything "by ear", playing rhythm patterns on a real drumkit was something I did intuitively after decades of listening to drummers, mostly on records and radio, but also by watching drummers in live performances . . .

My primary instruments are electric bass, electric guitar, and drums; but while I always have been fascinated with drumming, I never had a drumkit until about 15 years ago, which was a fascinating experience that in retrospect I should have done earlier . . .

When I had real musical groups, the drummers would leave their drumkits every once in a while when we practiced at my house; and this was an opportunity for me to get a bit of "stick time"; but it's not the same as having your own drum kit . . .

On the other hand, there is a benefit to waiting decades to get a real drumkit; and one of the benefits involves mathematics and physics, as well as knowing about ergonomics and tonality . . .

I started with a basic Pulse drumkit (kick drum, snare drum, tom-tom, floor tom, hi-hats, and two cymbals) . . .

After it made a bit of sense, I discovered Prof. Sound's "Drum Tuning Bible", which introduced the importance of pitch in a very clear way, since according to Prof. Sound each drum has a specific pitch to which it needs to be tuned for optimal tonality and so forth . . .

Prof. Sound's "Drum Tuning Bible"

Whether I would have realized the importance of pitch and tonality with respect to a drumkit without the knowledge I gained from Prof. Sound is another matter, but while I think I would have discovered it eventually, it was considerably easier and more obvious with the information from Prof. Sound . . . :+1

As I became more attuned to pitch and tonality, I expanded the basic drumkit and embarked on an effort that took about a year, which mostly was focused on expanding the basic drumkit with more drums, and then with more cymbals, which soon evolved into adding Latin percussion instruments--mostly cowbells and wood blocks but also lots of other Latin percussion instruments . . .

I arranged and positioned the various percussion instruments with great attention to ergonomics, which included stacking cymbals so that with one downward movement of a drumstick I could play several cymbals and typically at least one cowbell or wood block . . .

As the drumkit increased in size, I found myself having to make large motions to play some of the drums and cymbals, which led to another bit of physics and ergonomics and specifically mapped to making 22" drumsticks, which work very nicely . . .

I use 5/8" oak dowels to make 22" drumsticks, and generally they last a while, since I am very careful where I hit drums, cymbals, and Latin percussion instruments, which mostly is a matter of ergonomics and making everything as efficient as possible . . .

Along the way, I discovered and use advanced kick drum pedals that have the ability to double or triple the number of hits from a single foot movement--you get hits on both the downward and upward foot movements--which makes them "multipliers", where again the goal is to be able to play the most drum notes with the least number of physical motions . . .

Duallist D4 Dual Pedal (Duallist Drum Pedals)

This led to the "Really Big Drumkit" and then, when I added a second kick drum and more tom-toms, to the "Really Bigger Drumkit", which at present has what I consider to be a reasonable array of percussion instruments, although from the perspectives of mathematics, physics, and ergonomics, there is much more than can be done with respect to moving everything literally and physically to a higher or three-dimensional level, if you prefer . . .

Drums, cymbals, and Latin percussion instruments generally are designed to be played exclusively with downward motions, and there are good reasons for this; but for example, if you have two identical cowbells in a stack where the space between them is perhaps no greater than six inches, then you can play each one with a single up and down hit by placing the drumstick in the space between the two cowbells, which maps to playing two identical cowbell notes rapidly with a simpler drumstick motion (up-and-down or down-and-up) as contrasted to playing each note separately on a single cowbell with downward motions . . .

And who says you can't have several snare drums, some of which are mounted to the sides and overhead?

Image
The Really Bigger Drumkit

[NOTE: I run the drumkit through cascading echo units to make it possible to "play" even more notes without actually needing to do anything other than adjusting the repeat speeds of the echo units, which I also do for electric guitar. It's mathematics and physics, really. For reference, the drumkit part for "Nuke Out" was played in real-time on the fly with no overdubbing, which was done before I switched primarily to music notation and virtual instruments. Here in the sound isolation studio, percussion is rhythmic and melodic when there is enough of it tuned to different pitches. . . ]
.
phpBB [video]


MAPPING MIDI TO MUSIC NOTATION

What about mapping MIDI to music notation?

Since there is not a one-to-one mapping of MIDI to music notation, converting MIDI to music notation is more of an art than a science at present, although I think there is a way to improve the accuracy and the practicality of the conversion, which primarily is a matter of using a bit of what one might call "artificial intelligence" or better yet "practical common sense" . . .

Some of this is supported in NOTION for real-time MIDI recording and subsequent conversion of recorded MIDI to music notation; and there are various MIDI recording options which can improve the accuracy of the conversion to music notation . . .

Quantizing is another technique which can in some instances improve the production of "practical common sense" music notation . . .

I think that most keyboard players have a very specific idea of the way the music they play should look when converted or transcribed to music notation . . .

Sometimes this happens, but it's more likely that it doesn't happen . . .

As computers become faster and more capable of doing elaborate series of algorithms, the idea of using "artificial intelligence" to improve the quality of MIDI-to-music-notation conversions becomes more practical, and intuitively I think there are ways to do this sensibly . . .

Based on experiments over the past few years, my perspective is that when NOTION converts recorded MIDI to music notation, it nearly never matches what I think I am playing . . .

In fairness, it probably is exactly what I played, but since it always looks like something I have no idea how to play, I think the conversions can be improved with help from more advanced computer algorithms . . .

It's important to understand that what you think you were playing might not be exactly what you actually played; and for me this certainly is the case--in part because, as you might know, I taught myself how to play grand piano (and keyboard synthesizers) by dreaming about it for 20 years . . .

Based on being able to let my unconscious mind ("id" in Freudian terminology) run the show for a while, I can play extraordinary rapid series of notes without having any immediately conscious idea what I am doing; and if I have no immediately conscious idea what I am doing, then how can a computer algorithm make sense of it?

NOTION makes it possible to record the MIDI, and as best as I can determine NOTION is able to make some level of sense of what I am doing as it maps the MIDI to music notation; but making it look "pretty" is another matter . . .

Consider this bit of Surf.Whammy synthesizer lunacy which I played and recorded in NOTION in real-time on the fly on a 25-key Behringer mini-MIDI keyboard, which was done as an exercise in ReWire that probably nobody ever does, but so what . . .

The song is "Faster" (Techno Squirrels), and it was a demo song that came with Reason 6 (Propellerhead Software) or some version around the time . . .

I like the song, and I use it for educational purposes to show how one can enhance a song--which in this example from a few years ago (2014) is focused on Reason 7.1 (Propellerhead Software)--by adding additional instruments in NOTION with virtual instruments and music notation, some of which is recorded as MIDI and then converted to music notation, with all this being done in a ReWire session where a Digital Audio Workstation (DAW) application--in this example Studio One Producer 2.6.3--is the ReWire host controller and both NOTION and Reason are ReWire slaves . . .

[NOTE: The original version of "Faster" does not have a horn section, Surf.Whammy synthesizer, and Hammond B-3 Organ; and as I recall I also did some structural arranging to make the song longer, which might be in this version. Mostly, the goal was to determine whether it's possible to do all this stuff at the same time in a ReWire session; and it certainly is . . . :+1 ]

phpBB [video]


SUMMARY

Converting recorded MIDI to music notation is as much an art as it is science, and since there is not a one-to-mapping of MIDI to music notation, there is a lot of art involved in the decision making . . .

Remember that what you think you are playing probably is not exactly what you are playing . . .

It's probably not exactly what you thought you were playing, because (a) you are human and (b) MIDI and the computer are much faster at determining what you actually are doing in real-time on the fly than you might imagine, at least with respect to tracking and recording what you are doing in microseconds . . .

With a bit of practice, humans can do a reasonable job of keeping track of things in the range approximately 24 to 60 milliseconds--provided as I did, we reprogram our brains to increase the number and density of neural pathways between the auditory cortex and the frontal eye fields regions . . .

But even when you can perceive and perhaps play notes as rapidly as one note every 24 milliseconds (approximately 42 notes per second), if your timing is off by 1 millisecond, this is an approximation when time is measured in microseconds, "ticks", and so forth . . .

And then there is the matter of the perceptual apparatus of the brain doing some fascinating things to reduce the complexity of rapidly occurring events--in particular the Haas Effect, where two identical sounds occurring very rapidly are perceived a being only one sound but louder than the two individual sounds, which as best as I can hypothesize is a primitive defense mechanism to make the rapid sounds of bear or tiger paws appear louder so that you can get out of the way or do something to protect yourself . . .

Haas Effect (Wikipedia)

The horizontal bars--sequencer overlays--come directly from the MIDI, and they are precise . . .

How it maps to music notation is reasonable, but if it's not what you expected, then at present the strategies are (a) to adjust the MIDI recording configuration, (b) to quantize, and (c) to adjust the music notation and sequencer overlays manually . . .

Lots of FUN! :)

Re: Quickest Way to Fix (or Prevent) This MIDI>Notaton Conversion Error

Posted: Tue Aug 13, 2019 3:10 pm
by Surf.Whammy
If you have Studio One Professional, then another possibility is to record the MIDI in Studio One and the send it to NOTION . . . :)

THOUGHTS

This is easy to do, and Studio One and NOTION interact very well . . .

Experiment with the various Studio One MIDI recording options, including quantizing after the MIDI is recorded . . .

Lots of FUN! :-)

Re: Quickest Way to Fix (or Prevent) This MIDI>Notaton Conversion Error

Posted: Tue Aug 13, 2019 3:27 pm
by brundle-fly
The MIDI can have perfectly quantized note starts and durations, and Notion still won't place the 16th note where you can see the MIDI note starts in the overlay of the 'before' screenshot.

Yes, conversion of MIDI to notation is not an exact science, but I don't think it's too much to ask to have a perfectly quantized 16th note notated in the correct position. Cakewalk's staff view gets it right, but is much less editable than Notion, and doesn't ultimately produce printed output that looks as nice.

Basically what's needed is a setting in Notion similar to the 'Display Resolution' setting in Cakewalk's staff view that determines the smallest note value used to render the notation. If I set the resolution that to an 8th in Cakewalk, the result looks like what Notion does by default. But I'm not seeing a way to control this in Notion.

I see the following item in the 6.x change log:

"Improved rhythmic spelling on MIDI or Studio One import"

Anyone think that might address this?

There are other ways to 'skin this cat' such as exporting as .MXL from Cakewalk and importing it to Notion for 'polishing'. I just wanted to see if I could get this done easily within Notion, and maybe learn some editing tricks in the process.

I did, however, figure out how to correct the velocity of the inserted note, so that's progress. ;^)

P.S. I would argue that MIDI is closer to an analog representation of music than standard notation, certainly when it comes to representing rhythmic subtleties.

Re: Quickest Way to Fix (or Prevent) This MIDI>Notaton Conversion Error

Posted: Tue Aug 13, 2019 11:34 pm
by Surf.Whammy
These are the "MIDI Record" options in NOTION Preferences on the Mac . . . :)

Image

THOUGHTS

I think these are the default settings, since on the MacBook, I am running the demo version of NOTION and have not adjusted anything . . .

I like your idea of having a granularity setting, where you can specify that everything needs to be at eighth note granularity or some similarly useful value . . .

This would be similar to quantizing but perhaps better . . .

This would map to adjusting the durations and timing of notes, but I like the idea conceptually . . .

In effect, it says "no matter how imprecise or precise I play, map it so that the smallest duration is the duration of an eighth note or perhaps a sixteenth note" . . .

I think there is an algorithm for doing something like this, and it might not be an elaborate algorithm . . .

The starts and ends of notes are available in the recorded MIDI, so it's just a matter of doing what one might call "arbitrary smoothing", where the starts and ends of notes are set to the granularity of eighth or sixteenth notes, for example . . .

I seem to recall that at one time there was an option to allow triplets, but perhaps not . . .

It's all about mathematics, so I think it's not too difficult to have an option for the way to allow or disallow dotted notes, at least one or perhaps two dots . . .

This might be similar to what "quantizing" does, but I think the significant difference is that it adds parameters or options to control the way quantizing is done . . .

I do not do a lot of MIDI recording, but when I have done MIDI recording in NOTION and then quantized, it didn't appear to make a noticeable difference . . .

The NOTION folks might have done experiments with this idea; but I see posts like yours in various flavors occurring consistently every once in a while over the past 10 years . . .

I'm not in the sound isolation studio, so my ability to do experiments is limited; but for reference there are some very useful utility applications on the Mac that let you examine a MIDI file and see the exact values of nearly everything, some of which is hexadecimal and not so easy to interpret, but (a) it's there and (b) it starts making sense the more you learn about the MIDI Specification . . .

Thee probably are MIDI file examining and editing utility applications for Windows, as well . . .

Rollback the clock several years and MIDI was a new thing for me, which initially made no sense at all . . .

I found some excellent online courses on MIDI at Groove3 and macProVideo; purchased them; and then worked through the courses, which was very helpful . . .

[NOTE: These courses by both companies are available for Mac and Windows, and when you purchase them you can download them, so they're not "online". Instead, they are on your computer when you download them and install the player software and the data sets for the courses, which is convenient. As I recall, one of the courses was more simple and didn't go into as much technical detail as the other courses; but both of them were very helpful. The Mac Pro (Early 2009) does not have MIDI ports, but I use MOTU external MIDI and digital audio interfaces, and at least one of them had standard MIDI ports. I have a KORG Triton Music Workstation (88-keys, weighted grand piano style), and it's 10 or perhaps 15 years-old, so it only has standard MIDI connectivity. The newer 25-key Behringer mini-MIDI keyboard has both USB and standard MIDI--except that "newer' in this context actually is not the most current version . . . ]

MIDI Explained (Groove3)

MIDI Course Library (macProVideo)

Image

I selected this specific Behringer MIDI keyboard and controller because it has everything--including inputs for two pedals--but it's a keyboard and MIDI controller, not a synthesizer, per se, although it has audio line inputs and outputs (stereo) and a stereo headphone port . . .

[NOTE: It has an input for a power supply, but as I recall it did not come with a power supply. After doing some research, I found a Jim Dunlop power supply that matched the specifications and cost about $10 (US) . . . ]

Image

It does not have weighted piano-style keys, but the keys are velocity-sensitive; and it was affordable (less than $150 [US]) . . .

The primary use was to be able to explore all the MIDI stuff, and it's excellent for this purpose, since among other things it supports sending various MIDI commands and all that stuff, which is totally complex but nice to have when you are studying MIDI . . .

Behringer has a newer version that has weighted keys and is a synthesizer; and it has standard 8-pin MIDI ports, as well as a USB port, various other ins and outs, and so forth. As a synthesizer, it's monophonic, but I think it's just a regular MIDI keyboard when used only as a MIDI keyboard--but probably monophonic . . .

The newer version costs more, but it has weighted piano-style keys, which looks to be better than the plastic keys of the older model . . .

It also has MIDI-In, MIDI-Out, and MIDI-Pass-Through ports . . .

The older model only has MIDI-Out . . .

One of the features of the old model that I like is the ability to configure the keys so they are not velocity sensitive, which makes the keys "ON/OFF" rather than being touch-sensitive . . .

I found a laptop "arm" at Amazon that works very nicely and makes it possible to have the Behringer mini-MIDI keyboard next to the computer display and regular keyboard . . .

Image

I can use the KORG Triton Music Workstation, of course, but it's not so convenient since when I am sitting in front of the computer display the Triton is in back of me . . .

Lots of FUN! :)