I exported my project as a MIDI file and imported it into my DAW (Logic). Many notes positions were slightly anticipated. E.g. In Notion I wrote the oboe first note on the downbeat of the 5th bar (MIDI position 5 0 0 0), instead, in Logic, I see it as 4 4 4 236. Importing the MIDI file to Reaper gives the same result.
I tried to export the project as an XML file and to import it into Reason – since in Logic I can't – and the note positions were fixed. Now, given that I needed to continue this project in Logic, I exported a MIDI file from Reaper into Logic. Here all the instruments seem to have their correct notes positions, except the Harp that continues to present a similar anticipation issue (in Reaper it was correct this time).
Now, I can easily fix the wrong harp notes and go on with my project, but I wonder why this happened and I'd like to avoid this workaround the next time. Since in Notion I wrote the whole sheet via step time mode or with the mouse – I haven't performed that or tweaked anything – shouldn't the MIDI file be neat and tidy?
You might try quantizing the score in NOTION first, before exporting it as MIDI . . .
[NOTE: This is how it looks in NOTION 6. I outlined it in red for clarity . . . ]
[NOTE: If you find computer stuff boring, you can skip this part and go directly to the postscript, where you will find another possible solution. I'm not trying to be boring; but this is the way my mind works; and once I get past the computer stuff, I tend to remember something useful or to have a good idea that doesn't require a bunch of computer stuff, which is easier to understand when you know (a) that I'm chatty and (b) that I drink a lot of very strong coffee, which (c) tends to make me even more chatty . . . ]
Whether quantizing to music notation will make a difference is another matter, but it might . . .
The problem is that there is no one-to-one mapping of music notation to MIDI, and vice-versa . . .
The duration and timing of notes in MIDI are specified differently from the way they are specified in music notation . . .
It's apples and oranges; and in practice this requires each application to use its own set of algorithms to do a "best guess" mapping of music notation to MIDI, going and coming . . .
You can read about the way note durations and timing are specified in standard MIDI at the following link:
Timing in MIDI Files (Prof. Christopher Dobrian, University of California at Irvine)
Consider this example of a Dorian scale starting on "Middle C" . . .
[NOTE: This is how it looks as music notation in Digital Performer 9 (MOTU) . . . ]
[NOTE: This is the corresponding MIDI as shown in MidiKit . . . ]
Digital Performer 9 shows the first note as "Middle C", but in the MIDI file the first note is C3 . . .
This is because there are two general ways to specify "Middle C"; and both of them are fine with MIDI . . .
One way says that "Middle C" is C3; and the other way says it's C4 . . .
Here in the sound isolation studio, "Middle C" is C4; and this is consistent with scientific pitch notation, which is the primary reason that I use C4 for "Middle C" . . .
Scientific Pitch Notation (Wikipedia)
If you look at the "Data" column value for the "C3" row, you will find "90 3C 57"; and the "3C" part is hexadecimal for the base 10 number 60, which is the MIDI number for "Middle C" . . .
The way an application assigns the scientific pitch notation value depends on the application; because as noted, MIDI doesn't care . . .
MIDI says that 60 is "Middle C"; so whatever maps to "Middle C" is fine with MIDI . . .
For reference, if you follow the generally accepted standard defined by the Acoustical Society of America, then "Middle C" is C4 . . .
You also will note that the duration of each whole note is specified as being 1024 ticks . . .
It took about an hour or two, but after a bit of reading "The Complete MIDI 1.0 Detailed Specification" information about the structure of a MIDI file, I decided that in the Dorian scale MIDI file, the tempo is set to 270 BPM by specifying the number of microseconds per quarter note as hexadecimal "03 64 0E", which is decimal 222,222; and if you divide decimal 60,000,000 by 222,222, you get 270 and a few extra bits that don't count . . .
From this and knowing that a whole note is 1,024 ticks, I suppose that for this MIDI file, (a) a quarter note has a duration of 256 ticks, which corresponds to 222,222 microseconds. and (b) a tick is 868 microseconds, which is fabulous . . .
Where am I going with this?
I have a degree in Computer Science; and it took quite a while to wade through the MIDI specification just to discover how many microseconds are in a quarter note when the tempo is 270 BPM . . .
[NOTE: You can download the aforementioned, detailed MIDI 1.0 specification at the MIDI website. You need to register to download files. Registering doesn't cost anything, and they don't send you annoying emails. This is the official MIDI organization, so it's all good . . . ]
MIDI 1.0 Specification (MIDI.org)
Part of the reason it took me so long was that I looked at a hexadecimal dump of the MIDI file and for a while was puzzled by "FF 51 03 03 64 OE", which did not match what MidiKit showed for the Set Tempo command, which was "FF 51 03 64 0E" . . .
After pondering this for a while, I decided that the first "03" was providing the number of bytes or words that followed; so MidiKit didn't show it, based on the idea that it was extraneous information . . .
In the actual MIDI file, the value is "FF 51 03 03 64 OE", as you can see in the following image:
[NOTE: My best guess is that "FF 51" is the Set Tempo command; the first "03" is the number of bytes or words for the numerical value; and "03 64 0E" is the hexadecimal value of the 24-bit number of microseconds per quarter note or beat . . . ]
Imagine how long it might take a team of software engineers to decide what note is the best fit when the duration is 759 ticks . . .
How much wiggle room do you allow?
What if it might be a note in a 7:5 ratio tuplet at a slow BPM tempo?
I can imagine answering those two questions requiring a series of team meetings and a lot of coffee and doughnuts . . .
When the numbers are all nice and pretty, it's probably not so difficult; but every application that works with MIDI and music notation is going to have a slightly different set of algorithms for converting (a) from music notation to MIDI and (b) from MIDI to music notation . . .
It's this way, because there is no exact one-to-one mapping of (a) music notation to MIDI and (b) MIDI to music notation . . .
It's as much an art as it is a science; so while it looks like Reaper did a pretty good job--at least better than Logic Pro--there were some differences . . .
MIDI TIME vs. DAW TIME
Making this all the more complex, (a) MIDI TIME is different from (b) DAW TIME, which explained another way says that (a) wall clock time is different from (b) relative time . . .
In DAW TIME, which is relative time, by convention there are exactly 960 ticks per quarter note.
In MIDI TIME, which is wall clock or absolute time, the number of ticks per quarter note depends on the tempo and the Pulses Per Quarter Note (PPQN) value specified in the MIDI header, noting (a) that this is abbreviated to PPQ (Pulses Per Quarter) and (b) that PPN can be different values depending on the way it's specified or defined by the application or device that created the MIDI sequence represented in the MIDI file . . .
In the example (see above), the MIDI file tells us that the tempo is 270 beats per minute (BPM), which means that in 4/4 time, each quarter note or beat lasts for 0.22 seconds or approximately 222 milliseconds or 222,222 microseconds, hence as shown in the calculation (see above), a measure is 888,888 microseconds . . .
However, when this is in a DAW application and the transport is showing relative time in measures, beats, and ticks, then its "ticks" are not the same as the MIDI ticks, because MIDI ticks are wall clock or absolute time, while DAW ticks are relative time . . .
When the Dorian scale MIDI sequence is imported to Digital Performer 9 (MOTU), you can see how this works by comparing wall clock or absolute time to samples and measures|beats|ticks, as shown in the following transport images . . .
[NOTE: Wall clock or absolute time is shown at the left in each image, and it's determined by the tempo and PPQ in the MIDI file, as it's specified in the MIDI file. It will change if the tempo is changed in the DAW application; but in these images it's based on the information in the original MIDI file. The sample rate for the selected VSTi virtual instrument (an Addictive Keys [XLN Audio] grand piano) is 44,100 samples per second, which is standard CD audio quality. You can verify this by dividing the total number of samples (372,383) by 44,100, which is approximately 8.44 with a bit of rounding. The actual value to some degree of precision is "8.4440589569161", but so what. For reference, the samples are not from the MIDI file. Instead, the samples are provided by Addictive Keys for its grand piano; and the exact number of samples is determined by the sample rate to which Digital Performer 9 is set, which in this example is standard CD audio quality (44,100 samples per second); so the number of samples is an exact match for the wall clock or absolute time and has little to do with MIDI TIME ticks or DAW TIME ticks, other than MIDI TIME ticks defining the initial conditions . . . ]
Digital Performer 9: Start Value for Dorian Scale MIDI Sequence Samples
Digital Performer 9: End Value for Dorian Scale MIDI Sequence Samples
Digital Performer 9: Start Value for Dorian Scale MIDI Sequence Measures|Beats|Ticks
Digital Performer 9: End Value for Dorian Scale MIDI Sequence Measures|Beats|Ticks
Based on this additional information--specifically that DAW TIME arbitrarily is defined by the generally accepted standard to have exactly 960 ticks per beat in measures|beats|ticks or bars|beats|ticks units--I think this explains why in Logic Pro X, the music notation converted from the MIDI sequence you imported does not align so precisely as the MusicXML you imported to Reaper and then converted to MIDI, exported, and so forth . . .
But it also could be the way the MIDI to music notation algorithms work in Logic Pro X . . .
Regardless, it's not so straightforward and simple; because there is not an exact one-to-one mapping of MIDI to music notation, and vice-versa . . .
[NOTE: Adding yet another twist to the various rules, in Reason (Propellerhead Software), the bars|beats|ticks format is extended to include 16ths, which is bars.beats.16ths.ticks; since Reason uses dots instead of vertical bars as separators; and a 16th is exactly 240 ticks in this scheme, remembering that these "ticks" are different from MIDI ticks; but they are the same "ticks" that Digital Performer 9 and other DAW applications use in the corresponding arbitrarily defined measures|beats|ticks or bars|beats|ticks formats . . . ]
Reason 9: Start Value for Dorian Scale MIDI Sequence
[NOTE: The tempo reverts back to the default tempo at the end of the imported MIDI sequence, hence 120 BPM rather than 270 BPM . . . ]
Reason 9: End Value for Dorian Scale MIDI Sequence
For stuff like this, if you have a workaround that doesn't take a lot of time, then be happy . . .
Lots of FUN!
P. S. The current version of Logic Pro X (10.3.2) imports MusicXML, so the solution might be to upgrade NOTION 5 to NOTION 6 and then to upgrade Logic Pro X (10.3.1) to Logic Pro X (10.3.2). The upgrade to the current version of Logic Pro X is free, so check the Mac App Store for upgrades . . .
I did a quick experiment using a MusicXML file that I created several months ago in NOTION 6; and Logic Pro X (10.3.2) imported it with no problems, which is great . . .
[NOTE: Whether this will result in a better mapping is another matter, but it certainly might. For reference, Logic Pro X also exports MusicXML, which it will do based on the instruments you select; so it can be one instrument or a complete score. MusicXML might provide a straightforward back-and-forth solution, since NOTION 6 does this, too . . . ]
P. P. S. There's a typo in the following sentence from your post:
"I tried to export the project as an XML file and to import it into Reason . . . "
Correction: "Reason" should be "Reaper" . . .
Reason (Propellerhead Software) doesn't do music notation and MusicXML . . .
The Surf Whammys
Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
Users browsing this forum: No registered users and 2 guests