Hi! I'm currently using Notion for an upcoming musical theatre production for orchestra augmentation, playing the parts I'm not able to staff live. In addition to notion, I have two other macs running Apple Mainstage for two keyboards.
My question is - can I automate patch changes for the mainstage computers through Notion midi? I'd like to program the Notion score to send midi program change information to mainstage on the other computers. Do I need to use rewire to ableton to do something like this? I'm trying to lower the CPU usage as much as possible. Thanks!
alankenny wroteIn addition to notion, I have two other macs running Apple Mainstage for two keyboards.
Based on what you described, you have three Apple computers--one running NOTION and two running Mainstage (Apple) . . .
Applications on all three Apple computers will be generating audio, so you need a Digital Audio Workstation (DAW) application to mix the audio to send to the sound reinforcement system . . .
You could run a DAW application on the same Apple computer you are using to run NOTION, but I would not do this in a ReWire session. Start NOTION first, and then start the DAW application so that it's not a ReWire session . . .
You will need external digital audio and MIDI interfaces so the three Apple computers can communicate via MIDI and can generate and work with analog audio signals . . .
The audio generated by the two Apple computers running MainStage can use external digital audio and MIDI interfaces to send the audio to the external digital audio and MIDI interface for the primary Apple computer so that it's DAW application can mix it and then generate the output signals for the sound reinforcement system . . .
I think each Apple computer needs its own external digital audio and MIDI interface. It's easier to understand this way, and the general rule is that one Apple computer controls its external digital audio and MIDI interface . . .
You might need some additional MIDI routers or whatever is necessary connect all the MIDI cables in addition to the external digital audio and MIDI interfaces . . .
NOTION needs to be able to send MIDI to the two MainStage applications . . .
The two MainStage applications need to get their generated audio to the DAW application, and this can be done via analog output signals sent to the external digital audio and MIDI interface for the DAW application, which is on the same Apple computer where you are running NOTION . . .
The key to determining whether this will work is the ability of MainStage to recognize specific MIDI notes as being commands to do various things with patches . . .
For example, Kontakt 5 instruments usually have keyswitches that cause various things to happen . . .
NOTION can send notes via MIDI, and when the receiving application or instrument has specific notes defined to be keyswitches, a sequence of MIDI notes sent by NOTION can trigger the keyswitches, which then causes something to happen . . .
NOTION does not send MIDI program commands directly, although there might be ways to do it indirectly, although it's probably vastly complex, if it's possible . . .
In contrast, sending the MIDI command to play a note is trivial . . .
If you can configure an "instrument" in MainStage so that when it receives a E1 (low-pitch "E" string of an electric bass at standard reference tuning, where "Concert A" = 440-Hz), then you can write music notation on an External MIDI staff in NOTION to play E1 on the MainStage "instrument" by sending it the respective MIDI command to play the E1 note . . .
This is what comes to mind at the moment, and it's not something about which I know a lot, but so what . . .
The strategy here in the sound isolation studio is to find simple solutions to otherwise complex problems . . .
For example, I do everything with soprano treble clefs, which I can do because in NOTION I can define a soprano treble clef such that it plays notes one or two octaves higher or lower than notated . . .
This way, I only need to be proficient in soprano treble clef notation, which is fantastic because it's the only one I understand. All the other clefs are strange and require me to do real-time mental mapping to determine what the note actually is, which is totally impractical . . .
When I see a note on a bass clef, it's not the note at the same position on a soprano treble clef . . .
It's down or up by two half-steps, and I never can remember whether it's down or up; so I have to stop and think about it, which then causes me to have a "team" meeting in my mind, and perhaps in 15 minutes I hold a vote and decide by consensus that it's probably downward, hence what looks to me like High C, actually is Low A or something that truly makes not sense . . .
Here in the sound isolation there are 12 notes and 10 octaves, 2 of which humans can't hear, hence are useful only when you want to entertain bats, birds, cats, dogs, dolphins, porpoises, sea turtles, and whales--noting that I learned just a few days ago that dog's don't hear deep bass or anything lower that 40-Hz, which for the most part means that dogs can't hear the low-pitch "E" string of Paul McCartney's Hofner Beatle Bass, if he tunes it down a half-step . . .
I also don't use dynamics, most articulations (except for guitar whammys and string bends), and all that stuff--except that I like glissandi, since you can use one to play a bunch of notes, which is very efficient and optimal . . .
If I want something loud, then I crank the volume slider in the DAW application when I am producing and audio engineering . . .
If I want vibrato, then I select a set of samples where the musician is playing the instrument with vibrato . . .
If I need five different articulations or playing styles for violin, then I have five staves of violin and each staff has a set of samples for the specific articulation or playing style . . .
If I want to plucked violin notes, then I put the music notation for plucked notes on the staff where the violin is being plucked . . .
If I want bowed violin notes, then I put the music notation for bowed violin on the staff where the violin is being bowed, and so forth . . .
I have more staves, but each one is simple; and this works for me . . .
I might have 50 NOTION scores for one song, which is fine with me; since with 20 staves per NOTION score, I can have 1,000 instruments--not all at the same time, of course, but I do everything myself, and I can only do one thing at a time, hence it's all good . . .
Another primary bit of information is that by the time you run the audio through a sound reinforcement system that people can hear, a lot of the stuff one can notate in music notation doesn't matter, because it all blurs . . .
And then there's the matter of everything needing to occur flawlessly in real-time with no possibility of any of the applications running on the Apple computers crashing . . .
The more simple you can make the system, the more likely it will perform reliably in a real performance . . .
Do you really want to be loading patches during a live performance while you are managing three computers and an ensemble of musicians, singers, and other performers?
NOTION supports four MIDI ports (A, B, C, D), and each MIDI port has 16 channels, which in total maps to 64 channels . . .
The native NOTION instruments are reasonably light in terms of system resources, and you can have several of them with no problems; and I think you can have a lot of External MIDI staves; but there are limits to what NOTION can handle per score; and you want to keep everything within bounds . . .
There are a few vastly resource-intensive VSTi virtual instruments, for example a few of the MachFive 3 advanced, chromatically-sampled instruments that support scripting and all that stuff; and when I use one of them, it might be the only instrument in a NOTION score, although I usually have a handful of low-impact instruments so I can tell where I am in a song (for example a kick drum, snare drum, and bass, all native NOTION instruments) . . .
The resource-intensive VSTi virtual instruments are high-quality and produce remarkably realistic audio, so I intentionally ensure there are maximum available system resources available in NOTION when I am using one of them . . .
I did about a minute of what I call "scouting around", and it appears that you can configure MainStage to do keyswitch-based patch switching . . .
These YouTube videos explain how to do this in MainStage . . .
This YouTube video provides some useful information on AppleScript, MainStage, and Logic Pro X, as well as a quick overview of a MOTU MIDI router or whatever . . .
I am presuming (a) that you can control MainStage with a MIDI keyboard by playing notes on the MIDI keyboard and (b) that playing notes on a MIDI keyboard are the same as using music notation on an External MIDI staff in NOTION to cause NOTION to send the corresponding MIDI notes to MainStage . . .
I think you can do what you want to do, although how you do it might be different, which mostly is a matter of determining (a) what each application can do and (b) how to use the applications to do what you need to happen . . .
SUGGESTION: DO SOME EXPERIMENTS!
Do a simple experiment . . .
Setup a patch in MainStage that you can trigger with a MIDI keyboard, and then try to trigger the patch with music notation on an External MIDI stave in NOTION . . .
Keep it simple for the experiment . . .
You will need to run the MIDI cables and all that connectivity stuff, but it's something you will need to do, regardless . . .
The goal of the experiment is to determine whether MIDI notes sent from NOTION look the same to MainStage running on a different MAC as a MIDI keyboard playing the same notes . . .
Lots of FUN!
The Surf Whammys
Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
Users browsing this forum: No registered users and 2 guests