Discuss Notion Music Composition Software here.
54 postsPage 2 of 3
1, 2, 3
To be as clear as possible, the strategy on the Studio One side is to use a set of ".song" files to consolidate existing Instrument Tracks and Audio Tracks by submixing and then exporting the submixes as audio clilps (a.k.a., "stems") . . . :)

For reference, the way Instrument Tracks are "consolidated" begins by recording to corresponding Audio Tracks the audio generated by the respective Instrument Tracks and (if any) their effects plug-in Inserts and Automation lanes . . .

In this particular "ReWire MIDI" strategy, MIDI is not recorded to Instrument Tracks automagically . . .

If you need to record MIDI, then instead of using ReWire MIDI staves in the NOTION 6 score, you need to use External MIDI staves in the NOTION 6 score, which requires using a Virtual MIDI Cable and is not difficult to do; but External MIDI staves are different from ReWire MIDI staves, although for all practical purposes, with caveats for the difference, you can use them in similar ways . . .

ReWire MIDI staves are easier to connect to Studio One and using them does not require a Virtual MIDI Cable, because ReWire provides the necessary pipes . . .

[NOTE: A "pipe" is a low-level software construct for sending information back-and-forth between applications, but it also involves various aspects of hardware in the computer chips. The ReWire infrastructure provides a special set of "pipes" for this purpose, where for reference there are MIDI "pipes" and Audio "pipes" provided by the ReWire infrastructure. There also are what one can call Command "pipes" that are used to communicate various Transport events and instructions, although without doing a bit of reading to verify exactly how this aspect of ReWire works, I am comfortable writing that this type of "pipe" might not be what in UNIX is considered to be a pipe, but so what. There are communication channels, and some of them are more like pipes, while others are like a semaphore subsystem in low-level Windows. It's easier to conceptualize all this stuff in terms of being real-time communication channels, but I like the idea of pipes, which basically are various types of dedicated software "cables" or "wires" that have certain types of elevated privileges which are necessary to ensure timely processing. You don't need to understand all this low-level stuff to develop songs productively . . . ]

Image
ReWire Infrastructure Diagram (Propellerhead Software) ~ Pipes magnified for emphasis

[NOTE: This information is available to the public without NDA and other requirements. In ReWire terminology, Studio One is the "mixer application" in a ReWire session when it is the ReWire host controller; and NOTION 6 is the "panel application" in the ReWire session where it is a ReWire slave. To help put all this stuff into perspective, I have been doing low-level C/C++ programming for over 30 years, first in the Windows universe and more recently in the Mac OS X universe; and while I did not fall off a turnip truck, I am simply astounded by the way Studio One and NOTION make everything happen automagically. This is not the easy part of software engineering--instead, it's very complex. I can read about it and make sense of what generally is happening, but that's not the same are knowing how to code it At present, making sense of Reason Rack Extensions is a daunting task here in the sound isolation studio, but there is an occasional bit of progress, where for the past year or two I have been stuck on discovering how to avoid a totally annoying "click" sound when one of my Rack Extension knobs is rotated. Part of the conceptual problem is that I have not yet developed a fully coherent mapping for (a) "lollipops", (b) sounds I can make with an electric guitar and drumkit, and (c) music notation, where "lollipops" are individual digital samples, each of which has exactly two parameters: (1) amplitude and (2) relative arrival time. I will prevail in this worthy goal, but it's taking a bit of time. In some respects, I think it might be easier to get a doctorate in Quantum Mechanics . . . ]

ReWire Technical Information (Propellerhead Software)

With each iteration or layer, the instrumental, vocal, and orchestral complexity of the song increases . . .

When you keep all the ".song" files, audio clips, NOTION 6 ReWire MIDI scores, and so forth in a high-level project folder for the song, then each step or layer can be revisited at some later time (a) when you are focused on producing and mixing and (b) decide that something done earlier needs a bit of fine-tuning . . .

[NOTE: As a general rule, I think having 20 ReWire MIDI staves per NOTION 6 score is a convenient number; and it's useful to have a handful of ReWire MIDI staves common to all the NOTION 6 subscores, where these common ReWire MIDI staves are used to provide visual cues so that it's easier to determine where verses, choruses, bridges, and interludes begin and end. When you are composing counterpoint and harmonies, it's useful to include the relevant bass, rhythm chords, and melody staves, along with perhaps staves for kick drum and snare drum when you are working on some aspect of a "popular" genre song. By using only ReWire MIDI staves in your NOTION 6 subscores, it's very flexible with respect to the "reference" staves you decide to include . . . ]

Ideally, you want to refine your song development system so that everything always goes forward; but in this layered song development strategy, you always have the ability to revisit each layer . . .

Image

Based on my recent experiments using what I consider to be high-quality AUi and VSTi virtual instruments and AU and VST effects plug-ins, I think 10 new Instrument Tracks per Studio One ".song" layer is a good number that does not incur computing problems with respect to system resources and what one might call "peppy" real-time audio processing . . .

Done this way, you will add 10 new instruments and some reasonable number of vocal tracks per layer; so for example, if you need 1,000 instruments and 1,000 vocal tracks, then plan on having 100 or so ".song" layers; since as a general rule, recorded Audio Tracks do not have the overhead and processing limitations of AUi and VSTi virtual instrument tracks, although there are limits to the total number of Audio Tracks . . .

One limit for Audio Tracks is the amount of processing they require; and the other limit for Audio Tracks is being able to manage them in a coherent way when you switch to producing and mixing (audio engineering) modes . . .

Let's say for example purposes that you have 100 Audio Tracks in a Studio One ".song", which you might be able to do . . .

How are you going to see all of them? :roll:

Perhaps you have 10 high-resolution displays and devise a way to spread the Studio One Mixing Board across all 10 high-resolution displays . . .

You can't see all of them at the same time with sufficient focus to be able to do anything actually productive; so while it might be an interesting way to spend a lot of money for no practical reason, it's mostly spinning wheels . . .

You certainly can hear and be consciously aware of several hundred tracks of audio, but when you switch to producing and mixing, you need to focus on the Gestalt, which is the high-level perspective that you create by focusing diligently on details as you create each layer . . .

Consolidating, submixing, and using a bit of high-level common sense is the strategy I recommend; and I recommend it, because it works . . .

As a guideline, I think you need to do about 10 songs this way to discover the specific set of procedures and rules that work best for what will become your complete system for digital music production . . .

It might take several months to do the first song this way; but the more times you develop songs this way, (a) the easier it becomes, (b) the less time it requires, and (c) the better it sounds, which is fabulous . . .

Fabulous! :+1

A BIT OF HISTORY

Rollback the clock by a half century, and everything was done analog magnetic tape machines which had a limited number of tracks . . . :reading:

For example, "Sgt. Pepper's Lonely Hearts Club Band" (Beatles) was recorded on 4--track analog magnetic tape machines in a strategy that involved recording a few things on one 4-track machine and then playing back as input to a second 4-track tape machine where the already recorded tracks were consolidated or submixed to one or two tracks, which then made two more tracks available for recording more instruments and vocals . . .

The problem with this strategy is that with each copying activity of audio from one analog magnetic tape machine to another, there was a potentially significant loss of amplitude and frequency information due to the quantum physics of electromagnetism . . .

This problem was called "generational loss", where in the terminology "generation" referred to each time audio was copied and merged from one reel of analog magnetic tape to another reel of analog magnetic tape, with this "copying and merging" being done by a combination of tape machines and mixing boards . . .

Audio engineers and producers devised different strategies to compensate for "generational loss" when "bouncing" (or "ping-ponging"); and these strategies usually were sufficient to make the consequences of "copying and merging" transparent or unnoticed . . .

[NOTE: This is a vinyl record, but technically it's digital, because YouTube is digital, but so what. The only more accurate sound happens when you have the vinyl record and play it on a turntable through a calibrated, full-range studio monitor system. It's not a digital remaster, none of which sound anything like the vinyl records . . . ]

phpBB [video]


Today, we do everything in the digital music production universe; and there is no "electromagnetic generational loss" in the digital universe . . .

This makes it vastly easier to create songs that have virtual festivals of instruments and vocals, since there are no concerns about devising strategies for dealing with electromagnetic generational loss issues, because there are none . . .

In fact, when you are reasonably proficient with music notation and have a tasty set of high-quality virtual instruments and effects plug-ins, NOTION 6, and Studio One Professional, you can create your own "Sgt. Pepper's Lonely Hearts Club Band" extravaganza in your sound isolation studio, one track and one layer at a time, which is a bit beyond mind-boggling . . .

The procedures and rules are similar, and you need a well-defined and verified system for developing songs; but this definitely without any doubt is something you can do; and if you play real instruments and sing, you can do this, too . . .

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:18 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Fri May 11, 2018 2:49 am
"Surf Zot" is coming along nicely, but I want to add more instruments to show how the "ReWire MIDI" strategy is working; so I doubled the lead guitar and harmony guitar . . . :)

The new guitar tracks are panned to top-center, and they have the same music notation as the first set of lead and harmony guitars . . .

The differences are (a) that the doubled lead guitar is enhanced with a FabFilter Software Instruments "Timeless 2" flanger and (b) the doubled harmony guitar is enhanced with an IK Multimedia "T-RackS 5" CSR Classik Hall reverberation unit . . .

Image

[NOTE: In the "Advanced" section of Classik Hall, I set the low-frequency cut-off to 6dB for frequencies below 1,000-Hz, but I probably will change this to 600-Hz for the cut-off frequency with at total cut-off, since low frequencies make reverberation muddy . . . ]

Image

Both of the doubled guitars are at a lower volume level (-6 dB), so what they do is a bit subtle . . .

Mostly, they add sustain and texture, but they also help position the lead and harmony guitars in the center of the mix ("top-center" when you listen with studio quality headphones) . . .

phpBB [video]


THOUGHTS

The NOTION 6 score now has 17 ReWire MIDI staves, but only the six new guitars are playing VSTi virtual instruments hosted in the Studio One Professional ".song", since the six new guitars are assigned to ReWire Bus 2 rather than to ReWire Bus 1, which makes doing songs in layers very easy . . .

So far, I can see all the music notation; but only the music notation on the ReWire Bus 2 staves are playing Instrument Tracks in the Studio One Professional ".song", which for reference is the second ".song" in the set of ".song" layers . . .

In the YouTube video (see above), you see the music notation for the "R. Rhythm Guitar", but what you hear is the recorded audio from the previous ".song" layer . . .

I am tempted to add some keyboard synthesizers and a Cyclop (Sugar Bytes) buzzy bass; but so far I am trying to control the tendency to add more stuff . . .

Nevertheless, now that I think about it, I might add some synthesized cat purring to one of the lead guitar tracks, which is something I did for the singing in "Sweet Hour of Prayer" and is explained in more detail in corresponding forum topic (see below) . . .

Image

[NOTE: Cats purr at a frequency between 25-Hz and 150-Hz, which maps approximately to the time range of 7ms to 40ms per purr, depending on the way you do the mapping. At the lower frequency (25-Hz), there is one "purr" every 40 milliseconds, but at the higher frequency (150-Hz), there is one "purr" approximately every 7 milliseconds, which makes sense if you think about it a while. When you map it this way, cat purring is in the time range of the Haas Effect, except that it's not repeating what you are singing or playing. Instead, it's augmenting and enhancing with texture what you are playing or singing. Adding synthesized cat purring to an instrument or vocal track is Elvis Level Magic, and I like it! :+1 ]

Image

phpBB [video]


Project: "Sweet Hour of Prayer (PreSonus NOTION Forum)

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:19 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Sat May 12, 2018 2:04 am
I added some synthesized cat purring to the lead guitar using Cyclop (Sugar Bytes) . . . :)

There was a lot of low-frequency material in the Cyclop preset, so I used a FabFilter Software Instruments "Pro-C" low-cut, mid-scoop, hi-cut filter to tailor the synthesized cat purring; and I lowered the already-recorded and submixed lead guitar stereo Audio Track by 1 dB to balance it with the newly added synthesized cat purring . . .

The low-frequency material was affecting the overall mix but mostly the perceived volume level of the Höfner "Beatle" Bass, hence the Pro-C filter, which is a producing and audio engineering technique . . .

Image

As soon as there are a few tracks in a song, everything starts interacting with respect to overlapping frequency ranges and volume levels . . .

The best way to control these interactions is to use filters, limiters, and "ducking" to partition instruments and vocals so that to a practical extent they have their own unique sonic spaces . . .

Otherwise, it's all a blur . . . :(

As explained earlier in this topic, the Rolling Stones control this primarily by not all playing at the same time, which certainly is an effective way to do this separately from whatever the producer and audio engineers decide to do . . .

Charlie Watts uses this technique for snare drum rimshots when he is playing a hi-hat and snare drum rimshot pattern . . .

Instead of playing both a hi-hat beat and a snare drum rimshot at the same time, he only plays the snare drum rimshot . . .

When the only thing you play is a snare drum rimshot, it moves to the front of the mix for an instant . . .

[NOTE: The Cyclop synthesized cat purring is easiest to hear in the Interlude, where during the bass solo it plays whole notes. Once you identify the synthesized cat purring, you can hear it doing a bit of mid-scooped tonal "quacking", which is one of the textural things it adds to the lead guitar tone, albeit in a mostly subtle way. This was mixed when I was listening to the music played through the calibrated, full-range studio monitor system here in the sound isolation studio, but I did a tiny bit of fine-tuning for headphone listening, so it's best enjoyed when listening with studio quality headphones like the SONY MDR-7506 (a personal favorite) . . . ]

phpBB [video]


THOUGHTS

When I was custom modding my Fender American Deluxe Stratocaster--The Fabulous Fifty Million Dollar Trinaural Stratocaster®--I added some mid-scoop tone controls from Rothstein Guitars, and they are very nice . . .

Image

Passive Midrange Controls (Rothstein Guitars)

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:20 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Wed May 16, 2018 10:00 pm
I recorded a new version of "Surf Zot" with the synthesized cat purring done via Cyclop (Sugar Bytes) a bit more prominent, which includes a deeper IK Multimedia CSR Classik Hall reverb unit . . . :)

Image

Whether this is just something temporary to make the synthesized cat purring and mid-scooped quacking easier to hear is indeterminate at present . . . :P

Technically, it's more like (a) cat talking than (b) cat purring; so I am experimenting with different techniques for improving the synthesized cat purring . . .

[NOTE: This was mixed when I was listening to the song played through the calibrated, full-range studio monitor system here in the sound isolation studio, but it's best enjoyed when listening with studio quality headphones . . . ]

phpBB [video]


THOUGHTS

Strange as it might be, statistically it's probable that someone, somewhere is doing research on the best strategies for synthesizing cat purring; but here in the sound isolation studio the research is just beginning, which overall maps to discovering the rules slowly but surely . . .

My current hypothesis is that I need a lollipop slicer to create a stream of distinct pulses . . .

This might appear to be a bit wacky, but there actually are lollipop slicers for lead guitar . . .

[NOTE: A "lollipop" is a single sample, where for example at the standard CD sampling rate (44.1-KHz) there are exactly 44,100 "lollipops" per second. A "lollipop" has two parameters, (a) amplitude and (b) relative arrival time (i.e., when the sample was taken relative to the sampled sound) . . . ]

Image

Image

Image

SL-20 Slicer (BOSS)

phpBB [video]


I tried using a sine wave, but it doesn't sound like cat purring . . .

If sine waves are used, then it sounds more like an electric bass guitar than cat purring . . .

Rapid double-kick drums are a possibility, but I think they have similar problems with respect to being continuous rather than pulsing, although noise gates might provide a solution . . .

At present, I am intrigued by the idea of simulated cat talking, which might be possible with some help from Realivox Blue (Realitone) . . .

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:21 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by scunacc on Sun May 20, 2018 10:12 pm
Ooooodles of good stuff here - and well thought out approaches. I need to go back and read the details.

However, I do have a number of show stoppers for me that mean that Rewire isn't a viable solution even when doing the hosting in Studio One (or any host acting as a Rewire master) and driving via either MIDI over Rewire or external MIDI from Notion.

I would be interested to know (and, forgive me if you did indeed tackle it above and I missed it - I'll check back as time affords; I have every intention to do so) whether you have solutions for the following:

1.) Tempo control by Notion instead of Studio One. Doesn't matter what I set the tempo to in Notion, it's S1 that controls the tempo.

2.) When writing anything with changes in time signature, that's controlled completely by S1 when using Rewire. So, if I have a piece that changes from, say, 3/4 to 2/4 to 5/4 to 4/4 to 3/4 etc. (as I do in a classical guitar piece I'm currently writing), and I have it notated that way in Notion, S1 doesn't and can't have any concept of that, and will choose to use whatever single time signature you set - or, if you change time signatures manually in S1, whatever you change it to, but it can't be automatically controlled by Notion over Rewire.

3.) One cannot notate rits or accels in Notion - for similar reasons to the above - and expect them to work. S1 controls the tempo, not Notion.

4.) Repeats don't work and can't be used in Notion over Rewire. There's no way to sync them or unwind them in any sensible fashion.

Now, I could be missing something you've already found great solutions for :) but those have been the reasons I haven't spent any time on Rewire as I want Notion to have the control over the score, and acting as it only can as a Rewire slave to S1, it can't do be in control of any of those items and I do all of the above commonly. Acting as a Rewire master to Reaper sort of works, but even then Reaper has a hard time catching up with the MIDI tempo changes. It can work, but I don't use that solution either.

For me, it's meant that my preferred solution is to work entirely with external MIDI (mostly using IAC on macOS), since that allows all of the above concerns to be resolved. Notion is in control of everything. Repeats get unwound of course, into sequential MIDI when recorded in S1, but at least they work.

The downside is one can't sync between S1 and Notion in terms of start/stop for recording and that means one will have an offset from the start of a bar in S1 vs. Notion (unless your hand-eye coordination is good :) ). At that point, exporting the MIDI from Notion and importing into S1 is the better solution, and then working with the MIDI directly in S1 - which preserves all the tempo track information too. Overall, for me, it's a much easier solution to work with than Rewire as a result.

However, I could be missing things and I'd love to know if I am and ways to solve them.

LMK! :)

The nice thing about the pure IAC/non-Rewire approach is that it also doesn't depend on S1 and any specifics of Rewire that S1 allows. (MIDI over Rewire for example). The exact same IAC (or equivalent LoopBe etc. Windows solution) will work with Pro Tools, Live, Reaper, Cubase, etc. Even Tracktion. I have templates for each of those (full orchestral or partial orchestral) that work with the exact same Notion IAC template in each case.

The one it wont' work with is Logic because Logic's demultiplexing of more than 16 MIDI inputs for recording is still "broken". One can route all 4 of Notion's IAC channels for up to 64 MIDI channels in Logic - and indeed connect and route them to different instruments using the MIDI environment that do indeed play back independently.

BUT, one can't record more than 16 simultaneously, and it gets messy even if one is using more than 16 channels and one does not disable the non-currently-recording items' connections. A disappointing situation since Logic offers a lot of nice other features. 16 or less is OK with Logic, so, fine with smaller chamber ensembles or the like, but not a full orchestra.

Anyhow, I do this mostly in S1 anyway since it offers the best mix of features of all the DAW (and is my preferred DAW In any case :) ).

    Logic has the problems I mentioned above and no really useful folder nesting. Also, it's tricky to work with VCAs (they reset previous instantiations when you add more), and you can't group or move things in the same flexible way as in S1.

    Pro Tools only has groups and not nested folders for organization.

    Reaper has what I regard as more esoteric and more difficult to use MIDI and audio routing - otherwise is probably 2nd on the list of usability with IAC this way.

    Cubase isn't bad, but the routing is also less straightforward than S1's - ties 2nd with Reaper.

    Live's routing is OK, but it is... well... Live and not really focused on orchestral production - even in track vs. clip mode. 10 I believe has better folder nesting. I only have 9, and that only allows 2 levels.

    Surprisingly, Tracktion is fairly easy to use for this as well, as long as one can get on with the less than traditional interface. Has nicely nesting folders and failry easy MIDI routing.


Anyhow - my $0.02. Be glad for your thoughts and musings on these issues and solutions you have found for them as I say :thumbup:

Kind regards

Derek.

Derek Jones
MacBook Pro 17" 16GB,15" 12GB,Sierra, MacBook 13",12GB HighSierra. SSD,Windows 7 64 8GB RAM, Win 10 64 Pro 64GB RAM, [email protected]
Notion 6.5.470, Studio One 3.5 & 4.X
Pro Tools 2019.5,Logic Pro X 10.4,Cubase 10,Ableton Live 9.7.6,Reaper 5,Tracktion 7
Sibelius Ultimate 2019.5,Finale 25,Overture 5,MuseScore 3
Vienna Ensemble & Instruments 7,SE Strings,EWQLSO & Play 6,Miroslav Philharmonik 1&2,Kontakt 5.8.0,Waves 10
16.0.2 FW,iConnectMID4+ on macOS (+ others aggregated),192 Mobile+DP88 on Windows
User avatar
by Surf.Whammy on Mon May 21, 2018 9:30 am
scunacc wroteOoooodles of good stuff here - and well thought out approaches. I need to go back and read the details.

Glad you are enjoying the topic, and thanks for the kind words! :)

scunacc wroteI would be interested to know . . . whether you have solutions for the following:

1.) Tempo control by Notion instead of Studio One. Doesn't matter what I set the tempo to in Notion, it's S1 that controls the tempo.

2.) When writing anything with changes in time signature, that's controlled completely by S1 when using Rewire. So, if I have a piece that changes from, say, 3/4 to 2/4 to 5/4 to 4/4 to 3/4 etc. (as I do in a classical guitar piece I'm currently writing), and I have it notated that way in Notion, S1 doesn't and can't have any concept of that, and will choose to use whatever single time signature you set - or, if you change time signatures manually in S1, whatever you change it to, but it can't be automatically controlled by Notion over Rewire.

3.) One cannot notate rits or accels in Notion - for similar reasons to the above - and expect them to work. S1 controls the tempo, not Notion.

4.) Repeats don't work and can't be used in Notion over Rewire. There's no way to sync them or unwind them in any sensible fashion.

The best solution I have at present is to avoid trying to do this stuff with music notation . . .

THOUGHTS

Instead of focusing on music notation, think about it in terms of being in a recording studio working on a song which will be played on the radio, television, or YouTube . . .

It's a different perspective, and I think it's the most practical perspective for so-called "popular" genres . . .

With the exception of repeats, you can do everything else in Studio One on its Tempo Track generally or by other methods more specifically when it needs to be done at the instrument or voice level rather than at the song level; and this is the correct way to do it once you understand, accept, and embrace the strategy where (a) Studio One is the ReWire host controller and NOTION is the ReWire slave and (b) with perhaps only a few exceptions you do everything in ReWire sessions, although there are some things you can do solely within Studio One without needing to wander into NOTION, so long as at least some of the audio already is recorded . . .

[NOTE: The perspective here in the sound isolation studio is that "repeats" were created for two primary reasons, (1) to save paper and ink and (2) to annoy musicians and singers by forcing them devise strategies for making performance notes based on which time the apparently exact same notes are being played or sung, which effectively makes it a type of I. Q. Test where the musicians and singers with either (a) the best memories or (b) the best annotating systems get the highest scores. Explained another way, "repeats" are like the bass clef, and I avoid them diligently, mostly because a C on bass clef is an A, which forces me in real-time to map each bass clef note to treble clef for no logical reason other than someone with no adult supervision perhaps centuries ago thought it was a bright idea to move everything downward two notes on bass clef staves, which I suppose is diatonic in one way or another. When I look at a note on a bass clef staff, I have to remember that it's actually a different note and needs to be moved upward, where continuing with the example, when I see A, I have to remember that it's actually C, all of which takes too much time. Creating the bass clef led in turn to a series of team meetings and the subsequent creation of a virtual festival of clefs, none of which make intuitive sense to me. Once they did this and realized they had nothing to do, they switched focus to creating rules where the same note on a piano can have more than one name, where the definitive example is that C then also could be B#, A###, D♭, E♭♭♭, and so forth. Today at the dawn of the early-21st century, the descendants of these people are called "bureaucrats" . . . :P ]

Explained a different way, it's a matter of determining what is running the show:

(a) Digital Audio Workstation (DAW) application (Studio One)?

OR

(b) Music Notation (NOTION)?

If you want Music Notation to run the show, then what I call the "ReWire MIDI Strategy" will not work for you, where to avoid confusion although I call it the "ReWire MIDI Strategy", it includes recording real instruments and real vocals, External MIDI from NOTION, and everything else one can do with Studio One . . .

Some background will be helpful in understanding my perspective . . .

I learned soprano treble clef as a child when I was in a liturgical boys choir, and even though over half a century later I am a baritone or tenor, I can sight-sing soprano treble clef in my mind because (a) it makes intuitive sense to me and (b) I have a well-developed mapping of it in my mind . . .

None of the other clefs make any sense to me, and overall I think they are among the most devious traps and roadblocks that were designed centuries ago for the primary purpose of discouraging the common folk from pursuing careers in music . . .

On the good side, although the common folk were discouraged for a while, soon thereafter they realized that music notation actually was not necessary to make music when one was not so fortunate to have wealthy benefactors, which in those days was the only way one could do certain types of music . . .

Today, everything is different in the sense that the common folk have ready and affordable access to symphonic orchestras and everything else--provided there is an AUi or VSTi virtual instrument and sampled sound library for whatever it might be . . .

If I want to compose and record a symphony, then I can do it here in the sound isolation studio . . .

And if I want to have a section featuring hand bells, crystal glasses, and someone banging on a washing machine with a sledge hammer, I just need to do a bit of shopping at Bolder Sounds (some of which I already have done) . . .

Handbells V2 for Kontakt 3+ (Bolder Sounds)

Crystal Glasses V3 for Kontakt 4+ (Bolder Sounds)

Washing Machine and Dryer Loops and Hits (Bolder Sounds)

The high-level perspective is focused on composing and recording songs the way it has been done in so-called "popular" music since the invention of the Edison wax cylinder recorder at the end of the 19th century, although for the most part early recordings were made in the recording studio essentially as one-track live recordings rather than in layers, as happened perhaps three or so decades later . . .

Music notation centricity is fine with me, but from my perspective it's not practical . . .

This does not imply that music notation centricity is impractical; but it does suggest strongly (a) that I consider most of it to be a vast waste of time and consequently (b) that I avoid most of it diligently . . .

For example, consider the music notation for the current version of "Surf Zot" . . .

[NOTE: Some of the guitar staves are hidden so the basic music notation is larger and easier to see, but it's all there. I lowered the volume level for the Sugar Bytes "Cyclop" synthesized cat purring and mid-scooped quacking, but it's easy to hear when you listen with studio quality headphones . . . ]

phpBB [video]


In particular, observe five things:

(1) All the clefs are soprano treble clefs . . .

This is made possible by specifying transposing in NOTION score setup, where for example the drumkit staves play their notes two octaves lower, as does the Hofner "Beatle" Bass staff, and the guitar staves play their notes one octave lower. The logic for this is that here in the sound isolation studio there are 12 notes and 10 or so octaves, only 2 of which humans can hear. This keeps the music notation simple and for me very intuitive . . .

(2) There are no tempo markers in the NOTION score . . .

The tempo is set and controlled in Studio One . . .

(3) There are no articulations . . .

There are a few times when using a handful of articulations is useful, but generally I do most of the work in a few ways, where (a) one way is to do dynamics with automation lanes, compressors, limiters, "ducking", filters, and so forth and (b) another way is to use specific instances of sampled sound libraries, where for example if I want a French Horn played in a certain way, then I create a staff for the specific articulation, playing style, dynamics, and so forth for that French Horn, so that the musician actually was playing the French Horn the way I want it to be played, which results in the most realistic audio, because only minimal computer processing is required beyond what was done to digitize the original audio and to create the set of sampled notes. It's also important that everything is chromatically sampled, since otherwise the "in-between" notes are computed, which generally is not so realistic. If you want the highest level of realism, then you need to use chromatically sampled libraries, where in some instances "round robin" sample variation makes the resulting audio more "human", which is fine with me . . .

(4) There are no repeats . . .

Even though everything is digital, I prefer the analogy, metaphor, or simile where one is working with a "Virtual Electromagnetic Tape Machine", albeit without the "generational loss" problem associated with electromagnetic tape and "bouncing" from one reel of tape to another to create more tracks or layers . . .

I "repeat" verses, choruses, bridges, interludes, and so forth; but I do it linearly, which here in the sound isolation studio maps to copying; inserting a bunch of empty measures at the desired location; and then pasting . . .

If there already is recorded material in Studio One, then I can do the same thing in Studio One for the timeline and respective audio tracks, which makes changing and updating the overall structure of a song reasonably straightforward, although it's best to define the overall structure for a song as early as possible . . .

[NOTE: The Interlude section in "Surf Zot" is an example of adding stuff in the middle of a song, and it was not difficult to do. What took a while was discovering what to do in the Interlude section and then expanding on the various ideas, which is what happens when I listen to a song over and over for a while. Sooner or later, I identify places where something new can be added; so it's an iterative type of thing . . . ]

Aside from being easier and straightforward, doing everything linearly enforces the principle that every measure is equally important even though one section of measures initially might be identical to another set of measures . . .

For reference, this is more of a producing thing than a composing thing; and in part this makes it easy to introduce subtle variations, which although subtle are very important . . .

The timeline is linear when you do it this way

(5) With exceptions for an occasional cymbal accent and some tom-toms, each drum and cymbal is on a separate staff . . .

I love drumkits, and over the years I taught myself how to play drums; but I never learned music notation for drums; so making sense of percussion notation required a bit of work (several months) . . .

Image
The Really Bigger Drumkit

[NOTE: This was done when I was using real instruments, and it's one drumkit played in real-time on the fly--no overdubbing. It sounds like a "Wall of Drums", because I ran it through a set of cascading echo units with some deep and rich reverberation, which I also did for the rhythm guitar and lead guitar but with different settings, of course. I make my own drumsticks, since nobody sells 22" drumksticks. The hi-hat rig is modded so that it plays one set of Latin percussion instruments when the pedal goes downward but another set of stuff when the pedal goes upward. As you can see in the photo, the cymbals and Latin percussion instruments (mostly wood blocks and cowbells) are arranged in arcs, so that a set of them can be played in a single motion of a drumstick, which is the solution to a fascinating ergonomics and physics puzzle. "Nuke Out" was recorded in 2007, but I did a remix in 2014. It's a bit "hot" (i.e., "loud"), but that was four years ago, so it's all good . . . :P ]

phpBB [video]


Once I did a bit of experimenting with tuplets, everything started making intuitive sense; and now I have an intuitive mapping of drumsticks and pedals to music notation for drums, cymbals, and Latin percussion . . .

As you can see in the YouTube video for "Surf Zot" (above), I put the music notation for the left kick drum on its own staff, as I did for the right kick drum and nearly all the other instruments in the drumkit. Done this way, the music notation becomes "virtual drumsticks and pedals", which works for me . . .

I use a few favorite songs as references to gauge how the producing and audio engineering is coming along, and this is one of my current favorite reference songs . . .

phpBB [video]


THE PRACTICAL PERSPECTIVE

It's fine with me to do everything possible with music notation in the extreme; but the practical perspective is that when a song is played for an audience, (a) at minimum there is some amount of reverberation in the room and (b) the room has very specific acoustic behaviors which to no small extent are influenced by the audience . . .

[NOTE: In the mid-1950s and early-1960s teenage girls had "beehive" hairdos, which from an acoustical perspective behaved like shrubbery and was a surreal type of upper midrange and high-frequency filter. Among other acoustical bits, this is the reason you hear low frequencies before midrange and high frequencies. Low frequency sound waves travel through everything. Midrange and high frequencies are easily damped by shrubbery . . . ]

The Romans embedded large wine bottles in the walls of their theaters and used sand to adjust the acoustic behaviors of the theaters based on the type of performances and the expected audiences. The sand-filled (or empty) wine bottles were what we now know to be Helmholtz resonators, and they were physical filters that removed "hot spots", standing waves, and other types of problematic acoustical phenomena . . .

When it's a larger crowd, there will be a sound reinforcement system in the concert hall, and it will have a compressor-limiter and some other external sound processors, if only to protect itself and the audience . . .

When a song is broadcast on television or radio, the audio is run through various devices that ensure the broadcast audio always is kept within Federal Communication Commission (FCC) and other broadcast regulators' guidelines . . .

Some of these broadcast audio processors have especially "musical" characteristics and are used in recording studios to enhance instrument and vocal performances . . .

Here in the sound isolation studio, I use the T-RackS 5 "EQP-1A" processor with a custom-modded "FATT-1A" preset to enhance the deep bass texture of the Höfner "Beatle" Bass from SampleTank 2 (IK Multimedia) that I now use as a legacy instrument in SampleTank 3 (IK Multimedia) . . .

Image

The key bit of information in this context is that no matter what you do with elaborate music notation, dynamics, articulations, playing techniques, and so forth, once it's played through a sound reinforcement system or broadcast media, the folk(s) running sound are going to make it follow a different set of rules, some of which are required by law and others of which are dictated by common sense and the general goal of protecting the hearing of the audience and the safe operation of sound reinforcement systems, radios, televisions, and so forth . . .

There are rules for web streaming, but at present they generally are not mandated by federal law . . .

However, they follow the same general rules, at least most of the time . . .

It's more of a common sense controlling system for web streaming, which among other things is the reason I have a few favorite songs that I use for doing "ballpark" comparisons, which in part is based on the idea that the folks doing audio for Keith Urban probably know (a) exactly what they are doing and (b) why it works . . .

For reference, none of these songs are within European Broadcast Union (EBU) specifications, so for broadcast the loudness levels need to be lowered, but so what . . .

SUMMARY

If you need to have the tempo controlled by NOTION, then the strategy I am using will not work . . .

The strategy I am explaining and demonstrating in this topic is focused on providing a practical way to compose, record, and produce songs using layered techniques that map nicely to the way songs have been developed and produced since at least the 1950s, albeit with a few updates that make it easier and more practical . . .

Even when one does everything possible with music notation, once the music--real or virtual--is performed for an audience in a concert hall that requires using a sound reinforcement system or is broadcast, the sound reinforcement engineers and broadcast engineers are going to do what they do, because (a) at minimum they want to protect their equipment and the audience and (b) on the broadcast side the rules have legal consequences, which can include loss of broadcasting licenses . . .

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:22 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by scunacc on Mon May 21, 2018 11:38 am
Couple of thoughts in response.

1.) I compose in notation. I don't compose in DAWtation ;) What I write as score is the authoritative copy of the music. If the occasion arises, it could be given to an orchestra to play. It does not and should not - for my purposes and for the purposes of many others who write for musicians to play the music rather than just the computer - rely on a DAW at all.

2.) I'm perfectly happy writing raw MIDI in a DAW - but not for orchestral music. For synth, other keyboard, etc. - sure. I do that - or play it in and massage the MIDI in an instrument track. I'll even tweak MIDI right down at programmatic level if need be. But when I'm composing - writing orchestral music - that doesn't happen in a DAW. It happens with black dots on a page (virtual page now here in Notion, Sibelius, Finale, Overture, etc.). The orchestrated distribution and harmonic relationships along with temporal precision are readily visible in a score. Not so much with MIDI... (Hard to take in a 30 instrument MIDI score at a glance - much easier to do with notation, both when composing and when conducting). MIDI notes are nowhere near as concise a "notational" format as those in a traditional score. Other items - CC curves, additional technique changes among samples etc. are part of the interpretation, not the notation.

3.) Neither TV or YT are of interest to me in this context. Or movies. Or ads. What I'm using Notion for is pure orchestral composition and orchestration - for a live orchestra or virtual orchestra to play (and the music notation must support either without change).

4.) The problem here is very much that S1 is in control. If Notion was in control - as the Rewire master, which it can indeed be and almost works nicely with with Reaper this way - most of the problems would be resolved. If S1 would act as a Rewire slave I'd definitely use it that way.

Again, the tempo track is a target for me, not a compositional source. The tempo and time signature changes are in the notation. Longer actually to physically do in S1 as well in the tempo track as opposed to inserting a rit or accel.

As I mentioned, I have a guitar (and flute) piece I'm working on that changes time signature and tempo frequently. I would not relish writing that compositionally in S1 natively as being able to keep track of it visually as easily as in notation. Perfectly possible, just slower to work with compositionally. Not only that I don't get the benefit of being able to work with the tablature at the same time - which I do in Notion - and which, even though a guitarist, is a useful help to me to think in terms of fingering and fret positioning.

5.) While repeats serve - and served - the purposes that you mention, they also serve another purpose not mentioned, and that is as a notational convenience. Additionally, when playing from notation as a live musician, having a repeat rather than turning a page can be very helpful too if the section lends itself of course to having a repeat. As far as other clefs go, composers and orchestrators write in the normal clef and transposition of an instrument - if a transposing instrument - and get very quickly used to working with transposing instruments. It's not an issue. Holywood writes everything in C. Not everyone works in or for Holywood... ;) People who play across families of transposing instruments very much appreciate the single way of working in transposed score (although can become adept at reading C score by sight as in Holywood too - of course, but, normally I mean) as it maps on to finger and register uniformity across the family.

6.) As a composer and orchestrator, I would not consider writing in anything but notation therefore. And, I want to be able to use Notion with S1 to handle that when I use a virtual orchestra. It can do that just fine with MIDI over IAC (or LoopBe) - and I use it that way already - but not with Rewire, and, it seems as though you are agreeing that in fact the limitations I've observed in doing this what I consider "properly" from notation is indeed not supportable by Rewire when not driven by Notion as a master. That's good. :thumbup: That's confirmation of what I had observed. It leads me to continue to consider the IAC alone etc. solution to be the acceptable one in most circumstances for musicians who want to start with and continue with the score and the notation as the authoritative document.

7.) All the articulations, expressions, technique changes and dynamics I write in a score I want to sound in VIs. That happens just fine with Notion if one writes the appropriate rules. Doesn't mean that further tweaks aren't necessary in a DAW with sometimes more fine-grained choices of samples and then even further tweaking of those. I do that and use a DAW for this purpose - as opposed to Notion alone - for that very reason! :) If Notion supported better CC curve handling and more fine-grained rule mapping however, then I'd be more inclined to work in Notion alone and just do all my other non-orchestral, non-score-based work in S1 or other DAWs. (S1 is my main DAW but I do use others also).

8.) Modeling room characteristics is a whole different area. And one I particularly seek to take care about in terms of frequency propagation (low frequencies of bass instruments - double bass, contrabassoon, bass clarinet, tuba, etc.). This is partially handled by a good IR - and I use IR-1 among others as it offers some very helpful modeling features. Getting early reflections right is also important and helped considerably by having multiple IR points for a given space. I don't use Vienna's MIR but work with my own approach to placement, absorption, etc. But, as I say, this isn't really relevant to notation. The composer knows that their piece will be played in different venues with different characteristics and even different interpretation. The issue remains the same - the score is the authoritative document. I can't give an orchestra a DAW MIDI track to read ;) , but, I can give both my DAW and an orchestra my score "to read" and play.

Good discussion.

Kind regards

Derek.

Derek Jones
MacBook Pro 17" 16GB,15" 12GB,Sierra, MacBook 13",12GB HighSierra. SSD,Windows 7 64 8GB RAM, Win 10 64 Pro 64GB RAM, [email protected]
Notion 6.5.470, Studio One 3.5 & 4.X
Pro Tools 2019.5,Logic Pro X 10.4,Cubase 10,Ableton Live 9.7.6,Reaper 5,Tracktion 7
Sibelius Ultimate 2019.5,Finale 25,Overture 5,MuseScore 3
Vienna Ensemble & Instruments 7,SE Strings,EWQLSO & Play 6,Miroslav Philharmonik 1&2,Kontakt 5.8.0,Waves 10
16.0.2 FW,iConnectMID4+ on macOS (+ others aggregated),192 Mobile+DP88 on Windows
User avatar
by Surf.Whammy on Mon May 21, 2018 3:44 pm
I make an effort to be as precise as possible with respect to terminology when writing about digital music production . . . :)

My observations in this post are intended to introduce a bit of clarity, because I am not certain we are discussing the same thing(s) . . .

REWIRE MIDI

scunacc wrote
. . . I compose in notation.

. . . I'm perfectly happy writing raw MIDI in a DAW - but not for orchestral music.

ReWire MIDI is a ReWire technology that is included in the ReWire infrastructure but currently is not implemented by all applications that support ReWire, either as ReWire host controllers or as ReWire slaves . . .

As best as I have been able to determine at present, Studio One Professional 3.5+ and NOTION 6 (current version) are the only ReWire-aware applications that support ReWire MIDI in a practical way . . .

In ReWire terminology, Studio One Professional is a "Mixer Application"; and when NOTION is running as a ReWire slave, NOTION is a "Panel Application" . . .

[NOTE: This is information available to the public. I enhanced it by adding a magnifying glass to highlight the "pipes" part of the diagram . . . ]

Image

[SOURCE: Diagram of a running ReWire session (Propellerhead Software) ]

This is the link to the public ReWire Technical Information overview:

ReWire Technical Information (Propellerhead Software)

Since NOTION also can function to a limited extent as a ReWire host controller, it can be a "Mixer Application", but there are limitations to what NOTION can do in this regard, primarily because NOTION is not a Digital Audio Workstation (DAW) application . . .

NOTION does not have audio tracks like a DAW application; and you cannot record anything other than MIDI input and NTempo information in NOTION. Additionally, MIDI input in NOTION is limited to one staff at a time . . .

ReWire MIDI staves in NOTION are similar to External MIDI staves, but there several important differences:

(1) ReWire MIDI does not require a Virtual MIDI Cable, since it uses "pipes" in the ReWire infrastructure instead of a Virtual MIDI Cable . . .

(2) When you send MIDI from NOTION to Studio One Professional via External MIDI staves, Studio One Professional can record the MIDI . . .

(3) Using NOTION External MIDI staves requires one or more Virtual MIDI Cables; and on the Mac these are provided by the Mac OS X "Interapplication Communication Driver (IAC)", which is controlled and managed via the "Audio MIDI Setup" application. In the Windows universe, creating and managing a Virtual MIDI Cable requires using a third-party utility program, since Windows does not provide VIrtual MIDI Cables intrinsically . . .

(4) One might think that Studio One Professional can record MIDI sent to it by ReWire MIDI staves in a NOTION score, but this is not the case at present. If you need to record MIDI sent from NOTION in Studio One Professional, then you need to use External MIDI staves in NOTION to do this . . .

(5) While in certain scenarios, it's possible to view music in a NOTION score in a hybrid piano roll format (a.k.a., a "Sequencer" staff), this is not the way I do it . . .

This is the current version of "Surf Zot", and as you can see when you open it in NOTION 6.4.462 (current version for the Mac), it's music notation, not MIDI . . .

[NOTE: This NOTION score is approximately 4MB . . . ]

SW-N6-Surf-Zot-V13.notion

For reference, when you run NOTION by itself, it's in a ReWire session, as is the case when you run Studio One Professional by itself . . .

Explained another way, you might not think you are using ReWire, but you are . . . :o

NOTION AND STUDIO ONE PROFESSIONAL ~ EXTERNAL MIDI ~ VIRTUAL MIDI CABLE ~ NOT IN FORMAL REWIRE SESSION

scunacc wroteAs a composer and orchestrator, I would not consider writing in anything but notation therefore. And, I want to be able to use Notion with S1 to handle that when I use a virtual orchestra. It can do that just fine with MIDI over IAC (or LoopBe) - and I use it that way already - but not with Rewire.

"IAC" and "LoopBe" are Virtual MIDI Cable services, so the way I read this is that you appear to be saying that Studio One Professional does this when you are using External MIDI staves in a NOTION score to send MIDI to Studio One Professional via a Virtual MIDI Cable . . .

If you start NOTION first and then start Studio One Professional second, then NOTION is in a ReWire session with itself but not with Studio One Professional . . .

Studio One Professional does not function as a ReWire slave, but it is aware that NOTION is running in a separate ReWire session . . .

Done this way, you can send MIDI to Studio One Professional to play AUi (Mac only) and VSTi virtual instruments, but there is no automagic transport synchronization, which means that the NOTION transport operates independently of the Studio One Professional transport . . .

Nevertheless, when you have an Audio Track mapped in Studio One Professional to each Instrument Track in Studio One Professional, you can record both the MIDI and the generated audio in Studio One Professional . . .

The caveat is that firstly you need to click on the "Play" button in NOTION to start it sending MIDI to Studio One Professional; and then secondly you need to click on the "Record" button in Studio One Professional to start recording the MIDI and corresponding generated audio . . .

If you are going to do this with the goal of recording the MIDI and the generated audio in Studio One Professional, then the easy way to do it is to insert some empty measures at the start of the NOTION score so that you can click the "Play" button on the NOTION transport and have a few measures doing nothing to create time to switch focus to Studio One Professional where you will click its "Record" button (wth the respective Instrument Tracks and Audio Tracks enabled for recording) . .

After you have recorded the audio, you can do a bit of splicing and moving the recorded audio clips to align everything to a useful timeline . . .

I did a quick experiment, and the tempo information from NOTION is recorded in the MIDI in Studio One Professional, but it does not appear to change the tempo insofar as the Studio One Professional transport is concerned . . .

If this works for what you need to do, then great! :thumbup:

Technically, there is a ReWire session; but in this scenario it's NOTION in a ReWire session with itself; and Studio One Professional is not aware of the ReWire session--other than (a) it's aware that the ReWire session exists and (b) Studio One Professional knows it is not a participant in the ReWire session . . .

The transports not being synchronized makes this a bit awkward, but there is a way to align everything after its recorded . . .

If you use AUi (Mac only) and VSTi virtual instruments, then the same limit in what I call the "ReWire MIDI" strategy applies, which for practical purpose is approximately 10 AUI or VSTi virtual instruments per Studio One Professional ".song" . . .

If you use native Studio One Professional virtual instruments the limit might be higher; but when you are using AUi and VSTi virtual instruments, there is a limit to the number of Instrument Tracks that Studio One Professional can handle in a ".song" . . .

So long as the NOTION staves have the same custom tempo data, it's not a problem to record the generated audio in layers, as I am doing in the "ReWire MIDI" examples for "Surf Zot" . . .

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:23 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by cellierjerome on Sat May 26, 2018 10:41 am
Surf.Whammy wroteI added an Interlude to "Surf Zot" in a ReWire session with Studio One Professional as the ReWire host controller and NOTION 6 as the ReWire slave, using ReWire MIDI staves in the NOTION 6 score and AU and VSTI virtual instruments and effects plug-ins hosted in Studio One Professional, which is very easy to do. The YouTube video shows the music notation, but Studio One Professional is controlling the audio and the ReWire session. NOTON 6 provides the music notation and sends it to Studio One Professional via ReWire MIDI staves . . . :)
Lots of FUN! :)

I just test same process with Cubase 9.5.30 and Notion 6 and it works ,great ! :thumbup:

Notion 6.5 - Cubase 10.x 64bits- Pro Tools 2018.x - Windows 8.1 pro 64bits - wavelab element 9.5.x - core i7 920 - 18 Go RAM - 250 Go SSD (system) + 1To SSD (sound banks) + 32 Go SSD (current audio projects) + 4.750 To HDD.- ANTELOPE Discrete 4
User avatar
by cellierjerome on Sat May 26, 2018 11:06 am
Ho nooooooo it seems like only channel 1 / bus 1 works in MIDI over Rewire ... in cubase ...
I was able to record MIDI track from µNotion in Cubase but only for the first track.
Is it same for Studio One ? are virtual midi cables necessary to allow multi-midi chanels recording in Cubase from Notion ?
I tried to affect :
rewire_bus_1 / chanel 1 to Violin1
rewire_bus_1 / chanel 2 to violin 2
rewire_bus_1 / chanel 3 to Violas etc ... => only the track 1 is recordable in Cubase by Midi.

I tried :
rewire_bus_1 / chanel 1 to Violin1
rewire_bus_2 / chanel 2 to violin 2
rewire_bus_3 / chanel 3 to Violas etc ... => doesn't work

Notion 6.5 - Cubase 10.x 64bits- Pro Tools 2018.x - Windows 8.1 pro 64bits - wavelab element 9.5.x - core i7 920 - 18 Go RAM - 250 Go SSD (system) + 1To SSD (sound banks) + 32 Go SSD (current audio projects) + 4.750 To HDD.- ANTELOPE Discrete 4
User avatar
by scunacc on Sat May 26, 2018 12:58 pm
Surf.Whammy wroteI make an effort to be as precise as possible with respect to terminology when writing about digital music production . . . :)

My observations in this post are intended to introduce a bit of clarity, because I am not certain we are discussing the same thing(s) . . .



Couple of points of clarification.

Surf.Whammy wrote(4) One might think that Studio One Professional can record MIDI sent to it by ReWire MIDI staves in a NOTION score, but this is not the case at present. If you need to record MIDI sent from NOTION in Studio One Professional, then you need to use External MIDI staves in NOTION to do this . . .


It is entirely possible to record all Rewire MIDI channels sent from Notion in Studio One. One has to select the MIDI channel in the inspector to do so, but, once done it works just fine. At least, it does on macOS, and that without using any IAC at all - purely Rewire MIDI. I just retested this before posting this to make sure nothing had changed, and, it does indeed still work. (I had a string quartet in Notion with 4 Rewire channels, sent to 4 instrument tracks in S1 - each receiving from a different Notion Rewire channel on Rewire Bus 1. I then loaded the corresponding PSO instruments in Presence to test, and, hit record. It properly recorded the MIDI over Rewire from Notion in S1. )

Surf.Whammy wroteAs best as I have been able to determine at present, Studio One Professional 3.5+ and NOTION 6 (current version) are the only ReWire-aware applications that support ReWire MIDI in a practical way . . .


Hmmm. What do you mean by ... "practical" here?

MIDI from Notion over Rewire works to Reaper as well (again I have just retested it and it is working with the most recent Notion - no IAC used, just MIDI over Rewire). In that case, I had another string quartet notated in Notion sending over 4 Rewire channels on Rewire Bus 1. That drove 4 MIDI channels in Reaper sending to 4 MIDI corresponding string instruments in VSL sending back to 4 audio tracks in Reaper.

(AFAIR it also works with Pro Tools, but I'm not currently getting Pro Tools to see Notion as a Rewire application - not sure why. Need to look at that. I don't use PT + Notion that way anyway - only over IAC :) From what I recall, Cubase does too - but I haven't tested it with Cubase recently either and another poster here indicated differently. I'll look at some point as time permits.)

Surf.Whammy wroteNOTION does not have audio tracks like a DAW application; and you cannot record anything other than MIDI input and NTempo information in NOTION. Additionally, MIDI input in NOTION is limited to one staff at a time . . .


While I agree Notion is not presently DAW-like in that respect (who knows - maybe one day; (perhaps like a bit like Overture 5 does)) - nonetheless, it is possible to send audio (a mixdown) to Notion from Studio One. This can be useful to create a "backing track" further augmented in Notion in notation. But, anyhow, yes, that's why I currently use S1 as my DAW and Notion as my notation and composition program. I keep that functionality separate - although - being able to do all in one place is something that would be nice to happen in Notion one day IMO - again, as O5 seeks to accomplish.

Surf.Whammy wroteFor reference, when you run NOTION by itself, it's in a ReWire session, as is the case when you run Studio One Professional by itself . . .


... unless you turn Rewire off in the Notion preferences. This can be important to do esp. if you are using it without Rewire in conjunction with another Rewire master application (such as another DAW, whether S1, Pro Tools, Logic, Reaper, etc.) That will then allow the other application to start as a Rewire master instead of Notion being so. And, if turned off, neither does Notion act as a Rewire slave when started 2nd. Again that is very important when just using IAC as I normally do.

Surf.Whammy wroteNOTION AND STUDIO ONE PROFESSIONAL ~ EXTERNAL MIDI ~ VIRTUAL MIDI CABLE ~ NOT IN FORMAL REWIRE SESSION

scunacc wroteAs a composer and orchestrator, I would not consider writing in anything but notation therefore. And, I want to be able to use Notion with S1 to handle that when I use a virtual orchestra. It can do that just fine with MIDI over IAC (or LoopBe) - and I use it that way already - but not with Rewire.

"IAC" and "LoopBe" are Virtual MIDI Cable services, so the way I read this is that you appear to be saying that Studio One Professional does this when you are using External MIDI staves in a NOTION score to send MIDI to Studio One Professional via a Virtual MIDI Cable . . .

If you start NOTION first and then start Studio One Professional second, then NOTION is in a ReWire session with itself but not with Studio One Professional . . .


Only if Rewire is turned on in Notion's preferences - as I said above

Surf.Whammy wroteStudio One Professional does not function as a ReWire slave, but it is aware that NOTION is running in a separate ReWire session . . .


Ditto


Surf.Whammy wroteDone this way, you can send MIDI to Studio One Professional to play AUi (Mac only) and VSTi virtual instruments, but there is no automagic transport synchronization, which means that the NOTION transport operates independently of the Studio One Professional transport . . .

Nevertheless, when you have an Audio Track mapped in Studio One Professional to each Instrument Track in Studio One Professional, you can record both the MIDI and the generated audio in Studio One Professional . . .

The caveat is that firstly you need to click on the "Play" button in NOTION to start it sending MIDI to Studio One Professional; and then secondly you need to click on the "Record" button in Studio One Professional to start recording the MIDI and corresponding generated audio . . .


This is exactly what I do normally - just using IAC. Rewire turned off. Sending MIDI (and possibly Notion audio over Soundflower) to Studio One. The loss of "Pressing Play" sync is for me the only real downside in not using Rewire.

Surf.Whammy wroteIf you are going to do this with the goal of recording the MIDI and the generated audio in Studio One Professional, then the easy way to do it is to insert some empty measures at the start of the NOTION score so that you can click the "Play" button on the NOTION transport and have a few measures doing nothing to create time to switch focus to Studio One Professional where you will click its "Record" button (wth the respective Instrument Tracks and Audio Tracks enabled for recording) . .


Or, press record in S1 first, and then simply adjust the recorded MIDI for all tracks back to the start of the bar you want to start at in S1 after the fact. That way one preserves the score integrity as a score in Notion without extra unnecessary bars at the front (I actually have extra bars at the front of my Notion-over-IAC template but that's for another reason: to use a selection of different Notion Rules depending on which VI I choose for a part).

Surf.Whammy wroteTechnically, there is a ReWire session; but in this scenario it's NOTION in a ReWire session with itself; and Studio One Professional is not aware of the ReWire session--other than (a) it's aware that the ReWire session exists and (b) Studio One Professional knows it is not a participant in the ReWire session . . .


Again - only if Rewire is turned on in Notion.

Surf.Whammy wroteThe transports not being synchronized makes this a bit awkward, but there is a way to align everything after its recorded . . .


That's the downside yes - but the other upsides are a winner for me.

Surf.Whammy wroteIf you use AUi (Mac only) and VSTi virtual instruments, then the same limit in what I call the "ReWire MIDI" strategy applies, which for practical purpose is approximately 10 AUI or VSTi virtual instruments per Studio One Professional ".song" . . .


Err, :) where are you getting that limit from - even as a practical limit? I can record MIDI - and hear the realtime VI rendering - with a complete orchestra. I can certainly load more instruments than 10 - even if I'm not recording MIDI from all 64 (IAC limit) simultaneously for every note. Sure - I have to set the buffer size high in S1 to do so, but, that's OK if I'm not playing something live. Just like one might do when mixing and not recording generally.

Surf.Whammy wroteIf you use native Studio One Professional virtual instruments the limit might be higher; but when you are using AUi and VSTi virtual instruments, there is a limit to the number of Instrument Tracks that Studio One Professional can handle in a ".song" . . .


That is not at all my experience :D . I can load dozens of VIs in a Studio One session. I answered a thread over on the S1 FB page with a video demonstrating that very fact about 6 weeks ago where I had about 100 VIs tracking concurrently. I would not normally do so, but even on my old Quad core Q9450 machine - that's 10 years old now - I can load 100 separate VIs in S1. (Those are algorithmic, not sample-based - to be sure - but even so - there's no inherent limit in S1. ) There may be practical limits from the standpoint of memory and CPU (perhaps that's what you were alluding to?) but not in S1 itself. If I am using Kontakt, Play, VSL then or other sample-based VIs then, yes, there is a lower practical limit - but that's first memory, then CPU (CPU usage can be partially mitigated with higher buffer sizes as I noted).

Anyhow - hope that's useful further followup as you work towards your publication. I agree it's good to be accurate :thumbup: and offer the above to help that end.

Kind regards

Derek.
Last edited by scunacc on Sat May 26, 2018 1:24 pm, edited 1 time in total.

Derek Jones
MacBook Pro 17" 16GB,15" 12GB,Sierra, MacBook 13",12GB HighSierra. SSD,Windows 7 64 8GB RAM, Win 10 64 Pro 64GB RAM, [email protected]
Notion 6.5.470, Studio One 3.5 & 4.X
Pro Tools 2019.5,Logic Pro X 10.4,Cubase 10,Ableton Live 9.7.6,Reaper 5,Tracktion 7
Sibelius Ultimate 2019.5,Finale 25,Overture 5,MuseScore 3
Vienna Ensemble & Instruments 7,SE Strings,EWQLSO & Play 6,Miroslav Philharmonik 1&2,Kontakt 5.8.0,Waves 10
16.0.2 FW,iConnectMID4+ on macOS (+ others aggregated),192 Mobile+DP88 on Windows
User avatar
by cellierjerome on Sat May 26, 2018 1:06 pm
YESSSSS it works as expected for me in CUBASE you have just to select all the midi track you want to play or record in MID in Cubase and it works !!!!! I'm soooooo happy I can score a string section in a Cubase project and have directly in Notion the score of this part, Amazing ! :+1 :+1

Notion 6.5 - Cubase 10.x 64bits- Pro Tools 2018.x - Windows 8.1 pro 64bits - wavelab element 9.5.x - core i7 920 - 18 Go RAM - 250 Go SSD (system) + 1To SSD (sound banks) + 32 Go SSD (current audio projects) + 4.750 To HDD.- ANTELOPE Discrete 4
User avatar
by Surf.Whammy on Sun May 27, 2018 12:31 pm
scunacc wroteIt is entirely possible to record all Rewire MIDI channels sent from Notion in Studio One. One has to select the MIDI channel in the inspector to do so, but, once done it works just fine. At least, it does on macOS, and that without using any IAC at all - purely Rewire MIDI. I just retested this before posting this to make sure nothing had changed, and, it does indeed still work. (I had a string quartet in Notion with 4 Rewire channels, sent to 4 instrument tracks in S1 - each receiving from a different Notion Rewire channel on Rewire Bus 1. I then loaded the corresponding PSO instruments in Presence to test, and, hit record. It properly recorded the MIDI over Rewire from Notion in S1. )


Excellent! :+1

I did a quick experiment, and using the Studio One Professional "Inspector" to configure ReWire MIDI for recording MIDI on an Instrument Track works very nicely . . .

Image

Image

THOUGHTS

Until recently, my primary Digital Audio Workstation (DAW) application has been Digital Performer (MOTU), but after trying the "ReWire MIDI" strategy with Studio One Professional and NOTION, I have switched focus to Studio One Professional . . .

I did not know there was an "Inspector", or at least did not notice it or pay much attention to it . . .

This is very good, and it removes the odd bit about not being able to record ReWire MIDI as MIDI on Instrument Tracks in Studio One Professional . . .

I thought it was a bit odd, but nothing I tried worked; so I decided that since External MIDI works, using External MIDI when it's necessary to record MIDI for an Instrument Track in Studio One Professional was a good "workaround" . . .

Now there's no "workaround" needed, since using the Inspector in Studio One Professional to do the ReWire MIDI "wiring" for MIDI recording provides the necessary solution, which is excellent . . .

Lots of FUN! :)

P. S. I will reply to your other comments in a while, but this bit of new information is stellar, so I replied immediately . . .
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:24 am, edited 2 times in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Sun May 27, 2018 2:44 pm
scunacc wrote
Surf.Whammy wroteAs best as I have been able to determine at present, Studio One Professional 3.5+ and NOTION 6 (current version) are the only ReWire-aware applications that support ReWire MIDI in a practical way . . .

Hmmm. What do you mean by ... "practical" here?

In this context, "practical" maps to trying it with Digital Performer (MOTU) and Reason 9.5 (Propellerhead Software) but not being able to do it . . .

I also tried it with the demo version of Cubase Essentials but again with no success . . .

I did not try it with Reaper (Cockos), and I did not try it with the demo version of Live (Ableton) . . .

I could try it with the demo version of Cubase (full version), but it requires buying a Steinberg USB dongle, which I will not do; so all I can do on the Cubase side is try it with the demo version of Cubase Essentials every few months, which I mostly stopped doing . . .

I stopped trying to do snapshots of what every current version of everything can do at any given time; so here in the sound isolation studio, "practical" generally maps to Digital Performer and Studio One Professional, but I have Logic Pro X (Apple), which over the past few years I have decided is useless (too complicated to use) . . .

If it's ReWire in general and someone needs help, I usually will try to help; but for what I call the "ReWire MIDI" strategy, the "practical" aspect is that it works nicely with Studio One Professional 3.5.6 and the NOTION 6.4.462, both of which are running in 64-bit mode, of course . . .

My goal is to provide a verified way to do something useful without needing to mess with a bunch of computer stuff; and in this instance I am very happy with the "ReWire MIDI" strategy when Studio One Professional is the DAW application and NOTION is the ReWire slave, which at least to some extent can be enhanced by adding Reason as an additional ReWire slave . . .

With your information on the correct way to record ReWire MIDI on an Instrument Track, this is a perfect solution for what I need to do; and I like it!

scunacc wroteMIDI from Notion over Rewire works to Reaper as well (again I have just retested it and it is working with the most recent Notion - no IAC used, just MIDI over Rewire). In that case, I had another string quartet notated in Notion sending over 4 Rewire channels on Rewire Bus 1. That drove 4 MIDI channels in Reaper sending to 4 MIDI corresponding string instruments in VSL sending back to 4 audio tracks in Reaper.

This is useful to know . . .

scunacc wrote(AFAIR it also works with Pro Tools, but I'm not currently getting Pro Tools to see Notion as a Rewire application - not sure why. Need to look at that. I don't use PT + Notion that way anyway - only over IAC. From what I recall, Cubase does too - but I haven't tested it with Cubase recently either and another poster here indicated differently. I'll look at some point as time permits.)

Interesting . . .

scunacc wrote
Surf.Whammy wroteNOTION does not have audio tracks like a DAW application; and you cannot record anything other than MIDI input and NTempo information in NOTION. Additionally, MIDI input in NOTION is limited to one staff at a time . . .

While I agree Notion is not presently DAW-like in that respect (who knows - maybe one day; (perhaps like a bit like Overture 5 does)) - nonetheless, it is possible to send audio (a mixdown) to Notion from Studio One. This can be useful to create a "backing track" further augmented in Notion in notation. But, anyhow, yes, that's why I currently use S1 as my DAW and Notion as my notation and composition program. I keep that functionality separate - although - being able to do all in one place is something that would be nice to happen in Notion one day IMO - again, as O5 seeks to accomplish.

OK . . .

scunacc wrote
Surf.Whammy wroteFor reference, when you run NOTION by itself, it's in a ReWire session, as is the case when you run Studio One Professional by itself . . .

... unless you turn Rewire off in the Notion preferences. This can be important to do esp. if you are using it without Rewire in conjunction with another Rewire master application (such as another DAW, whether S1, Pro Tools, Logic, Reaper, etc.) That will then allow the other application to start as a Rewire master instead of Notion being so. And, if turned off, neither does Notion act as a Rewire slave when started 2nd. Again that is very important when just using IAC as I normally do.

OK . . .

scunacc wrote
Surf.Whammy wroteNOTION AND STUDIO ONE PROFESSIONAL ~ EXTERNAL MIDI ~ VIRTUAL MIDI CABLE ~ NOT IN FORMAL REWIRE SESSION

scunacc wroteAs a composer and orchestrator, I would not consider writing in anything but notation therefore. And, I want to be able to use Notion with S1 to handle that when I use a virtual orchestra. It can do that just fine with MIDI over IAC (or LoopBe) - and I use it that way already - but not with Rewire.

"IAC" and "LoopBe" are Virtual MIDI Cable services, so the way I read this is that you appear to be saying that Studio One Professional does this when you are using External MIDI staves in a NOTION score to send MIDI to Studio One Professional via a Virtual MIDI Cable . . .

If you start NOTION first and then start Studio One Professional second, then NOTION is in a ReWire session with itself but not with Studio One Professional . . .

Only if Rewire is turned on in Notion's preferences - as I said above

OK . . .

scunacc wrote
Surf.Whammy wroteStudio One Professional does not function as a ReWire slave, but it is aware that NOTION is running in a separate ReWire session . . .

Ditto

OK . . .

scunacc wrote
Surf.Whammy wroteDone this way, you can send MIDI to Studio One Professional to play AUi (Mac only) and VSTi virtual instruments, but there is no automagic transport synchronization, which means that the NOTION transport operates independently of the Studio One Professional transport . . .

Nevertheless, when you have an Audio Track mapped in Studio One Professional to each Instrument Track in Studio One Professional, you can record both the MIDI and the generated audio in Studio One Professional . . .

The caveat is that firstly you need to click on the "Play" button in NOTION to start it sending MIDI to Studio One Professional; and then secondly you need to click on the "Record" button in Studio One Professional to start recording the MIDI and corresponding generated audio . . .

This is exactly what I do normally - just using IAC. Rewire turned off. Sending MIDI (and possibly Notion audio over Soundflower) to Studio One. The loss of "Pressing Play" sync is for me the only real downside in not using Rewire.

OK . . .

scunacc wrote
Surf.Whammy wroteIf you are going to do this with the goal of recording the MIDI and the generated audio in Studio One Professional, then the easy way to do it is to insert some empty measures at the start of the NOTION score so that you can click the "Play" button on the NOTION transport and have a few measures doing nothing to create time to switch focus to Studio One Professional where you will click its "Record" button (wth the respective Instrument Tracks and Audio Tracks enabled for recording) . .

Or, press record in S1 first, and then simply adjust the recorded MIDI for all tracks back to the start of the bar you want to start at in S1 after the fact. That way one preserves the score integrity as a score in Notion without extra unnecessary bars at the front (I actually have extra bars at the front of my Notion-over-IAC template but that's for another reason: to use a selection of different Notion Rules depending on which VI I choose for a part).

OK . . .

scunacc wrote
Surf.Whammy wroteTechnically, there is a ReWire session; but in this scenario it's NOTION in a ReWire session with itself; and Studio One Professional is not aware of the ReWire session--other than (a) it's aware that the ReWire session exists and (b) Studio One Professional knows it is not a participant in the ReWire session . . .

Again - only if Rewire is turned on in Notion.

OK . . .

scunacc wrote
Surf.Whammy wroteThe transports not being synchronized makes this a bit awkward, but there is a way to align everything after its recorded . . .

That's the downside yes - but the other upsides are a winner for me.

OK . . .

scunacc wrote
Surf.Whammy wroteIf you use AUi (Mac only) and VSTi virtual instruments, then the same limit in what I call the "ReWire MIDI" strategy applies, which for practical purpose is approximately 10 AUI or VSTi virtual instruments per Studio One Professional ".song" . . .

Err, :) where are you getting that limit from - even as a practical limit? I can record MIDI - and hear the realtime VI rendering - with a complete orchestra. I can certainly load more instruments than 10 - even if I'm not recording MIDI from all 64 (IAC limit) simultaneously for every note. Sure - I have to set the buffer size high in S1 to do so, but, that's OK if I'm not playing something live. Just like one might do when mixing and not recording generally.


I do everything here in the sound isolation studio on a 2.8-GHz 8-core Mac Pro (Early 2008) with 32GB of system memory and fast 7200 RPM internal hard drives . . .

Perhaps with a faster Mac, the limits would occur later; but there still would be limits . . .

I observe what happens using Activity Monitor, and it's never a problem with maxing the 8 cores or system memory; so I think it's reasonable to suggest that it's a matter of the processing that needs to be done within strict time frames, although when everything is faster, more work can be done in less time . . .

The goal is not to write a book on "How to spend several million dollars on computer software, hardware, instruments, and recording equipment, so you can do songs the same way they are done by people who spent several million dollars on computer software, hardware, instruments, and recording equipment" . . .

This is based on my testing, where I use a combination of AUi (Mac only) and VSTi virtual instruments . . .

There are a few AUi and VSTi virtual instruments that are particularly "heavy" in terms of system resources (computing and memory); and over the years I decided that it's easier to keep the number of VSTi virtual instruments in the range of 10 to 20 per NOTION score when the instruments are hosted in NOTION . . .

However, there are a handful of advanced virtual instruments in MachFive 3 (MOTU) that are chromatically sampled and are further enhanced with scripting capabilities from the UVI folks . . .

They are very nice, but when I use one of them I reduce the total number of VSTi virtual instruments in the respective NOTION subscore to less than five and sometimes just one . . .

I have everything set to the highest resolution, which along with chromatic sampling increases the impact on system resources . . .

This works for me, and it's part of my strategy for developing songs in layers . . .

When I started exploring the "ReWire MIDI" strategy where Studio One Professional hosts the AUi and VSTi virtual instruments, Studio One Professional started becoming overwhelmed at around 11 hosted instruments, especially when one of them was a "heavy" MachOne 3 virtual instrument; so I set the upper limit in the "ReWire MIDI" strategy to some combination of 10 AUi and VSTi virtual instruments being hosted in Studio One Professional . . .

Nevertheless, when I am working with "heavy" MachOne 3 virtual instruments, I lower the upper limit . . .

When the Mac Pro is overwhelmed (software or hardware), (a) it's easy to observe and (b) I reduce the processing and system memory usage by not having so many AUi and VSTi virtual instruments running at the same time . . .

It works nicely, and it's consistent with developing a song in layers . . .

For reference, a key aspect of the strategy for developing songs involves what I call "Sparkling" instruments, where the strategy is to spread the notes for one instrument over as many as eight (8) staves, each of which is panned to a specific panning location on what I call the "Rainbow Panning Arc" . . .

There are other ways to do this, but I prefer this way because I can do it very precisely with music notation in NOTION . . .

It's also possible to do it with automation in Studio One Professional, but doing it with automation requires more work and is not so practical for changing the specific panning location for each note, should that be necessary for a particular flavor of "Sparkling" . . .

This is important, because what usually is one instrument with music notation on one staff can become as many as eight (8) instruments with music notation on eight (8) staves, which makes it a multiplier . . .

If you have 50 instruments and you sparkle each one by spreading its notes over 8 staves, then you have 400 tracks of audio--which you can combine by doing submixes--but initially it's a lot of staves and tracks no matter how it's done . . .

Consider the current version of "Surf Zot" . . .

From one perspective at present, it's a four-piece instrumental group; but the only instrument that is just one stereo track is the electric bass . . .

The drumkit is spread over several individual drum and cymbal tracks; the rhythm guitar is spread over two tracks; and there are several tracks for the lead guitar, which includes the harmony the lead guitar is playing . . .

Four instruments but 17 separate tracks; and none of the instruments are what I consider to be "sparkled" other than some "flying lead guitars" and so forth, which mostly are what I call "overdubs" and "embellishments" . . .

phpBB [video]


So far, I have not done a song that has 400 separate and independent tracks, but it's certainly possible and, more importantly, practical . . .

From yet another perspective--specifically producing and mixing--it's not practical to work with more than perhaps 25 tracks at a time . . .

It's too much information, and while it might be possible to have 100 Audio Tracks in a Studio One Professional ".song", it's not practical from the perspective of producing and mixing . . .

At some point, you need to do submixes, which is the practical solution . . .

And this is part of what I call the "ReWire MIDI" strategy where a song is developed in layers . . .

scunacc wrote
Surf.Whammy wroteIf you use native Studio One Professional virtual instruments the limit might be higher; but when you are using AUi and VSTi virtual instruments, there is a limit to the number of Instrument Tracks that Studio One Professional can handle in a ".song" . . .

That is not at all my experience :D . I can load dozens of VIs in a Studio One session. I answered a thread over on the S1 FB page with a video demonstrating that very fact about 6 weeks ago where I had about 100 VIs tracking concurrently. I would not normally do so, but even on my old Quad core Q9450 machine - that's 10 years old now - I can load 100 separate VIs in S1. (Those are algorithmic, not sample-based - to be sure - but even so - there's no inherent limit in S1. ) There may be practical limits from the standpoint of memory and CPU (perhaps that's what you were alluding to?) but not in S1 itself. If I am using Kontakt, Play, VSL then or other sample-based VIs then, yes, there is a lower practical limit - but that's first memory, then CPU (CPU usage can be partially mitigated with higher buffer sizes as I noted).

The arbitrary limit of a combination ten (10) AUi and VSTi virtual instruments per Studio One Professional ".song", along with some number of Audio Tracks is what I consider to be practical . . .

There can be more Audio Tracks, but at some point there needs to be a bit of submixing of audio to keep the total number of tracks per ".song" manageable . . .

On the NOTION side, at present I have all 17 ReWire MIDI staves in one NOTION score, which is working nicely . . .

There probably is a practical limit, but it might be more of a visibility matter, at least while the number of ReWire MIDI staves is 20 or thereabout . . .

Even when the ReWire MIDI staves in a NOTION score are not playing AUi or VSTi instruments hosted in a Studio One Professional ".song", NOTION has to do some work . . .

The focus in this regard is to minimize the computing that NOTION and Studio One Professional need to do at any given time, hence what I call the "Ten Hosted Instruments" rule . . .

The common sense perspective is that since I do everything myself, I can't do more than a few things at a time; so it doesn't matter if I spread audio and instruments over several Studio One Professional ".song" files and several NOTION subscores . . .

scunacc wroteAnyhow - hope that's useful further followup as you work towards your publication. I agree it's good to be accurate :thumbup: and offer the above to help that end.

Kind regards

Derek.

Much appreciated--especially the information on how to record MIDI using ReWire MIDI data sent from NOTION, which is stellar in every respect! :+1

Lots of FUN! :)

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Sun May 27, 2018 2:45 pm
cellierjerome wroteYESSSSS it works as expected for me in CUBASE you have just to select all the midi track you want to play or record in MID in Cubase and it works !!!!! I'm soooooo happy I can score a string section in a Cubase project and have directly in Notion the score of this part, Amazing ! :+1 :+1


Excellent! :)

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by scunacc on Sun May 27, 2018 4:33 pm
So far, I have not done a song that has 400 separate and independent tracks, but it's certainly possible and, more importantly, practical . . .

From yet another perspective--specifically producing and mixing--it's not practical to work with more than perhaps 25 tracks at a time . . .

It's too much information, and while it might be possible to have 100 Audio Tracks in a Studio One Professional ".song", it's not practical from the perspective of producing and mixing . . .


Couple of thoughts:

When working with orchestral scores, I currently, still, prefer to use keyswitch instruments in order to reduce the track density. Apart from anything else, it keeps the AETD*** changes on one track - as opposed to having one track for each AET (and sometimes D) change. That is very unnatural to me as it does not correspond with the score (which has one staff per instrument, not 20 - with one set of AET changes per staff - notated in notation not in MIDI :) ) , but, I understand why people who work in DAWs primarily - and use orchestral instruments - do it that way. Since I work primarily in notation for composition, I don't. That doesn't mean I won't do that in a DAW - just that it's not my normal choice starting point. If I need to add tracks to handle specific AEDT needs after the fact that aren't well-enough nuanced by the notated and keyswitch triggered sample changes, then I'll do that. No problem. But I don't start there...

(However, there are folks who have track templates that reach into the thousands of tracks when fully expanded. Of course, they don't normally fully load them, and have stub loading in, say, multiple Kontakt instances and only force loading on use of a particular AET choice, and/or use disabled instances of their VIs, rather than actually loaded and consuming CPU. This is something that's done regularly in other DAWs - there are some very impressive templates for doing so available for both Logic and Reaper for example.)

One way to organize that is to use folders. S1 is very good at doing that. My normal orchestral track count including folders/folder-linked VCAs, buses and FX is around 179 tracks. That includes 8 instances of Play - each loaded with multiple EWQLSO KS samples - one of VSL - likewise loaded with multiple samples - and one of Kontakt - ditto. I need to experiment with 4.0 to see what the load balancing is like when splitting these out into separate (i.e. more) instances instead of using them as multis. That doesn't mean I'm only using 10 VIs. I'm using 10 VI hosts with multiple samples each as multis...

There are a few AUi and VSTi virtual instruments that are particularly "heavy" in terms of system resources (computing and memory); and over the years I decided that it's easier to keep the number of VSTi virtual instruments in the range of 10 to 20 per NOTION score when the instruments are hosted in NOTION . . .


... and my normal IAC template uses 58 I think it is - at last count - separate IAC channels for MIDI from Notion to S1 for a full orchestral score. So, that's 58 effective simultaneous VIs - and would be if I broke them out into individual instruments instead of using multis.

Granted - not all notes are sending MIDI on all channels simultaneously - even in what would otherwise be considered tutti passages that's rare - but that covers the normal range of orchestral instruments.

For choir and organ, I have it set up to send audio from Notion (using Notion's own samples there) to S1 over Soundflower - although either could be replaced with other VIs and hosted in S1 as well - although I would be cutting it close with maxing out IAC MIDI channels at 64 then (and that's another slight disadvantage of using IAC over Rewire, but it hasn't bothered me just yet). I also don't think it's a built-in limitation AFAIK - I believe it's possible to use more than 4 IAC channels, just that Notion currently only supports using 4.

Normally I don't host instruments in Notion though when I'm doing the mixing in S1. I host in S1.

The goal is not to write a book on "How to spend several million dollars on computer software, hardware, instruments, and recording equipment, so you can do songs the same way they are done by people who spent several million dollars on computer software, hardware, instruments, and recording equipment" . . .


:) Understood - although I have given thought to using VSL Pro for further spreading to other machines. Again a common scenario and not that expensive - except if one also needs additional licenses for the actual VIs depending on use cases.

For reference, a key aspect of the strategy for developing songs involves what I call "Sparkling" instruments, where the strategy is to spread the notes for one instrument over as many as eight (8) staves, each of which is panned to a specific panning location on what I call the "Rainbow Panning Arc" . . .


Since I use "normal" orchestral seating (for various versions of normal depending on who / what one is emulating in terms of seating positions), I specifically narrow instruments in the 2D field mostly - not spread them. For me increased track counts would come from using specifically added AET choices as I mentioned above. Even then, the tracks are more manageable when using nested folders. I'll try and include some pics of my normal template setup in a while. (I still don't quite get when I can include a full screen pic and when not in the forum posts here - seems size limited - may have to link to them).

At some point, you need to do submixes, which is the practical solution . . .

And this is part of what I call the "ReWire MIDI" strategy where a song is developed in layers . . .


Part of my goal in setting things up the way I have is to be able to audition the approximate final mix without having to do any submixing. I.e. to at the very least be able to play from Notion, through S1 to the S1-hosted VIs without any dropouts and listen to the complete score. That I can do with the full orchestral template over IAC, and, still retain the complete control over the score in Notion - as I mentioned above being the goal and without having to submit Notion to Rewire control from S1 in terms of tempo, time signature, tempo changes, etc.

That includes most of the final FX as well - not just the VIs.

When that auditioning is ready for actual mixing, I can then send or record the MIDI to S1 from Notion, incorporate the tempo track from that MIDI into S1 and then everything's sync'ed up (with of course, the previously mentioned smidgen of work required to adjust the starting point of the imported MIDI to a bar start since there's no sync that way - as in Rewire.)

Kind regards

Derek

-----------------
**** AETD = Articulation, Expression, Technique and Dynamics
Last edited by scunacc on Sun May 27, 2018 4:49 pm, edited 1 time in total.

Derek Jones
MacBook Pro 17" 16GB,15" 12GB,Sierra, MacBook 13",12GB HighSierra. SSD,Windows 7 64 8GB RAM, Win 10 64 Pro 64GB RAM, [email protected]
Notion 6.5.470, Studio One 3.5 & 4.X
Pro Tools 2019.5,Logic Pro X 10.4,Cubase 10,Ableton Live 9.7.6,Reaper 5,Tracktion 7
Sibelius Ultimate 2019.5,Finale 25,Overture 5,MuseScore 3
Vienna Ensemble & Instruments 7,SE Strings,EWQLSO & Play 6,Miroslav Philharmonik 1&2,Kontakt 5.8.0,Waves 10
16.0.2 FW,iConnectMID4+ on macOS (+ others aggregated),192 Mobile+DP88 on Windows
User avatar
by scunacc on Sun May 27, 2018 4:45 pm
cellierjerome wroteYESSSSS it works as expected for me in CUBASE you have just to select all the midi track you want to play or record in MID in Cubase and it works !!!!! I'm soooooo happy I can score a string section in a Cubase project and have directly in Notion the score of this part, Amazing ! :+1 :+1


... and this is one of the reasons why, although it would be fantastic to see even better integration of Notion within S1 - even as an embedded subsystem so as to completely dispense with IAC, Rewire, LoopBe etc. and have it all work natively - it should not be the sole end goal and Notion should remain as a standalone program (quite apart from the fact that it gives PreSonus a separate revenue stream and is on a par with Avid and Steinberg doing the same).

Notion is a great standalone scoring and composition program - and has been for a good while. Yes, it has differences from Sibelius, Finale, Dorico, Overture, MuseScore etc. but what makes it useful for me is the ability to rapidly score in it compared with those others. It's still the fastest for me - and I think many others - to work in for raw composition.

Now, if it added some of the compositional aid plugin features of Sibelius and some of the layout capabilities from Finale, and the DAW-like and finer-grain AETD manipulation of O5 and the cross-platform (OK, so I work on Linux too ;) ) ability of MuseScore and the expression maps and AI of Dorico - I'd be very very happy indeed. But I'm pretty happy now. :D

Additionally, the ability to score on the go in the bidirectionally compatible iOS version and move to and fro from iPad (or even iPhone!) to desktop and back makes Notion the stand-out winner for me for mobile composition. Nothing else currently touches it. :thumbup:

Kind regards

Derek.

Derek Jones
MacBook Pro 17" 16GB,15" 12GB,Sierra, MacBook 13",12GB HighSierra. SSD,Windows 7 64 8GB RAM, Win 10 64 Pro 64GB RAM, [email protected]
Notion 6.5.470, Studio One 3.5 & 4.X
Pro Tools 2019.5,Logic Pro X 10.4,Cubase 10,Ableton Live 9.7.6,Reaper 5,Tracktion 7
Sibelius Ultimate 2019.5,Finale 25,Overture 5,MuseScore 3
Vienna Ensemble & Instruments 7,SE Strings,EWQLSO & Play 6,Miroslav Philharmonik 1&2,Kontakt 5.8.0,Waves 10
16.0.2 FW,iConnectMID4+ on macOS (+ others aggregated),192 Mobile+DP88 on Windows
User avatar
by Surf.Whammy on Thu May 31, 2018 2:54 am
I added a Stratocaster played through a tremolo unit and an echo unit to add sustain to the lead guitar . . . :)

phpBB [video]


THOUGHTS

The notes are the same as the lead guitar notes, so this new track is adding texture . . . :ugeek:

There now are 19 ReWire MIDI staves in the NOTION score; so before I add anything else, I will do some submixing and start a new Studio One Professional layer (".song") . . .

[NOTE: There are 10 tracks of lead guitar, including the harmony guitar, but it's what I consider to be one lead guitar. There are some "flying" phrases, "sparkled" phrases, synthesized cat purring, mid-scooped quacking, tremolo, and sustained echo textures. In the real world, this maps to an effects pedal rig; so this mostly is a way to do the same thing, but with more intimate control over everything . . . ]

On a related note, I have been learning about the loudness standards proposed by the Advanced Television Systems Committee (ATSC), European Broadcast Union (EBU), and International Telecommunication Union (ITU) . . .

Waves has a Loudness Meter plug-in that does the required measurements for the various standards; and since this loudness stuff has been around for a while--news to me--the price for the Waves plug-in is reasonable. When all this first appeared, the Waves plug-in had a MSRP of $399 (US), but now it's discounted to $59 (US), which is what I consider to be reasonable and affordable here in the sound isolation studio . . .

I did the 10-day demo, and it works; so I plan to buy it in a few weeks . . .

Image

As best as I can determine, the broadcast standards for loudness are different from the loudness standards for music streamed on the web; but it's an interesting thing to explore . . .

There are standards for everything, including iTunes and YouTube; so I think it makes sense to explore this . . .

My current strategy is to use hit songs on YouTube to determine volume and loudness levels, which generally is useful to set the maximum volume level; but I like meters, since meters don't have personal beliefs and opinions . . .

Volume and loudness are not the same . . .

Loudness is perceived, so it's different from volume . . .

The general rule is that to make something twice as loud, the volume needs to increase by a factor of 10; which among other things is the reason that the "decibel" unit is logarithmic . . .

When I was testing the Waves WLM Loudness Meter plug-in, I had to set the output ceiling for the T-RackS 5 Brickwall limiter (IK Multimedia) to -6dB, which at the maximum listening level is what one might describe as pleasant loudness; but compared to "Blue Ain't Your Color" (Keith Urban), it's not so loud . . .

In this version of "Surf Zot", I increased the T-RackS 5 Brickwall Limiter output ceiling to -3dB, which is more consistent with YouTube levels but nevertheless is not so loud . . .

My perspective is that the "standard" is defined by hit songs on YouTube . . .

Lots of FUN! :)
Last edited by Surf.Whammy on Sun Mar 24, 2024 12:25 am, edited 1 time in total.

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Sun Jun 10, 2018 5:53 am
After listening to the song for a while, I decided to add three more electric guitars to the Interlude . . . :)

phpBB [video]


THOUGHTS

It's becoming a bit more difficult to find sonic spaces in the song where I can add more instrumentation; but there were a few places in the Interlude that had sonic space, so I added some "sparkled" electric guitars playing what I suppose are "Post-Baroque embellishments" . . .

They are not Baroque "shakes"; so they are more like the embellishments Mozart did to expand on the Baroque style of making something simple a bit "fancy" . . .

I also did some producing and adjusted a few volume levels . . .

[NOTE: There is plenty of motion, so it's best to listen with studio quality headphones . . . ]

At present, I am planning to add a few tracks of Realivox Blue (Realitone) singing something similar conceptually to the "Echo Metal" guitars but not so rapidly; and since this Studio One Professional ".song" layer is hosting 11 VSTi virtual instruments, I will do submixes and start a new ".song" layer, since 11 VSTi virtual instruments is a good upper limit for a "song" layer done this way . . .

I also will start a new NOTION subscore, since Realivox Blue needs to be hosted in NOTION rather than in Studio One Professional, at least while I am composing the phrases Realivox Blue will sing . . .

After I record the Realivox Blue singing on Audio Tracks in Studio One Professional, I will switch to Melodyne (Celemony) to fine-tune it, which will be an interesting experiment . . .

Getting Realivox Blue to sound very precisely "human" requires a bit of work; but I have discovered most of the rules and developed a few techniques that make it easier, where in some respects the most important rule is to add a set of consonants and other useful phonemes after the end of the actual song so I can use them to over-enunciate via copying and pasting once everything is in Melodyne, which is necessary because Melodyne has its own private Clipboard . . .

[NOTE: Most of this is explained in my topic on Realivox Blue, so no need to explain it here . . . ]

Project: Realivox Blue (PreSonus NOTION Forum)

I have been studying "Red Cold River" (Breaking Benjamin), and I like the screaming "Run" bits; so I might have Realivox Blue do some screaming . . .

[NOTE: This is significantly louder than the volume level for "Surf Zot", so turn-down the volume before listening to "Red Cold River". I probably will pump "Surf Zot" later; but at present I'm keeping it within broadcast television levels or thereabout. Apparently the so-called "Youth of Today" have a lot of angst--which I think is a bit disturbing--but I like the "headbanger" aspects of "Red Cold River"; so it's all good, and it is not inconsistent with the Asynchronous Melodic Death Metal Nouveau genre I created for one of my pretend musical groups, KnightKocK, which on one level is a parody of Gwar, the stupidest Metal group in the known universe (but they sound a bit like KISS, so I like them) . . . :punk: ]

phpBB [video]


As a bit of puzzle, I am pondering the idea of adding a vocal melody and composing some lyrics . . .

"Surf Zot" already has a reasonably complex lead guitar melody; so the "puzzle" aspect is fitting a vocal melody on top of the instruments and whatever I decide to do with Realivox Blue . . .

At present, one possibility is to fit a vocal melody and lyrics between the lead guitar phrases in a "whack a mole" style, where just when you don't expect it, someone starts singing and then just as quickly stops singing . . .

It's a totally stupid idea, but I like it . . . :P

When you think about it, every song that has a vocal melody pretty much focuses everything on consistent melodic singing with pauses only for dramatic emphasis and the typical "lead guitar solo" . . .

Lots of FUN! :)

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
by Surf.Whammy on Tue Jun 19, 2018 4:22 am
After a bit more listening and thinking, I added Realivox Blue (Realitone) singing some very simple counterpoint through a long, syrupy FabFilter Software Instruments "Timeless 2" echo unit . . . :)

There also is a new flavor of the first measure of the Intro . . .

[FOLLOW-UP NOTE: After listening to this a few times, it's obvious that I was focusing on the music notation for one of the lead guitars--not on the song--so most of the Realivox Blue singing starts at the wrong times; but it's OK, since I will focus on the song and correct everything. This is all part of discovering the proper melody . . . ]

phpBB [video]


THOUGHTS

The music notation for Realivox Blue is on a ReWire MIDI staff in the NOTION 6 score, and Studio One Professional 3.5.6 is hosting Kontakt 5 (Native Instruments) and Realivox Blue, which runs in Kontakt 5 as a virtual instrument . . .

As you can see in the YouTube video, this works nicely; and so far there is only one NOTION 6 score . . .

There are two Studio One Professional ".song" layers; and the audio from the first ".song" has been exported and then imported into the second ".song" layer . . .

The Realivox Blue singing is more of a sketch; but I like the way it sounds, so after listening to it for a while it probably will become a melody . . .

Currently, Realivox Blue is singing what I call "safe" notes, which is a way to sketch a melody in an instrumentally complex song, which makes a bit of a puzzle . . .

Start with the "safe" notes, and then embellish . . .

After a bit more work on this first Realivox Blue part, I will record it and then get it into Melodyne to fine-tune the enunciation and make it sound very human (but ethereal) . . .

Lots of FUN! :)

Surf.Whammy's YouTube Channel

The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!

54 postsPage 2 of 3
1, 2, 3

Who is online

Users browsing this forum: No registered users and 12 guests