a possible solution to using reason as vst
NOTE TO PHeads: i would sincerely like to know the opinion of propellerheads on this one,so please acknowledge if any of the DEVs has actually read this...a "yes we've read this" from the propellereads's team would be really appreciated as im doubtful whether the devs actually read this forum nowadays(i know u used to earlier)
hear me out on this one and please dont start flaming...its just a thought,if u dont like it,u can ignore it without starting a flame war.
differences between using reason and vsti.
1.drag drop and play
1.start host then start reason
2.create midi track in host
3.create device in reason
4.route reason device's output to hardware interface
5.route host's midi track out to device
6.create track and get hardware interface out into its channel
ableton has removed a few steps and made the process shorter with the introduction of "external instrument".
now this may not seem like a long process,but doing this day in day out,most people get turned off by this.at least i do.i feel reason's routing capabilities are phenomenal,but this does not mean that the user has to perform such a long process everytime. the routing should be used to get creative juices flowing not the other way round.
VST MODE switch in preferences(i dont really care if they call it anything else)
what happens when reason runs in this mode?
1. the default routing of devices changes.
in reason the device is not routed to the mixer,but to the hardware interface.
in record it stops creating channel strips and routs by default to hardwre interface (or) creates channel strips but routs the direct outs to the hardware interface when an instrument is created.
2.a new protocol is developed call it RE-whatever...
what happens with this protocol ?
1. a reasondevice.dll is added into the vstplugins folder.
description of the reasonvst device :
its very much like the external instrument in ableton but with a difference.
when loaded,it sends a command to reason to create a combinator.
automatically routes midi to the combi and routes the hardware interface input to the audioout.(remember the default routing changes in this mode,and the device is automatically routed to hardware interface)
2.it can access refills and combi patchs inside them.
GUI of reason vst device :
basically it just shows up like a combinator faceplate.however works like a midi controller,moving knobs on this device move the knobs on the combinator loaded in reason.
loading a patch in this doesnt actually load it but just tells reason to load it.
so now the process is just like vstis...just drag n drop reason vst, and it will automatically create the combi in reason,now u can load up any patch and start playing it.
doing it only for the combi means that the props dont have to create this for every device.
but since any device can be put in a combi,the user can access any instrument.
and slowly as reason upgrades,the combi can be upgraded to have more controls.
so ppl can create complex patches give relevant controls on to combi faceplate and access it like a vst in their host.
again i request to not start a war on this one,u have a choice to ignore this if u do not approve of this suggestion.
EDIT : additionally the device could check if reason is running,if not it could automatically start reason.
this method also cuts down having to save two files,u really dont need to save the reason file.
eg,in ableton u save a file with 5 reason vsts,whn u load this set,itll automatically start reason and load up the combis in the reason rack with the presets.
Also in this mode(vst mode or instrument mode or plugin mode...) reason could run quietly in the system tray being invisible,but you could open the rack if u need 2.
I agree that the ReWire experience is pretty horrible. At the time it first came out it was something that allowed you to do things that couldn't be done before, but these days there are so many other options that I don't think most people have the patience. Certainly I doubt there are many users that are new to music making that choose a ReWire based setup, rather they either go for ReRec standalone or go down the VST route.
exactly...but if something like this was implemented,one wont have to choose...it would be like working with synthedit or reactor etc...make ure patch save the combi and load it up in ure host...the difference here would be that reason is running in the background,in fact this would be so much better as u could edit the patch while working on the track...it would make rewire a good experience...hope the pheads are listening
u dont need to do this for every device,just the combinator...it would make reason a synth editing tool and u play with the patch in ure host.
BTW to those who think this is useful,do give this a small bump once in a while...u know keep this alive
I'd love to even see a "midi out" device! Reason brings audio into it now so why can't we have a simple midi out device to control our hardware synths? your not selling your inst. inside reason as add-ons so what would it hurt? I fell my Reason collects dust simply for this reason and it has been replaced by abelton :( I would say that this vst Idea here would bring reason back into my mix :)
Can I just point out that the loading of VST's and ReWire devices is host specific. Currently I'm using Reaper as my DAW and it's a single right click and select for both processes.
The rest of the point is obviously to do with Reason itself and the fact that once it's opened you still have to set it up, which is fair enough comment.
I have to say though that I'm not so keen on this idea, simply because VST's have proven to be very easy for hackers to target. I don't know why this is, but it's a pretty safe bet that whatever the protection method is for standalone versions, it generally can be bypassed within the .dll context.
This would somewhat defeat the purpose of having the Ignition Key to protect the software, when those who were of the ilk to do so, could just use the back door method.
I'll still back the call for MIDI Out, as it would allow me to trigger my VSTi's from Reason's sequencer without endangering the main program.
I must stress that I like the thinking behind the post, but I just don't think it's viable in todays world.
Just wondering: doesn't this mean you'd have to have a VST that can run standalone? Because I don't think all VST's have that option.
The basic premise is that you would have the Reason "MIDI Out Device channel 1" going out to say Reaper/Live Lite/Cubase LE's MIDI input. You then set up whatever VSTi/AU you wish and the MIDI editing and recording is handled in Reason.
Of course you could say "why not just use the host's editing?". Well, as a Reason user of over a decade now, who learned sequencing on Reason, it's fair to say that my editing skills in this regard are FAST on Reason, as compared with muscle memory on a guitar or keyboard, I don't have to think about it at all, I just go ahead and do it almost on an unconscious level.
Standalone is the exact same but minus the host sequencer.
Same goes for me, I love sequencing in Reason, or, to put it differently, I am so used to it that it works much quicker than in e.g. Ableton.
But one more thing: how would the audio be routed back into Reason? Or would that still be handled manually, by rendering and exporting from the host? Or when the VST is in standalone mode, how to record the audio?
there's a couple of different possibilities here. First there's the physical possibilitry, which requires 2 different hardware units, which rely on independent ASIO's. Pretty simple thinking, you set one program to one of them, the other to the other, and link them. TA DA!
Now having a 2nd interface isn't necessarily required for those who don't have the cash or space for it, so you can do it internally by duplexing the driver and recording across the soundcard. This doesn't involve any extra hardware requirement, but it's a bit trickier to set it up, and depending on the specs, you may have some latency issues. However this latency can be edited out in the audio stages later on pretty easily, and shouldn't affect realtime playing. It's also a lot easier to do this on Mac than it is on PC from what I can garner, I'm on PC myself and used to use Jack, which works well enough, but a lot of people report chronic latency issues which are probably spec related, but I'm in no position to trouble shoot for them remotely, but usually try and help. The Mac version of this involves Soundflower or the original incantation of Jack, which is apparently kept up to date more often than it's Windows counterpart.
Nowadays I have a 2nd interface, but the coup de grace method on Windows if you don't have an interface is to get hold of a copy of Cubase LE (any version - usually about 20 quid tops on eBay). This, or indeed any Cubase/Nuendo install. has a very specific packaged ASIO driver with the install which does most of the work for you, and if you twin this with ASIO4All, then you essentially have 2 virtual interfaces. There is also no latency with this method, and also it's a breeeze to set-up, whereas Jack requires a bit of tweaking, and is actually initiated by opening two utility progs each time you want to use it, which can of course be a PITA! Jack is due an update on Windows to bring it up to speed on 64-bit, so I'm imagining that many of it's issues will have been fixed when that becomes available, but as it's freeware, you may need to be patient.
Hope I haven't meandered too much for this to be readable. :)
Thanks for the explanation :) That's much more complicated than I expected - or maybe just as complicated as I feared, that's why I asked. I absolutely hate messing around with drivers, setting things up, hardware, cables etc...
I had a much more simple image in my mind concerning midi-out: sending midi to a standalone plugin and recording directly into Reason, tada.
|All times are GMT +2. The time now is 15:13.|