I second the recommendation for the Enttec DMX-USB Pro with QLC+.
I've worked with MyDMX2.0 and it was an all-around frustrating and limited experience for busking, I would be bald from pulling my hair out if I had to do a preprogrammed show with it.
I've also used the DMXIS hosted in Ableton and while it was a step up from MyDMX, it still struck me more as a tool for musicians who want lighting as an afterthought rather than a tool for controlling lights.
My current personal show setup uses Ableton for all the live music, and uses LoopMIDI to send MIDI out from Ableton into QLC+, so I get all the benefits of Ableton's beat synchronization with all the benefits of QLC+'s more robust programming environment. Feel free to PM me if you have any questions about it, I love proselytizing this system to other musicians.
Yep, I use loopMidi (x32/X64 windows XP/7/8 compatible) (free)
as my virtual midi cable.
once you have glovePIE chatting to the virtual midi cable it all comes down to basic scripting (something I fumbled my way though using other peoples scripts as examples+Google+the official documentation for GlovePIE)
and assigning your virtual midi cable as an input device in your DAW
If you are looking to bridge VCV into Ableton, there are a number of ways to do it besides VCV Bridge, but it's a matter of configuring software.
I use Windows and have a number of outside-my-DAW applications, VCV included, that incorporate MIDI and audio, that I then use in tandem with my DAW.
For MIDI, I use <strong>loopMIDI</strong> which creates a virtual MIDI device that allows me to route MIDI between applications. I can send MIDI to loopMIDI from Reaper and then set MIDI input in VCV to use loopMIDI.
For audio, I use an audio interface that has several virtual channels. Specifically, I use an Apollo Twin USB. This allows me to send the output audio of VCV to the virtual channels, which I then set as input channels in my DAW for recording. I believe the same can be done with Focusrite interfaces as well.
If you don't have an audio interface, there's a number of software applications that do basically the same thing: <strong>Virtual Audio Cable</strong> and <strong>Soundflower</strong> are two examples.
I will note that using an audio interface with virtual channels does work better in my experience, because you don't have to change your ASIO device in your DAW to something else and then futz with bitrate and buffer settings.
No trouble at all. I write small sketches like that all the time when i have some idea of what interesting thing I could do with midi - it's really easy to work with midi messages. I already started a little bit just because you got me interested and i already have the basics working: https://soundcloud.com/x2mirko/tremolo-picking-a-rhodes/s-IlNbu
In case you do want to try it: It requires a virtual midi cable, like loopMidi to route the midi into the program and then back out of it. It's basically like installing a virtual midi port that you can then normally route midi to and from in your daw. So what you'd need to do would be to install that midi cable, route midi from your daw to one virtual midi port and then route midi from another virtual midi port to your synth. It's not all that hard, but if that sounds like too much trouble, it's probably better to try something like this at a more "adventurous" time - don't feel obligated to try anything just because i put in a bit of effort. I would've programmed that thing anyways after you put the thought into my head and now i'll play with it some more :D
The manual doesn't give details of how to get it to work in Cubase, but it's just a 64-bit VST (plus a standalone program), so I can't see why it wouldn't.
It does rely on a piece of software called loopMIDI though so it may be worth exploring if that works on your operating system before buying. Worked fine for me on Windows 10 after a bit of fiddling.
Like /u/KrabbyPattyFormula I've never found any way to do it in Live but if it's really important to you you can always download a MIDI editor like this free open-source one for Windows and do the manual editing outside of Live. You just right click your MIDI clip in Live and Export it to your Desktop or elsewhere then open it in your editor. I actually have MidiEditor set up to output playback to a LoopMIDI port which I can set as the MIDI In on the track in Live. This saves you from having to bounce back and forth exporting and reopening every time you make changes. If you're on a Mac there are other free editors out there and I believe iOS has virtual MIDI ports like LoopMIDI built in somehow.
You can use software like loopMIDI to create virtual MIDI tunnels and then it's very simple to write into it. I used it in the past to read input from a socket and push it into a DAW.
The best approach would be selecting an output in your app, either it goes to audio or a select a MIDI channel from a list to write into and place a recommendation about using a virtual MIDI program like loopMIDI or loopbe.
I use a USB port at the moment and two software workarounds called Hairless MIDI and loopMIDI. The arduino sends midi messages at 112500 Baud however the computer doesnt see a midi device, it sees a serial device. So hairless midi picks up the serial comms and turns them into MIDI messages, then sends them to loopMIDI. LoopMIDI recieves the MIDI messages and turns them into MIDI 'inputs'. Then in fl_studio ( or any program that can use midi inputs should work) I select loopMIDI as a midi input.
LoopMIDI and hairless MIDI are free programs and great to use while you're still changing code on the arduino.
Soon I will change the arduno mimi out for an arduino nano, then i can do a firmware update which will let the PC see the arduino as a MIDI device and the software workarounds won't be needed. But I want my code 100% done before I do this.
Some links: loopMIDI - http://www.tobias-erichsen.de/software/loopmidi.html hairless MIDI - http://projectgus.github.io/hairless-midiserial/#downloads
i use reatune for audio to midi and because you can't overdub with record output midi, i add midi hardware output http://www.tobias-erichsen.de/software/loopmidi.html and on a second track midi input loopmidi port recording overdub.
It sounds to me like you should be using a DAW. You can use a virtual MIDI cable to route the output of Synthesia to the MIDI input on a DAW. There are a few different programs that can do this, I've used this one: http://www.tobias-erichsen.de/software/loopmidi.html
on windows install loopmidi and on finale Playback To > MIDI System and in Reaper enable midi device (port name in loopmidi) and tracks input midi port name channel to play real time your finale scores in Reaper virtual instruments and press record in Reaper,and if you wanna connect the timeline,check if your finale version supports rewire.
Okay, maybe try this: http://www.tobias-erichsen.de/software/loopmidi.html
Or, if you have two midi devices, you could do a hardware loopback.
If no working solution pops up, you may have an easier time by switching to a soundcard or operating system that comes with virtual midi loopback drivers (Linux is pretty nifty on that side, all batteries included with most hardware ;-)
Eh, it's glitchy as fuck. Used to be pretty solid but since the 64 bit transition it's been flakey. I've been moaning about it on the FL forums but no one else seems to care (irony being that I bought the Producer Edition of FL just before the transition so... yeah, not cool).
A quick fix I use is loopMIDI which I use to trigger samplers in FL from Cubase (although I only use FL as a glorified sample browser so YMMV).
If you produce in Maschine, Traktor is your choice for playing out.
It's also not too complicated to sync Maschine to Traktor on either Windows or OS X. I use Windows 10 and run a Traktor/Maschine set up.
To sync the midi, you only need loopMIDI. There's tutorials on Youtube on how to use it but essentially, you keep Traktor as your master clock and then slave Maschine, so whatever happens in Maschine is synced to Traktor's tempo.
Then you have two approaches. If you don't want to use Maschine live in itself, you can export your patterns into the Traktor Remix decks, as simple as drag and drop (The little waveform^+ button at the top right of the pattern editor).
I however use Maschine as a "deck" in Traktor. I use JACK to route the audio out from Maschine into a Traktor "live input" on Deck D. This allows me to then control the Maschine through a mixer channel and also apply Traktor effects to it. Since the Traktor effects sync to Traktor's clock, which Maschine is also synced to, it all works really well.
Sorry for the essay!
EDIT: Forgot to mention, if you want to pursue this route on a budget then your best bet is to buy one of these controllers. This will give you 4 channel mixer control in Traktor, and also allow you to have rudimentary control of the remix decks and other Traktor functions. (Although not as well as the NI F1 will).
That's a really good idea. I remember talking to an ex-colleague who had taken some DSP courses... his problem with them was that it was pure theory and left him with no real understanding of how, when and why to apply it. (I'm self taught so I don't know what any of these courses are like). By playing around with signals and listening to the results I think you'll get a better feel for what's actually going on.
What OS are you running under? MIDI control isn't actually too hard to implement, at least under Windows (not done any MIDI stuff under Linux). And if you use a MIDI loopback driver (such as this one for Windows) you can use the sequencer within a DAW to control your synth.
A higher level environment might be beneficial to aid your understanding and allow you to try out algorithms and ideas without having to worry about low level details too much (as opposed to sticking with C++/Java). As well as Pure Data or CPython, I'd recommend having a look at at least one of GNU Octave (or MATLAB if you have it), Python (with SciPy and NumPy), Reaktor or MaxMSP... any of these will provide a higher level environment suited to DSP, each with their own pros and cons... for example, being able to display pretty graphs may be more or less important to you than playing audio in real time.
One advantage to either using something like Pure Data, Reaktor etc. or actually developing your own VSTs, is that you can easily add in interactivity (a VST without a GUI will still be given a set of knobs or sliders by the DAW software). Again, this may not be so important to you.
Oh yeah... it's not massively active, but subscribe to /r/dsp if you haven't already. Have fun!