Get Plane 9 its free and it syncs fucking awesome visuals to the music you are playing. And there is so many different visuals you can choose from. Also, It doesnt matter where you are listening from.
Just updated my music visualizer Plane9 now at v2.3.1.2 to use the 0.7 SDK and I must say the new runtime/direct mode is a great improvement! The oculus share page haven't been updated yet though.
As always it's, difficult to even come close to the quality of iñigo quilez. But here is my attempt at a noisewarp scene. Works as a screensaver & music visualizer. Time domain moves to the music. Still quite a lot of work left to match the beautiful colors that iñigo quilez have been able to produce. If you want to try out the scene you need to grab Plane9 first or just wait since someone will surely port it to WebGL soon enough.
Updated: Added embos postprocessing and fixed the colors a bit. Runs at 180fps on dual monitor 1600x1200 on a Geforce 460
Plane9 is a 3d visualizer where you never have to settle for just one view ever again. It features over 250 predefined scenes to choose from. But it doesn't end there since the scenes can be combined with one another to form a near endless supply of new views to experience. 37 transition are used to form a continous experience when moving from scene to scene.
The visualizer can be used either as a winamp plugin, a Windows Media Player plugin, screensaver or a oculus rift VR visualizer. In it's standalone modes it is sound sensitive and reacts to what your currently listening to, be it from spotify, iTunes or any another sound source, it can even react to what you record from a microphone or other input.
For 1.3 there are only Plane9 and the one in Virtual Desktop so far. Apart from that /u/excelynx has confirmed that he is working on a new version of Visir which he wanted to finish this weekend, but apparently did not.
I created such a noise based simulation a few years ago and it can look quite good actually Fire if you think about how cheap it is. But it still doesn't hold a candle next to a real fire simulation I did a while ago better fire. If you start to use real blackbody colors it improves it even further but I don't have a video of that one yet.
Plane9 installs itself as a Winamp and Windows Media Player visualizer as well as a sound-sensitive screensaver and it has several good presets (you can easily set it to only show you the presets you want).
If you're using Winamp then MilkDrop 2 is pretty much the way to go but Ryan Geiss has a few other Winamp plugins you should check out on his site.
The multiple blur passes is the fastest way to get a large bloom radius. In step 2 doing the blur with a normal 1000pixel blur even with a split vertical/horizontal blur would take ages. Using down scaled version is quick and does give you some nice bonus. That is that you can adjust the weight on each blur radius/layer. This is what UE4 also supports. I have some short notes in my blog. Basically the papers you want to look at are a GDC paper from 2004 by Masaki Kawase and 'Unreal 4 - The Technology Behind the Elemental Demo'.
Cellular automata can be quite mesmerizing if tweaked a bit. Like Jonathan McCabe did to create fractal turing patterns. Its in its simplest for a multidimensional cellular automata simulation so you simulate the cells with different spacings.
Actually there are 2 developers working on music visualizer apps http://www.plane9.com/ http://valynxstudio.com/visir/ Unfortunately I haven't been able to try either due to no vive but I've read a few people that have tested it on vive and they work just no move controller support as of right now.
Mine Plane9 supports 21:9 aspect and multimonitor as well. Most of the 260 scenes adjust well to this but not all of them. Has a matrix trails scene as well.
For anyone trying this I would recommend doing it in shader instead for massive speed boosts. A few examples Classic fire uses a 'fractal brownian motion' noise that has it's valued sent though a simple 1d gradient. The y & z dimensions are animated that creates animation. 3d fireball it's a 3d simplex volume texture (16x16x16 pixels) that is raymarched though.
Jelly is created in quartz composer. Quartz composer is huge and deeply integrated into all mac os so there isn't any windows ports of it.
Having said that I do have some knowledge about programming and its on my list of visualizers I want to add to my own visualizer. But matching how well it looks on iTunes is going to be difficult to say the least.
I haven't tried it myself, but i've been recommended one called Plane9.
It supports VR natively and will actually take you into the visualizer which is supposedly very trippy.
Check it here
I have created on called Plane9, it's free, stand alone and still being updated. The other option would be one of the commercial versions from soundspectrum
There aren't many developers left that make screensavers, I'm however one of them so if you want something new and still in development you can try Plane9. Another I can recommend that many like are the Really slick ones.
If you specifically target 4.5 you would remove a very large part of users since not only do they need a quite new card, they also need to have updated drivers for it. My own stats currently count 4.5 into the "other" category so the users of my music visualizer that actually have OpenGL 4.5 is lower than 13%. I would say target 3.3 as suggested by others.
Plane9 will sync to the refresh rate of the monitors so should do 120hz and it also supports multiple monitors. However running at 120hz over 3 monitors will require one beefy setup so pick some low resource scenes. /Author of Plane9
For the windows users here you can try mine Plane9. Start the configuration window. Click new playlist. Select the "monster spectrum" and the "black" scene as background. Then from the start menu run it in windows mode and it will work with iTune or any other sound source.
If you want improve the look of the fluid here is a simple system that produces quite nice results while being very fast.
I applied that effect to a variation of the fluid painting scene. The result can be seen here. It rendered 10000 particles at 130fps on a 1600x1200 display (debug build). But you certainly don't need that many particles to make the effect convincing.
Interesting ideas and very true. People always respond well to some form of story but as you say. Dynamically composing these stories while still showing some nice effects with a "wow" feeling is difficult at best. It's going to be very interesting to see what you end up with.
The idea behind Plane9 is more for the general user. Something you can install and when the screensaver/winamp plugin kicks in will show you something nice that reacts to the music your playing. While at the same time allowing anyone to create new scenes and upload them to the site for anyone else to use and build upon. Since all scenes created are independent and can be dynamically composed with one another creating some form of story flow in Plane9 will be very difficult. Even though the next version will add a lot of new features (with over 30 transitions being one of them) that will certainly help in this regard. Specially since the software can do more than just cross fade. It can apply the next scene to any object. Say a simple rotating cube that takes you from one scene to the next. This will help in creating a sense of continuity. Question is if it's enough.
But the importance of the story can easily be seen at demoparties. Usually the, possibly quite ugly, demo that has a story wins over the demos that where technical master pieces. It's the ones that combines the two that are amazing. The popular demo is one such demo. Debris is another one.
If you have any other thoughts on the subject of story from a VDJ perspective I'm very happy to hear them since it's a fascinating topic.
It's just a scene to the Plane9 visualizer so you need to do the following.
I got a TSU9600 Philips color remote control quite a while back and some Z-Wave devices to control the lights in my apartment. Of course I wanted to control one with the other but the protocol the remote used wasn't open. It left my no choice but to reverse engineer it. I did it by creating a proxy program that talked to a RSX9400 extender. That allowed me to analyze everything the remote did that made it believe it talked to a real extender. The next step was to make it believe my own program was a high end RSX9600 extender. The point being that the TSU9600 could send raw serial commands to a RSX9600 extender. After some digging I got it working and wrote a Homeseer plugin for the rest of the world to enjoy. The end result is that you can in the remote configure for example a button to send the serial command 'lights on' to the homeseer plugin. It would pick up the command and allow any action in homeseer to be triggered.
The second nerdiest thing I have done would be Plane9. A scene based music visualizer and sound sensitive screensaver.
You mean like Turing Patterns. A half decent multiscale turing patterns approximatition in realtime. Another bilogical looking scene
You might want to check out Plane9. Has about 140 free scenes that combines to form new variations, supports multiple monitors, powerstate aware (Stops the screensaver if monitor goes into powersave mode) and reacts to the music if you have any playing. Reallyslick is also well worth to check out.
You might want to check out Plane9. Has about 140 free scenes that combines to form new variations, supports multiple monitors, powerstate aware (Stops the screensaver if monitor goes into powersave mode) and reacts to the music if you have any playing. Reallyslick is also well worth to check out.
Realtime shader compiling is very useful when doing shader coding since you can quickly try things. But the danger is if you accidentally write something like "for (;;) { }" you will lock up you graphics driver. However usually the driver recovers after a while at least nvidias does. If you want to try a realtime shader compiling editor the only other one I know about is plane9
For those that wonders what a 2d fluid can be used for. Here are two very simple ideas that could be fun to add to a flash/java fluid. Wave machine - this just adjust the gravity of a fluid so much simpler than it seems. Move to the music - this adjusts the gravity left/right depending on beat and pushes down the fluid for stronger beats.
Fluid simulation is based on the Grand Kot one that has been floating around.
if you have windows, check out plane9. it’s a screensaver/visualizer that you can customize and has a lot of cool and trippy themes. some are kinda old school with the graphics that gives it that nostalgic feel.
Since at least on Windows Milkdrop is running through Winamp, you'd be better served to just play whatever you'd be using Foobar for in Winamp instead. Winamp and Milkdrop won't really be straining any resources for livestream. Looking into it, you could use Plane9 to do similar. I might add that into the main post.
For youtube videos you could use my Plane9. Just use the studio application that it comes with and add something in the form of "Visuals created using Plane9 (http://www.plane9.com)" to the description of your video. It has 250 scenes so I would hope at least one would fit your music.
If you like the abstract kind of visualizer then you can't go wrong with milkdrop that comes with the winamp music player. You can ask winamp to sample from what you hear if you want to use another player. Another option is my own Plane9 visualizer. The scenes are however not usually abstract so it's a matter of taste what you prefer.
Had postponed an major updated release of my OpenGL Plane9 music visualizer pending the SDK release. Since I just added 0.6 support is there any good features in 0.7 that requires code changes to be used or will they all just work with any 0.6 application?
There are as far as I know just a few visualizers developers left.
I would guess the problem is that on mobile devices people rather listen to music longer than view some visuals. For desktops there is little interest in it since there is no money in creating visualizers so all that try it commercially wither away. So if your going to create one you have to do it purely for the love of it.
The best would be to use the official look of the Monstercat visualizer as offered by ConfirmedSFW, if that falls though though you can use my version of it Monster spectrum. It's realtime but the visualizer can be setup to record a movie from a mp3 file instead.
If your after less abstract scenes like iTunes has then you can see if my Plane9 fits you. Just select sample from microphone in the configuration window. If it doesn't fit the bill then I suggest you go with dcurry431 suggestion.
http://www.plane9.com/ there is this screensaver program with some nice presets including awesome looking music visualizers and a cool matrix one. Some of them are really broken for multiple screens though.
I'm not sure if your laptop has good enough hardware but you could try mine called Plane9 it's still actively being developed and it also reacts to what you currently listen to.
It took a while longer than planed but Plane9 featuring this jello scene has now been released. However if you like that one I think you should also take a look at Wormhole it's quite the trip.
Some scenes from mine music visualize/screensaver Plane9 can probably give it something to do. Specially Cloud flight and Glass field
That does indeed sound cool and something I would be interested in also. Plane9 does have a galaxy scene although its a much simpler one that the one you describe. It is at least in 3d and not a slideshow.
Vertex shaders aren't actually that fun (Program that are run for each vertex in a polygon). The real fun start with pixel shaders. They are programs that are run for every pixel in a polygon. A few example can be found at my Plane9 scenes page. But it's when you see things like a realtime glowing spike ball that is created in a single shader that you understand just how powerful shaders are. Oh and yes, all scenes move to the music also.
Depends on what your looking for. For general programming I would recommend processing example creations. If you want to quickly hack together some shaders (Not music enabled) start hacking in your WebGL enabled browser at glsl.heroku.com. If you also want to use music to drive it you can take a look at Plane9. A good start would be to start the editor and bring up the Demoscene->Metaice scene. It simple but very nice
Depending on your needs in the OpenGL application you could possibly just fake the whole thing using a shader and some perlin noise. It would of course not match the quality of a offline rendered hand modeled disco ball but might be enough for what you require.
Good work maybe a mirror/radial blur effect would help to make it look more "full" for example like sound cube but reversed so the effect moves into the screen. Just copy the screen, fade it out a bit and render a smaller copy of it. Also I have to agree with the others that it doesn't feel like it matches the beat even with the angle changes but that might just be youtube. However as a visualizer programmer myself I know how difficult it is since defining what we people hear as a beat is notorious hard.
Another free one that can be worth checking out is Plane9 Matrix trails. Supports multiple monitors and reacts to the music that is playing for extra effect.
Quite a lot of the scenes in Plane9 are just one big CgFx shader and new ones are uploaded every now and then. Converting the CgFx shader to OpenGL is basically just renaming float4 to vec4 and so on. It's a music visualizer and screensaver so you have some use for your new shader.
If you just want to play around with some shaders you could try Plane9. You have to use the inbuilt (shader) editor though so might not be what your looking for. But quite a few scenes uses just a shader to do the effect.
I was trying to find a way to do this for quite a while for my own visualizer Plane9 but I couldn't afford a preprocessing step so it turned out to be next to impossible. Finding the beat humans feel as the beat proved to be difficult and I haven't been able to find any algorithms that can do it in a satisfactory way. Realtime or otherwise.
In the end I had to settle for the old detection from winamp. The results can be seen on the video skip in 35 sec or so. I do hope that someone does release a good algorithm for this someday since It would make for some much nicer visuals.
Plane9 might work for you. In it's screensaver mode it reacts to any sound/music that is playing though the speakers. So to get it to work with a microphone you would need to configure it so whatever the microphone hears it plays though your speakers. If you can't do that then that program wont work for your problem.
If you have a somewhat fast graphics card you could try Plane9. Under Win7 & Vista the screensaver will react to the beat of the music. Editor, Winamp & WMP plugin is also there.
Do this but then don't use normal alpha blended textures. Instead use distance field textures. Its a lot simpler than it sounds and the result is just amazing. Example of font writer that uses a single one component 256x256 texture that has all ASCII chars encoded. Check the paper by valve for the shader. The tool used to create the texture.
The idea is that you can try and explore the demo effects in the editor. For example how a android in a shader works. The shader is included and most are just a big shader. It's one of the few program where you have a use for some strange post processing effect that didn't fit in the demo you just made. The only extra you get with a key is the 10 extra scenes to the 115 you get for free. The software isn't crippled in terms of what a scene can do so you don't get anything extra at all if you just want to use it for experimentation. Then your better of just using the free version. Try to redo the fire effect if you want to. Its 3 lines of pixel shader code and a gradient.
Maybe a galaxy works. If that's not international I don't know what is or some swirling colors. Might be distracting though. Bubbles are quite soothing.