I have been playing around with various environments etc for the last 5 years or so - on and off. I have tried diving into supercollider (http://www.audiosynth.com/) a few times but have never put anything hardcore together. I have also played with chuck and pure data. Anyone play around with any of these?
I didn't get a chance to play with supercollider when I was studying music. Wish I did, but the class was after I left school. I knew a couple guys who were into that stuff, but not me personally. I'm getting good enough at other concepts in electronic music that the idea of designing my own soft synths is a long term goal.
Every now and again I find myself thinking something like "being able to program a sine or square wave into my filter frequency and make table for how that interacts with MIDi data would be neat." I find myself knowing what I want to do from the perspective of altering the sound, but not what to do in the environment I use to make the sound change right. That'll be something I learn later. It would be really cool to be able to design custom soft synths from the ground up.
I want an orchestra of them all undulating in different waves at maximal volume at some 21rt century avant garde concert, and to figure out the acoustics of the room, and to program it to make the room shake. Get some super low bass tone at that frequency and shake the damn building like Tesla, and then make horrible sounds that sound like the speakers getting tortured, then drenched in reverb that modulates in and out of being compatible with the natural reverb of the hall, while the horrible screeching noises build and the room's shaking grows louder and louder. Then I want the speakers to catch fire and the sprinkler system to dowse the audience before the low bass causes the ceiling to fall.
Damn I gotta figure out some music software design.
Whats kinda crazy about approaching sound design, at least from a programmers point of view - is all the concepts you have to learn about sound and then learning how other programmers decided to implement these concepts and then sort of procuring all the universal approaches in a specific area. This was my major hold up along with there being really really bad documentation for supercollider. On top of that, most of the code is insane when you try to take it apart and learn from it. I actually thought of digging into the C++ and learning it that way, but that code is pretty bad too. What other stuff are you into right now?
It's not programming, but all the stuff I'm working on right now is a kind of additive minimalism. I would say my influences on this project were Fennesz, Squarepusher, and Phillip Glass.
I play relatively simple melodic patterns, repetitively but with variations in rhythm based on additive techniques. Then, I polyrhythmically layer these patterns, add lots of knob twiddlings and effects. I like to find patterns where you can play with peoples harmonic perception by slightly altering the pattern additively. For example if you have 10 eighth notes in 5/4 that imply C major, you can add a 3/8 group that repeats the d-b-d part, and with no other change, the perception of harmony changes from an embellished C chord to an embellished G chord, and the groove changes, but the pattern stays basically the same. You can structure long pieces in various forms that way, maintaining a link to old sounds like fugue or early renaissance music (whatever the source material for the polyrhythmic lines is), and consciously manipulate form. Sometimes, I edit away most of it and just leave the few moments when the way the patterns interacts creates really interesting music. Then I drench the simple lines in electronic manipulations.
It's kinda a homebrew musical style I've been dreaming up in my spare time. I don't actually know how to play any of the instruments I am using (except keys and laptop) and I'm new to digital performer to record so I get to learn that all as I go too.
google giles bowkett archaeopteryx. did presentation today @ goruco (ruby conf in new york). started presentation with system autogenerating drum and bass rhythms through Reason.
not audio software; midi software. heavily lambda-fied. essentially a meta-sequencer, which allows you to write probabilistic templates, each of which can then generate infinite numbers of grooves conforming to the template.
obviously this is based on "This Is Your Brain On Music" and the nature of musical structure, in that our brains rapidly extract the meta-structures of music. therefore you should not compose in structures, but in meta-structures. currently an Archaeopteryx composition consists of a probability matrix and code. the probability matrix governs whether drums play frequently or infrequently; the code defines strategies for beat mutation.
Every now and again I find myself thinking something like "being able to program a sine or square wave into my filter frequency and make table for how that interacts with MIDi data would be neat." I find myself knowing what I want to do from the perspective of altering the sound, but not what to do in the environment I use to make the sound change right. That'll be something I learn later. It would be really cool to be able to design custom soft synths from the ground up.
I want an orchestra of them all undulating in different waves at maximal volume at some 21rt century avant garde concert, and to figure out the acoustics of the room, and to program it to make the room shake. Get some super low bass tone at that frequency and shake the damn building like Tesla, and then make horrible sounds that sound like the speakers getting tortured, then drenched in reverb that modulates in and out of being compatible with the natural reverb of the hall, while the horrible screeching noises build and the room's shaking grows louder and louder. Then I want the speakers to catch fire and the sprinkler system to dowse the audience before the low bass causes the ceiling to fall.
Damn I gotta figure out some music software design.