Image Image Image Image Image

music

01

May

[radical] signs of life

Through responsive dance, [radical] signs of life externalizes the mind’s non-hierarchical distribution of thought. Music is generated from the dancers‘ muscles and blood flow via biophysical sensors that capture sound waves from the performers’ bodies. This data triggers complex neural patterns to be projected onto multiple screens as 3D imagery. As the audience interacts with the images produced, they enter into a dialogue with the dancers.

The post [radical] signs of life appeared first on generactive :: generative + interactive + art + design.

16

Apr

Matthew Davidson aka Stretta

Matthew Davidson aka Stretta is a talented guy. He’s an accomplished graphic artist and video producer/editor but we talked to him about his music. Stretta’s music is lush, modest and dreamy in the tradition of Brian Eno but it definitely has character of its own. Stretta comes from a tradition of modular synthesis that led him to discover Max/MSP.

The following is reposted from an interview by Marsha Vdovin here.

Tell me a little about your background.

I grew up in Iowa, and in 1988 I was trying to figure out what college to go to. I applied to one school, and found myself out on the East Coast, at Berklee College of Music. While I was at Berklee, one of my professors introduced me to Max. That was 1992. I’ve been using Max ever since.

I was interested in music and technology, so growing up in Iowa — pre-internet—all the information I had access to was books and magazines. It’s not at all like the hot and cold running information that we have on tap these days, where you can be anywhere in the world and learn about any subject very quickly.

As an example, I recently developed an interest in photography. This is a subject I knew nothing about. With the internet, and the instant turnaround of digital photography, being able to see other people’s work, inspect the meta data, I was able to learn a great deal in a short amount of time. Today, taking up any new interest like electronic music is far easier than when I started, I can tell you that much.

Digital photography opened up so much for me. I was able to do it without spending money, which was incredible. People were able to see my work all over the world, without me spending money.

Right. The spending money thing is analogous to what life was like before digital recording. A reel of tape costs money, so when you’re rolling, when you’ve hit the record button, there’s money at stake. That was the same thing with photography, you’re burning film. Now it doesn’t cost anything to drop the shutter, and now it doesn’t cost anything to play with digital audio. This accelerates the learning process.

I loved your Way-Geeky Time Line.

[Laughs.] You’ve done your research. Looking back, I realized that computers helped me express myself, so it was the correlation of operating systems or computers and what was happening in my life was significant. My first computer at home that I had access to was an Apple Lisa. The first time I used it, it was like touching the future. It was like someone got in a time machine, kidnapped a computer, then brought it back to the current day. I’d never experienced anything like that before.

I guess you could apply the oft-used term “paradigm shift.” I hate to use that word, but I can’t really think of anything better to describe what it was like going from computers with a green phosphorous screen to a black-and-white bitmap display where you click on objects and open them up. It’s not hyperbole to say that that changed my life.

I only had that machine for three months, then it was replaced by a 128K Macintosh. I was definitely one of the very early Mac users, and I’ve been fortunate in my choice of careers and work, as I’ve never had to use a Windows machine. Even in the dark days of the ’90s.

Did you take to Max right away?

I remember the night I was exposed to Max. Afterwards, I stood outside Berklee and put my head back and looked up at the sky, imagining how far this thing went. I recognized it and I knew it was one of those things I could spend years playing with, and never really see the end of the potential.

I’m very fortunate to watch Max evolve, sprouting audio, making all these technological leaps, and then the leap to OS X. It continues to be more capable, while retaining its essential core.

My favorite toy growing up was Legos, and I see a commonality. People I talk to who are into modular synthesizers, or into Max, there’s this commonality of “Did you play with Legos when you grew up?” “Yeah.” So it’s like that. It’s like Legos for music.

I like that granularity of control. It sits in this weird space, between commercial music applications and programming languages. Max is somewhere in between these two things. It allows you to create and customize your environment without programming and compiling.

I’m not a programmer—there is something about procedural languages, text-based, linear thinking that I don’t get along with. Max is non-linear, it moves in all directions, it’s real time. If you’re a guitarist, you understand how guitar pedals and patch cords work. You plug this into this and this other thing. I think this is a metaphor that is compatible with musicians.

If you understand these things, then understanding Max comes intuitively. When your creations evolve, and they tend to get more complex, you look back at it and you think, “How did I even understand this to begin with?” Because it looks really complicated. But then you break it down into smaller parts, and you can see how everything works.

So, is Max your primary music-making tool?

No. I would be surprised to hear anyone say that it is, simply because we live in this age where we have so many amazing tools available to musicians. There has never been a better time, from a technological standpoint, to be a musician. So while there are people who can dedicate themselves monk-like to a particular tool—Charles Cohen comes to mind. He’s been using a Buchla Music Easel for forty years. That’s his thing, and he knows it inside out. I admire that. We need people like that to be able to dedicate themselves to an instrument, but I don’t have that kind of dedication.

Have you gone the Jitter road? Have you combined your photography with Max?

[Laughs.] No. Like I said, Max is one of those things that you could spend the rest of your life dedicating yourself to the possibilities, and not exhaust them all. Based on my interest in video, and photography, I am definitely interested in in Jitter, but I haven’t come close to exhausting all the ideas I have for audio and MIDI within Max yet.

I think if someone came to me and said, “You know, we want you to do a live performance, and we want there to be video,” yeah, [laughs] I would fast-track my Jitter education.

Also, I don’t think there’re enough video-y applications for the Monome. The Monome is very audio-centric right now and there isn’t any good reason for that. The Monome, in conjunction with Jitter, would be very powerful.

What is it about the Monome that draws you to it?

Probably it was all the years of Max prior to it. You spend all this time with Max, and then you think to yourself, “Gosh, I really wish I had a controller to go along with this, to provide input and feedback.” People would come out with controllers, and they would be overly specific, or they wouldn’t do the thing that you wanted to do.

Then you started seeing people building their own controllers. Do-it-yourself kits became available, like the iCube, where you could hook up sensors and other analog sources and it would provide a MIDI output. That was a good move forward.

But when I saw the Monome, I just thought, “Oh, of course. I know exactly what I would do with that.” I think that’s partly why Max has been the default language of choice for Monome developers. They’re very well suited for each other. There are no labels of any sort, there’s no pre-determined, prescribed usage to the Monome. It is exactly what you’re looking for if you’d done anything in Max at all in the past.

How did the Max 5 change affect you?

I was using Max 4 up until about two or three months ago. I knew about Max 5, I knew what was going on with the environment, and I thought it was a very necessary, gutsy move for the company. And from what I could tell, at least two solid years of engineering, while adding no new features or capabilities to the software, redoing the user interface from the ground up, with a completely new framework. That’s the right way to do things.

If they were a larger company, they would find a way to screw it up. “You want to do what? For how long? That’s ridiculous.” But the change from Max 4 to Max 5 is as significant as the change OS 9 to OS X.

I was talking to Nick Rothwell as recently as September, telling him that I think it’s time for me to move into Max 5. He said, “Well, once you start using Max 5, you’re never going to go back.” Intellectually I believed him, but deep down, I was like, “Yeah, well we’ll just see about that”— because it is a big change. And oh, he was right. [laughs] I have a Max 5 license on one computer and a Max 4 license on another computer. I can’t bring myself to use Max 4 anymore.

I took to it in a fairly short amount of time. I think the main change, in terms of capabilities for Max 5, is being able to think in metric units. You can think in terms of 16th notes and 8th notes, and you don’t have to worry about milliseconds, or converting this to samples. That makes everything a lot easier. The idea of a global transport, and having access to metrical units is a really big deal for me. That was huge.

What’s your favorite object?

The Coll object.

And why is that?

I use it in every single patch. It’s familiar, like an old friend. I know it, and I know how to use it. I’m constantly learning new things about it. I think if you’re doing anything that manipulates or stores little bits of data, you have to get comfortable with the Coll object.

It seems to be pretty fast. I don’t have any problem extracting data in a timely fashion from it. If you have a Coll object and a Metro, you have the entire basis of a whole variety of step sequencers with a timed beat. You can do all sorts of magic with just those two things.

I don’t think it’s very sexy if you look at it. The object that I really liked before the Coll object was Table. It was more limited and approachable than Coll, but it had a graphical interface. You had two-dimensional data that you could manipulate directly with the mouse. But the Coll object is a lot more flexible. With the Monome, the face of the Coll object is now tangible.

Often, I’ll peek inside patches of other developers to see how they do things. Sometimes, they’re doing some sort of complex mathematical abstraction, which is satisfying from an intellectual point of view, but I’m more likely to simply dump the values I want into a data object like Coll. It kind of feels like cheating, but it gets the job done.

Have you been working in Max for Live?

Yeah. Most of the work I’ve been doing recently has been in Max for Live.

I think if Max by itself had a weak point, it would be that it doesn’t have a decent time line. A time line is one of those features that represents infinite mission creep. Ultimately, what you want is a full-featured DAW. So, putting Max inside a mature DAW is the best solution here.

Prior to Max for Live, most of the things I made were only of interest only to me, due to the dependencies involved. In the beginning, the dependencies were racks of hardware. At Berklee, I had codified what I learned about harmony into software, but to make it do anything you had to use external synthesizers and sound generators. Nothing ever made it out of the lab.

Later, you could use soft synths, but that still involved a lot of setup. You had to load the virtual instruments and effects, perform complicated routings, and deal with sync issues. It wasn’t really plug-and-play. I couldn’t take this, and then give it to someone else, and have it be as useful for them.

Now with Max for Live, suddenly the things that I make are portable to other people. I can make these little tools, these little performance things that take real time input, and then outputs something that’s musically interesting.

That also has ramifications for live performance. I did a recent video using Max 5. It involved a software harmonizer, effects, recording multiple tracks into a DAW, and complex MIDI routings, and that’s like, four different applications, all combined. It took a good hour or so to set this one performance piece up. So, it’s not easy for me to reproduce that performance again, let alone string together a set of pieces to perform. Now, with Max for Live, you can put all these combinations of elements together, all of your soft synths, all your routing, all your effects in this one environment, and save it. Then you can recall it. I can’t tell you what a huge thing that is.

Max for Live also addresses the issue of a DAW trying to be all things to all people…

But they try to be.

Well, they try to be, and then that’s where the user interface breaks down. The application sprouts these weird appendages, and after two years of that and you end up with something that becomes incomprehensible and un-maintainable. Especially if you’re not willing to take the time to go in and refine the user interface, or piss off your existing user base by throwing out old, crusty features that a small percentage of your user base relies on. But if you jettisoned that code, then you could bring your DAW forward, develop faster and make your code more reliable.

So what Max for Live does for Live users is it allows people to create this customized environment to do the things that they need to do, without bringing the entire DAW down.

So you can see how Max and Live need each other. Max gets a fully featured timeline, and Live gets a mature environment for user customization.

So, you’re giving away your Max for Live ‘Monome suite’?

It’s free for anyone to download. I’m beta testing a new release right now that adds support for multiple Monomes. So if you have multiple Monomes, you can have one that’s switching between these applications, and another one that’s switching between another set of applications.

I’m replacing all the user interface objects with Live objects, and that enables parameters to be stored and automated. I just sent out a beta of that yesterday, and I’ll hopefully be getting some bug reports and actually making that an official release in the near future.

So, what is Stretta?

Stretta began as a vanity record label. I bought the domain back around 1996. It became clear to me that people weren’t buying music, so the idea of a record label really didn’t make sense anymore. Simultaneously, I was noticing the importance of personal branding on the internet because there are so many forces competing for attention. If you release something, you’ll see a huge spike of interest that falls off rapidly. It doesn’t matter if you spent two years working on something or two days, you’ll see the same spike, then everyone moves on to the next thing. From that I concluded that the better strategy is to release smaller things on a more consistent basis, and this is where having a memorable brand becomes useful.

‘Matthew Davidson’ is not very memorable, and it is kind of long. So, since I already had the domain—and short, pronounceable domain names are a rare commodity these days—I use Stretta. It is short and memorable and consistent across all these social media platforms.

The post Matthew Davidson aka Stretta appeared first on generactive :: generative + interactive + art + design.

16

Apr

Korinsky Studios 3845 m/s Sound Installation

Korinsky Studio consists of Abel, Carlo and Max Korinsky. They mainly focus on their shared passion: exploring the possibilities of using sound in vertical surfaces. 3845 m/s is their newest installation using their own software, in a former coal power plant in Berlin. See the Korinsky Studio website for more information about their work.


Documentary about the work of Berlin-based art collective “Korinsky – Atelier für vertikale Flächen” and their sound installation 3845 m/s

The post Korinsky Studios 3845 m/s Sound Installation appeared first on generactive :: generative + interactive + art + design.

14

Apr

Missing: An Interactive Installation by The xx at Sonos Studio

Missing: An interactive installation by The xx, Kyle McDonald, Aramique and Matt Mets explores the concept of the album “Coexist” through the relationship of man and machine. 50 robotic Sonos players follow movement inside Missing’s dark emotional landscape.

Visit the Sonos Studio (145 N. La Brea in Los Angeles) from Nov. 15th – Dec. 23rd to see the installation in person and sign-up for e-mail invites to upcoming events: http://www.sonos.com/studio

The post Missing: An Interactive Installation by The xx at Sonos Studio appeared first on generactive :: generative + interactive + art + design.

12

Apr

electric dharma wheels

download this track for free at https://soundcloud.com/stretta/holocene

Hardware: monome arc4. Software: electric dharma wheels. More about the monome arc4 can be found at http://monome.org Please go to monome.org for more information, including how to buy.

I received a production-run arc4 with the final firmware on Friday. This signaled a mad scramble to update my work for that and the latest serialosc with arc support so it’ll be ready when people start receiving their units in a few days. So what do I do on Saturday? Make a new app, of course. Sure, that totally sounds like the responsible thing to do.

After receiving the arc4, I thought it might be a good idea to produce an example that demonstrates a ‘bank’ of encoder values that you can switch between. That gave birth to an application idea involving triggering modal notes from a pool of probabilities across three octaves of scale degrees. There is a separate bank of pitches depending on clockwise or counterclockwise rotation so you can shift the harmony with a simple gesture. The weighting of scale degrees is programmable and editable in real time on screen or with a MIDI controller. This allows for a more controlled structuring of compositional development over longer periods of time. The speed of the rotation determines how often a note is triggered, and can also be used as a modulation parameter for the FM synthesis engine.

Relevant synthesis parameters are also editable on the arc as the notes are triggered. The state of these parameters is overlaid on the LEDs, so interesting patterns emerge when this mode is engaged. There was a really awesome bug where switching editing modes also transposed the output modally, so I built in a score feature that allows you to advance a programmed chord progression with a button push.

A sit-the-arc-in-your-lap-and-doodle app has been on my mind a lot and I have at least three good starts in this area, but other priorities have often pushed these out of the way. The prototype arc2 I had lacked the mounting bracket for the USB cable and the logic board was floating free inside the enclosure, so I always had to use it (carefully) on a stationary, flat surface. It is really nice to have an arc that can be moved around or used in the lap. My cat disagrees.

I recorded this video, holocene, as a demonstration of this app, which I’m calling electric dharma wheels. This is the raw output from the electric dharma wheels, with some Eos reverb added after the fact.

The post electric dharma wheels appeared first on generactive :: generative + interactive + art + design.

09

Apr

Will Wright & Brian Eno

Will Wright and Brian Eno Play with Time from The Long Now Foundation on FORA.tv

Game designer Will Wright and musician Brian Eno discuss the generative systems used in their respective creative works. This clip features original music by Brian Eno.

Will Wright and Brian Eno on “Playing with Time.”

In a dazzling duet Will Wright and Brian Eno give an intense clinic on the joys and techniques of “generative” creation.

Back in the 1970s both speakers got hooked by cellular automata such as Conway’s “Game of Life,” where just a few simple rules could unleash profoundly unpredictable and infinitely varied dynamic patterns. Cellular automata were the secret ingredient of Wright’s genre-busting computer game “SimCity” in 1989. Eno was additionally inspired by Steve Reich’s “It’s Gonna Rain,” in which two identical 1.8 second tape loops beat against each other out of phase for a riveting 20 minutes. That idea led to Eno’s “Music for Airports” (1978), and the genre he named “ambient music” was born.

The Long Now Foundation was established in 01996* to develop the Clock and Library projects, as well as to become the seed of a very long term cultural institution. The Long Now Foundation hopes to provide counterpoint to today’s “faster/cheaper” mind set and promote “slower/better” thinking. We hope to creatively foster responsibility in the framework of the next 10,000 years – The Long Now Foundation

The post Will Wright & Brian Eno appeared first on generactive :: generative + interactive + art + design.

09

Apr

Computers, Data, and Humanity

Generative Art – Computers, Data, and Humanity | Off Book | PBS

An intriguing combination of programmers, artists, and philosophers, these creators embrace a process that delegates essential decisions to computers, data sets, or even random variables. This allows important metaphors to arise in their work, calling attention to the relationship between humans and the computers that surround us, the mountains of information we generate, and the powerful impact that technology has on our relationships with each other.

Featuring:
Luke Dubois, Generative Composer
Scott Draves, Generative Artist
Will Wright, Game Designer

Music by:
Codex Machine
Luke Dubois
Revolution Void
Tryad
Reno Project

The post Computers, Data, and Humanity appeared first on generactive :: generative + interactive + art + design.

09

Apr

Years by Bartholomäus Traubeck

A record player that plays slices of wood.
Modified record player, wood, sleeves. 2011

A tree’s year rings are analysed for their strength, thickness and rate of growth. This data serves as basis for a generative process that outputs piano music. It is mapped to a scale which is again defined by the overall appearance of the wood (ranging from dark to light and from strong texture to light texture). The foundation for the music is certainly found in the defined ruleset of programming and hardware setup, but the data acquired from every tree interprets this ruleset very differently.

Thanks to Land Salzburg, Schmiede, Pro-ject Audio, Rohol Furniere, Karla Spiluttini, Ivo Francx, vvvv.

Bartholomäus Traubeck
traubeck.com

years 4 640x426 300x199 Years by Bartholomäus Traubeck

years 5 640x422 300x197 Years by Bartholomäus Traubeck

The post Years by Bartholomäus Traubeck appeared first on generactive :: generative + interactive + art + design.

08

Apr

Imogen Heap Performance with Musical Gloves Demo

“It’s the sustain! It’s never done that before!” Imogen Heap breaks out of a captivating performance of a song written just three weeks ago for a piece of tech she’s had to wait two-and-a-half years to get her hands on.

Covering Heap’s hands, arms and back are a series of wires. Two LEDs blink on the back of her hands. She adjusts a setting on her computer and composes herself in the centre of the stage, eager to continue the performance. Despite the minor hitch, the Wired 2012 audience are still captivated by the award-winning musician — if anything, the error only makes her passion for the new technology all the more obvious.

Heap told Wired 2012 that before she got her hands on her “magical gloves”, she would make music with an array of instruments and virtual instruments, along with Albeton music software: “Basically, inside this software I can play virtual instruments and loop things, add layers and textures that I spend hours working on in my basement. But I wanted to bring those sounds on stage with me. I strapped keyboards onto me, had microphones attached to my wrists so that I can mic up wine glasses or guitars or whatever I wanted to record. The problem was, how could I do this on the move.

“A lot of what I do, like adding a huge reverb to an instrument, is done by pressing a button on a keyboard — which isn’t very exciting. You can’t even see what i’m doing,” Heap said, picking up a synth and pressing said button. “I could be checking my email for all you know. Fifty percent of the show gets hidden. I wondered how to make it fluid on the stage without these buttons — I wanted to make a gesture like this [she throws her arm out in a wide arc] to add the reverb so you could see and hear the sound, which is much more interesting than turning a pot around.”

Heap encountered a technology that would inspire her own musical mittens when she visited the MIT Media Lab two-and-a-half years ago. There she met Elly Jessop, whose gesture gloves left an impression on Heap: “What she’d done was simple, or rather the idea was simple,” says Heap. “[Elly] sang a note and moved her hand — a gesture that let her control the grain of the note. She could change vibrato or select a harmony with movement. When I saw this combination of music and movement intuitively combined, I wanted to get involved.”

The device that Heap is wearing on the Wired 2012 stage has taken her and a team from the University of West England several years to develop. As well as the tools that she’s wearing, there’s an Xbox Kinect at the back of the stage that translates Heap’s position into different effects and layers. “It’s like the floor is like my playground, so I can walk into different sections to control the sound — I can step into a choir of ‘mes’.” Heap steps into a section of the stage and sings a note, which is instantly harmonised by an invisible choir.

“So it’s not just a controller, it’s really an instrument,” she explains. “The way we program the gestures is the same as playing an instrument in 3D space. As I walk around the stage you can see that I’m walking into a different set of effects. My proximity to the audience is also part of the performance — so when I’m further away from the audience the sound is a lot bigger, but when I get closer to the audience it becomes more intimate.”

Having demoed the incredible technology, Heap then began her performance — an incredible mix of song and movement. Despite the occasional glitch, it was an incredible spectacle. Should you have the chance of seeing Heap with her new technology, you’re in for a treat.

The post Imogen Heap Performance with Musical Gloves Demo appeared first on generactive :: generative + interactive + art + design.

15

Mar

CocoRosie

I was looking around for new music the other day and came across CocoRosie. I found this video and was totally blown away by their unique style and creativity.