new video: leftright

I've had the idea to do this kind of thing for a long time, so I'm glad to have it done.

A short video. I walked back and forth in front of the camera many times, and then I sorted the frames of the video based on the position of my red hat. The audio is a short guitar bit I recorded, cut into pieces; the pieces were looped to created a bunch of layers of sound. The stereo effect is created by delaying the two channels of each piece by varying amounts, rather than the more standard amplitude difference method.

I had intended to make the audio the same way as the video: do a stereo recording (using a stereo mic setup) of some sounds moving back and forth, then cut up the sound into piece and sort by "leftness". I figured out how to measure the "lag" between two signals, but it was too short, and didn't actually correlate with the stereo position very well. I'll have to try it again with a pair of fairly widely separated microphones so the lag is a little larger.

dueling axes: two-center bipolar version

I wrote some code in Processing so that I can participate in a future Dueling Axes event with BEAMS, an event in which audio performers generate visuals by using their two channels of input to independently control the x and y coordinate of a point, oscilloscope-like. I decided to try other coordinate systems. This video is my two-center bipolar proof-of-concept. Here, the audio channels are just a guitar and a slightly delayed version of it.