Wired: Your images often have an organic basis. Could you say something about how you make them?
Buggy G Riphead: I like to start with simple things, things I see around me. So much computer art is parasitic on itself. Because the experience of making it is all to do with sitting in front of a screen, the work keeps coming back to the computer. You get this stale imagery of metal and circuit boards.
Much of my work is done outside. I go trekking and hill-walking, travelling through the real world, experiencing it, and then committing my experiences to digital form - so I can begin to manipulate them with the help of a computer.
And what are you trying to achieve?
I try to bring out the hyperreality of simple natural objects by approaching them at macro- or micro-scales, unusual angles, things like that. At the moment I'm fascinated by decaying London. I've been photographing corrosion, blooms of mould, erosion on concrete which has been worn away by water over time. The colours are amazing.
You describe what you do as "ultramedia". How is that different from, say, hypermedia?
For me, hypermedia describes the logical processes you go through to access information. It's a nonlinear way of navigating from one piece of data to the next. Ultramedia has larger ambitions. The aim of it is to travel through data in realtime, to watch it unfold before your eyes. You could make an analogy with embryology. What I want is to experience datasets, to go into them and to restructure them just like I would any other aspect of my reality.
An ambitious project. What steps are you taking towards realising it?
At EBV we're developing something called Boombox, which we hope will be a piece of software for the Playstation, the new Sony console. The Playstation has just been launched in Japan and it's a really exciting machine. With its dedicated geometry engine and high-powered graphics processor, it could potentially become an ultramedia platform for the home. But so far the software they're bringing out is just higher-res versions of the usual games stuff. We feel that this sort of stuff doesn't exploit the machine's full potential.
So you're creating what?
Boombox is a kind of audiovisual synthesiser, a "synaesthesia machine" based on my visual datasets and FSOL's sonic ones. Imagine travelling down a pulsating tunnel or over a rippling fractal landscape, controlling abstract shapes or objects to which you can assign functions - say, use them to manipulate the topography, the colour or the nature of the soundscape which accompanies you.
Give me an example.
Perhaps you have four spheres hovering over a texture-mapped sea. If you fire or crash them onto the surface, depending on where they land, you'll get a particular global deformation of the five-part ambient harmony surrounding you. There will be a massive range of different zones. And the best thing is, you'll be able to save your trip and dump the whole lot onto VHS to play back later.
So Boombox is going to be some kind of high-powered audiovisual composition tool?
No, it's entertainment - though within certain parameters, yes, you'll be able to compose with it. It will be designed so you can do stuff just by blundering around and pressing buttons. But equally, you'll be able to take deeptime excursions into the submenus, picking out and sequencing exactly what you want to see. Once you've sorted out the basic tools, you have the basis for all sorts of ultramedia packages. It could even become a DJ thing, people being able to mix imagescapes like music.
What else have EBV got planned?
Well, there's the film for one thing. We made a short pilot for it, called Lifeforms, and we're trying to get finance for Yage, the full-length version. Lifeforms was just a dry run; the real thing will be way better. We want to integrate live action and computer animation. There'll be actors, a narrative of sorts. I see it as a confluence of all kinds of different ways of seeing; kind of a cross between 2001 and Performance.
Confluence, as an idea, seems very important to you.
Certainly. The convergence of different media is one aspect, but confluence is also to do with a way of working, of being with other people. EBV is a confluence of creative talents, not just in procedures and time resources, but also in ways of imagining, of minds. I can't think of anyone better than FSOL to be collaborating with, because their music adheres to these ideas. It corresponds to my own logic of creation. Somehow I don't think I could work with David Bowie or Tina Turner.
How about future tech? For example, what do you think are the possibilities for digitising touch? Is that something you'd like to add to your armoury?
I think it's a long way off, and personally I'm convinced that handgloves and bodysuits aren't the way it will go. But anything could happen. At a conference recently, a Sony executive was talking about "digitising thought". His English wasn't too good, and it was hard to work out what he understood by the concept, but it caught my imagination anyway. You could monitor alpha waves or other emissions and use them to modify 3D images - the first true mindscapes, a direct feedback loop between brain and screen. Now that's an idea.
Hari Kunzru (email@example.com) is a London-based freelance writer with interests in art, technology and culture. Buggy G Riphead can be reached at firstname.lastname@example.org, and Future Sound of London are at email@example.com. A FSOL Web site is on the way, but in the meantime try:hyperreal.com/music/artists/fsol/www/index.html