Like us elsewhere!

 

Subscribe - RSS feed
newsletter
E-mail address:
 

Entries in sonification (4)

Monday
May252015

Dolmen

 

During this year’s Sonic Acts, one of the installations that stuck with me was Mario de Vega’s DOLMEN. It hung in the main hall of the Muziekgebouw and made a whole lot of noise.

Consisting of various high frequency receivers, radio scanners and custom electronics based on logarithmic detectors, it is an intervention that explores the boundaries of human perception as well as the social, political and physical impact of telecommunications technnology by making wireless signals in space audible. If you stood below it, making a call, you could hear the normally inaudible carrier wave being amplified by de Vega’s installation. In that way it was also kind of semi-interactive, and made one really aware of all the different signals we surround ourselves with. Not only the waves of cellular phones were audible, also the carrier waves of the radios of the ships floating by. 

Friday
Apr032015

Sonify... Earthquakes Worldwide

We often think of sonification as an algorithm that translates data into an often abstract, often digital sound. R x2 by Moscow-based media artist Dmitry Morozov a.k.a. ::vtol:: is different in that aspect. In the “Sonify…” series, we look at different ways of sonifying data. This time: Earthquakes!

R x2 is a kinetic sound sculpture collecting data on the shocks in the earth’s crust (earthquakes) and capturing all of them above 0.1 Richter magnitude scale. On an average day there are up to 200 of these quakes.

The data is converted into signals that control motors connected to a bunch of Thunder Drums acoustic drums. These Thunder Drums consist of a spring attached to the skin of the drum, so when it’s shaken the spring moves and creates a continuous resonance through the body of the instrument, not unlike the rumble of thunder. The rumble that sounds fits the character of an sonified earthquake quite well.

Thursday
Dec252014

Sonify... Wi-Fi

With the coming of “the cloud”, and other services which rely on wireless internet, we’re living under the impression that Wi-fi is constantly around us. However, if you travel a bit like me, you’ll still often find yourself looking for Wi-fi. Furthermore, even when it’s around, the very fact that it’s wireless sometimes makes it a vague technology.
In the “Sonify…” series, we look at different ways of sonifying data. This time: Wi-fi signals. Phantom Terrains is a collaboration between science fiction writer Frank Swain and sound artist Daniel Jones. Swain has been slowly going deaf since his teens, and he’s been thinking about using hearing aids in different ways.
Phantom Terrains works by receiving a wireless signal on a hacked iPhone and sending the sounds to Swains hearing aids, connected via Bluetooth. After a few months of testing and experimentation, they’ve released the first audio of their sonification. Wi-fi is a very rich signal, almost as rich in data as our own physical surroundings. In urban life, it is almost constantly around us, so Jones made sure it sounded like something which would not be too intrusive, and could be around you all day. Distant networks are heard as a gentle clicking, which ticks more frequently as the wearer gets closer.
I think the fact that one could create an application where you’d be able to choose between different layers of sonified data is very promising. What other signals would we be able to sonify?
Monday
Dec082014

Sonify... Wikipedia

Sonification, and especially data-sonification, is still an underused technique. I’ve been quite interested in sonifications, and have heard both very useful, as well as utterly rubbish applications. I’ve been trying to wrap my head around which sonifications work, and which don’t.

In the “Sonify…” posts, I will post about different ways of sonifying data. This time: Sonfiying Wikipedia. Listen to Wikipedia by Hatnote is a sonification and visualisation of changes being made to Wikipedia. Hatnote is Mahmoud Hashemi and Stephen LaPorte, both interested in “Wiki life”.

“Listen to Wikipedia” sonifies changes from Wikipedia-articles in real time. Bell sounds indicate additions, and string plucks indicate subtractions to an article. Pitch changes according to the size of the edit. It’s worth noting that Wikipedia is maintaned by both bots and humans, and it’s only through these web experiments that we can see or hear that labour force.

What do you think? Is this a good sonification of the data of Wikipedia?