MEDIA TECHNOLOGY [Digital Media PhD] | Professor Nuno Correia



B-wind! A RTiVISS experience

What is the feeling of becoming the wind, an invisible power with a visible physical effect on trees? Will you cherish the leafs, or will you trigger a hurricane? B-wind !

B-wind! works with particle systems for responsive expressive wind effects, as if translating motion into abstract poetry. This proposal is connected to the motto of the “butterfly effect”, for the wind waves provoked by the subtle flickering have the “hurricane effect” in a remote place.
A prototyped component includes the power to apply the wind effect to the forest itself – the motion tracking in the installation having a real amplified effect on the real trees and in real-time, using a microcontroller controlling a fan that produces the wind, which is visible in the framework in the video. Confronted with such possibilities, multiple questions arose during the user experience: is this pleasant, is it strange? Will the participants “spread the wings” and feel the freedom to cherish the trees? Will they explore the superpower of generating wind, somehow “competing against the machine”? Or, as children, exploding in energy, will they join the celebration of nature without processing causes or consequences – just being?... the wind!

Technically, B-wind! is realized as two interconnected installation spaces. The first one presents the user with a projection of a live video stream of a remote forest space. A camera records the user's body motion in real-time and a custom video processing software written in openFrameworks and openCV analyses this motion data and uses it to graphically render particle effects showing the user's influence over the live video stream.

Simultaneously, it sends control signals to wind generators in the remote forest location. At the remote location, custom software also written in openFrameworks receives the network control signals and forwards them to an Arduino-based electronic control circuit that controls the power and motions of an array of fans. A video stream is captured at the remote location and is streamed in real-time to the installation space using Darwin Streaming Server and decoded on site using a custom GStreamer pipeline.

B-wind! was born in the context of a RTiVISS research project exploring networking practices and actions that contribute to change the current behavior regarding environmental protection, promoting new activities to move society towards more inclusive modes of production and sharing knowledge for the design of a better world.


B-WIND! Interactive installation
at AZ Labs @ O Espaço do Tempo

LIVE STREAM from the exhibition »


AMARAL, Bruno, FIGUEIREDO, RIta [2007], Blow me, installation, Prémio Mapa, Universidade Católica

GONÇALVES, André [2007], I Thought Some Daisies Might Cheer You Up, installation, Interactivos?, Media Lab Madrid

MADEIRA, Rui [2009], Parque, interactive installation, Parque da Paz – Almada, Portugal

SOMMERER, Christa, MIGNONNEAU, Laurent, [1993], "Interactive Plant Growing", ZKM – Center for Art and Media

WATSON, Theodore, GOBEILLE, Emily, [2009], Funky Forest - Interactive Ecosystem

WOOLFORD, Kirk, [from 2008], Shooting the Wind

WOOLFORD, Kirk, [2007], Will.0.W1sp – Installation Overview, ACM, Proceedings of the 15th international conference on Multimedia, Augsburg, Germany, pp. 379-380

WOOLFORD, Kirk[2008], Bhaptic

O'SHEA, Chris [2009] Hand from Above, written using openFrameworks and openCV

Robô de Detecção de Incêndios Florestais
Ano Europeu da Criatividade e Inovação




Video Based Environments – State of the Art
Foreseeing B-wind Interactive Installation

Interactive experiences engaged with the use of technologies are increasingly embodied in video based environments. Whereas real-time video has been mainly used as a functional tool for surveillance, targeted for informational and safety purposes, the use of this resource has an enormous potential for artistic exploration, slightly glimpsed through some significant, still technologically shy, art experiments.
With the development of the real-time video interactive installation B-wind in mind, this paper highlights the state of the art of video based environments. From a bird’s eye view of selected case studies art works concepts, to an insight into the applied technologies, this research aspires to support, to open new perspectives, and to describe the technologies underlying the B-wind experience.

Live video, video streaming, real-time processing, Motion Capture, Particle System, Experimental Design, Interactive installations, Sustainability, Social responsibility.




I. Theme proposed 2010/04/27: "B-wind: a RTiVISS experience"

II. System Architecture
Click on the images for full size

B-wind! [v2.0] interactive installation generic version


B-wind! [v1.0] version presented at "AZ labs @ O Espaço do Tempo" exhibition

The system developed involves three main components:
1. a working prototype of the fan turning on and rotating according to the X axis mouse movements, upgraded into a version with the video tracking of the user captured by the webcam at the installation;
2. the real-time video being captured, and the code for the streamed video processed in openFrameworks;
3. particle fields effects to be generated from the user's performance at the local installation, captured by the same tracking webcam, and embeded in the streamed video of the remote location.


III. B-wind! Software issues
Tools used: openFrameworks, GStreamer, openCV, Darwin Streaming Server, QuickTime Broadcaster
Getting deeper into the process implementation of the project, we can finally specify some of the functionalities of the applications featured in B-wind.
The video is captured at the forest by a network webcam and, by using QuickTime Broadcaster, the video streams through a wireless network to the server where the open software Darwin Streaming Server has been previously installed (FBAUL’s).
From there it is sent, at last, to the installation. At that exhibition destination, openFrameworks gets the uncoded H264 video from its container, codifies and converts it to RGB, drawing the video that is streamed (currently with a lag
of 9 seconds caused by the network, considering the tests with a good frame rate).

Finally, effects such as particle fields can be embedded after being processed in openFrameworks, constituting the video that is projected at the interactive installation. To work with computer vision and real-time video
manipulation, OpenFrameworks is the chosen tool – for its processing rates, as the code is optimized, it provides more responsiveness than other applications.
On image processing and computer vision issues, in order to use the video tracking as input data for the source of particles, movements and gestures are mapped as wind actions. Drawing storyboards for movement indexing and mapping is part of the
method. These cataloguing processes that can be similar to choreographic notation.
The movements in the videos created this indexing are analyzed by OF for the pixel changes, so the movement is the only action in the scene. Everything that is stable is
subtracted, allowing the movement to be identified and isolated. Mapping instances of wind is iteratively done throughout the development of the project. Lexical explorations – the breeze, the hurricane, gust, gale – and correspondences create visuals and sounds in response to motion. Generically, these motion inputs are analyzed and inform the system of the changes occurred, then activate the
creation of the correspondent desired output.

For streaming and recording the video, a prototype with GStreamer is fully working in Linux, but in MacOSX is still a complex issue, for it requires custom intervention on the libraries of the operating system. So far, the info has been passed to the developers of openFrameworks in order to develop a simpler solution to use GStreamer on OSX.
Work in progress encompasses the software code for Arduino and openFrameworks to control the fan by mapping the mouse X (/rtiviss_wip/bwind_fan_control), video streaming using gStreamer (/rtiviss_wip/bwind_streaming) and motion tracking experiences (/rtiviss_wip/bwind_tracker).
As for the hardware for recording and streaming, we’ll be using AXIS 211W Network Camera, a wireless outdoor IP camera, which will allow for high performance with resolution video to be sent directly to the server by a wireless network.
The superior image quality obtained with progressive scan at 30 frames per second is delivered in VGA resolution (640x480 pixels) [4].

Further developments on optimization and media at B-wind category of the RTiVISS project blog »

IV. B-wind! Physical computing for a big gust
Working with the Arduino microcontroller, sensors, motors, servos and fans
Overtaking space limitations, this experience recalls telepresence, too. A challenging idea is the power to apply the wind effect in the forest itself – the motion visual tracking in the installation will have a real amplified effect on the real trees and in real-time, by triggering physical devices producing the visual effect of wind on the framework visible in the video screen. Microcontrollers such as Arduino communicating with wind generators machines can be a departure point for accomplishing this behavior. Previous work with wind fans has been essayed, although in different contexts and scale.
Reinforcing the wind visual effects with real wind in the installation is a feature to implement, assuming the redundancy to enhance the visual impact of the image
perceived by the user.
The B-wind prototype with the physical computing is also an iterative process that is now on the passage from a first working version to an intermediate standalone setup able to deliver a more powerful output.

First prototype and tests at AZ residency – part I:
B-wind fan control first prototype and circuit

Working fan prototype and motion tracking:

Blog posts on the developments during AZ residency parts I and II and further developments on optimization and media related to the complete prototype presented at the AZ Labs @ O Espaço do Tempo exhibition at B-wind category of the RTiVISS project blog »

V. Present(ation) » Future developments
The most recent iterations are currently taking place at the AZ Labs @ O Espaço do Tempo exhibition.
The B-wind! project is being constantly updated HERE »

LIVE STREAM from the exhibition