To manifest our idea we built a working physical model controlled via Arduino technology, as well as a 3D fly through to better demonstrate the aesthetics of the building.
Programming public : The feeds explained.
Each of the four physical modules is driven by a servo motor. Using the pre installed servo libraries we are able to move a mechanical arm 180 degrees; perfect for the motion we need in our model. Each module has its own motors and accompanying piston to move itself.
Module #1 The Mind : For the mind module we have chosen to use twitter, this is because we believe twittter is arguably a collective conscious of society. So by targeting specific tweets and hash tags about the building we can create conditions that use this data to move the building automatically.In addition to this, the building will also tweet about its current formation, making it a blogject. (an object that blogs)
In the image below you can see that the building has blogged information about its current state. This allows the building to have a digital as well as a physical presence and therefore reinforces its interaction with the public.
For the actual programming of this I used the twitter 4J library in conjunction with processing to gain access to our twitter account. By using this library i was able to update the status with pre written tweets and collect tweets from the entire twitter realm; specifically tweets containing ‘#the1public.
Module #2 The Body : For the body of the building we have chosen to use a light sensor as its data input. We want the amount of light entering the building to directly affect one of the modules; this is because light is the source of all life from humans to photosynthesis in plants and molecular life. So because we are trying to give a space life of its own, we believe that a light source is fundamental to achieving that. I programmed the light sensor to respond to three different conditions of light that would directly affect one of the modules.
Module #3 The Soul : For the soul we decided to use a sound sensor for its respective module. When people speak they are expressing their thoughts and emotions in a verbal manor; this can be considered the human soul. I programmed public so that it reacts to noise levels within itself.
We decided to use ultra sonic sound sensors which work the same way sonar works. This is because we wanted to detect close interaction with he play scape. I programmed this sensor to work so that the distance a person is away from the sensor, directly affects the height of its respective module.
With all four of these inputs combined into one sketch they each independently drive its respective motor. These motors in turn directly affect the formation of the building.
Fall backs and visualisations
So, even though we had done vigorous testing throughout the project when it came to fitting all four modules in place it seemed that the servos were just not strong enough to do so. This is mainly due to the high amount of friction created when the modules are sitting flush to one another. Because of this I decided to mock up a processing visualisation to demonstrate where the modules would be on the model. The visualisation is directly affected by the data inputs from the Light, Sound, ultra sound sensors and twitter. Myself and Jessi Dimmock programmed the processing sketch; of which pulls in data from arduino using firmata.
Below are a series of images documenting the model build for public of which we all built together. Jon Moore came up with an instruction manual for the entire build process which can be viewed in detail here : http://blog.that-website.co.uk/#post60
1. The base frame
2. The ground floor is attached.
3. The building takes to its normal shape.
4. Testing the servo motors with a singular module.
5. Final form of the model including the interactive playscape.
6. The sensors attached to the model.
Below I’ve recorded the motors being moved by their respective feeds.
‘Public’ is a dynamic and contemporary approach to the architecture of an art gallery. It is a physical space that is constantly re-defined by several digital inputs, of which are driven by human interaction. This human interaction causes each compartment to reconfigure over night. Thus creating a living building that can be experienced differently each day.
The umbrella concept of this project is to have a building that monitors the public and how they interact with each other with in the space. The building itself will automatically adjust to the habits of the public.
In order for the building to become a living organism, we believe there are four critical areas of which should be included:
A mind – User interaction with twitter; specifically targeting hashtags and tweets about the space. Secondly allowing the building to tweet automatically enabling it to become a blogject. (An object that blogs)
A body – Sensors with in the gallery monitoring Light levels.
A heart – An interactive outdoor playscape that uses ultra sonic sound sensors to gather data about how people interact with it.
A soul – A sound sensor that detects the sound levels from public interaction.
Each of these areas will be driven by live data of which will drive the mechanics of the independent modules. This enables multiple configurations of the building allowing for different external and internal experiences.
Why are we doing this?
We want to explore how far the relationship between architecture and technology can be taken.
We want to deliver an experience that changes everytime the public visit it.
We believe that ubiquitous computing can be used to not only make buildings smarter but enable them to become a living organism that adapts to the public automatically.
Below is my initial concept drawing of the model we plan to build. Here we considered using shipping containers as re purposed architecture to for fill our vision:
Below is the second and final design for our model. We abandoned the shipping container idea because if our idea was to be built the modules would be purpose built.