The installation uses twelve flat screen computer monitors that surround the viewer from all directions in a room. The monitors lie black and dormant when no one is present, with a light hung from the ceiling of the room illuminating a circular spot on the floor.
When a viewer enters the space the spot light dissipates and the monitors come to life. Each monitor contains a single eye, filling the screen, staring at the viewer from all sides. After a few moments one of the monitors changes to a mouth, which says, “I can see through you”. Other monitors become mouths saying the same thing, in ten alternate languages. Those languages are Polish, Thai, Laotian, Russian, French, Italian, Hindi, English, Farsi, Spanish and two in English. The languages are representative of my friends and community.
As the viewer remains, the voices become more frequent, overlapping and building into a cacophony of sounds. The image above shows the same monitor in both active states, the first showing the eye and the second showing the mouth speaking “I can see through you” in one of twelve languages.
The installation puts viewers at the center of attention; their presence is required for transmission, their physicality in the space makes them the fulcrum; the images and sounds from all sides focus their attention eliminating distractions.
This is a video excerpt from the installation. It is a small snippet that would play on one of the twelve monitors. The other monitors would contain eyes and as you hear the different languages spoken those monitors would be changing to mouths saying “I can see through you” in that particular language. In this snippet the model is speaking in Polish.
But what of perception, how does the viewer interpret the statement? This is left to the viewer and their filter, their vision of reality and meaning. A judgment inevitably manifests, good or bad, right or wrong, meaningful or meaningless. We like to dissect and categorize, value and judge; it is the way we filter reality. The higher the perceived value of a thought or experience, the more these percolate to the top, into our conscious minds and our projections into the world.
This is where I would maintain that awareness plays a critical role. The greater a viewer’s awareness, the more capable that viewer is of contemplating alternate possibilities or variations of possibility, and this leads to less dogmatically defined judgments as it becomes impossible to limit or reduce variables into incontrovertible truths.
“The dot represents attention and the circle awareness. In these respective positions both are centered in relation to each other. Awareness can expand without losing center or its balanced relationship with attention and become more inclusive simultaneously. Attention can be focused, as fine as possible, in any direction and can probe all aspects of awareness without losing its balanced relationship to awareness.” – Pauline Oliveros, On Sonic Mediation, 1974
The dot becomes the viewer at the center of attention in the installation. The declaration “I can see through you” makes the viewer’s perceptions another point of attention. The viewer brings only his or her own judgments to the experience; his or her level of openness determines the possible variations in perception.
The installation responds to the viewers presence dynamically in the installation. As the viewer enters the space their presence activates all twelve monitors. When they leave the space all twelve monitors go black again. If the viewer returns within a short time, the video will resume where it left off, if they return after a long period, the video will start from the beginning.
The mechanics of the installation are controlled by a visual software programming language called MaxMSP. You can see an example of what the MaxMSP “patch” I used for the installation looks like above. The computer running this software is connected to a computer micro controller called the Make Controller Application Board v2. This board in turn interfaces with a proximity sensor. I built and programmed the installation components.
The MaxMSP patch is built off of Zach Poff’s Multiscreener patch. It is modified to take sensor data from a Maxbotix lv-ez1 proximity sensor interfaced through the Make Controller Application Board v2. The proximity sensor reacts to a participants proximity in the installed space via inaudible sonar pings and activates or deactivates the videos and controls the intensity of the overhead light in the middle of the room. All of the eleven other machines run MaxMSP patches to sync their video with this lead machine. There is some “play” with network latency which causes all the audio tracks to be slightly out of phase with each other producing a very full and dynamic sound scape.
This is the Make Controller v2.0 application board used in the project.