Sunny Park

#Interactive Installation #Immersive Space Design #Kinect #Motion Tracking

Memory Horizon

Jun 2016 - Dec 2016 | Programmer, Interaction Designer

Memory Horizon is an immersive media installation that captures the nature of human memory in 3D space using visual, audial, as well as motion interaction. As a programmer, I built a multi-modal interface using Arduino, Processing, and Kinect platform along with the overall spatial design.

Memory Horizon

Skills.

Processing, Arduino, Rapid Prototyping, Project Management

Abstract.

“See the Unseen.”

Memory is a collective term of a psychic function to cause impression, perception, and conception, which refers to the phenomenon that a human or an animal storing what they have experienced into the specific term and reproducing or restructuring them later. Our memory exists under the surface of consciousness, entered as a fragment with individual keywords, and restructured with their own special story at the very moment of printing it out. We all destroy the wall between the past and the present through the active process of cognitive force called memory and experience the transcending moment beyond the dailiness as a dasein.

Memory Horizon intends to express the beauty of memory crossing the reality and the fantasy through the stereoscopic spatialization by rediscovering the aesthetic value embedded in the attribute of ‘memory’ as a process of storing and revealing of the experimental past.

The mechanical movement of a clock located in the center of the space reveals the limitation of horizontal dimension of time and at the same time physically indicates the mediating energy emitted on the border of existence and non-existence. As a main agent with a fragment of memory of their own and externalizing its subconsciousness, and as a creator of the new story, the audience is constructing a new meaning to the un-daily space of the work.

We’re expecting Memori Horizon to exist as a field of experiencing special multisensory space by projecting the archetypal beauty of memory.

Workflow.

The Kinect located at the front-center of the space detects user's motion; in reaction to the inputs, screen effects are displayed, lights get brighter, and clock gears move faster with audio effects raises its tempos as well.

Motion Recognition.

Based on the mechanisms of Kinect, various interactions are developed within the viewing space of the installation which uses both of position(dx) and depth(dz) values. The change in horizontal position input is mapped to the size of the screen, and this value is constantly reflected in the X coordinate of the position value of the panel. Also, the depth value is important inputs in the installation. The depth change of the target is the Z coordinate of the particle sphere displayed on the screen, the illumination of five LED lights, and tempos of the sound effects.

Immersive Space Design.

The main object in the Memory Horizon is a mechanical clock that works with a combination of springs and gears without electronics. Stepping motors A and B are installed on the two wheels moving the clock needles.(Fig.5) The motors are attached to the central axis of the wheel and rotate at one speed each gear wheel is achieved by rotating each gear at a single speed (rpm value). As the viewer approaches the clock, the motor swivels at a higher speed and, when away, at a slower speed.
In response to audience's real-time location values(dz values), the screen, lighting, and sound display the following interactions effects :

Output.

Visualization

On the screen, real-time images are printed with a visual representation of particles. Each of the three spheres are composed of thousands of particles surrounding the core sphere. The sphere dimension interactively changes in response to audience's position values, using a horizontal position value (Ld) and a depth value (dz) detected by the Kinect. The horizontal position values are mapped to the screen size, which determines the X coordinate of the particle. The depth value to the camera depth value allows viewers to experience a sense of immersion as they get closer to the sensor.

Lighting & Sound

To maximize the immersive experience, not only the screen but also the lighting and sound reflect the Kinect information and demonstrates real-time interactions. Five LED Strip lights are installed at the bottom of the circuit controlled by Arduino boards. Also, Processing and Arduino are connected to each other with serial communication method.
The depth value of the viewer in string is uploaded to the board in analogue (AnalogWrite), with the overall LED's light intensity illuminating in real time. Clock sound tracks are also controlled by user location, and this interaction uses the sound-library provided by Processing.

Final Simulation and Exhibition

- Art&Technology Conference (2016.12)
- 2017 HCI Korea (2017.02)


Achievments.

2017.02

Creative Awards, HCI Korea 2017

2016.12

Selected Artwork, Art&Technology Conference