Thursday, 28 May 2009

Final Etiude - Critical Evaluation

I have just finished my journey with this project. The final work is close to planed achievement and ready to present on 3D Hybrids module. I include my Critical Evaluation of the project.

The Portal – Critical Evaluation

3D Hybrids




          'Traditional interface boundaries between human and machine can be transcended even while re-affirming our corporeality, and Cartesian notions of space as well as illustrative realism can effectively be replaced by more evocative alternatives.' - these words by Char Davies are very magical to me as she grasps the idea of modern romanticism in which humans are escaping through technology from reality they have created and evolve by aplying this technology into their bodies. Modern romanticism has been introduced long time before Char Davies's 'Osmosis' and 'Immersence'. Twentieth century writers like William Gibson and Philip K Dick talked about the downturns of technological development rather than the saving role of it. Nevertheless, modern societies are more likely to accept the carousel ride with technological innovations than deny it. Only until the hi-tec gadgets are not used another man, we are comfortable with automation. I am a big fan of LHC project that takes place somewhere underground in Switzerland. This is the case when technology is used to help us answer the biggest questions ever asked - What is life? What is reality? As an artist I begin a discussion on this topic and try to give my own answer. More specificly, I am suggesting a reality that can be adjusted, changed, morphed. As human eyesight is representing the reality that surrounds us, we interpret every object. Moreover, every one of us understands space and time more or less the same way. We cannot be in two different places at the same time. Taking part in that discussion, my project is suggesting a controversial answer - parallel worlds. I would like to say that space and time can bend to create a magical window into parallel world. This window can be opened in any place we like and on any surface we like.


          The parallel world is seen through the camera placed at point B and the observer is wearing VR goggles with another camera attached to it at point A. In the pure sense of it, it is an an implementation of two fully  three-dimensional spaces onto one two-dimensional space which is understood as a portal. The extra feature(which  gives more real feel to it) is user's position tracking at point A transferred to camera position at point B. The purpose of this feature is to represent a natural orthographic view. This was the crucial part of my experiment and also the most complicated part. Although, I did not achieve what I was planning at the beginning, I am quite pleased with the effect considering the amount of work I have put to it. Chromakeying was ready to be assembled since I have finished it last semester and I had to focus on position detection in space and its interaction with orthographic view of the camera. I started from preparation of devices and tools needed to achieve what I have planned. With the help of Wiimote and its popularity among digital artists I decided to include it in my project without hesitation. To control the camera view, I used RC servo motors that I wanted to work with from a long time. Wiimote was the perfect interactivity tool from the beginning and internet forums are full of informations about its possibilities of use. I was only concerned about its ability to read the X,Y and Z axis. I had no problem with the readings as my mini Web cam that I purchased last semester had Infra Red LEDs build in. 


          I am very proud of what I have achieved. Moreover, I have learned something about myself – I am stubborn and can think logically surprisingly well. I have many ideas and plan every step in my mind before actually doing it.  Patience and stoic calmness decided upon my success as an artist and inventor.        


Tuesday, 26 May 2009

Sketches and Max/MSP Patch

Hello, as I promised in previous posts I am showing the sketches of the project. Here it is:
Also, just to remind you that the original idea came from 'The Neovision' project, I am releasing this short video:


The Neovision is using a distance sensor to control the playback of a quicktime movie. No Flash used...

Monday, 18 May 2009

Prototype

This is a sketch of the two servos controlling a webcam. Undoubtly, this is how I am going to put these parts together, connect servos to Arduino with breadboard and hide the whole skeleton in a toy. Still waiting for goggles though...

Thursday, 14 May 2009

Etiude 7

Hello! Looks like I had a busy month... I was stuck for a longer period of time with my project. But nothing is lost and I made a huge step forward just by buying myself some time. I had to give myself some time to rethink my ideas and imagine the final results. Now, after days and days of researching and sleepless nights I can say that it all paid off and I cracked it!
Before I began the 3D Hybrids module I was pretty sure which direction I'm going. Also, I knew that there is something unexpected to happen with The Neovision glove. And it did happen. The glove is not a glove anymore. It might be a great surprise to some people who expected me to go in a way of developing its touch-sensing capabilities. Now I can say that it would be a less eventful approach.
I began from collecting materials and sensors. After a short internet research I was sure that I need two servo motors to control the movement. Moreover, I needed a subject to control and it was not long when I decided upon a toy, in this case a teddy bear...cute:) The only thing I was all the time concerned of was the display - how the user will experience the fourth dimension(see Etiude 6). For this purpose I always wanted to use the VR goggles - a thoughtfull choice because of its feature to display the image right in front of your eyes. I guess it also came from the fact that as a child I saw 'The Lawnmower Man' and was fascinated by this idea of virtual reality allowing people to broaden perception and intelligence. I am still not sure wheather I will receive the VR goggles from my University. In case of not having it, I will have to choose a projector. The image from the camera controlled by servos will be connected to that projector and the image will be displayed onto a wall or specially for this occasion build 2D plane. For the time of exhibition, the teddy bear with build-in camera will be placed somewhere outside of the Uni building(hopefully security measures will not be standing in a way). In order to experience my project the user has to have an interesting view - 180 degrees of eventfull space. Moreover, the toy has to be displayed in a strategic place as the Uni main entrance is. It is the promotional purpose of the exhibition that speaks for itself. Two of the main works that will be taken soon is the attachment of LEDs to the camera so that it represents eyes, appliance of sound coming out of the teddy bear(thanks to a small speaker inside if its belly). User wearing the goggles will be able to hear what the toy 'hears' and speak through it.
It will be all possible thanks to my MAX/MSP patch that sends and receives data via internet or LAN. I guess it makes it a some kind of Skype-like device but with extended features of camera movement control and an aspect of wide perception.
Soon I will upload the sketches of my project so stay tuned!