Thursday, 31 December 2009

Wiimote Sensor bar build up stage1

At this moment I'm at the stage of building a Wiimote IR sensor that will be attached to the goggles in order to co-operate with Wiimote's camera. I have already purchased LEDs needed.

And here is the tutorial how to build the sensor bar. 

My feelings and expectations towards the project are very pessimistic at this moment because I have a problem with the goggles. They seem to be broken unfortunately:( Only one screen is working and the other one is not responding/is off. I will carry on with the next stage as I planned but the final effect will be far cry from expected. After the sensor bar is done, the next stage will include the servo motor movement adjustments. Wish me luck!

Happy New Year!!!


Sunday, 20 December 2009

Wiimote sensor bar

Well, you may ask why oh why did I put this video here... It is funny and to the point. I have been installing my Max/MSP Jitter and Arduino back again after it has been not present in my computer for long few months. After a while, whne everything was there, I was testing my patch and everything that goes with it: cameras, Wiimote, goggles and Arduino. The first camera appears to be broken...hmm, it should not be a big problem, it cost me only 5 pounds to get a new one. The second camera that is inside the goggles seems to show a blurry picture so few more adjustmesnts need to be done. Next, the Wiimote is not detecting the IR LEDs of the camera inside the goggles even though the LEDs stick outside the goggles cover. I have spent 6 hours today trying to figure out what is the problem. The funniest part, that also relates to the video, is the Wiimote detecting my ceiling lamp like if it was a IR LEDs. Hah, so much to be done tomorrow. If the bug won't be fixed quick, the option would be to build a Wiimote sensor bar. Simple and convenient.

Tuesday, 8 December 2009

Promo 3D animation

This is the promotional piece that will be part of a promo website. 

Tuesday, 1 December 2009

Ortho-goggles revisited

Hello everyone! The project which I have been describing in details on these pages entered the next stage of development. Already, there have been improvements in esthetic's design. The camera placed on the goggles, responsible for showing the surrounding, has been mounted inside the goggles. Moreover, some additional features have been added to the overall look of the davice itself. The cover has been taken off in order to fit the camera and so the new cover had to be developed. Plus, previously it's been proved that the visitors of TVU Degree Show Digital Piracy Exhibition weren't convinced by its fragile structure. Today, the structure is more stable and user friendly. 

   A device that I am creating is a digital tool for artists. Its form is way much more sophisticated than a brush or pencil but also in terms of usability it is also more valuable. This device allows us to manipulate the perspective of the image. It becomes a tool for an artist in terms of vision and imagination. Additionally, it represents the technique of projecting a fully dimensional 3D object onto 2D plane, a technique called isometric projection. It is a method of representing three-dimensional objects in two dimensions where the X,Y and Z axes are equally foreshortened and and the angle between any two of the axes are 120 degrees.

The project consists if two cameras, AV goggles, two RC hobby motors, Wiimote controller and a projection plane. The first camera is attached to the goggles and the second to RC motors. The position of the first camera depends on the position of the second. Both move along X and Y axis thanks to the Wiimote controller placed under the projection plane. The Wiimote controller's role is to be a vantage point and to read the position of an object in 2D or 3D space. In this case it is the position of the first camera in 2D space that is being read. Thanks to these specific values of the controller, it is possible to send the data of a position of an object to be received by RC motors that control the position of the second object. And so, the position of the first camera is equal to the position of the second camera.



Thursday, 28 May 2009

Final Etiude - Critical Evaluation

I have just finished my journey with this project. The final work is close to planed achievement and ready to present on 3D Hybrids module. I include my Critical Evaluation of the project.

The Portal – Critical Evaluation

3D Hybrids




          'Traditional interface boundaries between human and machine can be transcended even while re-affirming our corporeality, and Cartesian notions of space as well as illustrative realism can effectively be replaced by more evocative alternatives.' - these words by Char Davies are very magical to me as she grasps the idea of modern romanticism in which humans are escaping through technology from reality they have created and evolve by aplying this technology into their bodies. Modern romanticism has been introduced long time before Char Davies's 'Osmosis' and 'Immersence'. Twentieth century writers like William Gibson and Philip K Dick talked about the downturns of technological development rather than the saving role of it. Nevertheless, modern societies are more likely to accept the carousel ride with technological innovations than deny it. Only until the hi-tec gadgets are not used another man, we are comfortable with automation. I am a big fan of LHC project that takes place somewhere underground in Switzerland. This is the case when technology is used to help us answer the biggest questions ever asked - What is life? What is reality? As an artist I begin a discussion on this topic and try to give my own answer. More specificly, I am suggesting a reality that can be adjusted, changed, morphed. As human eyesight is representing the reality that surrounds us, we interpret every object. Moreover, every one of us understands space and time more or less the same way. We cannot be in two different places at the same time. Taking part in that discussion, my project is suggesting a controversial answer - parallel worlds. I would like to say that space and time can bend to create a magical window into parallel world. This window can be opened in any place we like and on any surface we like.


          The parallel world is seen through the camera placed at point B and the observer is wearing VR goggles with another camera attached to it at point A. In the pure sense of it, it is an an implementation of two fully  three-dimensional spaces onto one two-dimensional space which is understood as a portal. The extra feature(which  gives more real feel to it) is user's position tracking at point A transferred to camera position at point B. The purpose of this feature is to represent a natural orthographic view. This was the crucial part of my experiment and also the most complicated part. Although, I did not achieve what I was planning at the beginning, I am quite pleased with the effect considering the amount of work I have put to it. Chromakeying was ready to be assembled since I have finished it last semester and I had to focus on position detection in space and its interaction with orthographic view of the camera. I started from preparation of devices and tools needed to achieve what I have planned. With the help of Wiimote and its popularity among digital artists I decided to include it in my project without hesitation. To control the camera view, I used RC servo motors that I wanted to work with from a long time. Wiimote was the perfect interactivity tool from the beginning and internet forums are full of informations about its possibilities of use. I was only concerned about its ability to read the X,Y and Z axis. I had no problem with the readings as my mini Web cam that I purchased last semester had Infra Red LEDs build in. 


          I am very proud of what I have achieved. Moreover, I have learned something about myself – I am stubborn and can think logically surprisingly well. I have many ideas and plan every step in my mind before actually doing it.  Patience and stoic calmness decided upon my success as an artist and inventor.        


Tuesday, 26 May 2009

Sketches and Max/MSP Patch

Hello, as I promised in previous posts I am showing the sketches of the project. Here it is:
Also, just to remind you that the original idea came from 'The Neovision' project, I am releasing this short video:


The Neovision is using a distance sensor to control the playback of a quicktime movie. No Flash used...

Monday, 18 May 2009

Prototype

This is a sketch of the two servos controlling a webcam. Undoubtly, this is how I am going to put these parts together, connect servos to Arduino with breadboard and hide the whole skeleton in a toy. Still waiting for goggles though...

Thursday, 14 May 2009

Etiude 7

Hello! Looks like I had a busy month... I was stuck for a longer period of time with my project. But nothing is lost and I made a huge step forward just by buying myself some time. I had to give myself some time to rethink my ideas and imagine the final results. Now, after days and days of researching and sleepless nights I can say that it all paid off and I cracked it!
Before I began the 3D Hybrids module I was pretty sure which direction I'm going. Also, I knew that there is something unexpected to happen with The Neovision glove. And it did happen. The glove is not a glove anymore. It might be a great surprise to some people who expected me to go in a way of developing its touch-sensing capabilities. Now I can say that it would be a less eventful approach.
I began from collecting materials and sensors. After a short internet research I was sure that I need two servo motors to control the movement. Moreover, I needed a subject to control and it was not long when I decided upon a toy, in this case a teddy bear...cute:) The only thing I was all the time concerned of was the display - how the user will experience the fourth dimension(see Etiude 6). For this purpose I always wanted to use the VR goggles - a thoughtfull choice because of its feature to display the image right in front of your eyes. I guess it also came from the fact that as a child I saw 'The Lawnmower Man' and was fascinated by this idea of virtual reality allowing people to broaden perception and intelligence. I am still not sure wheather I will receive the VR goggles from my University. In case of not having it, I will have to choose a projector. The image from the camera controlled by servos will be connected to that projector and the image will be displayed onto a wall or specially for this occasion build 2D plane. For the time of exhibition, the teddy bear with build-in camera will be placed somewhere outside of the Uni building(hopefully security measures will not be standing in a way). In order to experience my project the user has to have an interesting view - 180 degrees of eventfull space. Moreover, the toy has to be displayed in a strategic place as the Uni main entrance is. It is the promotional purpose of the exhibition that speaks for itself. Two of the main works that will be taken soon is the attachment of LEDs to the camera so that it represents eyes, appliance of sound coming out of the teddy bear(thanks to a small speaker inside if its belly). User wearing the goggles will be able to hear what the toy 'hears' and speak through it.
It will be all possible thanks to my MAX/MSP patch that sends and receives data via internet or LAN. I guess it makes it a some kind of Skype-like device but with extended features of camera movement control and an aspect of wide perception.
Soon I will upload the sketches of my project so stay tuned!

Friday, 17 April 2009

Etiude 6 - 4D

Orthogonality
In the familiar 3-dimensional space that we live in, there are three pairs of cardinal directions: up/down (altitude), north/south (latitude), and east/west (longitude). These pairs of directions are mutually orthogonal: they are at right angles to each other. Mathematically, they lie on three coordinate axes, usually labelled x, y, and z. The z-buffer in computer graphics refers to this z-axis, representing depth in the 2-dimensional imagery displayed on the computer screen.
A space of four spatial dimensions has an additional pair of cardinal directions which is orthogonal to the other three. This additional pair of directions lies on a fourth coordinate axis perpendicular to the x, y, and z axes, usually labelled w. Attested terms for these extra directions include ana/kata.

(http://en.wikipedia.org/wiki/Fourth_dimension)









This Easter I came up with an idea to include servo motors in my project. As I am not sure weather my previous idea of 3D video implemented onto a 2D plane will actually work. Even if it does, do I have enough time to finish it? Many questions arose and among them one particullar - What is the purpose of my project?

But, hey, viewer's perspective of 2D space changes the perspective of seeing another 2D space inside...because the image inside is just a projection. A visible barrier existing between the real world in which we move in space and time and the other(virtual, distant) is the 2D plane(mirror, window, portal). A flat, physical boundary between two fully dimensional worlds. Though, it is possible for our minds to travel in two spaces at the same time. Expanding the sense of view, we can control an eye-like camera to see more. As an eye, the camera can turn left/right ap/down and zoom in/out. I am using two servo motors and Arduino Deicimila to control X,Y and Z axes of a webcam. Its positioning is controled by my movement. Wiimote is detecting my position in space and transfering the data to a different location, to where the camera is. The two, me and cam, can be located in two distant locations at the same time but it needs the connection to the software and internet server. In other words, the eye-like webcam becomes my eye. Moreover I can see two different locations at the same time, in the same perspective.

Again, is it 3D space in 3D space?or...


Friday, 3 April 2009

Digital Piracy - Etiude 4

As for this module I will be creating a 3D animation of a beach with a sea waves hitting the shore, nice blue sky and sun behind the horizon:) This idea was not taken out of the blue. Recently, our group of Digital Artists decided that the theme for the Degree Show is the 'Digital Piracy' what we (ass a group of contemporary artists) are trying to achieve is to explain to the visitors of the show what we see as digital piracy in the XXI century. It is rather sharing the code as open-source and sharing experience as a community opposed to stealing the software and abusing copyright law. We need to be understood as a fast-growing, enthustiastic community of modern artists and not as hackers or impostors. As Alexei Shulgin, Russian born contemporary artist, explains in his book 'Read Me: Software Art&Cultures', 'software art is a practice that regards software as a cultural phenomenon that defines one of the principal domains of our existence today'.
Alexei Shulgin is an artist who created 'Super-i Real Virtuality System'.





Shulgin's idea incorporates a neat design - VR goggles and a small image-processing box, it does not even need a computer. The visual effects are rather simple as producing some kind of filters that are used in Adobe Photoshop. Hmmm, I wonder how much money did he spend buying the goggles. It bugs me because I am trying to have something cheaper but less extravagant.

This week and next few days, I will be creating a 3D animation which I will then render into a short movie. Like it is written above, it is a beach:) I am more concerned about the possibilities of the projection. Hmm, maybe some of that will help - Throwable Display for Gaming. Rather a complicated idea but it did not take long to be recreated by Johnny Lee into a piece called 'Foldable Interactive Displays'.



Moreover on the topic - these are useful sites for any creative 3D artists.

Thursday, 2 April 2009

Thursday, 26 March 2009

The Neovision - Etiude 2

Hi again! I am writing after two busy weeks during which I was building websites and studying. Now it is the time to release all that information that I stacked in my browser for weeks. I have been thinking about the possibilities of projecting a 3D image onto 2D planar. In other words, the process of creating a 3D scene and then seting up an orthographic(2D) onto the scene. Max/MSP Jitter can easily handle orthographic views thanks to OpenGL and its jit.gl.render object. It is a matter of adding '@ortho 1' command to jit.gl.render object. Now, it is all great but in my case I am using DirectX as a renderer ans so I have to look for different solution. 

Thanks to forums on Cycling74 website, I have found a recipe that seems to be useful in my project. Recipe 25:Raging Swirl is a patch that controls x,y and z axis of the image. It performs manipulations of a 3-dimensional matrix and visualizes 3-D data as a geometrical figure. Still, it is the experimentation with that perticular patch that will lead me to specific results. Moreover, in order to achieve these results I will have to have a IR LED sensor bar. This bar in connection with Wii Remote is the image manipulation tool without which I cannot experiment fully with the patches. 

I do not want to buy the Wii Sensor bar and mainly becuase of the fact that I am spending money onto something that can be build using cheap parts. The whole process of building your own 'ugly' sensor bar is described here

It seems like I am ready to begin the extended creation of my Neovision project. Finally, after few weeks of research, I have it all planned step by step. This is way is really more convenient. Firstly, I research for weeks to plan my very first step leading me all the way to the last. Oh, but no, I have not decided upon my last step. I am still not sure whether I should accomplish it as a Head Mounted Device or a Glove. It is a hard choice to make when the project has to be 100% ready in May when the Degree Show starts. On one hand, the safe solution would be to exhibit the project in form of a glove but at the same time lacking the effect of immersive interaction with the animation. And in case of Head Mounted Device, the effect would be immersely astonishing but it would cost me stresful month of preparation. If I will go for the design of a Head Mounted Device, this is how it will look like: 



Hack Cheap Video Glasses - video powered by Metacafe

Monday, 9 March 2009

The Neovision- Etiude 1

Hi everyone! My journey with 3D hybrids began at the Kinetica Art Fair exhibition. That is when I realised what I want from my new project. Although, I did not see many video art projects at the exhibition, I realised that I have basic knowledge of how many of these project are done. My previous flirtation with distance sensors, LEDs and programming was strong enough to make me understand how the process of electronic art making looks like. Now I know that I will never lose this instinct.

Last semester I was developing my idea of a glove-like device, the Neovision. The final artefact looked differently and much better than I predicted it to be in the beggining. I enhanced its responsivity but at the same time had to leave some of the ideas behind. Quick reminder - The Neovision.  Among its capabilities is the movie playback control using the URM37 ultrasonic sensor and 'chromakeying'.  

This semester I am about to enhance the Neovision's vision:-) by adding the functionality of Wii Remote and applying 3D graphics. In most cases of chroma keying, the camera is static and the background is projected onto coloured material. In the case of Neovision, the camera is dynamic and the image/video can be projected onto a chosen colour. So, it could be a wall, door, desk or computer screen. At the moment, Neovision operates without responding to Y and Z axis of the camera. The only response is achieved thanks to ultrasonic sensor X axis distance from the object.

This phase of the project will change the above and the viewer will be able to interact with the projection more freely. Johnny Lee found the perfect answer to achieve this effect. Thanks to his experimentation with Wii Remote and sensor bar. As he describes: 'This effectively transforms your display into a portal of a virtual environment'.  


Kinetica Art Fair 2009