Sunday, November 9, 2008

Week 3

03 Nov
A new week and some good news. We finally managed to snap the sub-images together!! And soon, we had integrate rotation into the application. Today, the whole IHPC had some seminar to attend so we can hardly see anyone in the office.

Had given Kevin an upadte before lunch and I guess it went smooth. Lunch was special today because we had Crystal Jade. It was expensive of cause lucky we just gotten our pay and it's a 'once in a while' thing, so I guess it's alright. So at the end of the day, this is how our application looked like.
04 Nov
Ming Hong and Kevin had again went for a seminar and we are left in the lab on our own for almost the whole day. Did not really do much coding today, just had changed the inital array of array to a vector of vectors so as to store the position of the sub-images more easily as compared to an array. With a vector of vectors, the index of the sub-images would tell us the orginal position of the images and so made the algorithms looked less complex.

05 Nov
The puzzle game was somehow consider completed and so we tested it with lightdraw using X11. We had before that add in the winning message at the end of the game so that it would looked more completed. 

Testing on lightdraw looks successful and so we went on trying to integrate Open Sound Control but know real little about that so we went on doing research and asking Ming Hong about it. Had change the makefile just to integrate OSC into the application. Being stucked and really bored trying to solve the integration problems, we went to play the black board that Ming Hong created and this is what we drew. Cool huh?

Not forgetting the other team that is also working hard on their apps. They able to check for collision and make the ball bounce more naturally. 

06 Nov
After some understanding from Ming Hong on how the OSC actually work, we actually started editing our codes so that it would take the coordinates of the laser instead of the mouse. It was not as easy as we initially thought because OSC does not have separate event such as mouse up, mouse down with a normal mouse would have, so we had to think of alternatives to allow the laser to work as similar as the mouse. 

We had a hard some trying to edit the codes as it become kind of dis-organized when all the mouse events are together in the same loop. However at the end of the day, we managed to display the squares for rotation and rotate it.

07 Nov
Since we had managed to rotate and display the squares, we went on editing the codes for dragging and snapping. By lunch time, we only managed to drag the images but not snap it. Until after lunch we found that it was because the machine did not 'go through' the snapping code.

Today was special too because we had our lunch at VivoCity because Kevin had to settle his visa.
And finally, at the end of the day, we had completed integrating the application with OSC. However, this application is only for single-user. Let hope we are able to allow multi-user by next week. Here is a video showcase of the application. 


Reflection of the week:
This week, we learn more about Open Sound Control(OSC) that is used for the integration of lightdraw. Basically the application listens to the port that sends packets containing the coordinates of the laser where it points so that it could replace the mouse condinates. The problem that we encounter is that a laser had limited gestures unlike the mouse which have a left and right button. Also, OSC does not have separated event like the mouse up, mouse down, mouse click event which the mouse has. Hence we have to find ways to make the laser work as if it was like a normal mouse event.

Another thing is that we are more used to coding using text editor thanks to the everyday 'training' that we had in our lab. That is a skill that students in this generation lacks of because of the over dependent of IDEs.

No comments: