Thursday, December 16, 2010

Final Exam. Matchmoving Cellphone

Intro

One of my major reasons I decided to take this course was to learn how VFX artist in many movies, videos, and other media successfully placed CG elements among live action. Movies like transformers, The Terminator  and many more showed CG characters full of life interacting with live action elements in a very realistic way.

It is to answer this question that my final project idea came into fruition. Of course I did not set out to do a production quality 3D action film. I knew from the very beginning that I wanted to do something extremely simple that showcased my ability composite the elements as such

After initial research it turned out that the method or technique artists use (at least one of them) was called match-moving. Match-moving is when a live action video is shot and certain points are tracked by a tracking program. The program then calculates the various tracking information from the points and uses that information to solve for a 3D camera. In other words the program creates a 3D camera that animates almost precisely the same (of course it is a guess and only as accurate as the quality of tracks) as the real world camera. Then the software creates a 3D scene (in maya for this case) with the virtual camera imported in already keyed. It is then up to the user to import all objects/scenes they want to be included in the final shot.

I was able to successfully do this for a rested smart phone on top of a live action table.

Production
The program used for matchmoving is called Matchmover. Matchmover was introduced as part of Maya software package from version 2010. I learned the basics of this program using the included help section under the menu bar.


-Filming base footage.
The first thing needed here is to film the actual shot (live action) where you want to composeting to take place. Matchmover provides very good high quality tracking but a lot of this quality is based on the quality of the original footage. Therefore it is very important to have good quality footage with proper lighting and set up.

When filming my phone, I had a lot of problems finding good track points because my phone is pretty uniform by design, thus it had no natural design properties that allowed for good tracking points (plain simple design). You can see here that I added green construction paper to serve as points I wanted to track, similar to green screening.

-Bringing it into Matchmover
Matchmover only work with image sequences. I had to import this shot into after effect and exported it as a tiff image sequence. Once I was done with this step I then opened it in matchmover via File>load image sequence. Now in order to actually matchmove the 3D object, Matchmover allowed me to import a .obj file or create a primitive myself to serve as the object (i chose the former option). At this point I was not actually working with my main model just yet, it is only necessary to import or create an object just so that the tracking points can be set. (Note: it is very important to try to get the object used in this stage to be the relatively the same size and shape as the real model being used. For example: a cylinder object should be used if you were trying to matchmove a model of a can of soda)
This is done by first changing Matchmover into 3D mode (clicking on the "3D" icon towards the top left corner) and then right clicking anywhere on the workspace and selecting "Import Scene."

-Setting up tracking points
When my object was imported I rotated it to a similar angle based on the live action object being replaced (in this case my phone). Then I switched from light to full mode. To make the long story short, I connected the tracking points on each of the green construction paper cut outs to the corresponding point (more or less) on the 3D object. There was a total of 7. Once the points were set up I used the analyze command (similar to what we learned in combustion) to run through the track of each point. (press FN+F3 to do this). I also fixed the track over the time line by changing the pixel it was tracking for a better/sharper one. Matchmover does an amazing job providing you with smooth tracks so you don't have to fine tune much.

-Where the magic took place - solving for 3D camera
Once all of my tracks have been analyzed and I was happy with them. It was then time to solve for the 3D camera. I made sure I saved the file before doing this in case it didn't work properly. I selected 3D tracking>solve for camera up from the menu bar. If everything went smoothly the camera would compute the rotation, transformation and scaling of the object over the course of the shot, just like it should if it was really there (in live action). To see this clearly I made sure to select view>lock on camera to see from the camera's perspective. The process did go smoothly for me and I was done with matchmoving.

-Putting them all together
Finally Matchmover allowed me to export the project as a Maya scene. To my surprise, the scene generated by Match mover contained a key-framed camera with an image plane attached to it that played the image sequence imported earlier. In addition Match mover also placed empty placeholders that indicated where the model should be placed in order to look convincing (these were the 7 tracking points I have created earlier). Finally all I had to do was import my real model and line it up with the placeholders. I also had to scale among different axis to properly cover my real phone (in the image plane) with my virtual phone (positioned in front of it) from the camera's perspective. This is why I mentioned the importance of getting a live object that is very similar in size.

And there you have it, my report on how this was made. If you would like to do this yourself, I highly recommend you open Matchmover and open it's help>contents menu. It is very simple to follow and it is also short.

Tracking Project

for this assignment I wanted to replicate a cool composite video I saw on a tutorial website. The video depicted a midday sky with a large planet in full view. This is a classic sci-fi scene of a large neighboring planet from the perspective of the viewer on the ground.

here are some of the images for this project. Tracking was used here to move the planet I made myself in photoshop (PSD file) in relative to the camera. I tracked the roof of one of the townhouses and set the position of the planet to that track. This is how I was able to create the planet animation coming in from out of view.



This was done in after effects

2D composite

Tuesday, November 16, 2010

particle Exercise

This particle exercise is done using combustion. I have a total of 4 particle system, which are: Grass meadow, firefly, comet and stars. Here are some screen shots, all animation are default with some tweaks.


Tuesday, November 9, 2010

Intro

Hi my name is Luis Herrera and I am a student at TCNJ, but not for long now. As part of my interactive multimedia major I have learned a lot of technical skill in planning, developing, and producing various projects in multimedia. My strong points are in 3D computer graphics, and texturing (also some animation). I am also fond of video production and editing. I enjoy and look forward to improving my skills in VFX especially for live action videos

Rotoscoping Video

here i wanted to create more of a cartoony rotoscope animation of a tennis ball bounce lasting about 30 frames. Each frame were done by hand.

this was done in Photoshop by coloring over the layer each frame






Thursday, October 28, 2010

midterm Production log

Overall

The midterm project I created was aimed to show the green screen ability to match 3D scene and overlay live action inside it. The project I had in mind was to have a volcano crater with a magma pit. The scene depicts me walking over to the edge of the pit and peeking over to the lava. This simulates one of the scenes in The Lord of the Rings trilogy where the protagonist Frodo destroys the ring.

Production


My strategy was to create the scene really exemplifying lighting. I wanted to show the dynamic lighting in which the moon light blue glow and the lava yellog glow mix together. Although I was able to do this in the 3D scene, I was unable to do this in the live action part. Even though this was my first official green screen project, i believed that the lighting was not overall bad, however it should be noted that it was not great either. My strategy going in to the project was to do as best as I can in preproduction and production stage. By doing this, post production would have been easier. Of course things rarely work out the way one wants.

From the beginning, I put a lot of value in planning. This is why the storyboard part really became an important asset to me as I followed that in the my cinematography. I also decided to create the 3D scene first, rightfully so, as I needed to know the timing of the shots. The trick with compositing live action and 3D is getting the timing right during the interaction. For instance: if wanted to throw an animated ball my hand and arm motions must match the speed of which the ball is moving as I throw it. This is essential to create realistic effect. Therefore by rendering out the 3D footage first I was able to figure out how long each shot must be to synchronized with the animated background.

After finishing the background movie I shot the live action footage.  Because of limited green screen space I planned the shots to be at 90 degrees from the camera. In other words it was not possible for me to have any diagonal shots at all. This actually turned out to be a good idea as I shot from two angles, the back and the side, making it easy to composite. The backshot was a bit tricky to create. Originally I intended to have the camera placed high above me at an angle facing my back. This shot was supposed to have me coming in from out of view, walking into view. However since I worked on the project myself I could not place the camera higher than what my tripod allowed. What resulted was a shot fairly down to eye level, limiting the effect of moving in depth away from the camera. This was later solved in post production. The second shot was a closeup side view. Fairly simple to make in live action and straightforward.

Post Production

Post production was where the bulk of my work was done. The first order of business was to key out the green screen background. To do this I used the keylight effect in AE and set the tolerance up a little (because the green screen was not perfectly stable in that there were highlights and shadows). The second stage was to use the alpha and "purify" the key to make sure my figure was pure white and the green screen was pure back; this also included bluring the edges a little bit scale the mask. The third stage was color correction. I color corrected the live action footage in order to make myself fit with the background. I suppressed the blue and increased the red and green to make yellow. I also darkened the footage a bit as well. The final stage was to animate the footage so that it appears to come from the bottom leading the middle of the screen. Scaling down for the depth effect was also necessary.

The second shot, the side shot was done exactly like the first back shot with the exception of animation. If you look closely I programmed the 3D camera in the 3D scene to zoom in for this shot so that it looks convincing as a closeup. This was a small effect I was particular proud of. The main aspect of shot was the glow. Originally I wanted to have the light shining on my face as I peered into the lava pit. Because I had no lights available (due to the fact that I could not enter the lighting room) I resorted to mimick the effect in AE. This was a lot harder and took a lot longer than I thought. After several failed attempts to alter the color hues using masks I could not get a realistic looking effect until I came accross the glow. The glow effect was most favorable and was chosen for this shot. My plan for the glow was to have it stationed in a spot and have me "lean" under it. This gives the effect of directional light setting my face aglow. The first implementation was a bit tricky. I could not have the glow affect my face without also affecting the background. To prevent this unwanted artifact I duplicated the live action layer and masked out one of them so that the glow will cover the mask and not my entire body (without the mask, the glow affects the entire layer, thus I had to mask the live action layer so that the glow can only affect a small portion of my face. Then the original live action layer will be visible because of the mask on the overlayed layer). This gave the effect of glowing only one part of my body.
Finally the final shot was not filmed due to lack of time.

What worked

-green screen
-animation
-timing,
color correction
-Glow

What didnt work

-lighting
-camera angle

Final thoughts

I throuhoutly enjoyed the project. I had been eager to do a green screen project for some time now. I believe that it came out quite well considering I had never done this before, I had mostly my imagination to work with. Despite the lighting and camera challenges, I improvised and overcame them quite nicely. I learned how powerful compositing programs like AE and Combustion really are because I was able to do so much with such simple shots. I really liked how the glow came out in the second shot, but mainly how well the footages were timed really impressed me; this made my project believable by a generous stretch of the imagination. I also liked how well the compositing came out although I could not acheive the blue and yellow light mixture i wanted (due to lack of equipment); the green screen really held its own for being peices of crumbled up green paper. I am happy how this project rendered out overall and look forward to my next one.