This was part of the project where we underestimated the time needed to create all the sounds and edit them together to fit the film. In a previous post i explained how we went around recording the individual sounds ready to be edited together.
This all had to be done after the final edit of the film was finished as we were sinking up the sound to the video on Soundtrack Pro. A nice feature of Soundtrack is that you are able to import the video in to the program, this makes lining up the sound to the video a lot easier. Saying this there were still no shortcuts to editing the sound.
Because we had recorded the sounds with the robot movements in mind it made it a lot easier to find the right sounds for all the movements. There was still a bit of sound editing needed to make the sounds fit the movement better, for example changing the pitch of some so they sounded more like a sound that a robot of his size would. This consisted of mainly changing the pitch and the EQ.
Another aspect that took a lot of time to edit was the volume of each clip. Normally when shooting a film we would record the sound at the same time as shooting, this helps when calculating the levels because they are recorded at the same distance as the camera. For our film we had to calculate what level the sounds would be at in each clip depending on how close the camera is to the subject. We also had to work out how manny different sounds there would be in each clip to make it all sound realistic.
In the final edit we ended up with 40 different tracks all consisting of separate sounds that make up the film. With each sound we had to change the pan levels so that they were mostly central to Left and Right. In some of the shots we tried to pan the sound according to the direction of robot and sounds in the clip. although as i said previously we underestimated the time needed to create and edit each sound, also as we ended up with such a large amount of tracks this also made it harder to regulate each sound correctly. Eventually we managed to get the sound edited to the best of our ability. One thing that was a big learning curve for us was the time needed, the sound design could have been a project in itself let alone animating and filming a short video as well.
Sound plays a huge part in the success of our short film, without it the robot would look completely out of place. From watching films such as Transformers, iRobot and A.I i found that without the sound of the robots moving the CGI still looks stuck on top of the footage. There fore we knew that this was going to be similar to our film. As we had already filmed and animated most of our shots we already had a rough idea fo what we wanted the robot to sound like and other sounds that are needed in the back ground.
With all this fresh in our heads we drove down to the road we filmed on in pool. Armed with a microphone we proceded to record the sounds we needed. The sounds mostly consisted of the sea, birds, footsteps and car noises. Car noises being the hardest to record mainly because of how loud the road as but aslos because cars passing by insisted on beeping there horns when they saw the microphone. Because of this we had to be a bit more covert with our recordings and sit in the car with the window open.
After spending most of the day in pool recording manny different sounds we decided to come back home to record some of the less natural sounds for the robot. These sounds were the ones that we really wanted to get right, they would make the film.We started with the noises the robot would make when walking, his legs, arm,s head etc. These sounds had to sound quite electronic but also have quite a smooth moment to them. We decided to experiment with different objects such as a drill, electric razor, food blender and a toy helicopter.
For each shot we have made there are many stages that have to be done in order to get the best results. The first stage is to make the movie clip in to an image sequence, this has to be done because Maya will only read Tiff image sequences and if we need to motion track the scene Match Mover will only read image sequences. This was quite awkward at first as we had never use images sequences before. To create the image sequences we used a piece of software called MPEG Streamclip, this allowed us to select the length of clip we wanted and then export in any format we wanted.
The next step in the proces is to import the image sequence and robot in to Maya and then move the camera so that the base grid lines up with the floor of the image sequence. This has to be done so when the robot is placed in the scene it looks as if he is standing on the floor of the clip. Once this is done we had to create polygon planes that fit with the the floor and with any other part of the scene that the robot will be interacting with or that a shadow might be cast on. For example the floor, top of step and side wall.
This will now make the animation a lot easier to get right because there are floor planes to line up the robot with. So naturally the next step in the proces is to animate the robot. Before we had even started any animation in the scenes we decided that it would be best to animate an walk cycle and run cycle so we can just import it when needed instead of animating it for every shot. This saved a lot of time in the animation proces.
Once all the animation is done we have to put lights in to the scene and adjust the settings so that the shadows look realistic. In order to have the shadows cast on the ground of the scene we have to apply a texture to the polygon planes called Use Background. This texture allows the polygon to act the same as any other but it becomes transparent when rendered allowing the shadows to fall on the plane but also allowing the background to show through.
Next is to add texture to the robot and smooth the robot. We found that it is best for us to smooth the robot twice so that all the edges are rounded and look correct. Smoothing the robot changes how the light reacts on the robot and can change the shadows in some cases so we had to go back and check the shadows to see if they still look as we originally wanted them.
Because we have a lot of texture on the robots casing, for example the sceme down the side of him and the springs in the middle of him. When just rendering a simple high quality colour render the inner shadows on the robot didn’t show up as well as we were hoping. In order to get these shadows we found out that we had to add a separate texture to the whole robot called Ambient Occlusion. This calculates the internal shadows and creates a mask of the robot including these shadows. We combine both renders later on in the proces. When rendering the Ambient Occlusion there needs to be no other objects in the scene, such as lights and polygon planes, but we keep the camera and the animation.
Next is rendering and waiting…….
Once Both clips have finished rendering we have to import them both into after affects as well as the movie clip image sequence. Layering them all with the the Movie clip on the botom, then the colour render and then the Ambient Occlusion.
In dooing this the colour version of the render becomes fully masked out by the Ambient which is white and grey. We still want to keep this layer because it is the the internal shadows so we have to and a blend to the ambient occlusion called Multiply which blends it and the colour version together giving the final product realistic internal shadows as well as the original colour and floor shadows. The brightness and contrast can be adjusted on each clip if needed.
In some of our shots we wanted to have some movement so this meant tracking the shot in Match Mover.
When we did some test tracks we set up the shot with easy tracking points, this made it a lot easier to get a smooth and successful track. The only down side to this was that the track points were obviously placed in the scene and were very visible taking the focus away from the main part of the scene, the robot.
Because of this we decided that we were not going to place tracking points in the scene when filming, but try and use the natural right angles and places with high contrast that make for good tracking points for the final track in Match Mover.
Unfortunately this didn’t always work as well as we had hoped in some shot. So instead of creating a manual track where we input the tracking points ourselves and adjust them so that they fit the scene perfectly, we decided to try and use the Automatic Tracking that Match Mover has as a function. This tended to work very well and would find manny tracking points and build up a good spread through out the shot. In some cases this was more of a hindrance than a benefit, mainly because it would track points that we didn’t want to be tracked. Such as people, cars and other objects moving in the background, anything that is not constantly static in the shot. A lot of the time this would throw off the track and change the whole geometry of the scene making it jump around. Obviously this couldn’t happen when the robot was in the scene because he would sit right and would slide without any animation. To stop this from happening we decide that we were going to keep using the automatic track but edit the tracking points in Match Mover. We started off by using another automatic setting on the program that cleaned up the tracking points getting rid of ones that were too short or just jumped around too much. After that we then had to go rough each track point and find the tracking points that were tracked to the moving objects in the back ground, people and cars, and delete them. Once this was done we ended up with a solid track and a scene that we could start animating in.
The shot above is a screen shot of the track after we have cleaned it up and removed most of the unwanted tracking points.
The shot above is another screen shot of the same scene but i havent done any of the cleaning up of the track. As is visible there are a lot more tracking points in the scene and a lot less them as green points which is the highest quality track. All of these small changes helo to throw the track off and ruin the animation we place in it.
The final step before rendering out the sequence in Maya is to Smooth the model. This makes every part of the robot rounded making him look a lot more realistic.
With texture added edges smoothed the robot plus animation is ready to be rendered out and put in aftereffects to be colour corrected.
Adding texture to the robot is in the final stages of the animation process, and is also by far the most satisfying stage! When animating we have to use a low polly version of the robot, making it a lot quicker to animate and for when rendering out test shots. We have also found that is stops the software from crashing (which it does a lot!)
So once the animation is complete we get to have fun and make the robot look like a real robot rather than an edgy grey object moving on the screen. To add texture we use the render Menue called Hypershade, this allows us to select a texture, edit it and then assign it to a specific selected object.
For our robot we chose three different textures. One for the main Body (White) one for the links and moving parts (Black Metallic) and another for the inside of the eyes (Turquoise) which we assigned a slight glow to to give the eyes a light like appearance.
To make it easier to get the same texture settings for each scene we exported the Hypershade settings which can be imported at anytime to assign to the individual robot scenes.