While starting with the first computer animation it might be hard to see what tools to use for what purpose and how those tools are integrated into the complete workflow. Further than that, a general understanding of the workflows to create the various assets inside the different tools, should be helpful as well.
A great thing with university projects compared to personal work is that there is a fixed deadline. The following chart visualizes the different tasks of the project and the time slots in which they should be done.
Note that all projects are different so that the chart may only apply roughly to yours.
- Crash Course – canceled due to corona
- Treatment – submission in week tow
- Tutorials – first four weeks should be spent mostly on learning the software
- Animatic – if your scene consists of a lot of objects, create a layout with basic geometry to test your cameras and object placement
- Modeling / Animation / Shading / Lighting – depending on your scene
- Rendering – start with the first test renderings, to see early if there are problems in your scene
- Sound / Mastering – add sound, into, credits and effects in post
Blender will be the main tool for production. It is used to create almost the howl scene with objects, lights, animations, simulations and cameras. If there is no need for fine self made image textures, evan all of the materials can be done here.
If the custom textures are desired, parts of the scene can be exported from Blender and then imported into Substance Designer using the *.fbx file format. Substance Painter is powerful tool, with a lot of great predefined materials, to create easily very complex and good looking textures. The final textures can then be saved as a bitmap (*.png, *.jpg, …) to be used on the object in Blender.
After the complete scene is modeled, shaded, animated and all cameras and light are in place, RenderPal is used to manage the clients in the render farm to efficiently render all frames as single images (*.png, *.exr) onto the server.
Single images because it is not predictable which client is rendering which frame at which time, so creating a video file is kind of impossible. Another reason for single images is to avoid any form of compression at this point.
The last step in the progress is to cut the footage into the right order, to add sound and to create the final video file. This is done in DaVinci Resolve (or directly in Blender if you like).
Note that the game engines are only mentioned here, to show the according pipeline to use assets in a game
Note that the order of steps in the following list can be differ. For example, it’s best to add a basic light and camera set up early on to have more control over the look and feel of the objects. But keep in mind that there are some dependence and restrictions between the different tasks, mostly concerning the geometry. For unwrapping and animation the geometry should be finished as it can take a tremendous amount of work to change or remodel an already textured/rigged/animated object.
- Start with basic object
- Use operations to add geometry and/or add new basic objects to create desired form (an object in the scene can consist of several blender objects)
- Use modifiers (avoid if not absolutely necessary to apply them)
- Shade smooth and define hard edges
- Unwrap the object when done modeling
- Add shaders and textures
- Create and attach a bone structure for animation (if object will be animated)
- Animate object
- Place all objects in the scene
- Add lights and cameras
- Setup render engine and render the scene
Substance Painter is a lot like Photoshop for 3D models with many predefined materials already on board. It also has the capability to bake high poly structures into bitmap textures to be used on low poly objects. Very handy for game makers but not necessary for offline rendered animations.
- Import one or more objects with correct unwrapped UVs
- bake object data into textures
- use materials, filters, alphas etc. with the layer stack to paint the object
- export the bitmaps to be used in a shader