Pre Visualisation for Production Planning and Motion Control
The live action film and TV industry has been slow to converge. Most of the production methods used today, despite the onset of digital broadcasting, look little different to those used in the 60s. Post production and animation houses have adopted technology and are reaping the benefits in terms of advances in animation and special effects. However, their developments are proprietary, solving problems that only they perceive. There is little holistic and farsighted development that would ensure competitiveness of the whole industry on a global scale.
The project aims to provide the software framework to enable improved resource planning; workflow and increased creativity through the complete production process for the whole industry. The consortium that support the project consists of industry key players
The most important component of the project is the development of a coherent software toolkit (API) for pre visualisation planning and the control of motion systems. This will enable rapid development of new software to meet industry needs and facilitate interoperability. At its simplest, the API gives control over specific motion control rigs. At its most exciting, it enables the motorisation and control of vastly more complicated set-ups, such as motorised dollies and cranes which have radically different motion profiles.
It is recognised that there is great value in being able to visually understand a process before committing resources to it. The pre visualisation component of the project addresses this issue. 3D virtual environment software that will enable the user to download from the Internet calibrations and characteristics of studios world-wide and to create virtual sets within the chosen production environment. It will allow them to manipulate a choice of professional standard motion and lighting equipment within that set and experiments to achieve maximum performance and creativity. It will also provide workflow information. Ultimately, it will be used to direct remotely, both in real-time and off-line, automated motion equipment in both the studio and live action sectors
The research and development will be based upon the API framework and on our current research into virtual environments and discontinuity meshing which should enable effective detail for planning purposes with reduced processing overheads. The savings being used to support improved algorithm for equipment manipulation
Motion control is in its infancy but crucial to the animation and special effects required by the industry. It enables the accuracy and repeatability essential for the production of special effects. A rig is in fact a robot with a camera eye. We intend to explore the application of calibration and move-correction to enhance motion rig performance and define a visual system to detect errors in rig movement and dynamically correct the error. We will also apply the research to 3D move matching, to ensure that the pre visualisation system knows where live action equipment is at all times.

