CHANGES ARE COMING TO THE FILM world, changes that will leave a long lasting impact and that the film industry is largely not aware of yet.
Why talk about the film industry in a game professionals’ magazine? Because these changes are originating from the game world and will leave an indelible mark on how both games and films are made in the next decade.
WHERE IT STARTED
Two years ago, a new technology initiative got under way at LucasArts. The goal was to figure out a robust solution to the question of an internal development engine for the next generation of LucasArts games. The new LucasArts marched out into the world on a mission: to assess the state of the industry’s technology and find the best solutions for the next generation of internal development.
We assessed middleware engines. We assessed proprietary engines. We talked to development teams in the U.S., Europe, and Asia. In the end, the solution came from a somewhat surprising source: our own backyard.
As it turned out, Industrial Light and Magic (ILM), a Lucasfilm company, had long been working with a proprietary tool called Zeno. Zeno helps ILM build and populate digital stages for their films. It contains texture controls, particle generators, and systems for digital actors. Its only drawback is that it’s rooted in rendering; each frame is rendered individually in a time- consuming process, so making changes is very slow going.
But in time, LucasArts brought to the table the grand concept of runtime viewing. Two years into the collaborative development of Zeno, LucasArts is well on its way toward the original goal of having a robust game engine populated with ILM’s computer effects wizardry while ILM is gaining a robust real-time pre-visualization tool that saves time, money, and aids pre-production in the creation of advanced animatics. Both LucasArts’ game developers and ILM’s film teams are using the same technology with standardized terminology, a standardized workflow pipeline, and standardized tool sets.
WHERE IT’S GOING
The environment we’ve been working in for the past two years is one in which many game developers may find themselves in the near future. With standardization and the cross-pollination of tools (which puts game and film design on an even technological playing field), assets can be tossed back and forth between teams instantaneously throughout production. The digitalization of filmmaking and visual effects enables the game development teams to create levels out of scenes still being worked on by the film team. And effects artists, texture artists, and animators can service the design needs of both teams simultaneously.
But how does this affect audio? The conjunction of film and game authoring technology has different implications for audio professionals than it does for our graphics-centric brethren. Film audio exists almost exclusively in a world of post-production and will most likely always be that way. Game sound is rarely a linear experience. The notion of audio simply coming in at the end of a film project, however, is misleading. As budgets balloon, an ever-increasing importance in the film world is being placed on detailed and sophisticated pre-visualizations.
The audio capabilities of current pre- visualization tools are often extraordinarily basic, if implemented at all. Usually, if a pre-visualization program has any audio functionality, it’s the ability to play back one or two linear tracks of audio.
ILM began using Zeno’s pre- visualization tool with that kind of limited functionality. However, the group then collaborated with the same programming staff who was creating the audio tool sets for LucasArts’ audio engine, and as a result the audio capabilities of the pre- visualization tool are now more robust. It even shares similar key commands, UI design, and workflow conventions as the game audio engine.
At its heart, converging technology is efficient, economical, and dedicated to expanding professional opportunities. What’s critical to remember is that, while film audio and game audio are often separate beasts, the audio pros who create content for each often aren’t.
When industry toolsets do merge, the big conclusion for audio professionals will be whether they can easily hop between film and game projects without needing to learn new software. And if they can, game audio professionals will find themselves available to do pre- production work for films, as opposed to simply coming onto a project late in a post-production role.
Film audio professionals will find themselves versed in software that allows them to step out of the world of linear audio and into the world of interactive audio. All audio professionals will find themselves with a wider pool of job opportunities, which they previously would not have been able to take due to the different skill set required by each industry.
While bi-media development environments are currently relegated solely to the realms of proprietary software, commercially available options are inevitable. It won’t be long before dual-use development engines find themselves industry mainstays among other more specialized programs common to both film and game development such as Maya, the Havok physics engine, and Protools.