The future of animation is now! Far from just fast renders, Real-time technology births a revolution in collaborative workflows from which projects like Ada and Christmas with the Moonies have emerged. Now Blue Zoo has opened a dedicated Real-time department with Head of Real-Time, Phil Stewart, and Unreal Technical Artist, Jack Sainsbury (who is now based in Canada!) Curious how an international Real-time department works? We asked Phil and Jack a few questions about just that...
How did the new Real-time department get started?
Phil: Blue Zoo have been experimenting with Unreal Engine for a while now - we’ve done a couple of short films using it - ‘Ada’ and ‘Christmas with the Moonies’ - and the tech is getting so good and the potential gains so large that it was decided it was worth spending resources pushing it forwards and seeing how well it could work in production. I was in Pipeline at the time and had created our Unreal Pipeline and tools so got asked to set up the new department.
Jack: I was originally employed as an Unreal Technical artist to work on ‘Ada’ and subsequently ‘Christmas with the Moonies’ and seeing the potential of the engine in speeding up aspects of production, I began working with the services department where advertising and client work is handled and relies on a much faster turnaround and iteration times. With the ongoing success of these projects, a Real-time department was established and is now entering a phase to be considered for larger-scale production.
Christmas with the Moonies dir. by Will Cook
What makes Real-time animation different?
Phil: The 3D studio at Blue Zoo uses Redshift to render their frames, which is actually one of the fastest offline renderers on the market, but it still takes between 1 and 40 minutes to render a frame - our lighting artists are very used to sending a shot off and having to wait until the following morning to get the results back. Real-time renderers (such as Unreal Engine) are designed to process 60 frames a second - what you’d see in a computer game. When we use it we’re pushing the renderer to its limits, so actually only get around 20 frames per second (or less), but this is still hugely fast compared to more traditional methods. This means Lighting artists can test dozens of versions per hour instead of waiting overnight to get a single version back. More iterations mean more beautiful compositions, faster fixes and happier lives.
Jack: Iteration time is the most significant improvement by comparison to standard rendering, a redshift render can spend hours performing the lighting calculations. With Unreal, saving the rendered frame to disk actually ends up taking longer than the actual rendering does. We’re also looking into collaborative workflows, using source control programs to streamline multiple people working on the same scene, and per show post effects and materials libraries, all built within unreal.
How and why did you personally start working with Real-time software?
Phil: My masters was in ‘Animation for Computer Games’ so I’ve always known my way around game-engines as well as other animation software. When Blue Zoo started to look into Unreal Engine I set up the pipeline and worked on the first projects here so got more and more familiar with it. I’ve also been a keen follower of emerging technology my entire life so the other aspects of Real-time like motion-capture, VR etc. were something I was interested in and wanted to develop for the company.
Jack: I’ve been interested in Real-time graphics for a long time, and ended up focusing a lot of my university units towards projects developed in game engines. The potential for their use in offline production has always been discussed but is only just becoming a viable tool with the focus from companies, like Epic Games, on RTX features and bringing game engine up to an industry level for use as compositing and lighting programs.
Where did you study?
Phil: My undergrad (Computing Visualisation) and masters (Animation for Computer Games) were at Sheffield Hallam, but I also spent a year in Germany studying and then working for a small animation company in Reutlingen during my placement year from uni. At the start of my career, I did a few internships which really helped me prepare and find a place in the industry.
Jack: I studied at Bournemouth (NCCA - Computer Visualization & Animation).
What’s a typical day for you in the Real-time department?
Phil: This varies hugely - recently I’ve been working on an advert for the commercials department, so that involved working in Unreal Engine setting up the shots, lighting, FX, all sorts. Other times I’m testing Blender and new Unreal versions or experimenting with new animation techniques - like motion-capture using a Kinect and an iPhone.
Ada dir. by Dane Winn
What do you think is in store for the future of Real-time animation?
Phil: With the way computer hardware is improving it’s easy to see that Real-time renderers will become the industry standard within the next decade or so, so it’s vital we stay on top of this tech wave to stay competitive in the future. Also with the entry cost of motion-capture dropping and the development of Virtual-Production, animation techniques could easily change.
Jack: I think the future looks very promising, there's already discussion about GPU render farms designed for running raytracing renders on, optimistically within the next 6-7 years we’ll see a paradigm shift from some of the larger companies in the industry. Alongside this rendering transition, collaborative work environments, VR and virtual production are already starting to be explored, which could completely change the way we develop CG, bringing a virtual world closer to a traditional film set.
Now that Jack lives in Canada, how does the department work?
Phil: Ha - we’re still working this out, but it’s going well so far. We stay in touch over chat and Voip calls, and Jack can log into his machine here using remote-desktop or use his home machine and push files up and down from our version control software - depending on what project we’re working on.
Jack: The time difference allows for 3-4 hours of overlap each day, during which time we stay in contact over chat and I’ve been video calling in for meetings. I’ve been using remote-desktop to access files on the system.
Do you have any advice for anyone who wants to get into Real-time animation?
Phil: It’s the future so it’s definitely worth getting on board - Unreal Engine is free to download and experiment with, Blender is too and there’s plenty of other ways to learn.
Jack: The skills you need would need for offline production roles are very much transferable, the industry is constantly changing and I believe it is essential to keep learning with it, we’re seeing the split between games and film/tv is becoming narrower and narrower, get started with Unreal, Unity, Blender, even just following Real-time developments at Siggraph or the nVidia conferences, get out there and get learning, it’s the way the work is going.
A 3D scan of Phil and Jack from when they were testing the Kinect