loader image
DD2 blue beetle comp

Digital Domain Flexes its New ML Cloth System for DC’s Blue Beetle

Digital Domain talks about visualising DC superhero Blue Beetle, creating his intelligent, powerful suit using the team’s machine learning cloth tool for the first time in a feature film.

DD2 blue beetle comp 1066

As lead VFX vendor, Digital Domain contributed nearly 700 shots to over six sequences across the new superhero film Blue Beetle and was responsible for the Blue Beetle character himself. The team also created the asset for the villain Carapax, the epic final battle scene, several environments and other work. However, because Blue Beetle is the star, and what viewers’ eyes are focussed on in every shot he appears in, creating his 3D asset called for creativity and storytelling, as well as carefully coordinated effects from across several different teams at Digital Domain.

Led by VFX Supervisor Jay Barton, they started this part of the project with inspiration from sources like the DC comic books, animated cartoons and video games. Working from a scan and photography of the practical suit the actor wore on set, Digital Domain created a precise digital replica. From there, they incorporated animatable suit extensions that built out the costume with more detail, complexity and personality.

With all of these elements in place, they were ready to enhance the practical suit as necessary for each sequence, or completely replace parts or all of it for action that couldn’t be captured in camera, such as flying and fight sequences.

Design Language

From the outset, the Blue Beetle asset was a challenge. The production’s vision for the character was absolutely photoreal, and in only a few shots is the hero Jaime Reyes simply wearing his practical suit powered down, without any augmentation – most shots either had CG additions or a complete CG replacement. “The look of the costume and its extensions had to be indistinguishable from one shot to the next. But even so, the looks and the way in which Blue Beetle’s weapons are deployed and used, walk that fine line separating photoreal and physically accurate, from fantastical and surprising. In that respect, he was always a really fun character to work on!”

The practical suit that the team based their work on was very intricately designed in its details and material finishes. Organic, other-worldly textures help to reinforce the alien aspect of its origin. “Working within the same design language, we created CG suit extensions and FX that supported this organic alien technology. Our work included animation techniques to help reinforce the story point that, at first, all of that technological detail is a surprise to Jaime when it suddenly deploys, or does things without his participation. But by the end, of course, Jaime and the suit are working closely together to do bigger, more powerful things.”

DD2 blue beetle comp 1023

DD2 blue beetle plateqc 1023

In-Camera, In Post – or Both

Determining exactly when in the story the suit would have to be replaced, partly or fully, with digital elements was a continuous process. Like most movies, the initial thought was to try and get as much in camera as possible. “This is actually fantastic since everyone on set is involved in trying to make it as great as it can be on the day, although the realities of filming mean that sometimes stunt rigging can affect the suit so much that it becomes unusable for the final picture. Obviously, what an actor can physically do can also be a limitation when the character should be extremely powerful,” Jay said.

“Still, just trying these things out on set gives us great reference into the intent of the shot. It can be an organic process during the shoot and during the editorial process to figure out which shots worked as intended and which ones needed replacement. Of course, since Blue Beetle has extraordinary powers, many sequences were started and completed as entirely CG shots.”

DD2 blue beetle comp 1233

Overall, the Art Department undertook a lot of exploration to create designs that, while unique, were still instantly recognisable as belonging in Blue Beetle’s world. When it came to weapons, for instance, a couple of them resembled their comic book counterparts very closely, but the rest followed the originals in concept only. “We had a lot of fun with some of these weapons and could be creative with how they deployed and their shapes, which were true to the essence of the comics but fit this new design aesthetic.”

Digi-Double

Various moments arose during filmmaking that made having a CG version of Blue Beetle’s actor Xolo Maridueña very useful. This model’s build was based on scans and photographs provided by production. It was used for Blue Beetle action shots when he isn’t wearing his helmet, and for transition shots from Jaime to Blue Beetle. An alternate Blue Beetle model was also created based on the stunt performer's body for some of the hand-to-hand stunt shots.

DD2 blue beetle comp 1244

DD2 blue beetle plateqc 1244

“It needed to be accurate and precise, and indistinguishable from our actor, but in fact, the animators had fun with our full-CG Blue Beetle, especially early in his transformation. The director Ángel Manuel Soto really wanted everyone to feel that Jaime was being taken on a ride with little to no control of the suit’s actions. We worked to create shots that sold the power of the suit itself, combined with the whimsical nature of a scared but excited kid being taken on the ride of his life.”

Expressive Extensions

The animated suit extensions needed to be expressive and creative, which made developing a consistent approach for the work a challenge. Jay said, “While there was a framework and consistent technique for each of these, the feeling between shots could be quite different. Our animation supervisor Arda Uysal and FX supervisor Eddie Smith worked together to come up with unique variations that helped sell the emotion of these elements, depending on how they fit into the story. Sometimes they are created to surprise us, sometimes out of desperation and, in the end, with a very deliberate focus.”

Since the self-illumination of various elements on his suit was also important, the team approached it by trying to be as physically accurate as possible. As it pops more at night and in dark scenes, it not only becomes a very prominent graphical element in the frames, it is essential to help the audience to read Blue Beetle's actions, as his suit is otherwise so dark.

“The suit's eye has a complex inner structure like an illuminated alien camera, and the outer surface of his eyes is reflective, causing them to look quite different from shot to shot. Nevertheless, they are such a large part of the character’s personality that we took time to do a lot of adjustment to make them feel consistent.”

The Costume and ML Cloth

Creating a realistic tactile quality for Blue Beetle’s costume was a special task in itself that the Digital Domain team tackled by developing a cloth system that uses machine learning and a custom neural network to learn how cloth should react to movement. In this case, the ML Cloth system learned how Blue Beetle's rubber-like suit should fold and wrinkle with the superhero's actions.

The traditional methods used for such a material are time-consuming, expensive and would occur at a separate phase in the pipeline. With this new tool, the team was able to see the cloth simulation in the same program their animators regularly use and have it all work in near real-time, giving the animators immediate feedback to help their decision process and allowing the supervisors to review the results quickly.

The team originally planned to use the tool only on midground or distant shots, but after seeing its accuracy and speed, even on close-ups, it was used in the studio's work throughout the film. While initially tested on Marvel Studios' She-Hulk: Attorney at Law, Blue Beetle marks the first time it was used for a feature film.

Cloth with a Memory

John-Mark Gibbons, an Associate Lead Software Engineer at Digital Domain, is currently leading the development of ML Cloth. He talked about how the tight fitting nature of Blue Beetle’s costume made it a good candidate for ML Cloth. “Although Blue Beetle’s costume is complex with different materials and small details, instead of involving free flying pieces that needed to flap in the wind or react to gravity, it fits closely to his body. It reacts to his movements by stretching, wrinkling and forming folds it did not originally have.”

Several factors make traditional cloth simulations time-consuming for VFX artists. For one, cloth simulations are easily caught by points that get trapped inside other geometry. Jay said, “This leads to sticking and odd bunching up, especially when characters like superheroes do some pretty outrageous actions while fighting or travelling at impossible speeds. It’s also difficult to make a rig that works in all circumstances, calling for a lot of back and forth between modelling, rigging and cloth to get to a version that is close 90% of the time but still requires a lot of tweaking.”

John-Mark described the differences the new system has made for the animation team, allowing them to simplify and speed up development of the animation rig. “We create a rig for the animators containing the ML Cloth deformations, along with the standard rig they expect. This not only gives them a preview of the final deformations, but also removes the need for simulation, thereby removing many of the problems that occur when real-world physics are applied to impossible motion.

“Long, fast-moving shots are especially difficult to simulate, requiring a significant amount of artistry from CFX artists. Unlike a time-dependent simulation, ML Cloth simply remembers how to deform the various sections of the body, given a pose. So, as long as the training data is clean and covers the character’s range of motion, we can trust the approximated results will not crumple, tangle or explode.”

Furthermore, animators can preview the final deformations in their scenes. This encourages early detection of issues in the simulation rig, as well as bringing to light problematic animation poses that will likely cause issues in future departments.

Training the Network

To begin the process of ‘teaching’ the Neural Network about the nature of the suit’s rubbery cloth and about how it should respond to the character’s motion, the CFX team still had to research and crafting simulated deformations for the suit. “Fortunately, we have very talented artists who researched, built and maintained a CFX simulation suit,” John-Mark said. “Our goal was to preserve as much of the original artistry as we could, while swapping out the simulation for the approximated deformations in the rig.

“We showed the network many examples of the Blue Beetle costume in many different poses. It then learns to mimic the simulation accurately by iteratively refining and improving its understanding of the deformations, until the approximated results could be mistaken for the simulation results.”

 

DD2 blue beetle comp 1072

DD2 blue beetle plateqc 1072

They were actively rebuilding much of the pipeline early on while simultaneously working on the Blue Beetle training, which makes it hard to say how long it took before the network was ready for production as. However, once they had a stable version of the tool, training would typically take about eight hours from start to finish.

John-Mark said, “When it is time to train ML Cloth for a project, we ask the animation leads to put together some animation clips of the sorts of motion the character is likely to perform. It is important that these training poses cover the full range of the character’s motion so that ML Cloth can properly approximate relevant animation poses in production.

“This asset-specific animation is combined with some standard range of motion animation, which is then used to train the model. We test the results by having it perform animation that didn’t appear in the training set, telling us if the model generalises to new motion or if we need to train further, add animation, or tweak parameters.”

Building the Trained Rig

When training is complete, the team can build a new Maya rig automatically by stripping out the parts of the animation rig that are now approximated using ML Cloth, and injecting the trained model into the rig in its place so that no functionality is lost. The animators can then choose to load whichever rig they prefer in their scene as they work.

“Although they are free to work however they prefer, we recommend that animators use the ML Cloth rig because it gives them a chance to pick up on problematic poses and flag simulation issues before they interrupt production,” said John-Mark. “Regardless of whether or not the animators used the ML Cloth rig as they worked, the ML Cloth deformations will be cached and passed to the next department, just like all other cached products for the shot. The shot-specific ML Cloth results are then evaluated in lighting.”

Jay believes that one of the system’s main advantages is that, once it is set up, the animators can monitor the cloth on their animation rigs as they work. Sometimes an odd pose that looks correct from the camera can be detected by the way the cloth reacts. Even if a great pose causes issues, the cloth can still be tweaked. digitaldomain.com

Words: Adriene Hurst, Editor