loader image
BLUE BEETLE

Digital Domain Goes Into Machine Learning Mode For Blue Beetle

Pushing the boundaries of digital humans has become standard practice for Digital Domain, with the title character of Blue Beetle providing an opportunity to fully test a cloth system driven by machine learning and a custom neural system known as ML Cloth. The goal was to have the rubber-like superhero suit fold and wrinkle in accordance with the actions being performed by Jaime Reyes (Xolo Maridueña), who has been fused with a scarab created from alien technology. The initial tests for ML Cloth began with She-Hulk: Attorney at Law. “What we found is that the implementation didn’t take in account animators’ preferences, so it was almost like software engineering for software engineers,” explains John-Mark Gibbons, Associate Lead Software Engineer at Digital Domain. “It was too confusing for animators to use, so we threw away most of the workflow from She-Hulk and reimagined it as artists being the centerpiece of the tool. We were hoping for success, and on Blue Beetle we got a massive success. We’re excited with the outcome of that and to see it progress into future shows.”

DigitalDomain BlueBeetle 1280

When it comes to the training dataset, everything is customized. “We don’t have a general trained model that knows how cloth works,” Gibbons notes. “What we’re doing is learning how a specific asset deforms.” There is no web scraping involved. “We still do traditional setups and rigs with certain types of muscle and volume control to get the underlying body to move the way we want it to,” states Jay Barton, Visual Effects Supervisor at Digital Domain. “We still did a traditional cloth rig on top of that. In this case we did a lot of training calisthenics to try to encapsulate the possible range of motion inside of cloth settings that have already been established and work. Once we have that, the training and machine learning takes new animation, runs it through the setup and comes up with a cloth simulation specific to that animation, but trained on our initial stuff. It’s all in-house and in the same pot swirling around without having a whole lot of departments vet it, touch it and move it on. Once you have all of that training data finished, animators while they’re animating can go, ‘What does this look like with cloth?’ And rerun their animation and see the cloth right there in their preview. It’s not like an AI cloth. It’s specifically machine learning.”

“What we found is that the implementation [of in-house cloth system ML Cloth] didn’t take in account animators’ preferences, so it was almost like software engineering for software engineers. It was too confusing for animators to use, so we threw away most of the workflow from She-Hulk and reimagined it as artists being the centerpiece of the tool. We were hoping for success, and on Blue Beetle we got a massive success. We’re excited with the outcome of that and to see it progress into future shows.”

—John-Mark Gibbons, Associate Lead Software Engineer, Digital Domain

Efficiency was improved by adopting ML Cloth. “A great example that happened on the show was animators could turn it on in their scene file and while they were working noticed it doing something that was odd,” Barton recalls. “Ran it up the food chain. Everybody had a quick meeting about it. The lead cloth artist tweaked some settings in how it worked. Everybody looked at it in shot and said, ‘That seems right.’ Reran the training data, and all subsequent shots had that fix without us having to go through the traditional ‘do the cloth, do the settings on top of the animation, render it,’ and it gets into a review where I’m sitting in the screening room saying, ‘That cloth looks off.’ That can take weeks. I ironed this out in a matter of hours. I never had to see it. It was fantastic.” Shots have to be added to the training dataset in order for there to be improvements in the machine learning. “The important thing is that the model has to know what types of animation the rig is performing,” Gibbons remarks. “We can train it on ranges of motion, but from asset to asset you will have more specific and targeted animation that a character tends to do a lot of. As the show progresses, we are made more aware that this character has his arms up a lot more than our training set covers, so maybe we should have more of those types of shots in the training.”

PIX 5 OFB 0160 DD2 plateqc v001.1029

“[W]e did a lot of training calisthenics to try to encapsulate the possible range of motion inside of cloth settings that have already been established and work. Once we have that, the training and machine learning take new animation, runs it through the setup and comes up with a cloth simulation specific to that animation, but trained on our initial stuff. … Once you have all of that training data finished, animators while they’re animating can go, ‘What does this look like with cloth?’ And rerun their animation and see the cloth right there in their preview. It’s not like an AI cloth. It’s specifically machine learning.”

—Jay Barton, Visual Effects Supervisor, Digital Domain

Practical suits were constructed that the digital versions had to match. “Even when we would replace the suit, the intention was to use as much practical in-camera of real actors in real suits where we could,” Barton states. “Certainly, there were things that the practical suit would do that we didn’t want it to do, or situations where something that matched the practical wouldn’t be wanted on our end because the action is so much more fantastical, crazy and over the top than what a human can do. The practical suit was made out of foam, rubber and harder plastic pieces that could deform. We had an actor and stuntmen putting it through its paces.” Scans were taken of Xolo Maridueña with and without the suit as well as his stunt doubles. “We did have a lot of stuff to base an anatomically-correct model and rig to underlie the cloth simulation,” Barton adds. Suit transformations were meant to be different each time to reflect the character arc of the protagonist. “When the first transformation happens, Jamie Reyes is not a willing participant, and by the end of the film he is in control of the suit,” Barton notes. “There was a lot of exploration in how we wanted to do that and what aspects would relate across the various transformations in film and what aspects would be different as he gains control.”

“Even when we would replace the [practical] suit, the intention was to use as much practical in-camera of real actors in real suits where we could. Certainly, there were things that the practical suit would do that we didn’t want it to do, or situations where something that matched the practical wouldn’t be wanted on our end because the action is so much more fantastical, crazy and over the top than what a human can do.”

—Jay Barton, Visual Effects Supervisor, Digital Domain

PIX 8 OFB 0370 DD2 comp v197.1066

In order for ML Cloth to work properly, three departments collaborated closely with each other. “You have animation defining how the character moves,” Gibbons remarks. “You have rigging, which is setting up the rig for the animators, but then you have CFX that has to make sure their rigs are deforming in the ways animation expects. What ML Cloth allows us to do is to tighten that connection between those departments so that it’s not giving up ownership of anything. All of the departments still own completely what they’re doing, and CFX rig is to be at the high bar that we expect, but we’re able to expand communication between the departments to get good results out of ML Cloth.” ML Cloth eased the worry factor.

“Certainly, by the next time that I get to use ]ML Cloth], whether it’s some other character or animation tools that it helps with, or muscle deformation and simulation or cloth, I will be looking at it from the beginning: how can we set this up to use it in ways we haven’t thought of before?”

—Jay Barton, Visual Effects Supervisor, Digital Domain

“As you’re going through a project, you get to the end and everybody is running around crazy,” Barton observes. “You’re trying to get the last 200 or 300 shots finished in a matter of weeks. There are so many places and little things where the ball gets dropped. ML Cloth was one of the things I did not have to think about once it was stable. That freed me up to concentrate on other things. This was an unexpected bonus on this film. Certainly, by the next time that I get to use this, whether on it’s some other character or animation tools that it helps with, or muscle deformation and simulation or cloth, I will be looking at it from the beginning: how can we set this up to use it in ways we haven’t thought of before?”