Facial mocap comes to Unreal Engine via new iPhone app


A new iOS app for Unreal Engine uses your iPhone to capture your facial expressions and animate an onscreen character in real-time. Live Link Face is designed to work in both professional game production settings, like a soundstage with actors in full mocap suits, and amateur ones, such as a single artist at a desk, according to a blog post from Unreal Engine developer Epic Games. The app is available now from Apple’s app store.

The app uses Apple’s augmented reality platform, ARKit, and the iPhone’s front-facing TrueDepth camera (introduced on the iPhone X in 2017) to capture facial features and transmit the data to the Unreal Engine. The app also captures the head and neck movement.

To support collaborative work, Live Link Face uses multicast networking to synchronize with multiple devices at once. Developers promise “robust timecode support and precise frame accuracy” to further support multi-device synchronization. The “tentacle sync” feature allows the app to connect with a master clock on stage.

The post doesn’t mention whether an Android equivalent app is in the works. This could be because it would be near-impossible to ensure this mocap app would work with the dozens of different front camera setups found in the many Android phones available. Developing it for the iPhone means Epic could target the TrueDepth camera and not have to worry about variations on it.

Unreal Engine is one of the most widely-used productivity tools for games and other media, including in titles as big as Fortnite and The Mandalorian. Epic plans to continue making Unreal widely available, saying, “continued democratization of real-time tools for virtual production is one of the primary goals of Unreal Engine development.” 

No comments