UE5 Tutorial: How to Combine Live Link and Body Animations (working)

Combining livelink recorded face animation and body animations how to guide featured image
Reading Time: 6 minutes

Published: April 20, 2025 | Last Updated: May 16, 2025

This guide walks you through building a hybrid character animation setup in Unreal Engine 5.x. You’ll use Live Link Face on an iPhone to stream real-time facial data — blend that with a body running pre-animated motion (like from Sequencer or mocap) for a non-metahuman modular character. I haven’t tested this with metahumans. It’s excellent for video game cut scenes and animated movies.

Here’s a video explaining everything, and below that, a written tutorial of all the steps.

Here’s a working blueprint for a modular character:

I’ve tried and tested this setup. It even works live, i.e., you can have an animation drive the body and animate and record your Live Link Facial blendshapes in real time while the body moves:

LiveFaceanimandBodyanim
Here, I use the sequencer to drive the body animation while running the Live Link in real time. As you can see, the head and neck jump up and down a bit. But it doesn’t do that after I’ve recorded the facial animation and run both from the sequencer. So this is still useful for testing your facial animations live in a scene, where you’ve already made the body animations.

I can’t take full credit for this setup—the final pieces of the puzzle are all thanks to the kind help of Ian Ferrari at the Unreal Engine Forum, who helped me create this blueprint. Without Ian’s help, I would have pulled out the remaining grey hair on my head in frustration at not getting this to work correctly.

1. Enable the Right Plugins in Unreal Engine

Go to Edit > Plugins and enable:

  • Live Link
  • Apple ARKit
  • Apple ARKit Face Support
  • UDP Messaging
Live Link Plugin
Apple ARKit
UDP Messaging

Restart the engine when prompted.

2. Connect Your iPhone to Unreal

On your PC, open Command Prompt (type CMD in the Windows search bar) and type ipconfig. Find the IPv4 Address—it usually looks like 192.168.X.X:

ipconfig

On your iPhone, open the Live Link Face app. Tap the gear icon, go to Live Link > Add Target, and enter your PC’s IP address. Also, make sure Stream Head Rotation is turned ON.

Back in UE5, open Window > Virtual Production > Live Link. Your phone should appear in the Subjects list. If it doesn’t, double-check your Wi-Fi and firewall.

3. Create the Animation Blueprint

Right-click in the Content Browser and select Animation > Animation Blueprint. Choose your character’s skeleton and name the file something like ABP_HeadBlend (Mine is called PollyLiveLink.):

Animation Blueprint

Assign this AnimBP to your character in the level (you’ll do this later in the sequencer). But for now, it’s ready to use Live Link data and body animation together in the editor:

BluePrint Assigned in Editor

Tip: Enable Update Animation in Editor in the skeletal mesh’s Details panel to see real-time head movement while scrubbing:

Update Animation in Editor

4. Step-by-Step: UE5 EventGraph Setup for Live Link Head Rotation

In the EventGraph:

  • Add Evaluate Live Link Frame → Subject Name = iPhone (mine is called “Jan”)
  • Compile and save

Set Up Head Rotation in the EventGraph

  • Add three Get Property Value nodes → Yaw, Pitch, Roll (rename to HeadYaw, HeadRoll, HeadPitch).
  • Add a Map Range Clamped for each:
    • Yaw: In -1 to 1 → Out -90 to 90
    • Roll: In -1 to 1 → Out 70 to -70
    • Pitch: In -1 to 1 → Out 60 to -60

  • Use Make Rotator → Connect Yaw, Roll, Pitch.
  • Create a new variable HeadRotation (Rotator) → Set it using the Make Rotator node.
  • Add a Sequence Node → Wire Event Blueprint Update Animation to it.
  • Then 0 → Evaluate Live Link Frame
  • Then 1 → Cast to BP_ModularCharacter
  • Get Owning Actor → Cast → Target = Leader Bone Mesh Component
  • Use Set Morph Target for body mesh
  • Wire Evaluate Live Link Frame → Property Value nodes
  • Valid Frame → Set Head Rotation

5. Step-by-Step: Live Link Head on Animated Body (AnimGraph Setup)

Right-click → Animation > Animation Blueprint → Choose face skeleton → Name it ABP_HeadBlend
In the AnimGraph:

  • Add a Live Link Pose node → Connect to Final Animation Pose
  • Set Subject Name to match your iPhone

Finish the AnimGraph

  • Drag Skeletal Mesh → Add Copy Pose from Mesh
  • Create Save Cached Pose → Name it InputCache
  • Add Local to Component → Connect Live Link Pose
  • Add two Transform (Modify) Bone nodes → Chain them
  • Multiply HeadRotation by a scalar (e.g., 0.7 or 0.5) → Connect to Transform nodes
  • Add Component to Local and Default Slot
  • Add Layered Blend per Bone:
    • Base Pose = Cached PoseBlend Pose 0 = Default SlotAdd branch filter → Bone = neck_01 (or correct name)
    (see screenshot below)
  • Add another Use Cached Pose → Connect it to Live Link Pose Input
  • Compile and save everything
Layered Blend Per Bone Neck Set
My relevant neck bone is called Neck_01, but yours may differ, especially if you have a long neck with several bones.

7. Test in Sequencer

Add your character to Sequencer. Set a body animation track if needed, but let the AnimBP handle body motion if you’re using the Sequence Player node. First, here’s a screenshot of the setup:


Set Up Sequencer

  • Create or open a Level Sequence
  • Click Track → Actor to Sequencer → Select BP_ModularCharacter
  • Head Mesh: Set Animation Mode to Use Animation Blueprint and select ABP_HeadBlend
  • Body Mesh: Use the Leader Pose component with any baked animation
  • (Optional) Record Live Link and import it as baked animation for clean editing
  • Press Play — face mocap and body animation should now play together in real time

Summing Up

This setup lets you drive the head in real time with an iPhone while your character performs pre-animated body motion. It’s fast, flexible, and works cleanly in Sequencer. Use Take Recorder or record directly into the level sequence to record the head animation—both work.

I hope you find this helpful. Feel free to share, and happy animating 🙂

Read Next: Read more about my cyberpunk short film Echoes of Love, where I use this setup for multiple characters.

By Jan Sørup

Jan Sørup is a indie filmmaker, videographer and photographer from Denmark. He owns filmdaft.com and the Danish company Apertura, which produces video content for big companies in Denmark and Scandinavia. Jan has a background in music, has drawn webcomics, and is a former lecturer at the University of Copenhagen.

Leave a comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.