Skip to content
Megasteakman edited this page Apr 13, 2024 · 24 revisions

MegaMocapVR

Introduction

This project is a motion capture and Vtuber solution for Unreal Engine using SteamVR tracked objects to drive humanoid characters. Currently the project is designed to work with the following hardware using Live Link:

  • Vive Trackers (2-7 per body)
  • SteamVR HMDs for immersive, in VR capture
  • HTC or Valve Base Stations (required to track the Vive Trackers)
  • Index Controllers (other controllers require additional inputs to be set up)
  • iPhone 10 and up (for facial tracking)

The system scales from simple iPhone only VTuber set ups to 3-9 steamvr tracked devices for full body set ups. I find the sweet spot is 5 trackers and 2 controllers combined with an iphone for facial tracking.

The skeletal meshes designed to work with the rig are:

  • Metahumans
  • UE4 Mannequin
  • UE5 Mannequin
  • Character Creator 3+

With UE5's IK Retargeting, it is easy to transfer motion data between skeletons even at runtime! This opens up the system to any skeleton you can imagine, and if that's not enough, all of the system is exposed through blueprints so you can add your own custom skeleton set ups!

This framework heavily relies on SteamVR in versions prior to 5.2, and OpenXR in 5.2 and later. MMVR requires Live Link and Control Rig, so the Unreal Engine editor is required for operation.

Brief Overview

MegaMocapVR requires 3 items in your scene to run:

  • MMVR_PlayerPawn_BP
  • MMVR_SpectatorCamera_BP
  • A Blueprint with your Actor

When running the game in editor (note: play mode CANNOT be simulate), the MMVR_MocapActor_Component in your Actor Blueprint will determine what skeletal mesh you want to motion capture, and will send that information to the MMVR_PlayerPawn_BP if the ‘Is Active Mocap Target’ bool on the component is set to true. The MMVR_MocapActor_Component determines the mesh, as well as important bone names for calibration from a selectable MMVR_ActorRigProfile_Enum.

If the user is possessing the MMVR_PlayerPawn_BP (either through auto possessing player 0 or using the included MMVR_GameBase_BP game mode), then the player pawn will enter Calibration Mode and teleport your VR playspace to the centre of the Actor Blueprint. If SteamVR devices are connected using Live Link XR, and those device names are populated in the player pawn, then the user will see their body movements drive visual representations of the trackers in the virtual world.

The user will then calibrate their tracker positions to the character’s skeleton by lining up their body with the in-game character. Holding down both B Buttons on the index controller (consult your Input Settings in Unreal Engine's Project Settings for alternative keys) will offset the steamvr devices to match the location of the bone goals they are set to control. While in Actor Mode, the location and rotation of Live Link XR SteamVR trackers will be send to the actor’s animation blueprint via the MMVR_Interface, and be used in a control rig to drive the animation of your character.

First Time Set-Up

This project was built using Unreal Engine 4.26.2, and has branches for 4.27 and 5.03. The title of each commit will tell you what version is fully compatible with that submission. To use the program, you will need to install the following plugin-ins:

  • Live Link
  • Live Link XR
  • Apple ARKit Face Support
  • Control Rig
  • Full Body IK
  • Blueprint Utilities
  • OSC
  • SteamVR (Only works in versions below 5.1.1)
  • OpenXR
  • OpenXRViveTracker (Can cause crashes in 5.3.2 when running in play mode without -XRTrackingOnly flags)

Import the MegaMocapVR folder into your UE Project's 'Content' folder using Windows Explorer. Dragging files from windows explorer into Unreal Engine's editor will not work. Also, this folder needs to be in the root of the 'Content' folder: moving it may break references and cause the program to fail. In the root of the MegaMocapVR folder, you will see two important actors blueprints for making your scenes:

  • MMVR_PlayerPawn_BP
  • MMVR_SpectatorCamera_BP

Optionally, you can right click on MMVR_EditorWidget and select ‘Run Editor Utility Widget’ to bring up a custom menu that shows debug information, extra commands and a helpful interface that allows a person stationed at the computer to control the scene.

You can either change your game mode to MMVR_GameBase_BP to spawn in as the MMVR_PlayerPawn_BP, or drag the MMVR_BP into a scene and set it to auto possess Player 0. You must be spawned into the player pawn for the system to work

Having this Player Pawn in your scene will allow you to easily change which Live Link XR tracked device will control the various goals of the character, as well as other public bools like ‘Debug Strings Hud On.’

Now you will need an actor. Currently, the system is designed to work with Metahumans, UE5 Mannequins, UE4 Mannequin and Character Creator 3+ models, although can drive other skeletons with IK Retargeting. To make a mocap ready actor, create a blueprint containing the skeletal mesh you wish to use, ensuring that it is facing forward in X. You may need to make the skeletal mesh a child of another root object to obtain this orientation.

Now, add the animation blueprint that is appropriate to the character you are making (CC3, Metahuman, UE5 or UE4 Mannequin, or a duplicated IK retargeted animation bp that has been assigned to your skeleton). You can find premade Anim BPs under the MEgaMocapVR/AnimBP’s folder. Make sure the skeletal mesh in your blueprint is set to 'Use Animation Blueprint' under Animation Mode.

NOTE: If you are importing a UE4, UE5 or CC3 skeleton, it's best to import using an existing skeleton in the MegaMocapVR folder.

If you create a new skeleton on import, the animation bps included with this system might not be able to understand they are the same skeleton type, and after calibrating you will not see any movement (or the character will just start sliding backwards). If you are unable to import the model using one of the pre-existing skeletons, you can also right-click and retarget an animation blueprint to work with your character, or you can right-click on a mesh to re-assign which skeleton it uses.

Next, add the MMR_MocapActor_Component to the blueprint, and choose from the dropdown which skeleton rig the character is using, if it is the Mocap Target and which player is meant to control the model (NOTE: you can actually have up to 4 MMVR_PlayerPawn_BPs in the system for local multiplayer, with each driving a different character).

Now, add the VRMocap_SpectatorCamera_BP to the level in a location that gives you a full view of the actor in an unobstructed view. This camera will be your viewport for calibration and operation of your character.

On your first project set up, power up your Vive Trackers one at a time and connect to them using Live Link as an XR source. This will allow you to note what Vive Tracker has what serial number, which we will be using on the VRMocap_PlayerPawn_BP. Make a document of these and physically label your devices or the straps they are attached to. Using OpenX in projects 5.2 or greater, these trackers will need to be assigned roles using SteamVR's manage trackers menu.

NOTE: Sometimes Live Link will not add a device if it is partially occluded or not valid at the time you add an XR source. When adding sources, it is best to ‘Add’ when the person wearing the trackers is in the center of the playspace (or at the very least, not partially occluded by a desk).

The best way to get tracking data and VR inputs in non VR play modes is to launch the engine with the command: -XRtrackingonly Learn more about how to do this here: https://www.youtube.com/watch?v=GKsPjufVPwg

VRMocap uses control rig to attach goals to the VR tracked objects (vive trackers, controllers and/or HMD). When calibrating, the system handles the offset that the tracker has to the desired goal automatically, but still it is advised to get the tracker as close as possible to where the relevant bone is on the real world actor. Trackers can be placed on the front or back of the torso depending on what motion is needed. The goals available are:

  • Left Foot
  • Right Foot
  • Pelvis
  • Chest (Spine 03, Spine_05)
  • Head
  • Left Elbow
  • Right Elbow
  • Left Hand
  • Right Hand

The Chest, Pelvis, Feet and Elbow trackers are optional, although the chest and pelvis are recommended. If you do not have Feet Trackers, the system will default to Upper Body Only Mode, although you can get foot motion if you are using the MMVR_MovementActor_BP or if your actor is using the Character Class. If you don't have a tracker for a body part, delete the Live Link Device name out of the Live Link Tracker Set Struct in the MMVR_PlayerPawn_BP. This will set bools for NoPelvisTrackerm NoChestTracker, NoElbowTrackers and NoFeetTrackers depending on which fields you delete.

Opening the MMVR_PlayerPawn_BP will allow you to set persistent default values for what tracker will control which body part. As long as you attach the same tracker to the same part of the performer’s body, you will not need to set these values again, and can power on and connect all Vive Trackers at once using Live Link in the future. Also, you can add your XR devices in Live Link XR and save a Live Link preset, which can be added to Project Settings/Plugins - Live Link. This will add your devices on engine start-up.

If you are using an iPhone to capture facial performance, update the Default Value in LiveLink_iPhone variable. Below you can see a Glide Gear POV 100 DSLR Helmet Cam Rig with a magic arm and phone mount.

Now we are ready to calibrate and embody the character! Run the game in editor. If you wish to use the program with a VR HMD, turn on the bool 'VRMode HMDCapture' on the MMVR_PlayerPawn_BP and launch the game using Unreal Engine's VR playmode.

The MMVR_PlayerPawn_BP is designed to enter Calibration Mode on Begin Play, although this can be changed by setting the MMVR Player Enum during the Event Begin Play.

On entering Calibration Mode, the MMVR_PlayerPawn_BP will teleport so that the user’s VR volume will center on the Actor in the level with an MMVR_MocapActor_Component that has ‘Is Mocap Target’ Enabled. The Player Pawn’s size will also be scaled to match the Mocap Actor’s height based on the PlayerHeight Float in MMVR_PlayerPawn_BP.

NOTE: The PlayerHeight variable in the player pawn is saved when the actor completes calibration. You can use the VR controller triggers or the MMVR_EditorWidget to scale the user up and down to match the virtual actor. It's advised to scale the user rather than the actor as scaling actors can cause calibration issues with control rig.

The performer will now line up their trackers as closely as possible with the virtual actor’s body. It is helpful to try to clip the trackers with the body, and then reverse slightly so that they rest on the surface similar to how they are placed on the real world actor. If the performer is not facing the computer monitor when calibrating, you can change the rotation offset of the VR playspace using the 'Rotate Actor when Entering Calibration' float on the MMVR_PlayerPawn_BP.

NOTE: If you have two operators, one person can press F8 to exit the MMVR_SpectatorCamera and enter a flycam pawn to get a closer look at the alignment of various parts. The left joystick click on the index controller will also toggle a 'calibration camera sequence' that can help you line up with the character.

For a solo operator to calibrate, they must hold both ‘B’ Buttons down on the index controllers. This saves the calibration for the player int and the character. If tracker placement hasn’t changed, the actor can load this calibration and begin motion capture instantly on startup.

You are all ready to start making motion!

The Scene Locator

There are two different ways to capture motion using MegaMocapVR. When there is no active MMVR_SceneLocator_BP in the level, the mocap actor will default to a mode that updates their Actor root based on their hip position. While not one-to-one, this roughly mimics ‘in-place’ animations, and allows for forced locomotion and blending in and out of animations without the virtual actor popping back to their origin on playback. The downside to this mode is there is some slight foot sliding.

If a MMVR_SceneLocator_BP is active in the scene, the current mocap actor will update their root to match the transform of the locator. This mode is root-motion, and is useful for keeping multiple actors spatially in sync. Using this workflow, it would be easy to parent the actors to the locator and zero out their transforms if their position is ever lost, and animation can be done on the MMVR_SceneLocator_BP to move the entire scene to a new location.

By default, the system will make the MMVR_SceneLocator_BP the centre of all user's VR playspaces (if in local multiplayer with several MMVR_PlayerPawn_Bps, each will teleport to the same spot during calibration). This should ensure that the actors will move in direct relation to each other, but both character blueprints will need to be placed close to the MMVR_SceneLocator_BP to allow for calibration.

In Game Controls

Controls are a work in progress, but functional for most uses.

Holding the ‘A’ Button on the Index Controller will pull up a Half Life Alyx inspired radial menu centered to the hand that called up the menu. Moving your hand up, left, right or down and then releasing the ‘A’ Button will select different options depending on what hand is being used.

Important functions for changing the mode are on the left hand, and camera switching is on the right hand. If a camera mode is set to toggle between a socketed mode that follows a specific bone in the mocap actor and a static mode that warps to the bone location but then stays in that transform.

Holding Down the B-Button and Pressing either trigger will disable tracker control for the current actor and move the player to another actor to control.

Feel free to add your own controls, and customize the system to your liking. If you wish to contribute additional functionality to the project, I welcome all submissions!

The Sound Generator

This thing is pretty cool. Try it out I guess!

CameraControl

Put a tracker on your camera with the LED light at the top and the vive logo facing away from you, and turn on the ‘tracker controlled’ bool. Use an Xbox Controller to move your camera around the scene.

Fixing Metahuman Recordings

If you find that the head or hands are offset during playback of your Metahuman recordings, you will have to change the 'Retarget Source' under the animation asset.

Navigate to the Animation folder under your recording's subscenes.

Next, add the skeleton that your metahuman is using. If you are unsure of what this is, open up your metahuman blueprint, highlight the 'Body' component and find the skeletal mesh under the details panel. An example name is 'm_med_nrw_body'

Fixing Strange Character Deformation (Metahumans, etc).

If your metahuman skeleton deforms when entering the character (or if any character visibly deforms when entering the character), then it could be a problem with the control rig's preview mesh. The control rigs in MegaMocapVR often use the FBIK node, which is set to base its deformation based on the initial pose, and it appears the system gets it's initial pose from the preview mesh and not the skeleton using the blueprint.

To fix this, open up the relevant control rig for your character under MegaMocap/ControlRigs and select the Preview Scene Settings tab. Under preview mesh, clear out the default and add in the skeleton that your model is using. This typically is mostly and issue for Metahumans, as there is a great variety of skeleton sizes that the characters can come in as.

Roadmap and Closing Words

Thank you so much for your interest in this project! Around 6 years ago I started learning to code with the intention of making filmmaking tools to use in Virtual Reality, and I’m finally proud to have accomplished this goal. I think there are many things that can be added and improved to this framework, and I hope that I can add more functionality as my skills grow, but being a new developer I know that many things are currently beyond my current abilities. That is why I am making this project open source so that others can use this tool and add to it so that it can become more useful to all of us! If you make a cool feature, or decouple any systems to use a greater variety of hardware I would be honored if you submitted changes to this project.

Another reason I have made this project free is I have been inspired by the generosity of Epic Games. Using the Unreal Engine and all the incredible tools Epic has made more affordable has made a huge impact on my art and my life. Metahumans are also such an incredible gift, and with them currently being free it just feels right to make this free as well!

If you found this project useful and wish to help fund my Vive Tracker/VR Accessory buying addiction, you can support me on Kofi here: https://ko-fi.com/megasteakman

Also, if you have any relevant motion capture hardware you want me to support and can spare sending me a copy I can try to incorporate it into the framework. Just send me a message!