7 Ways To Animate Characters For VRChat
Published: Updated:
By Vavassor
Look, I just wanna animate a character for my VRChat world. Or maybe make a custom action or locomotion for my avatar. And I already got body tracking hardware, like Vive trackers. So why not use motion capture?
Recording in VRChat
ShaderMotion
So far, I've used ShaderMotion for animations.
ShaderMotion is a system to stream body motion using video. You can live-stream performers and see them move in-game! This tool can also be used to record motion.
Check out How to animate NPCs in your world with MOCAP for a full guide. But here's a quick outline.
- Set up an avatar to use as the performer and upload them to VRChat.
- In VRChat, record animations and use video capture software such as Open Broadcaster Software (OBS) to record video of the game, which contains the animation data.
- In Unity editor, set up a version of the character to act out your animation.
- Play back the video in play mode and use one of the included scripts, UdonPlayer or MotionPlayer, to animate your character.
- Use the AnimationRecorder tool to record that movement into a .anim file, which is the final animation.
- Edit the animation. AnimationRecorder runs slow for me, which makes the character's movement slow. So I have to either edit the animation file or increase the playback speed in the animator controller.
The resulting animation looks pretty decent. But … it's so complicated! There's also two separate recording steps. Which can introduce timing or smoothness issues if your computer's performance varies while recording. Also due to the limitations of transmitting data over video.
So I've been wondering about other tools. Or options that support face tracking.
ShaderMotion can record lip sync and blinking. It seems to support 8 facial expressions (following the "circumplex model") that are mapped to individual blend shapes. But my test avatar has multiple blend shapes for different parts of the face. So I wasn't able to do facial expression.
Also, if you want to edit in other software, you may want something besides a .anim file. Or to access unity's import settings for model animations, which are only available for model files. So, you can use a tool like Animation Converter to get a .fbx file.
ShaderMotion is a good choice if you want to record animations for any VRChat world. ShaderMotion filming tutorial by Minkis shows how you can even use it to record dance videos with multiple recordings of a dancer.
MotionCaptureLabo
There's a VRChat World to record animation called MotionCaptureLabo. You're given an avatar to record. And the tool exports the animation as text that you copy. Then you paste the text into a Unity tool called MotionDecoder which converts it into an animation.
It's fun, but a bit limited. The quality of the animation suffers from being low frame rate and becomes more choppy with a higher frame rate.
VRChat limits the number of characters in a text field. Which means less data can be transferred out of the game by copying text.
HyakuashiUdonMotionRecorder
You can record animations in your own VRChat world using prefabs from HyakuashiUdonMotionRecorder (HUMR). Place them in your own VRChat world project. And when playing, the prefab will export animation data into VRChat's log files. Then you use a Unity tool to convert the logs into animation files.
It only supports body movement and not blend shapes. So no facial expression or lip sync. However, the body and hand recording is fantastic quality!
You can make animations that interact with the environment in your VRChat world. And it may be the best option for dance recording because of the quality and because it uses VRChat's character movement. (like inverse kinematics) Characters move and feel different outside of VRChat. So recording in-game keeps that feeling.
Motion Capture Software
Virtual Motion Recorder
For unity animations you can use an app called Virtual Motion Recorder. It exports .anim files directly.
The app uses Virtual Reality Modeling (VRM), which is a file format for humanoid models. So if you need facial expressions in your animation, you should convert your VRChat avatar to VRM. Otherwise it includes preset models to record body movement.
The animation quality is great! I didn't find a way to change the controls for expressions, which made acting a bit tough. But otherwise was ok.
Mocap Fusion [VR]
Mocap Fusion is a professional app for motion capture using VR hardware. It supports tons of options for trackers, sensors, finger tracking, eye tracking, and face tracking.
Lots of capability! Most VRChat avatars, environments, and props can be converted to use in-app. (they have video tutorials) So it has plenty of flexibility to record different scenarios.
I usually use vive wands and didn't find a way to customize hand poses. So I'd recommend other controllers for hand movement.
Face puppeteering is important to me too and it has a good setup. You can bind emotes to joysticks or track pad. Or bind trigger buttons to sets of blend shapes.
Multi-Tool Setups
Virtual Motion Capture
Ok, so Virtual Motion Capture (VMC) is software for controlling a character using VR hardware. It's often used for VTubing. It uses VRM characters and supports lip sync, face tracking, and gesture-based facial expressions.
The cool thing is—the paid version on Patreon or Fanbox supports sending and receiving motion data between other tools using the VMC protocol.
So even though VMC doesn't record animation by itself, there's tools that support VMC protocol that do!
EasyVirtualMotionCaptureForUnity
Using EasyVirtualMotionCaptureForUnity (EVMC4U) you can directly control a character in a Unity app with VMC. And record movement into .anim files with a separate tool called EasyMotionRecorder.
So with these three tools together, you can record! But the setup was a little confusing to learn.
The quality I got in my test was similar to ShaderMotion. It speeds up and slows down slightly. Which may have been caused by the load on my computer.
Blender addon for VMCProtocol
Blender addon for VMCProtocol (VMC4B) is similar to EVMC4U, but controls a character in a blender project using VMC motion. This option is cool because it might be the most flexible for VRM characters. You could render videos directly in blender. Or export your character to any major game engine!
Also I had so much fun lol
So What Am I Using?
I can totally see myself using Mocap Fusion and HUMR more. I'm also recording at least one other person. And with HUMR they wouldn't need any setup or extra programs. They can just record and send me the log files. Maybe use VMC4B for videos, or if I need some fast record-to-edit iteration time? I'll see!!
Finally, here's a lil side by side comparison of the same animation using each tool.
Extra Apps
I wasn't able to cover Glycon which is unique for having a quest standalone version.
Also there's a way to record animations in the VTubing app Warudo. However I couldn't get exported .fbx files to import into Blender. But the feature is mostly meant for playing back inside the app anyway. So it's understandable!