Skip to main content

Motion Transfer

The Motion Transfer node (internally called “Animation”) transfers motion from a reference video onto a character image. The character in the image mimics the movements from the reference video.

Inputs

Handle IDData TypeLabel
image-inImageCharacter
video-inVideoReference

Outputs

Handle IDData TypeLabel
video-outVideoVideo

Available engines

Engine IDLabelCost (per second)
kling-2.6-motionKling 2.6 Motion Control0.08(standard)/0.08 (standard) / 0.12 (pro)
fal-dreamactor-v2DreamActor v2$0.09

How it works

  1. Provide a character image — the person or character to animate
  2. Provide a reference video — the motion source (e.g., a dancing TikTok, a person talking)
  3. The AI maps the reference motion onto the character
  4. The output is a video of your character performing the reference motion

Advanced options

ParameterOptionsDefault
TierStandard / ProStandard
SeedAny integerRandom
No advanced options — uses a flat per-second rate.

Credit cost

Engine5s Standard5s Pro
Kling 2.6 Motion4 credits6 credits
DreamActor v25 credits

Tips

  • Use a clear, front-facing character image for best results
  • The reference video should have clear, visible body movements
  • DreamActor v2 specializes in facial animation and body movement
  • Combine with TikTok Scraper to use trending motions as references
  • Use Fixed Context Media to keep the same character across multiple videos

Example use cases

  • Making an AI-generated character dance to a TikTok trend
  • Animating a brand mascot with natural body movements
  • Creating talking-head videos with AI characters